AI class homework 3: Difference between revisions

From John's wiki
Jump to navigation Jump to search
No edit summary
 
Line 5: Line 5:
== Naive Bayes Laplacian Smoothing ==
== Naive Bayes Laplacian Smoothing ==


{{#ev:youtubehd|Lj9ku_w8JAE}}
<youtube>Lj9ku_w8JAE</youtube>


Note: HINT: The size of the vocabulary is 11.
Note: HINT: The size of the vocabulary is 11.


{{#ev:youtubehd|evtCdmjcZ4I}}
<youtube>evtCdmjcZ4I</youtube>


== Naive Bayes 2 ==
== Naive Bayes 2 ==


{{#ev:youtubehd|VqJVQlsuGoA}}
<youtube>VqJVQlsuGoA</youtube>


Note: Use the probabilities from the previous part (so Laplace smoothing still applies).
Note: Use the probabilities from the previous part (so Laplace smoothing still applies).


{{#ev:youtubehd|LRQKhmXpDLI}}
<youtube>LRQKhmXpDLI</youtube>


== Maximum Likelihood ==
== Maximum Likelihood ==


{{#ev:youtubehd|9SDMNmgIhBE}}
<youtube>9SDMNmgIhBE</youtube>


{{#ev:youtubehd|3lA9jrqw7_4}}
<youtube>3lA9jrqw7_4</youtube>


== Linear Regression ==
== Linear Regression ==


{{#ev:youtubehd|rIO9zynD__M}}
<youtube>rIO9zynD__M</youtube>


Note: The question is whether the line can be fit EXACTLY.
Note: The question is whether the line can be fit EXACTLY.


{{#ev:youtubehd|yTYQg1XiBEQ}}
<youtube>yTYQg1XiBEQ</youtube>


== Linear Regression 2 ==
== Linear Regression 2 ==


{{#ev:youtubehd|5gIXtI82Olk}}
<youtube>5gIXtI82Olk</youtube>


{{#ev:youtubehd|ynxLGEE_Bgo}}
<youtube>ynxLGEE_Bgo</youtube>


== K Nearest Neighbours ==
== K Nearest Neighbours ==


{{#ev:youtubehd|MhDJ47KG_Oc}}
<youtube>MhDJ47KG_Oc</youtube>


{{#ev:youtubehd|01qBi27m3Ss}}
<youtube>01qBi27m3Ss</youtube>


== K Nearest Neighbours 2 ==
== K Nearest Neighbours 2 ==


{{#ev:youtubehd|SAG4-uC9BnE}}
<youtube>SAG4-uC9BnE</youtube>


{{#ev:youtubehd|IjzpuYn7Szc}}
<youtube>IjzpuYn7Szc</youtube>


== Perceptron ==
== Perceptron ==


{{#ev:youtubehd|-fpVTLGoxZ4}}
<youtube>-fpVTLGoxZ4</youtube>


{{#ev:youtubehd|P88qJlIRnwI}}
<youtube>P88qJlIRnwI</youtube>

Latest revision as of 23:56, 14 April 2018

These are my notes for homework 3 of the AI class.

Homework

Naive Bayes Laplacian Smoothing

Note: HINT: The size of the vocabulary is 11.

Naive Bayes 2

Note: Use the probabilities from the previous part (so Laplace smoothing still applies).

Maximum Likelihood

Linear Regression

Note: The question is whether the line can be fit EXACTLY.

Linear Regression 2

K Nearest Neighbours

K Nearest Neighbours 2

Perceptron