AI class homework 3: Difference between revisions

From John's wiki
Jump to navigation Jump to search
No edit summary
 
(2 intermediate revisions by the same user not shown)
Line 2: Line 2:


= Homework =
= Homework =
== Naive Bayes Laplacian Smoothing ==
<youtube>Lj9ku_w8JAE</youtube>
Note: HINT: The size of the vocabulary is 11.
<youtube>evtCdmjcZ4I</youtube>
== Naive Bayes 2 ==
<youtube>VqJVQlsuGoA</youtube>
Note: Use the probabilities from the previous part (so Laplace smoothing still applies).
<youtube>LRQKhmXpDLI</youtube>
== Maximum Likelihood ==
<youtube>9SDMNmgIhBE</youtube>
<youtube>3lA9jrqw7_4</youtube>
== Linear Regression ==
<youtube>rIO9zynD__M</youtube>
Note: The question is whether the line can be fit EXACTLY.
<youtube>yTYQg1XiBEQ</youtube>
== Linear Regression 2 ==
<youtube>5gIXtI82Olk</youtube>
<youtube>ynxLGEE_Bgo</youtube>
== K Nearest Neighbours ==
<youtube>MhDJ47KG_Oc</youtube>
<youtube>01qBi27m3Ss</youtube>
== K Nearest Neighbours 2 ==
<youtube>SAG4-uC9BnE</youtube>
<youtube>IjzpuYn7Szc</youtube>
== Perceptron ==
<youtube>-fpVTLGoxZ4</youtube>
<youtube>P88qJlIRnwI</youtube>

Latest revision as of 23:56, 14 April 2018

These are my notes for homework 3 of the AI class.

Homework

Naive Bayes Laplacian Smoothing

Note: HINT: The size of the vocabulary is 11.

Naive Bayes 2

Note: Use the probabilities from the previous part (so Laplace smoothing still applies).

Maximum Likelihood

Linear Regression

Note: The question is whether the line can be fit EXACTLY.

Linear Regression 2

K Nearest Neighbours

K Nearest Neighbours 2

Perceptron