25. Perceptron
algorithm
Step 3: (repeat 1000 times)
- Pick random point
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
with blue and red sides.
Step 4: Enjoy your line that
separates the data!
Get over
here!
I’m good
36. Algorithm
Step 3: Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your fitted line!
- If point is correctly classified
- Do nothing
- If point is incorrectly classified
- Add + 0.01 to a
- Add + 0.01 to b
- Add + 0.01 to c
Figure out!
38. Algorithm
Step 3: Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your fitted line!
- If point is correctly classified
- Do nothing
- If point is incorrectly classified
- Add + 0.01 to a
- Add + 0.01 to b
- Add + 0.01 to c
Figure out!
Perceptron
trick
61. Perceptron
algorithm Step 3: Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point (p,q)
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your line!
- If point is correctly classified
- Do nothing
- If point is blue, in the red area
- Subtract 0.01p to a
- Subtract 0.01q to b
- Subtract 0.01 to c
- If point is red, in the blue area
- Add 0.01p to a
- Add 0.01q to b
- Add 0.01 to c
69. Perceptron
algorithm Step 3: Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point (p,q)
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your line!
- If point is correctly classified
- Do nothing
- If point is blue, on red area
- Add 0.01 p to a
- Add 0.01 q to b
- Add 0.01 to c
- If point is, red on blue area
- Subtract 0.01 p to a
- Subtract 0.01 q to b
- Subtract 0.01 to c
- If point is correctly classified
- Do nothing
- If point is blue, and ap+bq+c > 0
- Subtract 0.01p to a
- Subtract 0.01q to b
- Subtract 0.01 to c
- If point is, red and ap+bq+c < 0
- Add 0.01p to a
- Add 0.01q to b
- Add 0.01 to c
90. Log-loss error
Large log-loss error Small log-loss error
Minimize using calculus (gradient descent)
Bad line Good line
Logistic regression algorithm
91. Logistic regression
algorithm
Step 3: Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point (p,q)
- If point is correctly classified:
- Move line away from point
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your fitted line!
- Add 0.01(y - y)p to a
- Add 0.01(y - y)q to b
- Add 0.01(y - y) to c
^
^
^