I am currently following this tutorial http://stevenmiller888.github.io/mind-how-to-build-a-neural-network/ on building a neural network. But I am getting confused on the back propagation section.
Delta output sum = S'(sum) * (output sum margin of error)
Delta output sum = S'(1.235) * (-0.77)
Delta output sum = -0.13439890643886018
I just can't figure out how to get the delta output sum.
The Stigmoid of 1.235 is 0.7747.
0.7747 * -0.77 = - 0.596519 != -0.13439890643886018
This is the GPU fanboy wars board.
We don't discuss actual technology here.
You don't plug 1.235 into the sigmoid function, you plug it into the derivative (gradient) of the sigmoid function you math illiterate idiot. The apostrophe ' acts as notation to let you know S' is the derivative (gradient) of S, the sigmoid function 1/(1+e^(-x)). S'(x) = (e^x)/((e^x+1)^2). This gives 0.174544*(-0.7747)= -0.13439890643886018
>>60789989
S'(x) is the derivate of S(x)
>>60790085
apparently some real mathematicians do post here, >>60790094
>>60790103
>mathematicians
>>60790105
I've published math papers in optimization and signal processing and I post here.
>>60790116
Optimize THIS
*unzips dick*
>>60790280
Sorry, it's too feminine for my tastes
>>60789989
i had to run an iteration forward then back all on paper back when i did ai at university and it was easy compared to the hmm question
>>60790105
more like someone who didnt fail maths in middleschool
>>60789989
Copy this and you'll figure out what your math is doing wrong.
https://github.com/codeplea/genann/blob/master/genann.c
>>60790280
I'd use a Genetic Algorithm, but you'd be dropped first.
>>60792008
>this is how actual CS students do math
>>60793148
>laughing girls
But anon girls cant do math past highschool geometry