관련 글

https://sacko.tistory.com/19

 

문과생도 이해하는 딥러닝 (3) - 오차 역전파, 경사하강법

2017/09/27 - 문과생도 이해하는 딥러닝 (1) - 퍼셉트론 Perceptron 2017/10/18 - 문과생도 이해하는 딥러닝 (2) - 신경망 Neural Network 이전 시간까지 신경망이 무엇인지 어떻게 생긴 것인지 작동원리 등을 살..

sacko.tistory.com

http://jaejunyoo.blogspot.com/2017/01/backpropagation.html

 

Backpropagation 설명 예제와 함께 완전히 이해하기

쉽게 설명하는 기계학습(machine learning) 개념, 역전파(backpropagation) 예제와 함께 완전히 이해하기

jaejunyoo.blogspot.com

https://www.linkedin.com/pulse/gradient-descent-backpropagation-ken-chen

 

Gradient Descent and Backpropagation

In previous articles, I have referred to the concepts of gradient descent and backpropagation for many times. But I did not give the details and implementations of them (the truth is, I didn't know these either.

www.linkedin.com

 

As we know, the loss function is a function of weights and biases. So, its gradient can be calculated by taking its derivative with respect to the weights and the biases, so that we know how much each variable contributes to the total error. However, since the relationship between the weights and the biases in the different layers is sort of iterated and accumulated, it is not an easy task to calculate the gradients with respect to them. And this is where backpropagation comes to the rescue!

If I was asked to describe backpropagation algorithm in one sentence, it would be: propagating the total error backward through the connections in the network layer by layer, calculate the contribution (gradient) of each weight and bias to the total error in every layer, then use gradient descent algorithm to optimize the weights and biases, and eventually minimize the total error of the neural network.

'딥러닝 > 이론 관련' 카테고리의 다른 글

딥러닝 파라미터 최적화 용어 정리  (0) 2018.05.14

http://ikaros0909.tistory.com/3




+ Recent posts