## What is backpropagation really doing? | Deep learning, chapter 3

1. #### Skylight

I remember seeing this a year back and not really understanding what is going on. I have recently started Andrew Ng's Machine Learning course and now this video feels a lot clearer 😀

2. #### timkolm2

@3Blue1Brown Why was there no explanation on how the biases should be adjusted?
Otherwise great series!

3. #### Ryan Denziloe

The stochastic gradient descent analogy of the quickly stumbling drunk man is just perfect.

4. #### Deepak

I like that pi.

5. #### Raver

I love these moments in maths or other sciences, where you have this one singular moment of pure enlightenment. You got me at 7:37 ….I got goose bumps when the first column of "+" appeared. This is just brilliant.

6. #### Ajay Hemanth

very good explanation …

7. #### user 53503

Love this channel

8. #### mondlos

for non natives: "bang for your buck" = "effect"

9. #### mondlos

1. Gradient descent (step): You calculate the negative gradient of your multidimensional cost function at a random point and then move to the point where the (displacement) vector, i.e. the gradient vector is pointing to and calculate the gradient again. You do this until you find a point where the magnitude of the gradient vector is very small (approaching zero) – meaning that you do the gradient descent until you find a local minimum of the cost function.

2. impact on the result a.k.a. bang for the buck: focussing on a single input, the weights must be changed (nudged) by particular values proportional to the activation of the neurons they are referring to. The learning rate alpha is the proportionality factor. Let`s focus on the last layer. You take a single neuron and calculate the needed changes for the activations of the neurons in the previous layer. You do that for every neuron in this layer and add all these desired changes for every activation of the neurons in the previous layer.

3. backpropagation: it´s called backpropagation because you first calculate the weights of the last layer. Then you move on to the second last hidden layer until the last hidden layer.

10. #### Awesome Kid

is a cost function the same as a loss function?

11. #### - RedBlazerFlame -

<!DOCTYPE html>
<html>
<title>What is backpropagation really doing? | Deep learning, chapter 3</title>
<body>
<p class="topic" id="plan"> … </p>

<p class="topic" id="Stochastic gradient descent">It will take a long time to get the average cost for all the data in your training set. So instead of calculating the average cost for everything in your dataset, you instead take a mini batch and calculate the cost. Sure it is less accurate, but you will get a significant computational speedup!</p>

</div>

</body>
</html>

12. #### 남재현

저거… 자막이 bias를 편견이라고 해석하네요… 그리고 여러 단어가 실제 문맥 맥락과 다른 의미로 번역되었어요…

13. #### bfbfbfb bfbfbf

какой ублюдский перевод субтитров, выблядок что это сделал – вандал, просто не вникая в тему забил в гугл транслит англ субтитры и все, выблядок смерти тебе

You are great….

15. #### Sarim Mehdi

Quite frustrating I have to end up here on this guy's channel. I don't like his video format (what's up with this weird music as if this is some super mysterious stuff? It actually turns away a lot of people and I don't think he realizes that he is actually making it look way more complicated than it should be). However, I don't see any proper video on backpropagation so I have to make use of this. But still unliking because of the poor presentation of rather simple ideas. Why do people not use a simple pen and paper approach as should be the case. Weird animations like these just scare a lot of people away.

16. #### DowzerWTP72

I'm forcing myself to watch all these videos before going on to write my own code to do this! I've got some awesome ideas that I can't wait to implement, plus in my next year of University, I will be doing computer vision and machine learning, so having this under my belt will be really great! I just wanna start programming though!!

17. #### Vitalijs Fescenko

Not drunk, just stochastic

18. #### Phil Myday

Wonderful videos! You would have been my favorite teacher in education.

19. #### generaldefence

I use Brave rewards on you man you're that worth it 😀

20. #### Sanwal Yousaf

The song in the background reminds me of the theme song of CityVille, the Facebook game from 2010.

21. #### Charlie Angkor

once I took the description of visual system by Hubel and Wiesel and programmed a neural network to simulate it and it was amazing. Then I used the tricks I learned from the brain to program a language cortex. That is even more amazing because I just flush down a language corpus without even telling it what encoding it is in or where the word boundaries are and it learns all the phrases words suffixes prefixes by itself. How do I figure out what I programmed when I dont understand the neural network theory at all? Like how its called and stuff? I find it boring to study and a lot of mental effort. I am lazy to decipher the ideas from the mathematical formulae.

22. #### Tianze Wang

So, how could we make a donation?

23. #### mixbaal0

Wittgestein said something like "What we can talk about, we can say it clearly, otherwise is better to keep silence". You talk very, very clear. Congratulations!

24. #### Mario G.

11:51 I think there's a mistake in one of the equations. You have z superscript L superscript L instead of just z superscript L.

25. #### Zachary Thatcher

What kind of resources do you need to apply neural networks and AI to researching other fields? I know that learning is computationally expensive, but if I am doing image object recognition on a large dataset, is there any description of time or space complexity as a function of the size of the dataset or the number of levels/nodes in a neural network?

26. #### Bat YOK

You must be wicked smart! Cause the way you explain things is just awesome

27. #### Lyaman Agabekova

Thank you for the explanation. Amazing channel!