Backpropagation inspired learning

If there is one thing we could learn from back propagation method in neural net is reverse engineering our daily learning process.

For example, if you are going to learn a complex topic. You could start with reading the technical (like Wikipedia) and so on. The problem is, that kind of materials are often messy and jumbled with even more technical (uh) and jargons. It will be overwhelmed before you reach an understanding of that particular topic.

Let’s take into more drilled example;

Say I would like to grasp a concept about Convolutional Neural Network; and I know the father of CNN is the Facebook AI Research director, Yann LeCun. Normally, I could go deep dive through his scholar page or his group publication site. But, that could take more time than I imagine just to understand a general concept and perhaps some comparisons. One easiest way I could do is browse to YouTube, and watch all of his interview or presentation about CNN. Usually, these presentations are present to a less technical audience (unless you watch academic one, still, it is simpler than reaching his papers from the beginning). Certainly, he will tone down the technicalities (and people friendly touch) so the audience could understand what the message tried to convey.

Usually, you will be able to learn the lower gradient of the complexity by his presentation or interview. Next, I will go to discussion site like Quora for more details before finally lead to his publications.

This way, you could learn faster and better. Remember the key here is FAST. I’m not saying that you do not need to learn the nitty-gritty or the bits and bobs of that particular topic, but these would like to push you in some way to get the better understanding and faster.

Elon Musk, also uses similar method becoming an expert-generalists in many, many fields. He starts first with the knowledge tree and gets the details later on. The structure is the most important part when you try to understand one substantial concept.

This easy hack is inspired by the same back propagation training method in the neural network. Learn the result or general picture first, and work backwards under its curtains.