Self-supervised learning is great for both AI and humans
The self-supervised methods are “filling in the blank” approach where the model attempts to come up with a missing piece of information given the context (missing word in a sentence, missing image patch, etc).
The model’s answer is then compared to the original (which we must have) and the difference is the loss value to be minimized.
At scale, this allows models to build useful representations and extract semantics from data. Most modern deep neural networks are trained this self-supervised way before they are useful for the target upstream task.
What I realized is that humans learn best the same way! Do not simply read or listen. Instead, try to answer a tough question at hand to the best of your current ability (even when starting with a new topic — guess, find analogies etc).
Then compare your answer with the notes (ground truth), notice your mistakes and missing components of your answer. Then again attempt to perfectly answer the question on your own. Repeat.
It is the act of forcing ourselves to formulate a coherent answer that organizes representations of information in our mind. Mere focused reading or focused listening doesn’t come close.
The best way to learn is to iteratively re-create the knowledge and spot the gaps in understanding. By the way, this is why it’s often said that teaching is a great way to enhance our own understanding.