A research team at Apple published a study in October examining supervised and self-supervised algorithms. The title is “Do Self-Supervised and Supervised Methods Learn Similar Visual Representations?” From the abstract:
We find that the methods learn similar intermediate representations through dissimilar means, and that the representations diverge rapidly in the final few layers. We investigate this divergence, finding that it is caused by these layers strongly fitting to the distinct learning objectives. We also find that SimCLR’s objective implicitly fits the supervised objective in intermediate layers, but that the reverse is not true.
Check It Out: Apple ML Study Compares Supervised Versus Self-Supervised Learning