"However, even billions of neurons and dozens of training decisions can result in a stable representation. Any language model worth its salt can complete the analogy “husband is to wife as king is to ____”."

Beware that this isn't really the case: https://blog.esciencecenter.nl/king-man-woman-king-9a7fd2935a85

Expand full comment

Transformers >> word2vec on language tasks including analogies. Also the larger point is that, despite being vast and complicated, models converge to pretty similar representations of relationships we understand. You can pick any model from the SuperGLUE leaderboard and recover social concepts like gender or royalty (or the Big Few model of personality). It would be bad news if NLP studies like the altruism one here were largely dependent on the choice of language model. I don't think that is the case.

Expand full comment