Tripping on LLMs
As a rising senior in high school I was intrigued by the concept of Transformers when TalktoTransformer came out. It was wild that a model could spit out text that sort of made sense in the abstract but made absolutely no sense when trying to decipher what exactly it was trying to generate. As an experiment I ran up a dozen blogs with GPT-2 generated content, posted it on FB in small groups and played with SEO reaching 100m+ hits in a year.
It showed me that most content isn’t really relevant outside the headline, banner image, and in few cases, the introduction. 99% of views never made it past the first splash page, i.e. never read the article.
Coming back to the current stage of LLMs — we’re at a place where the outputs actually do make sense in the abstract and can actually do legitimate stuff. Send legitimate emails, summarize and reason with good accuracy, and do better semantic search than Google by a long shot.
However, I believe fundamentally that this stage is no different than the stage we’ve had for the past 7 years in Self-driving.
You could do “self-driving” with Comma.ai in 2015. Realistically you’re driving on the highway most of the time, so in that sense “self-driving” has been solved. But in the past 8 years we’ve incrementally gotten better in terms of functionality (street driving, better reliability, safety, etc.). Self-driving is getting better but we’re just optimizing for that last 0.001% of the problem while having solved most of it.
LLMs I believe are in a similar situation where 99% of the functionality is there — copywriting, basic coding, search, etc.
But you won’t get really good copywriting just like GPT-4 doesn’t write code that compiles all the time or gets instructions wrong here and there. We’re 99% of the way there though, which is promising.
The last 1% comes from alignment, bigger models, and more specialization. Which could take years while the market cools down.
What does this mean realistically? World doesn’t end just yet and gives wrappers more time to compete and cut down.
Though right now I am focused on creating new ways to monetize LLM wrappers. More to come.