A Short Aside On Large Language Models And Kojima’s Prophecy
Among all the chatter about large language models and their ongoing impact on society and business, and with memories writing for Ars Technica floating back into consciousness in turn, I couldn’t help but be reminded of a piece of art well before its time that talked, at length, about the implications of our evolving information age as both creation and distribution are largely automated.
I’m talking, of course, about Metal Gear Solid 2. A script written in ~1999, for of all things a video game, proves to be a most prophetic window into the good and bad (depending on how you interpret the content) that stands before us as machines take a more active role in creating and curating human-ingestible content.
Among the major questions the game asked:
- What value is content in that world where that content is automatically generated?
- What is the point of content in a world where it is, in effect, free and increasingly meaningless?
- How does one even begin to contorl a machine that can move orders of magnitude faster than humans?
- How does that lack of control, and of value, drive us back to human-first interactions offline?
All very relevant thought exercises as we all attempt to adapt and exploit these new tools for our collective benefit (while controlling the consequences of said exploitation).
There’s an excellent online analysis of game’s original theory on YouTube that provides incredible context and color commentary on the moment which I’d recommend watching as a basis for understanding the correlation:
You can find the source clip online, as well:
While I have so much more to say on this topic, so many ways I think said content could be updated, and so many ideas of how my teams are going to take advantage of the new tools available, I couldn’t help but come back to this seminal part of my adolescence and if for only a moment rue the implications of the next great work revolution.
Perhaps a longer rue is in order.