AI & Externalizing Knowledge
I’ve had AI on the brain recently. One thread of thought is the age old question – is AI a valuable tool, or does it replace & atrophy my thinking muscles?
This is a debate that goes back throughout history. Socrates, apparently, was skeptical of the great technological invention of… [wait for it] writing. Apparently, writing would weaken your memory and make it harder to achieve ‘true wisdom.’ Instead of storing the wisdom of the elders in your head, you’d have to rely on external crutches for this knowledge. Writing, according to Socrates, was no substitute for dialogue. This is where the trouble began.
It’s not just Socrates, though. Whenever a new technology has come along, we’ve had a storied tradition of folks who worry that it’ll be the end of everything. The luddites, for instance, were so opposed to new technology, they went about the countryside physically destroying looms. The thing is, they weren’t entirely incorrect.
Specifically, the scenarios of the doomers’ worries often came to pass; even as the negative fallout did not. The loom *did* automate textile creation, and it *did* displaces the livelihoods of many. The fall of religion as an institution, did, in fact, co-occur with a rise in individualism, loneliness, and lack of community. Short-form content today does seem to have various damaging effects on a developing teenager’s psyche. The thing is, we’re remarkably good at dealing with change.
Going back to Socrates – in hindsight, yes, writing has been tremendously valuable to civilizational and material endeavours. It forms the basis of long-distance coordination / communication to make a society legible to a state. But, also, Socrates was not wrong.
Writing was the second instance of knowledge being externalized. The revolution here, was assuming the correctness of your belief rested on the veracity of some symbols carved into tablets, possibly ones displaced by time and place when they were written. Writing, in contrast to discussion, increases the ‘distance’ that ideas have to travel. While knowledge can definitely reach much further than before, it has also become more precarious. It’s harder to verify if anything you know is true. Alphabetizing discussions have now made them less legible to you, dear reader. This, now, is knowledge twice removed.
Twice removed?, you ask. What was the first one? Well, arguably, the first instance of this externalization was developing communication in the first palce. Ie. relying on other people for knowledge. You can stretch Socrates’ critiques and apply them here. After all, isn’t depending on discussion to draw out ones ideas a form of crutch? The only source of ground truth should be yourself, right? People with “real knowledge” would simply be able to generate these discussion-drawn-ideas independently?
(Having stripped away communication with others, we should surely be here, right? A source of ideas that’s not externalized at all? Hm, not really. Well, how do you know your personal memory is reliable? After all, here’s every chance you’re living in a dream, or a simulation, or are in some other manner being ‘deceived’, right? The only thing you can truly guarantee, is that you’re thinking, and because of that, you exist. But not much beyond this atomic fact)
At the same time, this externalization of knowledge has been the basis of civilization. Communication is increasingly possible over time and space. You have regularities in the landscape, that others can build upon. It’s the basis for coordination and communication over great lengths. The printing press, the telephone, the internet – all of these have shrunk the world down, till a point where even the idea of a nation-state (hundreds of millions of people big), is now coherent and feasible. (Not that bigger (the size of your coherent entity) is necessarily better – I actually think it’s worse in this case).
And yet, Plato was not wrong. Outsourcing your knowledge does makes it more brittle; less reliable. Gell-Mann amnesia is real. You’re far likelier to believe falsehoods in various domains of your life. People often have wacky beliefs about things they don’t confront directly. Practical matters, however, are their expertise. You need to know how to drive a car yourself. The backstop, the source of ground truth, has to be you. Remember, gremlins exist, and they’re always in the system.
So, yeah. The externalization of knowledge makes it more brittle; less reliable. It has always done that. And at the same time, it’s enabled greater cathedrals to be erected on the foundations we have today. Whether that’s a good thing or a bad thing, is irrelevant. It’s simply what this externalization has always done. And these are the dynamics we’d also expect to see with AI.
Thought this could use some work? Tell me about it here :)

