AI and the implicate order

Now that the age of AI – Artificial Intelligence – is upon us, so are fears of being replaced. Workers in disciplines as varied as medicine, engineering, law, and scientific analysis are running scared, and they should be.

History teaches that any new groundbreaking technology disrupts society and renders many occupations obsolete. So it will be with the advent of AI. While it is impossible to precisely predict where, when, and how the working world will change, it most certainly will. And in accordance with such changes, they will benefit the technologies first and humanity second, despite our expectations otherwise.

As Professor Neil Postman wrote in his prescient 1992 book Technopoly, new technology does not simply change culture, but creates its own. While rendering older technologies obsolete, it retrieves their resonance to create a comfortable cultural continuum between the past and the present. Thus it is we call the digital computer devices in our pockets “phones,” even though what they provide goes far beyond voice transmissions, and we still refer to “horsepower” when talking about cars.

Human culture repeatedly makes the mistake of adopting technologies before fully understanding their implications. New technologies bring new social order, and that order is both explicit and implicit. The explicate order includes the aspects and consequences that we intend and can properly foresee changes that are desired and planned. The implicate order, and it’s notable that the root of the word “implicate” means “folded up and hidden within,” contains effects that are unforeseen and only revealed as the use of a new technology unfolds. The implicate order is consistently more powerful, and what it eventually reveals is often unexpected and surprising.

All events and actions, including new technologies, propagate probability waves that intersect and interact with each other, producing inconceivable complexity. We Buddhists call this Karma. Both the implicate and the explicate orders of Karma propagate simultaneously, but their aspects are not simultaneously evident.

For example, when television first became popular, it was ballyhooed as an educational resource of potentially unexcelled power. The entire globe, it was predicted, would now have access to information and educational opportunity hitherto not available. Instead, to quote Newton Minow, the head of the Federal Communication Commission during John F. Kennedy‘s presidential administration, television became “a vast wasteland.” It turns out that the implicate order of television did not primarily promote education, but entertainment and commerce instead.

It is impossible to fully anticipate the implicate order of AI. Plenty of science fiction stories have tried to do so, and if they are correct, we are looking at a rather complex human future. In the Matrix movies, human beings have become little more than batteries. In the Terminator films, humanity is the declared enemy of technology. For all the explicate benefits of AI we intend, there are an equal number of implicate liabilities.

The implicate order is beyond the looking glass, hidden until we fully penetrate the future. But although it may be as yet unrevealed, it nonetheless exists in all its fullness, having been propagated simultaneously with the explicate order. Just as the internal combustion engine transformed human society in unexpected ways, so too will AI technologies.

There’s no putting the AI genie back in the lamp, and meanwhile, we’re being quite careless with the wishes we want granted. At this point, we want it all, but as any thoughtful reader of the human story knows, that’s not how it works.

Leave a Reply

Your email address will not be published. Required fields are marked *