Discussion about this post

User's avatar
Ro's avatar

This is so interesting. ‘Tech is a tool, not a telos’ sums up very well one of the main things that is going wrong, here.

What’s very peculiar, watching the current catastrophe unfold, is that one can see not even the desire to build tools or use them or even to make them ubiquitous in harmful ways (though there is that) but something much more along the lines of a full-blown fantasy, some kind of gaming that is going on. The system of government is like some puzzle to be defeated within a game. The agencies are like challenges, little nodes you bust open, and the opponent in the game is society itself (this society but ultimately global society). Then, once you have busted that up, you rebuild a new structure on top, replacing all the bits with the things which you control, instead of those which your enemy controls. Then, you’ve completed the game. The vast complexity of an actual civilization or the possibility of consequences isn’t relevant, because you are the player—you are completely outside the game. The consequences aren’t yours, they are for others.

The people in charge of this, the termites chewing away at the wires, aren’t primarily the people that make things but the people that hire other people to make things, which they end up selling. That might be why the idea of destroying so many things that other people have made, millions of people over generations, that all people in the society are dependent on, comes so naturally. If you are primarily a salesman and not a maker, you are likely not to respect the work that’s gone into anything someone else has made. Indeed, that gets in your way, because you can’t make money off something somebody else has made—you need a space for the things you want to sell.

They are enamored of tech but they are not excited or impressed enough by science not to destroy in a few days much of what’s necessary for the practice of science. They like to throw ideas out there—but they use ideas for selling, not for making. The truth of the ideas isn’t of interest, only how people respond to the ideas matters. These salesmen don’t have to build, so they don’t have to deeply know the inner workings of things or think like engineers about what things were made for—this makes it much easier to perceive things that exist, including other people, as important to defeat as part of a game, obstacles to completion. Remove those things so you can sell your things instead.

I suppose this is why they don’t seem to understand how, in good science fiction, the futuristic gadgets are there to drive the narrative to explore fine grained details of the human situation, because it’s going to be the subjective beings that matter in the story, not the tech—which is a tool to explore something human (or if another conscious entity, the value of the subjectivity of whomever the character is). That’s the whole point—any of the twists introduced by technology are of interest if the subjectivities the story explores is valued for itself. The tech oligarchs speak of their interest in technology like bad science fiction where everything is perceived from the outside, characters are just an excuse to talk about gadgets, and the admiration goes to the objects, because they are excitingly futuristic.

But this caused me to have a scary thought. If the Nazi mass murder reflected the norms of efficiency and productivity of industrial manufacturing, I wonder if this crowd turns to mass murder, as one might suppose will become appealing from the way they talk, what form would it take? Possibly a lot of the destruction, and potential death will simply come from failure to understand most of the reality of a society, and mucking around to open up markets that they want. But if they get caught up in a urge for further and further control, then I suppose the models in their heads will dictate what fate they want to inflict on all the inconvenient humans. It wouldn’t be industrial and efficient like the Nazi murder factories but something else like cutting people off from the goods they need to survive—using some kind of algorithm to decide who is worthy of persistence.

I realize that thought’s a little far out, but these people genuinely seem far out—they imagine they can do things that they can’t possibly do, like live forever. They are like the Nazis in that they have invented deranged conceptions of the human body and humanity itself that they seem driven to want to experiment with, and they don’t seem inhibited by the usual moral constraints.

Expand full comment
Muckledger's avatar

“cutting people off from the goods they need to survive—using some kind of algorithm to decide who is worthy of persistence”

An AI trying to optimize society without moral constraints in its training set would find, like politicians without moral constraints, that non-participants in the GDP are most expendable: the disabled, the non-working elderly boomers, the poor who have already fallen through the defunded safety net.

Expand full comment
38 more comments...

No posts