19 Comments
Commenting has been turned off for this post
May 2, 2023Liked by Peco, Ruth Gaskovski

Excellent point about past technological changes having the time to be "proved" - the rate of change with AI is astounding (today it was released that AI can write a mediocre scientific paper in response to data), obviously huge ramifications for society are on the horizon but the creators and regulators don't seem to see the obvious (or are aware but arrogant).

As to the point about when this will all come crashing down - for me, it hinges on limits. Limits being there for our good and flourishing (as I think I mentioned before), mean the constant transgression of them (of which transhumanism is the transgressing of limits to the extreme) means eventually things are going to burn out or collapse.

Think of stress testing. The repeated pushing of a material to its limit of resistance/resilience eventually causes it to break. Same with fundamentally limited human beings, and also with the wider Creation (also limited) - which sure is groaning quite loudly at the moment.

Expand full comment
author

Agreed, Hadden. It’s to a great extent a problem of stress tolerance. When do things crack? When do things break?

Every generation, as it grows older, tends to look back on the younger with skepticism, and it might seem easy to dismiss concerns about tech/transhumanism as merely that. But the objective reality has changed; tech actually is progressing faster; the repetitive stress to people and society must invariably increase.

Something I didn’t mention in the article, but which are additional major drivers are profit and military competition (esp between the US and China). These are two more reasons, I think, why many don’t want to put on the brakes.

Expand full comment
author

And BTW, I just read one of your excellent articles on farming and limits, which captured well some of the ways nature supports human identity. And the point about leaving the land fallow got me thinking if that principle might be fruitful elsewhere, whether in tech, economics, personal life, etc. I will have to think about that a bit more.

Expand full comment

Thanks, Peco,

Yes, I think the concept of fallow could have many fruitful applications elsewhere. The practice of a pastor taking a sabbatical is one example (extended time off from normal work to read, pray, reflect, write, study and grow in virtue).

I look forward to seeing where your thoughts lead on this.

Expand full comment
May 2, 2023Liked by Peco, Ruth Gaskovski

Two verses of Scripture jumped to mind reading this reflection (probably obvious ones). Regarding overcoming the weaknesses inherent in being human:

"And lest I should be exalted above measure by the abundance of the revelations, a thorn in the flesh was given to me, a messenger of Satan to buffet me, lest I be exalted above measure. Concerning this thing I pleaded with the Lord three times that it might depart from me. And He said to me, “My grace is sufficient for you, for My strength is made perfect in weakness.”

And regarding 'ChatGPT brains' (as if the advent of the internet hasn't already shown clearly that more information is hardly the solution to our suffering and strife):

"And though I have the gift of prophecy, and understand all mysteries and all knowledge...but have not love, I am nothing."

Expand full comment
author

“..but have not love, I am nothing.”

I could not agree more.

Expand full comment

Beautifully put. I do feel the transhumanist pull at times, so exciting and triumphant. Yet I also see the costs of acceleration first hand in my yoga students, especially the younger ones, who seem more afflicted each year. The cracks are definitely showing.

Expand full comment
author

There has been a tendency to see the impact of tech in terms of the mind (scattered attention, etc.), but I have been wondering how long before that trickles into the body. Your yoga student observation is very interesting, in that sense. Mind and body are so intimately connected, that even small but persistent disturbances in the former could easily translate into instabilities and imprecision at the muscular and sensory level.

Expand full comment

As one example: sitting with eyes fixated down and forwards is a posture of vigilance for the human animal - almost a hunting stance. Sustaining this for hours bathes the bodymind in mild arousal, dampens the breath, closes down peripheral awareness. A suitable position for praying to the gods of eternal growth.

Expand full comment

Peco, This is one of the best pieces I have read in a long time. Thank you. D

Expand full comment
author

Glad it resonated!

Expand full comment
May 3, 2023Liked by Peco

I often wonder what life would be like had we not adopted past technologies. There is an assumption that it would be stunted, I'm not so sure. We seem to lose bits and pieces of ourselves with each new tech and that the tech changes are happening faster only means we are losing ourselves faster.

Expand full comment
author

Thanks Rita. I was just pondering the same question last evening. And I also wondered: What if we had adopted new technologies, but at a slower pace, and more discerningly?

We might still have a world with digital devices, but they might operate differently, or we might approach them differently. Eg., teens aren’t allowed to consume alcohol or to drive before a certain age. What if we took the same approach with social media?

Suggestions like this might sound old-fashioned nowadays. But I am thinking with a pre-millennial mind.

Expand full comment
May 3, 2023Liked by Peco

Doesn't sound old fashioned to me Peco, I'm well into my 6th decade myself. I hear what you are saying about introducing tech at a slower pace as well I would suggest we distinguish between levels of tech based on whether it brings us into more harmony/connection with nature nor not. I'm giving tech some leeway here as mostly I think it has had and continues to have a deleterious effect on us and the world.

Expand full comment
May 3, 2023Liked by Peco, Ruth Gaskovski

great article. some of the points yer making dovetail into what I’ve been crudely trying to point out in these online discussions surrounding AI.

the naysayers are dismissing the critics as throwback Chicken Littles, who are always in a moral panic over change.

fair enough, but even if AI were 100 years in the future, we are still facing immense problems.

to put it in a practical sense, technology is progressing so rapidly that we can’t possibly adapt to it fast enough, and it’s driving us insane.

it’s less about progress and even more about speed! even the most ardent tech progressive must admit that. the one limitation that tech cannot help solve is that of our own evolution and adaptation.

I feel like the online arguments are completely missing the larger point (of course they are)...

Expand full comment
author

Yes, it’s strange, how the problem of adaptation never seems to come up—even where the evidence is starting to emerge that the stress of this adaptation is impacting us. And to think we’re still at the start of this hyper-acceleration.

Expand full comment

right? it’s the one obvious blind spot and also the one the materialists should be most in line with: evolution and adaptation...

Expand full comment

The analogy of the dome and pendentives is an apt one, and perhaps in ways you may not have considered. We are in a time where the dome is so far elevated that those fully in it can no longer see the foundations, but seem eager to hack them away. This is the transhumanist project - to liberate "humanity" (indeed they seem to think just *being* human is itself an artificial constraint) from its foundations on the mistaken notion that the foundations are keeping it down, not holding it up.

And yet, the higher things go the more weight the foundations have to bear, and the more will be lost when it all comes crashing down.

That being said, there is a trap ahead that you may want to consider. Each iteration brings added complexity and abstraction. Each "advancement" requires more sophistication, and especially more power. We see the smartphones, we don't see the massive data centers without which the black glass boxes are nothing more than paperweights. We certainly don't see into the massive data centers, the vast concrete bunkers clustered in nodes and surrounded by armed guards, and vast electrical power draws these centers require. We also do not see the miles and miles of redundant fiber and copper lines, that connect them to each other. We no doubt notice the proliferation of antennae, transmitters, and towers, but not what goes into each of them.

In rocketry there's what's known that the tyranny of the rocket fuel equation: "the more you need, the more you need." That is, we know how much energy is required to lift a 1kg load into an orbital altitude of X, with a given orbital velocity, or how much extra is required to push that out of orbit, etc. And that tells us how much of any given fuel we need. But that fuel needs to be contained in something, and you have to lift that too. And you have to lift the fuel while you burn it (and you have to carry the oxidizer with the fuel). So you need to add fuel to lift the fuel and the fuel containers too. And so on.

All this AI isn't residing on a couple of desktops, it's residing in swarms of data centers consuming vast amounts of power. And the more we push it, the more of these we create, the more we must feed them... the more you need, the more you need... There are limits. We may not know what they are, but there are limits. And we will hit them, or the dome will collapse in the trying.

Expand full comment
author

These are helpful reflections/analogies. Thank you!

Expand full comment