On charismatic megatraumas - David Wallace-Wells in NYT:
‘In 2023 — just as ChatGPT was hitting 100 million monthly users, with a large minority of them freaking out about living inside the movie “Her” — the artificial intelligence researcher Katja Grace published an intuitively disturbing industry survey that found that one-third to one-half of top A.I. researchers thought there was at least a 10 percent chance the technology could lead to human extinction or some equally bad outcome.
A couple of years later, the vibes are pretty different. Yes, there are those still predicting rapid intelligence takeoff, along both quasi-utopian and quasi-dystopian paths. But as A.I. has begun to settle like sediment into the corners of our lives, A.I. hype has evolved, too, passing out of its prophetic phase into something more quotidian — a pattern familiar from our experience with nuclear proliferation, climate change and pandemic risk, among other charismatic megatraumas.’
(…)
‘This week the longtime A.I. booster Eric Schmidt, too, shifted gears to argue that Silicon Valley needed to stop obsessing over A.G.I. and focus instead on practical applications of the A.I. tools in hand. Altman’s onetime partner and now sworn enemy Elon Musk recently declared that for most people, the best use for his large language model, Grok, was to turn old photos into microvideos like those captured by the Live feature on your iPhone camera. And these days, Aschenbrenner doesn’t seem to be working on safety and catastrophic risk; he’s running a $1.5 billion A.I. hedge fund instead. In the first half of 2025, it turned a 47 percent profit.
So far, so normal. But there is plenty that already feels pretty abnormal, too. According to some surveys, more than half of Americans have used A.I. tools — a pretty remarkable uptake, given that it was only after the dot-com crash that the internet as a whole reached the same level. A third of Americans, it has been reported, now use A.I. every single day. If the biggest education story of the year has been the willing surrender of so many elite universities to Trump administration pressure campaigns, another has been the seeming surrender of so many classrooms to A.I., with high school and college students and even their teachers and professors increasingly dependent on A.I. tools.’
(…)
‘Venture capitalists now like to talk about embodied A.I., by which they mean robots, which would be a profound shift from software to hardware and even infrastructure; in Ukraine, embodied A.I. in the form of autonomous drone technology is perhaps the most important front in the war.’
(…)
‘This isn’t a blueprint of the world to come, just one speculative glimpse. Perhaps the course of the past year should reassure us that we’re not about to sleepwalk into an encounter with Skynet. But it probably shouldn’t give us that much confidence that we have all that clear an idea of what’s coming next.’
Read the article here.
In war and education ai is going to be important.
A professor told me that thanks to ai the papers of most of his students contained less nonsense but the dullness was immense.
I predicted that AI will win the Nobel in literature, but maybe it will win the Nobel Peace Prize first.