Atomic bomb test at Bikini Island in 1946 (Wikimedia Commons photo).
One of the reasons I miss Robert Anton Wilson and Robert Shea, besides the obvious ones, is that I wonder what they would make of advances in computers in general, and AI in particular. (They were both fascinated by personal computers; there's a little bit about this in my Robert Shea book.) At age 69, I love my smartphone and marvel at the computer I can carry in my pocket. Technology is wasted on the young. I was a teenager in the 1970s, when our second TV was a black and white with antenna ears, and mobile music meant eight track tapes.
Anyway, here are a couple of things that caught my eye, alarming or black humor, depending on your temperament:
1. "AIs can’t stop recommending nuclear strikes in war game simulations." The lead sentence: "Advanced AI models appear willing to deploy nuclear weapons without the same reservations humans have when put into simulated geopolitical crises."
Via Jesse Walker, who writes, "I was rooting for the resolution of WAR GAMES and instead they kept giving us the setup for THE TERMINATOR."
2. Scott Alexander mentions one of the winners of the ACX forecasting contest and then writes, "Seems potentially bad that so many of the people who win forecasting contests are professionally involved in some form of worrying about AI killing us. Hopefully that’s just a coincidence."

No comments:
Post a Comment