Saturday Links: Safe driving, SaaS replacement, and AGI

Saturday Links: Safe driving, SaaS replacement, and AGI

After a refreshing holiday break, it's back to work and back to posting (thank you for your patience!). Much has happened in the past few weeks, so I'll be doing a bit of triage and catch up. Here is the first batch of links:

  • Human drivers are to blame for the most serious Waymo collisions. Detailed analysis from Timothy Lee on Waymo's published data. As many would have suspected, journeys by fully self-driving cars are likely many times safer than with human drivers. Even in the few incidents of significant crashes, in many cases, human drivers were to blame.
  • Klarna Plans to 'Shut Down SaaS Providers' and Replace Them With Internally Built AI. The Tech World Is Pretty Skeptical. The company reports having replaced its Salesforce CRM and Workday HR systems with in-house code that focuses on the subsets of features they need. This is part of an ongoing purge of 3rd party software from SaaS providers to cut costs. I'm really not surprised (and actually predicted this to a number of people early this year). AI makes integration and migration easier; it makes code creation and maintenance easier and it makes it increasingly untenable to pay millions to multiple SaaS software providers each year. The last 15-20 years have seen a relentless march forward for vertical, specialized, and now, increasingly high-cost software businesses. Margins have been 75% and above for many. My guess is that over the next 5-10 years we'll see achievable margins come down to more like 15-20% in many categories.
  • NaNoWriMo in crisis after AI usage comments. The strength of feeling on the use of AI in writing still runs deep. The non-profit that runs the annual "Write a Novel" challenge in November each year is under heavy fire for comments it made regarding the potential use of AI to complete the challenge. Its initial comments suggested that AI would be acceptable to use for disabled or marginalized writers who otherwise would not be able to participate. The backlash was swift. It was both general condemnation for approving of what many in the community see as a technology that damages their craft (and possibly steals their work) and specific anger from disabled writers angry their situation was being used to promote AI. I once wrote a book draft during NaNoWriMo month and would consider doing it again. It's a unique experience, so it's a shame this backlash will damage the experience. I can understand people's anger, the use of text for training still has not really been clarified. In the long run, though, I really can't see how even professional novels will not use AI in their creation. It will be hard to distinguish between the use of a spellcheck, a grammar checker, a suggestion generator, or many other layers of help. Tools almost always change our crafts forever.
  • Meanwhile, in the music industry... Black Eyed Peas to Debut AI-Powered Member at Las Vegas Residency. The Black Eyed Peas have announced that they are adding a virtual band member to their lineup in 2025. The entity will sing and perform, including vocalizing in real time for duets. It will also apparently react to audience participation. It's still unclear exactly how this will manifest, but there's no doubt that this kind of human band augmentation will appear more and more. After all, we've now had 25-30 years of mega-touring stages (thank you, U2 Zootopia and the early 90s), and there is only so much you can build before replacing the performers.
  • Are we able to agree on what we mean by "AGI?" This is a nice compact X thread from Andrej Karpathy on the definition of AGI. In the post, he notes that when asked about a definition for AGI, many people wave their hands and say there is no agreed definition or that the goalposts keep moving. I agree with his response to that. It's actually relatively clear: "the existence of a highly autonomous system that outperforms humans at most economically valuable work". All parts of the sentence do work: 1) it just needs to exist; it doesn't need to be deployed widely (that's a different step and a choice; 2) it needs to be capable of doing this work by itself, not as a copilot to a human, 3) it should be better than a human, 4) the number and type of tasks should be economically relevant and numerous (otherwise it is not "general." The comment thread is an interesting reflection on variants of the statement.

Wishing you a wonderful weekend.