Sunday Links: Parrot, Apple AI AI, and building LLMs

Sunday Links: Parrot, Apple AI AI, and building LLMs

A little later again due to travel during the week, but here are this week's links:

  • CodeParrot: An AI-powered tool that Transforms Figma Files into Production Ready Code. One of the more widely talked about this week is Parrot, which takes Figma (interface code design) files and turns them into code. The Y-Combinator company has some interactive code development features like Github Co-Pilot, but it is novel since it can use your carefully crafted Figma project as a guide. These types of tools tend to work well because they are grounded (it's relatively easy to tell if the result is good from whether it runs), use building blocks (code library components), and there is likely to be quite a large amount of specific training data. For the company, though, the challenge may be that this may end up being a common feature across many AI-coding tools.
  • Apple Intelligence in 5mins. Probably the biggest strategy news event of the week was apple's WWDC event announcing (amongst little else) Apple's Apple Intelligence products. A lot of strategy takes have already been penned (Stratechery nails it in my view [in this free piece and some paywalled content]) and wall street liked what it saw. In a nutshell, Apple is executing its platform play to a tee. The device maker's intelligence tools are focused on being local to your device when possible, adding little bits of incremental value via a boosted Siri and providing a framework work for other AI services to be brought in. The first of these is a partnership with OpenAI to give users the option to have more complex queries to OpenAI. All of this allows Apple to neatly sidestep some of the very hard (and controversial) frontier use cases like unconstrained image and text generation while having other AI vendors compete to access Apple's user base to serve that need. That gives the company time to improve its own model, and its aim will be to have AI providers pay them to be on the platform much in the same way that Google pays them $20B a year to be the default search engine. The only hiccup in the plan will be if any of the AI services becomes so good users care more about it than they do about the device it runs on.
  • Musk Threatens to Ban Staff From Owning Apple Products Over OpenAI Deal. As an addendum to the previous story, Elon Musk's X/Twitter response to the Apple keynote was to raise concerns about Apple handing queries and Data to OpenAI. It seems likely, from what Apple actually announced, that the new AI-powered Siri will actually not do that automatically and will likely ask the user for a confirmation each and every time it happens. Something that is likely to get quite annoying. Perhaps Elon should be looking ahead to the day when he hopes Apple might add his Groq AI to integrate with the same interface OpenAI is now using.
  • How to build an LLM. If you've wondered how LLMs actually get built, this is a nice video summary that gets the main concepts across in an hour. You might want to order a few GPUs before you get started, though!
  • Mistral AI raises 600 mln euros in the latest funding round. In funding news, one of Europe's challenger AI models, Mistral, has raised another significant chunk of funding. The company has so far been quite a bit more capital efficient than some of its LLM competitors (emphasizing carefully chosen data over brute force), but there is still a large funding gap to some of the larger players. Market sentiment for LLM investment also seems to be turning a little, given the continued parity between top models (which points to potential long-term commoditization). Congrats to the team for putting more cash in the bank, though!

Thank you, and I wish you a great week.