Guillaume Bonnissent’s Insurance Technology Diary

Episode 66: My two-part prediction for 2026

Guillaume Bonnissent’s Insurance Technology Diary

Freddie said it best: I want to ride my bicycle. Every day I get on my two-wheeler and pedal into the office. Well, almost. I do it every day, but it’s not my bicycle I use for commuting. It’s a Boris bike.

The performance benefits I could gain from riding my carbon-fibre, drop-handled, narrow-tyred, highly geared racing bike to the office simply don’t justify the wear and tear, or the need to haul it up in the lift. Conceivably I could shave 90 seconds off my daily ride, both directions, when Embankment and a couple of other major throughfares are clear. Usually they aren’t, and three minutes is a margin of error.

Similarly, there’s no point in driving my Lamborghini inside the congestion zone. With the speed limit down at 20 mph, it would be like using the proverbial sledgehammer to crack a joke. (Okay, I admit drive a boring family wagon, but you get the point.)

This leads in nicely to the first part of my big insurance technology prediction for 2026:

AI is going to be a big deal.

Now, before you roll your eyes and write that off as the safest and most anodyne prediction you’ve ever heard, hear me out until the second part.

AI is already a big deal. Between the beginning of 2025 and now, the scope of the AI-possible has expanded beyond my wildest blue-sky thinking. It has very nearly blown my mind. We’re doing things today with LLMs and agentic AI which simply were not possible last January, or even in June.

That means any more gritty prediction I could make is likely either to underestimate the forthcoming experience during the year ahead, or – perhaps more likely – to overshoot it massively because of the roses tinting my justifiably optimistic eyes.

But big deal or not, still I expect that the AI excitement bubble will start to shrink in 2026. We will begin to move through and beyond it.

This travel will become evident in areas where AI is already firmly ensconced, as the fact of it is forgotten. I don’t mean that the use of AI will be abandoned or curtailed (quite the opposite), but it will lose its place at the centre of attention. People will stop talking about Spotify’s AI and the AI that drives the Netflix algorithm, and go back simply to talking about Spotify and Netflix.

It’s a bit like our longstanding lack of wonder at the incredible gears that make it possible to propel ourselves along the ground much faster than our feet can walk us. Instead, we just talk about bicycles.

This evolution won’t be completed this year, but soon enough everyone will simply take for granted that AI is everywhere, making people’s jobs easier.

Meanwhile, an extremely practical AI uncertainty is likely to remain unresolved in 2026. How will the pricing of AI (by which I mean Large Language Models like ChatGPT and Claude) change over time?

Many leading LLM companies are running losses. ChatGPT owner OpenAI is forecast (by ChatGPT when I asked) to deliver an operating loss of around $8 billion in 2025, for example. The AI companies must move towards profit, which implies price increases.

However, competition, particularly from China, including in hardware and infrastructure, could push prices down dramatically. Remember when solar panels were expensive?

One way that technology developers and their clients can ensure that AI doesn’t cost them more than it has to, regardless of the price, is to use the right solution for the problem. No one buys a Ferrari to park at the station every day.

Look first at the pain point that demands a solution (maybe it’s the three-month lead time for product development). Then see if AI is the right solution. I believe that in 2026 we will more often decide that it is.
As with any tool, different AIs are suited to different tasks. So we next need to decide which one to deploy, and which version. Does the problem demand the most expensive AI to deliver 100% accuracy at the speed of light, or would 95% and a one-minute lag suffice?

Each new AI model is incrementally more expensive, but often the application will work just as well with an older or cheaper version. Just like my commute doesn’t need a £3k carbon-fibre racing bike, and driving to the London office daily in a Ferrari would be bonkers. Both reflect only a lot of cash wasted.

So here’s my whole prediction for the year ahead:
AI will be a big deal in 2026, but the price will start to matter.