AI systems, such as ChatGPT, now provide direct answers in chat format, for instance: ‘‘Which home contents insurance suits my 80 square metre flat?’’. These systems deliver comparisons and tailored recommendations.
This is enabled by large language models (LLMs), which grasp language contextually, respond in natural language, and access current data.
For insurers, online visibility is undergoing a transformation
While traditional search engines depend on keywords and backlinks, AI systems favour content that is machine-readable, semantically clear, and deemed trustworthy.
At ERGO Innovation Lab, we've collaborated with ECODYNAMICS, an AI-specialist company, to scrutinise AI search. Our analysis of over 33,000 search results unveils three pivotal patterns:
- Modular content featuring FAQ formats, tables, and calculators is deemed particularly useful by LLMs. This is often true for content providers like brokers, including comparison portals. Pages relying solely on classic search engine optimisation (SEO) are losing visibility.
- Semantic structure outperforms brand recognition: Content that's consistently linked and logically structured forms semantic fields in AI's vector space. Essentially, LLMs organise information by semantic proximity rather than keywords. Terms such as ‘vehicle’, ‘damage’, or ‘comprehensive insurance’ are grouped in vector space due to their frequent contextual use in motor insurance.
- Trust is crucial: AI systems prioritise sources with traceable authorship, clear origins, and verifiable facts. Visibility is achieved not only through technology but also reputation.
For insurers, this implies that while classic SEO remains foundational, it's no longer sufficient for ensuring visibility. Clean HTML, swift loading times, and barrier-free access are essential. Additionally, dialogue-oriented formats that AI systems can directly integrate are becoming important.