Since the release of ChatGPT in November, there has been a lot of speculation about OpenAI’s latest large language model (LLM) spelling doom for Google Search. The sentiment has only intensified with the recent report of Microsoft preparing to integrate ChatGPT into its Bing search engine.
There are several reasons to believe that a ChatGPT-powered Bing (or any other search engine) will not seriously threaten Google’s search near-monopoly. LLMs have several critical problems to solve before they can make a dent in the online search industry. Meanwhile, Google’s share of the search market, its technical ability and its financial resources will help it remain competitive (and possibly dominant) as conversational LLMs start to make their mark in online search.
Meanwhile, the real (and less discussed) potential of LLMs such as ChatGPT is the “unbundling” of online search, which is where real opportunities for Microsoft and other companies lie. By integrating ChatGPT into successful products, companies can reduce the use cases of Google Search.
Integrating ChatGPT in search engines
While ChatGPT is a remarkable technology, it has several fundamental problems, which are also present in other LLMs. This is why Google, which already has similar technology, has taken a conservative approach toward integrating conversational LLMs into its search engine.
A company like Microsoft might be able to solve these problems by using its highly efficient Azure cloud and developing suitable LLM architectures, training techniques and complementary tools.
Microsoft and OpenAI might also be able to solve the truthfulness problem by adding automated guardrails that fact-check ChatGPT’s answers before showing them in Bing results.
However, nothing prevents Google from doing the same thing. Google has immense data and compute resources and a highly talented AI team. Google also has the advantage of being the default search engine on Chrome, most Android devices and Safari (included with macOS and iOS devices). This means that unless it's significantly better, a ChatGPT-powered Bing will not convince users to go out of their way to make the switch from Google Search.
Unbundling search
People use Google Search to solve various problems, from locating nearby restaurants to finding academic papers, retrieving news articles, querying historical information, looking for coding advice and more.
ChatGPT and other LLMs can also solve some of these problems. We’re already seeing this happen in software development. When programmers need help writing code for a specific problem, they usually search for it on Google or visit a coding forum such as Stack Overflow.
Today, thanks to GitHub Copilot and OpenAI Codex, they just need to write a textual description in their integrated development environment (IDE) (that is, Visual Studio Code or GitHub Codespaces) and have the LLM automatically generate code for them. This helps developers stay in the flow by avoiding switching from their IDE to Google search. This is an example of “unbundling” some of the work that Google search is currently doing.
There are many other opportunities to unbundle search through LLMs, such as developing assistants for academic papers, essays and other content creation. Unbundling has several benefits:
The future of online search
For many use cases, Google’s list of blue links will remain the dominant tool. For example, if you want to do a precise search in specific domains and timeframes, Google’s search technology is better than current LLMs.
Unbundling will not pose an existential threat to Google search just yet. In fact, the history of large platforms such as Craigslist and Amazon shows that unbundling usually results in the expansion of a market (and Google already has a stake in many of those markets). However, unbundling will weaken Google’s hold on the online information market to a degree.
And in the long run, LLMs can trigger more profound shifts in the market.
