Microsoft's Bing AI: Factual Errors Revealed in Last Week's Launch Demo

Last week, Microsoft's Bing search engine unveiled its AI chatbot technology. With more than a million people signing up to try the tool in the first 48 hours, the company was eager to showcase the AI's capabilities. However, during the demo, Bing AI's analyses of earnings reports from Gap and Lululemon revealed several factual errors. Experts attribute these errors to a phenomenon they call "hallucination." While Microsoft has acknowledged the mistakes and pledged to improve the technology, the incident raises concerns about the accuracy and reliability of AI-driven tools.


Factual Errors in Bing AI's Demo

In the demo, the Bing AI chatbot analyzed earnings reports from Gap and Lululemon. The AI's responses contained several errors, including missing some numbers and others that appeared to be fabricated. Independent search researcher Dmitri Brereton identified additional factual issues in the responses concerning vacuum cleaner specifications and travel plans to Mexico. These errors highlight the propensity of tools based on large language models to "make stuff up," known as "hallucination" among AI experts.


Rushing to Incorporate Generative AI into Search Engines

Microsoft and Google are both competing to incorporate new types of generative AI into their search engines. With the explosion of ChatGPT, which OpenAI introduced to the public in November, companies are eager to show advancements. While OpenAI has raised billions from Microsoft, competing startups like Stability AI and Hugging Face have also received billion-dollar valuations in private funding rounds.


Google Reluctant to Add AI-Generated Responses to Search Engines

Although Google has been reluctant to add AI-generated responses to search engines due to reputational risk and safety concerns, Microsoft stressed the short-term potential of releasing the technology to the public. Microsoft CEO Satya Nadella emphasized the importance of releasing the technology outside the lab, stating that "you have to get these things out safely." However, the errors revealed during the demo show the need for caution and continued improvements in AI technology.


Microsoft's Response to the Errors

In response to the errors, Microsoft acknowledged that the AI may make mistakes during its preview period. The company stressed the importance of feedback to help improve the technology and stated that it is working to analyze the findings and improve the AI's capabilities. As Bing AI and other AI-driven tools continue to evolve, accuracy and reliability will be essential to building trust and credibility with users.