The U.K.’s Competition and Markets Authority (CMA) has officially launched an antitrust investigation into Amazon’s recent $4 billion investment in the AI startup Anthropic. This move follows the CMA’s invitation for comments regarding Google’s earlier investments in Anthropic, which included an initial $300 million followed by an additional $2 billion. The investigation comes amid a broader scrutiny of major tech companies’ strategies to gain control over emerging AI technologies through substantial investments rather than outright acquisitions.
Founded in 2021, San Francisco-based Anthropic is known for developing large language models and its chatbot, Claude, which rivals other prominent AI systems like OpenAI’s ChatGPT and Google’s Bard. Despite its short history, Anthropic has rapidly accumulated $10 billion in funding, reflecting the intense interest and competition in the AI sector. The company’s unique position as a public benefit corporation (PBC) sets it apart from other AI firms, adding an additional layer of interest in its strategic partnerships.
The CMA’s investigation will determine whether Amazon’s investment in Anthropic could potentially undermine competition in the U.K. market. The regulator has 40 working days to decide if the investment meets the criteria for a more detailed examination under merger regulations. This scrutiny aligns with the CMA’s broader review of similar investments by major tech players, including Microsoft’s significant financial commitments to AI startups and its recent acqui-hire of the Inflection AI team.
Anthropic has stated that the investment from Amazon does not grant the tech giant any direct influence over its operations or governance. The company insists it remains independent and continues to collaborate with a range of partners. The outcome of the CMA’s investigation will be crucial in shaping how large tech investments in AI startups are regulated, reflecting ongoing concerns about market concentration and competitive dynamics in the rapidly evolving AI industry.
Reference: