Tech
Little  Bay Beach
Little Bay Beach, Anguilla (Getty Images)

There are now more than 1 million “.ai” websites, contributing an estimated $70 million to Anguilla’s government revenue last year

Data from Domain Name Stat reveals that the top-level domain originally assigned to the British Overseas Territory of Anguilla passed the milestone in early January.

From Sandisk shareholders to vibe coders, AI is making — and breaking — fortunes at a rapid pace.

One unlikely beneficiary has been the British Overseas Territory of Anguilla, which lucked into a future fortune when ICANN, the Internet Corporation for Assigned Names and Numbers, gave the island the “.ai” top-level domain in the mid-1990s. Indeed, since ChatGPT’s launch at the end of 2022, the gold rush for websites to associate themselves with the burgeoning AI technology has seen a flood of revenue for the island of just ~15,000 people.

In 2023, Anguilla generated 87 million East Caribbean dollars (~$32 million) from domain name sales, some 22% of its total government revenue that year, with 354,000 “.ai” domains registered.

As of January 2, 2026, the number of “.ai” domains surpassed 1 million, per data from Domain Name Stat — suggesting that the nation’s revenue from “.ai” has likely soared, too. This is confirmed in the government’s 2026 budget address, in which Cora Richardson Hodge, the premier of Anguilla, said, “Revenue from domain name registration continues to exceed expectations.”

The report mentions that receipts from the sale of goods and services came in way ahead of expectations, thanks primarily to the revenue from “.ai” domains, which is forecast to hit EC$260.5 million (~$96.4 million) for the latest year. In 2023, domain name registrations were about 73% of that wider category. Assuming a similar share of that category for this year would suggest that the territory has raked in more than ~$70 million from “.ai” domains in the past year.

Anguilla typically charges $140 for a two-year domain registration, creating a steady stream of income, as some 90% of domains renew after two years. But auctions for expired “.ai” domains, sold via domain name registrar Namecheap, are where bigger numbers roll in — for example, the domain “you.ai” was bought for $700,000 last September, and even in the past week, 31 expired “.ai” domains were sold at a total price of ~$1.2 million, per domain sale tracker NameBio.

More Tech

See all Tech
$26B

Nvidia is planning on spending $26 billion to train its own AI open-weights models, according to a 2025 financial filing. Wired was first to report the information. Nvidia has released several of its own AI models, including the Nemotron reasoning model, as well as specialized ones for specific tasks.

Nvidia making its own large frontier models could allow the company to go head-to-head against some of its biggest AI customers.

tech

Musk blurs the boundaries of his companies even more with joint xAI-Tesla AI agent project

Tesla and SpaceX CEO Elon Musk said Wednesday that Tesla and xAI, which is part of SpaceX, would work on a joint AI agent project called “Macrohard,” also referred to as “Digital Optimus,” as part of Tesla’s $2 billion investment in xAI. The collaboration would pair Grok with what Musk described as a real-time computer-controlling AI agent running on Tesla hardware.

In his post, Musk said Grok would serve as the higher-level “System 2” reasoning layer, directing “Digital Optimus,” a faster “System 1” system that processes the last five seconds of screen video and keyboard and mouse inputs to take action. He claimed the system would run inexpensively on Tesla’s low-cost AI4 chip alongside more expensive Nvidia chips at xAI, and suggested it could, “in principle,” emulate the function of entire companies. “No other company can yet do this,” he said.

Business Insider reported earlier Wednesday that Tesla was taking up the AI agent mantle as xAI’s similar project stalled, but Musk’s post suggests the initiatives are more intertwined than previously understood.

The collaboration marks the latest example of Musk’s companies working closely together, further blurring the lines between Tesla and the recently merged SpaceX–xAI entity.

tech

Meta doubles down on custom inference chips after reportedly scrapping training chip

Meta said today that it’s expanding its custom silicon development to include four new generations of Meta Training and Inference Accelerator (MTIA) chips. The announcement comes just weeks after The Information reported that the social media company had scrapped its most advanced AI training chip, dubbed Olympus, after facing design challenges. In the meantime, it signed outside chip deals with Nvidiaand Advanced Micro Devices.

Early in its recent conference call, Broadcom CEO Hock Tan sought to reassure investors that the custom chip specialist’s relationship with the social media giant was only getting stronger.

“Now contrary to recent analyst reports, Meta’s custom accelerator MTIA road map is alive and well,” he said. “We’re shipping now.”

The new road map suggests Meta’s in-house chips will focus more on inference, which has more predictable workloads, over training — a technically more demanding area dominated by Nvidia:

“MTIA 300 will be used for ranking and recommendations training, and is already in production. MTIA 400, 450 and 500 will be capable of handling all workloads, but we will primarily use these chips to support GenAI inference production in the near future and into 2027.”

Meta CFO Susan Li told attendees at Morgan Stanley’s tech conference earlier this month that the company “eventually” plans to expand its custom chip design to include training models.

Early in its recent conference call, Broadcom CEO Hock Tan sought to reassure investors that the custom chip specialist’s relationship with the social media giant was only getting stronger.

“Now contrary to recent analyst reports, Meta’s custom accelerator MTIA road map is alive and well,” he said. “We’re shipping now.”

The new road map suggests Meta’s in-house chips will focus more on inference, which has more predictable workloads, over training — a technically more demanding area dominated by Nvidia:

“MTIA 300 will be used for ranking and recommendations training, and is already in production. MTIA 400, 450 and 500 will be capable of handling all workloads, but we will primarily use these chips to support GenAI inference production in the near future and into 2027.”

Meta CFO Susan Li told attendees at Morgan Stanley’s tech conference earlier this month that the company “eventually” plans to expand its custom chip design to include training models.

tech

Google completes acquisition of Wiz — its biggest ever

Today Google said it has completed its $32 billion acquisition of cybersecurity startup Wiz, the largest deal in the company’s history.

“This acquisition is an investment by Google Cloud to improve cloud security and enable organizations to build fast and securely across any cloud or AI platform,” the company wrote in the press release.

The companies agreed to the all-cash purchase last year, after quite a bit of back-and-forth.

Alphabet updated acquisitions chart
Sherwood News
Alphabet updated acquisitions chart
Sherwood News

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.