Once, the world’s richest males competed over yachts, jets and personal islands. Now, the size-measuring contest of alternative is clusters. Simply 18 months in the past, OpenAI educated GPT-4, its then state-of-the-art massive language mannequin (LLM), on a community of round 25,000 then state-of-the-art graphics processing items (GPUs) made by Nvidia. Now Elon Musk and Mark Zuckerberg, bosses of X and Meta respectively, are waving their chips within the air: Mr Musk says he has 100,000 GPUs in a single information centre and plans to purchase 200,000. Mr Zuckerberg says he’ll get 350,000.