Forget remote server farms in the desert. The next battle for AI supremacy is being fought in your backyard. A quiet suburb just 20 miles from Dallas is now the epicentre of a $2 billion gamble that will dictate how fast your ChatGPT responds, how accurately your AI assistant works, and how seamlessly artificial intelligence integrates into your life.

The player? DataBank, a Dallas-based operator that has just secured a colossal loan to build a new breed of data centre. Their secret weapon? Proximity. They're not building for AI training; they're building for AI *use*. And it’s a move that has every major tech company from Amazon to Meta scrambling for a piece of the action.

The "Inference" Revolution: AI Moves Next Door

Until now, the AI boom has been powered by colossal, isolated facilities built to *train* complex models. But training is only half the story. The real magic—and the next trillion-dollar challenge—is called "inference." This is the moment you ask a question and the AI answers. It requires lightning speed and ultra-low latency, which means the computers doing the work can't be thousands of miles away.

"Why do data centre users want to be close to metro markets? Because that's where most of the fibre is and that's where people are," explains Raul Martynek, CEO of DataBank. "Who consumes these products? People." His company's new 240-megawatt campus in Red Oak, Texas, is fully leased by a single, unnamed hyperscaler—one of the giants like Google or Microsoft—betting big on this very idea.

From 9% to 37%: The Staggering Data Centre Shift

The numbers reveal an industry on the brink of transformation. In 2025, just 9% of global data centre workloads were for inference computing. By 2030, experts at JLL predict that figure will explode to 37%.

"It's on the forefront of a lot of minds," says Carl Beardsley of JLL. "The next phase is once your models have learned... to convert that over to inference and be closer to the population centre." This isn't a minor trend; it's a fundamental re-engineering of the internet's physical backbone, moving critical computing power from the middle of nowhere to the edge of major cities.

Why a $2 Billion Loan Signals a Nervous Market

Securing this financing wasn't simple. Even as demand soars, lenders are getting cold feet about the enormous debts piling up in the AI sector. DataBank's deal with Mitsubishi UFJ Financial Group and others hit snags, forcing the company to break off a $600 million portion for a fourth building and seek private Wall Street investors instead.

"It took us longer than we originally anticipated," Martynek admits, pointing to a broader slowdown in bank syndication for mega-projects. This caution tape around AI financing reveals a market that's both red-hot and increasingly risky, as the financial world tries to keep pace with technology's breakneck speed.

The first of these inference-optimised data centres in Red Oak is slated to open in late 2026. When they do, they won't just be buildings full of servers. They will be the invisible engines in your neighbourhood that make the AI of tomorrow feel instant, intuitive, and utterly seamless. The age of distant AI is over. The era of local, lightning-fast intelligence is just beginning.