Sunday, June 8, 2025
18.1 C
London

AI Adoption Creating Demand For ‘Last-Mile’ Data Centers

AI Adoption Creating Demand For ‘Last-Mile’ Data Centers

Artificial intelligence computing demand is shifting as more people use the technology, and it is expected to push data centers closer to population centers. 

Bisnow/created with assistance from Microsoft Copilot
Big Tech’s AI arms race has sparked a data center megacampus boom — a wave of planned projects with hundreds of megawatts of capacity, built primarily to provide the computing power for “training” new AI models.
Now, as corporations and consumers increasingly embrace AI products and services, this adoption is driving a surge in demand for the computing through which users interact with these models — known as inference. 
In the months ahead, the need for AI inference computing is expected to account for a higher share of data center demand than ever before. The demand shift from training to inference is already underway, and industry leaders at Bisnow’s DICE: National event last month said it will fundamentally transform the data center development landscape.  
Instead of the massive, centralized clusters of concentrated computing power developed for AI training, inference will often be deployed in smaller facilities near major metro areas or specific end users. In the parlance of the data center industry, inference is about to spark an explosion of demand at the “edge” — from 50-megawatt data centers built on the outskirts of major cities to dumpster-sized modular data centers deployed in a rural Walmart parking lot. 
The coming increase in inference demand creates an opportunity for data center developers and providers, but Harrison Street Managing Director Michael Hochanadel said it also represents a significant challenge for the industry.
“There is going to be a flood,” Hochanadel said at the event, held at The National Conference Center in Leesburg, Virginia. “Inference is going to become way more important, and we don’t currently have the infrastructure to support the compute that is going to handle that inference.”
An estimated 80% of AI data center workloads today are for training, with just 20% devoted to inference. Yet a growing body of evidence suggests this gap is narrowing. 
Data center REIT Equinix reported better-than-expected quarterly results in April on the back of an increase in AI inference demand. Last week, Nvidia reported a “sharp jump in inference demand” over the preceding three months, a development CEO Jensen Huang said marked an inflection point in the evolution of AI computing. 
“We’ve entered an era where inference is going to be a significant part of the compute workload,” Huang said on the company’s earnings call Wednesday.
With inference projected to account for the bulk of AI data center demand by the end of the decade, the infrastructure needed to support those workloads will look different from the training-oriented development that has dominated development pipelines for nearly three years. 
While the need for massive blocks of power has pushed the largest training-focused data center projects well beyond the borders of traditional hubs, much of the data center capacity needed to support AI inference will need to be located closer to the industry’s constrained major markets and other large population centers.
The reason: latency, or the amount of time it takes data to travel from a data center to an end user. Latency isn’t a significant consideration when it comes to AI training, but it is critical for most inference use cases. While the computing to build an AI model to detect fraud for a financial firm can happen anywhere, applying that model to instantly detect fraudulent transactions in real time requires computing near where the bulk of those transactions take place. 
“We will see an evolution as the AI thesis plays out … we’ll start to see some of that capacity come back to core markets,” said Angela Harding, associate director at Macquarie Asset Management and a member of Aligned Data Centers’ board of directors.
“You are seeing a true differentiation between an AI training facility, which is just going where the power is and is not latency-sensitive, and inference — when end users are starting to use the model and it does need that low latency, those deployments will start to come back closer to end users. “

Bisnow/Adam Knobloch
Cushman & Wakefield’s Bill Collins, Harrison Street’s Michael Hochanadel, National Real Estate Advisors’ T.J. Hofheimer, Macquarie Asset Management’s Angela Harding, Wharton Equity Partners’ Adam Krupp and Mintz’s Jeffrey Moerdr at Bisnow’s DICE: National event May 20, 2025, in Virginia.
As inference fuels demand for smaller data centers close to areas with large concentrations of customers, the evolution of the digital infrastructure ecosystem to support AI will mirror the evolution of the logistics networks that developed with the emergence of e-commerce, Hochanadel said. 
The hyperscale megacampuses being developed today are the digital equivalent of the big-box distribution centers for companies like Amazon and Walmart: massive facilities typically located along key highways but far from metro areas. While these hubs are the heart of the logistics systems for e-commerce and retail giants, in order to deliver products to their customers quickly they needed a network of strategically located regional hubs and “last-mile” logistics facilities in key population centers.
The coming wave of data center demand, Hochanadel said, will be to build out such last-mile edge data centers needed to get data to customers at the speeds they require.
“There’s going to be this last-mile piece, which is going to be 20-to-50-megawatt inference design locations,” he said. “I think it’s going to be required across all the major metros.”
The shift toward inference is already fueling this kind of edge development, said Adam Krupp, managing director at Wharton Equity Partners.
He pointed to his firm’s project near Cincinnati where, although power has not yet been secured for the sub-100-megawatt site, colocation providers have proactively approached them about deploying what he terms “infill” data centers for inference computing.
“Right around 50 megawatts seems to be the sweet spot,” Krupp said. “The focus is going to be on these smaller infill sites going forward.”
Yet build-out of such midsized data centers near industry hubs isn’t the only way inference demand is starting to drive development at the edge. 
Kanan Joshi, global head of digital infrastructure for CVC DIF said inference is already serving as a growth driver for colocation providers in lower-tier markets like Boise. 
Joshi points to CVC DIF’s portfolio company ValorC3, a colocation provider focused on smaller markets with facilities in Boise, Oklahoma City and St. George, Utah. The firm’s traditional tenants were local enterprises, but Joshi said there has been growing demand from hyperscalers and other large tech firms trying to ensure they will be able to offer low-latency AI products and services to users in these smaller markets.
She said the firm pursued a greenfield expansion project in Boise based largely on this demand. 
“Now we are seeing a lot of the tech companies and internet and SaaS companies coming into these tier-two and tier-three markets, trying to get ahead of their AI deployments,” Joshi said. “The existing data centers that we had were too small … we’re talking about building less than 10 megawatts, but that’s still pretty large for these smaller markets.”
Inference is also boosting demand for containerized, modular edge data centers, even far from large population hubs, said Doug Recker, founder and president of Dous Edge AI, which provides  edge computing solutions primarily for educational institutions and hospitals.
Recker said demand has shot up, with 15 deployments expected by the end of the year across small markets in Texas.
Rather than hyperscalers, Recker said inference demand for modular edge units is coming from enterprise and institutional customers that want to deploy their own AI computing in areas with insufficient connectivity or colocation options.  And he said new use cases will continue to emerge, particularly in agriculture and as firms with national footprints like Walmart look to deploy AI computing at their facilities in markets with little in the way of digital infrastructure. 
“The Walmarts, the big super stores that are out in these markets, they have so much data that’s computed now — they keep eyes on you to know which product you pick off the shelf and put it in your cart — and all that compute has to be localized,” Recker said. “You’re going to start seeing them put small little hubs in their parking lot. AI is really a factor out there … there’s definitely a need and several use cases already.” 

Hot this week

Nobusuke Kishi and the US-Japan Alliance – SNA Japan

SNA (Tokyo) — From 1957-1960 Japan was led by...

Homat Grace residences available to rent soon!

We’re very excited to share this announcement! Homat Grace,...

Brand New Rental Homes at Tokugawa Village

Two new homes at Tokugawa Village (Mejiro, Toshima-ku) have...

New Rental houses in Shoto, Shibuya

Nestled in the heart of Shoto, Shibuya, a rare...

Information on Embassies in Tokyo | Housing Japan

This is a comprehensive list of the main embassies...

Topics

Nobusuke Kishi and the US-Japan Alliance – SNA Japan

SNA (Tokyo) — From 1957-1960 Japan was led by...

Homat Grace residences available to rent soon!

We’re very excited to share this announcement! Homat Grace,...

Brand New Rental Homes at Tokugawa Village

Two new homes at Tokugawa Village (Mejiro, Toshima-ku) have...

New Rental houses in Shoto, Shibuya

Nestled in the heart of Shoto, Shibuya, a rare...

Information on Embassies in Tokyo | Housing Japan

This is a comprehensive list of the main embassies...

Owning a luxury car in Tokyo: What to know | Housing Japan

In a city known for its incredibly efficient public...

Living With Spring | Housing Japan

Join us on an exclusive tour through Tokyo’s most...

The Contrast of Akasaka Luxury Living – Tokyo | Housing Japan

Akasaka is widely considered as the beating heart of...
spot_img

Related Articles

Popular Categories

spot_imgspot_img