Back to briefings
Download PDF

February 24, 2026

Vertiv Q4 orders +252% as hyperscalers plan $650bn spend atNorth 300MW Sollefteå Sweden campus targeting H1 2028 FERC EL25-49 pushes PJM non-firm interconnect “connect while building” NVIDIA BlueField DPUs bring zero-trust OT security with Siemens

The sharpest tell in today’s tape is how quickly the power-and-gear supply chain is being pulled into the data centre boom. In one place, the numbers are almost comical: Vertiv says Q4 orders jumped 252%, with data centres now 80–85% of revenue and backlog covering more than 100% of its 2026 sales guidance. When Vertiv, Eaton, GE Vernova and Legrand all talk about swelling backlogs while hyperscalers plan $650bn of data-centre and power-equipment spend this year, “AI buildout” stops being a narrative and starts being an industrial cycle.

The Big Stories

The clearest near-term winners are the firms that touch every MW of new compute. Electrical equipment makers Vertiv, Eaton, GE Vernova and Legrand reported large order increases and expanding backlogs as hyperscalers plan $650bn in data-centre and power-equipment spending this year. Vertiv’s 252% Q4 order growth and a backlog that already covers more than all of 2026 guidance is a blunt indicator: demand is outrunning delivery capacity, and the constraint is increasingly “how fast can you ship and install,” not “is there demand.” The investor implication is straightforward—this cycle is leaking out of the REIT/operator universe and into electrification and thermal management balance sheets.

In the Nordics, the scale just keeps climbing. atNorth will develop a 300MW campus on a 50-hectare plot at Hamre Industrial Park in Långsele, Sollefteå, Sweden, targeting operations in H1 2028. The company is leaning on modular design, renewable power, and heat-reuse partnerships—pretty much the Nordic playbook—but 300MW is a reminder that “green compute” regions aren’t staying boutique. Watch the timing: H1 2028 puts this in the next wave of capacity that arrives after today’s grid queues and supply-chain bottlenecks have (hopefully) been forced to adapt.

Grid interconnection workarounds are becoming policy, not just engineering. Enchanted Rock is pushing flexible, dispatchable onsite generation co-located with hyperscale data centres to accelerate interconnection, arguing a 500MW site could operate three to five years sooner and cut grid costs by $78m per GW. The notable detail is regulatory: FERC Order EL25-49 tells PJM to establish non-firm pathways—effectively “connect while building”—with PJM first to implement and other regions circling similar ideas. If this sticks, it’s a shift in bargaining power: the fastest projects won’t necessarily be those with the best substation, but those that can show up with credible, dispatchable onsite capacity and operational discipline.

Cybersecurity is increasingly being productised around “AI factories,” and the OT layer is next. NVIDIA is teaming with Akamai, Forescout, Palo Alto Networks, Xage Security and Siemens to bring AI-driven zero-trust to OT environments using BlueField DPUs, with demos at S4x26 in Miami. The pitch—agentless segmentation, local inspection/enforcement at the edge, and centralized AI threat analysis—reads like an attempt to make OT security scale the way hyperscale networking does. Why it matters: as more power gear, cooling, and facility controls become remotely managed and instrumented for efficiency, the OT attack surface becomes a board-level risk, not an ops footnote.

Cooling is heading toward a less intuitive destination: hotter water. Nvidia says its Vera Rubin processor can be cooled with 45°C water, eliminating the need for chillers and improving AI data-centre efficiency. Vendors across the stack are lining up behind liquid cooling, with the market pegged near $3bn in 2025 and projected to reach $7bn by 2029. The practical takeaway is that “cooling” is becoming a system design decision (water temps, heat rejection strategy, facility layout), not a bolt-on—yet another reason the capex mix is tilting toward mechanical/electrical complexity rather than just shells and racks.

In Brief

Subscribe to Data Centres Briefings

Get AI-powered briefings delivered to your inbox

Region