As they ramp up their application of artificial intelligence (AI), organisations are quickly realising the need to rein in energy consumption and other associated costs.
With the accelerated adoption of AI, discussions have shifted from moving workloads into the data centres, to building or outfitting data centres specifically to support AI workloads.
Companies now look more closely at how they can run their data centres more efficiently with power, cooling, and water systems, said Matthew Hardman, Asia-Pacific CTO at Hitachi Vantara.
Their conversations have transitioned towards power consumption and sustainability, Hardman said in a video call with FutureCIO.
Enterprises need to build capacity and capabilities, and figure out how to scale out. Providers of AI as a service, in particular, are having more detailed discussions around capacity, cooling, and density, he said.
Energy is the obvious hotspot since training GPUs (graphics processing units) consume a lot of power, he noted. Cooling also is another key area as traditional networks generate significant heat.
Another factor organisations tend to overlook is data efficiency, he said.
Companies presume they need as much data as possible to train their AI models, but end up storing loads of data that are not necessarily relevant or effective, he noted. For example, they may be storing data that is not suitable for training specific models.
AI workloads aren’t a linear increase over traditional IT workloads. They reflect a significant change in datacentre capacity requirements, said Matthew Oostveen, Pure Storage’s vice president and CTO of Asia-Pacific Japan.

Conventional enterprise applications typically operate at rack densities of between 4kW (kilowatts) and 10kW. In comparison, modern AI environments regularly exceed 100kW per rack, driven by high-performance GPUs and continuous data movement, Oostveen said, citing figures from International Energy Agency (IEA).
The agency further forecasts that electricity demand from data centres will more than double globally by 2030, hitting some 945 terawatt-hours (TWh). This figure is more than the entire power consumption of Japan today.
Power rises as main constraint
AI is the most significant driver of this increase, with electricity demand from AI-optimised data centres projected to more than quadruple by 2030, IEA said.
In advanced economies, data centres are projected to account for more than 20% of the growth in electricity demand between 2025 and 2030. It places the power sector in these economies back on a growth path, after years of stagnate or declining demand, IEA noted.
The effects will be particularly evident in some countries, such as Japan, where data centres will account for more than half of the growth in electricity demand. In Malaysia, these facilities will contribute upwards of one-fifth of growth in demand for electricity.
Energy has become the defining constraint for AI at scale, said Oostveen.
A single GenAI (generative AI) query can consume up to 10 times more electricity than a traditional web search, he said, noting that AI-specific electricity demand is growing at 30% annually in some regions.
In Asia-Pacific Japan, Pure Storage already is seeing AI plans facing real-world grid limitations.
“The implication is clear: every watt consumed by inefficient infrastructure is a watt unavailable to GPUs,” he said.
This, for one, is driving the move away from power-intensive spinning disks towards modern all-flash platforms, which can deliver higher performance per watt, he noted. Flash storage can use one-tenth the power and up to 94% less space than legacy disk, he said.
Oostveen believes energy efficiency, not raw performance, will determine a company’s competitive advantage in AI.
Pointing to research from Gartner, he said that, by 2027, some 40% of existing AI data centres will be operationally constrained by power availability.
Hardman urged organisations to be clear about the value they want to derive from AI, so they can better manage their infrastructure and datacentre costs.
“AI isn’t the product, it’s the outcome,” he said. “Is it going to add value, help you create new user experience or benefits for customers? If you’re not starting with that, then you’re not ready to start AI.”
Transitioning from proof-of-concepts to production also is not straightforward, he noted.
Enterprises cannot simply purchase the latest GPU or AI model and expect a corresponding increase in output.
They need to also think about how to incorporate the systems into their environment, including managing the governance and complexity and addressing potential barriers.
"The implication is clear: every watt consumed by inefficient infrastructure is a watt unavailable to GPUs."
Matthew Oostveen, Pure Storage
Energy and cooling are just part of the challenge around AI, Hardman said. Organisations also need to look at issues such as data governance and new attack vectors.
They have to look at AI’s response to people and ensure it does not reveal information it should not be revealing, he said. Rogue AI use also should be managed.
Furthermore, traditional capacity planning is difficult with AI, Oostveen said, where compute demand can spike overnight as new models, agents, or edge inference use cases emerge.
It makes making fixed, long-term infrastructure bets increasingly risky, he said.

He recommends that organisations move towards modular, subscription-based infrastructure models that allow them to scale capacity and performance incrementally.
This enables enterprises to redirect budgets as technology evolves, rather than lock their capital into assets that may be obsolete within a year, he said.
He also underscored the importance of building vendor-agnostic AI stacks.
“Infrastructure should make it possible to swap models, inference engines, or cloud providers in weeks, not years,” Oostveen said. “The goal is to ensure infrastructure agility keeps pace with software innovation, rather than becoming the bottleneck that slows AI adoption or inflates costs.”
Singapore to enforce energy efficiency
And while the need to control costs may be a strong motivation for companies to run more efficient data centres as their AI adoption accelerate, in Singapore, operating such facilities will be a necessary requirement.
The Asian nation has just unveiled plans to introduce legislation as part of efforts to ensure data centres and cloud platforms are managed sustainably.
Singapore currently hosts more than 1.4 gigawatts of datacentre capacity, with demand for more expected to grow alongside the adoption of AI.
The country already has one of the highest concentrations of data centres in the region, said Senior Minister of State for Digital Development and Information Tan Kiat How.
The Singapore minister noted that data centres are foundational to the local digital economy, and essential to support its AI ambitions, but also are intensive users of resources, especially power and water.
It is, hence, necessary to ensure growth in Singapore’s datacentre capacity is sustainable, said Tan, who was speaking at the launch of Singtel Digital InfraCo - NVIDIA Centre of Excellence for Applied AI this week.
Data centres should tap green energy sources and take steps to improve their energy efficiency, he said, adding that the Singapore government has provided grants to help data centre operators transition to more energy-efficient IT equipment.
The IT Energy Efficiency standard also was released last June to guide them on choosing energy-efficient IT equipment and operating IT systems more efficiently.
To further drive such initiatives, the Ministry of Digital Development and Information will unveil the Digital Infrastructure Act to establish baseline energy efficiency requirements for all data centres, including existing and new facilities.
Slated to be introduced later this year, the new legislation will impose PUE (power usage effectiveness) mandates on data centres, amongst other requirements.
“By systematically raising the energy efficiency and sustainability of all our data centres, we can create more headroom to support the growing demands of Singapore’s digital economy, particularly with the push for AI adoption,” Tan said.
The new Act also will include requirements to bolster the security and resilience of data centres as well as major cloud service providers, he said. These will encompass mandated measures to manage security risks, minimise disruptions, and establish business continuity.
Finding sovereignty in AI
Data sovereignty is no longer just about compliance, Oostveen said. It has become a critical business risk tied to service continuity, trust, and control.
AI and data sovereignty complicate planning because they introduce new risks that sit alongside performance, cost, and scale, he said.
“Organisations now have to plan not just for how much infrastructure they need, but also where data and AI workloads are legally allowed to reside, who ultimately controls them, and how exposed they are to geopolitical or regulatory disruption,” he noted.
Companies will need to look at sovereignty as a design constraint, not an afterthought, as they work to mitigate the additional risks, he said.
Understand which data and AI workloads are truly sensitive, and keep them under sovereign control, he advised.
He also pointed to the use of hybrid and multi-cloud models to balance the need for resilience, flexibility, and innovation.
“Platforms need to be portable and governed by design, so organisations can adapt to changing regulations or jurisdictions without rearchitecting their environments,” Oostveen said. “Those that plan for sovereignty upfront, rather than reacting to mandates later, will be better positioned to scale AI safely while maintaining control, availability, and trust.”
