Fueled by our increasingly digital lives, the importance of data centers has risen substantially over the last decade. They are the backbone of the Internet. The latest estimates suggest that the average connected person now accesses data center servers 3,000 to 4,000 times per day, as demand for digital services grows. Major digital operators are implementing a significant push to reduce the energy consumption of all data center components to mitigate their growing environmental impact. Semiconductor technologies are at the center of these efforts.
Summary of what happens every minute on the Internet
- 28,000 subscribers watching films/programs on Netflix
- €1.6 million spent on online shopping
- 500 hours of video content uploaded onto YouTube
- 2 million swipes made on Tinder
- 700,000 images shared on Instagram
- 9,000 new connections made on LinkedIn
- 5,000 downloads from TikTok
There are nearly 8,000 major data centers globally as of January 2021. A study by the Shift project estimates that the digital industry in general emits around 4% of the world’s greenhouse gases, with its energy consumption increased by nearly 6% year-on-year.
Data centers and climate change
The world’s largest data center operators are taking bold steps to reduce their energy consumption levels. The Climate Neutral Data Center Pact, representing data center operators and trade associations, targets to make data centers climate neutral by 2030. The initiative was formed in 2021 and signatories include the world’s biggest data center operators including Google, IBM, NTT, Amazon Web Services, Microsoft, and Intel. There are two complementary methods to achieve that objective. One is to generate they electricity they require solely through renewable energy sources, such as wind or solar. The other is to curb electricity usage by making their systems more efficient.
Electricity usage within data centers
One of the key challenges faced by any electronic system, including those in modern data centers, is the efficient use of power. Maximizing the data center’s power usage effectiveness (PUE) is therefore the primary objective of every data center operator. Although there is usually an AC power source, most of the circuitry in a data center (including the servers), runs on DC. Consequently, it must be converted to make it applicable. Each subsystem uses a different voltage, so there are further conversion steps required. Finally, each voltage must be routed to its specific location within the server where it is used.
All these power conversions come with a ‘transaction cost’ in terms of lost power. This is analogous to the way that people are charged a fee when they change euros into yen or dollars. In the latter case, the transaction fee is the only thing lost, but that is not true for power conversions. Lost power does not simply disappear. Instead, it manifests itself in the form of heat. The more inefficient the device is, the more heat is lost – a double penalty for inefficiency since cooling requires more energy.
Conventional cooling methods include the use of heat sinks, but these add size and weight. This is anathema for data centers, as the facilities are very expensive. The more computing power that can be crammed into the least space, the greater the profitability that operators can derive. Cooling fans are another possibility, but they require electricity, which adds to operational expenses, as well as the environmental impact.
Higher-performance semiconductors – the gateway to enhanced server efficiency
The development of new wide bandgap materials, like silicon carbide (SiC) and Gallium Nitride (GaN), is helping to improve the data center’s PUE. These innovative materials enable significant improvements in energy efficiency compared with traditional, silicon-based power devices and other alternatives. Specialized semiconductors made from new wide bandgap materials can run at much faster speeds than those of the past and allow far more efficient conversion. These characteristics also mean that the power conversion hardware needs fewer and smaller components resulting in savings in weight and space.
This has enabled a reduction in the volume of the power subsystem by around 30%, compared to actual systems using pure silicon-based devices.
Higher power densities, with concurrent reductions in size, allow for smaller units that prove significantly easier to modularize. This facilitates installation and removal helping to reduce maintenance costs, which are critical for data centers, as they must operate continuously.
Improving energy efficiency
STMicroelectronics offers a number of solutions that help data centers become more power efficient from climate control to storage power management. Key among these are silicon-carbide MOSFETs and gallium-nitride HEMTs that can transfer power at 98% efficiency or better. This enables data center power supplies to attain higher efficiency levels and greater power densities compared to the use of silicon technologies alone. Consequently, they can form the basis of power converters that comfortably satisfy new European Union efficiency benchmarks, which require 94% efficiency to be maintained, even when systems are running at 50% of their maximum power.
Demand for digital services continues to grow, fueled by Artificial Intelligence (AI), 5G, and the Internet of Things (IoT). Keeping power usage under control is an important piece of the sustainability puzzle for data centers. The integration of SiC and GaN based technology into data centers enables them to operate at higher efficiencies, maximize floor space, and reduce operating costs across the facility while helping data center operators meet their sustainability goals.