Science | Smith sonian mag

A New System for Cooling Down Computers Could Revolutionize the Pace of Innovation

In 1965, Gordon Moore, a co-founder of Intel, forecast that computing would increase in power and decrease in price exponentially. For decades what later became known as Moore’s Law proved true, as microchip processing power roughly doubled and costs dropped every couple of years. But as power increased exponentially, so did the heat produced by packing billions of transistors atop a chip the size of a fingernail.

As electricity meets resistance passing through those processors it creates heat. More processors mean higher temperatures, threatening the continued growth of computer power because as they get hotter, chips decrease in efficiency and eventually fail. There’s also an environmental cost. Those chips, and the cooling they require, devour power with an insatiable hunger. Data centers use roughly one percent of the world's electricity . In the United States alone, they consume electricity and water for cooling roughly equivalent to that used by the entire city of Philadelphia in a year.

Now, Swiss researchers have published a study in the journal Nature which says they have one solution to the cooling problem. “Data centers consume a huge amount of electricity and water as as we rely more and more on this data, this consumption is just going to increase,” says Elison Matioli , a professor in the Institute of Electrical Engineering at Ecole Polytechnique Fédérale de Lausanne (EPFL) who led the study. “So finding ways to deal with the dissipated heat or dissipated power is an extremely important issue.”

Previous attempts to cool microchips have relied on metal sinks, often combined with fans, that absorb heat and act like an exhaust system. Some data centers rely on fluid flowing through servers to draw away heat. But those systems are designed and fabricated separately and then combined with the chips. Matioli and his team have designed and fabricated chips and their fluid cooling systems together. In the new design, the cooling elements are integrated throughout by creating microchannels for fluid within semiconductors that spirit away the heat, save energy, and mitigate the environmental problems created by data centers.

Their work also could have important applications in an electrified future, helping eliminate the heat problem and reducing the size of power converters on cars, solar panels and other electronics. “The proposed technology should enable further miniaturization of electronics, potentially extending Moore's Law and greatly reducing the energy consumption in cooling of electronics,” they write.

Heat produced by chips in electronics has been an issue as far back as the 1980s, according to Yogendra Joshi , an engineering professor at Georgia Tech, who was not a part of the study. Early microprocessors like Intel’s first central processing unit released in 1971 didn't create enough heat to require cooling. By the 1990s, fans and heat sinks were integrated into virtually all central processing units—the physical heart of the computer that includes the memory and calculation components—as increased power created increased heat. But relying on metallic heat sinks that draw the heat away and dissipate it through the air increases the temperature of the entire device and creates a loop that just creates more heat. “Electronics typically don't work really well when they are hot,” Matioli adds. “So in a way, you decrease the efficiency of the entire electronics, which ends heating up the chip more.”

Researchers explored microfluidics, the science of controlling fluids in tiny channels, as far back as the early 1990s. Efforts increased after the U.S. Department of Defense’s Defense Advanced Research Projects Agency (DARPA) first became interested in the technology in the late 1990s, but began to take deeper interest in 2008 as the number of heat-producing transistors on a microprocessor chip went from thousands to billions. Joshi estimates that the agency has spent $100 million on research, including funding what it called ICECool programs at IBM and Georgia Tech beginning in 2012.

Over the years, embedding liquid cooling in chips has been explored through three basic designs. The first two designs did not bring cooling fluid into direct contact with the chip. One used a cold plate lid with microfluidic channels to cool chips. Another featured a layer of material on the back of chips to transfer heat to a fluid-cooled plate without the lid. The third design, the one that Matioli and his team explored, brings the coolant into direct contact with the chip.

Matioli’s research builds on work by Joshi and others. In 2015, Joshi and his team reported cutting fluid channels directly into integrated circuits yielding temperatures 60 percent lower than air cooling. “Cooling technology is absolutely going to be critical and using fluids other than air is a key part of being able to take away these very large heat rejection requirements put out by the computers,” Joshi says. “And you want to have the coolant where the heat is being produced. The further away it is, the less effective at a very high level it's going to be.”

That’s what Matioli’s research advanced. To test their concept, the team designed a water-cooled chips, concerting alternating current (AC) into direct current (DC) integrating microchannels filled with water in the same semiconductor substrate. The substrate they used was gallium nitride , rather than silicon, which enabled much smaller miniaturization than the typically-used silicon. The result, according to the paper, is cooling power up to 50 times greater than conventional designs.

The trick was finding a new way to fabricate chips so the fluid channels, ranging from 20 microns (the width of a human skin cell) to 100 microns, were as close to possible as the electronics. They combined those with large channels on the back of the chip to reduce the pressure needed to make the liquid flow. “The analogy is it's like our bodies,” Matioli says. “We have the larger arteries and the smaller capillaries and that's how the entire body minimizes the pressure necessary to distribute blood.”

The cooling technology has the potential to become a key part of power converters ranging from small devices to electric cars. The converter Matioli's team created pushed out more than three times the power of a typical laptop charger but was the size of a USB stick. He compares it to the evolution of a computer that once filled a room and now fits into a pocket. “We could start imagining the same thing for power electronics in applications that go all the way from power supplies to electric vehicles to solar inverters for solar panels and anything related to energy,” Matioli says. “So that opens a lot of possibilities.”

His team is getting interest from manufacturers, but he declined to go into detail. To Joshi, the research is a first step. “There remains more work to be done in scaling up of the approach, and its implementation in actual products.”

In a commentary accompanying the Nature paper, Tiwei Wei , a research scholar at Stanford University who was not a part of the study, also said challenges remained to implement the design, including studying the longevity of the gallium nitride layer and possible manufacturing issues. But their work, he says, “is a big step towards low-cost, ultra-compact and energy-efficient cooling systems for power electronics.”