Researchers present promising approach to self-cooling microchips

Share on facebook
Share on twitter
Share on linkedin
Share on email

With Nvidia’s forthcoming RTX 30 series expected to draw up to 350W of power, we shudder to think what sort of heat that card is going to kick out.

But cooling is not just a problem for PC builders at home, in the context of a data centre, heat is something that needs to be evacuated from servers as quickly as possible and water is a good way to do that.

Unfortunately, the amount of water needed is immense. A report from Bloomberg in April reveals that in 2019, Google requested or was granted more than 2.3 billion gallons of water for data centres in three different US states.

There has to be a better way of dealing with heat from chips right?

Well that’s where a team of Swiss researchers come into the picture.

This week Remco van Erp, Reza Soleimanzadeh, Luca Nela, Georgios Kampitsis and Elison Matioli published a paper in the journal Nature titled “Co-designing electronics with microfluidics for more sustainable cooling”.

The premise is simple – integrate a microchip’s cooling into the chip itself rather than having two separate products.

Now, we’re aware that this has been tried before, but this time the researchers suggest building the chip and cooling together rather than two units combined at fabrication.

It’s best explained by mechanical engineer author of a companion paper, Tiwei Wei.

The engineer says that the integrating cooling channels co-fabricated into the chip is a breakthrough.

“The buried channels are therefore embedded right below the active areas of the chip, so that the coolant passes directly beneath the heat sources,” wrote Wei.

The researchers etched and widened coolant channels direct onto a chip’s substrate. These channels were then sealed off with copper and the other electronics within the chip were built on top of the channels.

The design was tested using water as a coolant and the researchers report that cooling power was up to 50 times greater than other solutions.

So great then right? We should start seeing self cooled chips any day now?

Not really.

There are a few problems with this design, namely as regards connecting the substrate surface and cooling channels during fabrication. According to Wei these materials could prove problematic for long term stability in different environments.

The other problem is water might not be the best solution here given how it doesn’t exactly play well with electricity.

But this is an interesting step toward squeezing even more performance out of increasingly smaller chips while maintaining reasonable temperatures.

[Source – Inverse]

Brendyn Lotz

Brendyn Lotz

Brendyn Lotz writes news, reviews, and opinion pieces for Hypertext. His interests include SMEs, innovation on the African continent, cybersecurity, blockchain, games, geek culture and YouTube.