Phononic launches chip-level cooling kit for AI data centers

Phononic has announced the Thermal Kit, an intelligent cooling solution designed for artificial intelligence (AI) data centers. The company claims the kit addresses performance losses caused by thermal throttling and reduces infrastructure overprovisioning costs by delivering targeted, node-level cooling for high-power compute nodes.

According to Phononic, its Thermal Kit combines high-performance thermoelectric coolers with an integrated mechanical and thermal architecture, and is controlled via accessible firmware and software APIs. This setup enables precise, chip-level thermal management for processors and high-bandwidth memory, and operates as a supplement to existing liquid-cooled data center systems. The system is designed to identify and respond to thermal hotspots in milliseconds, aiming to maintain optimal compute temperatures and limit throttling during variable AI workloads.

The company reports that data center operators currently overprovision cooling capacity by up to 78 percent to accommodate unpredictable AI workload spikes, contributing to both wasted energy and capital expenditure. Phononic states its cooling technology can reduce performance throttling—potentially minimizing performance drops of up to 30 percent—without redesigning silicon or requiring substantial infrastructure changes. Applications detailed in the release include transformers, generative AI models, large batch training, and large language model inference.

Phononic claims additional operational benefits: improved utilization and extended life of high-value assets, along with lower facility energy consumption by running secondary cooling loops at warmer settings to reduce chiller demand. The company also notes that it holds a position in optical transceiver cooling and states its solutions are deployed with tier 1 hyperscalers and major equipment manufacturers globally.

Matt Langman, Senior Vice President and General Manager of Infrastructure Solutions at Phononic, said, “The Thermal Kit is designed to meet one of the biggest challenges of today’s AI data centers: cooling,” adding, “For operators facing unprecedented power demands for today’s AI workloads, it is mission critical to maintain performance through reductions in thermal throttling and optimized energy use of existing liquid-cooled infrastructure. With this breakthrough, customers can unlock higher compute capability and deliver meaningful data center wide ROI.”

Source: Phononic

Get Data Center Engineering News In Your Inbox:

Popular Posts:

695fcac850f073b041e711a2_karman-p-3200 copy
Karman launches 10 MW Heat Processing Unit for giga-scale AI data center cooling
Screenshot
Five AI data centers to reach 1 GW power capacity in 2026, new analysis shows
68e79d30a17eea847251fae6_img-home-product-liquidjet-main
Frore Systems updates LiquidJet direct-to-chip coldplate for 1,950 W NVIDIA Rubin data center GPUs
1765906506220
Tritium launches 800 VDC bidirectional inverter for data centers and renewable energy sites
Screenshot
HC Capital Partners and Herrmann Family Companies plan 1,500-plus-acre Energy Ranch power-linked data center campus in South Texas

Share Your Data Center Engineering News

Do you have a new product announcement, webinar, whitepaper, or article topic? 

Get Data Center Engineering News In Your Inbox: