Mythic announces analog AI chiplets delivering 100x energy efficiency for data centers

Mythic has announced it has raised $125 million in a funding round led by DCVC, aimed at deploying its Analog Processing Units (APUs) for data center and edge AI workloads. The company claims its APUs are 100 times more energy-efficient than current graphics processing units (GPUs) and all competing AI application-specific integrated circuits (ASICs). The company reports that its architecture addresses energy challenges inherent to AI computation, particularly for data centers running large language models and other intensive AI inference workloads.

Mythic’s APUs use an in-memory analog computation architecture that eliminates the divide between memory and compute, a bottleneck present in conventional Von Neumann architectures where memory and processors are physically and operationally separate. According to Mythic, this unified approach enables its chiplets to reach 120 trillion operations per second per watt. The company reports this offers both massive energy savings and throughput improvements, with single multiply-accumulate (MAC) operations consuming only 17 femtojoules—1,000 times more efficient than typical GPUs.

The company states its APUs scale with network complexity and are capable of running large language models with one trillion or more parameters. Internal benchmarks cited by Mythic show performance of up to 750 times more tokens per second per watt than NVIDIA’s highest-end GPUs when running these workloads. Unlike GPUs that require high-speed interconnects such as NVLINK, Mythic’s APUs do not need specialized inter-chip connectivity when scaling to large models. Mythic claims its next-generation devices reach cost levels of half a cent per million tokens for 100 billion parameter models and four cents per million tokens for one trillion parameter models, representing up to an 80 times cost improvement compared to modern GPUs.

In addition to data center applications, Mythic targets the automotive, robotics, and defense industries. The company’s Starlight sensing product integrates sub-1 watt APUs to enhance imaging applications, reportedly improving signal extraction from noise by a factor of 50 and bringing significant lowlight performance gains for mission-critical use cases.

Mythic has also expanded its software development capabilities. Its CAMP (Compute Analog in-Memory Processing) software development kit (SDK) allows deployment of deep neural network applications on Generation 1 APUs, supporting frameworks such as ONNX, PyTorch, and TensorFlow, as well as NVIDIA’s TensorRT on supported CPU platforms. The SDK was recently rated by a German research study at the highest maturity level among compute-in-memory processors.

“Mythic will win based on this fundamental insight: energy efficiency will define the future of AI computing everywhere,” said Taner Ozcelik, CEO of Mythic. “Much as GPUs have become the accelerated computer of choice next to CPUs due to their performance per watt benefits, our insanely energy-efficient APUs will become the accelerated computer of choice next to GPUs.”

“Mythic is taking a radically novel approach to building low-power, low-cost neural nets that can make any product more intelligent,” said Steve Jurvetson, Founder and Managing Director of Future Ventures. “Mythic can do an 8-bit multiply and add (the core element of AI compute) in a single transistor! No digital design can achieve that efficiency. Computation and memory are unified, as in the brain. We are excited to see Mythic continue to outperform all digital chips on calculations/$ and calculations/watt.”

Source: Mythic

Get Data Center Engineering News In Your Inbox:

Popular Posts:

Screenshot
Five AI data centers to reach 1 GW power capacity in 2026, new analysis shows
1600x1600_1
DCX announces 8.15 MW facility-scale CDU for 45 C warm-water AI data center cooling
pr429-10kw
Navitas ships a 10 kW 800 V-to-50 V DC-DC platform for high-voltage DC AI data center power
hybrid-power-stabilizer
Prevalon launches Hybrid Power Stabilizer for AI data center power stabilization
pr434-option-d-1
Navitas launches fifth-generation 1,200 V SiC TAP MOSFET platform for AI data center power

Share Your Data Center Engineering News

Do you have a new product announcement, webinar, whitepaper, or article topic? 

Get Data Center Engineering News In Your Inbox: