Lenovo has announced a suite of enterprise servers, solutions, and services designed for AI inferencing workloads, expanding its Lenovo Hybrid AI Advantage portfolio. The announcement was made at Tech World at CES 2026 at Sphere in Las Vegas, and targets real-time inferencing use cases across cloud, data center, and edge environments, according to Lenovo.
Lenovo positioned the release around running fully trained models on new, unseen data for real-time decisions, and cited market growth expectations for inference infrastructure. Futurum estimates the global AI inference infrastructure market growing from $5.0 billion in 2024 to $48.8 billion by 2030, representing a six-year compound annual growth rate of 46.3%.
The company’s new inferencing-optimized server lineup includes three systems with different deployment footprints. Lenovo ThinkSystem SR675i is positioned for running “full LLMs anywhere” with “massive scalability,” and is also described for accelerated simulation in manufacturing plus critical healthcare and financial services environments. Lenovo ThinkSystem SR650i is described as a high-density GPU compute platform designed to scale and deploy in existing data centers. For edge sites, Lenovo ThinkEdge SE455i is described as a compact server built for retail, telecom, and industrial environments, with ultra-low latency, rugged reliability, and an operating range between -5 C and 55 C.
Lenovo also tied the servers to supporting infrastructure and consumption options, citing Lenovo Neptune air and liquid cooling and the Lenovo TruScale pay-as-you-go pricing model. On the solution side, Lenovo described its Lenovo Hybrid AI Factory as a “validated modular framework” for building and operating AI solutions at scale, with platforms that combine inferencing servers with storage, networking, software, and orchestration. Announced configurations include Lenovo Hybrid AI Inferencing with Lenovo ThinkAgile HX and Nutanix AI for centralized shared inference in a virtualized environment; Lenovo Hybrid AI Inferencing with Red Hat AI for an “enterprise-grade” platform; and Lenovo Hybrid AI Inferencing with Canonical Ubuntu Pro, described as a streamlined option that can use the ThinkSystem SR650i.
For operations teams, Lenovo said it is also providing Lenovo Hybrid AI Factory Services for inferencing, including advisory, deployment, and managed services aimed at standing up and optimizing inferencing environments. It also referenced Lenovo Premier Support for ongoing data center management assistance and TruScale Infrastructure-as-a-Service financing to scale as inferencing and AI operations evolve.
Source: Lenovo







