SambaNova has introduced its SN50 AI chip, announced a planned collaboration with Intel focused on AI inference infrastructure and cloud expansion, and reported more than $350 million in strategic Series E financing. SambaNova positions SN50 as a data center-oriented inference accelerator for “agentic AI,” and says the combination of new hardware, manufacturing expansion, and cloud capacity is intended to move autonomous AI agents into production with lower latency and lower total cost of ownership.
SambaNova says SN50 delivers five times more compute per accelerator and four times more network bandwidth than the previous generation. The company says the platform links up to 256 accelerators over a multi-terabyte-per-second interconnect to reduce time-to-first-token and support larger batch sizes, targeting higher-throughput, lower-latency inference for larger, longer-context models. SambaNova also says SN50 is built on its Reconfigurable Data Unit (RDU) architecture and includes a three-tier memory architecture intended to support “10T+ parameter models and 10M+ context lengths,” plus “resident multi-model memory and agentic caching” to optimize utilization and reduce cost-per-token. SambaNova says SN50 “uses existing power and is air cooled” and that it expects to ship the chip to customers later in 2026.
The press release calls out data center deployment directly: SoftBank Corp. is named as the first customer expected to deploy SN50 “within its next-generation AI data centers in Japan,” targeting low-latency inference services for sovereign and enterprise customers across Asia-Pacific. The release also says SoftBank already hosts SambaCloud for regional developers, and that anchoring new clusters on SN50 positions SambaNova as “the inference backbone” for SoftBank’s sovereign AI initiatives and future large-scale agentic services.
SambaNova and Intel say they have entered into a planned multi-year strategic collaboration aimed at “high-performance, cost-efficient AI inference solutions” for AI-native companies, model providers, enterprises, and government organizations. SambaNova says the collaboration includes scaling its vertically integrated AI cloud on Intel Xeon-based infrastructure optimized for large language and multimodal models, and combining SambaNova systems with Intel CPUs, accelerators, and networking technologies. Intel also plans to make a strategic investment in SambaNova to accelerate the rollout of an Intel-powered AI cloud, according to the release.
On funding, SambaNova says the “oversubscribed” Series E round was led by Vista Equity Partners and Cambium Capital, with participation from Intel Capital, and that proceeds will expand SN50 production, scale SambaCloud, and deepen enterprise software integrations. The release also cites demand across financial services, telecommunications, energy, and sovereign deployments.
Source: SambaNova













