Nvidia Unveils Vera Rubin AI Platform New Open-Source AI Models at CES 2026
Nvidia announced the Consumer Electronics Show (CES) 2026 announcement Monday with a number of artificial intelligence (AI) statements. Among these, it was Santa Clara tech giant Vera Rubin’s new AI platform, the Santa clara-based tech company’Stylilian version of its latest AI that replaces Blackwell. Similarly, the company announced six new chipsets and a supercomputer built on the new architecture, expanded its catalogue of open-source AI models, shared advances made by it in the physical AI space. During Nvidia CEO Jensen Huang’s keynote session, all of these announcements were made.
Nvidia Introduces Vera Rubin AI Platform
At the keynote address Huang introduced the Vera Rubin platform during his keynote speech. Just like its predecessor, Blackwell, this new architecture will be the standard for next chipsets targeting AI workflows, enterprise systems and supercomputers. Interestingly, the new AI platform is named after American astronomer Vera Florence Cooper Rubin, who has been known for studying galaxy rotation curves to prove dark matter.
When is the right time to arrive “Rubin, whose demand for both training and inference are being demanded by AI computing demands on this field of training as well as inferred from one side through the roof”? Huang said ‘We are an annual cadence of delivering a new generation of AI supercomputers (and extreme codesign across six new chips) — Rubin leaps giantly toward the next frontier of artificial intelligence.
A key idea of Vera Rubin’s core concept is extreme co-design Nvidia engineered the platform’ components from scratch to share data quickly, reduce costs and increase efficiency for training and running AI models. The firm also launched six key chipset families, which will be integrated with Rack-scale systems known as Vera Rubin NVL servers. This includes Nvidia Vera CPU, NVidia Rubin GPU, NVidia LVLink 6 Switch, NevDIA ConnectX-9 SuperNIC,Nvadia BlueField-4 data processing unit (DPU) and the NVPidia Spectrum-6 Ethernet Switch.
According to the company’s press release, “the new architecture will speed up agentic AI, advanced reasoning and large-scale mixture-of-experts (MoE) model inference,” according to a statement. In comparison with Blackwell it is said to be as affordable as possible, and up to 4x fewer GPUs are used for the same tasks.
Several of the companies that will be using Vera Rubin-based chipsets in the coming months, Nvidia also mentioned some of those companies. It includes Amazon Web Services (AWS), Anthropic, Dell Technologies, Google and HPE; Lenovo & Meta Microsoft / OpenAI – Oracle Perplexity, Thinking Machines Lab.
Nvidia’s Open Models, Data and Tools
Nvidia detailed an array of open models and data tools designed to accelerate AI across industries along with its system architecture, as well as a series that was developed by NVidia for the first time. Among these releases is the Nvidia (NvDIA Alpamayo) family of open, large-scale reasoning models and simulation frameworks that are designed to support safe, reasoning-based autonomous vehicle development. It has a family of reasoning-capable vision-language-action (VLA) model, simulation tools like AlpaSim and Physical AI Open Datasets that cover rare and complex driving scenarios.
Alpamayo is part of what Huang called a “ChatGPT moment for physical AI” where machines begin to understand, reason and act in the real world (including explaining their decisions) as they start to do so. These models and Simulation Frameworks (and the data sets) are designed to be open, so that industry developers & researchers working on Level 4 advanced driver assistive systems (ADAS) can progress more quickly.
In addition to this, Nvidia’s Nemotron family for agentic AI, Cosmos platform for physical AI (It is also available to the open community) Isaac GR00T for robotics and Clara for biomedical AI.
Thanks for reading Nvidia Unveils Vera Rubin AI Platform New Open-Source AI Models at CES 2026