Neo Android: 1X, OpenAI & Nvidia collaboration


Staff member
OpenAI is planning to utilise 10 million Nvidia GPU’s to create an advanced AI model, which will most probably run on the Neo android robot. OpenAI and 1X will no doubt be incorporating the Nvidia Modulus platform that uses machine learning to simulate physical systems in development of the Neo android robot.
NVIDIA and OpenAI to create advanced AI model that connects 10 million GPUs together
NVIDIA and OpenAI expand their partnership with a new advanced AI model more powerful than ChatGPT that connects 10 million NVIDIA AI GPUs.

NVIDIA and OpenAI are reportedly aiming to create a technological marvel by combining the power of more than a million NVIDIA AI GPUs and software developed by OpenAI that will link them all together.
For those that don't know, AI systems such as open AI's ChatGPT are powered by thousands of NVIDIA AI GPUs. According to reports, NVIDIA has supplied approximately 20,000 of its AI GPUs to open AI, but this won't be enough if the company wishes to keep up with its ever-expanding language models that require more and more power.

According to Wang Shah Kwong, a businessman and founder of the Chinese search engine Sogou, OpenAI is already developing a more advanced AI computing model that will have a total capacity of 10 million AI GPUs. While 10 million NVIDIA AI GPUS sounds astronomical (and it is), it's another thing entirely to actually reach that number in physical hardware.

According to reports, NVIDIA is only able to produce one million of its AI GPUs every year, which means that at its current rate, it will take 10 years to achieve the total capacity of OpenAIs new advanced AI model.

However, if NVIDIA is able to produce even a fraction of what OpenAI needs, the sheer computing power will be hard to fathom, especially considering NVIDIA's technology that will be able to connect each of the GPUs together - forming an eye-watering amount of computing power.

Read more:

NVIDIA Modulus​

NVIDIA Modulus is an open-source framework for building, training, and fine-tuning Physics-ML models with a simple Python interface.

Using Modulus, engineers can build high-fidelity AI surrogate models that blend the causality of physics described by governing partial differential equations (PDEs) with simulation data from CAE solvers or observed data. Such AI models can predict with near-real-time latency and for a parameterized design space.