At its annual customer conference, Nvidia is showing off new ways to model and predict the behavior of clouds and more, in the cloud and elsewhere.
It’s been years since developers found that Nvidia’s main product, the GPU, was useful not just for rendering video games but also for high-performance computing of the kind used in 3D modeling, weather forecasting, or the training of AI models—and it’s on enterprise applications such as those that CEO Jensen Huang will focus his attention at the company’s GTC 2022 conference this week.
Nvidia is hoping to make it easier for CIOs building digital twins and machine learning models to secure enterprise computing, and even to speed the adoption of quantum computing with a range of new hardware and software.
Digital twins, numerical models that reflect changes in real-world objects useful in design, manufacturing, and service creation, vary in their level of detail. For some applications, a simple database may suffice to record a product’s service history—when it was made, who it shipped to, what modifications have been applied—while others require a full-on 3D model incorporating real-time sensor data that can be used, for example, to provide advanced warning of component failure or of rain. It’s at the high end of that range that Nvidia plays.
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
La transformation numérique du secteur financier n'a pas que du bon : elle augmente aussi…
Telegram envisage de quitter la France : le chiffrement de bout en bout au cœur…
L'intelligence artificielle (IA) révolutionne le paysage de la cybersécurité, mais pas toujours dans le bon…
TISAX® et ISO 27001 sont toutes deux des normes dédiées à la sécurité de l’information. Bien qu’elles aient…
This website uses cookies.