At its annual customer conference, Nvidia is showing off new ways to model and predict the behavior of clouds and more, in the cloud and elsewhere.
It’s been years since developers found that Nvidia’s main product, the GPU, was useful not just for rendering video games but also for high-performance computing of the kind used in 3D modeling, weather forecasting, or the training of AI models—and it’s on enterprise applications such as those that CEO Jensen Huang will focus his attention at the company’s GTC 2022 conference this week.
Nvidia is hoping to make it easier for CIOs building digital twins and machine learning models to secure enterprise computing, and even to speed the adoption of quantum computing with a range of new hardware and software.
Digital twins, numerical models that reflect changes in real-world objects useful in design, manufacturing, and service creation, vary in their level of detail. For some applications, a simple database may suffice to record a product’s service history—when it was made, who it shipped to, what modifications have been applied—while others require a full-on 3D model incorporating real-time sensor data that can be used, for example, to provide advanced warning of component failure or of rain. It’s at the high end of that range that Nvidia plays.
Sécurité des mots de passe : bonnes pratiques pour éviter les failles La sécurité des…
Ransomware : comment prévenir et réagir face à une attaque Le ransomware est l’une des…
Cybersécurité et e-commerce : protéger vos clients et vos ventes En 2025, les sites e-commerce…
Les ransomwares : comprendre et se défendre contre cette menace En 2025, les ransomwares représentent…
RGPD et cybersécurité : comment rester conforme en 2025 Depuis sa mise en application en…
VPN : un outil indispensable pour protéger vos données Le VPN, ou « Virtual Private…
This website uses cookies.