The impact artificial intelligence (AI) is having on enterprise data processes and workloads is well-documented, as is its capability to monitor and manage complex systems. But what is not widely recognized at this point is how AI will change data infrastructure, not just in design and architecture, but rather in how it’s consumed by new generations of smart applications and services.
While infrastructure may seem immutable, the fact is even a physical plant is highly dynamic, right down to the processing capabilities in servers and networking devices, as well as the media used for storage. Virtualization has only added to this dynamism, to the point where infrastructure can be quickly tailored to meet the needs of any workload.
The latest twist on virtualization is containers, and as The Enterpriser’s Project’s Kevin Casey showed in a recent report, running AI at scale requires some retooling at the container level. For one thing, AI workloads require a lot of data gathering and processing up front, before you even get to the training. Once the model hits production, it must be supported with performance monitoring, performance metrics and a host of other services. Containers can certainly streamline these processes, but they must be optimized for consistency and repeatability to provide the maximum benefit to the workload.
Le règlement DORA : un tournant majeur pour la cybersécurité des institutions financières Le 17…
L’Agence nationale de la sécurité des systèmes d'information (ANSSI) a publié un rapport sur les…
Directive NIS 2 : Comprendre les nouvelles obligations en cybersécurité pour les entreprises européennes La…
Alors que la directive européenne NIS 2 s’apprête à transformer en profondeur la gouvernance de…
L'intelligence artificielle (IA) révolutionne le paysage de la cybersécurité, mais pas toujours dans le bon…
Des chercheurs en cybersécurité ont détecté une intensification des activités du groupe APT36, affilié au…
This website uses cookies.