How AI is changing data Infrastructure

How AI is changing data Infrastructure

The impact artificial intelligence (AI) is having on enterprise data processes and workloads is well-documented, as is its capability to monitor and manage complex systems. But what is not widely recognized at this point is how AI will change data infrastructure, not just in design and architecture, but rather in how it’s consumed by new generations of smart applications and services.

While infrastructure may seem immutable, the fact is even a physical plant is highly dynamic, right down to the processing capabilities in servers and networking devices, as well as the media used for storage. Virtualization has only added to this dynamism, to the point where infrastructure can be quickly tailored to meet the needs of any workload.

The latest twist on virtualization is containers, and as The Enterpriser’s Project’s Kevin Casey showed in a recent report, running AI at scale requires some retooling at the container level. For one thing, AI workloads require a lot of data gathering and processing up front, before you even get to the training. Once the model hits production, it must be supported with performance monitoring, performance metrics and a host of other services. Containers can certainly streamline these processes, but they must be optimized for consistency and repeatability to provide the maximum benefit to the workload.

Read more