AI Tax: The Hidden Cost of AI Data Center Applications
Abstract
Artificial intelligence and machine learning are experiencing widespread adoption in industry and academia. This has been driven by rapid advances in the applications and accuracy of AI through increasingly complex algorithms and models; this, in turn, has spurred research into specialized hardware AI accelerators. Given the rapid pace of advances, it is easy to forget that they are often developed and evaluated in a vacuum without considering the full application environment. This paper emphasizes the need for a holistic, end-to-end analysis of AI workloads and reveals the "AI tax." We deploy and characterize Face Recognition in an edge data center. The application is an AI-centric edge video analytics application built using popular open source infrastructure and ML tools. Despite using state-of-the-art AI and ML algorithms, the application relies heavily on pre-and post-processing code. As AI-centric applications benefit from the acceleration promised by accelerators, we find they impose stresses on the hardware and software infrastructure: storage and network bandwidth become major bottlenecks with increasing AI acceleration. By specializing for AI applications, we show that a purpose-built edge data center can be designed for the stresses of accelerated AI at 15% lower TCO than one derived from homogeneous servers and infrastructure.
- Publication:
-
arXiv e-prints
- Pub Date:
- July 2020
- DOI:
- arXiv:
- arXiv:2007.10571
- Bibcode:
- 2020arXiv200710571R
- Keywords:
-
- Computer Science - Distributed;
- Parallel;
- and Cluster Computing;
- Computer Science - Performance;
- I.2;
- C.4
- E-Print:
- 32 pages. 16 figures. Submitted to ACM "Transactions on Computer Systems."