Ampere Optimized Frameworks
Ampere Model Library (AML)
Ampere AI delivers world class AI Inference solutions! Ampere Optimized AI delivers a significant inference performance benefit to any existing model that runs on our supported frameworks out of the box. Ampere AI currently supports the following frameworks available for free download here or at some of our supporting partners:
Ampere hardware supports native FP16 data format providing nearly 2X speedup over FP32 with almost no accuracy loss for most AI models.
Ampere provides easy-to-use Docker containers that include Computer Vision and Natural Language Processing model examples and benchmarks that enable developers to get started quickly. Download one of our Docker containers today to experience the benefits of our best-in-class performance. Read more about our solutions below.
System configurations, components, software versions, and testing environments that differ from those used in Ampere’s tests may result in different measurements than those obtained by Ampere. The system configurations and components used in our testing are detailed here
Ampere Altra and Ampere Altra Max, with high performance Ampere optimized frameworks, offers the best-in-class Artificial Intelligence inference performance for frameworks including Tensorflow, PyTorch and ONNX Runtime. Ampere Model Library (AML) offers pretrained models to help accelerate AI development.
With an inference engine, Ampere optimized frameworks offer significant benefits. Click here to view the demo.
Ampere helps customers achieve superior performance for AI workloads by integrating optimized inference layers into common AI frameworks.
This seamless integration to any AI framework accelerates inference without any accuracy loss, conversions, or model retraining. The architecture is diagrammed in the figure above. The main components are as follows:
All model types are supported
AML is a collection of optimized AI models pretrained on standard datasets. The library contains scripts running the most common AI tasks. The models are available for Ampere customers to quickly and seamlessly build into their applications.
AML Benefits Include:
Benchmarking AI architecture with different frameworks
Accuracy testing of AI models on application-specific data
Comparison of AI architectures
Conducting tests on AI architectures
Regression of the currently available Ampere Optimized AI images
Ampere Altra Systems
Ampere Altra and Ampere Altra Max. These systems are flexible enough to meet the needs of any cloud deployment and come packed with Ampere's 80-core Altra or 128-core Altra Max processors
Microsoft offers a comprehensive line of Azure Virtual Machines that can run a diverse and broad set of Linux workloads such as web servers, open-source databases, in-memory applications, big data analytics, gaming, media, and more.
Equinix Metal, an on-demand digital infrastructure platform, has created Gen3 configs with Ampere Altra for common workloads which are available in minutes on bare metal
Whether your business is early in its journey or well on its way to digital transformation, Google Cloud can help you solve your toughest challenges.
OCI Ampere A1
Ampere Altra and Oracle Cloud combine predictable performance, near-linear scaling, and secure architecture with the best price-performance in the market in the following shapes: