Amazon Web Services (AWS) has announced that in the coming weeks its new Amazon Elastic Compute Cloud (EC2) G4 instances will feature NVIDIA T4 Tensor Core GPUs. This will boost the G4 instances and give the AWS customers a versatile platform which can be deployed with a wide range of AI services cost-efficiently.
T4 GPUs are powerful and best for companies which are looking for powerful and cost-efficient cloud solutions for deploying ML into production. With this announcement the AWS customers will now be able to pair the G4 instances with NVIDIA GPU acceleration software, including the NVIDIA CUDA-X AI libraries for accelerating deep learning, machine learning, and data analytics. In addition, T4 will also be supported by the Amazon Elastic Container Service for Kubernetes. “This will make it easier or the customers to deploy, manage, and scale containerized applications on EC2 G4 GPU instances using Kubernetes.”
Matt German, Vice President of Compute Services at AWS said: “NVIDIA and AWS have worked together for a long time to help customers run compute-intensive AI workloads in the cloud and create incredible new AI solutions. With our new T4-based G4 instances, we’re making it even easier and more cost-effective for customers to accelerate their machine learning inference and graphics-intensive applications.”
Samsung’s shareholders urged to use electronic voting for AGM
Nvidia CEO strongly believes AI to be the future of technology
NVIDIA and VMware team up to release cloud software
NVIDIA trains BERT in record 53 minutes
Google lets you use fingerprint to unlock some its services
© 2020 CIO Bulletin. All rights reserved.