Listing Thumbnail

    Hugging Face Neuron Deep Learning AMI (Ubuntu 22.04)

     Info
    Hugging Face Neuron Deep Learning AMI (DLAMI) makes it easy to use Amazon EC2 Inferentia & Trainium instances for training and inference of Hugging Face Transformers and Diffusers models, offering up to 50% cost-to-train savings over comparable GPU-based DLAMIs.
    Listing Thumbnail

    Hugging Face Neuron Deep Learning AMI (Ubuntu 22.04)

     Info

    Overview

    Hugging Face Neuron Deep Learning AMI (DLAMI) makes it easy to use Amazon EC2 Inferentia & Trainium instances for efficient training and inference of Hugging Face Transformers and Diffusers models. With the Hugging Face Neuron DLAMI, scale your Transformers and Diffusion workloads quickly on Amazon EC2 while reducing your costs, with up to 50% cost-to-train savings over comparable GPU-based DLAMIs.

    This DLAMI is the officially supported, and recommended solution by Hugging Face, to run training and inference on Trainium and Inferentia EC2 instances, and supports most Hugging Face use cases, including:

    • Fine-tuning and pre-training Transformers models like BERT, GPT, or T5
    • Running inference with Transformers models like BERT, GPT, or T5
    • Fine-tuning and deploying Diffusers models like Stable Diffusion

    This DLAMI is provided at no additional charge to Amazon EC2 users.

    For more information and documentation, visit the Hugging Face Neuron Developer Guide: https://awsdocs-neuron.readthedocs-hosted.com/en/latest/frameworks/torch/torch-neuronx/tutorials/training/bert.html 

    AMI Name format: Hugging Face Neuron Deep Learning AMI (Ubuntu 22.04) ${YYYY-MM-DD}

    The AMI includes the following:

    • Supported AWS Service: EC2
    • Operating System: Ubuntu 22.04
    • Compute Architecture: x86
    • EBS volume type: gp2
    • Python version: 3.8
    • Supported EC2 Instances: Trn1/Inf2
    • Pytorch: 1.13 Neuron SDK
    • Hugging Face Libraries: transformers, datasets, accelerate, evaluate, diffusers

    Highlights

    • Ready to use Environment to run Hugging Face Transformers, Datasets, Accelerate, Diffusers with AWS Trainium & Inferentia
    • Save costs on your Training and Inference workloads for Hugging Face Transformers and Diffusers
    • Get the most out of Trainium and Inferentia on Amazon EC2, with accelerated deep learning framework and models maintained by Hugging Face

    Details

    Delivery method

    Delivery option
    64-bit (x86) Amazon Machine Image (AMI)

    Latest version

    Operating system
    Ubuntu 20.04

    Typical total price

    This estimate is based on use of the seller's recommended configuration (trn1.2xlarge) in the US East (N. Virginia) Region. View pricing details

    $1.344/hour

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    Hugging Face Neuron Deep Learning AMI (Ubuntu 22.04)

     Info
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.

    Usage costs (7)

     Info
    Instance type
    Product cost/hour
    EC2 cost/hour
    Total/hour
    trn1.2xlarge
    Recommended
    $0.00
    $1.344
    $1.344
    trn1.32xlarge
    $0.00
    $21.50
    $21.50
    trn1n.32xlarge
    $0.00
    $24.78
    $24.78
    inf2.xlarge
    $0.00
    $0.758
    $0.758
    inf2.8xlarge
    $0.00
    $1.968
    $1.968
    inf2.24xlarge
    $0.00
    $6.491
    $6.491
    inf2.48xlarge
    $0.00
    $12.981
    $12.981

    Additional AWS infrastructure costs

    Type
    Cost
    EBS General Purpose SSD (gp2) volumes
    $0.10/per GB/month of provisioned storage

    Vendor refund policy

    no refunds

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    64-bit (x86) Amazon Machine Image (AMI)

    Amazon Machine Image (AMI)

    An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.

    Version release notes

    Platform:

    • Platform: Linux-5.15.0-1066-aws-x86_64-with-glibc2.29
    • Python version: 3.10.12

    Python packages:

    • optimum-neuron version: 0.0.26
    • neuron-sdk version: 2.20.0
    • optimum version: 1.22.0
    • transformers version: 4.43.2
    • huggingface_hub version: 0.26.2
    • torch version: 2.1.2+cu121
    • aws-neuronx-runtime-discovery 2.9
    • libneuronxla 2.0.4115.0
    • neuronx-cc 2.15.128.0+56dc5a86
    • neuronx-distributed 0.9.0
    • neuronx-distributed-training 1.0.0
    • optimum-neuron 0.0.25
    • tensorboard-plugin-neuronx 2.6.63.0
    • torch-neuronx 2.1.2.2.3.0
    • transformers-neuronx 0.12.313

    Neuron Driver:

    aws-neuronx-collectives 2.22.26.0-17a033bc8 aws-neuronx-dkms 2.18.12.0 aws-neuronx-runtime-lib 2.22.14.0-6e27b8d5b aws-neuronx-tools 2.19.0.0

    Additional details

    Usage instructions

    Launch instance on either trn1 or inf2 instance type. Connect to the instance using the "EC2 Instance Connect" button in the AWS console. User Name is/should be ubuntu Test if neuron devices are accessible by running neuron-ls. Test if Transformers library is installed by following the commands bellow: python -c 'import torch_neuronx;import transformers;import datasets;import accelerate;import evaluate;import tensorboard;from optimum.neuron import pipeline'

    Support

    Vendor support

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Similar products

    Customer reviews

    Ratings and reviews

     Info
    5
    1 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    100%
    0%
    0%
    0%
    0%
    1 AWS reviews
    Paul-HF

    Used it to compile a model to Neuron

    Reviewed on Feb 20, 2024
    Purchase verified by AWS

    I've used this AMI to compile a tiny LLama model to Neuron and it works as expected with the necessary libraries, and with opimum-neuron among them.

    View all reviews