AWS Storage Blog
Video transcoding at the edge with AWS Snowcone
A customer doing video analysis in remote locations has the following problem: they must capture high-resolution video in the field and then transfer that data to a durable, highly available data store in the cloud for long-term storage. They also want to keep copies of video files in the remote location so that they can view the videos at any time. However, the customer has limited storage capacity in the remote location, and they are looking for a way to transcode the video files into a lower resolution format in the field.
To solve this problem, I show you how to use AWS Snowcone to store and transcode files while in a remote location that has no internet connection. Following that, I use the Snowcone to ship the high-resolution files back to AWS and into Amazon S3. This allows me to be economical with my storage capacity in the remote location while also storing and having access to the video files I have captured. It also allows me to securely transfer the files back to AWS where they can be stored long term in AWS’ durable, cost-effective object storage service, Amazon S3.
Overview
AWS Snowcone is the smallest member of the AWS Snow Family of edge computing, edge storage, and data transfer devices. AWS Snowcone is a small, ultra-portable, and ruggedized device purpose-built for use outside of a traditional data center.
Following are the high-level steps I will use to demonstrate this proof of concept.
- Prepare an Amazon Machine Image (AMI) to use as a template to run EC2 instances on AWS Snowcone.
- Order AWS Snowcone through the AWS Snow Family console.
- Once the device arrives, take it to a “remote” location, that is, a location where there is intermittent or no internet connectivity. Using a standard home router, create a local area network, and connect the Snowcone to it using RJ45 Ethernet cables. Install AWS OpsHub on a workstation (MacOS/Windows) with an NFS client. AWS OpsHub is a graphical user interface used to manage AWS Snowcone (and AWS Snowball) devices.
- Capture high-resolution video, in .mov format, using a DSLR camera or similar, and then transfer the video to the laptop.
- Using AWS OpsHub, start the Snowcone’s built-in NFS server. Next, transfer the video data to Snowcone using NFS.
- Using AWS OpsHub, start an EC2 instance and transcode the .mov file into a lower resolution, smaller .mp4 format.
- After transcoding the video, ship the device back to AWS. Once the Snowcone arrives at AWS, the high-resolution video file data is loaded into an S3 bucket.
I depicted this solution in the following architecture diagram:
The below video shows how I conducted the proof of concept, and the following section provides a step-by-step instruction guide.
Step 1 – Prepare an Amazon Machine Image
I launch an EC2 instance using the Ubuntu 16.04 AMI. Once I have accepted the terms and activated the subscription, I click Continue to Configuration. I leave the settings as default.
On the Configure this software screen, I choose Continue to Launch.
On the next page, I choose the action Launch through EC2, and then click Launch.
See the Amazon EC2 user guide for instructions on how to launch an EC2 instance using the launch instance wizard.
I configure my instance with a security group that allows HTTP traffic over port 80 and SSH traffic on port 22. As I launch my EC2 instance, I create a new key pair and download it. I keep this key pair handy as I will need it to log into the AWS Snowcone when it arrives.
Once my EC2 instance is running, I connect to it from a terminal using SSH, for example:
ssh -I "BenSnowconeKeypairSept8.pem" ubuntu@ec2-3-88-52-241.compute-1.amazonaws.com
Next, I configure the instance with the video transcoding software and a web server.
Once connected to the EC2 instance, I install ffmpeg. You can find more information about how to download ffmpeg here.
Install fmpeg using the following commands:
wget https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz
Unpack everything:
tar xvf ffmpeg-release-amd64-static.tar.xz
Delete the zipped file:
rm ffmpeg-release-amd64-static.tar.xz
Confirm that ffmpeg is working by running the following:
./ffmpeg-4.3.1-amd64-static /ffmpeg -version
Install the Apache web server by executing the following commands:
sudo apt-get update
sudo apt-get install apache2
sudo service apache2 restart
Edit the web server index file and create a media directory:
sudo su
cd /var/www/html
echo "Ben's snowcone web server" > index.html
mkdir media
exit
I confirm the web server works by going to the public IP address of the EC2 instance in a browser:
The next step is to create an Amazon Machine Image from this instance. See the Amazon EC2 user guide for instructions on how to create an Amazon EBS-backed Linux AMI.
Step 2 – Order the AWS Snowcone
From the AWS Management Console, I navigate to the AWS Snow Family console and click Create Job.
See the AWS Snow Family documentation for instructions on how to create a Snowcone job. As I create the job, I choose the option Import into Amazon S3 and choose device type AWS Snowcone. I specify the S3 bucket where I want to import the data. If the S3 bucket does not exist, I must create an Amazon S3 bucket. I select the option Enable compute with EC2 and add the AMI that I created previously. When prompted to set security, I click on Create/Select IAM role and click Allow. This gives the AWS Snowcone read/write access to resources in my account. AWS Snowcone uses this role to import my data into Amazon S3.
When I receive the Snowcone, it will be in a locked state. To unlock it I need an unlock code and the manifest file, both of which are obtained through the AWS Snow Family dashboard.
Using the AWS Snow Family dashboard, I select the job I created. I select View job details and open the Credentials section. From here, I obtain the client unlock code and a manifest file. I make a copy of the unlock code and store it in a safe place on my laptop. I download the manifest file to my laptop. I will need both the unlock code and the manifest file to unlock the device in the field to begin using the Snowcone.
Now I have created my Snowcone job, and AWS will send a Snowcone to the address I specified.
Step 3 – Set up the AWS Snowcone
Once I receive the Snowcone, I take it to a remote location and begin configuring it. Following are the steps on how to set up video transcoding on the Snowcone device. I will need:
- A Snowcone configured using the preceding instructions.
- A USB-C battery power supply for the Snowcone.
- A router that supports 1 GbE or 10 GbE Ethernet. This is required to create a local area network that will allow a laptop to connect to the Snowcone. In this demonstration, I used a standard home Wi-Fi router with at least one available RJ45 port.
- A USB battery power supply for the router.
- An RJ45 cable to connect the Snowcone to the router.
- A laptop, to interact with the Snowcone over the LAN, using AWS OpsHub and SSH. The laptop must also have an NFS client installed. In this example, I used a macOS laptop.
- A DSLR camera to take high-resolution video, and a USB cable to transfer the video from the camera to the laptop (from where I can transfer it to the Snowcone).
The instructions for setup are as follows:
- Open the rear panel of the Snowcone and connect the USB-C power source.
- Connect one end of the RJ45 Ethernet cable to the Snowcone into either of the two available ports.
- Connect the other end of the RJ45 Ethernet cable to a LAN port on the router.
- Connect power source to the router and power on.
- Switch on the Snowcone using the power button on the front panel.
The Snowcone should obtain an IP address from the router using DCHP. This IP address is the address assigned to the device itself, and is visible on the top panel of the Snowcone.
To unlock the Snowcone I open AWS OpsHub for Snow Family on my laptop, and choose the Snowcone option.
On the following screen, I provide the IP address of the device, as shown on the top screen of the device. I click Next and enter the unlock code. I choose the manifest file that I previously saved onto my laptop, and click Unlock Device.
After a few minutes, the device should be unlocked and ready to use.
Step 4 – Start an EC2 instance on the Snowcone
For detailed instructions on how to launch an EC2 instance on Snowcone, see the AWS Snowcone documentation. When launching the instance, I use the AMI that I prepared when I created the Snowcone job.
Step 5 – Capture video data and transfer it to the laptop
In the following steps, I will capture video content using a camera and transfer it to the laptop. I will start an NFS server on the Snowcone and connect to an NFS share on the Snowcone from the laptop. I will then copy the video file onto the NFS share. I will then mount the NFS share on the EC2 instance and transcode the video in that share.
Using a DSLR or an equivalent camera capable of generating .mov video files, I capture the video content, and transfer the captured video to my laptop using the USB cable that came with the camera.
From the Snow Family console, I start the NFS service by clicking on Enable next to Transfer Data. In the Start NFS panel, I leave the options as default, and click Start NFS.
After a few minutes, the NFS service will show as Active in the Snow Family console.
In AWS OpsHub, in the Transfer Data panel, I click Open in Finder. This opens the NFS share in the finder window on the laptop. Finally, I copy the video files from the desktop to the NFS share.
Step 6 – Transcode video
In this section I transcode the video file using an EC2 instance running on Snowcone. Then I use a web server running on the instance to view the transcoded files through a web browser. To do this, I launch an Amazon EC2 instance on Snowcone using the AMI I created when I set up the Snowcone job.
From the AWS OpsHub dashboard, in the Start computing panel, I click Get Started. On the next screen, I click Launch Instance. Then, I choose the AMI that I created when I set up the Snowcone job in the Image (AMI) drop down. I leave the other fields as default and click Launch.
Once the EC2 instance state changes from Pending to Running, I log into the EC2 instance using the key pair that I created when I configured the AMI.
ssh -I "BenUbuntuKeypair.pem" ubuntu@10.0.0.4
I can find the public IP address of the instance I am connecting to by navigating to the AWS OpsHub Dashboard, and then by clicking on Compute. From there, I am able to view a list of running instances. I find the public IP address in the row corresponding to the instance I just launched, under the column Public IP.
Once logged in, I connect to the NFS server by mounting a directory to the Snowcone’s NFS share using the Linux mount command. I substitute the IP address with the IP address of the NFS server that is running on the Snowcone. I substitute bld-snowcone
with the name of my NFS share.
sudo mkdir /mnt/mydata
sudo mount -t nfs -o vers=3,nolock -v 192.168.1.233:/buckets/bld-snowcone/ /mnt/mydata
I navigate to the mount directory using cd. I transcode the .mov video file in that directory using the ffmpeg tool that I installed on the AMI.
cd /mnt/mydata
ffmpeg -i FIleNameOfMOVFile myTranscodedVideo.mp4
Once the transcoding completes, I copy the transcoded file into the /var/www/html/media directory of the web server that is running on the EC2 instance.
sudo cp myTranscodedVideo.mp4 /var/www/html/media
Finally, using the public IP address assigned to the instance, in this case 10.0.0.4, I open the transcoded media in a web browser by navigating to the URL 10.0.0.4/media and then by selecting the transcoded .mp4 file.
I am now able to view the transcoded video through my browser.
Now I have demonstrated two examples of compute at the edge: transcoding and hosting a web server.
Step 7 – Ship the Snowcone back to AWS
After I have completed the transcoding exercise, I ship the device back to AWS. Once the device arrives back at AWS, I receive a notification, and the data is loaded into Amazon S3. I can view details of the import process by looking at the AWS Snow Family job dashboard in the AWS Management Console. I select my job and click Download success log to see that files were loaded successfully into S3.
Conclusion
In this post, I demonstrated video transcoding in the field using the edge computing features of AWS Snowcone, by launching an EC2 instance with video transcoding software installed. I demonstrated edge storage, which allows the customer to store both raw and transcoded video files on the device. I also demonstrated transferring data onto Snowcone using the built-in NFS server. Finally, I showed how you can physically transfer data from a remote location into an Amazon S3 bucket in the AWS Cloud. Storing and transcoding the video in the field to a lower-resolution is important because it enables you to economically maintain local copies of the video file in remote locations where you have limited storage capacity. Transferring the original high-resolution data back to AWS is important because it allows the customer to perform further detailed analysis on the video at a later date, and allows for cost-effective, long term, and highly durable storage of your data.
AWS Snowcone is a versatile service that has a wide range of applications and other features that I did not discuss here. For example, you can use AWS Snowcone with the online data transfer services AWS DataSync or AWS Direct Connect. DataSync simplifies, automates, and accelerates copying large amounts of data to and from AWS Storage services – as well as between AWS Storage services – over the internet, while Direct Connect offers a dedicated network connection.
Thanks for reading this blog post on using the AWS Snowcone for video transcoding. Please don’t hesitate to leave comments or questions in the comments section.