AWS for M&E Blog
How to improve user engagement with real-time AR effects using BytePlus Effects and Amazon IVS
This blog post was co-authored by Chen Xiaoyu, ML Solution Director at BytePlus, Hao Chen at AWS, and Tian Shi at AWS.
According to Sandvine’s Global Internet Phenomena Report, video content accounted for more than 65% of all internet traffic in 2022. This number is up 24% from 2021 and is set to increase with the growing number of platforms available to creators to reach audiences through streaming video.
With such a large, diverse, and growing market, it can be challenging to differentiate your user-generated content (UGC) platform. One way some platforms have found success is by adding features like augmented reality (AR).
BytePlus Effects, an SDK that features an extensive library of AR engagement tools, provides one example of how developers can quickly incorporate these features. The SDK is available offline, meaning that all the effects are created on the user’s device without sending any personal data to the cloud. While there are several advantages to this approach, increased performance and user data privacy are paramount in creating a positive experience that establishes trust with users.
BytePlus Effects can also integrate with your live streaming UGC platform. In this blog post, we explore how to use the library to create an iOS application that broadcasts to Amazon Interactive Video Service (Amazon IVS).
Prerequisites
To get started, set up a few prerequisites:
- An AWS account to use the Amazon IVS Console
- An Amazon IVS streaming channel created from the Amazon IVS Console directly or through the Amazon IVS API
Once you have created a streaming channel, note the following pieces of information provided by Amazon IVS:
- The Ingest Server URL
- The Stream Key
- The Playback URL
For a complete guide to Amazon IVS Channels, please refer to the Getting Started with Amazon IVS guide.
SDK installation
For easy SDK installation and integration, we recommend integrating the SDK via CocoaPods. Both Amazon IVS and BytePlus support CocoaPods. For more information, please refer to the Amazon IVS Broadcast SDK (iOS) Installation Guide and BytePlus Effect SDK (iOS) Installation Guide.
Integration steps
By default, the Amazon IVS iOS broadcast SDK provides an API to control and capture device camera images for streaming. In order to integrate with the BytePlus Effect SDK, we will give application logic to control the device camera and use the BytePlus Effect SDK to process the image, then pass the processed image to the IVS Broadcast SDK via CustomImageSource for streaming. In this blog post, we demonstrate interaction steps using Object-C code, which is compatible between Amazon IVS iOS broadcast SDK and BytePlus Effect SDK.
- Setup Broadcast Session
- Setup
CustomImageSource
configuration - Setup Video Capture
- In this example, we will let the BytePlus Effect SDK access cameras via
BEVideoCapture
. However, application developers are free to use other input sources of their choice. - Process Captured Image using the BytePlus Effect SDK
- Send Processed Image Buffer to
CustomImageSource
- Start Broadcast
Setup broadcast session
Detailed SDK instructions can be found on the Amazon IVS Documentation page and the sample application available on GitHub.
Setup CustomImageSource
with broadcast configuration
CustomImageSource
is an advanced feature of the Amazon IVS Broadcast SDK. It allows application developers to submit image buffers from custom input sources for streaming.
Broadcast configuration
Using CustomImageSource
as the input source for the broadcast session, we first need to set up broadcast configuration. In this example, we use preset configuration standardPortrait
.
Mixer slot configuration
In order to use CustomImageSource
as input source, we use the mixer feature in the Broadcast SDK to create a custom Mixer Configuration. The mixer is a video processing unit that takes multiple input sources (slots) and generates a single output for ingestion. For demonstration purposes, this example uses a single slot.
Finally, we will set up the Broadcast session with the previous configurations.
Setup video capture
For demo purposes, we use the BytePlus Effect SDK’s API BEVideoCapture to set up video capture.
BytePlus Effect SDK initialization
Capture video image from device camera
Process captured image using BytePlus Effect SDK
The following videoCapture
proxy method provided by BEVideoCapture
passes captured CMSampleBufferRef
from the device camera for processing.
Processed buffer from camera using BytePlus Effect SDK
The actual image processing happens in the processWithCVPixelBuffer function that follows.
Set up input and output image buffer
Process the image
We use self.manager
which is the BEEffectManager
object we set up earlier during the BytePlus Effect SDK initialization.
Send processed image buffer to CustomImageSource
Once the processed image buffer is ready, we send it to CustomImageSource
in the Broadcast Session set up in the previous steps.
Start video broadcasting
Integration result
Facing shaping demo. Left: source iPhone application; right: HLS output from Amazon IVS channel
Image filtering demo. Left: source iPhone application; right: HLS output from Amazon IVS channel
This integration can be customized further. The BytePlus Effect SDK offers versatile integration options to process a variety of image sources such as BEImageCapture
or BELocalVideoCapture
Application architecture
To conclude, we integrated both the Amazon IVS Broadcast SDK and the BytePlus Effect SDK into an iOS mobile application. Using the BytePlus Effect SDK, we processed video images from the device camera and submitted them to the IVS Broadcast SDK for streaming.
About BytePlus
Born from ByteDance technology, BytePlus helps clients maximize growth through a wide range of intelligent technology solutions. A dedicated team of specialists work hand-in-hand with customers to help create better products, yield better experiences, and realize business growth.