BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News AWS Announces Amazon SageMaker Edge Manager

AWS Announces Amazon SageMaker Edge Manager

This item in japanese

Bookmarks

Recently AWS announced a new capability of Amazon SageMaker called Amazon SageMaker Edge Manager. This new capability in Amazon SageMaker makes it easy for customers to prepare, run, monitor, and update machine learning models on fleets of edge devices such as smart cameras, robots, and industrial machines. 

Amazon SageMaker Edge Manager is one of the nine significant updates to the cloud-based machine learning platform Amazon SageMaker, announced during the annual re:Invent. With Edge Manager, the company delivers a solution for its customers to more effortless deploy and manage models at the edge. 

Furthermore, Amazon SageMaker Edge Manager extends capabilities previously only available in the cloud by sampling models' input and output data from edge devices and sending it to the cloud - allowing developers to continuously improve model quality by using Amazon SageMaker Model Monitor for drift detection, then relabel the data and retrain the models. 

Developers can train or import a model in Amazon SageMaker, and subsequently let Amazon SageMaker Edge Manager optimize it for their hardware platform using Amazon SageMaker Neo – a service launched two years ago. This service converts models into an efficient standard format executed on the device by a low footprint runtime. Currently, Neo supports devices based on chips manufactured by Ambarella, ARM, Intel, NVIDIA, NXP, Qualcomm, TI, and Xilinx. 

Amazon SageMaker Edge Manager then packages the model and stores it in Amazon Simple Storage Service (S3), where it can be deployed to the intended devices. The on-device models are managed by the Amazon SageMaker Edge Manager Manager Agent, which communicates with the AWS Cloud for model deployment and the developers' application for model management.

Julien Simon, an artificial intelligence & machine learning evangelist at AWS, states in a blog post:

Indeed, you can integrate this agent with your application, so that it may automatically load and unload models according to your prediction requests. This enables a variety of scenarios, such as freeing all resources for a large model whenever needed, or working with a collection of smaller models that cohabit in memory.

Source: https://www.youtube.com/watch?v=zS0Q3bdsLiU (screenshot)

Vin Sharma, GM of Machine Learning Inference Service at AWS, said in a video on Amazon SageMaker Edge Manager:

Today, many developers use either hardware-specific tools or framework-specific tools to convert and optimize their trained models to run on the target hardware. This process can take many months for developers to hand-tune each model to fit each device's specific hardware constraints. 

Amazon SageMaker Edge Manager makes this easy by using Amazon SageMaker Neo. Neo compiles the models for a wide variety of target devices, a wide range of operating environments including Linux, windows, android, and even IoS and MacOs across a variety of target hardware based on CPUs, GPUs, and embedded hardware socs.

Currently, Amazon SageMaker Edge Manager is available in a couple of AWS regions in Europe, North America, and Asia Pacific. Furthermore, sample notebooks are available on GitHub - and pricing details on the pricing page.

Rate this Article

Adoption
Style

BT