BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News AWS Batch Introduces Multi-Container Jobs for Large-Scale Simulations

AWS Batch Introduces Multi-Container Jobs for Large-Scale Simulations

Recently, AWS announced the support of multi-container jobs in AWS Batch through the management console. This new feature simplifies the process of running simulations, particularly for testing complex systems such as those used in autonomous vehicles and robotics.

According to the cloud provider, multi-container jobs accelerate development times by reducing the effort required for job preparation and eliminating the need for custom tooling to integrate the work of multiple teams. Danilo Poccia, chief evangelist (EMEA) at AWS, writes:

Traditionally, AWS Batch only allowed single-container jobs and required extra steps to merge all components into a monolithic container. It also did not allow using separate "sidecar" containers, which are auxiliary containers that complement the main application by providing additional services like data logging. This additional effort required coordination across multiple teams (...) because any code change meant rebuilding the entire container.

A set of batch management capabilities, AWS Batch dynamically provisions the required quantity and type of compute resources based on the volume and specific resources of the batch jobs submitted. This service assists developers, scientists, and engineers in running batch computing jobs on the cloud.

According to AWS, the new feature makes it easier to run large-scale simulations in areas like autonomous vehicles and robotic, workloads that are usually divided between the simulation itself and the system under test that interacts with the simulation. IPG Automotive, MORAI, and Robotec.ai are among the AWS customers already running multi-container jobs. Poccia adds:

Using multi-container jobs accelerates development times by reducing job preparation efforts and eliminates the need for custom tooling to merge the work of multiple teams into a single container. It also simplifies DevOps by defining clear component responsibilities so that teams can quickly identify and fix issues in their own areas of expertise without distraction.

By running multiple containers within a job, there is no longer a requirement to rebuild a system into a monolithic container before executing batch jobs. Developers can now define multiple smaller, modular containers representing distinct system components using the AWS Management Console, CLI, or SDKs.

AWS is not the only cloud provider offering a set of batch management capabilities: Microsoft provides Azure Batch, a service that assists developers in managing compute-intensive work across a scalable collection of VMs; Batch is Google Cloud's managed service for scheduling, queuing, and executing batch processing workloads. However, neither currently supports multi-container jobs.

The new feature is available in any region where AWS Batch is available and there are no additional costs for using AWS Batch or multi-container jobs.

 

About the Author

Rate this Article

Adoption
Style

BT