Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage News Developing Software to Manage Distributed Energy Systems at Scale

Developing Software to Manage Distributed Energy Systems at Scale

Functional programming techniques can make software more composable, reliable, and testable. For systems at scale, trade-offs in edge vs. cloud computing can impact speed and security.

Héctor Veiga Ortiz and Natalie DellaMaria spoke about Tesla’s virtual power plant at QCon San Francisco 2022 and QCon Plus December 2022.

The Tesla Energy Platform is a microservices cloud architecture using functional programming, as Veiga Ortiz and DellaMaria explained:

Functional programming enables us to move fast while maintaining confidence in quality. Much of our application logic is built with pure functions. This enables us to rely on lightweight unit tests that are quick to run and give us confidence in our code without needing to stand up resource-heavy system or integration tests.

Strongly typed languages, like Scala, allow to model business logic into powerful types that can express use cases in a more readable and understandable manner, DellaMaria mentioned.

The immutability of variables reduces possible side-effects and results in fewer bugs and more readable code, as Veiga Ortiz explained:

For example, instead of throwing exceptions, which is an expensive operation because the runtime needs to collect information about the stack trace, we model errors using the type Either[A,B] where the type A represents the Error/Exception and type B represents your successful object returned from a computation.

We also use Option[T] to represent the existence (or not) of an object. When you combine these powerful simple types with category theory and effect libraries such as Cats, you can express complicated business logic in simple for-comprehension blocks, boosting productivity and ensuring your code is doing what you expect at compile time.

DellaMaria mentioned that when making decisions about cloud vs. edge computing, speed and security are often considered. Often it is quicker to iterate in the cloud layer before moving logic down to the edge, however, sometimes features make the most sense implemented locally on the device. DellaMaria mentioned that, as they are vertically integrated, they can release cloud-based features quickly, learn from them, and at any time choose to move that implementation down to the device.

InfoQ interviewed Héctor Veiga Ortiz and Natalie DellaMaria about the Tesla Energy Platform.

InfoQ: What purpose does the Tesla Energy Platform serve?

Héctor Veiga Ortiz and Natalie DellaMaria: The Tesla Energy Platform provides software services that enable real-time control of millions of IoT devices and support a variety of user experiences. Its main purpose is to abstract complexities from the devices, like telemetry collection or device control, into simple and usable primitives through APIs. Having a simple set of primitives opens the door to other applications to create experiences such as Storm Watch or Virtual Power Plants.

InfoQ: How does the architecture of the Tesla Energy Platform look?

Veiga Ortiz and DellaMaria: Applications within the Tesla Energy Platform fall into three logical domains: Asset and Authorization Management to manage device relationships and authorization models, Telemetry for the ingestion and exposure of real time data, and Control to enable smart controls, configuration updates and user features.

All these services run on Kubernetes deployments and expose their APIs through gRPC or HTTP ingresses. Most of our Kubernetes deployments use horizontal pod autoscalers to react to load changes and use the appropriate resources. Horizontal pod autoscalers and kubernetes cluster node autoscalers help us use the necessary amount of resources at any given time, and therefore maintain cost to the minimum required.

InfoQ: How do you trade off between edge and cloud computing?

Veiga Ortiz and DellaMaria: In the past 20 years, edge devices were considered low-powered machines only able to report data from the installed sensors. On the other side, server-side computing (either cloud or on-prem) was the sole point where processing of that reported data could happen. In the past years, newer devices have more resources to do computations both in terms of CPU and memory which is blurring the lines about where a computation should happen.

Another important aspect in these regards is cost: if you need to process more data in the cloud or on-prem, presumably you need to allocate more resources for it, increasing the overall cost. However, running the computation on the edge makes it virtually free, as you have already paid for the resources. This new paradigm opens new possibilities to create an even larger distributed system, where part of the processing now happens at the edge.

InfoQ: What do you expect the future will bring for energy cloud systems?

Veiga Ortiz and DellaMaria: Energy cloud systems will continue to grow, increase energy security and help accelerate the transition to renewable energy by intelligently controlling energy generation and storage. More and more features will be based on the devices and not in the cloud. We do think cloud systems will continue to be a critical component in supporting user experiences and providing relevant information to devices to enable them to make autonomous decisions.

About the Author

Rate this Article