BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News QCon New York 2023: Living on the Edge with Erica Pisani

QCon New York 2023: Living on the Edge with Erica Pisani

Erica Pisani, Sr. software engineer at Netlify, presented Living on the Edge at the 2023 QCon New York conference.

In an edge computing infrastructure, availability zones are defined as one or more data centers located in dedicated geographic regions provided by organizations such as AWS, Google Cloud and Microsoft Azure. Pisani further defined: the edge as data centers that live outside of an availability zone; an edge function as a function that is executed in one of these data centers; and data on the edge as data that is cached/stored/accessed at one of these data centers. This provides improved performance, especially if a user is the farthest distance away from a particular availability zone.

After showing global maps of AWS availability zones and edge locations, Pisani then provided a step-by-step overview of the communication between a user, edge location and origin server:

Edge Request Path

Source: Living on the Edge presentation by Erica Pisani

As shown in the diagram, a user makes a request via a browser or mobile application that first arrives at the nearest edge location. In the best-case scenario, the edge location will respond to the request, and the transaction will be complete. However, if the data at the edge location is somehow outdated or invalidated relative to the origin server, the edge location must communicate with that origin server to obtain the latest data before responding to the user. This, of course, means a latency burden on the user, but subsequent users will now benefit.

Various challenges with corresponding solutions for web application functionality using edge functions were discussed. These were related to: high-traffic pages that need to serve localized content; user session validation taking too much time in the request; and routing a third-party integration request to the correct region.

Pisani introduced the AWS Snowball Edge Device, a physical device that provides cloud computing available for places with unreliable and/or non-existent Internet access or as a way of migrating data to the cloud.

She wrapped up the presentation by enumerating some of the limitations of edge computing, namely: lower available CPU time; blocking on an external network request; limited integration with other cloud services; and smaller caches.

Pisani spoke to InfoQ to learn more about edge computing.

InfoQ: How long have you been with Netlify, and what are your current responsibilities, that is, what do you do on a day-to-day basis?

Erica Pisani: I've been with Netlify for a little over a year (started in March 2022). Currently I work on the Composable Tooling team, which focuses on providing developers the tools they need to enhance their Netlify sites with the functionality that they need.

In my day-to-day, this means that I'm researching tools and workflows that developers likely want when building their sites, and creating abstractions within Netlify to make it simple to plug everything together without needing to deeply understand all the features Netlify has.

InfoQ: How do developers get started writing edge functions in terms of access to servers, required accounts, etc.?

Pisani: It depends on the service provider that a developer would want to use but generally speaking, it's quite easy to get started.

Vendors like Cloudflare (Workers), Netlify (Edge Functions), and Vercel (Edge Functions) make it extremely simple to write and deploy edge functions and only require a free account to get started.

All of these vendors also have some great examples to give folks a sense of how quickly they can set one up:

I can't speak for Google Cloud Platform or Microsoft Azure as my experience with them has been very limited, but I imagine that they are similar to AWS in that they're slightly more involved than the vendors mentioned above.

In AWS' case, a developer would have to do some additional setup such as setting specific triggers in Cloudfront to invoke a Lambda@Edge Function and ensuring proper IAM permissions are set.

For those interested in taking a look at Lambda@Edge, they can find their tutorial here, but again, it's generally straightforward to write and publish an edge function.

InfoQ: Will there be support for other languages besides TypeScript and JavaScript to write edge functions?

Pisani: There already is! Again, it depends on the vendor, but from the ones I know, along with JavaScript Cloudflare Workers support Rust, C, and C++, and AWS Lambda@Edge supports Python.

InfoQ: In your presentation, you recommended using REST over GraphQL when discussing favoring caching generalized requests over personalized ones. Please explain why, especially since GraphQL solves the problem of under-fetching and over-fetching of data with REST.

Pisani: Great question! To add some context for readers - this came up when talking about deploying an API layer at the edge in order to help boost performance for mobile applications and backend services. By having the API layer at the edge, more generalized responses could be stored at the edge allowing for faster responses going forward.

In this context, the main reason I was recommending REST is because while GraphQL allows developers to fetch exactly what they need, the various permutations that can result from this, especially when requesting fields on deeply nested data, can lead to the response being highly personalized.

Ideally we want more generalized, rather than personalized, responses because more requests (and the users/services making those requests) can benefit from the cached values.

With a higher number of personalized requests and a smaller cache available at an edge location compared to a data center in a cloud provider's availability zone (AZ), it's more likely that a request will need to be made to the origin server in an AZ which is potentially very distant from the edge location handing the request.

When this happens, it diminishes the effectiveness of the edge location in enhancing performance by reducing the physical distance a request needs to travel.

InfoQ: What’s on the horizon for edge computing?

Pisani: I think we're going to see more data, complex workloads, and applications hosted at edge locations over time. The performance benefits can be massive when applied well, particularly in regions of the world where there's only one availability zone data center but a number of edge locations scattered across that region.

Edge computing infrastructure being available over the 5G network through services like AWS Wavelength looks like they could also have a massive impact on how services are delivered and what kinds of services become reliable as a result of the ultra-low latency this makes available.

I think we may start to see more "on-premises" hardware (such as AWS Snowball Edge) and local-first applications (Carl Sverre had an excellent talk on this at QCon NYC) being adopted as a form of edge computing as well, especially in areas that are under-served with respect to reliable internet access.

The 9th annual QCon New York conference was held June 13-15, 2023, at the New York Marriott at the Brooklyn Bridge in Brooklyn, New York. C4Media, a software media company, organized this three-day event focused on unbiased content and information in the enterprise development community and creators of InfoQ and QCon.

More details on QCon New York 2023 may be found in the Day One, Day Two and Day Three daily recaps.

About the Author

Rate this Article

Adoption
Style

BT