BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Streamlining Cloud Development with Deno

Streamlining Cloud Development with Deno

Bookmarks
49:17

Summary

Ryan Dahl discusses Deno Runtime, Deno KV: a datastore anchored by ACID transactions and powered by FoundationDB, Deno Queues, and NPM in Deno.

Bio

Ryan Dahl studied mathematics at UCSD and the University of Rochester before pivoting to software engineering. In 2009, he created Node.js and guided the project through its foundational years. Throughout his career, Dahl has ventured into diverse areas of software engineering, from server infrastructure to machine learning research. He currently serves as the co-founder and CEO of Deno Land Inc.

About the conference

Software is changing the world. QCon empowers software development by facilitating the spread of knowledge and innovation in the developer community. A practitioner-driven conference, QCon is designed for technical team leads, architects, engineering directors, and project managers who influence innovation in their teams.

Transcript

Dahl: I'm going to just give you a whirlwind tour of Deno and some subset of the features that were built into it. These days I'm working on Deno. The web is incredibly important. Year after year, it just becomes more important. This wasn't super clear in 2020. There was a time where maybe iPhone apps were going to replace the web. It wasn't super obvious. I think now in 2023, it's pretty clear that the web is here to stay, and, indeed, is the medium of human information. This is how you read your newspaper. This is how you interact with your bank. This is how you talk to your friends. This is how humans communicate with each other. It is incredibly important. We build so much infrastructure on the web at this point, and the web being a very large set of different technologies. I think it is a very fair bet to say that the web is still going to be here 5 years from now, if not 10 or 20 years from now. That is maybe unsurprising to you all, but technology is very difficult to predict. There are very few situations where you can say 5 years from now, this technology is going to be here. I think this is a rare exception. We are pretty sure that JavaScript, a central protocol of the web central programming language, it will be here in the future because it's inherently tied to the web. I think that further work in JavaScript is necessary. I think the tools that I and others developed 10-plus years ago with Node.js, and npm, and the whole ecosystem around that, are dating fairly poorly over the years. I think further investment here is necessary.

The Node Project

With Node, my goal was to really force developers to have an easy way to build fast servers. The core idea in Node, which was relatively new at the time was, to force people to use async I/O. You had to do non-blocking sockets, as opposed to threading. With JavaScript, this was all bundled up and packaged in a nice way that was easily understandable to frontend developers because they were used to non-blocking JavaScript in websites. They were used to handling buttons with on-click callbacks. Very similar to how servers are handled in Node, when you get a new connection, you get an on-connection callback. I think it's clear these days that more can be done beyond just doing async I/O. I think this is pretty table stakes. Yet, developing servers is still hard. It's still something that we're stumbling across. We have a larger perspective these days on what it means to build an optimal server. It's not just something that runs on your laptop, it's something that needs to run across the entire world. Being able to configure that and manage that is a pretty complex situation. I'm sure you all have dealt with complex cloud configurations in some form or another. I'm sure many of you have dealt with Terraform. I am sure you have thought about how to get good latency worldwide. Having a single instance of your application server in U.S.-East is not appropriate for all applications. Sometimes you need to serve users locally, say in Tokyo, and you are bounded by speed of light problems. You have to be executing code close to where users reside.

There is just an insane amount of software and workflows and tool chains, especially in the JavaScript space where with Node, we created this small core and let a broad ecosystem develop around it. I think not an unreasonable theory, and certainly was pretty successful, but because of this wide range of tooling options in front of you when you sit down to program some server-side JavaScript, it takes a lot to understand what's going on. Should you be using ESLint, or TSLint, or Prettier? There are all these unknown questions that make it so that you have to be an expert in the ecosystem in order to proceed correctly. Those are things that can be solved at the platform layer. Of course, I think we've all heard about the npm supply chain security problems where people can just publish any sort of code to the registry. There are all sorts of nasty effects where as soon as you link a dependency, you are potentially executing untrusted post install scripts on your local device, without you ever agreeing to that, without really understanding that. That is a very bad situation. We cannot be executing untrusted code from the internet. This is a very bad problem.

Deno: Next-Gen JavaScript Runtime

In some sense, Deno is a continuation of the Node project. It is pursuing the same goals, but just with an expanded perspective on what this means on what optimal servers actually entail. Deno is a next-generation JavaScript runtime. It is secure by default. It is, I say JavaScript runtime, because at the bottom layer, it is JavaScript. It treats TypeScript and JSX natively, so it really does understand TypeScript in particular, and understands the types, and can do type checking, and just handles that natively so that you are not left with trying to configure that on top of the platform. It has all sorts of tooling built into it, testing, linting, formatting, just to name a small subset. It is backwards compatible with Node and npm. It's trying to thread this needle between keeping true to web standards, and really keeping the gap between browsers and server-side JavaScript as minimal as possible. Also being compatible with existing npm modules, existing Node software. That's a difficult thing to do because the gap between the npm ecosystem, the Node ecosystem, and where web browser standards are specified is quite large.

Demo (Deno)

In Deno, you can run URLs directly from the command line. I'll just type it up, https://deno.land/std@0.150.0/examples/gist.ts. Actually, let me curl this URL first, just to give you an idea. This is some source code. This is a little program that posts GISTs to GitHub. It takes a file as a parameter and uploads it. I'm going to run this file directly in a secure way with Deno, so I'm just going to Deno run that command line. What you'll see is that, first of all, it doesn't just execute this thing. You immediately hit this permission prompt situation where it says, Deno is trying to access this GIST token. I have an environment variable that allows me to post to my GitHub. Do you want to allow access or deny access to this? I will say yes. Then it fails because I haven't actually provided a file name here. What I'm going to do is upload my etc password file, very secure. I will allow it to read the environment variable. Then you see that it's trying to access the file etc password, I will allow that. Now it's trying to post to api.github.com. I will allow that. Finally, it has actually uploaded the GIST, and luckily, my etc password file is shadowed here and doesn't actually contain any secret information so it's not such a big deal.

Just as an example of running programs securely, obviously, if it's going to be accessing your operating system, and reading GIST tokens and whatnot, you need to opt into that. There is no secure way for a program to read environment variables and access the internet in some unbounded way. By restricting this and at least being able to observe and click through these things and agree, very much like in the web browser, when you access a website, and it's like, I want to access your location API or access your webcam, in Deno you opt into operating system access. That is what is meant by secure by default and what is meant by running URLs directly. It's not just URLs it can run, it can also execute npm scripts. You can do deno run npm:cowsay, is a little npm package that prints out some ASCII art. I'm just going to try to have it say cowsay hello. For some reason, the npm package cowsay wants access to my current working directory, so we'll allow that. It's also trying to read its own package JSON file, it's a little bit weird. It's also trying to read a .cow file, which I assume is the ASCII art. Once I grant that access, it can indeed print out this stuff. Of course, you can bypass all of these permission checks with allow read, which just allows you to read files from your disk without writing them, without getting network access, without getting environment variables. In this way you can execute things with some degree of confidence.

Deno is a single executable, so I will just demo that. It's a relatively large executable, but are you ever going to know that if I didn't tell you that? It's 100 megabytes, but it contains quite a lot of functionality. It is pretty tight. It doesn't link to too many system libraries. This is what we've got it linked to. It is this one file that you need to do all of your Typescript and JavaScript functionality. We do ship on Mac, Linux, and Windows all fully supported. I think it's a nice property that this executable is all you need. You only need that one file. When you install Deno, you literally install a single executable file. I had mentioned that Deno takes web standards seriously. This is a subset of some of the APIs that Deno supports. Of course, globalThis, is this annoying name for the global scope that TC39 invented. That is the web standard. WebAssembly, of course. Web Crypto is supported. Web Workers are supported, that's how you do parallel computation in Deno, just like in the browser. TransformStream, EventTarget, AbortController, the location object, FormData, Request and Response. Window variable, window.close, localStorage. It goes quite deep.

Just to demo this a little bit, I want to gzip a file with web standards here. Let me, deno init qcon10. I'm using 10 because I've been doing this for hours now. I think I've got a lot of QCon directories here. Let me open up VS Code and potentially make that big enough that you're able to view this. What I've run is a deno init script that creates a few files here. What I can do is deno run main.ts, and that adds two numbers together. No big deal. I'm going to delete this stuff. What I want to do is actually open up two files. I'll open up etc password. Then I'm going to open up a file, out.gz. I'm going to compress that text file into a gzip file, but using web streams here, so Deno.open etc/passwd is the first one. That's an async operation, and I'm going to await it, top level await here. This is the source file that we're reading from. Then I'm going to open up a destination file Deno.open out.gz, in the current directory, and we want this to be, write is true, and create is true. We'll let Copilot do a lot of the work there. What we can do is this SRC file has a property on it, readable, which is actually a readable stream. I can do pipeThrough, again web standard APIs, new CompressionStream for gzip. The problem with Copilot is you can't really trust it. Then we're going to pipeThrough this gzip CompressionStream, and then pipe to the destination that's writeable. This is a very web standard way of gzipping a file. Let's try to run that guy. Of course, it's trying to access etc password, so we have to allow that. Yes, it's trying to get write access to this out.gz file. There. Hopefully, we've created an out.gz file, and hopefully that file is slightly less than this password file, which that appears to be the case. Web standard APIs are very deep. In particular, when it comes to streaming APIs and web servers, Deno has it all very deeply covered. If you want to stream a response and pipe it through something else, all of this is dealt with, with standard APIs.

I mentioned that Deno has Node compatibility. This was not an original goal of Deno. Deno set out with blazing new trails. What we found over time is that, actually, there is a lot of code that depends on npm and Node, and that people are unable to operate in this world effectively without being able to link to npm packages, and npm packages, of course, are built on top of Node built-in APIs. We actually have this all pretty much implemented at this point. There are, of course, many places where this falls through. Just as a demo of this, we're going to import the Node file system, so import from node:fs, and I'll do readFileSync. Before I was using the Deno.apis, some global object that provides all of the Deno APIs. Here, I'm just going to do readFileSync etc, I'll use my same file here. This thing returns a Node buffer, not to be confused with a web standard buffer. Very different things. I'll just console.log.out just as a demo of running a Node project. Of course, the security still applies here. Even though you're going through the built-in Node APIs, you do not actually have access to the file system without allowing that, so you have to grant that either through the permission prompt or through flags. There we go. We've read etc password now, and outputted a buffer. This fs API is rather superficial, but this goes pretty deep. It's not 100% complete, but it is deep.

The Best Way to Build npm Packages: DNT

I do want to mention this project called DNT. DNT stands for Deno Node Transform. One of the big problems with npm packages is that you need to provide different transpilation targets. Are you building an ESM module? Are you building a common JS module? What engines are you targeting? It's pretty hard to code that up properly by hand, there's a lot of things to get wrong. At this point, we are thinking of treating npm packages as essentially compilation targets rather than things that people write by hand themselves, because it is so complex and so easy to get wrong. DNT takes care of this for you. It will output a proper npm module with common JS support and ESM support, and creates tests for you, and does this all in a cross-platform way that can be tested on Node, and can polyfill any things that are not in Node, can provide you with toolings there. I just wanted to advertise this, even if you're not using Deno, this is a great tooling to actually distribute npm packages.

Express Demo

As a slightly more serious example, let's do some Express coding here. In Deno, you don't need to set up this package JSON file, you can just import things. You can use a package JSON file, but it's boilerplate at some point. Let me do this, import express from npm:express, a slightly different import here. Then let me see here. I'm going to const app = express, and then app.listen is something, and console.log listening on http://localhost:3000. Just throw together my little Express app here. Let's see if this thing is running. For some reason, it wants to access the CWD. I think this must be something weird with Express, also environment variables. Also, of course, it wants to listen on port 3000. I think once we've jumped through those hoops, we should be able to curl localhost at port 3000, and get Hello World. In order to not have to repeat that all the time, let me just add some command line flags here, --allow-read --allow-env --allow-net, not allow write. There is a couple of red squigglies here, because I'm using TypeScript, because this is transparent TypeScript support here, is giving me some errors, and saying, this response has an any type. The problem with Express because it's a dated package is that it doesn't actually distribute TypeScript types itself. We have this admittedly somewhat nasty little pragma that you need to do here, where you have to link this import statement with npm:types here. Modern npm packages will actually distribute types, and link them in the package JSON. This shouldn't be necessary. Just because Express is so old, you have to give a hint to the compiler about how to do this.

We're still getting one little squiggly here. Now we have our proper little Express server. In fact, let me just make this a little bit better. We've got this --watch command built into Deno, so let me just add my flags here to this deno task. Deno task is something like npm scripts, so I can just do deno task dev, and now it's going to reload the server every time I change any files here. If I want to change that from Hello World, to just Hello, it will reload that. We've got our little Express server.

We'll keep working with that example, but just as an interlude, here's some of the tooling built into Deno. The one that I wanted to point out here is standalone executables. Let's take this little Express server that we've built, and let's try to deno compile main.ts. What is this going to do? This is going to take all of the dependencies, including all of the Express and npm dependencies, package it up into a single executable file, and output it so that we can distribute this single file that is this tiny little Express server. I'm going to do this. Looks like it worked. There's my qcon10 executable. Now it's still prompting me for permission. Actually, what I want to do is provide a little bit more --allow-net --allow-env --allow-read, provide these permissions to the compiled output so that we can actually run this without flags. Now we've got this little distributable qcon10 file that is an ARM64 Macintosh executable here. You can package this up and send it around, it is hermetic. It will persist into the future. It doesn't depend on any external things. It is not downloading packages from npm. It includes all of the dependencies there. What's super cool about this, though, is that we can do this trick with target. Target allows you to cross-compile for different platforms. Let's say that we wanted to actually output qcon10.exe for distribution on Windows. We'll provide a target and we'll do qcon10.exe. There we go, we've got our, hopefully, qcon10.exe. We've got our exe file that should be executable on MS Windows that contains all of these dependencies, just like the previous one. Similarly, you can target Linux, of course. This is extremely convenient for distributing code. You do not want to distribute a tarball with all of these dependencies. You want to email somebody one file and say execute this thing.

Deno Deploy: The Easiest Serverless Platform

Deno is a company. We are also building a commercial product. All of this stuff that I've been showing you, is all open source. Of course, MIT licensed, very free stuff. We take that infrastructure, combine it with some proprietary stuff, and we are building a serverless platform. This is the easiest serverless platform you have ever seen. It is quite robust. It is, first and foremost, serverless, and so scales to zero is a quite important goal for anything that we're developing inside Deno Deploy. It, of course, supports npm packages, and Deno software in general. It has built-in storage and compute, which is really interesting when you start getting into this. Those also have serverless aspects to them. It is low latency everywhere. If you have a user in Japan, when you deploy to Deno Deploy, they will get served locally in Japan, rather than having to make a round the world hop to your application server. It is focused on JavaScript. Because of that, we can deliver very fast cold start times. We are just starting up a little isolate. This is not a general-purpose serverless system like lambda. I started this talk by saying that JavaScript has a future unlike all other technologies. Maybe semiconductor technology is going to be here 5 years from now. In these kinds of high-level technologies, JavaScript is pretty unique in that we can say that it is going to be there in the future. That is why we feel comfortable building a system specifically for JavaScript. This is a platform unlike say Python. It does have different things. This system is production ready and serving real traffic. For example, Netlify Edge Functions is built on top of Deno Deploy. This is actually supporting quite a lot of load.

Let's try to take our Express server and deploy it to Deno Deploy and see what that's like. What I'm going to do is go to the website, dash.deno.com. This is what Deno Deploy looks like. I'm going to create a new project. I'm going to create a new blank project here. It's given me a name, hungry-boar-45. For good memory, we'll call it qcon10. Now I have to deploy it. I can go look up the command for this, but I know what it is, deployctl. What I'm going to do is just say, give it the project name, deployctl deploy, and then I have to give it the main file that it's going to run. I'll give this main.ts. Before I execute this, let me just delete these executables that I outputted here, so it doesn't inadvertently upload those. Let me also just give the prod flag. This is uploading my JavaScript code to Deno Deploy, and making a deployment. Before you know it, you've got this subdomain, which we can curl and gives me a Hello response. This is maybe seemingly quite trivial, but this is deployed worldwide in the time that we run this command. When you access this subdomain, and if you access this 2 years from now, hopefully this will continue to be persisted. This is running in a serverless way. Costs us essentially nothing to run a site that gets no traffic.

Deno KV: Simple Data Storage for JavaScript

To take this example maybe one step further, I mentioned that this has some state involved here. What we're doing right now is just responding with Hello. Let's make this slightly more realistic. Obviously, real websites are going to be accessing databases and whatnot. You can, of course, connect to your Postgres database. You can, of course, connect to MongoDB. What we've added in Deno Deploy is this Deno KV, which is a very simple serverless database that is geared towards application state, specifically in JavaScript. It is very critically zero configuration to set up. It will just be there out of the box. It does have ACID transactions, so you can use this for real consistent state. It scales to zero, of course. It's built into Deno Deploy. Under the hood, it's FoundationDB, which is a very robust industrial database that is running, for example, Snowflake, and iCloud at Apple, a very serious key-value database that provides all of these primitives. We, of course, are not engineering a database from scratch. That would be absolute insanity. What we are trying to do is provide a very simple state mechanism that you can use in JavaScript, for very simple states that you might need to persist. You would be surprised at how useful this actually is. It's a key-value database, and it operates on JavaScript objects. It's important to note that the keys are not strings, but JavaScript arrays. You can think of it as like a directory structure, a hierarchical directory. Those keys can be strings, or the values in those arrays can be strings, or integers, or actually byte arrays. The values can be any JavaScript object, not just a JSON serializable JavaScript object. You can dump date objects in there. You can dump BigInts. You can dump Uint8Arrays. It's very transparently nice.

Let's try to add a little bit of state to the system. What I'm going to do is first of all open the database, so Deno.openKv is the access to this, and you have to await that. This is how you access it, and you do const kv equals that. I'm getting a little red squiggly line around this, because it's like, what is this openKv thing? This stuff is still in beta and behind the unstable flag. I do need to go into my settings and enable the unstable APIs in Deno. I just click this thing, and it should go away. What I want to do is just set a real simple value here. As I said, the keys are arrays. I'm going to just start a key that's called counter, which is an array with a single string in it. I'm going to set this to the value 0. I have to await this. Then I'm going to get this counter and get back 0, so const v, and I'll just console.log that. Let's run this file again and see if I haven't really messed this up, so deno task dev. It's giving me another warning here, openKv is not a function, again, this is because I don't have the unstable flag. Let me just add unstable in here and try this again. There we go. Now when I run this file, at the top level, I've stored a key and gotten a value back out of it. That's interesting. Let me just give a name to this key here, and delete this code. Actually, let me save some of this code.

Let me get this counter out of here and return it in the response of this Express server here. I'm going to await this kv.get, and then I'm going to return Hello, maybe I'll say counter, with the value of that counter. I've got a little red squiggly here, and it's saying that await expressions can only be allowed inside async functions. Got to turn this into an async. That should make it happy. Let's just test it, curl localhost. I'm getting counter is zero. Then the question is like, how do I increment this counter? We're going to deploy this to Deno Deploy, where it's going to be propagated all around the world. How are we going to increment that counter? Let me show you. You can do atomic transactions. I mentioned ACID transactions in KV. You can do kv.atomic sum, we're going to increment this counter key that we've got. We'll increment it by 1. Let's see if that works. We'll curl this, and it is not working. We have to commit the transaction, so curl. It has crashed, because this is a non-U64 value in the database. Let me use key there, and let's just use a different value here. Counter 1, counter 2, counter 3, so we are now incrementing this value from the database. My claim here is that, although this is just a counter, can now be deployed to this serverless environment. I'll just run the same command that I did before, this deploy CTL thing. We'll do deployctl deploy prod project equals qcon10 main.ts.

My claim here is that this is, although, a very simple server still, somewhat stateful, so every time I'm accessing this qcon, the counter is increasing quite rapidly. This thing is deployed worldwide. If I go into my project page up here on Deno Deploy, you can get this KV tab, and inside of this you can see some details about this. The right region for the KV values is in U.S.-East. There are read replicas for this. Actually, the only one that's turned on right now is U.S.-East-4, but we've got two others available. We could turn on the Los Angeles read replica. In fact, let's just do that, it takes a second. This will replicate the data to Los Angeles so that when we read from that value, we are getting something. I'll leave it at that. It's suffice to say that this is pretty useful in many situations where you need to share state between different isolates running in different regions. I don't think it takes the place of a real relational database. If you have a user's table and that sort of stuff, you probably want to manage that with migrations. There are many situations in particular, like session storage where something like this becomes very valuable.

Deno Queues: Simple At Least Once Delivery

We also have KV queues. This is a simple way of doing at least once delivery. This is, in particular, very useful for webhooks, where you don't want to do a bunch of computation inside the webhook. What you want to do is queue up something and do that asynchronous from the webhook request. I'll just leave the link, https://deno.com/blog/queues.

Questions and Answers

Participant 1: Can I get your spicy take on the class undone?

Dahl: JavaScript is super important. There is a lot of room for improvement here. Deno started off, as I mentioned, not Node compatible, and we are operating in a world, pushing out what is possible beyond npm. I think the Bun project has created some good competition in that sense. It's just like, no, actually, this is very important for stuff. I think that pushes our work in Deno to be receptive to the needs of npm users. We can't just elbow our way into a developer community and say that you need to use HTTPS imports for everything. Very good in that sense. I really like Rust. I'm very confident that a very heavily typed, very complex system like what we are building, is manageable quite well in Rust. We continue to develop with urgency, and we will see where this goes in the future.

Participant 2: I'm curious if you've thought about the psychology of the permission system in terms of users or developers will just say allow all. How do you deal with that decision fatigue?

Dahl: Of course, when you're developing yourself on some software, you're just going to do allow all, because you know what you're running. The JavaScript V8 engine provides a very secure sandbox. I think it's a real loss if you just throw that away and say like, no, you have unrestricted access to the operating system, bar nothing. It gets in your way, but I think that's good. Sure, maybe it introduces decision fatigue and maybe people do allow all. That's their problem. Better to have the choice than to not. I think there's work that we can do in the ergonomics of it to make it a little bit more usable and user friendly. Generally, we are pretty confident with our secure by default design.

Participant 3: The slide where you were talking about supporting web standard APIs, that makes a ton of sense. I read between the lines as that being a goal to allow people to use the same code in the browser, but they were even better than in some cases, which seems like a noble goal. Then your import statements are a little bit different, so they wouldn't work in a browser. I'm curious like how those pieces fit together.

Dahl: We are touching the file system. We are creating servers. We're doing all sorts of stuff that is not going to run in the web browser. We're writing in TypeScript in particular, that is not going to run in the web browser. Just because there is something that you're doing differently on server-side JavaScript, doesn't mean we need to reinvent the world. I think a big problem in the Node world, for example, has been the HTTP request API. When I made that API there was no fetch. I just invented it. Let's import HTTP request. That was fine until fetch got invented. Then there was this huge gap for many years. These days Node is fixing things, I think in the next release actually fetch is going to be very well supported. I think, just because you can't run TypeScript in the web browser, or you don't have npm imports, doesn't mean that we should just throw off all compatibility. Think of it as a Venn diagram. We want that Venn diagram to be as close as possible. Yes, of course, it's a server-side JavaScript system, it is not going to do exactly the same things as browsers. Let's not just make it two distinct, separate systems. That creates a lot of confusion for users.

Participant 4: I'm wondering if the mKV would be part of the open source implementation, or is that a commercial product?

Dahl: You saw me running it locally. The question is like, what is it doing there? Locally, it's actually using a SQLite database, hidden behind the scenes. The purpose of that is for testing so that you can develop your KV applications and check the semantics of it, run your unit tests, without having to set up some mock KV server. That is all open source and fine. You're free to take that and do what you want. Obviously, you can ignore it if you want to. The FoundationDB access comes when you deploy to Deno Deploy. There are ways to connect, for example, Node to the hosted FoundationDB, even if you're not running in Deno Deploy, that's called KV connect. It is a hosted solution, and it is a proprietary commercial operation. First of all, you're free to ignore it. You can run it locally. The SQLite version of it is open source.

 

See more presentations with transcripts

 

Recorded at:

Mar 27, 2024

BT