Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Five Reasons to Start Working in the Cloud

Five Reasons to Start Working in the Cloud

Leia em Português

Key Takeaways

  • A cloud IDE workflow provides excellent convenience in terms of being accessible across any device or location.
  • When configured correctly, it is arguably more secure than a local workflow as no code or tools are stored locally.
  • Speed of onboarding increases tenfold as complete workstations can be automated and stood up with a single command.
  • Local operating systems become irrelevant, allowing users to use the hardware and operating systems that best suit their workflow.
  • Cloud-based development opens up a world of flexibility by abstracting your engineering environment from all local resources and dependencies.

With the recent announcement of products such as Visual Studio Codespaces and GitHub Codespaces, it is clear there is a demand for cloud-based engineering workflows. While working on a local machine may feel safe and familiar, the cloud IDE workflow offers users significant benefits such as power and flexibility.

For several months, I have been using self-hosted cloud IDEs on a range of public cloud providers. I have been doing so via the open-source version of Code Server from the talented folks over at Coder. Working in the cloud is now my primary workflow for my personal and professional work as a DevOps engineer. With a remotely-accessible instance of VSCode and its built-in terminal, all I need is a web browser to get to work.

Whether self-hosted or managed, this article is going to cover five reasons why a cloud IDE may be precisely what you or your company needs to boost productivity to the next level.

One: Convenience

Working in the cloud is not a new concept. We've all been using products like Google Drive, Dropbox, iCloud and Office 365 for the last decade or so. The convenience of storing your work in the cloud and being able to come back and pick up right where you left off is a game-changer.

With cloud IDEs hitting the scene, us engineers can now enjoy the same convenience and productivity. No longer do I need to be tied down to local hardware forever shuffling around keys, juggling different versions of tooling and languages, and trying to achieve optimal compatibility between my Mac and Linux machines.

With my remote cloud IDE, I can work from any device and location as long as I have an internet connection and browser. While a required internet connection may sound like a downside, in today's world of distributed teams, pair programming and CI/CD workflows, an internet connection is generally already a must in most scenarios.

Offloading my workflow to a remote instance also means the associated load is no longer a burden on my local machine. Lowering the CPU, memory and hard drive load significantly improves performance, thermals, and battery life on my laptop.

Working in the browser also has other convenient benefits. The main one being an isolated work environment. For instance, I use Firefox as a personal browser and Chrome as a work browser. Now I can contain my entire work environment within a single browser. Doing so provides an exceptional level of convenience, but also touches on my second point, security.

Two: Security

As previously mentioned, working in the cloud means my work environment and its related resources remain isolated in a browser profile. Containing my work in such a way means no tools, keys or code are on my local filesystem. I'm sure you've heard the, "you must encrypt your drive in case of a stolen system" lecture during onboarding. While I do still recommend and encrypt all my drives, my local hardware now becomes substantially less relevant to a would-be attacker.

Having my workload execute from a remote network also means my local network holds little value if compromised. All keys and permissions to execute workloads are locked down and authenticated from my remote instance, not my local network and computer terminals.

How the network stack is configured and locked down is a vital part of maintaining optimal security while working in the cloud. There are a few different ways to go about this.

The simplest way to securely access a cloud IDE, with no extra configuration required, would be forwarding the local port of the IDE instance through an SSH tunnel. By doing so, it is not exposed to the public internet. Now all work is being conducted through an encrypted tunnel. While straightforward and secure, this does limit one to needing the correct SSH keys and an SSH client.

As mentioned in part one, a key benefit of a cloud IDE is the ability to access it from any device. To achieve this, I do need to expose a web server via the public internet.

The easiest way to accomplish this is with something like Caddy. A simple web server with automatic HTTPS via Let's Encrypt. In a matter of minutes, I can have a cloud IDE being served securely over TLS.

I, however, choose to house my remote instances in private networks. Traffic is then routed to my IDE instance port via a TLS encrypted load balancer at the edge. This keeps the instance itself off the internet.

Whether choosing to route directly to the instance or via a load balancer, I now have a password-protected cloud IDE running over TLS.

Basic authentication is still a little lack-lustre by today's standards. To beef up security to an acceptable level, I would recommend implementing MFA. The best way to do so is by using a reverse proxy that supports this like Cloudflare or OAuth2 Proxy.

Now I have a universally browser-accessible cloud IDE workstation with TLS, password, and MFA protection.

Implementing these security procedures only applies when self-hosting the cloud IDE, of course. Managed services, such as Visual Studio Codespaces, generally provide and take care of security measures as mentioned above with excellent quality assurance.

Another notable security benefit is the ability to have different instances for different work environments. By having these separate instances, it means all associated security measures for their respective work environments are isolated. Gone are the days of having several sets of keys and access rights on a single machine, creating a blast radius that could take out multiple systems and networks if compromised.

One last point on security, regardless of whether you are using a self-hosted or managed service, make sure the underlying platform meets security compliance. Certifications like SOC 1 Type II, SOC 2 Type II, and in particular, ISO/IEC 27001:2013 are paramount for security assurance. Equally important, though, is that your chosen public cloud provider or managed service is on your company's trusted vendor list. You don't want to wake up to a message in your inbox from a security architect!

Three: Speed

A major constraint in delivering value when starting at a new company is the onboarding process. "What access do I need? What specific tools, languages and versions does my role require? Are these versions up-to-date with what my colleagues are currently using?"

Getting yourself into a position to deliver value for the company can take days or weeks. Even then, tooling and versioning are still subject to change amongst individual contributors.

This is where the speed of delivery and productivity comes into play with cloud workflows. With tools like Terraform and Packer, the whole stack from the underlying infrastructure platform, to the tooling and access requirements on the instance itself, can be standardized and automated.

Imagine a scenario in which there are three different teams for a particular product: an ops team, backend team, and frontend team. Each has requirements for tooling, languages, versioning and access for their respective layer of the product.

In a cloud workflow, we can have three ready-to-go workstation images on the shelf for each team. We treat these images like products, with owners and maintainers.

When a new team member joins, let's say the backend team, they now run a single command to spin up a new cloud IDE instance from the "backend team" image. In a matter of minutes, they are delivering value with a fully functional VSCode instance running on a secure VPS loaded with the correct tools, versions and access they require.

From a team perspective, having a centralized image that all cloud workstations build from drastically decreases non-value add work. It does so by removing drift between tooling and versioning across workstations. When a team decides it's time to upgrade to a new version of Go, for instance, the maintainers of that team's image update the version, and all associated instances upgrade from the new image release. Now we have an opinionated workflow in which everyone is working with the same tools and versions.

There is also a drastic increase in terms of technical delivery speed. Rather than working from a domestic home or office internet connection, the workload is running in a high-speed data center. That Docker image that used to take two minutes to pull down now takes 15 seconds.

The required speed to work in the cloud is not high. I've worked with a 10mbps connection from Sydney, Australia on a cloud IDE housed in a Digital Ocean data center in Singapore. It reacted as if it were running as a native app on my local machine.

Four: Operating Systems

A point of contention for some time now is the fact that most of the tech industry supply and work on Macs, while the systems we build generally run on Linux. I understand the appeal of the Apple ecosystem; a standardized set of hardware and software with excellent support. However, It does create an engineering dilemma.

While some may make the argument that BSD and GNU are not that different, the truth is, they are different enough. To accurately test against Linux architecture from my Mac, I need to run either VM's, containers, or CI/CD builds.

Even when running containers on Mac, I am still technically chewing up extra resources running a Linux VM under the hood to use Docker. This is because the Docker daemon interacts directly with the Linux kernel to create its namespaces and control groups. What I'm alluding to here is that engineering work generally makes more technical sense to be conducted on a Linux base.

With a cloud IDE workflow, my local operating system becomes irrelevant. I can still use my company-supplied Mac along with all the excellent productivity apps like Mail, Calendar, Slack, and of course, Spotify. At the same time, my engineering workstation and IDE is a high-powered Linux-based VPS running in a browser tab. For that real native zen mode, I like to give the IDE its own workspace and go full screen!

The above is, of course, applicable to Windows as well. I'm not judging…

Five: Flexibility

My last point, and also to recap, is flexibility. Ultimately what cloud IDE workflows provide are flexibility and freedom.

I can now work from Mac, Windows, Linux, iOS, Android; it doesn't matter. The tools I need to get my engineering work done are now on a remote instance and accessible from any location or device in the world. Yes, I have tested this from my Google Pixel 4!

Self-hosted workstations do initially involve a little more legwork to get automated and stood up, but as a result, I have fine-grained control over exactly what I need. I can achieve vertical elasticity based on the resource load. I can (and do) mount my home directories to separate immutable drives, allowing me to blow away, recreate or change my underlying instance. I can choose whether to run my cloud IDE as a container or native daemon. Plus, I can move to different public cloud providers with ease.

If all you need is a remote IDE to write and test code, then the several managed services out there are an easy way to work remotely and securely. They have the benefits of freedom and flexibility, without the extra overhead.

Whether you choose to go down the self-hosted or managed service path, there is a whole new world of power and flexibility that a cloud workflow can provide.

Spending days setting up new machines and backing up hard drives are a thing of the past. We live in a world where, with the press of a button, I can stream a movie in 4K from a catalog of thousands in a data center hundreds of miles away. With technology like this, it was only a matter of time before we started working in the cloud and not just on the cloud. I am happy to say that day is here, and it's damn impressive!

About the Author

Ben Vilnis grew up around computers from a young age as his father opened a computer store in the ’90s which then grew into an ISP and hosting company in the early 2000s. He very quickly found an affinity for computer technology and started building computers and websites during high school. In his senior year, he discovered his passion for Linux, system administration and networking. Skip forward several years, and Ben now works at Envato as a DevOps engineer and has a strong passion for composable infrastructure, Lean IT, and Platform-as-a-Product. Ben is also a lifelong musician and singer/songwriter and serves as a volunteer firefighter in the New South Wales Rural Fire Brigade.

Rate this Article