I've been on something of a streak committing stuff to my open source project, POP Forums. This app has been OSS for 15 years now, through several rewrites, countless improvements and deployments to CoasterBuzz and (less frequently) PointBuzz. Now that I'm back to a job where I'm not writing code, permanently I assume, it's important to me to stay in it, if not for street cred, then to simply engage in a creative endeavour that's relevant to my job.
In the years of neglect, one of the things that bothered me about the app is that I never really made it into something that could scale out. For non-nerds, scale out means making it so it runs on many "servers" (in quotes because it's all virtualized these days), so when your browser talks to it, it can be a different server every time. This is good to handle load, certainly, but it's also nice just to have that redundancy.
That gets to the point of this post: It's so flipping easy to do this these days. Logically, the app has to do a few things. It has to response to requests from web browsers, it has to do stuff in the background (like index a thread for searching) and it has to persist data somewhere, like a database. To make it faster, I've been storing data in memory when it doesn't have to change often. No point in going to a database for that. The problem is that if you're running it on a multi-node arrangement, you can't refer to stuff in memory because there are a bunch of servers, and they don't all know what's current. So you have to use a separate thing, a cache, to keep that stuff, and every server uses that instead of its own memory.
You'll also want to use some kind of third-party entity to index and search all of your stuff. There are lots of choices for that in a cloud world, like ElasticSearch running on all kinds of stuff (like AWS), or Azure Search. Then you'll need something to run all of that background stuff outside of your web serving, because you don't want multiple copies of that running. This is that "serverless" thing that's all the rage, like Azure Functions or AWS Lambdas.
If it weren't enough that you can provision all of this stuff with a few clicks (and automate the provisioning), you can also run it all locally with emulators and Docker containers. Specifically, I'm able to run locally with the web server associated with .Net Core, docker containers for Redis caching and ElasticSearch, and the Azure emulators for Functions and storage (for queues). It's like magic, and it all just runs and works together. When I commit code, there are free mechanisms to make it all build and deploy into the cloud and be running a few minutes later.
This is not a recent revelation, mind you, but when I think about how hard and expensive it was to do this back in the day, it's crazy. If you're starting something up, you can run all of this for less than $200 per month, when you used to have to spend thousands, before you even had a single customer. And more to the point, you can run it all on a single laptop by running a few command line entries to spin up the stuff virtually. You don't even need to install stuff anymore.
The evolution of software development in recent years reinforces my m.o. about what the job is really about for most people: composition. We'll always need people who are really good at writing algorithms and managing memory, but I suspect that the overwhelming majority of the work these days involves composing solutions. The best people in most jobs are those who write the best glue.