There is a lot of speculation, some fear, some excitement about the role of generative AI in software development. I haven't been writing a ton of code lately, but I recently finished the demo code for my Code Camp talk this year (still need to do the deck). With that in mind, and my GitHub Copilot subscription that I keep paying ten bucks a month for but don't use much, I started to ask it some questions. The talk is about using code to manipulate moving lights. The basic knowledge is easy, it's just changing numbers in an array. But what if I wanted to do cross fades between lights? What if I wanted them to be log or exponential instead of linear? What if I wanted to spread the fade evenly across 10 lights?
It knew how to answer every single question. Then it started suggesting questions that led to better answers.
I couldn't believe it. Because that's math, it would have taken me a long time to figure that stuff out on my own. I've used AI before to generate some markup and style sheets before to get layouts I described with words, but this was something else.
So what does this actually mean? Well, there's context I would add. First off, most people don't write a lot of code that requires algorithms. If I had to guess, most coding is pretty boring and routine. More to the point, it's usually about composition, making different blocks work together for some outcome. This stuff is super easy for AI to do, and I've done it quite a bit in what little experience I have with it. At the moment, that's where the real value is.
The bigger question is usually about how much code it can write to reach those bigger outcomes. That's not a thing yet, but even if it could, it depends entirely on the human input. There are always rules that govern the outcomes, and you have to be as explicit about as many of those as possible. So yes, it could get to a point where product managers could write enough requirements to get the outcomes, but then they also have to figure out where the edge cases are, and test to make sure stuff still works.
I think that long before anyone will be "replaced," people will be needed to learn the best way to use the tools, and that will still be a special and well developed skillset. I would also assume that there will be some kind of regulation, or at least ethical standard, about how the machines write and deploy code, because, you know Terminator. The tech may move fast, but as we know, humans do not. We have the technology to end poverty, racism and such, but we're not there.
I'm kind of excited though about using this more to try things that I may not otherwise be confident enough to try.