In fact, in my opinion, one of the benefits of AI tools that is often overlooked is "psychological support". When you are stuck at work, it will give you a push. Even if it is not completely right, it is enough to get you moving. The feeling of "no longer fighting alone at work" is actually more important than many people think.
"Great news, boss! We invented this new tool that allows nontechnical people to write code in English! Now anyone can deploy applications, and we don't have to hire all those expensive developers!"
"Wow, show it to me!"
"OK here it is. We call it COBOL."
Having experienced several overhyped corporate knee-jerk (and further press-amplified) silver bullets, I expect this will play out about as well as the previous ones.
And by that, I mean corps will make poor decisions that will be negative for thought workers while never really threatening executive compensation.
I see this latest one somewhat like TFA author: this is a HUGE opportunity for intelligent, motivated builders. If our jobs are at risk now or have already been lost, then we might as well take this time to make some of the things we have thought about making before but were too busy to do (or too fatigued).
In the process, we may not only develop nice incomes that are independent of PHB decisions, but some will even build things that these same companies will later want to buy for $$$.
Hi all - I write a lot for myself but typically don't share, hence the stream-of-consciousness style.
But I thought this might be worth blogifying just for the sake of adding some counter-narrative to the doomerism I see a lot regarding the value of software developers. Feel free to tear it apart :)
One thing that every working dev needs to realize, for their benefit, is that the AI gold rush is leading many, many companies to find ways to trumpet their use of AI for no other reason than to please investors. After all, if it's the big new thing, then of course you need to get on the train or get left behind. Investors are very important. They can provide money. Companies need money.
We might think, "Yeah, but so many of these dumb AI corpo-initiatives are doomed to fail!" and that's correct but the success/fail metric is not based on whether the initiatives' advertised purpose is effective. If investors respond positively in the near term, that's success. This is likely why Logitech embedded AI in their mouse software (Check and see if Logi+ AI Agent is in your task manager) https://news.logitech.com/press-releases/news-details/2024/N...
The near term crash (which will happen) in AI stuff will be because of this dynamic. All it means is that phase one of the grift cycle is completing. In the midst of this totally predictable, repeatable process, a dev's job is to get gud at whatever is truly emerging as useful. There are devs who are showing huge productivity gains through thoughtful use of AI. There are apps that leverage AI to do new and exciting things. None of this stuff happens without the human. Be that human!
My take just purely based on the title, I'm in the security space not a developer but I did study it during my degree.
I would say that when the fundamentals are easier to learn it becomes a great time to learn anything. I remember spending so much of my degree during software development trying to fix bugs and have things explained by trawling through online forums like many of us have. Looking for different ways of having concepts explained to me and how to apply them.
LLM's give us a fairly powerful tool to act as a sort of tutor in asking questions, feedback on code blocks, understanding concepts, where my code went wrong etc. Asking it all of the dumb questions we go trawling for.
But I can't speak to how this translates when you're a more intermediate developer.
Metaphors are fun, they "feel" meaningful, but… you still need to back that up.
> mechanized farm equipment
Sure, that could be a valid analogy.
Or maybe we invented CAD software for mechanical engineering, where we were making engineering drawings by hand before?
And that doesn't quite ring the same way in terms of obsoleting engineers…
The farming quote is interesting, but one of the Jevons paradox requirements is a highly elastic demand curve, and food is inelastic.
The open questions right now are how much of a demand is there for more software, and where do AI capabilities plateau.
I'm not sure if I agree with every aspect of the framing here; specifically, I don't think the efficiency gains are anywhere on par with a combine harvester.
However, I do agree that the premium shifts from mere "coding" ability -- we already had a big look into this with the offshoring wave two decades ago -- to domain expertise, comprehension of the business logic, ability to translate fluidly between different kinds of technical and nontechnical stakeholders, and original problem-solving ability.
Essentially it’s the same as it always was. Back in the day, Low-code or No-code solutions implemented by non-technical people have always resulted in engineers having to come in behind them to clean up their mess. I’ve had quite the lucrative career doing just that.
I don't think that now is the best time to learn software development, but I do think that now is the best time to learn computer science.
LLMs are amazing tools, but they don't (yet) replace deep understanding or critical thinking. And yeah, it is a super fun time to be learning/building.
Upwork is already filling up with people who have vibe-coded their way into a pit and need experienced developers to pull them out.
All of this is good reason that orgs _shouldn't_ be laying off developers, but none of it is a reason that they won't/aren't. In any case, I see more "if they're remote why can't they be on the low-wage side of the planet" at the moment, than I do "use AI instead of a developer", although they are no doubt related.
The more awkward truth is that most of what developers have been paid to do in the 21st century was, from the larger perspective, wasted. We mostly spent a lot of developer time in harvesting attention, not in actually making anything truly useful.
> The day we stop valuing human contribution is the day alignment has failed.
Unfortunately, that's many businesses already, even before AI. It's all just one big factory line. Always has been (to those at the top).
Now really feels like a good time to start learning how to code. I used to get completely lost reading documentation, but with Copilot, I just type a few lines and it helps fill in the logic. It feels like having a more experienced person sitting next to me.
That said, I still try to figure out the logic myself first, then let AI help polish or improve it. It is a bit slower, but when something breaks, at least I know why.
AI has definitely lowered the barrier. But whether you can actually walk through the door still depends on you.
Here are my 2¢ on using AI at work. I’m using Claude code and my typical tmux/neovim setup.
1. I use AI to find my way in a sprawling micro(service|frontend) system that I am new to. This helps cut down massively on the “I know what to do, I just can’t figure out where.” I started a new job where everyone has years of context as to how things fit together and I have none. I feel strongly that I need to give an honest effort at finding things on my own before asking for help, and AI certainly helps there.
2. Anything I stumble upon in a dev/deployment process that leans too heavily into the “good behavior/hygiene,” I try to automate immediately for myself and then clean up to share with the team. In the past, I might have tried to adopt the common practice, but now it’s less effort to simply automate it away.
3. There is value in using AI in the same manner as I use vim macros: I use the planning mode heavily and iterate like crazy until I’m satisfied with the flow. If the task has a lot of repetition, I typically do the first one myself then let the AI take a whack at one or two. If I don’t like the process, I update the plan. Once I see things going smoothly, I give the AI the ok to finish the rest (making atomic commits so that it’s not just one big ball of wax). This is pretty similar to how I record macros (make one change yourself, record the macro on the next line, test it out for a line or 2, re-record if necessary, test again, plow through the rest).
4. When I come across something that needs to be fixed/could be improved but isn’t related to my task at hand, I do a few minutes of research and planning with the AI, and instead of coding a solution, we create a todo document or an issue in a tracking system. This wasn’t happening before because of the context switching required to write good documentation for later. Now it’s more of the same thing but akin to a dry run of a script.
5. I can quickly generate clear and easy to read reports to allow other teammates to give me feedback on work in flight. Think about a doc with before and after screenshots of changes throughout an app produced by a playwright script and a report generator that I can rerun in under a minute whenever I want.
I’m finding that I really enjoy the skipping the tedious stuff, and I’m also writing higher quality stuff because I have more bandwidth. It helps me collaborate more with my non dev peers because it lowers the barrier to sharing.
Important to note that in my experimenting, I haven’t had great luck with winding it up and setting it loose on a task. Too often it felt like being a junior engineer again, doomed to throw spaghetti at the wall. Once I started using AI as an assistant, I felt things really started to click. Software development is about writing code, but it’s about a lot of other things too. It’s nice when the AI can help write code, but it’s fantastic when it helps you accomplish the other things.
This article seems to ignore the 6% sector unemployment, massive layoffs, and the terrible interview processes.
It's like "we invented Fortran so there will be no need for so many developers"
> And like mechanized farm equipment, LLMs are cheap, plentiful, getting smaller every day, and- most importantly- require no training to operate.
I... assume that was meant sarcastically, but it's not at all clear from context I think.
> What do you do while awaiting the agents writing your code?
I browse the web. Eventually, I review the agent code and more often than not, I rewrite it.
> and historical romance novels will rightly remember us as rugged and sexy.
Damn straight we are.
This call for arms reminds me of https://www.braveclojure.com/ which was also a definite inspiration for me.
>> "ChadGPT"
There actually is a ChadGPT but I assume the OP meant ChatGPT
Reading this reminded me how much the learning curve is flattening. You can now learn by doing and debugging AI output. That’s a very different entry point from five years ago. Less lonely, more interactive.
I learned programming in 2013, I was asking questions on stack overflow constantly while learning, people there were super friendly and supportive and answered my questions. SO was pivotal for me learning so fast.. Ive been a game programmer ever since.. This year I learned web-dev and made my 1st commercial web app. SO is totally dead though, utterly useless these days, but I used chatGPT to fill the same role and it worked great. Its a shame about SO though.
The management at my corporate job literally say in our townhalls that they expect AI to increase productivity and reduce costs. Makes logical sense to me, the glory days of high wages are over.
If you’re going to use LLMs to learn software development, great! Welcome!
Just, don’t skip out on learning the fundamentals. There’s no royal road to knowledge and skill. No shortcuts. No speed running, downloading kung fu, no passing go.
Why?
Because the only thing LLMs do is hallucinate. Often what they generate is what you’re looking for. It’s the right answer!
But if you don’t know what and L1 cache is or how to lay out data for SIMD; no amount of yelling at the bot is going to fix the poor performance, the security errors, and the logic errors. If you don’t know what to ask you won’t know what you’re looking at. And you won’t know how to fix it.
So just remember to learn the fundamentals while you’re out there herding the combine space harvesters… or whatever it is kids do these days.
Engineering problems are human problems. For now I guess? I saw a missinfo headline recently to the toon of, "AI is after all the water in X country" and I thought about the Anthropic paper where the model blackmailed the engineer. And then the Matrix. Cry, lol?
> and now with far greater reach and speed than ever before
I heard that before. Borland Delphi, Microsoft FrontPage, Macromedia Flash and so on. I learned how in 5 years or so, these new technologies would dominate everything.
Then I learned that two scenarios exist. One of them is "being replaced by a tool", the other is "being orphaned by a tool". You need to be prepared for both.
Now might be the last time to learn software development.
“The only weights I use are at the gym”
Lol
The hardest part is debugging.
>Now might be the best time to learn software development
Always has been.
Nice article! Reflects my views as well!
[flagged]
> LLMs really are like combine harvesters; allowing one to do the work of many.
Heck I'm so tired of statements like this, many who? It's already a lot an LLM that automate/help the boring/tedious part of my job, I have yet to see taking over 2, 5 or 10 of my collegues, just knowing what a hawful lot these tiredlessly dudes do I couldn't ever imagine doing also their job. imo such statements have very short shelf life