Greetings, dear community!
Inspired by how helpful this topic from last year seemed to be for so many, I decided to create another one for 2025.
I think we all know what the job market has been like for self-taught developers in the last couple of years: pretty much gruesome, with occasional unexpected strokes of luck just enough to keep us from giving up. At the end of the day, nothing’s going to replace years of hard work and dedication. All we want to rest assured in is that all our labor isn’t in vain. What does 2025 look like? Is it pretty much the same as last year, or are we seeing some changes?
It seems to me, and I could be completely mistaken, that a new skill slowly but surely becoming essential to employed programmers is leveraging AI to increase productivity. Companies know AI is still bad at even straightforward software development. But they’re also aware of how much more productivity per developer would increase if they could combine human problem-solving skills with the impressive data processing capabilities of AI systems. What do you think of this? I’m of the thought that this is only something to consider for advanced programmers, as AI could become a crutch to beginner developers if they never learn to code without it. Do you think it’s worthwhile for intermediate or advanced developers to learn how to incorporate AI into the workflow to increase productivity?
I would like to be clear, I do not mean utilizing AI to code for you, but rather to speed up mundane tasks, such as debugging.
Personally, I dislike all forms of AI in programming (what aspiring developer doesn’t?), and still believe in the good old, 100% human way of building software, with all the difficulties and delays that may come with it. Still, one must adapt to a changing world if they intend to find employment with their skills. I would love your thoughts on this, both novices and experts alike.
Many thanks for taking the time to read this overly-long topic description!
Keep codin’, keep goin’,
Nicolas, 18