Im worried about AI making learning coding obsolete

Have you guys heard of GPT-3? Check out this link to see what it can do.

In this example the AI creates a complete react app based on a plain English sentence, in this case something like “enter a todo item and save it, and show an input button”

Here is the page with of the AI that can do this. along with other examples of its capabilities.

Some of the applications are just insane, it can write amazon reviews with a positive or negative sentiment, they used it to create a mental health platform for teens and the AI can pick up on cues if a user is in danger of self-harm, suicide, etc, just incredible stuff.

Now, I’m learning programming and just finished my third cert here at freecodecamp, and seeing this is a little discouraging to me. To any of the more experienced programmers, what do you think about this? Will coding become obsolete soon? Something about seeing a concept that took me some weeks to learn being done in a second with just a plain English sentence gives me some insecurities if I’m being frank.

1 Like

AI applications are created by programmers.

Technological progress will make some specific types of programming less relevant over time, programming as a discipline is becoming more ubiquitous.


Yep. AI is not magic. Someone needs to program it. And there are still lots of things that AI really sucks at.

1 Like

Yes I agree what do you think about my main question though, its effect on learning programming, maybe you don’t know enough to answer, I clearly dont.

It makes me think that after doing these freecodecamp certs Ill start learning some ai programming as well

1 Like

It is worth noting that Learning actual AI is rather hard. It requires lots of stats. Learning to use AI is easier, but it still requires statistical understanding so you can avoid using it incorrectly.

This is true, sometimes the AI will return something not to the client specs’ due to “ambuiguity.” This is GPT3, an exponential increase from previous iterations, I saw a graph somewhere, on The last “Two Minute Papers” youtube video if you know it, the neural network is expected to reach a “Human” level of neural connections soon :o crazy

Cold Fusion, flying cars, and robot maids have been ‘coming soon’ for decades. I tend to worry about now now and worry about soon soon. Something is always “just about to make it big”.


I get what your trying to say, but if I type “create a todo app” and I get a functioning app is not some utopic future thing that is or is not on the verge of happening, its happening now and a legit long term worry for any current or would be future self- learning programmer imho.

1 Like

If you want to only code simple applications that can be described in single sentences, sure. Professional programmers don’t get paid to write simple applications that can be described in single sentences, as a general rule. The complexity gap between a ToDo app and making freeCodeCamp’s full website is massive.

1 Like

I suggest taking a look at some of the conversations here about whether services like Wix will make web development code-free. Basically, it’s fine that a program can create a todo app, because you won’t be building a todo app. The nature of programming is that things that can be automated will be and should be automated. Most of what I do are things that people don’t know exactly how to do yet, let alone know how to write a script to do.

1 Like

Of course not, my example was clearly pointing out what the capacities are currently with only an input of one sentence. Of course programming worthwhile apps would require sustained efforts not within the realms of what an AI could presently spit out with a simple plain English description. I lack real working experience in this field, which is maybe why this worries me a bit I guess I’ll just wait and see how things progress. I agree with you 100% by the way

just today during one of my various quantum chemistry courses there were mentions of evolution of computational tools.
there are massive calculations involved, and a chemist will be needed for a few more years to make all the necessary approximations based on empirical experience so that those calculations can ran in human time, like maybe max a couple of days, and give usable results

it’s not yet time for everything to be automated. programmers (like people building those tools) will be needed for yet some time

anyway, programming interfaces that interpret natural language exist already, but you still need a programmer to actual put in all edge cases and stuff, and make a good paragraph or two pointing those out.

Wait and see how it progresses is always the best bet with fancy new technology. It can be fun to chase the hottest new thing, but its always a risk to do so. Those technologies rarely grow as fast as people say they will.

Specifically for AI, its important to realize that its based on statistics. Statistics are strongest when you remain inside the bounds of known data. This means its extremely difficult for AI to do novel things. (A great example of this is some of the work in getting AIs to write stores.)

I hope you don’t feel like we’re jumping on you and shutting you down. This is just a couple of professional developers saying “Yeah…We’re not remotely worried about AI taking our jobs.”


Yes, it is not a utopic future.
Nothing will exist forever.

If you go one step further, you know you will die in some years.
Even the earth will die in some years.
So everything will be obsolete in the future.

So what? Instead of talking about “coding”, I’m talking about “software engineering”. Instead of coding, it is more useful to learn the skill of software engineering, building software from scratch, starting with reaching out to potential customers, collecting requirements etc. Converting all this stuff into code is just one tiny step.

I originally wrote a long post about how GPT-3 isn’t some super fancy technology that will take over software development jobs and massively affect the growing job market. However I realized there is only really 1 point to be made about how software is actually made that makes GPT-3 a useful tool at best, rather than some threat to the entire field.

Businesses and users pay for software to be made, but don’t actually know what they need.

Simply put even if GPT-3 could build software of any level of complexity (it cant), it won’t know what to build unless its told specifically how to build it. The level of abstraction GPT-3 offers is to high to build entire systems. As such, you still need software developers to tell it what they need, and even then GPT-3 wont know enough to do the full job.

GPT-3 is a tool to write out boilerplate code for simpler common use-cases it can understand. Beyond that you need something that is flexible enough to build what you need. That flexible something is code.

Finally to provide some quick “insight” to how GPT-3 actually works, it works the same as existing AI technologies except on a vastly larger scale. It uses statistics to power what is essentially pattern recognition. As such it is no more a threat as Google powered searches are a threat to actual experts on topics. You need context, experience, creative thinking and people skills to build things, none of which GPT-3 possesses.

You could say the “easy” jobs are taken up by GPT-3 and such could be true, but the same is also true with no-code development platforms and CMS systems.


I like this point. Sure, things with “AI” and GPT-3 will be easier to write (like an entire technical textbook), but it will still be up to a human to guide it to create something useful.

Yes! This kind of reminds me of the functionality an IDE does with auto-complete and Intellisense technology. Sure it types out some stuff for us, but we still need to check and use the right code to complete our intended function.