Feeling discouraged because of AI

Hey, guys. Aspiring web developer here. I’ve been learning to code on and off for a few years and it’s something I really love doing. I love the entire process, I love figuring things out and finding solutions by myself.

Recently, however, with the AI stuff happening, I’m feeling like it’s a hopeless dream. I don’t really want to use ChatGPT, to me personally it feels like I won’t retain any information if I don’t learn things the “hard” way. But if it keeps getting incorporated into companies, I don’t know if I’ll ever be able to compete against my peers who do use it.

I get that it’s a useful tool, but if it just gives me the answers, I won’t remember them and I’d feel like I’d just cheated on my high school exam.

I don’t know many people who work in tech and so I don’t know if my fears are right or not. I don’t have a university degree but I was hoping to make a solid portfolio with personal projects and try to get hired, but now I don’t know if it’s worth it. I’m not aiming for FAANG or anything, so can someone tell me if it’s possible to continue on without using this tool?

Sure its possible, nothing is stopping you but you. Dont mean for that to sound harsh if it does. Whether we like it or not AI is not going away and will only get better. Have you tried it? It does more then just give you answers, it can also give explanations and break things down if you have a hard time understanding documentation.

Theres nothing saying you have to use, and you are definitely not going to be disqualified from a job because you dont use it. ChatGPT is far from being 100% accurate so people still have to double check the answers it gives.

Like you said, AI is a tool not something that will replace developers. Im sure you can look through the forum for more AI related posts and answers as variations of this question have been asked frequently

A note about cons of using ai in learning
I dont’t know if it is the right answer. But, according to the recommendation in the link above, it’s good to avoid using AI while learning in order to not develop blindspots in your knowledge. You won’t learn how to figure out problems on your own if you become reliant on it while learning. Learning things the hard way without AI is fine.

So, it is fine not to use AI, atleast until you start working.
There are many ways of using AI anyway, if you don’t like using it to solve your problems, you can instead use it to do stuff you don’t like or tedious stuffs, like writing unit test.
Or use AI tools like github copilot that offer suggestions while you are coding, kind of like autocompletion, instead of directly giving the full solution.

You can always figure out how to use AI in a way that is appropriate for you later, you don’t have to use AI yet while you are learning. So don’t worry about it for now, and just keep learning.

Disclaimer: i personally haven’t used AI myself and don’t really understand anything, so take what i say with a grain of salt.

No, but I’ve seen many videos of people prompting it for different use cases (such as having it conduct a mock job interview for you, etc., so I get how it’s used).

That’s the reason I worded the question like that, I haven’t used it and I don’t want to be a prompt engineer, so… While I know it’s not going to replace developers, I was curious whether or not people would look down on me for not using it at a potential workplace.

What about after that? With the necessity for speed and productivity, it will give others an advantage when working, which is great and all, but I’m still not a fan of how fast-paced it makes everything. Will it become a requirement on the job to use it, even if I want to slow down?

Thank you guys for replying, by the way.

Hi @mtrgoob !

Welcome to the forum!

Here are my thoughts on ChatGPT.

It’s a tool that will help you not replace you.
You will still have to learn the fundamentals of software and learn how to problem solve.

When I was first learning to code, I wasn’t sure entirely what the job looked like and would have been fearful just like you about AI.

But now that I work as a developer, I don’t fear tools like ChatGPT.

A large part of development involves the following:

  • Collect and document requirements from the client
  • Communicate with the client on deliverables and milestones. Ask any clarifying questions you need
  • Convert those business requirements into technical documentation.
  • have design mockups created by the designers
  • Break the technical requirements into individual tickets that can be developed and reviewed

More skilled developers will be able to take those business requirements, think about the best approaches on how to build out the project, ask the necessary clarifying/follow up questions and break up the work into smaller pieces for the entire team to work. They will also be there to help guide the team through technical considerations and help more junior members with their work.

When you first start off as a junior you will start off with pretty detailed tasks that you will need to code out. As you gain more experience and move up the ladder then you will start work on larger/ more complex features and start to take more ownership and input on projects.

How does this fit in with ChatGPT?

Well, ChatGPT is great with smaller tasks.

Even though AI will continue to improve over time, it is not going to be able to replace all of those steps I just listed.

There is a lot of designing and planning that goes into building a new product and service that developers are needed for.

If you are interested in programming, I would suggest continuing to learn and grow.

Hope that helps!

1 Like

AI is nothing new.
The hype over AI blows it out of all proportions.

If you find yourself blocked you can use an AI to help you get past the block.
You can use it to save some time here and there on basic tasks.

Bing can be great as an AI search engine. (most of the time)
Notion AI is great for brainstorming and generating placeholder text.
I’ve heard Co Pilot is also good, but have yet to use it.
(I think all these AIs use ChatGPT to some extent)

The way you use AI to help you is often how you use a human to help you.
Sometimes its better to ask an AI first,
for example if I asked Bing
‘What does HTML stand for’
Bing could easily give me a quick answer and link me to posts with more information on the subject that tends to be written by a human.

But if I wanted help with breaking up and structuring a unique project,
(as a learner)
I would ask the forum, since AIs are unable to understand how to teach, mentor, or structure their answer in a way that takes the learners experience, abilities and mood into account.

If I were you, I would try ChatGPT.
You will likely find it not as competent as you imagine it to be.
Which can be disappointing.

If you’re trying to solve problems without ever making reference to existing (good) solutions (via google, chatgpt or any other resource) then I think you would find yourself struggling in the job.

You naturally retain things you use regularly and forget things you use infrequently (and have to relearn them when you use them again). You don’t need to memorise all this stuff, and the stuff you do remember you will remember naturally through practice.

The web is built on open source software and forums where people share knowledge and solutions. There’s no such thing as “cheating” in this context.

I can understand the concern, but there is a fine line between using a tool and relying on a tool. This goes for a lot of things, not just development. AI is just a new fancy tool.

AI technologies for developers (and people in general!) are in an interesting spot. They are tools that can easily be used in a wide variety of situations, but have issues in a lot of these situations as well. Because of that they can easily become a crutch, but they aren’t a great crutch.

An apt comparison would be the internet itself. If you want to be a developer, odds are you wont lookup every single problem you find, copy-paste the answer you find and move on. Its a great crutch to find solutions to all your problems via the internet, but again it isn’t perfect, and actually can create plenty of issues if you just take this approach without considering what your actually doing. AI is very similar.

I think the feeling its fast paced is because it feels fast paced, but it actually isn’t. Github’s Copilot has been out over a year, but most people haven’t heard much about it because its developer focused, and not as customer facing as something like ChatGPT.

It takes time for people to learn how to use a tool correctly. Because of that things can’t move too fast as everyone is learning how to use it correctly. Even today with all these new AI technologies the entire space is very much a bubble, where lots of cash is going into the field, but that doesn’t instantly mean its the future, only the potential future. Its highly possible lots of these investments will fail, some will probably be useful in the future, but most of it is hype.

As a tech worker, you’ll be at the forefront of finding what works and what doesn’t. Its possible you will find it somewhat useful, but not perfect.

From a learner perspective you’ll find it as another resource to utilize when necessary.

See what the future holds, but in the mean time explore it and see what you find.

Thanks to everyone for replying. I really appreciate it.

Of course, it’s just that my preferred method is the “old-fashioned” way, like reading the docs and googling around for different answers. I agree re-inventing the wheel on the job gets you nowhere (though I’d argue it’s essential for learning).

I think this hits the nail on the head regarding my initial concern. What if after a few versions it becomes the de facto “crutch” and people just rely on it to write (most of) their code for them? I like Copilot, for example, but it’s just my preference that I like writing most of my code myself (based on references, docs and blogs). I hope the hype won’t push people to be forced to rely on it if they don’t want to.

Once again thank you guys for replying, sorry if this is another one of those threads that feels like it’s going in circles. I’ll try to be a bit more positive about the future lol.

1 Like

The “push” would come from businesses, as businesses can try to enforce usages for such technologies. Its hard to visualize a business that forces you to use AI tools to write all your code, because the technology is too inaccurate. It might be offered/provided/forced to be available, but at the end of the day the developer doing their job is the one who gets to enter what.

Its actually more realistic to find businesses who enforce not using these tools, as these tools come with some inherent risks, such as security, privacy, bias, and simple effectiveness.

Ultimately “capitalistic darwism” will come in. Companies relying too heavily on bad tech (AI or otherwise) will underperform those using the right tech over time. Such things will take time to work though, and right now “AI is hot” so we will have to see where things settle.

In the mean time use what you want and whatever makes you feel productive. We can worry about the big picture once it formulates more clearly in the coming years.

1 Like

I don’t think it is all bad either. Let’s say you have two tasks, they are equally important and time-sensitive. One is boring and requires a lot of boilerplate, one is fun and creative. If you can write the boring boilerplate code faster using a tool then why not? You now have more time to spend on the task you actually care about. Using tools that can do menial tasks for you is just common sense.

If an employer can prove you are more productive with the tool than without it, then they have an economic incentive to force you to use it. The issue, in the end, is quality vs quantity. If they want low-quality code in large quantities, then with or without a tool, the work you do will not be very satisfying work.

You can minutely handcraft a watch over a period of a month, or you can stand next to an assembly line vomiting out cheap digital watches. You are still “making” watches, but the two jobs are worlds apart.

Let chatgpt be your rubber duck

1 Like

Or I could just use a rubber duck :smile:

1 Like

I’ve seen dev managers say they’ll fire juniors for using AI. AI isn’t taking jobs. There’s no reason to worry about this. Just keep learning and practicing.

2 Likes

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.