Algorithm challenges and computational efficiency


Hi everyone,

I started FreeCodeCamp about a month ago, and I’m having a great time. I have learned so much, especially doing the API projects (the random quote machine, twitch stream viewer, etc.).

I’m currently doing the intermediate algorithm challenges. While I haven’t had a CS algorithms class, I do know a bit about O(n) notation and algorithms from reading a good chunk of the Sedgewick algorithms book on my own. I’m a bit worried about the efficiency of my solutions.

For (e.g.) the sorted union challenge I need to remove duplicates in the final array. I definitely see a way to do it in O(n^2) time, but I’m pretty sure that if I utilize a tree set (as in Java) that keeps track of the unique values that it can be done in O(n log(n)) time, because it only takes O(log(n)) time to look up a value in a tree set.

Because Javascript doesn’t have a tree set built into the language, I think I’ll just take the easy way out for this problem just to keep up my pace. Still, I somehow feel like I’m going to get burned down the road when I interview for a web development job if I don’t have a solid grasp of algorithms, and I’m a bit worried that the challenges on FreeCodeCamp might not cut it given that optimal efficiency isn’t stressed. Does anyone have any advice on where I can learn or practice algorithms that would sufficiently prepare me?



Yeah, I was wondering about basic algorithms myself the other day. I can remember back in the day (when dinosaurs roamed the Earth) when I had to make all those data structures and algorithms from scratch, in Pascal, and then in C. It’s a shame that a lot of people will never learn how to do a bubble sort or build a linked list from scratch. Maybe we should.

As far as efficiency, I don’t know how much they are worried about it for frontend javascript. Waiting for the internet to respond and send what it needs to is probably going to the biggest factor, by at least an order of magnitude. For frontend JS, you probably aren’t going to have to sort through 1.3 billion pieces of data. Computers are so fast now that anything you need to do will be over before your AJAX call gets back. Any hard core coding and massive data crunching is probably going to be on the backend. And I’m sure there are libraries that will build these data structures for you.

But I do feel like something is being lost. But then again, when I was learning in the 80s, there were probably people saying that I wasn’t really learning because I wasn’t doing assembly. And when those guys were learning there were a couple of old guys standing around saying that real programmers use punch cards.

But it probably would be good for people to learn these things - for the way of thinking and for interviews.


I understand your concerns on efficiency - in high level languages like Javascript, you can definitely adjust the way your code solves a problem but you can’t optimize it in the same way as you would in let’s say C or C++. The challenges on freeCodeCamp are to get new developers to get into the whole developer mindset, not as much as computational efficiency, though it’s a good skill to learn.


The chances of an interviewer for a web development job asking you questions beyond “can you improve the efficiency of this a little bit” are vanishingly small. Those questions in an interview for web development roles could be interpreted as the interview process (and by extension the role at that company) being borked.


You’re absolutely on the right track to think about algorithmic performance - being able to gauge asymptotic efficiency and improve it is an important skill for programmers - it is considered essential to computer science education - an analysis of algorithms course is required for every undergraduate CS degree - also every single algorithm in any CS course is accompanied by a discussion of its complexity

Javascript has set and map classes that are very useful - in fact make short shrift of a few freecodecamp problems - I encourage you to use them as much as possible

I cannot imagine a programming interview that ignores algorithmic complexity - binary search in different guises is quite popular among interviewers exactly because it is O(log(n))- many interview questions are purposely designed to expose weakness in algorithms and data structures or to allow a variety of solutions that can be progressively improved and optimized

many interviewers are looking for candidates that use an incremental problem-solving approach - first quickly find some solution to a problem - then analyze its performance and improve it

it’s easy enough to see what companies are asking in programming interviews these days on many websites


For general programming if needed for the role, possibly. For web development roles though? Sure, it has some importance, but there are a million and one more important things that’ll affect web dev projects long before all but basic algorithmic optimisations come into play. Web Dev is not CS. Unless the role was highly specialised [or for an architect?], I’d be incredibly surprised to be given (or be instructed to give) that kind of question.


You should watch this technical interview by Google, it might change your opinion on that:


See, the issue here is Google. Using Google as a model for what a normal interview should look like is insanity. Google are interviewing for postions at Google, the mega-scale problems Google [and a handful of other usual suspects] have are not the problems of most any other company.

I’m not saying they’re bad or stupid questions in any way, just that using using Google as a model of normal questions web devs should expect at interview is nuts


You should be prepared for the most difficult of interviews - you never know the difficulty of questions you may be asked at a technical interview so you should be fully prepared for the hardest questions rather than risking bombing a technical interview.


Sure, you don’t know exactly, but you can almost always just ask in advance and get an honest answer from the company you’re going for an interview with. Most people will tell you.

The problem with the Google, whiteboard out algos style stereotypical coding interview is that a. everyone hates them, b. they’re not reflective of the job, and c. what you’re doing is optimising for candidates that train to do that


But being prepared for the hardest interviews is the best way - your employer is free to lie to you about the interview questions - besides are you gonna walk into a technical interview and be like I don’t know how - that’s gonna harm your chances at getting the job. Just be prepared for anything.


If the interviewer is deliberately lying to interviewees for no real reason (selecting candidates on basis of better rote memorization skills?) then there are larger issues at play

Also, there is nothing wrong with having as wide a spread of knowledge as possible, but this does not negate that generally, an interview for a web dev role thay involved algorithmic complexity as anything except a very minor part would be strange, as is would seem to be placing undue weight on something that has no bearing on the role


If you are not prepared for the most difficult of challenges an interviewer could offer - why should they hire you?