I’m just a kid (66 years old). I had a bad experience with my algorithms course at Stanford being cheated out of my grad certs in Advanced Systems and Databases by a faculty that broke their own written rules apparently to favor on-campus students. But I still love the school. I cannot name any specific books, but you might want to check out anything by John Ousterhout. And even though I got into an argument with Jeffrey Ullman over that algorithms course, he and other Stanford professors are quite amazing. You might want to check out his book on finite automata. For networking, my professor was Craige Partridge. He’s a nice guy, very personable, and very, very knowledgeable. He is referenced in most networking books. And one all time favorite writer on Network Programming is, of course, the late W. Richard Stevens. I believe I have three of his books, TCP/IP, another explaining the source code of TCP/IP, and another for network programming. These, of course, are in C. As for databases, my professor was Serge Abiteboul. In his pictures on the web, he’s smiling, but he’s quite serious and a little stern in person, but very, very good–very rigorous. If you study his works, prepare to study relational calculus and relational algebra and comparisons of the coverage of both from a mathematical perspective. Be ready for database optimization, distributed transactions, etc. For machine learning, Andrew Ng is great. Geoffrey Hinton is a perfectionist–rigorous, and not so cheerful–somewhat stern, but you’d be surprised to find a pretty genious sense of humor. Anything by him is bound to be great. Oh. If there’s anything by Nick Parlante, he was my teacher at Stanford for Software Engineering in C.
All of this may seem really, really old as I studied there in the mid-1990s. But for me, I kind of go back to the mid 1970s with BASIC, Fortran IV, COBOL, old assemblers like IBM’s BAL, Modcomp, Cyber, 8080, 6502, etc. I saw a new language come out of UCSD called Pascal :-). I was an IBM mainframe programmer when the IBM PC first became available, and I started programming it in Microsoft compiled Basic and Pascal as well as DB II. Actually, I sort of reverse engineered that database, which is good for learning but not good for actual production work. I dabbled in Lisp and Prolog but did not get into neural nets until the 1990s when I wanted to use them with TradeStation to build stock investment systems. Then in 2001, I went through a divorce and didn’t get back into neural nets until just a few years ago–perhaps 2016 or so.
Going back to programming, the reason I brought up John Ousterhout, a professor I never studied under, is that there are many perspectives and strong opinions whether object oriented programming is good or evil, whether it’s better to use strictly typed languages or ones like javascript where you might have enough freedom to shoot yourself in the foot. You may find it interesting how languages evolved. Sounds boring? Well, originally, you plugged in cables and flipped switches to program in binary. Then assemblers let you use commands like LDA for “load accumulator” or “JMP” to jump to an address. Most used hexadecimal rather than binary and macros allowed you to reuse code very easily. Then there were compilers that translated languages into binary code, or actually into assembler. But then it became helpful to compile programs into modules and reuse those modules storing them into libraries. So, then you had link-loaders that would combine that object code into binary executables. Languages like COBOL, Fortran, and such were compiled while BASIC took a different approach. You would run your Basic programs inside a Basic interpreter which would read each line of code, translate it on the fly and execute it, then go to the next line and do the same, and so on. It ran slower–especially if you had a FOR loop that ran a million times as you might have to re-interpret that line a million times. And people started to optimize. They optimized both compiled and interpreted code. Some produced “just-in-time” or “jit” compilers so their programming language would enjoy the benefits of an interpreter with its ability to run code immediately while enjoying the speed of a compiler.
One programming language called “Forth” introduced the idea of a TIL or threaded interpreted language. It was stack oriented, sort of the way that HP calculators are somewhat stack oriented rather than being algebraic. You push a couple numbers onto the stack and tell it to add. The forth interpreter would take the top two numbers on the stack, pop them off, add them, and push the result back onto the top. And when laser printers became popular, one language called Postscript was also designed as a somewhat stack oriented language. HP had their own language and that seemed to pass up Postscript.
But you can start to see that so many languages developed over time. UCSD came up with Pascal which could be compiled on one machine into something called “p-code”, which is similar to the way Java compiles into a byte-code. Then the pascal runtime would run the p-code version of your program. Borland came up with a version of Pascal that compiled into machine code and ran really fast for that time, and it became popular with students at universities in the early 1980s. I used it and tutored many students in that language myself.
But then a CSci student friend of mine told me about this cool new operating system called “Unix”. I bought a book and read it cover to cover and then resigned my job to volunteer at Fresno’s Veteran’s Hospital working for a Neurologist and Psychologist doing research into the effects of damage to the parietal region of the brain–specifically how it affected the speed and accuracy of the subject’s ability to detect left-right and right-left movements visually. Programming in C on the PC and on the Vax running 4.2 BSD unix came naturally, and I had a great time doing it. And then I went into systems totally unprepared. At Fresno State, a system administrator handed me 9-track tapes for a Vax 11/730 with 1 meg of RAM and 110 meg of disk. Not gig. Not terrabytes. Megs. The machine was being switched back and forth from Unix to VMS on different days of the week. The sysadmin left for L.A. and for me, it was sink or swim. I learned fast, but sometimes, with the politics of a university, it could be a living hell. Other times it was a joy, and I became close to several Indonesian students. This started me down a path that would change my life forever. I ended up at an Indonesian church married to an Indonesian woman, and I learned the language. I moved to the bay area and worked for NASA, Ingres, Oracle, PeopleSoft, and then a startup called Clickmarks. During that decade and a few years, I was fortunate enough to study at Stanford and to be trained professionally by several companies in systems, databases, project management, customer support, people management, and various applications. That, and the wonderful experience of being a dad to an amazing daughter. Then I had the gut wrenching experience of divorce. I returned home to Fresno to heal and deepened my web skills. Years later, I remarried, and returned to software architecture and development in PL/SQL and Perl and later supported Epic systems at a local healthcare organization for a few years. My parents became frail, and by the time we moved in to care for them, we had a baby son. I lost my mother in 2014 and worked on developing my virtualization and DevOps skills as well as mobile development, and then returned to studying machine learning and artificial intelligence, which I am still doing today.
So, what language do I focus on now? Python mostly. The libraries available to Python are amazing. I’m about equally comfortable in Windows, Mac, Linux, and most variants of Unix.
I told you I was old. But I think you’re in a great place for studying today’s languages.
I’d look for books on concepts including old ones–structured programming, modular programming, DRY or don’t repeat yourself, Object oriented programming, why it’s loved and hated, functional programming and the use of immutable variables and its effect on multi-threaded programming, or the concept of thread-safe programming. One thing of extreme value are concepts around UX/UI and design, but not to confuse them. Also, data modeling, and software development related to project management–concepts such as Agile, Kanban, and as for PMI related project management, you may want to limit yourself to the 1997 one rather than the whole PEMBOK which could demand that you become overwhelmed with things entirely unrelated to software development. You may want to study product lifecycles, versioning, oh, and GIT, github, etc. And Docker. And consider kubernetes and micro-services. Docker containers are sort of like tiny, mini-virtual machines, but not really. They’re actually more like single processes that ordinarily serve one function. They don’t require a boot-up of the operating system. But they have much of the isolation of a separate machine. They run inside an already running Linux kernel.
Well, computer science in general is so wildly massive. And by the time you barely begin to learn a big part of it, there will be so much new stuff invented that you will have more to learn than when you started.
I hope this massively gigantic reply was helpful and not irrelevant or boring or overly random. But I think freecodecamp is an awesome place to learn.
I’m finishing up my specialization through DeepLearning and Coursera from Andrew Ng and am going through an AI and Machine Learning bootcamp at this time, too. The latter is expensive. But I wanted the practical training in addition to the theoretical training I got through Andrew Ng. Plus I thought it might be nice to have both Stanford and Caltech on my resume. So there’s a little bit of a vanity thing going on with me.
In the end, here I am at age 66 wondering what i am going to do when I grow up. When you say you’re “old”, I hope you’re not old like me, but it is great to be actively learning at 66. It would be cool to be much younger, but then I would not have the kids or wife I have now, and that’s a blessing I would never want to give up.
Best wishes from a very verbose man.