You wouldn’t buy food if it was out of date, would you? So would you use out-of-date code?
As I look for answers to questions that I have as I learn, I have found it very important to look for when an article was created. An article’s publication year might give an indication as to how reliable the information is. Sure, some coding methods might not change, but others do, along with other technologies they interact with; and any developments in code, especially encryption, could be quite crucial to the security of what you produce.
You might find an article that demonstrates how to create a database log in script, but if it was written in 2002, it might not be fit for purpose, no matter how easy it is to learn.
But I also acknowledge that coders need to code for those who refuse to update their systems, but for how long should that be done? I mean we’ve been in a 64-bit world for a long, long time, yet so many programs are only served as 32-bit applications. Come one! Let me use my PC’s full capacity! Is web development going to be a long drag towards the simplicity of coding for modern tech?
Perhaps had Microsoft not released patches for the WannaCrypt disaster, maybe that would have forced people and organisations that should know better, to get with the times and upgrade or switch to Linux. By the same token, if people with old systems find that they cannot access their favourite sites, or any sites, that might give them a reason to do themselves a favour and get a new computer.
EDIT: Ah, yes, there are also people who may be oblivious to the need for periodic upgrades of technology, but eventually they are forced to change. A case in point is the removal of analogue TV signals, forcing people to either buy a new TV or a digital receiver. Had the analogue system not been taken away, they would still be using it at a huge cost to broadcasters. Eventually things get switched off.