Chiming in here!
I can say with good certainty that docker is not going away (or at least containerization for that matter).
Depending on the maturity and the scale of the company you are working for, docker becomes a necessity rather than a nice-to-have. It adds to reliability in builds (through both development and especially through production), and it supports testing extensively. This is because Docker provides reproducible environments, building your system the exact same way each time you run a container from your specified image.
This gets exacerbated in environments that require distribution across multiple areas, say for example clients in North America and Asia.
That’s where Kubernetes comes in, as @chuckadams mentions it allows for scale, redeployment, and orchestration.
You can have your system or platform distributed across differing regions with redundancies that either scale with use or redeploy upon error, each rebuilding under the instructions of their underlying containerization (docker, in most instances).
I have used both extensively where I work, and I can say that it is definitely worth your time to invest in learning the basics of docker. It helps with open source projects; it will help your prospects at newer companies and bigger companies that are keeping up with technological paradigms.
A small bonus of learning docker as well: a lot of the concepts learned from a basics docker course will carry over to anything related to an orchestration implementation as well. Kubernetes and docker share a lot of the same core concepts and docker is a major component of the kubernetes stack.
Finally knowledge in docker will help you synthesize knowledge around both items like gitlab runner, and github actions (among other pipelines) both of which are effectively extensions of the docker environment.