Explain to me like I'm 5, why I should use Docker locally

I can see some benefit to virtualization technologies when it comes to deployment, as it allows you to specify an applications environment more explicitly, however I always have a pain when using such technologies locally.

Explain to me like I’m 5 why I should try to use Docker locally when developing.
:smiley:

It means you can specify the environment your code runs in exactly. You can set it up in exactly the same way over and over and over again. It allows for replicable builds, is the key thing.

You might now need or want the extra complexity this entails it if is just you. They are a bit painful to use, particularly if you don’t have a team/person dedicated to managing the processes involved. In a team, it can remove the “works on my computer” problem. It shines when you need to deploy and redeploy applications, often as part of a team. Then if something goes wrong in the state of the application, can just destroy the container and redeploy it (this is where tooling like Kubernetes comes in, because managing lots of containers is much more of painful without that extra orchestration layer. Which is another huge layer of complexity!)

The problem isn’t “it works on my computer” its getting the same thing as whats working on my computer to get to production while still being easy to use locally.

Something as simple as hot reloading has become a nightmare when I approach Docker, and at that point any sense of environment stability is thrown out the window as the cost to developer productivity ends up being too high.

I do think I’m just not using it properly, but without getting to the point mentioned above, has so far been a super massive pain haha.

1 Like

Hot reloading is working fine for me. Maybe you should try to change your workdir + adding alway restart + depends on.
I use docker locally only for big projects, when I have multiple servers running with docker compose, so I dont need to manage 4 accounts trying to get free services for mongo, redis, backend, frontend, not either do i need running every service one by one locally.

And when it is working, I know it will be working on production on every server in the world:).

So with Docker I’ll need to setup a different Dockerfile for local development rather than deployments, as development requires extra tooling/different setup than production. In this sense the two environments are different in a lot of regards, which doesn’t seem much better from what I’m doing right now, but with extra overhead.

With a setup of multiple services, they all need to be ran locally tho right? Do I just need a beefy machine, or some fancier service beyond just docker-compose so I could swap out locally ran services for external ones. (like an actual mongodb server somewhere so I don’t have to run it) Or could docker-compose actually do this?

Just use docker compose, you should do 2 docker compose files for dev and prod.
So you can run different images for each if you want.

If you want to use same images, you can just change the volumes, so you would have same images but with different data.

If you need to share persisted data from 1 image across different environments, you would have to use same volume.

1 Like

the only point to use docker, is because the project will run on ur computer.
or, if you build it, it will run on someone elses computer.

because, ur computer, is not the same as the computer someone else has.
docker sits between that.

thats it.

this also means, ull be sure, if it runs local, it runs remote.

on the other hand, if you develop on ur own, and ur not sharing and caring and you wanna go fast, i see no reason to use docker.

Ideally, you don’t do this, because the whole point is to have the same system from end to end. Practically speaking though, development containers do often come with the extra tooling you mention and other conveniences. In that case, consider using environment variables and bind mounts containing your local overrides rather than building a whole new container.

If it really takes building a whole new container, consider at least deriving your dev container from your prod one rather than making it a sibling.

Plenty of use cases for local. Here’s one of mine: I have to develop with three different versions of PHP, which means running three different versions of FPM on three different webservers. Managing all those on one box is a massive pain in the ass, so I use Docker to keep them separated.

If you have a Linux box, Docker is literally as fast as running things “native”, because it is native, just in a different namespace. On Mac or Windows, it has the overhead of running in a Linux VM, but all the containers run in that one VM, so it’s still quite a bit less than if you were to use Vagrant or VMWare or whatnot.

@chuckadams

gotta say…you are correct.

The reasons to use Docker is it creates containers for code and their dependencies. It allows you to install tons of libraries into the container and not put them on your entire system. It also allows you to create “live or production” containers and “development” containers and then you can use the docker container to either test those environments independently or work on them without affecting the live versions.

If you don’t use the virtual environment then when your system updates it probably will upgrade these libraries and break your code. This can be a real nightmare when your production versions of code aren’t written for a new library, etc. However, you will still have to support the production until you can upgrade it away. What if one of your users calls in a bug with production and now you need to use that environment, etc? This is why you use docker – you usually need a replica of the production environment for bug investigation/testing and you need a working “dev” copy to upgrade on the latest libraries. The other reason is because so many services innately understand what to do with a docker container and you can instantly push the entire “environment” to them and install all the dependencies instantly. The container will then run on your host and will be running EXACTLY what you had on your dev computer. It lets you do things like zero-downtme upgrades, load-balancing, and quickly spinning up more copies of the environment.

Obviously, this is much more complicated than “5 year old”, but this is why you use it. You may or may not need that functionality when you’re plunking around, but I guarantee when you start doing paid work you will be using these things all day.