I am new in Docker. Even though I have done the tutorials provided in Docker site, I am still not very sure how should I create the Dockerfile.
Basically, I will be using Putty in Windows to do all my stuff. So, I am rather confused how things should be done. Should I create the Docker image of my git clone in local first before upload it up to AWS using Putty in Windows ?
My next question is that how can I create the Dockerfile to make sure that all my NamedWebApp goes into a directory say /jsbackend ? I am really stuck here so would appreciate some help here.
And also I am new in node.js, react.js, redux so I come to know this json.package which will contains all the projects dependencies. My question is do I still need to download it when I build the docker image or when the docker image is built, it will contain the package.json and things will run ?
Please help me out on this cos I am really lost.
Thanks.
It’s not going to be straight forward to help over here, unless you just want the answers without explanations
I would recommend to try out different things and get some hands on experience. Once you do it, you will get the hang of things
About the directory, it’s possible to put anything in any path in your docker image
You would need to learn about local docker images vs remote docker images which are hosted in an image registry like Docker Hub, to understand how to use Docker for deployments including deployment in cloud like AWS
Nodejs packages - they need to be installed in your docker image too, only then the app will run. Anything that’s needed by the app to run must be there in the docker image of the app
For your first question; technically no, depending on how you want the dockerfile to operate. You’ll need a method to automate the upload of your docker image each time there is a new build. (Ie. when you make changes to that repo). Alternatively, you could build your image in such a way as to “borrow” or mount the changing code when you run it, through volume mounting (second question).
You can mount at the point of docker run this lets you use a stable environment with a symlinked (afaik) directory of your repository. However, if you would rather copy it in at the time of build, you can set the workspace and/or set the copy directory via: COPY <local_directory_where_/jsbackend_is_located> /jsbackend
As far as the third question, this is tricky. The same sort of principles also apply. If you plan on cloning the repo in the docker image (ie. fresh clone), then yes you will need to run your package install during that build time. Alternatively again, if you only need to update the packages every-so-often I would again suggest doing a build whenever packages change, install on local, and copy into the image after.
Hopefully that makes sense
tldr;
Image Building: push to git --> trigger automatic build of image from remote (runner / actions) --> updated image to pull from anywhere.
To prevent oddities in package installation: run image with volume mount of the repo root folder.