New to Dev Ops: AWS, GitLab, Docker, Node, React, StoryBook

I have just recently started to work at a company as a Jr. UI Engineer (I am a Sr.UX UI Designer and have experience with Front-End JS Technologies, wanted to take this role as a challenge, and learn more) when I saw what GitLab and DevOps can do, it blew my mind away, and I added DevOps to my to-learn list. I personally did not know the DevOps technology got so good over the years, and automated. I remember the FTP Zilla days and just using GitHub etc.

Here are some of my questions but before I begin to ask questions, I humbly thank you in advance if you decide to give me some directions and knowledge.

I will try to be as detailed as I can be.

Technology I am planning to use: GitLab (Private), Docker, NPM (Private), AWS Beanstalk for full apps (Node, React, Express, EC2, S3, DynamoDB, etc.), EC2 for UI Libraries with StoryBook build (React, React Native, Electron, etc.)

So with that in mind:

How can I deploy to AWS EC2 via GitLab?
I hear lots of Docker talk around, but to be perfectly honest I kinda have some idea what it does, but not entirely. I am not even sure if I need it or not.

On merge to development, staging and master I want StoryBook to build and I want to deploy the build to the AWS EC2.

giblab-ci.yml file on storybook-build job:

build storybook:
  stage: storybook
  script:
    - npm run storybook-build
  only:
    - master
    - stage
    - dev

The storybook spits out a storybook-static folder, an index.html, some js and CSS files, etc.

image

I have AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY added to my GitLab as variables, and the key has super admin access.

So I would need to take this build and deploy it into EC2, right? But how? Maybe the Docker is playing a role here?

How can I register my NPM Package via GitLab and semantic versioning on merge to the master branch?

I have created an NPM Token, added it to the GitLab CI/CD variables as NPM_TOKEN
I also have my project ID added to the GitLab CI/CD variables as CI_PROJECT_ID

gitlab-ci.yml file on build job:

build:
  stage: build
  script:
    - npm run build
    - npm config set '//registry.npmjs.org/:_authToken' "${NPM_TOKEN}"
    - echo '//gitlab.com/api/v4/projects/${CI_PROJECT_ID}/packages/npm/:_authToken=${NPM_TOKEN}'>.npmrc
    - npm publish
  artifacts:
    paths:
      - build/
  only:
    - master

I get an error:

 npm ERR! code E401
 npm ERR! 401 Unauthorized - PUT https://registry.npmjs.org/@organization/package-name - You must be logged in to publish packages.

Is there any way I can automate the project creation on AWS via GitLab or any other tech?
For example, whenever I am working on a new project, I could just automate the project creation with templates, prepopulated variables added to correct places, super admin added, etc.

Thank you so much in advance!

If you’re using Docker, I suggest skipping EC2 and go straight to ECS, possibly using AWS Fargate for a streamlined experience. With ECS, you build a Docker container then just deploy the container without worrying about spinning up or configuring a VM in EC2. It’s nifty :slight_smile:

1 Like

Thank you for your reply @chuckadams
Are there any resources, tutorials, etc. you can direct me to?

Probably want a tutorial on Docker first. I just went straight to the docs, but others have different styles, so you may need to google for Docker tutorials – there’s oodles of them. If you’re familiar with basic system adminisration like installing packages, Docker will be super simple. Docker-compose would be something you want to pick up sooner or later, but it’s a real simple layer on top for connecting multiple containers.

I haven’t used ECS or Fargate myself, only gotten secondhand praise for it. All the cloud providers have services for running containers, so definitely don’t think you have to be locked in to Amazon ECS. In fact that’s kind of the point of using containers, since you can just take your containers and run them pretty much anywhere.

Then there’s really serverless stuff like AWS Lambda or Azure Functions, though I find the tooling around them really substandard, and in the serverless project I’m doing I’d really rather be using containers.

1 Like

Really helpful! Thank you so much!

Probably, but I’m not sure if you want to invest your time automating this. I’m not sure how many projects you plan on making, but odds are each one will be different. If you plan on building the same infrastructure for every project, then maybe you can look into writing some custom scripts to automate parts (or all) of your project creation using the aws cli. All major cloud vendors provide a cli and rest api’s to automate their cloud, so you could even code your infrastructure creation.

I’d consider this to be more of a “boilerplate architecture” script than anything else, as if you want something along the lines of infrastructure as code, or gitops then you should use a more specific technology.

PS. I use Google Cloud, not AWS, but they are comparable at this level.

1 Like