leranJs Monorepos Project[Docker, Gitlab]

Language
date
Jan 24, 2023
thumbnail
i99DEV_Docker_application_and_NodeJs_application_and_GitLab_app_d61a6876-b6b5-4550-b9b3-e70cc04eabfd.png
slug
leranjs_monorepos_project_docker_gitlab
author
status
Public
tags
GitLab
Docker
DevOps
summary
the issue is the use of LernaJS monorepo structure, GitLab CI/CD pipeline, and Docker to automate the deployment of the different components of the project to a server.
type
Post
updatedAt
Aug 21, 2023 05:57 PM
Status
Done
Person

Project Structure.

my-project/
ā”œā”€ā”€ packages/
ā”‚   ā”œā”€ā”€ api/
ā”‚   ā”‚   ā”œā”€ā”€ package.json
ā”‚   ā”‚   ā”œā”€ā”€ src/
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ routes/
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ users.js
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ models/
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ user.js
ā”‚   ā”œā”€ā”€ admin-dashboard/
ā”‚   ā”‚   ā”œā”€ā”€ package.json
ā”‚   ā”‚   ā”œā”€ā”€ src/
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ components/
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ user-management.js
ā”‚   ā”œā”€ā”€ web/
ā”‚   ā”‚   ā”œā”€ā”€ package.json
ā”‚   ā”‚   ā”œā”€ā”€ src/
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ pages/
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ about.js
ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ components/
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ index.js
ā”‚   ā”‚   ā”‚   ā”‚   ā”œā”€ā”€ header.js
ā”œā”€ā”€ lerna.json
ā”œā”€ā”€ package.json
In this example, the project is structured with a top-level packages directory that contains the different components of the project, each in its own subdirectory. For example, the api component is in the packages/api directory, the admin-dashboard component is in the packages/admin-dashboard directory, and the web component is in the packages/web directory. Each component has its own package.json file and contains its own source code, organized in a way that makes sense for that component.
At the root of the project, there is a lerna.json configuration file for LernaJS and a package.json file for the entire project. The lerna.json file contains information about the project's Lerna configuration and the packages that are managed by Lerna. The package.json file contains information about the entire project, such as its dependencies and scripts.
This is just one example of how a project structure can be designed using LernaJS, and the exact structure of the project will depend on the specific requirements of the project.

Issue.

The issue at hand is related to managing and deploying a monorepo project that contains multiple components, specifically an API, an admin dashboard, and a web component. The project is utilizing LernaJS, a tool for managing monorepos, to organise and manage the different components within the project.
GitLab is being used for continuous integration and deployment (CI/CD) to automate the process of building and pushing Docker images to a server. The pipeline is configured to build and package the project's different components into separate Docker images, and then push those images to the server for deployment.
sequenceDiagram
    participant Dev as Developer
    participant GitLab as GitLab CI/CD
    participant Docker as Docker
    participant Server as Deployment Server
    
    Dev->>GitLab: Push code changes
    GitLab->>GitLab: Build and test changes
    GitLab->>Docker: Build and package into Docker images
    Docker->>Server: Push images to server
    Server->>Server: Deploy images to production
Ā 

Solution

Set and Configure LernaJs

verify that LernaJS is correctly set up and configured for the project. This includes ensuring that the project's lerna.json file is properly configured and that the packages directory is set up correctly.
Here's an example of what the lerna.json file might look like for a project that has three packages, api, admin-dashboard, and web:
{
  "lerna": "3.22.1",
  "version": "0.1.0",
  "packages": [
    "packages/api",
    "packages/admin-dashboard",
    "packages/web"
  ],
  "npmClient": "yarn",
  "useWorkspaces": true
}
In this example, the lerna.json file is using Lerna version 3.22.1 and is configured with a starting version of 0.1.0. The packages array specifies the location of the different packages within the project, in this case api, admin-dashboard and web are located in packages/api, packages/admin-dashboard, packages/web respectively. npmClient is set to yarn, which specifies that yarn should be used as the package manager for the project. The useWorkspaces property is set to true, which tells Lerna to use the workspaces feature to manage the packages.
Here's a diagram of the project structure using Mermaid syntax:
graph LR
    subgraph packages
        api(api)
        admin-dashboard(admin-dashboard)
        web(web)
    end
    lerna[lerna.json]
    api --> lerna
    admin-dashboard --> lerna
    web --> lerna
āš ļø
When it comes to handling breaking changes, one approach you can take is to disable the use of workspaces in Lerna and instead manage the dependencies of each package individually. This allows you to update the dependencies of each package separately and test them individually before updating the rest of the packages.
To disable workspaces in Lerna, you can set the useWorkspacesproperty in the lerna.jsonfile to false.
{
  "lerna": "3.22.1",
  "version": "0.1.0",
  "packages": [
    "packages/api",
    "packages/admin-dashboard",
    "packages/web"
  ],
  "npmClient": "yarn",
  "useWorkspaces": false
}
With this configuration, Lerna will not use the workspaces feature and will not automatically link dependencies between packages. Instead, you will need to manage the dependencies of each package individually, by editing the package.jsonfile of each package.

Dockerfile

A Dockerfile is a script that contains instructions for building a Docker image. Each package in your project will likely require its own Dockerfile, as the build process and dependencies may differ between packages.
Here is an example of a Dockerfile for a package called api:
# Use an official Node.js runtime as the base image
FROM node:14

# Set the working directory
WORKDIR /app

# Copy package.json and package-lock.json
COPY package*.json ./

# Install dependencies
RUN npm ci

# Copy the rest of the source code
COPY . .

# Expose the port the app will run on
EXPOSE 3000

# Start the app
CMD ["npm", "start"]
This Dockerfile starts by using the official Node.js 14 runtime as the base image. Then it sets the working directory to /appand copies the package.jsonand package-lock.jsonfiles. It then runs npm cito install the dependencies. Next it copies the rest of the source code, it exposes the port 3000 and finally, it

Docker-Compose

Docker Compose is a tool for defining and running multi-container Docker applications. It allows you to define the services that make up your application in a docker-compose.yml file, and then start and stop those services with a single command.
Here is an example of a docker-compose.yml file for an application that includes the api, admin-dashboard, and web packages:
version: "3"
services:
  api:
    build:
      context: packages/api
    ports:
      - "3000:3000"
  admin-dashboard:
    build:
      context: packages/admin-dashboard
    ports:
      - "3001:3000"
  web:
    build:
      context: packages/web
    ports:
      - "3002:3000"
In this example, the docker-compose.yml file defines three services: api, admin-dashboard, and web. Each service is built from the corresponding package directory, and maps the host port to the container port.
When adding Docker to a project, the file structure of the project will likely change to include the necessary files for building and running the Docker images. Here's an example of what the file structure of a project that includes the api, admin-dashboard, and web packages, as well as Docker, might look like:
my-project/
ā”œā”€ā”€ packages/
ā”‚   ā”œā”€ā”€ api/
ā”‚   ā”‚   ā”œā”€ā”€ src/
ā”‚   ā”‚   ā”œā”€ā”€ package.json
ā”‚   ā”‚   ā”œā”€ā”€ package-lock.json
ā”‚   ā”‚   ā”œā”€ā”€ Dockerfile
ā”‚   ā”œā”€ā”€ admin-dashboard/
ā”‚   ā”‚   ā”œā”€ā”€ src/
ā”‚   ā”‚   ā”œā”€ā”€ package.json
ā”‚   ā”‚   ā”œā”€ā”€ package-lock.json
ā”‚   ā”‚   ā”œā”€ā”€ Dockerfile
ā”‚   ā”œā”€ā”€ web/
ā”‚   ā”‚   ā”œā”€ā”€ src/
ā”‚   ā”‚   ā”œā”€ā”€ package.json
ā”‚   ā”‚   ā”œā”€ā”€ package-lock.json
ā”‚   ā”‚   ā”œā”€ā”€ Dockerfile
ā”œā”€ā”€ lerna.json
ā”œā”€ā”€ .gitlab-ci.yml
ā”œā”€ā”€ docker-compose.yml
ā”œā”€ā”€ README.md
ā””ā”€ā”€ ...

GitLab CI/CD

GitLab CI/CD (Continuous Integration and Continuous Deployment) is a built-in feature of GitLab that allows you to automatically build, test, and deploy your code changes. It is based on the popular open-source tool, GitLab Runner, which runs the jobs defined in the project's .gitlab-ci.yml file.
Here's an example of a .gitlab-ci.yml file for a project that includes the api, admin-dashboard, and web packages, as well as Docker:
stages:
  - build
  - test
  - deploy

build_api:
  stage: build
  script:
    - cd packages/api
    - docker build -t my-project-api .
  artifacts:
    paths:
      - packages/api

test_api:
  stage: test
  script:
    - cd packages/api
    - docker run my-project-api npm test

deploy_api:
  stage: deploy
  script:
    - cd packages/api
    - docker push my-project-api
  environment:
    name: production
    url: my-project-api.example.com

build_admin_dashboard:
  stage: build
  script:
    - cd packages/admin-dashboard
    - docker build -t my-project-admin-dashboard .
  artifacts:
    paths:
      - packages/admin-dashboard

test_admin_dashboard:
  stage: test
  script:
    - cd packages/admin-dashboard
    - docker run my-project-admin-dashboard npm test

deploy_admin_dashboard:
  stage: deploy
  script:
    - cd packages/admin-dashboard
    - docker push my-project-admin-dashboard
  environment:
    name: production
    url: my-project-admin-dashboard.example.com
    
.gitlab-ci.yml file when it comes to build, test and deploy the web package because the structure and commands are similar to the previous packages, api and admin-dashboard, so the stages for build, test, and deploy for the web package would have the same structure, with different names and commands accordingly.
So, you can continue the .gitlab-ci.yml file by adding the build, test and deploy stages for the web package and also you can add any other stages or commands that you need to deploy your application correctly.

GitLab CI/CD & Docker-compose.

When using GitLab CI/CD with Docker and Docker Compose, the .gitlab-ci.yml file is used to define the pipeline for building, testing and deploying the application. The pipeline typically includes several stages, such as build, test, and deploy. For each stage, the pipeline runs a set of commands that are defined in the .gitlab-ci.yml file. The pipeline can be triggered by different events, such as pushing code changes to the GitLab repository or merging code into the master branch.
In the build stage, the pipeline uses the Dockerfile for each package to build the corresponding Docker image. The pipeline runs the docker build command along with the location of the package's Dockerfile as an argument. Once the images are built, the pipeline can also tag them with a version number or the Git commit hash.
In the test stage, the pipeline runs test on the built images, this can be done by running the images in a container and running test commands on the application inside the container.
In the deploy stage, the pipeline uses docker-compose to deploy the application. The pipeline runs the docker-compose up -d command to start the services defined in the docker-compose.yml file. The pipeline can also run docker-compose down command to stop the services and update the images with new versions.
Here's an example of how the stages might be defined in the .gitlab-ci.yml file:
stages:
  - build
  - test
  - deploy

build:
  stage: build
  script:
    - docker build -t api -f packages/api/Dockerfile packages/api
    - docker build -t admin-dashboard -f packages/admin-dashboard/Dockerfile packages/admin-dashboard
    - docker build -t web -f packages/web/Dockerfile packages/web

test:
  stage: test
  script:
    - docker-compose -f docker-compose.test.yml up --exit-code-from api
    - docker-compose -f docker-compose.test.yml down
In this example, the test stage uses docker-compose to run the services defined in a separate docker-compose.test.yml file. The up command is run with the --exit-code-from option, which allows the pipeline to fail if the tests fail. The down command is used to stop the services after the tests are completed.
The deploy stage is similar to the build stage, but instead of building the images, it pulls the image from a registry or use the built one and runs them with docker-compose up -d command, the pipeline can also run docker-compose down command to stop the services and update the images with new versions.
It's worth noting that this is just an example and the pipeline can vary depending on the requirements of the project, but this is a general overview of how GitLab CI/CD, Docker and Docker Compose can be used together to build, test and deploy a project.

Conclusion

using Lerna with Monorepo, GitLab CI/CD, Docker and Docker Compose can help to manage and deploy a complex application with multiple packages. Lerna allows to manage the dependencies and versioning of the packages, GitLab CI/CD provides a way to automate the build, test and deployment process, Docker allows to containerize the application and run it in different environments, and Docker Compose allows to define and run multi-container applications.
To solve conflicts in versions of package, one approach is to use yarn workspace or npm workspaces to manage the dependencies, another approach is to disable workspaces in Lerna and manage the dependencies of each package individually.
It's important to verify that the pipeline is properly configured and that it's running the correct commands, as well as the server is properly configured and that the deployment is working as expected, this will ensure that the application is accessible and running correctly for the end users.
Overall, using these tools can help to simplify the management and deployment of a complex application, and ensure that it stays up-to-date and stable.