Stator
🚀 Quick Start
The interactive CLI will guide you to easily setup your project.
npm run get-started
📋 Table of Contents
📚 About the Project
Have you ever started a new project by yourself?
If so, you probably know that it is tedious to set up all the necessary tools.
Just like you, the part I enjoy the most is coding, not boilerplate.
Say hi to stator, a full-stack TypeScript template that enforces conventions, handles releases, automates deployments and much more!
If you want more details about how this idea was implemented, I recommend reading the series of blog articles I wrote on the topic.
Demo Application
🦄This template includes a demo todo application that serves as an example of sound patterns. Of course, you won't be creating a todo application for your project, but you can use this as an example of useful patterns and learn how to use the technologies presented in this project.
Technical Stack
For a detailed list of all those technologies, you can read this blog article.
Deployment | Database | Backend | Frontend | Testing | Conventions |
---|---|---|---|---|---|
DigitalOcean App Platform | Postgres | Nest | React | jest | commitlint |
semantic-release | Mongo | Fastify | React Router | cypress | eslint |
docker-compose | TypeORM | Swagger | Redux | prettier | |
NestJs CRUD | ReDoc | Redux Toolkit | |||
Material UI |
💥 Getting Started
Prerequisites
- Docker Compose
- node.js v14.x
Copy the template
This repository is a repository template, which means you can use the Use this template
button at the top to create your project based on this.
*Note: If you have an existing repository, this will require more work. I would recommend using the use template button
and migrating your current code to the newly created projects.
Make it yours
You will now want to make this project yours by replacing all organization and project naming occurrences with your own names. Thankfully, we have a script just for that:
npm run rename-project -- --organization {YOUR_ORGANIZATION_NAME} --project {YOUR_PROJECT_NAME}
*Note: I highly recommend that the project name is the same as your git repository.
On completion, you will see the following message:
Run the application
First, install the dependencies:
npm i
Then, run the whole stack:
npm run postgres
npm start api
npm start webapp
Finally, why not test it:
npm test api && npm run e2e webapp-e2e
For a full list of available commands, consult the package.json
.
Continuous Integration
This templates integrates Github Actions for its Continuous Integration. The existing workflows are under .github/workflows
.
Currently, the CI will ensure all your apps work properly, by building and testing.
For your pull requests, it will create a review application which will basically host your whole stack on a VM.
Once everything is ready a new comment will be added to your pull request with the deployment URL.
When the PR is closed, your review app will be destroyed as it's purpose will have been served.
It's sacrifice will be for the greater good and also your wallet.
To have the CI working, you must:
- (Optional) If you want review apps to work, you should follow the instruction provided by the
get-started
CLI. - (Optional) Link your repository with Codecov by inserting your
CODECOV_TOKEN
in github secrets. - (Optional) Insert your Nx Cloud access token in github secrets under
NX_CLOUD_TOKEN
. This enables for caching and faster build times.
Deployment
The application can be deployed in two different ways, depending on your objectives.
Digital Ocean App Platform
For a simple and fast deployment, the new App Platform from Digital Ocean makes it easy to work with monorepos. For our todo app, the config file lies under .do/app.yaml
. There, you can change the configuration of the different apps being deployed. The spec can be found here.
To deploy this full stack application yourself, follow the steps below:
- Create an account on Digital Ocean Cloud (this is a sponsored link) and enable Github access
- Install doctl CLI
- Run
doctl apps create --spec .do/app.yaml
- View the build, logs, and deployment url here
Once done, your app will be hooked to master branch commits as defined in the spec. Therefore, on merge, the application will update. To update the spec of the application, first get the application id with doctl apps list
, then simply run doctl apps update <app id> --spec .do/app.yaml
.
⚙️ Implementation
Database
Postgres
There are 2 databases available, postgres and mongo.
To ensure your developers don't get into any trouble while installing those, they are already pre-configured with docker-compose.yml
files.
By default, the project uses postgres. If this is what you want, you're good to go; everything will work out of the box.
Migrations
By default, the automatic synchronization is activated between your models and the database.
This means that making changes on your models will be automatically reflected on your database schemas.
If you would like to control your migrations manually, you can do so by setting synchronize
to false in orm-config.ts
file.
Generate migration from your modified schemas:
npm run typeorm -- migration:generate -n {MIGRATION_NAME}
This will check the difference between models for your defined entities and your database schemas. If it finds changes, it will generate the appropriate migration scripts.
Run all pending migrations:
npm run typeorm -- migration:run
To get all the information on migrations, consult typeorm documentation.
Mongo [NOT RECOMMENDED]
If you would like to use mongodb, even though it is absolutely not recommended because it currently doesn't work well with typeorm, you can still do that by updating the connection info under ./apps/api/src/config/configuration.ts
.
You simply need to replace type: "postgres"
with type: "mongo"
.
Make sure you run the mongo container using the command: npm run mongo
.
Data seeding
If you want your database to be pre-populated with that, it is very easy to do so.
For postgres add your sql
statements to apps/database/postgres/init.sql
file.
For mongo add your mongo statements to apps/database/mongo/mongo-entrypoint/seed-data.js
file.
Backend
We are using cutting edge technologies to ensure that you get the best development experience one could hope for. To communicate with the database, we make use of the great typeorm. We use the code-first approach, which means defining your models will also represent your tables in your database. Here is an example:
import { Column, Entity } from "typeorm"
import { RootEntity } from "./root.entity"
import { MinLength } from "class-validator"
@Entity()
export class Todo extends RootEntity {
@Column()
@MinLength(5, { always: true })
text: string
}
To serve your API requests, we make use of nest alongside with fastify to ensure blazing fast performance.
To reduce the boilerplate commonly found around creating a new entity, we are using the nestjsx/crud plugin that will generate all necessary routes for CRUD operations.
Here is an example from our todo app:
import { Controller } from "@nestjs/common"
import { Crud, CrudController } from "@nestjsx/crud"
import { Todo } from "@stator/models"
import { TodosService } from "./todos.service"
@Crud({ model: { type: Todo } })
@Controller("todos")
export class TodosController implements CrudController<Todo> {
constructor(public service: TodosService) {}
}
Of course, you're probably wondering if this actually works. To convince you, we have implemented integration tests that perform real requests using supertest.
Can I view the generated endpoints? Well, of course, you can!
We now have generated swagger documentation that is viewable with the beautiful redoc.
Once you navigate to localhost:3333, you will see this:
Frontend
For our webapp, we're using the very popular react alongside redux-toolkit and react-router. We highly recommend that you use function components as demonstrated in the example.
To further reduce the boilerplate necessary you can generate hooks based on your API swagger by running npm run generate-api-redux
.
When you add new entities to your API, you should also add them in the output file property of the tools/generators/open-api-config.ts
file.
If you would like to avoid this, you can generate a single file by removing both properties [outputFiles
, filterEndpoints
]
This script will generate the required RTK Query code and caching keys so your data remains up to date while performing CRUD operations.
For a complete example of CRUD operations, consult the apps/webapp/src/pages/todos-page.tsx
file.
In our example, we are using material-ui, but you could replace that with any other framework.
We also use axios to simplify our requests handling as it works very well with TypeScript.
General
We strongly believe that typing helps create a more robust program; thus, we use TypeScript.
To facilitate and optimize the usage of the monorepo, we make use of NX.
eslint enforces excellent standards, and prettier helps you apply them.
Commit messages must abide to those guidelines. If you need help following them, simply run npm run commit
and you will be prompted with an interactive menu.
File and directory names are enforced by the custom-made enforce-file-folder-naming-convention.ts
.
Branch names are enforced before you even commit to ensure everyone adopts the same standard: {issue-number}-{branch-work-title-kebab-case}
.
For end-to-end testing, we use the notorious cypress.
We also have a pre-built CI toolkit for you that will build and run the tests.