Awesome
Vertical Slice API Template
This is a An Asp.Net Core
template
based onVertical Slice Architecture
, CQRS, Minimal APIs, API Versioning and Swagger. Create a new project based on this template by clicking the above Use this template button or by installing and running the associated NuGet package (see Getting Started for full details).
⭐ Support
If you like feel free to ⭐ this repository, It helps out :)
Thanks a bunch for supporting me!
Table of Contents
Install
For installing vertical slice api template
from nuget container registry run this dotnet cli command:
dotnet new install Vertical.Slice.Template
Or for installing the template locally you can clone the project and run this command in the root of this repository:
dotnet new install .
Features
- ✅ Using
Vertical Slice Architecture
as a high-level architecture - ✅ Using
CQRS Pattern
on top ofMediatR
library - ✅ Using
Mapperly
source generator for the mappings - ✅ Using
Minimal APIs
for handling requests - ✅ Using
Fluent Validation
and a Validation Pipeline Behaviour on top of MediatR - ✅ Using
Postgres
On Top of EfCore - ✅ Using different levels of tests like
Unit Tests
,Integration Tests
andEnd-To-End Tests
- ✅ Logging with
Serilog
andElasticsearch
andKibana
for collecting and searching structured logs - ✅ Using Microsoft Tye and
Pm2
for running the application - ✅ Using docker and
docker-compose
for deployment - 🚧 Using
OpenTelemetry
for collectionMetrics
andDistributed Tracing
Libraries
- ✔️
.NET 8
- .NET Framework and .NET Core, including ASP.NET and ASP.NET Core - ✔️
Npgsql Entity Framework Core Provider
- Npgsql has an Entity Framework (EF) Core provider. It behaves like other EF Core providers (e.g. SQL Server), so the general EF Core docs apply here as well - ✔️
FluentValidation
- Popular .NET validation library for building strongly-typed validation rules - ✔️
Swagger & Swagger UI
- Swagger tools for documenting API's built on ASP.NET Core - ✔️
Serilog
- Simple .NET logging with fully-structured events - ✔️
Polly
- Polly is a .NET resilience and transient-fault-handling library that allows developers to express policies such as Retry, Circuit Breaker, Timeout, Bulkhead Isolation, and Fallback in a fluent and thread-safe manner - ✔️
Scrutor
- Assembly scanning and decoration extensions for Microsoft.Extensions.DependencyInjection - ✔️
Opentelemetry-dotnet
- The OpenTelemetry .NET Client - ✔️
Newtonsoft.Json
- Json.NET is a popular high-performance JSON framework for .NET - ✔️
AspNetCore.Diagnostics.HealthChecks
- Enterprise HealthChecks for ASP.NET Core Diagnostics Package - ✔️
NSubstitute
- A friendly substitute for .NET mocking libraries. - ✔️
StyleCopAnalyzers
- An implementation of StyleCop rules using the .NET Compiler Platform - ✔️
Mapperly
- A .NET source generator for generating object mappings, No runtime reflection. - ✔️
Mediator
- A high performance implementation of Mediator pattern in .NET using source generators. - ✔️
NewID
- NewId generates sequential unique identifiers that are 128-bit (16-bytes) and fit nicely into a Guid
Getting Started
- This application uses
Https
for hosting apis, to setup a valid certificate on your machine, you can create a Self-Signed Certificate, see more about enforce certificate here. - Install git - https://git-scm.com/downloads.
- Install .NET Core 8.0 - https://dotnet.microsoft.com/download/dotnet/8.0.
- Install Visual Studio, Rider or VSCode.
- Run
dotnet new install Vertical.Slice.Template
to install the project templates. - Now with running
dotnet new --list
, we should seeVertical.Slice.Template
in the template list. - Create a folder for your solution and cd into it (the template will use it as project name)
- Run
dotnet new vsa
for short name ordotnet new Vertical.Slice.Template -n <YourProjectName>
to create a new project template. - Open <YourProjectName>.sln solution, make sure that's compiling.
- Navigate to
src/App/<YourProjectName>.Api
and rundotnet run
to launch the back end (ASP.NET Core Web API) - Open web browser https://localhost:4000/swagger Swagger UI
Setup
Dev Certificate
This application uses Https
for hosting apis, to setup a valid certificate on your machine, you can create a Self-Signed Certificate, see more about enforce certificate here and here.
- Setup on windows and
powershell
:
dotnet dev-certs https --clean
dotnet dev-certs https -ep $env:USERPROFILE\.aspnet\https\aspnetapp.pfx -p <CREDENTIAL_PLACEHOLDER>
dotnet dev-certs https --trust
- Setup in
linux and wsl
:
dotnet dev-certs https --clean
dotnet dev-certs https -ep ${HOME}/.aspnet/https/aspnetapp.pfx -p <CREDENTIAL_PLACEHOLDER>
dotnet dev-certs https --trust
dotnet dev-certs https --trust
is only supported on macOS and Windows. You need to trust certs on Linux in the way that is supported by your distribution. It is likely that you need to trust the certificate in your browser(with this certificate we don't get an exception for https port because of not found certificate but browser shows us this certificate is not trusted).
Conventional Commit
In this app I use Conventional Commit and for enforcing its rule I use conventional-changelog/commitlint and typicode/husky with a pre-commit hook. For read more about its setup see commitlint docs and this article and this article.
Here I configured a husky hook for conventional commits:
- Install NPM:
npm init
- Install Husky:
npm install husky --save-dev
- Add
prepare
andinstall-dev-cert-bash
command for installing and activatinghusky hooks
in the package.json file:
npm pkg set scripts.prepare="husky install && dotnet tool restore"
npm pkg set scripts.install-dev-cert-bash="curl -sSL https://aka.ms/getvsdbgsh | bash /dev/stdin -v vs2019 -l ~/vsdbg"
- Install CommitLint:
npm install --save-dev @commitlint/config-conventional @commitlint/cli
- Create the
commitlint.config.js
file with this content:
module.exports = { extends: '@commitlint/config-conventional']};
- Create the Husky folder:
mkdir .husky
- Link Husky and CommitLint:
npx husky add .husky/commit-msg 'npx --no -- commitlint --edit ${1}'
- Activate and installing all husky hooks with this command:
npm run prepare
# this command should run in git-bash on the windows or bash in the linux
npm run install-dev-cert-bash
Formatting
For formatting I use belav/csharpier but you can also use dotnet format
, you can integrate it with your prefered IDE.
Here I configured a husky hook for formatting:
- Install NPM:
npm init
- Install Husky:
npm install husky --save-dev
- To install a tool for local access only (for the current directory and subdirectories), it has to be added to a manifest file. So we Create a manifest file by running the dotnet new command:
dotnet new tool-manifest
- Adds the tool to the manifest file that we created in the preceding step and then install our required packages as dependency with dotnet tool install, that will add to dotnet-tools.json file in a
.config
directory:
dotnet new tool-manifest
dotnet tool install csharpier
dotnet tool install dotnet-format
- Add
prepare
command for installing and activatinghusky hooks
that we will add in the next steps andrestoring
our installed dotnet tools in the previous step to the package.json file:
npm pkg set scripts.prepare="husky install && dotnet tool restore"
- Create the Husky folder:
mkdir .husky
- Add formatting and Linting hooks to the husky:
npx husky add .husky/pre-commit "dotnet format && git add -A ."
# Or using csharpier
npx husky add .husky/pre-commit "dotnet csharpier . && git add -A ."
- Activate and installing all husky hooks with this command:
npm run prepare
Analizers
For roslyn analizers I use serveral analyzers and config the in .editorconfig
file:
- StyleCop/StyleCop
- JosefPihrt/Roslynator
- meziantou/Meziantou.Analyzer
- Microsoft.VisualStudio.Threading.Analyzers
Application Structure
In this project I used vertical slice architecture or Restructuring to a Vertical Slice Architecture also I used feature folder structure in this project.
- We treat each request as a distinct use case or slice, encapsulating and grouping all concerns from front-end to back.
- When We adding or changing a feature in an application in n-tire architecture, we are typically touching many different "layers" in an application. we are changing the user interface, adding fields to models, modifying validation, and so on. Instead of coupling across a layer, we couple vertically along a slice and each change affects only one slice.
- We
Minimize coupling
between slices
, andmaximize coupling
in a slice
. - With this approach, each of our vertical slices can decide for itself how to best fulfill the request. New features only add code, we're not changing shared code and worrying about side effects. For implementing vertical slice architecture using cqrs pattern is a good match.
Also here I used CQRS for decompose my features to very small parts that makes our application:
- maximize performance, scalability and simplicity.
- adding new feature to this mechanism is very easy without any breaking change in other part of our codes. New features only add code, we're not changing shared code and worrying about side effects.
- easy to maintain and any changes only affect on one command or query (or a slice) and avoid any breaking changes on other parts
- it gives us better separation of concerns and cross cutting concern (with help of MediatR behavior pipelines) in our code instead of a big service class for doing a lot of things.
With using CQRS, our code will be more aligned with SOLID principles, especially with:
- Single Responsibility rule - because logic responsible for a given operation is enclosed in its own type.
- Open-Closed rule - because to add new operation you don’t need to edit any of the existing types, instead you need to add a new file with a new type representing that operation.
Here instead of some Technical Splitting for example a folder or layer for our services
, controllers
and data models
which increase dependencies between our technical splitting and also jump between layers or folders, We cut each business functionality into some vertical slices, and inner each of these slices we have Technical Folders Structure specific to that feature (command, handlers, infrastructure, repository, controllers, data models, ...).
Usually, when we work on a given functionality we need some technical things for example:
- API endpoint (Controller)
- Request Input (Dto)
- Request Output (Dto)
- Some class to handle Request, For example Command and Command Handler or Query and Query Handler
- Data Model
Now we could all of these things beside each other and it decrease jumping and dependencies between some layers or folders.
Keeping such a split works great with CQRS. It segregates our operations and slices the application code vertically instead of horizontally. In Our CQRS pattern each command/query handler is a separate slice. This is where you can reduce coupling between layers. Each handler can be a separated code unit, even copy/pasted. Thanks to that, we can tune down the specific method to not follow general conventions (e.g. use custom SQL query or even different storage). In a traditional layered architecture, when we change the core generic mechanism in one layer, it can impact all methods.
High Level Structure
Modules Structure
Folder Structure
src
│ Directory.Build.props
│ Directory.Build.targets
│ Directory.Packages.props
│
├───Vertical.Slice.Template
│ │ CatalogsMetadata.cs
│ │ readme.md
│ │ Vertical.Slice.Template.csproj
│ │
│ ├───Products
│ │ │ ProductConfigurations.cs
│ │ │ ProductMappingProfiles.cs
│ │ │
│ │ ├───Data
│ │ │ ProductEntityTypeConfigurations.cs
│ │ │ SieveProductReadConfigurations.cs
│ │ │
│ │ ├───Dtos
│ │ │ └───v1
│ │ │ ProductDto.cs
│ │ │
│ │ ├───Features
│ │ │ ├───CreatingProduct
│ │ │ │ └───v1
│ │ │ │ CreateProduct.cs
│ │ │ │ CreateProductEndpoint.cs
│ │ │ │ ProductCreated.cs
│ │ │ │
│ │ │ ├───GettingProductById
│ │ │ │ └───v1
│ │ │ │ GetProductById.cs
│ │ │ │ GetProductByIdEndpoint.cs
│ │ │ │
│ │ │ └───GettingProductsByPage
│ │ │ └───v1
│ │ │ GetProductsByPage.cs
│ │ │ GetProductsByPageEndpoint.cs
│ │ │
│ │ ├───Models
│ │ │ Product.cs
│ │ │
│ │ └───ReadModel
│ │ ProductReadModel.cs
│ │
│ └───Shared
│ │ DefaultProblemDetailMapper.cs
│ │
│ ├───Data
│ │ │ CatalogsDbContext.cs
│ │ │ CatalogsDbContextDesignFactory.cs
│ │ │
│ │ └───Migrations
│ │ └───Catalogs
│ │ 20230502202201_InitialCatalogsMigration.cs
│ │ 20230502202201_InitialCatalogsMigration.Designer.cs
│ │ CatalogsDbContextModelSnapshot.cs
│ │
│ ├───Extensions
│ │ ├───WebApplicationBuilderExtensions
│ │ │ WebApplicationBuilderExtensions.Infrastrcture.cs
│ │ │ WebApplicationBuilderExtensions.ProblemDetails.cs
│ │ │ WebApplicationBuilderExtensions.Storage.cs
│ │ │ WebApplicationBuilderExtensions.Versioning.cs
│ │ │
│ │ └───WebApplicationExtensions
│ │ WebApplicationExtensions.Infrastructure.cs
│ │
│ └───Workers
│ MigrationWorker.cs
│ SeedWorker.cs
│
├───Vertical.Slice.Template.Api
│ │ appsettings.Development.json
│ │ appsettings.json
│ │ appsettings.test.json
│ │ CatalogsApiMetadata.cs
│ │ Program.cs
│ │ Vertical.Slice.Template.Api.csproj
│ │
│ ├───Extensions
│ │ └───WebApplicationBuilderExtensions
│ └───Properties
│ launchSettings.json
│
├───Vertical.Slice.Template.ApiClient
│ │ ClientsMappingProfile.cs
│ │ nswag.json
│ │ swagger.json
│ │ Vertical.Slice.Template.ApiClient.csproj
│ │
│ ├───Catalogs
│ │ │ CatalogsApiClientOptions.cs
│ │ │ CatalogsClient.cs
│ │ │ ICatalogsClient.cs
│ │ │ Product.cs
│ │ │
│ │ └───Dtos
│ │ CreateProductClientDto.cs
│ │ GetGetProductsByPageClientDto.cs
│ │
│ ├───Extensions
│ │ ServiceCollectionExtensions.cs
│ │
│ └───RickAndMorty
│ │ IRickAndMortyClient.cs
│ │ RickAndMortyClient.cs
│ │ RikAndMortyApiClientOptions.cs
│ │
│ ├───Dtos
│ │ CharacterResponseClientDto.cs
│ │ LocationClientDto.cs
│ │ OriginClientDto.cs
│ │
│ └───Model
│ Character.cs
│ Location.cs
│ Origin.cs
Vertical Slice Flow
TODO
How to Run
For running and debugging this application we could use our preferred Dev Environment, for example Visual Studio
, VsCode
Or Rider
for me, it's Rider, So just open the Vertical.Slice.Template.sln solution file in the IDE and run, debug your application.
Using PM2
For ruining all microservices and control on their running mode we could use PM2 tools. for installing pm2
on our system globally we should use this command:
npm install pm2 -g
After installing pm2 on our machine, we could run all of our microservices with running bellow command in root of the application with using pm2.yaml file.
pm2 start pm2.yaml
Some PM2 useful commands:
pm2 -h
pm2 list
pm2 logs
pm2 monit
pm2 info pm2.yaml
pm2 stop pm2.yaml
pm2 restart pm2.yaml
pm2 delete pm2.yaml
Using Tye
We could run our microservices with new microsoft tools with name of Project Tye.
Project Tye is an experimental developer tool that makes developing, testing, and deploying microservices and distributed applications easier.
For installing Tye
local tool to our existing .Net tools we can use following command:
dotnet tool install Microsoft.Tye --version "0.11.0-alpha.22111.1"
Then this tool will add to .net tools manifest file and After you check in the manifest file to the repository. To install all of the tools listed in the manifest file, we run the dotnet tool restore command:
dotnet tool restore
For installing Tye
globally on our machine we should use this command:
dotnet tool install -g Microsoft.Tye --version "0.11.0-alpha.22111.1"
OR if you already have Tye installed and want to update:
dotnet tool update -g Microsoft.Tye
After installing tye, we could run our microservices with following command in the root of our project:
tye run
One of key feature from tye run is a dashboard to view the state of your application. Navigate to http://localhost:8000 to see the dashboard running.
Also We could run some docker images with Tye and Tye makes the process of deploying your application to Kubernetes very simple with minimal knowlege or configuration required.
Contribution
The application is in development status. You are feel free to submit pull request or create the issue.
License
The project is under MIT license.