• Stars
    star
    233
  • Rank 172,230 (Top 4 %)
  • Language
    C#
  • License
    MIT License
  • Created over 4 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Samples and resources of how to design WebApi with .NET

Twitter Follow Github Sponsors blog blog

WebApi with .NET

Samples and resources of how to design WebApi with .NET

Support

Feel free to create an issue if you have any questions or request for more explanation or samples. I also take Pull Requests!

πŸ’– If this repository helped you - I'd be more than happy if you join the group of my official supporters at:

πŸ‘‰ Github Sponsors

Prerequisites

  1. Install .NET Core SDK 3.1 from link.
  2. Install one of IDEs:
    • Visual Studio - link - for Windows only. Community edition is available for free,
    • Visual Studio for Mac - link - for MacOS only. Available for free,
    • Visual Studio Code- link with C# plugin - Cross-platform support. Available for free,
    • Rider - link - cross-platform support. Paid, but there are available free options (for OpenSource, students, user groups etc.)

Project Configuration

Routing

From the documentation: "Routing is responsible for matching incoming HTTP requests and dispatching those requests to the app's executable endpoints."

Saying differently routing is responsible for finding exact endpoint based on the request parameters - usually based on the URL pattern matching.

Endpoint executes the logic that creates an HTTP response based on request.

To use routing and endpoints it's needed to call UseRouting and UseEndpoints extension method on app builder in Startup.Configure method. That will register routing in middleware pipeline.

Note that those methods should be registered in the order as presented above. If the order is changed then it won't be registered properly.

Route templates

Templates add flexibility to supported URL definition.

The simplest option is static URL where you have just URL, eg:

  • /Reservations/List
  • /GetUsers
  • /Orders/ByStatuses/Closed

Route parameters

Static URLs are fine for the list endpoints, but if we'd like to get a list of records.
To allow dynamic matching (eg. reservation by Id) we need to use parameters. They can be added using {parameterName} syntax. eg.

  • /Reservations/{id}
  • /users/{id}/orders/{orderId}

They don't need to be only used instead of concrete URL part. You can also do eg.:

  • /Reservations?status={reservationStatus}&user={userId} - this will get parameters from the query string and match eg. /Reservations?status=Open&userId=123 and will have status parameter equal to Open and userId equal to 123,
  • /Download/{fileName}.{extension} - this will match eg. /Download/testFile.txt and end up with two route data parameters - fileName with testFile value and extension with txt accordingly,
  • /Configuration/{entityType}Dictionary - this will match /Configuration/OrderStatusDictionary and will have entityType parameter with OrderStatus value.

You can also add catch-all parameters - {**parameterName}, that can be used as fallback when no route was found:

  • /Reservations/{id}/{**reservationPath} - this will match eg. /Reservations/123/changeStatus/confirmed and will have reservationPath parameter with changeStatus/confirmed value

It's also possible to make the parameter optional by adding ? after its name:

  • /Reservations/{id?} - this will match both /Reservations and /Reservation/123 routes

Route constraints

Route template parameters can contain constraints to narrow down the matched results. To use it you need to add constraint name after parameter name {prameter:constraintName}. There is a number of predefined route constraints, eg:

  • /Reservations/{id:guid} - will match eg. /Reservations/632863d2-5cbf-4c9f-92e1-749d264d965e but wont' match eg. /Reservations/123,
  • /Reservations/top/{limit:int:minlength(1):maxLength(10) - this will allow to pass integers between 1 and 10 for limit parameter. So it will allow to get at most top 10 reservations,
  • /Inbox?from={fromEmailAddress:regex(\\[A-Z0-9._%+-]+@[A-Z0-9.-]+\.[A-Z]{2,4})} - regex can be also used to eg. check email address or provide more advanced format check. This will match /[email protected] and will have fromEmailAddress parameter with [email protected] value,
  • see more constraints examples in route constraint documentation.

Note - failing constraint will result with 400 - BadRequest status code, however, the messages are generic and not user friendly. So if you'd like to make them more related to your business case - it's suggested to do move it to validation inside the code.

You can also define your custom constraint. The sample use case would be when you want to provide the validation for your business id format.

See sample that validates if reservation id is built from 3 non-empty parts split by |;

public class ReservationIdConstraint : IRouteConstraint
{
    public bool Match(
        HttpContext httpContext,
        IRouter route,
        string routeKey,
        RouteValueDictionary values,
        RouteDirection routeDirection)
    {
        if (routeKey == null)
        {
            throw new ArgumentNullException(nameof(routeKey));
        }

        if (values == null)
        {
            throw new ArgumentNullException(nameof(values));
        }

        if (!values.TryGetValue(routeKey, out var value) && value != null)
        {
            return false;
        }
        
        var reservationId = Convert.ToString(value, CultureInfo.InvariantCulture);
        
        return reservationId.Split("|").Where(part => !string.IsNullOrWhiteSpace(part)).Count() == 3;
    }
}

You need to register it in Startup.ConfigureServices in AddRouting method:

public class Startup
{
    public void ConfigureServices(IServiceCollection services)
    {
        // registers controllers in dependency injection container
        services.AddControllers();

        services.AddRouting(options =>
        {
            options.ConstraintMap.Add("reservationId", typeof(ReservationIdConstraint));
        });
    }

    // (...)
}

Then you can use it to in route:

  • /Reservations/{id:reservationId} - this will match /Reservations/RES|123|01 (and get id parameter with value RES|123|01) but wont't match /Reservations/123.

Routing pipeline

Routing is split into the following steps:

  • request URL parsing
  • perform matching against registered routes (it's done in parallel, so the order of registration doesn't matter)
  • from matching routes, remove all that do not match routes constraints (eg. route parameter defined as int was not numeric)
  • select single best matching (the most concrete one) if possible, from the left routes. If there are still more than one matches - the exception is being thrown. If there was only single match but value does not match constraint then exception will be thrown.

Having eg. following routes:

  • /Clients/List
  • /Clients/{id}
  • /Reservations/{id:alpha}
  • /Reservations/{id:int}
  • /Reservations/List

and trying to match /Reservation/List the routing process will find matching templates so:

  • /Reservations/{id:alpha}
  • /Reservations/{id:int}
  • /Reservations/List

It matched the Reservations part and then both {id} routes (as List could be just string id text) and concrete part List.

Then constraints will be verified and we'll end up with two routes (as {id:int} does not match because List is not an integer).

  • /Reservations/{id:alpha}
  • /Reservations/List

From this set both are matching, but List is more concrete.

Accordingly:

  • trying to match Reservations/abcde routing will match /Reservations/{id:alpha} route,
  • trying to match Reservations/123 routing will match /Reservations/{id:int} route.

Routing with endpoints

ASP.NET Core allows to define raw endpoints without the need to use controllers. They can be defined inside UseEndpoints method, by calling UseGet, UsePost etc. methods:

public class Startup
{
    public void ConfigureServices(IServiceCollection services)
    {
    }

    public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
    {
        // registers routing in middleware pipeline
        app.UseRouting();
        
        // defines endpoints to be routed
        app.UseEndpoints(endpoints =>
        {
             endpoints.MapGet("/Reservations/{id}", async context =>
             {
                 var name = context.Request.RouteValues["id"];
                 await context.Response.WriteAsync($"Reservation with {id}!");
             });
        });
    }
}

Using endpoints currently requires a lot of bare-bone code. This will change with .NET 5 where it will get a set of useful methods that will make it first-class citizen. See more in accepted API review: link.

Routing with controllers

Http requests can be mapped to controller with two ways: conventional and through attributes

Conventional controllers routing

Conventional is done by calling MapControllerRoute method inside UseEndpoints. It allows to provide route template (pattern), name and controller action mapping.

public class Startup
{
    public void ConfigureServices(IServiceCollection services)
    {
    }

    public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
    {
        // registers routing in middleware pipeline
        app.UseRouting();
        
        // defines endpoints to be routed
        app.UseEndpoints(endpoints =>
        {
            // defines concrete routing to single controller action
            endpoints.MapControllerRoute(name: "blog",
                pattern: "Reservations/{id}",
                defaults: new { controller = "Reservations", action = "Get" });
            
            // defines "catch-all" routing that will route all requests
            // matching `/Controller/Action` or `/Controller/Action/id`
            endpoints.MapControllerRoute(name: "default",
                pattern: "{controller=Home}/{action=Index}/{id?}");
        });
    }
}

Important thing to note is controllers should have the Controller suffix in the name (eg. ReservationsController), but routes should be defined without it (so Reservations).

Routing with attributes

Controllers are derived from the MVC pattern concept. They are responsible for orchestration between requests (inputs) and models. Routing can be defined by putting attributes on top of method and controller definition.

If you want to use Controllers then you should also call AddControlers in configure services (to register them in Dependency Container) and MapControllers inside UseEndpoints to map controllers routes configuration.

public class Startup
{
    public void ConfigureServices(IServiceCollection services)
    {
        // registers controllers in dependency injection container
        services.AddControllers();
    }

    public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
    {
        // registers routing in middleware pipeline
        app.UseRouting();
        
        // defines endpoints to be routed
        app.UseEndpoints(endpoints =>
        {
            // maps controllers routes to endpoints
            endpoints.MapControllers();
        });
    }
}

Route attribute

The most generic attribute is [Route]. It routes that will direct to the method that it's marking.

public class ReservationsController : Controller
{
    [Route("")]
    [Route("Reservations")]
    [Route("Reservations/List")]
    [Route("Reservations/List/{status?}")]
    public IActionResult List(string status)
    {
        //(...)
    }

    [Route("Reservations/Summary")]
    [Route("Reservations/Summary/{userId?}")]
    public IActionResult Summary(int? userId)
    {
        // (...)
    }
}

In this example routes:

  • /, /Reservations, /Reservations/List, /Reservations/List/Open will be routed to List method,
  • /Reservations/Summary, Reservations/Summary/123 will be routed to Summary method.

Important note is that you should not use action, area, controller, handler, page as route template variable (eg. /Reservations/{page}). Those names are reserved for the internals of routing logic. Using them will make routing fail.

HTTP methods attributes

ASP.NET Core provides also more specific attributes [HttpGet], [HttpPost], [HttpPut], [HttpDelete], [HttpHead], [HttpPatch] representing HTTP methods. Besides the URL routing they also perform matching based on the HTTP method. Normally using them you should add [Route] attribute on a controller that will add prefix for all the routes defined by HTTP verbs attributes.

Sample of the most common CRUD controller definition:

[Route("api/[controller]")]
[ApiController]
public class ReservationsController : ControllerBase
{
    [HttpGet]
    public IActionResult List([FromQuery] string filter)
    {
        //(...)
    }

    [HttpGet("{id}")]
    public IActionResult Get(int id)
    {
        // (...)
    }
    
    [HttpPost]
    public IActionResult Create([FromBody] CreateReservation request)
    {
        // (...)
    }

    [HttpPut("{id}")]
    public IActionResult Put(int id, [FromBody] UpdateReservation request)
    {
        // (...)
    }

    [HttpDelete("{id}")]
    public IActionResult Delete(int id)
    {
        // (...)
    }
}

Using [Route("api/[controller]")] will define route based on the controller name - in this case it will be /api/Reservations. By convention WebApi routes usually start with a /api prefix. Prefix existence is optional and can have a different value. If you'd like you could also add suffix eg. [Route("api/[controller]/open")] if eg. you'd like to have dedicated controller for open reservations. The benefit of using [controller] is that when you rename controller the route will be also updated. If you want to avoid accidental route name change then you should use concrete route eg. [Route("api/reservations")]

Having that:

  • GET /api/Reservations will be routed to List method. Value for the filter parameter, because of [FromQuery] attribute will be mapped from request query string. For GET /api/Reservations?filter=open it will have open value, for default route GET /api/Reservations it will be null,
  • GET /api/Reservations/123 will be routed to Get method. Value of the id parameter will be taken by convention from the route parameter,
  • POST /api/Reservations/123 will be routed to Create method. Value for the request parameter, because of [FromBody] attribute will be mapped from request body (so eg. JSON sent from client),
  • PUT /api/Reservations/123 will be routed to Update method,
  • DELETE /api/Reservations/123 will be routed to Delete method.

It's not mandatory to use route prefix. Most of the time it's useful, but when you have nesting inside the API then it's worth setting up it manually eg.

[ApiController]
public class UserReservationsController : ControllerBase
{
    [HttpGet("api/users/{userId}/reservations")]
    public IActionResult List(int userId, [FromQuery] string filter)
    {
        //(...)
    }

    [HttpGet("api/users/{userId}/reservations/{id}")]
    public IActionResult Get(int userId, int id)
    {
        // (...)
    }
    
    [HttpPost("api/users/{userId}/reservations/{id}")]
    public IActionResult Create(int userId, [FromBody] CreateReservation request)
    {
        // (...)
    }

    [HttpPut("api/users/{userId}/reservations/{id}/status")]
    public IActionResult Put(int userId, int id, [FromBody] UpdateReservationStatus request)
    {
        // (...)
    }
}

Links

REST

Let's go back in time. In 2000 Roy Fielding wrote doctoral dissertation titled "Architectural Styles and the Design of Network-based Software Architectures". This dissertation gave rise to "REpresentational State Transfer" - REST. Roy created REST as an architectural style based on the principles that make the Internet so successful. The World Wide Web runs itself on HTTP, which has a number of conventions that provide the basis for scalability, fault tolerance, and loose coupling. REST and HTTP are not the same thing, but REST fully embraces HTTP. It means that it uses verbs, status codes, headers, and resource identified as URI in order to fulfill the constraints that together compose the so-called RESTful style. What are those constraints?

The Six Contraints of REST

REST, like any other architectural style, describes constraints, that composed together define the basis of RESTful style.

Client-server

This constraint just mainly specifies that there's a distinction between a client and a server. This separation allows the components to evolve independently thus improving portability and scalability.

Stateless

Each request must have all the information necessary for its correct completion. It means that all the state that's contained for a given web request is contained within the request itself as a part of the URI, query string parameters, body, or headers. Since there is no session related dependency, each server can handle any request thus API can be easily scaled. Removing all server-side state synchronization logic also makes REST APIs less complex.

Cacheable

The server should label what data within a response to a request can be cached and what cannot. If a response can be cached, then a client cache is given the rights to reuse that response data for later, equivalent requests. Following this constraint give the potential to partially or completely eliminate some interactions, thus improving performance and scalability and also decrease latency.

Layered System

The client can make a request and the response could come from a web server, a load balancer, a cache, etc. For the client, it doesn't really matter where the data is coming from as long as it gets the requested information. In other words, before the server completes the response, it can perform additional operations that the client does not need to know.

Code on demand

This is the only optional constraint. Most of the time, the server will be sending the static representations of resources in the form of XML or JSON, but on demand, it can send additional code (f.e. javascript) that can be executed on the client side. This simplifies clients by reducing the number of features required to be pre-implemented.

Uniform interface

The server should provide an API that will be well understood by all applications communicating with it. By designing one interface, we should respond to the needs of all applications that use it. In order to obtain such a uniform interface, four additional constraints must be met.

Identification of resources

On the basis of a single request, the server can identify the resource it concerns. For that purpose most often the Uniform Resource Identifier - URI is used. It distinguishes resource from any other, and through it interaction with that resource take place. In the example we have address that is pointing on specific employee with id 123. This address is the URI, which is identifier and the returned employee is the resource.

GET http://example.org/employees/123
200 OK
{
  "employeeId": 123,
  "firstName": "John",
  "lastName": "Doe"
}
Manipulation of resources through representations

The server can return reponse in various formats (media types) like HTML, XML, JSON etc. That format is the representation of the identified resource, that the client can understand and manipulate. It is possible for the client to request a specific representation that fits it needs. This is accomplished via the Accept header.

GET http://example.org/employees/123
Accept: application/xml
200 OK
<?xml version="1.0" encoding="UTF-8"?>
<employee>
    <employeeId type="integer">123</employeeId>
    <firstName>John</firstName>
    <lastName>Doe</lastName>
</employee>

Clients are also allowed to indicate their preferred representation when sending data to the server. This is accomplished via the Content-type header. The server response should not be affected by the choosen format.

POST http://example.org/employees
Content-type: application/json
{
  "firstName": "John",
  "lastName": "Doe"
}
201 Created
Location: http://example.org/employees/123
Self-descriptive messages

A message, which is a request or a response, is being considered as self-descriptive when it contains all the information necessary to complete the task. In other words it should contains all the information that the recipient needs to understand it. Down bellow is an example of self-descriptive message. It contains information about protocol, host, which type of action need to be performed (HTTP method), and desired resource representation to be returned (Accept header). Such a message will be well understood by the server.

GET /employees/123 HTTP/1.1
Host: example.org
Accept: application/json

The server can respond accordingly. That message is also self-descriptive. It tells the client that operation was sucessfull by returning appriopriate status code. It also tells how to interpret the message body by specyfing Content-Type header.

HTTP/1.1 200 OK
Content-Type: application/json
{
  "employeeId": 123,
  "firstName": "John",
  "lastName": "Doe"
}
Hypermedia as the engine of application state (HATEOAS)

Together, the first three uniform interface constraints imply the fourth. It can be summarise as that: sending self-desciptive messages to uniquely identifying resources, using representations, changes the state of the application. This constraint allows to compare the RESTful API to a website. As a website is a collection of links leading to subsequent subpages, HATEOAS informs that the same can be done with API. Also think of it as an situation in the office when you want to start a new business. You can't just go there and "POST" a new company. You must submit an application for creating a new company and then you will receive anwser like "Thank you for submitting an aplication. Here are the next possible steps that you can perform: cancellation of the application, address change, financing".

POST http://example.org/companies
{
  "name": "NewOne",
  "address": "Example 5",
  "owner": {
    "firstName": "John",
    "lastName": "Doe"
  }
}
HTTP/1.1 201 Created
{
  "companyId": 1234,
  "name": "NewOne",
  "address": "Example 5",
  "owner": {
    "firstName": "John",
    "lastName": "Doe"
  },
  "_links":{
    "self":{
      "href": "http://example.org/companies/1234",
      "method": "GET"
    },
    "cancellation":{
      "href": "http://example.org/companies/1234",
      "method": "DELETE"
    }
  }
}

API Versioning

Filters

Middleware

Inversion of Control and Dependency Injection

API Testing

Projects structure

Logging

General

Log Levels

By default in .NET Core there are six levels of logging (available through LogLevel enum):

  • Trace (value 0) - the most detailed and verbose information about the application flow,
  • Debug (1) - useful information during the development process (eg. local environment bug investigation),
  • Information (2) - usually important information about the application flow that can be useful for diagnostics and flow,
  • Warning (3) - potential unexpected application event or error that's not blocking flow (eg. operation was successfully saved to the database but notification failed) or transient error occurred but was succeeded after retry),
  • Error (4) - unexpected application error - eg. no record found to update, database timeout, argument exception etc.,
  • Critical (5) - informing about critical events that require immediate action like application or system crash, end of disk space or database in the irrecoverable state,
  • None (6) - means no logs at all, used usually in the configuration to disable logging for selected category.

It's important to keep in mind that Trace and Debug should not be used on production, and should be used only for development/debugging purposes (Trace is by default disabled). Because of their characteristic, they may contain sensitive application information to be effective (eg. system secrets, PII/GDPR Data). Because of that, we need to be sure that on production environment they are disabled as that may end up with security leak. As they're also verbose, then keeping them on the production system may increase significantly cost of logs storage. Plus too many logs make them unreadable and hard to read.

Log Categories

Each logger instance needs to have an assigned category. Categories allow to group logs messages (as a category will be added to each log entry). By convention category should be passed as the type parameter of ILogger. Usually it's the class that we're injecting logger, eg.

[Route("api/Reservations")]
public class ReservationsController: Controller
{
    private readonly ILogger<ReservationsController> logger;

    public ReservationsController(ILogger<ReservationsController> logger)
    {
        this.logger = logger;
    }

    [HttpPost]
    public async Task<IActionResult> Create([FromBody] CreateReservationRequest request)
    {
        var reservationId = Guid.NewGuid();

        // (...) 

        logger.LogInformation("Created reservation with {ReservationId}", reservationId);
        
        
        return Created("api/Reservations", reservationId);
    }

}

Log category created with type parameter will contain full type name (so eg. LoggingSamples.Controllers.ReservationController).

It's also possible (however not recommended) to define that through ILoggerFactory CreateLogger(string categoryName) method:

[Route("api/Reservations")]
public class ReservationsController: Controller
{
    private readonly ILogger logger;

    public ReservationsController(ILoggerFactory loggerFactory)
    {
        this.logger = logger.CreateLogger("LoggingSamples.Controllers.ReservationController");
    }
}

Categories are useful for searching through logs and diagnose issues. As mentioned in the previous section - it's also possible to define in different log levels for configuration.

Eg. if you have a default log level Information and you need to investigate issues occurring in a specific controller (eg. ReservationsController) then you can change the log level to Debug for a dedicated category.

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "LoggingSamples.Controllers.ReservationController": "Debug"
    }
  }
}

Then for all categories but LoggingSamples.Controllers.ReservationController you'll have logs logged for Information and above (Information, Warning, Error, Critical) and for LoggingSamples.Controllers.ReservationController also Debug.

The other example is to disable logs from selected category - eg.

  • because you noticed that is logging some sensitive information and you need quickly to change that,
  • you want to mute some unimportant system logs,
  • you want to make sure that logs from a specific category (eg. LoggingSamples.Controllers.AuthenticationController) won't be ever logged on prod.
{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "LoggingSamples.Controllers.AuthenticationController": "None"
    }
  }
}

Log Scopes

Besides categories, it's possible to define logging scopes. They allow having add set of custom information to each log entry.

Scopes are disabled by default - if you'd like to use them then you need to toggle them on in configuration:

{
  "Logging": {
    "IncludeScopes": true,
    "LogLevel": {
      "Default": "Information"
    }
  }
}

Having that you can use ILogger.BeginScope method to define one or more logging scopes.

The first potential use case is to always add entity type and identifier to all logs in business logic to not need to add it in each entry. Eg. reservation id during its update. You can also create nested scopes.

[HttpPut]
public async Task<IActionResult> Create(Guid id, [FromBody] UpdateReservationRequest request)
{
    using(logger.BeginScope("For {EntityType}", "Reservation")
    {
         using(logger.BeginScope("With {EntityId}", id)
         {    
              logger.LogInformation("Starting reservation update process for {request}", request);
              // (...)
         }
    }
    
    return OK();
}

You can create also scopes with aspect programming way - so eg. in middleware to inject scopes globally.

An example would be injecting as logging scope information from request eg. client IP, user id.

Sample below shows how to inject CorellationID into logger scope.

public class CorrelationIdMiddleware
{
    private readonly RequestDelegate next;
    private readonly ILogger logger;

    public CorrelationIdMiddleware(RequestDelegate next, ILoggerFactory loggerFactory)
    {
        this.next = next;
        logger = loggerFactory.CreateLogger<CorrelationIdMiddleware>();
    }

    public async Task Invoke(HttpContext context /* other scoped dependencies */)
    {
        var correlationID = Guid.NewGuid();

        using (logger.BeginScope($"CorrelationID: {CorrelationID}", correlationID))
        {
            await next(context);
        }
    }
}

Log Events

The other option for grouping logs is log events. They are used normally to group them eg. by purpose - eg. updating an entity, starting controller action, not finding entity etc. To define them you need to provide a standardized list of int event ids. Eg.

public class LogEvents
{
    public const int InvalidRequest = 911;
    public const int ConflictState = 112;
    public const int EntityNotFound = 1000;
}

Sample usage:

[HttpPut]
public IActionResult Update([FromBody] UpdateReservation request)
{
    logger.LogInformation("Initiating reservation creation for {seatId}", request?.SeatId);

    if (request?.SeatId == null || request?.SeatId == Guid.Empty)
    {
        logger.LogWarning(LogEvents.InvalidRequest, "Invalid {SeatId}", request?.SeatId);

        return BadRequest("Invalid SeatId");
    }

    if (request?.ReservationId == null || request?.ReservationId == Guid.Empty)
    {
        logger.LogWarning(LogEvents.InvalidRequest, "Invalid {ReservationId}", request?.ReservationId);

        return BadRequest("Invalid ReservationId");
    }

    // (...)
    return Created("api/Reservations", reservation.Id);
}

Links

Serilog

Links

NLog

Links

Elastic Stack - Kibana, LogStash etc.

Links

CorrelationId

Links

Docker

To setup docker configuration you need to create Dockerfile (usually it's located in the root project folder).

Docker allows to define complete build and runtime setup. It allows also multistage build. Having that, you can use in first stage different tools for building the binaries. Then in the next stage you can just copy the prepared binaries and host them in the final image. Thank to that the final docker image is smaller and more secure as it doesn't contain eg. source codes and build tools.

Microsoft provides docker images that can be used as a base for the Docker configuration. You can choose from various, but usually you're using either:

  • mcr.microsoft.com/dotnet/core/sdk:3.1 - Debian based,
  • mcr.microsoft.com/dotnet/core/sdk:3.1-alpine - Alpine based, that are trimmed to have only basic tools preinstalled.

It's recommended to start with alpine as it's much smaller and use the regular if you need more advanced configuration that's lacking in alpine. There are also windows containers, but they're rarely used. For most of the cases linux based will be the first option to choose.

Sample DOCKERFILE

See example of DOCKERFILE:

########################################
#  First stage of multistage build
########################################
#  Use Build image with label `builder
########################################
FROM mcr.microsoft.com/dotnet/core/sdk:3.1-alpine AS builder

# Setup working directory for project
WORKDIR /app

# Copy project files
COPY *.csproj ./

# Restore nuget packages
RUN dotnet restore

# Copy project files
COPY . ./

# Build project with Release configuration
# and no restore, as we did it already
RUN dotnet build -c Release --no-restore

## Test project with Release configuration
## and no build, as we did it already
#RUN dotnet test -c Release --no-build


# Publish project to output folder
# and no build, as we did it already
RUN dotnet publish -c Release --no-build -o out

########################################
#  Second stage of multistage build
########################################
#  Use other build image as the final one
#    that won't have source codes
########################################
FROM mcr.microsoft.com/dotnet/core/runtime:3.1-alpine

# Setup working directory for project
WORKDIR /app

# Copy published in previous stage binaries
# from the `builder` image
COPY --from=builder /app/out .

# Set URL that App will be exposed
ENV ASPNETCORE_URLS="http://*:5000"

# sets entry point command to automatically 
# run application on `docker run`
ENTRYPOINT ["dotnet", "DockerContainerRegistry.dll"]

Debugging application inside Docker

All modern IDE allows to debug ASP.NET Core application that are run inside the local docker. See links:

Links

Storage

EntityFramework

Dapper

Azure

App Services

Links

Azure ARM Templates

Links

Azure Key Vault

Links

AWS

CI/CD

Azure DevOps Pipelines

Setting up Docker Resources

Azure Devops has built in AzureCLI@1 task that's able to run Azure CLI commands.

To use it, it's needed to configure Azure Resource Manager comnnection. It's possible to do either with default service principal or by setting up custom one with set of permissions.

To allow new resource group creation you need to add at least Microsoft.Resources/subscriptions/resourcegroups/write permission on the subscription level. You can do that through Access Control (IAM) section (Home => Subscriptions => Select subscription => IAM). Then you need to assign role that has that permission (eg. Contributor but beware - using it might be dangerous, as it has a high level access permissions, someone with access to Azure Devops can get access to subscription management). You can define your own custom role with minimum set of permissions.

Sample usage would be, creating new resource group and Azure Container Registry:

parameters:
  vmImageName: 'ubuntu-16.04'
  resourceGroupName: ''
  imageRepository: ''
  subscription: ''

stages:
  - stage: create_azure_group_and_azure_docker_registry
    displayName: Create Azure Group And Azure Docker Registry
    jobs:
      - job: create_azure_group_and_azure_docker_registry
        pool:
          vmImageName: ${{ parameters.vmImageName }}
        steps:
          - task: AzureCLI@1
            displayName: Create Resource Group
            inputs:
              azureSubscription: ${{ parameters.subscription }}
              scriptLocation: 'inlineScript'
              inlineScript: az group create --name ${{ parameters.resourceGroupName }} --location northeurope

          - task: AzureCLI@1
            displayName: Create Azure Container Registry
            inputs:
              azureSubscription: ${{ parameters.subscription }}
              scriptLocation: 'inlineScript'
              inlineScript: az acr create --resource-group ${{ parameters.resourceGroupName }} --name ${{ parameters.imageRepository }} --sku Basic

Sample usage of this template would look like:

variables:
  vmImageName: 'ubuntu-16.04'
  imageRepository: dockercontainerregistrysample
  dockerRegistryServiceConnection: AzureDockerRegistry
  resourceGroupName: WebApiWithNetCore
  subscription: AzureWebApiWithNetCore

stages:
  - template: AzureDevOps/Stages/CreateAzureGroupAndAzureDockerRegistry.yml
    parameters:
      imageRepository: $(imageRepository)
      resourceGroupName: $(resourceGroupName)
      subscription: $(subscription)
      vmImageName: $(vmImageName)

Links:

Building and pushing image to Docker Registry

Template for building and pushing Docker image

Setup the universal template as follows (with eg. filename BuildAndPublishDocker.yml):

parameters:

  - name: imageRepository

  - name: dockerRegistryServiceConnection

  - name: tag
    type: string
    
  - name: vmImageName
    default: 'ubuntu-16.04'
    
  - name: dockerfilePath
    default: DOCKERFILE

######################################################
#   Stage definition
######################################################
stages:
  - stage: build_and_push_docker_image
    displayName: Build and push Docker image
    jobs:
      - job: Build
        displayName: Build job
        pool:
          vmImage: ${{ parameters.vmImageName }}
        steps:
          - checkout: self
  
          - task: Docker@2
            displayName: Build a Docker image
            inputs:
              command: build
              repository: ${{ parameters.imageRepository }}
              dockerfile: ${{ parameters.dockerfilePath }}
              containerRegistry: ${{ parameters.dockerRegistryServiceConnection }}
              tags: |
                ${{ parameters.tag }}
  
          - task: Docker@2
            displayName: Push a Docker image to container registry
            condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
            inputs:
              command: push
              repository: ${{ parameters.imageRepository }}
              dockerfile: ${{ parameters.dockerfilePath }}
              containerRegistry: ${{ parameters.dockerRegistryServiceConnection }}
              tags: |
                ${{ parameters.tag }}
Azure Docker Registry

Before running the pipeline, you need to manually using Azure Cloud Shell:

  1. Create Azure Resource Group, eg.:

    az group create --name WebApiWithNETCore --location westus

  2. Create Azure Container Registry, eg.

    az acr create --resource-group WebApiWithNETCore --name dockercontainerregistrysample --sku Basic

  3. Setup service connection in Azure Devops. See more in documentation

Use defined stage template and define needed variables, eg.:

variables:
  # image version (tag) variables
  major: 1
  minor: 0
  patch: 0
  build: $[counter(variables['minor'], 0)] #this will reset when we bump patch
  tag: $(major).$(minor).$(patch).$(build)
  vmImageName: 'ubuntu-16.04'
  dockerfilePath: CD/DockerContainerRegistry/DOCKERFILE
  imageRepository: dockercontainerregistrysample
  dockerRegistryServiceConnection: AzureDockerRegistry

stages:
  - template: AzureDevOps/Stages/BuildAndPublishDocker.yml
    parameters:
      imageRepository: imageRepository
      dockerRegistryServiceConnection: dockerRegistryServiceConnection
      tag: tag
      vmImageName: vmImageName
      dockerfilePath: dockerfilePath

See more in the pipeline definition: link.

Links:

Docker Hub

Before running the pipeline, you need to manually using Azure Cloud Shell:

  1. Create an account and sign in to Docker Hub.
  2. Create repository (this will be your image name) selecting your Git repository.
  3. Setup service connection in Azure Devops. See more in documentation

Use defined stage template and define needed variables, eg.:

variables:
  # image version (tag) variables
  major: 1
  minor: 0
  patch: 0
  build: $[counter(variables['minor'], 0)] #this will reset when we bump patch
  tag: $(major).$(minor).$(patch).$(build)
  vmImageName: 'ubuntu-16.04'
  dockerfilePath: CD/DockerContainerRegistry/DOCKERFILE
  imageRepository: oskardudycz/dockercontainerregistrysample
  dockerRegistryServiceConnection: DockerHubDockerRegistry

stages:
  - template: AzureDevOps/Stages/BuildAndPublishDocker.yml
    parameters:
      imageRepository: imageRepository
      dockerRegistryServiceConnection: dockerRegistryServiceConnection
      tag: tag
      vmImageName: vmImageName
      dockerfilePath: dockerfilePath

Publishing application to App Services

Links

Links

Github Actions

Building and pushing image to Docker Registry

Docker Hub

Before running the pipeline:

  1. Create an account and sign in to Docker Hub.
  2. Go to Account Settings => Security: link and click New Access Token.
  3. Provide the name of your access token, save it and copy the value (you won't be able to see it again, you'll need to regenerate it).
  4. Go to your GitHub secrets settings (Settings => Secrets, url https://github.com/{your_username}/{your_repository_name}/settings/secrets/actions).
  5. Create two secrets (they won't be visible for other users and will be used in the )
  • DOCKERHUB_USERNAME - with the name of your Docker Hub account (do not mistake it with GitHub account)
  • DOCKERHUB_TOKEN - with the pasted value of a token generated in point 3.

Then add new file in the .github/workflows repository folder - e.g. build_and_publish_docker_to_docker_hub.yml.

name: Build And Publish Docker To DockerHub

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Check Out Repo
        uses: actions/checkout@v1

      - name: Login to DockerHub
        uses: docker/login-action@v1
        with:
          # Use secrets defined in GithubRepository
          # Based on the generated in DockerHub token
          username: ${{ secrets.DOCKERHUB_USERNAME }}
          password: ${{ secrets.DOCKERHUB_TOKEN }}

      - name: Set up Docker Buildx
        id: buildx
        uses: docker/setup-buildx-action@v1

      - name: Build and push
        id: docker_build
        uses: docker/build-push-action@v2
        with:
          # build image in pull requests
          # publish only if branch is `main`
          push: ${{ github.ref == 'refs/heads/main'}}
          # define at which tag should be docker image published
          tags: oskardudycz/webapi_net_core_github_actions:latest
          # path to your project subfolder
          context: ./CD/DockerContainerRegistry
          # path to Dockerfile
          file: ./CD/DockerContainerRegistry/DOCKERFILE

      - name: Image digest
        run: echo ${{ steps.docker_build.outputs.digest }}
Links

Caching

GraphQL

Links

CQRS

OAuth

Links

More Repositories

1

EventSourcing.NetCore

Examples and Tutorials of Event Sourcing in .NET
C#
2,907
star
2

ArchitectureWeekly

Architecture Weekly - links and resources to boost your knowledge and developer skills
1,112
star
3

EventSourcing.NodeJS

Examples and Tutorials of Event Sourcing in NodeJS
TypeScript
411
star
4

GoldenEye

The CQRS flavoured framework that will speed up your WebAPI and Microservices development
C#
285
star
5

EventSourcing.JVM

Examples and Tutorials of Event Sourcing in JVM languages
Java
200
star
6

Ogooreck

Sneaky Testing Library in BDD style
C#
107
star
7

PostgresOutboxPatternWithCDC.NET

PoC of doing Outbox Pattern with CDC and .NET
C#
50
star
8

cqrs-is-simpler-with-net-and-csharp

Repository for CQRS is simpler with .NET and C# talk
C#
28
star
9

AzurePipelinesSamples

Azure Pipeline Samples - sample configurations with explanation and useful links
C#
27
star
10

NetCoreWithDocker

Tutorial with samples about how to setup .Net Core with Docker
C#
20
star
11

kafka-connect

C#
17
star
12

EventStoreInOneHour

Repository for Live Stream "Event Store in One Hour"
C#
14
star
13

event-driven.io

https://event-driven.io/ - Resources about Event-Driven Architectures, Event Sourcing and pragmatic development
JavaScript
13
star
14

MessagingKnowledgeBase

11
star
15

PostgreSQL-Is-Awesome

Resources I found during my journey with PostgreSQL.
10
star
16

postgres-for-dotnet-dev

Postgres for .NET developer
C#
10
star
17

emmett

Emmett - a Node.js Event Store
TypeScript
10
star
18

event-sourcing-on-prod-workshop

β›” This is not Event Sourcing sample you're looking for
C#
9
star
19

Kafka.NET

Sample showing how to use Kafka in .NET
C#
9
star
20

dante

Let's try some E-Commerce
TypeScript
8
star
21

cqrs-is-simpler-with-java

Java
7
star
22

slim-down-your-aggregate

Java
6
star
23

oskardudycz

4
star
24

dotnet-materials

Some .NET materials found down the road
4
star
25

design-and-modelling

Resources around software design and modelling
4
star
26

APIKnowledgeBase

3
star
27

Memoization

Memoization is a useful technique that allows easily optimize method calls. The sample shows how the Memoization works and how to implement it in C#.
C#
3
star
28

create-typescript-app

Template for TypeScript project in NodeJS
JavaScript
3
star
29

documentation-as-code

3
star
30

github-actions

JavaScript
2
star
31

CQRSKnowledgeBase

2
star
32

KnowledgeBase

Usefull links found during development
2
star
33

WASM

1
star
34

simple-is-not-easy

1
star
35

FSharpKnowledgeBase

1
star
36

GoldenEye-Sample

JavaScript
1
star
37

EntrepreneurKnowledgeBase

Various links about entrepreneurship
1
star
38

architecture-kata

1
star
39

XUnit.MatrixTests

C#
1
star
40

Marten.SimpleEventSourcing

C#
1
star
41

TSQLTExample

Example MSSQL Unit Tests written in TSQL framework
1
star
42

MicroservicesKnowledgeBase

MicroservicesKnowledgeBase
1
star
43

SparkWithScalaAndDocker

Example showing how to run Spark with Scala and Docker
Scala
1
star
44

MCSD-480-Preparation

MCSD: Web Applications - 480 - Programming in HTML5 with JavaScript and CSS3 Preparation materials
C#
1
star
45

Mario

Sneaky .NET library to pipe your commands, queries and events
C#
1
star
46

Joomanji

C#
1
star
47

postgres_debezium_plv8

Dockerfile
1
star