European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core Hosting :: What exactly is Entity Framework Core (EF)?

clock July 18, 2023 07:59 by author Peter

The Entity Framework (EF) Core is a lightweight, extendable, open-source, cross-platform version of the well-known Entity Framework data access technology. We'll look at the Code First technique in EF Core using a.NET 7 API in this article. This process begins with the definition of your model classes; EF Core then generates the database and its structure based on these models.

Why do we require EF Core?

Microsoft's EF Core is an open-source lightweight ORM that provides a mechanism to connect with relational databases using programming concepts. Here are some of the reasons why it is advantageous.

  • Object-Relational Mapping: EF Core connects relational databases to object-oriented programming. It lowers the amount of boilerplate code required to convert database tables and columns into object properties. This is handled automatically by EF Core.
  • Language Integrated Query (LINQ): LINQ is a strong querying language included with.NET. It enables developers to write database queries in C# syntax. LINQ converts our C# queries into efficient database queries.
  • Testability is promoted by the design of EF Core, which allows developers to fake or alter the database context during unit testing. This allows developers to build isolated tests that do not rely on a physical database, making testing more efficient and reliable.
Let's begin
We will create a new project.

After project creation, we will have a structure like this.

Next, we will install the required nuget packages.

Install the following packages,

    Microsoft.EntityFrameworkCore
    Microsoft.EntityFrameworkCore.Design
    Microsoft.EntityFrameworkCore.SqlServer

Now we will create a folder called 'Models' in the solution, and inside it, we will add another folder called Entities which will have entity classes from which our database will be created.

We will create a class "Department.cs" with the following properties.
public class Department
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string Description { get; set; }
    public bool IsActive { get; set; }
    public DateTime CreatedDate { get; set; }
    public DateTime UpdatedDate { get; set; }
}


Now we will create a class called "Employee.cs".
public class Employee
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public int DepartmentId { get; set; }

    public bool IsActive { get; set; }
    public DateTime CreatedDate { get; set; }
    public DateTime UpdatedDate { get; set; }
}

Now we will create an "EmployeeDbContext.cs".
public class EmployeeDbContext : DbContext
{
    public EmployeeDbContext(DbContextOptions<EmployeeDbContext> options) : base(options)
    {
    }

    public DbSet<Department> Departments { get; set; }
    public DbSet<Employee> Employees { get; set; }
}

As we can see, this EmployeeDbContext class extends from the DbContext class that comes in the "Microsoft.EntityFrameworkCore" framework, and then we have the DbSets for the Departments and Employees as per the entities we created.

Now we will configure the DB context in the "Program.cs".

We will add the following code.
builder.Services.AddDbContext<EmployeeDbContext>(options =>
{
    options.UseSqlServer("Server=RAVI;Database=EFCoreCodeFirstDB;Trusted_Connection=True;MultipleActiveResultSets=true;TrustServerCertificate=True;");
});


After that, we will run the migrations commands that will help us maintain the DB state.

Run the following command in the solution directory.

“dotnet ef migrations add InitialCreate”.

After running it, the output will look something like this.

In the solution explorer, you will see the following folder.

Note. If the above command throws the error, please run the following command first.
“dotnet tool install --global dotnet-ef”

After this, we will run the following command.
“dotnet ef database update”

You should see the following output.

After this, you can see your SqlServer to see that your database has been created along with your tables.

Now we will add a controller called "DepartmentController.cs".
[Route("api/[controller]")]
[ApiController]
public class DepartmentController: ControllerBase {
  private readonly EmployeeDbContext _dbContext;
  public DepartmentController(EmployeeDbContext dbContext) {
    _dbContext = dbContext;
  }

  [HttpGet]
  public IActionResult Get() {
    var departments = _dbContext.Departments.ToList();
    return Ok(departments);
  }

  [HttpPost]
  public IActionResult Post([FromBody] Department department) {
    _dbContext.Departments.Add(department);
    _dbContext.SaveChanges();
    return Ok();
  }

  [HttpPut]
  public IActionResult Put([FromBody] Department department) {
    var dept = _dbContext.Departments.FirstOrDefault(d => d.Id == department.Id);
    if (dept == null)
      return NotFound();

    dept.Name = department.Name;
    dept.Description = department.Description;
    dept.UpdatedDate = DateTime.Now;
    _dbContext.SaveChanges();
    return Ok(dept);
  }
}


In our DepartmentController.cs, we have three REST endpoints; one is HttpGet which will get us the list of all departments. The second endpoint is HTTP POST which will save the departments in the DB, and the last one is HTTPPUT which will update the department in the DB.

Similarly, we will have an employee controller as well, which will save, update and get employees.
[Route("api/[controller]")]
[ApiController]
public class EmployeeController: ControllerBase {
  private readonly EmployeeDbContext _dbContext;
  public EmployeeController(EmployeeDbContext dbContext) {
    _dbContext = dbContext;
  }

  [HttpGet]
  public IActionResult Get() {
    var employees = _dbContext.Employees.ToList();
    return Ok(employees);
  }

  [HttpPost]
  public IActionResult Add([FromBody] Employee employee) {
    _dbContext.Employees.Add(employee);
    _dbContext.SaveChanges();
    return Ok();
  }

  [HttpPut]
  public IActionResult Update([FromBody] Employee employee) {
    var emp = _dbContext.Employees.FirstOrDefault(d => d.Id == employee.Id);
    if (emp == null)
      return NotFound();

    emp.FirstName = employee.FirstName;
    emp.LastName = employee.LastName;
    emp.DepartmentId = employee.DepartmentId;
    emp.UpdatedDate = DateTime.Now;
    _dbContext.SaveChanges();
    return Ok(emp);
  }
}

Now let's run our API, and then we will hit the POST endpoint of the department's controller.

Now we will try to get all the departments by hitting the Get endpoint of departments.

Now we will create a new employee using the Post endpoint of the employee controller.


Now we will hit the Get endpoint of the employee controller.

Now we can verify the same values from the database as well.

As we can verify, our values are stored in the database as well.
We have seen how to create a .NET 7 API and use EF Core to interact with the relational database to store and get the data.



European ASP.NET Core Hosting :: Push Notification From C#.Net Using FCM HTTP V1

clock July 4, 2023 09:19 by author Peter

In this article, we will be sending push notifications to Android devices via C# Dot Net/.Net. Using Firebase Cloud Messaging API HTTP V1.

This article will guide using of FCM HTTP V1 over Legacy FCM API.

Legacy FCM API :
https://fcm.googleapis.com/fcm/send

HTTP
Latest FCM Http V1:
https://fcm.googleapis.com/v1/projects/myproject-b5ae1/messages:send

HTTP
The latest FCM Http V1 provides better security, Support for the new Client Platform version.
Section 1. Create/ Login in Firebase Console Account.
URL: https://console.firebase.google.com/

After creating of account/Login, Click on Add Project.

Section 2. Add Firebase to your Android project
Section 3. Go to Project Settings

Section 4. Enable FCM HTTP V1

Section 5. Create a .Net Web API project; write this code to generate a Bearer token for calling FCM HTTP V1 API.

Code
using System.IO;
using Google.Apis.Auth.OAuth2;
using System.Net.Http.Headers;
using System.Text;
using System.Web;
using System.Web.Configuration;
using System.Net.Http;
using System.Web.Http;

//----------Generating Bearer token for FCM---------------

string fileName = System.Web.Hosting.HostingEnvironment.MapPath("~/***-approval-firebase-adminsdk-gui00-761039f087.json"); //Download from Firebase Console ServiceAccount

string scopes = "https://www.googleapis.com/auth/firebase.messaging";
var bearertoken = ""; // Bearer Token in this variable

using(var stream = new FileStream(fileName, FileMode.Open, FileAccess.Read)) {
  bearertoken = GoogleCredential
    .FromStream(stream) // Loads key file
    .CreateScoped(scopes) // Gathers scopes requested
    .UnderlyingCredential // Gets the credentials
    .GetAccessTokenForRequestAsync().Result; // Gets the Access Token
}


Section 6. Follow this Step to Download the Json file from the Firebase Console Account (Project Settings -> Service Account -> Generate New Private Key) & Place the Json File in your Project directory.

Section 7. Place the downloaded JSON File in your .net Project directory.

Section 8. Install Google.Api.Auth from Nuget Package Manager

Section 9. Use the Authorization token generated above via c# code to call FCM HTTP V1 API; get the FCM Token of Android from an Android device. Test using Postman; a push notification will be sent to the Android device.

Section 10. On successful completion of the above steps, Implement Further, Create this Model Classes for storing Request & Response data.

Section 11. Call the FCM HTTP V1 API using the Authorization token generated & get the FCM id / Token from the Android device.

Section 12. Entire Code
#region FCM Auth & Send Notification To Mobile //notify FCM Code
public class Data {

  public string body {
    get;
    set;
  }

  public string title {
    get;
    set;
  }

  public string key_1 {
    get;
    set;
  }

  public string key_2 {
    get;
    set;
  }

}

public class Message {

  public string token {
    get;
    set;
  }

  public Data data {
    get;
    set;
  }

  public Notification notification {
    get;
    set;
  }

}

public class Notification {

  public string title {
    get;
    set;
  }

  public string body {
    get;
    set;
  }

}

public class Root {

  public Message message {
    get;
    set;
  }

}

public void GenerateFCM_Auth_SendNotifcn()

{
  //----------Generating Bearer token for FCM---------------

  string fileName = System.Web.Hosting.HostingEnvironment.MapPath("~/***-approval-firebase-adminsdk-gui00-761039f087.json"); //Download from Firebase Console ServiceAccount

  string scopes = "https://www.googleapis.com/auth/firebase.messaging";
  var bearertoken = ""; // Bearer Token in this variable
  using(var stream = new FileStream(fileName, FileMode.Open, FileAccess.Read))

  {

    bearertoken = GoogleCredential
      .FromStream(stream) // Loads key file
      .CreateScoped(scopes) // Gathers scopes requested
      .UnderlyingCredential // Gets the credentials
      .GetAccessTokenForRequestAsync().Result; // Gets the Access Token

  }

  ///--------Calling FCM-----------------------------

  var clientHandler = new HttpClientHandler();
  var client = new HttpClient(clientHandler);

  client.BaseAddress = new Uri("https://fcm.googleapis.com/v1/projects/***-approval/messages:send"); // FCM HttpV1 API

  client.DefaultRequestHeaders.Accept.Clear();
  client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));

  //client.DefaultRequestHeaders.Accept.Add("Authorization", "Bearer " + bearertoken);
  client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", bearertoken); // Authorization Token in this variable

  //---------------Assigning Of data To Model --------------

  Root rootObj = new Root();
  rootObj.message = new Message();

  rootObj.message.token = "cEM68BIdTomaE0R2dbaO:APA91bG8XfOjU_GSPNQYCrJ4wzE7VmMPEsyudwtE41VWKzJcoT2f3wbKsKCHwk5s078ZL31mM258-BzdZPRNXAlc_fyzCzj2txLQvQ3u7jggDPHjYIMlHRgspXT0CudfK"; //FCM Token id

  rootObj.message.data = new Data();
  rootObj.message.data.title = "Data Title";
  rootObj.message.data.body = "Data Body";
  rootObj.message.data.key_1 = "Sample Key";
  rootObj.message.data.key_2 = "Sample Key2";
  rootObj.message.notification = new Notification();
  rootObj.message.notification.title = "Notify Title";
  rootObj.message.notification.body = "Notify Body";

  //-------------Convert Model To JSON ----------------------

  var jsonObj = new JavaScriptSerializer().Serialize(rootObj);

  //------------------------Calling Of FCM Notify API-------------------

  var data = new StringContent(jsonObj, Encoding.UTF8, "application/json");
  data.Headers.ContentType = new MediaTypeHeaderValue("application/json");

  var response = client.PostAsync("https://fcm.googleapis.com/v1/projects/**-approval/messages:send", data).Result; // Calling The FCM httpv1 API

  //---------- Deserialize Json Response from API ----------------------------------

  var jsonResponse = response.Content.ReadAsStringAsync().Result;
  var responseObj = new JavaScriptSerializer().DeserializeObject(jsonResponse);

}

#endregion



European ASP.NET Core Hosting :: Upload Files using ASP.NET Core API

clock June 26, 2023 10:55 by author Peter

In contemporary web applications, it is frequently necessary to upload files. Whether it involves uploading images, documents, or any other file format, ensuring a seamless user experience when dealing with uploads is crucial. This article will delve into the process of incorporating file upload functionality into an ASP.NET Core Web API.

Setting up the Project

To begin, we will set up a new ASP.NET Core Web API project. We'll also install the necessary NuGet packages required for file handling, such as
    Microsoft.EntityFrameworkCore.SqlServer
    Microsoft.EntityFrameworkCore
    Microsoft.EntityFrameworkCore.Tools


The tools which have been leveraged for this tutorial are.
    Visual Studio Community Edition 16.4.5
    .NET 6.0
    Entity Framework Core
    Web API

The entire source code can be downloaded from GitHub.

Creating the Model and Database Context
In this section, we'll define the model class representing the uploaded files and the corresponding database context for storing the file data. We'll use Entity Framework Core to interact with the database and store the file information as byte arrays.

Create a model class to represent the image entity. In this example, let's call it ImageEntity.cs:
public class ImageEntity
{
    [Key]
    public int Id { get; set; }

    public string FileName { get; set; }

    public byte[] Data { get; set; }
}

Then, create a database context class, AppDbContext.cs, that inherits from DbContext and includes a DbSet for the ImageEntity:
public class AppDbContext : DbContext
{
    public AppDbContext(DbContextOptions<AppDbContext> options)
        : base(options)
    {
    }

    public DbSet<ImageEntity> Images { get; set; }
}


Handling Multiple File Uploads in the Controller
In this segment, we will develop a controller action that receives an IFormFile object to manage file uploads. We will extract the file data and store it in the database utilizing the database context.

Create a controller, ImagesController.cs, with two actions: one for uploading the file and another for retrieving the file:
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;

namespace FileUploadsPOC.Controllers
{
    [Route("api/[controller]")]
    [ApiController]
    public class ImagesController : ControllerBase
    {
        private readonly AppDbContext _dbContext;


        public ImagesController(AppDbContext dbContext)
        {
            _dbContext = dbContext;

        }

        [HttpPost("upload")]
        public async Task<IActionResult> Upload(IFormFile file)
        {
            if (file == null || file.Length == 0)
                return BadRequest("No file uploaded.");

            var imageEntity = new ImageEntity
            {
                FileName = file.FileName
            };

            using (var memoryStream = new MemoryStream())
            {
                await file.CopyToAsync(memoryStream);
                imageEntity.Data = memoryStream.ToArray();
            }

            await _dbContext.Images.AddAsync(imageEntity);
            await _dbContext.SaveChangesAsync();

            return Ok(imageEntity.Id);
        }

        [HttpGet("download/{id}")]
        public async Task<IActionResult> Download(int id)
        {
            var imageEntity = await _dbContext.Images.FirstOrDefaultAsync(image => image.Id == id);

            if (imageEntity == null)
                return NotFound();

            var fileContentResult = new FileContentResult(imageEntity.Data, "application/octet-stream")
            {
                FileDownloadName = imageEntity.FileName
            };

            return fileContentResult;
        }
    }
}

Setup the configuration
Make sure to configure your database connection string in the appsettings.json file:
"ConnectionStrings": {
    "DefaultConnection": "Server=localhost;Database=FileUploadPOC;Trusted_Connection=SSPI;Encrypt=false;TrustServerCertificate=true"
  }

Configure your database connection and add the required services to Program.cs
builder.Services.AddDbContext<AppDbContext>(options =>
            options.UseSqlServer(builder.Configuration.GetConnectionString("DefaultConnection")));

Create the Database Migration and Update Database
Open the Package Manager Console and execute the below commands.

Add-Migration FileUpload (FileUpload is the name of the migration, you can provide any name at your convenience)

Once the build succeeds, execute the below command.
Update-database

Testing the Upload and Download Functionality
To ensure our file upload functionality works as expected, we'll test it using a tool like Postma. I will leave this to you.

Following the steps outlined in this blog post, you can easily enable this feature in your web applications. Happy coding!



European ASP.NET Core Hosting :: Understanding Bitwise Enums in C# What is Bitwise Enums

clock June 19, 2023 08:13 by author Peter

A bitwise enum is an enum type that combines multiple enum values into a singular value using bitwise operations. Each enum value has its own bit indicator, which is represented by a power of 2. By designating these flags, we can represent multiple enum values using bitwise operators to combine or mask them.
How do we define Bitwise Enums?

To define a bitwise enum, the [Flags] attribute must be added to indicate that the enum values can be combined using bitwise operations. Here's an instance:
[Flags] enum DaysOfWeek {
  None = 0, Monday = 1, Tuesday = 2, Wednesday = 4, Thursday = 8, Friday = 16,
    Saturday = 32, Sunday = 64
};

In the above code, we define a DaysOfWeek enum where each day of the week is assigned a unique power of 2. The None value is assigned 0, indicating no days are selected.

Combining Bitwise Enums

To combine multiple enum values, we use the bitwise OR (|) operator. Here's an example,

DaysOfWeek selectedDays = DaysOfWeek.Monday | DaysOfWeek.Wednesday | DaysOfWeek.Friday;

The above code combines the Monday, Wednesday, and Friday enum values using the bitwise OR operator.

Checking for Enum Values

To check if a specific enum value is set in a bitwise enum, we use the bitwise AND (&) operator. Here's an example:

Console.WriteLine("selected Days", selectedDays);

if ((selectedDays & DaysOfWeek.Monday) != 0) {
  Console.WriteLine("Monday is selected.");
}

selectedDays ^= DaysOfWeek.Wednesday;

In the above code, we check if the Monday value is set in the selectedDays variable using the bitwise AND operator.

Removing Enum Values
To remove a specific enum value from a bitwise enum, we use the bitwise XOR (^) operator. Here's an example:
selectedDays ^= DaysOfWeek.Wednesday;

Console.WriteLine(selectedDays);


In the above code, we remove the Wednesday value from the selectedDays variable using the bitwise XOR operator.

In C#, bitwise enums offer a potent method for effectively encoding and modifying flag-based enumerations. We can mix, check, and remove particular enum values within a compact representation by giving them distinctive bit flags and using bitwise operations. Bitwise enums provide ease and flexibility for expressing flag-based situations such as days of the week, permissions, or other flag-based circumstances. If you have a firm grasp of bitwise enums, you can use this functionality in your C# projects to deal with challenging flag-based scenarios.



European ASP.NET Core Hosting :: HostedService in .Net Core

clock June 13, 2023 10:22 by author Peter

In the. NETCore, a HostedService, is a type of class that embodies a background task or service running asynchronously within an application. Its purpose is to initiate when the application starts and conclude when the application ceases. HostedService is an integral component of the generic host in .NET Core, enabling various functionalities like executing background processes, managing job schedules, and monitoring system resources.

The tools which I have leveraged for this tutorial are below.
    VS 2022 Community Edition (64-bit)
    .Net 7.0
    Console App

The entire source code can be downloaded from GitHub.
Here's an example of creating a HostedService in .NET Core.
using System;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;

public class MyHostedService : IHostedService, IDisposable
{
    private readonly ILogger<MyHostedService> _logger;
    private Timer _timer;

    public MyHostedService(ILogger<MyHostedService> logger)
    {
        _logger = logger;
    }

    public Task StartAsync(CancellationToken cancellationToken)
    {
        _logger.LogInformation("MyHostedService is starting.");

        _timer = new Timer(DoWork, null, TimeSpan.Zero, TimeSpan.FromSeconds(5));

        return Task.CompletedTask;
    }

    private void DoWork(object state)
    {
        _logger.LogInformation("Doing some work...");
        // Perform your background processing or task here
    }

    public Task StopAsync(CancellationToken cancellationToken)
    {
        _logger.LogInformation("MyHostedService is stopping.");

        _timer?.Change(Timeout.Infinite, 0);

        return Task.CompletedTask;
    }

    public void Dispose()
    {
        _timer?.Dispose();
    }
}

In this instance, the class MyHostedService is implemented with the IHostedService interface, which necessitates the implementation of the StartAsync and StopAsync methods. The ILogger is injected into the service's constructor for the purpose of logging.

The StartAsync method is invoked upon application startup. Within this method, you can initialize and commence any background tasks or processing. In this example, a Timer is instantiated to simulate a recurring task occurring every 5 seconds.

The StopAsync method is called when the application is shut down. Its purpose is to gracefully halt any ongoing work and perform resource cleanup. In the provided example, the Timer is stopped.

The Dispose method is implemented to properly release any resources utilized by the service.

To utilize the MyHostedService, it is necessary to register it within the dependency injection container of your application. Here's an illustration of how to configure and execute the HostedService in a console application.
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;

var host = new HostBuilder()
.ConfigureServices((hostContext, services) =>
{
    services.AddHostedService<MyHostedService>();
})
.ConfigureLogging(logging =>
{
    logging.ClearProviders();
    logging.AddConsole();
})
.Build();

await host.RunAsync();


In this scenario, the AddHostedService method is employed to enlist MyHostedService as a hosted service within the dependency injection container.

Upon executing the console application, MyHostedService will initiate and operate in the background until the application is halted.

Conclusion
Please take note that this example relies on Microsoft.Extensions.Hosting and Microsoft.Extensions.Logging namespaces, which are frequently utilized in .NET Core applications. Ensure you include the necessary NuGet packages (Microsoft.Extensions.Hosting and Microsoft.Extensions.Logging.Console) in your project.

This sample serves as a fundamental illustration of a HostedService in .NET Core. It can be customized and expanded to cater to your specific requirements.



European ASP.NET Core Hosting - HostForLIFE :: Utilizing NLog for logging in ASP.NET Core WebAPI

clock June 5, 2023 10:36 by author Peter

Logging is an integral element of the development process for all applications, including web APIs. It assists developers in monitoring and analyzing the flow of their applications, identifying issues, and resolving issues. NLog is a prominent logging framework in the ASP.NET Core ecosystem, providing a robust and flexible logging solution. This article examines how to implement monitoring in an ASP.NET Core WebAPI with NLog.

What exactly is NLog?

NLog is an adaptable and extensible platform for logging in.NET applications. It supports multiple logging destinations, including files, databases, and email. NLog is extremely configurable, enabling developers to configure logging behavior based on their particular needs.

Configuring NLog for ASP.NET Core WebAPI
Follow the steps below to get started with NLog in an ASP.NET Core WebAPI project:

Step 1. Add NLog dependencies
Add the required NLog dependencies to your project to get started. The NuGet Package Manager in Visual Studio can be used to search for and install the following packages:
NLog NLog.Web.AspNetCore

These packages provide the essential monitoring functionality and ASP.NET Core integration.

Step 2. Configure NLog

Create a nlog.config file in the project's root directory. This file defines NLog's configuration, including log targets and rules. Here is an example of a simple nlog.config file:

<?xml version="1.0" encoding="utf-8"?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    autoReload="true" internalLogLevel="Trace" internalLogFile="${basedir}\logs\internallog.log" throwConfigExceptions="true">
    <!-- enable asp.net core layout renderers -->
    <extensions>
        <add assembly="NLog.Extended"/>
    </extensions>

    <targets>
        <!-- File Target for all log messages with basic details -->
        <target xsi:type="File" name="logfile" fileName="${basedir}/logs/nlog-${date:format=yyyy-MM-dd}.log"
            layout="${longdate}|${event-properties:item=EventId_Id:whenEmpty=0}|${level:uppercase=true}|${logger}|${message} ${exception:format=tostring}" />
        <!-- Other targets like console, database, etc. can be added here -->
    </targets>
    <rules>
        <logger name="*" minlevel="Trace" writeTo="logfile" />
        <!-- Other rules can be added here -->
    </rules>
</nlog>

In this example, we define a file target called "logfile" that writes log messages to a file named nlog-YYYY-MM-DD.log within your application's logs subdirectory. You can add more targets such as console, database, or email as per your requirements. The minlevel attribute specifies the minimum log level to be captured, whereas the writeTo attribute specifies the destination(s) to which log messages should be written.

Step 3. Configure NLog within the ASP.NET Core application.
To configure NLog within your ASP.NET Core WebAPI application, open the Program.cs file and make the following modifications:
using NLog.Web;

val logger = NLogBuilder.ConfigureNLog("nlog.config").GetCurrentClassLogger(); try
var builder = WebApplication.CreateBuilder(args); // Add services to the container. builder.Services.AddControllers(); // To learn more about configuring Swagger/OpenAPI, visit https://aka.ms/aspnetcore/swashbuckle. builder.Services.AddEndpointsApiExplorer(); builder.Services.AddSwaggerGen(); //Configure logging. build
     loggingBuilder.ClearProviders(); loggingBuilder.AddNLog(); ); var app = builder.Build(); // Configure the request pipeline for HTTP requests.
    if (app.Environment.IsDevelopment())
     app.UseSwagger(); app.UseSwaggerUI(); app.UseHttpsRedirection(); app.UseAuthorization(); app.MapControllers(); app.Run(); app.UseSwaggerUI();
rescue (Exception error)
// NLog: capture configuration error
    logger.Error(exception, "Stopped program because of exception"); throw;
finally
// Ensure internal timers/threads are flushed and terminated prior to application termination (Avoid segmentation fault on Linux).
    NLog.LogManager.Shutdown(); }


4. Inject ILogger into controllers or services using C#
Create an instance of the Logger class within your controllers or services to use NLog for logging. For instance:
using NLog;

[ApiController] [Route("api/[controller]")]
public class SampleController inherits from ControllerBase; private static read-only Logger = LogManager.GetCurrentClassLogger();

    public IActionResult Get() _logger; [HttpGet].Example: Info("SampleController: Get method called");

        // ...

        return Ok(); } }


NLog entries are written to the specified file by default. Depending on the requirements of your application, you can configure additional targets, such as databases, email, and custom targets.

To view the logs, you can either manually open the log file or use tools such as NLog Viewer, which provides a graphical interface for monitoring and analyzing the logs.

This article demonstrated how to use NLog to implement logging in an ASP.NET Core WebAPI. We installed NLog, configured it to write logs to a file, and implemented logging in WebAPI controllers. NLog simplifies the process of logging and log administration in ASP.NET Core projects, which is an essential aspect of application development.



European ASP.NET Core Hosting - HostForLIFE :: ElasticSearch and.NET Core API for Effective CRUD Operations

clock May 30, 2023 08:56 by author Peter

Description of ElasticSearch
ElasticSearch is a scalable, open-source search and analytics engine developed on top of Apache Lucene. It is designed to manage vast amounts of data and provide distributed search capabilities that are both rapid and distributed. ElasticSearch is frequently used for a variety of purposes, such as full-text search, log analytics, real-time analytics, and data visualization.

Principal Elements of ElasticSearch

  • ElasticSearch is designed to distribute data across multiple nodes, allowing for horizontal scalability and high availability. It employs sharding and replication techniques to distribute and replicate data throughout the cluster.
  • Full-Text Search: ElasticSearch provides robust full-text search capabilities, enabling complex queries across structured and unstructured data. It allows for filtering, faceting, highlighting, and relevance scoring.
  • ElasticSearch is designed to manage the ingestion and analysis of real-time data. It enables near-real-time indexing and searching of data, making it suitable for applications that require current information and prompt responses.
  • ElasticSearch is schema-less, meaning documents can be indexed and searched without a predefined schema. It dynamically maps and indexes data based on its structure, providing flexibility and simplifying the process of indexing data.
  • ElasticSearch provides a comprehensive RESTful API that enables interaction with the search engine via HTTP requests. This facilitates ElasticSearch's integration with numerous applications and programming languages.
  • ElasticSearch provides robust aggregation capabilities for executing data analytics and generating insightful insights. Aggregations permit the grouping, filtering, and calculation of data sets, facilitating the extraction of valuable information from indexed documents.
  • ElasticSearch integrates seamlessly with other Elastic Stack components, such as Logstash for data ingestion, Kibana for data visualization and dashboarding, and Beats for lightweight data shipment. This ecosystem offers a comprehensive data management and analysis solution.


Enterprise search, e-commerce, content management, log analysis, cybersecurity, and other industries utilize ElasticSearch extensively. Its scalability, adaptability, and robust search capabilities make it a popular option for applications requiring the efficient and rapid retrieval of structured and unstructured data.

Here is an example of conducting CRUD operations for managing products using ElasticSearch and the.NET Core API:

Install NEST (Elasticsearch.Net & NEST) NuGet packages.

Specify Product Model:

using Nest;

public class Product
{
    [Keyword]
    public string Id { get; set; }

    [Text]
    public string Name { get; set; }

    [Number]
    public decimal Price { get; set; }

    // Add additional properties as needed
}

Configure ElasticSearch Connection:

using Nest;

public class ElasticSearchConfig
{
    private readonly string _elasticSearchUrl = "http://localhost:9200";
    private readonly string _indexName = "products"; // Name of your index

    public ElasticClient GetClient()
    {
        var settings = new ConnectionSettings(new Uri(_elasticSearchUrl))
            .DefaultIndex(_indexName);

        return new ElasticClient(settings);
    }
}


Create the Product Service:
using Nest;

public class ProductService
{
    private readonly ElasticClient _elasticClient;

    public ProductService(ElasticClient elasticClient)
    {
        _elasticClient = elasticClient;
    }

    public async Task<bool> AddProduct(Product product)
    {
        var indexResponse = await _elasticClient.IndexDocumentAsync(product);
        return indexResponse.IsValid;
    }

    public async Task<Product> GetProduct(string id)
    {
        var searchResponse = await _elasticClient.GetAsync<Product>(id);
        return searchResponse.Source;
    }

    public async Task<bool> UpdateProduct(Product product)
    {
        var updateResponse = await _elasticClient.UpdateAsync<Product>(product.Id, u => u.Doc(product));
        return updateResponse.IsValid;
    }

    public async Task<bool> DeleteProduct(string id)
    {
        var deleteResponse = await _elasticClient.DeleteAsync<Product>(id);
        return deleteResponse.IsValid;
    }
}


Create the Product Controller:
using Microsoft.AspNetCore.Mvc;

[ApiController]
[Route("api/products")]
public class ProductController : ControllerBase
{
    private readonly ProductService _productService;

    public ProductController(ProductService productService)
    {
        _productService = productService;
    }

    [HttpPost]
    public async Task<IActionResult> Create(Product product)
    {
        var success = await _productService.AddProduct(product);
        if (success)
        {
            return Ok();
        }
        return BadRequest("Failed to create product.");
    }

    [HttpGet("{id}")]
    public async Task<IActionResult> Get(string id)
    {
        var product = await _productService.GetProduct(id);
        if (product != null)
        {
            return Ok(product);
        }
        return NotFound();
    }

    [HttpPut]
    public async Task<IActionResult> Update(Product product)
    {
        var success = await _productService.UpdateProduct(product);
        if (success)
        {
            return Ok();
        }
        return BadRequest("Failed to update product.");
    }

    [HttpDelete("{id}")]
    public async Task<IActionResult> Delete(string id)
    {
        var success = await _productService.DeleteProduct(id);
        if (success)
        {
            return Ok();
        }
        return BadRequest("Failed to delete product.");
    }
}

Register Dependencies in Startup.cs:
using Nest;

public class Startup
{
    // ...

    public void ConfigureServices(IServiceCollection services)
    {
        // ...

        // Add ElasticSearch configuration and services
        services.AddSingleton<ElasticSearchConfig>();
        services.AddScoped<ElasticClient>(serviceProvider =>
        {
            var config = serviceProvider.GetRequiredService<ElasticSearchConfig>();
            return config.GetClient();
        });

        // Add ProductService
        services.AddScoped<ProductService>();

        // ...
    }

    // ...
}


With these code examples, you can create a .NET Core API for CRUD operations on products using ElasticSearch as the data store.



European ASP.NET Core Hosting - HostForLIFE :: Encryption and Decryption Using C# in an ASP.NET Windows Console Application

clock May 23, 2023 07:52 by author Peter

I have explained how to encrypt and decrypt text using the AES Encryption standard in this article.
Open Visual Studio, Create a new console application.

Provide a project name and choose the location to store the project information, and click next.

Choose the .Net framework based on your project requirement, then click Create.

Once the project has been created, then Right Click on the project name, choose to add, and click on the new item.
Add the class file to the existing project.

From the C# node, then choose the class definition and provide the class file name like Encrypt and decrypt file.cs.
After we added the class file to our project solution.

The class file was created successfully after we wrote the code in the class file.

Use the below code to encrypt the given text input.
public static string EncryptString(string plainText)
  {
      byte[] array;
      using (Aes aes = Aes.Create())
      {
          aes.Padding = PaddingMode.PKCS7;
          aes.KeySize = 256;
          aes.Key = new byte[32];
          aes.IV = new byte[16];
          aes.Padding = PaddingMode.PKCS7;
          ICryptoTransform encryptor = aes.CreateEncryptor(aes.Key, aes.IV);
          using (MemoryStream memoryStream = new MemoryStream())
          {
              using (CryptoStream cryptoStream = new CryptoStream((Stream)memoryStream, encryptor, CryptoStreamMode.Write))
              {
                  using (StreamWriter streamWriter = new StreamWriter((Stream)cryptoStream))
                  {
                      streamWriter.Write(plainText);
                  }
                  array = memoryStream.ToArray();
              }
          }
      }
      return Convert.ToBase64String(array);
  }

Use the below code to decrypt the given text input.
public static string DecryptString(string cipherText)
   {
       byte[] buffer = Convert.FromBase64String(cipherText);
       using (Aes aes = Aes.Create())
       {
           aes.Padding = PaddingMode.PKCS7;
           aes.KeySize = 256;
           aes.Key = new byte[32];
           aes.IV = new byte[16];
           ICryptoTransform decryptor = aes.CreateDecryptor(aes.Key, aes.IV);
           using (MemoryStream memoryStream = new MemoryStream(buffer))
           {
               using (CryptoStream cryptoStream = new CryptoStream((Stream)memoryStream, decryptor, CryptoStreamMode.Read))
               {
                   using (StreamReader streamReader = new StreamReader((Stream)cryptoStream))
                   {
                       return streamReader.ReadToEnd();
                   }
               }
           }
       }
   }


Then open the "Program.cs" file to consume the encryption and decryption method.
using System;
using Encrypt__Decrypt__AES_Operation__file;

namespace EncryptionDecryptionUsingSymmetricKey
{
class Program
{

        public static void Main(string[] args)
        {

            while (true)
            {
                ProcessEncryptDecrypt();
            }
        }


        public static void ProcessEncryptDecrypt()
        {
            int iChoice = 0;
            string strPwd = string.Empty;
            var encryptedString = string.Empty;
            Console.WriteLine("Enter your choice:");
            Console.WriteLine("1.Decryption   2.Encryption  3.Exit ");
            Console.WriteLine("...............");
            iChoice = Convert.ToInt32(Console.ReadLine());

            if (iChoice == 1)
            {
                Console.WriteLine("Enter the Password:");
                strPwd = Convert.ToString(Console.ReadLine());
                encryptedString = AesOperation.EncryptString(strPwd);
                Console.WriteLine($"encrypted string : {encryptedString}");


            }
            else if (iChoice == 2)
            {
                Console.WriteLine("Enter the Password:");
                strPwd = Convert.ToString(Console.ReadLine());
                var decryptedString = AesOperation.DecryptString(strPwd);
                Console.WriteLine($"decrypted string : {decryptedString}");
            }
            else
            {
                Environment.Exit(0);
            }
        }

}
}


After the successful implementation of the above code, run the application; the output seems like the below screenshot.

HostForLIFE ASP.NET Core Hosting

European best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



European ASP.NET Core Hosting - HostForLIFE :: Utilisation of FluentValidation in ASP.NET Core

clock May 16, 2023 07:19 by author Peter

In the previous article, I introduced the fundamentals of FluentValidation in .NET Core using a Console Application. In addition, I will introduce some ASP.NET Core-based applications in this article.

Suppose we have a class named Student.
public class Student
{
    public int Id { get; set; }
    public string Name { get; set; }
    public List<string> Hobbies { get; set; }
}

Now, we want to create an API to query students' hobbies.

So, we create a QueryStudentHobbiesDto class to define the request parameters.
public class QueryStudentHobbiesDto
{
    public int? Id { get; set; }
    public string Name { get; set; }
}


And let's create the validator first.
public class QueryStudentHobbiesDtoValidator: AbstractValidator<QueryStudentHobbiesDto>
{
    public QueryStudentHobbiesDtoValidator()
    {
        RuleSet("all", () =>
        {
            RuleFor(x => x.Id).Must(CheckId).WithMessage("id must greater than 0");
            RuleFor(x => x.Name).NotNull().When(x=>!x.Id.HasValue).WithMessage("name could not be null");
        });

        RuleSet("id", () =>
        {
            RuleFor(x => x.Id).NotNull().WithMessage("id could not be null")
                     .GreaterThan(0).WithMessage("id must greater than 0");
        });

        RuleSet("name", () =>
        {
            RuleFor(x => x.Name).NotNull().WithMessage("name could not be null");
        });
    }

    private bool CheckId(int? id)
    {
        return !id.HasValue || id.Value > 0;
    }
}


Let's begin with a familiar way.

Manual validation
You can regard this usage as a copy of the Console App sample that I showed you in the last article.
// GET api/values/hobbies1
[HttpGet("hobbies1")]
public ActionResult GetHobbies1([FromQuery]QueryStudentHobbiesDto dto)
{
    var validator = new QueryStudentHobbiesDtoValidator();
    var results = validator.Validate(dto, ruleSet: "all");

    return !results.IsValid
               ? Ok(new { code = -1, data = new List<string>(), msg = results.Errors.FirstOrDefault().ErrorMessage })
               : Ok(new { code = 0, data = new List<string> { "v1", "v2" }, msg = "" });


What we need to do are three steps.
    Create a new instance of the validator.
    Call the Validate method
    Return something based on the result of the Validate method.

After running up this project, we may get the following result.

Most of the time, we create a new instance directly. It is not a very good choice. We may use Dependency Injection in the next section.

Dependency Injection(DI)
There are two ways to use DI here. One is using the IValidator directly, the other one is using a middle layer to handle this, such as the BLL layer.
Use IValidator directly

What we need to do is inject IValidator<QueryStudentHobbiesDto> and call the Validate method to handle.
private readonly IValidator<QueryStudentHobbiesDto> _validator;
public ValuesController(IValidator<QueryStudentHobbiesDto> validator)
{
    this._validator = validator;
}

// GET api/values/hobbies5
[HttpGet("hobbies5")]
public ActionResult GetHobbies5([FromQuery]QueryStudentHobbiesDto dto)
{
    var res = _validator.Validate(dto, ruleSet: "all");

    return !res.IsValid
               ? Ok(new { code = -1, data = new List<string>(), msg = res.Errors.FirstOrDefault().ErrorMessage })
               : Ok(new { code = 0, data = new List<string> { "v1", "v2" }, msg = "" });
}


And don't forget to add the following code in the Startup class.
public void ConfigureServices(IServiceCollection services)
{
    //inject validator
    services.AddSingleton<IValidator<QueryStudentHobbiesDto>, QueryStudentHobbiesDtoValidator>();
}


When we run it up, we will get the same result in a manual way.

Use a middle layer

We also can create a service class to handle the business logic.
public interface IStudentService
{
    (bool flag, string msg) QueryHobbies(QueryStudentHobbiesDto dto);
}

public class StudentService : IStudentService
{
    private readonly AbstractValidator<QueryStudentHobbiesDto> _validator;
    //private readonly IValidator<QueryStudentHobbiesDto> _validator;

    public StudentService(AbstractValidator<QueryStudentHobbiesDto> validator)
    //public StudentService(IValidator<QueryStudentHobbiesDto> validator)
    {
        this._validator = validator;
    }

    public (bool flag, string msg) QueryHobbies(QueryStudentHobbiesDto dto)
    {
        var res = _validator.Validate(dto, ruleSet: "all");

        if(!res.IsValid)
        {
            return (false, res.Errors.FirstOrDefault().ErrorMessage);
        }
        else
        {
            //query ....

            return (true, string.Empty);
        }
    }
}

Go back to the controller.
private readonly IStudentService _service;
public ValuesController(IStudentService service)
{
    this._service = service;
}

// GET api/values/hobbies4
[HttpGet("hobbies4")]
public ActionResult GetHobbies4([FromQuery]QueryStudentHobbiesDto dto)
{
    var (flag, msg) = _service.QueryHobbies(dto);

    return !flag
        ? Ok(new { code = -1, data = new List<string>(), msg })
        : Ok(new { code = 0, data = new List<string> { "v1", "v2" }, msg = "" });
}

This also has an easy way which is similar to Model Binding.

Validator customization

Using the CustomizeValidatorAttribute to configure how the validator will be run.
// GET api/values/hobbies2
[HttpGet("hobbies2")]
public ActionResult GetHobbies2([FromQuery][CustomizeValidator(RuleSet = "all")]QueryStudentHobbiesDto dto)
{
    return Ok(new { code = 0, data = new List<string> { "v1", "v2" }, msg = "" });
}

Let's run it up.

It didn't seem to work!

We should add FluentValidation in the Startup class so that we can enable this feature!
public void ConfigureServices(IServiceCollection services)
{
    //others...

    services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1)
            //when using CustomizeValidator, should add the following code.
            .AddFluentValidation(fv =>
            {
                fv.RegisterValidatorsFromAssemblyContaining<Startup>();
                //fv.RunDefaultMvcValidationAfterFluentValidationExecutes = false;
                //fv.ImplicitlyValidateChildProperties = true;
            });
}


At this time, when we visit api/values/hobbies2, we can find out that we can not get the query result but the validated message.

However, we don't want to return this message to the users. We should follow the previous sample here!
How can we format the result?

We could use Filter to deal with the validated result here.
public class ValidateFilterAttribute : ResultFilterAttribute
{
    public override void OnResultExecuting(ResultExecutingContext context)
    {
        base.OnResultExecuting(context);

        //model valid not pass
        if(!context.ModelState.IsValid)
        {
            var entry = context.ModelState.Values.FirstOrDefault();

            var message = entry.Errors.FirstOrDefault().ErrorMessage;

            //modify the result
            context.Result = new OkObjectResult(new
            {
                code = -1,
                data = new JObject(),
                msg= message,
            });
        }
    }
}


And mark the attribute at the action method.
// GET api/values/hobbies3
[HttpGet("hobbies3")]
[ValidateFilter]
public ActionResult GetHobbies3([FromQuery][CustomizeValidator(RuleSet = "all")]QueryStudentHobbiesDto dto)
{
    //isn't valid will not visit the okobjectresult, but visit the filter
    return Ok(new { code = 0, data = new List<string> { "v1", "v2" }, msg = "" });
}


And we will get what we want.

This article introduced three ways to use FluentValidation in ASP.NET Core.

I hope this will help you!

HostForLIFE ASP.NET Core Hosting

European best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



European ASP.NET Core Hosting - HostForLIFE :: Ways To Optimize Performance In ASP.NET Core Applications

clock May 15, 2023 09:17 by author Peter

Distributed cache plays a significant role in optimizing the efficacy of the application by caching data in memory and reducing database calls. Simply put, caching is the act of storing the result of an operation so that subsequent requests deliver results more quickly.

When do we perform caching?

  • When the processing is sluggish.
  • Multiple computations will be performed.
  • When the output is identical for a given input, we know we do not need to recompute it because it will always be the same.
  • When your host charges for database access. By caching the response and decreasing the number of superfluous server requests, you can save money. For instance, the Google APP engine provides a fixed number of reads and writes to the data repository per day; if you exceed this limit, you must pay, even if the website does not receive a great deal of traffic.  

With NCache, there are numerous methods to optimize performance in ASP.NET Core applications.

  • Response Storage
  • Dispersed Caching
  • Object Caching Session Caching
  • SignalR

What exactly is NCache?
NCache is an open-source distributed cache for.NET, Java, and Node.js that operates in memory. NCache is incredibly quick and scalable, and it caches application data to minimize database access. NCache is utilized to address the performance issues associated with data storage, databases, and the scalability of.NET, Java, and Node.js applications.

To get started with NCache, kindly follow the installation instructions.

Response Caching
Response caching will cache the server responses for a request so that the subsequent request can access the data from the cache instead of hitting a server. For example, assume we have a country and state cascading  dropdown list  in our web form, so based on the country selected, the request will pass to the server, and we use it to get a respective state list,

In this case, the country and state data will never change frequently, so it is unnecessary to make country and state requests to the server whenever the web form loads; initially, we can save the response in the cache and for the consecutive calls instead of hitting the server we can get a saved response from a cache. We can cache any response from the HTTP request; it can be XML, Text, JSON, or HTML. We will get complete control over cached response using the cache-control property; for example, if cache-control: max-age=45 implies that the server response is valid for 45 seconds.

There are different ways to cache the response.

In-memory
Response cache via HTTP Header
Distributed Caching

1. In-Memory
In this approach, the response content will be cached in-memory on the server side and served for all subsequent calls. We can use built-in <cache> tags to define the section of the views to be cached.
Since the content is cached in-memory, it will be cleared on the application restarts.

2. Response cache via HTTP Header

This approach caches the response content on the browser's end. It can reduce the server hits because the browser cache serves the subsequent calls.

3. Distributed Caching
In this approach, the response cache is distributed across several servers in a web farm that is external to an application. For subsequent calls, any server can be responded to. NCache provides a powerful distributed cache service to implement this approach, improving the application's performance by reducing server hits.
NCache as a response caching middleware

NCache is one of the best response caching middleware for .NET web applications, providing the following benefits as a distributed cache.

  • 100% .NET- It is one of the distributed caches in the market, which is entirely built with .NET.
  • Fast and Scalable- It is very fast since it uses an in-memory distributed cache. We can scale it linearly, so NCache will be a great choice when we experience performance challenges in our ASP.NET Core application during high traffic.
  • High Availability- It supports peer clustering architecture, so there is no data loss when the cache server is down.

Distributed Caching
Distributed Caching is used to overcome the drawbacks of in-memory caching mechanisms regarding reliability and scalability. It will improve the application's performance when cloud services host the app. In the distributed cache, the cache is shared by multiple app servers and maintained as an external service.

It has several advantages,
    It doesn't use local memory
    Reliability -The cached data is available in the different data center
    It's consistency across requests to multiple servers

By default, ASP.NET Core has a distributed caching mechanism that persists the data in memory. The main drawback is the memory limit; the cached data will be cleared during the server restarts. Another approach is using the SQL Server, which has a drawback in performance; it will impact the application's performance when the load is high. So, the best approach is distributed caching with NCache. NCache is one of the distributed caching service providers. Using NCache, we can configure the ASP.NET Core application locally and in Azure as a distributed cache cluster.

The AddNCacheDistributedCache extension method configures NCache as a provider for Distributed Caching for the ASP.NET Core web application.

Download the NuGet package and add the below code to the program.cs file.
builder.Services.AddNCacheDistributedCache(configuration => {
    configuration.CacheName = "democache";
    configuration.EnableLogs = true;
    configuration.ExceptionsEnabled = true;
});


If there is any update in the record from the database, it will cache the new data.

Session Storage in ASP.NET Core using NCache

HTTP is a stateless protocol, which means the information will not persist across multiple requests; we need to use a session storage mechanism for the ASP.NET Core application to persist the information across multiple requests. By default, ASP.NET Core will use in-memory storage to store the session information, this is fine to some extend, but whenever there is high traffic, this session storage may fail due to some limitations.

Memory Limitation
Since the session information is directly stored in-memory, memory space might be a major concern here; if the memory space exceeds more than the limitation, the session storage will fail.

Session Loss
If the application server goes down, the session loss occurs.

ASP.NET Core provides two options to perform session caching,
    Session storage provider and  
    IDistributedCache interface

Both of them are fast in managing the session information; however, both are stand-alone caches with a single point of failure and no option for data replication when the server goes down.

So, the perfect solution to overcome this issue is storing the session information in distributed session storage. NCache is an in-memory distributed Cache provider developed natively in .NET and .NET Core; it's fast and scalable.

Configuring the NCache on the default Sessions Storage provider and IDistributed Cache interface is simple. Know more about session caching in NCache here.
Advantages of using NCache

1. Data replication capability

NCache provides data replication capability so that the session information will be available in the different centers for different regions; if any region went down, it would reroute the traffic from one region to another.

2. It also offers session sharing between separate cache clusters.
Configuring NCache as a Session Storage Provider

1. Install the following NuGet Package Manager
a.    For Open source
Install-Package AspNetCore.Session.NCache.OpenSource

b.    For Professional
Install-Package AspNetCore.Session.NCache.Professional

c.    For Enterprise
Install-Package AspNetCore.Session.NCache

2. Configure the session using the AddNCacheSession extension method, and add the below code to the program.cs file.
builder.Services.AddNCacheSession(configuration => {
    configuration.CacheName = "[your clustered cache name]";
    configuration.EnableLogs = true;
    configuration.SessionAppId = "demoAppSession";
    configuration.SessionOptions.IdleTimeout = 5;
    configuration.SessionOptions.CookieName = "AspNetCore.Session";
});


3. Configure the NCache settings in appsettings.cs file
{
  "NCacheSettings": {
    "SessionAppId": "demoAppSession",
    "SessionOptions": {
      "CookieName": "AspNetCore.Session",
      "CookieDomain": null,
      "CookiePath": "/",
      "CookieHttpOnly": "True",
      "IdleTimeout": "5",
      "CookieSecure": "None",
     "useJsonSerialization": true,
    },

    "CacheName": "demoClusteredCache",
    "EnableLogs": "True",
    "RequestTimeout": "90"
  },
}

Reference: ASP.NET Core Session Provider Configuration | NCache Docs (alachisoft.com)

Object Caching
Object Caching is one of the techniques to cache the object data to improve the ASP.NET Core application performance.
Assume we have an ASP.NET Core Web API service that returns a list of available countries in the world; in this case, whenever there is a request from the client, the service will fetch the country list from the database and return it to the client. In this case, we know the Country object data will not be frequently updated. So, caching this object in a distributed cache will avoid unnecessary database trips and improve the application's performance.

NCache provides two APIs to achieve object caching,
    NCache API
    IDistributedCache API

1. NCache API
NCache API has an extension method GetCache which is used to connect to an instance of the NCache within the application.

2. IDistributedCache API
If your application already uses the default distributed cache, the IDistributedCache API of NCache can be easily configured on top of the default one.

We already discussed IDistributedCache in the above DistributedCache section.

What is ASP.NET Core SignalR?

SignalR is used to fetch the content from the server side to clients instantly and manages the connection seamlessly. It is an open-source Microsoft API. It uses remove procedure call shortened to RPC to call the client from the server. Hubs establish a core connection between the server and the client, and it will have complete control over the connections.  

Use cases for SignalR,
    Real-time monitoring applications
    Chat application
    Scoreboard application
    IoT Device control and so on.  

SignalR Backplane
It is a shared bus or repository. We can configure a Webfarm as a backplane so that instead of sending a message from a web server to the clients, the backplane can be broadcast to all the web servers with all the servers connected to the backplane.

Main bottlenecks with SignalR Backplane,
    Database as SignalR Backplane is slow; it cannot scale out with increasing applied load.
    The database can slow down under high load, which may led to single-point failure sometime.

Integrating a scalable NCache In-memory Distributed Cache to our ASP.NET Core SignalR application can overcome all these bottlenecks and improve performance.

Configuring NCache as a backplane for ASP.NET Core SignalR application,

1. Download and install the NCache Nuget Package
dotnet add package AspNetCore.SignalR.NCache --version 5.3.1

2. Configure NCache settings in appsettings.json
"NCacheConfiguration": {
  "CacheName": "myLocalCache",
  "ApplicationID": "chatApplication"
},

3. Using the AddNCache extension method, we can easily configure our application and add the below code in the program.cs.
builder.Services.AddSignalR().AddNCache(ncacheOptions => {
    ncacheOptions.CacheName = Configuration["NCacheConfiguration:CacheName"];
    ncacheOptions.ApplicationID = Configuration["NCacheConfiguration:ApplicationID"];
});

We have seen a different technique for improving the performance of the ASP.NET Core applications using response cache, distributed cache, SignalR cache, and session and object cache with NCache. We also saw the overview of response, distributed, SignalR, session, and object cache on what, why, and the advantages of using this caching mechanism with NCache, which is super-fast and linearly scalable, and it is built on .NET to overcome the performance bottleneck in our ASP.NET Core web applications.

HostForLIFE ASP.NET Core Hosting

European best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in