European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core 9.0 Hosting - HostForLIFE :: Explaining IAuthorizationFilter in .NET Core

clock October 28, 2024 07:45 by author Peter

We will discuss ASP.NET Core IAuthorizationFilter in this article.

What is the Authorization Filter?
The Authorization Filter allows us to apply authorization rules to controllers and actions within our application. Authorization Filters in ASP.NET Core are responsible for checking whether a user is allowed to perform an action or access a resource. These filters run before the action method is executed, ensuring the user has permission to access the method.

Authorization filters are executed after the routing but before model binding and other action filters. If the authorization fails (e.g., the user does not have the required permissions), the filter short-circuits the request, and the action method does not execute.
IAuthorizationFilter

In ASP.NET Core, IAuthorizationFilter is an interface that allows you to perform authorization checks on an action or a controller before the action executes. Implementing this interface gives you the ability to enforce security and authorization rules within your application.

Advantages of IAuthorizationFilter

  • Centralized Authorization Logic: By using filters, you can centralize your authorization logic, making it easier to maintain and manage. You can apply the same filter to multiple controllers or actions.
  • Separation of Concerns: Filters support the separation of concerns in your application. They promote clean architecture by isolating authorization logic from business logic.
  • Flexibility: Since filters can be applied at the action or controller level, you have the flexibility to handle different authorization requirements for different endpoints easily.
  • Access to Action Context: The `IAuthorizationFilter` provides access to the `ActionExecutingContext`, which contains information about the current request, the action is executed, and route data. This allows for dynamic authorization checks based on the context.
  • Built-in to ASP.NET Core: It is part of the ASP.NET Core framework, which means it is well-integrated with the rest of the framework and benefits from the built-in dependency injection features.
  • Chain of Responsibility: Multiple filters can be applied to a single action. This allows for a chain of responsibility pattern, where different filters can be called in a defined order.

Disadvantages of IAuthorizationFilter

  • Performance Impact: If your authorization logic involves heavy or synchronous operations (like database calls), it can slow down request processing. This is critical if the logic is not optimized or you make external calls during authorization checks.
  • Complexity in Conditional Logic: Complex authorization requirements that depend on various conditions may lead to complicated filter implementations, making it harder to read and maintain.
  • Limited to Action Execution: Authorization filters are only executed before the action method is called. You cannot use them to enforce rules after action execution, such as in the response pipeline.
  • Difficulty in Testing: If your authorization logic is embedded within the filters, unit testing may become more complex since you'll need to mock the entire filter environment.

Example

public class CustomAuthorizationFilter : IAuthorizationFilter
{
    public void OnAuthorization(AuthorizationFilterContext context)
    {
        // Implement your authorization logic here
        var user = context.HttpContext.User;

        if (!user.Identity.IsAuthenticated)
        {
            // User is not authenticated, return a 403 Forbidden response
            context.Result = new ForbidResult();
        }
    }
}


// Register the filter in Startup.cs

public void ConfigureServices(IServiceCollection services)
{
    services.AddControllers(options =>
    {
        options.Filters.Add<CustomAuthorizationFilter>();
    });
}

// Above .net 6

builder.Services.AddScoped<CustomAuthorizationFilter>();

[ApiController]
[Route("[controller]")]
[ServiceFilter(typeof(CustomAuthorizationFilter))]
public class HomeController : ControllerBase
{
    [HttpGet]
    public IActionResult GetAuthorized()
    {
        return Ok("You are authorized!");
    }
}

 

 

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Serialization and DeSerialization in C#

clock October 22, 2024 06:50 by author Peter

Serialization is the process of transforming an item into a format (XML, JSON, SOAP, or binary) that is suitable for easy storage or transmission. The opposite procedure, known as deserialization, transforms the serialized data back into an object.

Types of Serialization Process

  • XML Serialization
  • JSON Serialization
  • Binary Serialization
  • SOAP Serialization

1. XML Serialization
XML serialization in .NET involves the process of transforming an object into an XML format, which facilitates easy storage, transfer, or sharing across different systems. In contrast to binary or SOAP serialization, XML serialization primarily focuses on the public fields and properties of an object, thereby providing a more straightforward and transparent method for data representation.

Features

  • Transform an object into XML format.
  • The predominant form of serialization is utilized for interoperability.
  • Public properties and fields undergo serialization, while private ones remain excluded.
  • Frequently employed in web services, configuration files, and for storage that is easily readable by humans.

Step 1. Create a class for XML serialization like the one below.
EmployeeDetails.cs
public class EmployeeDetails
{
    public string Name { get; set; }
    public int Age { get; set; }
    public string Address { get; set; }
    public string Hobbies { get; set; }
    public string EmployeeType { get; set; }
}


Step 2. Serialize the class EmployeeDetails.cs and save it according to the specified path.
public partial class MainWindow : Window
{
    EmployeeDetails employee;

    // Specify the file path where the XML file will be saved
    string filePath = @"C:\Users\peter\Desktop\Peter\SerailizedPath\"; // Change this to your desired location
    public MainWindow()
    {
        InitializeComponent();
        employee = new EmployeeDetails
        {
            Name = "Peter",
            Age = 30,
            Address = "London",
            EmployeeType = "Permanent",
            Hobbies = "Cricket"
        };
    }
    private void XmlSerialization_Click(object sender, RoutedEventArgs e)
    {
        // Create the XmlSerializer for the EmployeeDetails type
        XmlSerializer serializer = new XmlSerializer(typeof(EmployeeDetails));
        // Serialize the object to the specified location
        using (StreamWriter writer = new StreamWriter(Path.Combine(filePath, "Employee.xml")))
        {
            serializer.Serialize(writer, employee);
        }
    }
}
Step 3. DeSerialize the class EmployeeDetails.cs like below.
private void XmlDESerialization_Click(object sender, RoutedEventArgs e)
{
    XmlSerializer serializer = new XmlSerializer(typeof(EmployeeDetails));
    using (StreamReader reader = new StreamReader(Path.Combine(filePath, "Employee.xml")))
    {
        EmployeeDetails employeeDetails = (EmployeeDetails)serializer.Deserialize(reader);
        MessageBox.Show($"Name: {employeeDetails.Name}, Age: {employeeDetails.Age}, Address: {employeeDetails.Address}, EmployeeType: {employeeDetails.EmployeeType}, Hobbies: {employeeDetails.Hobbies}");
    }
}

2. JSON Serialization

In C#, JSON serialization is the process of transforming an object into a JSON string, while deserialization involves converting a JSON string back into an object. The System.Text.Json or Newtonsoft.Json libraries can be utilized for these operations.Features

  • Transform an object into JSON format.
  • It is commonly utilized for web APIs, configuration files, and efficient data storage.
  • JSON serialization is supported by both the System.Text.Json and Newtonsoft.Json libraries.

Step 1. Create a class for JSON serialization like the one below.
EmployeeDetails.cs
public class EmployeeDetails
{
    public string Name { get; set; }
    public int Age { get; set; }
    public string Address { get; set; }
    public string Hobbies { get; set; }
    public string EmployeeType { get; set; }
}


Step 2. Serialize the EmployeeDetails.cs class into JSON format and save it to the specified location based on your requirements.
public partial class MainWindow : Window
{
    EmployeeDetails employee;

    // Specify the file path where the json file will be saved
    string filePath = @"C:\Users\peter\Desktop\Peter\SerailizedPath\"; // Change this to your desired location

    public MainWindow()
    {
        InitializeComponent();
        employee = new EmployeeDetails
        {
            Name = "Peter",
            Age = 30,
            Address = "London",
            EmployeeType = "Permanent",
            Hobbies = "Cricket"
        };
    }

    private void JsonSerialization_Click(object sender, RoutedEventArgs e)
    {
        string jsonString = JsonConvert.SerializeObject(employee, Formatting.Indented);
        File.WriteAllText(Path.Combine(filePath, "Employee.json"), jsonString);
        Debug.WriteLine("Object serialized to employee.json");
    }

    private void JsonDESerialization_Click(object sender, RoutedEventArgs e)
    {
        string jsonString = File.ReadAllText(Path.Combine(filePath, "Employee.json"));
        EmployeeDetails employee = JsonConvert.DeserializeObject<EmployeeDetails>(jsonString);
        MessageBox.Show($"Name: {employee.Name}, Age: {employee.Age}, Address: {employee.Address}, EmployeeType: {employee.EmployeeType}, Hobbies: {employee.Hobbies}");
    }
}

3. Binary Serialization
Binary serialization refers to the method of transforming an object into a binary format, enabling it to be saved in a file, transmitted across a network, or exchanged between applications. After the object has been serialized into binary, it can subsequently be deserialized, restoring it to its original state.

Features

  • Transform an object into a binary representation.
  • Optimized for size, though not easily interpretable by humans.
  • It necessitates that the object is annotated with the [Serializable] attribute.
  • Frequently employed for saving objects to storage or for communication between applications.

Step 1. Create a class for Binary serialization like the one below.EmployeeDetails.cs
[Serializable]
public class EmployeeDetails
{
    public string Name { get; set; }
    public int Age { get; set; }
    public string Address { get; set; }
    public string Hobbies { get; set; }
    public string EmployeeType { get; set; }
}


Step 2. Serialize the EmployeeDetails.cs class in binary format and save it to the specified location based on your requirements.
public partial class MainWindow : Window
{
    EmployeeDetails employee;

    // Specify the file path where the Binary file will be saved
    string filePath = @"C:\Users\Peter\Desktop\Peter\SerailizedPath\";  // Change this to your desired location

    public MainWindow()
    {
        InitializeComponent();
        employee = new EmployeeDetails
        {
            Name = "Peter",
            Age = 30,
            Address = "London",
            EmployeeType = "Permanent",
            Hobbies = "Cricket"
        };
    }

    [Obsolete]
    private void BinarySerialization_Click(object sender, RoutedEventArgs e)
    {
        BinaryFormatter formatter = new BinaryFormatter();

        using (FileStream stream = new FileStream(Path.Combine(filePath, "Employee.dat"), FileMode.Create))
        {
            formatter.Serialize(stream, employee);
        }

        Debug.WriteLine("Object serialized to employee.dat");
    }
}


Step 3. DeSerialize the class EmployeeDetails.cs like below.

[Obsolete]
private void BinaryDESerialization_Click(object sender, RoutedEventArgs e)
{
    BinaryFormatter formatter = new BinaryFormatter();
    using (FileStream stream = new FileStream(Path.Combine(filePath, "employee.dat"), FileMode.Open))
    {
        EmployeeDetails person = (EmployeeDetails)formatter.Deserialize(stream);
        MessageBox.Show($"Name: {person.Name}, Age: {person.Age}, Address: {person.Address}, EmployeeType: {person.EmployeeType}, Hobbies: {person.Hobbies}");
    }
}

4. SOAP Serialization

It refers to the method of transforming an object into the XML format defined by the Simple Object Access Protocol (SOAP), facilitating its transmission across a network, particularly within web services. While SOAP was widely utilized in earlier web service architectures, its prevalence has diminished in recent years, largely due to the increasing adoption of RESTful APIs and JSON serialization.

Features

  • Utilizes the SOAP (Simple Object Access Protocol) format.
  • Traditionally employed in web services.

Step 1. Add the “SoapFormatter” Nuget package as shown below.

Step 2. Create a class for soap serialization like the one below.

EmployeeDetails.cs

public class EmployeeDetails
{
    public string Name { get; set; }
    public int Age { get; set; }
    public string Address { get; set; }
    public string Hobbies { get; set; }
    public string EmployeeType { get; set; }
}


Step 3. Serialize the EmployeeDetails.cs class into Soap format and save it to the specified location based on your requirements.
public partial class MainWindow : Window
{
    EmployeeDetails employee;
    // Specify the file path where the Soap file will be saved
    string filePath = @"C:\Users\peter\Desktop\Peter\SerailizedPath\"; // Change this to your desired location
    public MainWindow()
    {
        InitializeComponent();
        employee = new EmployeeDetails
        {
            Name = "Peter",
            Age = 30,
            Address = "London",
            EmployeeType = "Permanent",
            Hobbies = "Cricket"
        };
    }
    private void SoapSerialization_Click(object sender, RoutedEventArgs e)
    {
        SoapFormatter formatter = new SoapFormatter();
        using (FileStream stream = new FileStream(Path.Combine(filePath, "Employee.soap"), FileMode.Create))
        {
            formatter.Serialize(stream, employee);
        }
        Debug.WriteLine("Object serialized to employee.soap");
    }
}

Step 4. DeSerialize the class EmployeeDetails.cs like below.
private void SoapDESerialization_Click(object sender, RoutedEventArgs e)
{
    SoapFormatter formatter = new SoapFormatter();
    using (FileStream stream = new FileStream(Path.Combine(filePath, "employee.soap"), FileMode.Open))
    {
        EmployeeDetails person = (EmployeeDetails)formatter.Deserialize(stream);
        MessageBox.Show($"Name: {person.Name}, Age: {person.Age}, Address: {person.Address}, EmployeeType: {person.EmployeeType}, Hobbies: {person.Hobbies}");
    }
}


Note. The results of all the aforementioned formats can be observed by executing the application as demonstrated below. Various serialization methods cater to distinct use cases influenced by requirements for performance, interoperability, and readability.

Serialization methods

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Obtain Every SQL Server Instance in C#

clock October 18, 2024 08:13 by author Peter

We'll be using SQL SMO(SQL Management Objects).

First of All, you need to add a reference to the Microsoft.SqlServer.smo.dll file which is located in.

  • For 64-bit Windows 7:[Your drive]:\Program Files (x86)\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.Smo.dll.
  • For 32-bit Windows 7:[Your drive]:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.Smo.dll.

Now that we've added it to our project, we can start coding!

Add a ListBox control to your project.
DataTable dataTable = SmoApplication.EnumAvailableSqlServers(true);
listBox1.ValueMember = "Name";
listBox1.DataSource = dataTable;


By running this code alone,you'll get all the available sql servers inside your listbox control.

Ok, let's develop it further. Let's see what databases our instances have. So to do this, you need to add another ListBox.Then in your listbox1's selectedindexchanged event, you need to check which server selected. So you need to create a server object first.

After this, you'll iterate through this server to populate all the databases in the newly created Listbox.

Here is the code to do that.
listBox2.Items.Clear();
if (listBox1.SelectedIndex != -1) {
    string serverName = listBox1.SelectedValue.ToString();
    Server server = new Server(serverName);
    try {
        foreach (Database database in server.Databases) {
            listBox2.Items.Add(database.Name);
        }
    } catch (Exception ex) {
        string exception = ex.Message;
    }
}

After we run the project we'll be getting our Databases.

Hope it helps!

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ASP.NET Core: Customizing HTTP Headers with Middleware

clock October 14, 2024 08:04 by author Peter

We'll attempt to use custom middleware to alter HttpResponse in this post. After developing custom middleware, we will alter the response. The middleware will alter the HttpResponse each time the request is returned to the client.

public class CustomResponseMiddleWare
{
    private readonly RequestDelegate _next;
    public CustomResponseMiddleWare(RequestDelegate next)
    {
        _next = next;
    }
    public async Task InvokeAsync(HttpContext context)
    {
        context.Response.Headers["CustomHeader"] = "This middleware added by Peter";
        await _next(context);
    }
}

We have created custom middleware and added a new header at the invoke method.

Program.cs
using SampleHttpResponseModify;

var builder = WebApplication.CreateBuilder(args);
// Add services to the container
builder.Services.AddControllers();
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
var app = builder.Build();
// Use custom middleware
app.UseMiddleware<CustomResponseMiddleWare>();
// Configure the HTTP request pipeline
if (app.Environment.IsDevelopment())
{
    app.UseSwagger();
    app.UseSwaggerUI();
}
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();


Register middleware at the program. cs.
app.UseMiddleware<CustomResponseMiddleWare>();

When we ran the application and inspected it, we found a custom-added header in the response.Now let us try to modify the Http header when a particular API is called.
Consider i want to add an extra header when we call weather API only then do we have to add condition on middleware like below.
public class CustomResponseMiddleWare
{
    private readonly RequestDelegate _next;
    public CustomResponseMiddleWare(RequestDelegate next)
    {
        _next = next;
    }
    public async Task InvokeAsync(HttpContext context)
    {
        context.Response.Headers["CustomHeader"] = "This middleware added by Peter";
        if (context.Request.Path.Value == "/WeatherForecast")
        {
            context.Response.Headers["CustomeHeadernparticularRequest"] = "This is header added when this path found";
        }
        await _next(context);
    }
}

We have added the below condition only, so if the below path is found then only add an extra header.
if (context.Request.Path.Value == "/WeatherForecast")
{
    context.Response.Headers["CustomeHeadernparticularRequest"] = "This is header added when this path found";
}

Response on WeatherForcast API. We can see the additional header is coming.

Another learning point here is a sequence of middleware in the program.cs. If we move the position of UseMiddleware then we will find that this middleware is not calling due to incorrect positions or order of middleware.

Conclusion
We have learned about how to modify http headers using custom middleware. Keep learning and Keep Enjoying.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ​Simple Load Balancer in .NET Core with YARP

clock October 8, 2024 08:55 by author Peter

Load balancing is essential in distributed systems and web applications to ensure that traffic is efficiently distributed across multiple servers or resources.

Reverse Proxy

A reverse proxy is a server that sits between client devices (e.g., browsers) and the backend servers, forwarding client requests to the appropriate server and then returning the server's response to the client.

Sticky Sessions

Sticky sessions (also known as Session Affinity) in YARP is a feature that ensures requests from a particular client are always routed to the same backend server. This is particularly important for applications that rely on server-side session data, such as when storing user state or authentication information in memory on specific servers.

YARP(Yet Another Reverse Proxy) Setup

create a new ASP.NET Core web appliLoad balancing is essential in distributed systems and web applications to ensure that traffic is efficiently distributed across multiple servers or resources.

Reverse Proxy

A reverse proxy is a server that sits between client devices (e.g., browsers) and the backend servers, forwarding client requests to the appropriate server and then returning the server's response to the client.

Sticky Sessions

Sticky sessions (also known as Session Affinity) in YARP is a feature that ensures requests from a particular client are always routed to the same backend server. This is particularly important for applications that rely on server-side session data, such as when storing user state or authentication information in memory on specific servers.

YARP(Yet Another Reverse Proxy) Setup

  • create a new ASP.NET Core web application
  • Install the Yarp.ReverseProxy NuGet package

Program.cs
var builder = WebApplication.CreateBuilder(args);

builder.Services.AddReverseProxy().LoadFromConfig(builder.Configuration.GetSection("ReverseProxy"));

var app = builder.Build();
app.MapReverseProxy();

app.Run();


appsettings.json
{
  "ReverseProxy": {
    "Routes": {
      "api-route": {
        "ClusterId": "api-cluster",
        "Match": {
          "Path": "{**catch-all}"
        },
        "Transforms": [
          { "PathPattern": "{**catch-all}" }
        ]
      }
    },
    "Clusters": {
      "api-cluster": {
        "SessionAffinity": {
          "Enabled": "true",
          "AffinityKeyName": "Key1",
          "Cookie": {
            "Domain": "localhost",
            "Expiration": "03:00:00",
            "IsEssential": true,
            "MaxAge": "1.00:00:00",
            "SameSite": "Strict",
            "SecurePolicy": "Always"
          }
        },
          "LoadBalancingPolicy": "RoundRobin",
          "Destinations": {
            "destination1": {
              "Address": "https://localhost:7106"
            },
            "destination2": {
              "Address": "https://localhost:7107"
            }
          }
        }
      }
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  },
  "AllowedHosts": "*"
}

Destination Address: The destination address should point to the base URL of your hosted web application.

Output
Here is the console output of round-robin policy.

Load balancer

Summary
Load balancing helps in improving the scalability, performance, availability, and security of web applications and services.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ​Implementing CORS in .NET Core 8

clock October 4, 2024 07:56 by author Peter

Web pages are prohibited from sending requests to a domain other than the one that provided them thanks to a security feature called Cross-Origin Resource Sharing, or CORS. Building safe and useful apps in modern web development requires knowing how to properly implement CORS, especially when utilizing Angular as a front-end framework and.NET Core 8 as a backend. The best practices and typical pitfalls for configuring CORS in a.NET Core 8 environment are described in this article.

Assume you have a friend who lives next door (another website) and a toy box at home (your website). Your mom (the browser) has a rule that states your friend can only play with their own toys. Your friend wants to play with your toys (data) in your toy box. This is to ensure that everything is secure and safe. Assume that your friend's website is funfriend.com and that yours is mycooltoys.com. CORS checks are performed if funfriend.com requests to borrow a toy from mycooltoys.com.

  • Is funfriend.com allowed to borrow toys from mycooltoys.com?
  • Did they follow the rules about how to ask?
  • If everything is good, then your friend can borrow the toy!

Setting Up CORS in .NET Core 8

  • Install Necessary Packages: If you haven't already, ensure you have the required .NET Core packages installed. In most cases, the default setup will suffice, but you can install any specific CORS libraries if needed.
  • Configure CORS in Startup: In .NET Core 8, the configuration of CORS is typically done in the Program.cs file. Here’s a simple setup.

    var builder = WebApplication.CreateBuilder(args);
    // Add CORS services
    builder.Services.AddCors(options =>
    {
        options.AddPolicy("AllowAngularApp",
            builder => builder.WithOrigins("https://your-angular-app.com")
                              .AllowAnyMethod()
                              .AllowAnyHeader()
                              .AllowCredentials());
    });
    var app = builder.Build();
    // Use CORS policy
    app.UseCors("AllowAngularApp");
    app.MapControllers();
    app.Run();


  • Allowing Specific Origins: For production environments, it’s crucial to specify the exact origin rather than using AllowAnyOrigin(), which is a common pitfall. Limiting allowed origins enhances security.

options.AddPolicy("AllowAngularApp",
    builder => builder.WithOrigins("https://your-angular-app.com")
                      .AllowAnyMethod()
                      .AllowAnyHeader());

  • Handling Preflight Requests: Ensure your server can handle preflight requests. These are OPTIONS requests sent by browsers to check permissions. By enabling CORS and handling these requests, you ensure that your application can respond correctly.
  • Allow Credentials: If your Angular application needs to send cookies or HTTP authentication information, you need to set AllowCredentials() in your CORS policy. Be cautious with this feature, as it requires that the origin is explicitly specified and cannot be set to AllowAnyOrigin().

Advanced CORS Configuration

  • Customizing Allowed Methods: You can customize allowed methods if your API uses specific HTTP methods.

    options.AddPolicy("AllowAngularApp",
        builder => builder.WithOrigins("https://your-angular-app.com")
                          .WithMethods("GET", "POST", "PUT", "DELETE")
                          .AllowAnyHeader()
                          .AllowCredentials());


Setting Exposed Headers: If your API returns custom headers that the client needs to access, specify these using WithExposedHeaders.

options.AddPolicy("AllowAngularApp",
    builder => builder.WithOrigins("https://your-angular-app.com")
                      .AllowAnyMethod()
                      .AllowAnyHeader()
                      .WithExposedHeaders("X-Custom-Header")
                      .AllowCredentials());

Logging CORS Requests: For debugging purposes, you can log CORS requests to track any issues that arise. Here’s a simple logging middleware.
app.Use(async (context, next) =>
{
    if (context.Request.Headers.ContainsKey("Origin"))
    {
        var origin = context.Request.Headers["Origin"];
        Console.WriteLine($"CORS request from: {origin}");
    }
    await next();
});

Best Practices

  • Limit Origins: Always specify the exact origins that are permitted to interact with your API. Avoid using wildcards (*) as they expose your API to potential security risks.
  • Use HTTPS: Ensure both your .NET Core backend and Angular frontend are served over HTTPS. This secures data in transit and enhances trustworthiness.
  • Regularly Review CORS Policies: As your application grows and evolves, periodically review your CORS configurations to ensure they align with current security requirements.
  • Test CORS Configurations: Use tools like Postman or browser developer tools to test your CORS setup. Check for errors and ensure your API is returning the expected headers.
  • Document Your API: Clearly document the CORS policies and allowed origins in your API documentation. This helps other developers understand how to interact with your API correctly.

Common Pitfalls

  • Misconfigured Allowed Origins: One of the most frequent mistakes is misconfiguring allowed origins. Double-check the exact URLs, including the protocol (HTTP vs. HTTPS) and any potential trailing slashes.
  • Forgetting to Apply CORS Middleware: Ensure that the UseCors middleware is applied before any endpoints are mapped. Placing it after endpoint mapping can lead to unexpected behaviors.
  • AllowAnyOrigin with AllowCredentials: This combination is not allowed and will cause CORS requests to fail. If you need credentials, specify the exact origins.
  • Not Handling OPTIONS Requests: Ignoring the preflight OPTIONS requests can lead to issues when your API is accessed from different origins. Ensure your server can properly handle these requests.

Example
Default

Allowing Specific Origins

Conclusion
Implementing CORS in .NET Core 8 for Angular applications is crucial for creating a secure and functional web application. By following best practices and being aware of common pitfalls, you can ensure that your CORS setup is both effective and secure. Regularly revisiting your CORS configuration as your application evolves will help maintain security and functionality in the long run.

Happy Coding!

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using the Shopping Cart Discount Function to Better Understand the Concept of a Rule Engine

clock September 30, 2024 09:30 by author Peter

Hello everyone, I hope everything is going well for you. We will examine the idea of a rule engine in this post, as well as how NRule (a.NET package) implements it. We will develop a new.NET 8 desktop application to demonstrate the shopping cart discount feature for users according to their membership status (Normal/Silver/Gold/VIP) in order to facilitate comprehension.

Now let's use NRule and Rule Engine together

A software program that carries out one or more business rules in a runtime production setting is called a rule engine. The policies may be derived from corporate guidelines, statutory regulations, or other sources. You may think of a rule engine as an advanced translator of if/then statements. A rule engine's main advantage is that its business rules may be written in a non-programmer's language and can be changed without affecting the application code underneath.

NRules - An Introduction
NRules is an open-source .NET rule engine that supports defining business rules separate from the system logic. It allows for rules to be written in C#, using internal DSL to express rule conditions and actions. NRules is highly extensible, can integrate with any .NET application, and supports dynamic rule compilation.
Shopping Cart Discount Logic Using NRules

In this example, we will create a simple console application in C# that uses NRules to apply different discount percentages based on the user's membership type (Normal, Silver, Gold, VIP).

Step 1. Set Up the Project
First, create a new Console Application in Visual Studio or using the .NET CLI. Then, add the NRules package and NRules.Runtime package via NuGet.

dotnet add package NRules
dotnet add package NRules.Runtime

Step 2. Define the Domain Model
Create classes for Customers and Orders
public class Customer
{
    public string MembershipType { get; set; }
}

public class Order
{
    public Customer Customer { get; set; }
    public decimal TotalAmount { get; set; }
    public decimal DiscountedAmount { get; set; }
}


Step 3. Define the Rules
Create a class MembershipDiscountRule for discount rules. We'll consolidate the rules into a single class for simplicity.
using RuleEngine_ShoppingCartDiscount.Models;
using NRules.Fluent.Dsl;
using System;

namespace RuleEngine_ShoppingCartDiscount
{
    public class MembershipDiscountRule : Rule
    {
        public override void Define()
        {
            Order order = null;

            When()
                .Match<Order>(() => order,
                              o => o.Customer != null,
                              o => o.DiscountedAmount == 0); // Ensure discount is not already applied

            Then()
                .Do(ctx => ApplyDiscount(order))
                .Do(ctx => ctx.Update(order));
        }

        private void ApplyDiscount(Order order)
        {
            var discount = order.Customer.MembershipType switch
            {
                "Normal" => 0.90m, // 10% discount
                "Silver" => 0.80m, // 20% discount
                "Gold" => 0.75m, // 25% discount
                "VIP" => 0.70m, // 30% discount
                _ => 1.00m // No discount
            };

            order.DiscountedAmount = order.TotalAmount * discount;
            Console.WriteLine($"Applied {((1 - discount) * 100)}% discount for {order.Customer.MembershipType} member. Total now: {order.DiscountedAmount}");
        }
    }
}


Step 4. Configure and Run the Rule Engine


In your Main method, set up the rule repository, compile rules, create a session, and create the list of Customers (Normal,Silver,Gold,MVP).

// See https://aka.ms/new-console-template for more information
// Load rules
using NRules;
using NRules.Fluent;
using RuleEngine_ShoppingCartDiscount;
using RuleEngine_ShoppingCartDiscount.Models;

var repository = new RuleRepository();
repository.Load(x => x.From(typeof(MembershipDiscountRule).Assembly));

// Compile rules
var factory = repository.Compile();

// Create a session
var session = factory.CreateSession();

var customers = new List<Customer>
{
    new Customer { MembershipType = "Normal" },
    new Customer { MembershipType = "Silver" },
    new Customer { MembershipType = "Gold" },
    new Customer { MembershipType = "VIP" }
};

// Create customer and order
foreach (var customer in customers)
{
    var order = new Order { Customer = customer, TotalAmount = 100 };

    // Insert facts into rules engine's memory
    session.Insert(order);

    // Start match/resolve/act cycle
    session.Fire();

    Console.WriteLine($"Final amount to pay: {order.DiscountedAmount}\n");
}

Console.ReadLine();


Step 5. Test the Application

Run the console application and it will print the All membership-based discount price.

Code Explanation

  • Rules Definition: A single rule handles all membership types by using a switch expression to determine the discount rate based on the membership type.
  • Rule Engine Setup: Rules are loaded, and compiled, and a session is created where the order is inserted as a fact.
  • Execution: The session.Fire() method triggers the rule engine, which evaluates the inserted facts against the compiled rules and applies the appropriate discount.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Docker and Kubernetes: Containerizing React JS and.NET Core Applications

clock September 24, 2024 07:54 by author Peter

This tutorial will use React JS web forms and the.NET Core Web API to develop a prototype product application backend. Additionally, we will use Docker and Kubernetes to containerize the same process.

Example of a Product Application: Web API for Backend (.NET Core)
First, create a new Web API for Product Management using.NET Core.

Step 2: Install the NuGet packages listed below, which are for the in-memory database.

Step 3. Add the product class inside the entities folder.
namespace ProductManagementAPI.Entities
{
    public class Product
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public decimal Price { get; set; }
    }
}

Step 4. Create an AppDbContext class inside the data folder with an in-memory connection and a DB set property.
using Microsoft.EntityFrameworkCore;
using ProductManagementAPI.Entities;
namespace ProductManagementAPI.Data
{
    public class AppDbContext : DbContext
    {
        public DbSet<Product> Products { get; set; }

        public AppDbContext(DbContextOptions<AppDbContext> options)
            : base(options)
        {
        }
        protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
        {
            // This check prevents configuring the DbContext if options are already provided
            if (!optionsBuilder.IsConfigured)
            {
                // Configure the in-memory database here, if needed
                optionsBuilder.UseInMemoryDatabase("InMemoryDb");
            }
        }
        protected override void OnModelCreating(ModelBuilder modelBuilder)
        {
            // Optionally configure entity mappings here
        }
    }
}


Step 5. Add a product repository inside the repositories folder.
IProductRepository
using ProductManagementAPI.Entities;
namespace ProductManagementAPI.Repositories
{
    public interface IProductRepository
    {
        Task<List<Product>> GetAllProductsAsync();
        Task<Product> GetProductByIdAsync(int id);
        Task AddProductAsync(Product product);
        Task UpdateProductAsync(Product product);
        Task DeleteProductAsync(int id);
    }
}

ProductRepository
using Microsoft.EntityFrameworkCore;
using ProductManagementAPI.Data;
using ProductManagementAPI.Entities;
namespace ProductManagementAPI.Repositories
{
    public class ProductRepository : IProductRepository
    {
        private readonly AppDbContext _context;
        public ProductRepository(AppDbContext context)
        {
            _context = context;
        }
        public async Task<List<Product>> GetAllProductsAsync()
        {
            return await _context.Products.ToListAsync();
        }
        public async Task<Product> GetProductByIdAsync(int id)
        {
            return await _context.Products
                .AsNoTracking()
                .FirstOrDefaultAsync(p => p.Id == id);
        }
        public async Task AddProductAsync(Product product)
        {
            if (product == null)
            {
                throw new ArgumentNullException(nameof(product));
            }

            _context.Products.Add(product);
            await _context.SaveChangesAsync();
        }
        public async Task UpdateProductAsync(Product product)
        {
            if (product == null)
            {
                throw new ArgumentNullException(nameof(product));
            }

            _context.Entry(product).State = EntityState.Modified;
            await _context.SaveChangesAsync();
        }
        public async Task DeleteProductAsync(int id)
        {
            var product = await _context.Products.FindAsync(id);
            if (product == null)
            {
                throw new KeyNotFoundException("Product not found.");
            }
            _context.Products.Remove(product);
            await _context.SaveChangesAsync();
        }

    }
}

Step 6. Create a new product controller with different action methods that we used to perform different operations using our front-end application after invoking the same.
using Microsoft.AspNetCore.Mvc;
using ProductManagementAPI.Entities;
using ProductManagementAPI.Repositories;
namespace ProductManagementAPI.Controllers
{
    [ApiController]
    [Route("api/[controller]")]
    public class ProductsController : ControllerBase
    {
        private readonly IProductRepository _repository;
        public ProductsController(IProductRepository repository)
        {
            _repository = repository;
        }
        [HttpGet]
        public async Task<IActionResult> GetAllProducts()
        {
            var products = await _repository.GetAllProductsAsync();
            return Ok(products); // Returns only the list of products
        }
        [HttpGet("{id}")]
        public async Task<IActionResult> GetProductById(int id)
        {
            var product = await _repository.GetProductByIdAsync(id);
            if (product == null)
            {
                return NotFound();
            }
            return Ok(product); // Returns only the product data
        }
        [HttpPost]
        public async Task<IActionResult> AddProduct([FromBody] Product product)
        {
            if (product == null)
            {
                return BadRequest();
            }
            await _repository.AddProductAsync(product);
            return CreatedAtAction(nameof(GetProductById), new { id = product.Id }, product);
        }
        [HttpPut("{id}")]
        public async Task<IActionResult> UpdateProduct(int id, [FromBody] Product product)
        {
            if (product == null || id != product.Id)
            {
                return BadRequest();
            }
            await _repository.UpdateProductAsync(product);
            return NoContent();
        }
        [HttpDelete("{id}")]
        public async Task<IActionResult> DeleteProduct(int id)
        {
            await _repository.DeleteProductAsync(id);
            return NoContent();
        }
    }

}

Step 7. Register our services inside the service container and configure the middleware.
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Options;
using ProductManagementAPI.Data;
using ProductManagementAPI.Repositories;
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddScoped<IProductRepository, ProductRepository>();
builder.Services.AddDbContext<AppDbContext>();
builder.Services.AddCors(options => {
    options.AddPolicy("CORSPolicy", builder => builder.AllowAnyOrigin().AllowAnyMethod().AllowAnyHeader());
});

// Configure in-memory database
builder.Services.AddDbContext<AppDbContext>(options =>
    options.UseInMemoryDatabase("InMemoryDb"));

builder.Services.AddControllers();
// Learn more about configuring Swagger/OpenAPI at https://aka.ms/aspnetcore/swashbuckle
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();

var app = builder.Build();

// Configure the HTTP request pipeline.
app.UseCors("CORSPolicy");
app.UseSwagger();
app.UseSwaggerUI();
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();

Step 8. Finally, run the application and use Swagger UI to execute different API endpoints.

Sample Product Application: Frontend (React JS)

Let’s create a client application using React JS and consume the above API endpoints within it.

Step 1. Create a new React JS application with the help of the following command.
npx create-react-app react-netcore-crud-app

Step 2. Navigate to your project directory.
cd react-netcore-crud-app

Step 3. Install Axios to consume and hit backend API and bootstrap for designing purposes.
npm install axios
npm install bootstrap


npm install axios
npm install bootstrap


Step 4. Add the following components and services.
Product list component
// src/components/ProductList/ProductList.js
import React, { useState, useEffect } from 'react';
import ProductListItem from './ProductListItem';
import productService from '../../services/productService';
const ProductList = () => {
    const [products, setProducts] = useState([]);
    useEffect(() => {
        fetchProducts();
    }, []);
    const fetchProducts = async () => {
        try {
            const productsData = await productService.getAllProducts();
            setProducts(productsData);
        } catch (error) {
            console.error('Error fetching products:', error);
        }
    };
    const handleDelete = async (id) => {
        try {
            await productService.deleteProduct(id);
            fetchProducts(); // Refresh product list
        } catch (error) {
            console.error('Error deleting product:', error);
        }
    };
    const handleEdit = () => {
        fetchProducts(); // Refresh product list after editing
    };
    return (
        <div className="container">
            <h2 className="my-4">Product List</h2>
            <ul className="list-group">
    {Array.isArray(products) && products.length > 0 ? (
        products.map(product => (
            <ProductListItem key={product.id} product={product} onDelete={() => handleDelete(product.id)} onEdit={handleEdit} />
        ))
    ) : (
        <p>No products available</p>
    )}
</ul>
        </div>
    );
};
export default ProductList;


Product list item component

// src/components/ProductList/ProductListItem.js
import React, { useState } from 'react';
import productService from '../../services/productService';
const ProductListItem = ({ product, onDelete, onEdit }) => {
    const [isEditing, setIsEditing] = useState(false);
    const [editedName, setEditedName] = useState(product.name);
    const [editedPrice, setEditedPrice] = useState(product.price);
    const handleEdit = async () => {
        setIsEditing(true);
    };
    const handleSave = async () => {
        const editedProduct = { ...product, name: editedName, price: parseFloat(editedPrice) };
        try {
            await productService.updateProduct(product.id, editedProduct);
            setIsEditing(false);
            onEdit(); // Refresh product list
        } catch (error) {
            console.error('Error updating product:', error);
        }
    };
    const handleCancel = () => {
        setIsEditing(false);
        // Reset edited values
        setEditedName(product.name);
        setEditedPrice(product.price);
    };
    return (
        <li className="list-group-item">
            {isEditing ? (
                <div className="row">
                    <div className="col">
                        <input type="text" className="form-control" value={editedName} onChange={e => setEditedName(e.target.value)} required />
                    </div>
                    <div className="col">
                        <input type="number" className="form-control" value={editedPrice} onChange={e => setEditedPrice(e.target.value)} required />
                    </div>
                    <div className="col-auto">
                        <button className="btn btn-success me-2" onClick={handleSave}>Save</button>
                        <button className="btn btn-secondary" onClick={handleCancel}>Cancel</button>
                    </div>
                </div>
            ) : (
                <div className="d-flex justify-content-between align-items-center">
                    <span>{product.name} - ${product.price}</span>
                    <div>
                        <button className="btn btn-danger me-2" onClick={onDelete}>Delete</button>
                        <button className="btn btn-primary" onClick={handleEdit}>Edit</button>
                    </div>
                </div>
            )}
        </li>
    );
};
export default ProductListItem;


Product service
// src/services/productService.js
import axios from 'axios';
const baseURL = 'https://localhost:7202/api/products';
const productService = {
    getAllProducts: async () => {
        try {
            const response = await axios.get(
              baseURL,
              {
                timeout: 3000,
                headers: {
                  Accept: 'application/json',
                },
              },
            );

            return response.data;
          } catch (err) {
            if (err.code === 'ECONNABORTED') {
              console.log('The request timed out.');
            } else {
              console.log(err);
            }
          }
    },
    addProduct: async (product) => {
        const response = await axios.post(baseURL, product);
        return response.data;
    },
    deleteProduct: async (id) => {
        const response = await axios.delete(`${baseURL}/${id}`);
        return response.data;
    },
    updateProduct: async (id, product) => {
        const response = await axios.put(`${baseURL}/${id}`, product);
        return response.data;
    }
};
export default productService;


App component
// src/App.js
import React, { useState } from 'react';
import ProductList from './components/ProductList/ProductList';
import ProductForm from './components/ProductForm/ProductForm';
function App() {
    const [refresh, setRefresh] = useState(false);
    const handleProductAdded = () => {
        setRefresh(!refresh); // Toggle refresh state to trigger re-render
    };
    return (
        <div>
            <ProductList key={refresh} />
            <ProductForm onProductAdded={handleProductAdded} />
        </div>
    );
}
export default App;


Step 5. Run the application using the following command and perform the different CRUD operations with the help of the same.

Docker Files for Application
Docker file for backend application (.NET Core).
# Use the official .NET Core SDK as a parent image
FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build

WORKDIR /app

# Copy the project file and restore any dependencies (use .csproj for the project name)
COPY *.csproj ./
RUN dotnet restore

# Copy the rest of the application code
COPY . .

# Publish the application
RUN dotnet publish -c Release -o out

# Build the runtime image
FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS runtime

WORKDIR /app
COPY --from=build /app/out ./

# Expose the port your application will run on
EXPOSE 80

# Start the application
ENTRYPOINT ["dotnet", "ProductManagementAPI.dll"]


  • Line 1-2: Uses the official .NET Core SDK image (mcr.microsoft.com/dotnet/sdk:6.0) as a base.
  • Line 4: Sets the working directory to /app.
  • Line 6-7: Copies the project file(s) (*.csproj) into the container.
  • Line 8: Runs dotnet restore to restore dependencies specified in the project file(s).
  • Line 10-11: Copies the rest of the application code into the container.
  • Line 13-14: Publishes the application in Release configuration (dotnet publish -c Release -o out), outputting to the out directory.
  • Line 16-17: Uses the official .NET Core ASP.NET runtime image (mcr.microsoft.com/dotnet/aspnet:6.0) as a base.
  • Line 19-20: Sets the working directory to /app and Copies the published output from the build stage (from /app/out) into the /app directory of the runtime stage.
  • Line 22-23: Exposes port 80 to allow external access to the application.
  • Line 25-26: Specifies dotnet ProductManagementAPI.dll as the entry point command to start the application.

Docker file for frontend application (React JS).
FROM node:16-alpine
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
EXPOSE 3000
CMD ["npm", "start"]

  • Line 1: specifies the base image, using Node.js version 18.
  • Line 2: sets /app as the working directory for subsequent commands.
  • Line 3: Copies the contents of your local directory into the container's working directory.
  • Line 4: Installs the project dependencies inside the container.
  • Line 5: Build the production version of your React app.
  • Line 6: This exposes port 3000, which is where the application will run.
  • Line 7: The command starts the React application using serve to serve the build folder.

Next, modify your backend hard-coded URL in the product service.
// src/services/productService.js
import axios from 'axios';

//const baseURL = 'https://localhost:31912/api/products';
const baseURL = process.env.REACT_APP_API_URL;

const productService = {
    getAllProducts: async () => {
        try {
            const response = await axios.get(
              baseURL,
              {
                timeout: 3000,
                headers: {
                  Accept: 'application/json',
                },
              },
            );

            return response.data;
          } catch (err) {
            if (err.code === 'ECONNABORTED') {
              console.log('The request timed out.');
            } else {
              console.log(err);
            }
          }
    },
    addProduct: async (product) => {
        const response = await axios.post(baseURL, product);
        return response.data;
    },
    deleteProduct: async (id) => {
        const response = await axios.delete(`${baseURL}/${id}`);
        return response.data;
    },
    updateProduct: async (id, product) => {
        const response = await axios.put(`${baseURL}/${id}`, product);
        return response.data;
    }
};

export default productService;

Containerize the front-end and back-end application
Step 1. Build the docker images
docker build -t productbackendapp:latest .

docker build -t productfrontendapp:latest .

Step 2. Create a deployment and service YAML files for the backend application.

deployment.yml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: product-management-api
  labels:
    app: product-management-api
spec:
  replicas: 1
  selector:
    matchLabels:
      app: product-management-api
  template:
    metadata:
      labels:
        app: product-management-api
    spec:
      containers:
      - name: product-management-api
        image: productbackendapp:latest
        imagePullPolicy: Never
        ports:
        - containerPort: 80

service.yml
apiVersion: v1
kind: Service
metadata:
  name: product-management-api-service
  labels:
    app: product-management-api
spec:
  type: NodePort
  ports:
    - protocol: TCP
      port: 80
      targetPort: 80
  selector:
    app: product-management-api


Step 3. Create deployment, service, and backend-config map YAML files for the frontend application.

deployment.yml

apiVersion: apps/v1
kind: Deployment
metadata:
  name: react-client-deployment
spec:
  replicas: 1
  selector:
    matchLabels:
      app: react-client
  template:
    metadata:
      labels:
        app: react-client
    spec:
      containers:
      - name: react-client
        image: productfrontendapp:latest
        imagePullPolicy: Never
        ports:
        - containerPort: 3000
        env:
        - name: REACT_APP_API_URL
          valueFrom:
            configMapKeyRef:
              name: backend-config
              key: REACT_APP_API_URL


service.yml
apiVersion: v1
kind: Service
metadata:
  name: react-client-service
spec:
  selector:
    app: react-client
  ports:
    - protocol: TCP
      port: 80
      targetPort: 3000
  type: NodePort


backend-configmap.yml
apiVersion: v1
kind: ConfigMap
metadata:
  name: backend-config
data:
  REACT_APP_API_URL: "http://localhost:30191/api/products"

Step 4. Apply all the above files one by one using kubectl commands. (Note: make sure Kubernetes is running on your system with docker daemon.)
kubectl apply -f deployment.yml
kubectl apply -f service.yml
kubectl apply -f backend-configmap.yml


Step 5. Verify the deployment, services, pods, and config map are up and running or not with the help of kubectl commands.

Step 6. Hit the backend and frontend product application using services.

 

Conclusion
In this article, we created a product management backend application using .NET Core with different API endpoints that are required to perform CRUD operations. Later on, we created the front-end application using React JS and consumed the back-end application inside the same with the help of Axios. Also, we containerized both applications with the help of Docker and Kubernetes.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ASP.NET Core IActionFilter Explanation

clock September 20, 2024 07:30 by author Peter

We will discuss ASP.NET Core IActionFilter in this article. Okay, let's get going.

What in ASP.NET Core are Filters?

Cross-cutting issues like logging, authentication, authorization, exception handling, caching, etc. are added via filters.

We can carry out cross-cutting logic in the following ways thanks to filters:

  • Before a controller action method handles an HTTP request.
  • Following the handling of an HTTP request by a controller action method.
  • Prior to being delivered to the client, but after the response has been prepared.

Action Filters in the Core ASP.NET
In the ASP.NET Core, action filters run both before and after an action method. They carry out operations such as logging, modifying the arguments used in the action, and changing the outcome.

IActionFilter

IActionFilter is an interface in ASP.NET Core that lets you see and control the actions that are running in your application. It offers ways to provide unique behavior both before and after an action method is called. IActionFilter is an effective tool for cross-cutting issues, but there are benefits and drawbacks to its use.

Advantages of IActionFilter

  • Cross-Cutting Concerns: It allows you to handle cross-cutting concerns like logging, validation, caching, or authentication in a clean and centralized manner instead of scattering those concerns across multiple action methods.
  • Reusable Logic: Filters can be reused across multiple controllers and actions, promoting DRY (Don't Repeat Yourself) principles in your codebase.
  • Separation of Concerns: By using action filters, you can separate concerns between the action logic and the filtering logic, leading to cleaner, more maintainable code.
  • Execution Order Control: Action filters can be ordered using the `Order` property, providing flexibility in the execution sequence of filters, which is essential for scenarios where the order of execution matters.
  • Access to Action Context: Action filters have access to the action context, allowing them to inspect and modify request data easily or affect the result that gets returned.

Disadvantages of IActionFilter

  • Complexity: Introducing filters can add complexity to your application. When multiple filters are used, understanding the execution flow can become difficult, especially with dependencies between filters.
  • Performance Overhead: Each action filter adds a slight overhead to the execution of requests, which can accumulate, particularly in high-throughput applications. It's essential to ensure that filters are efficient and necessary.
  • Tight Coupling: If not used judiciously, filters may lead to tight coupling between your API's action methods and the filters, making it harder to test and maintain individual components.
  • Limited Scope: Filters operate on action methods, meaning they cannot be applied globally to things like middleware or other request pipelines. If you need a broader application of logic, you might need a different approach.
  • Difficulty in Testing: While filters can be reusable, they can also complicate unit tests. Mocking dependencies or asserting functionality across multiple layers can become challenging.

Example
TimeActionFilter logs the execution time of action methods. The OnActionExecuting method captures the start time, while OnActionExecuted calculates the elapsed time and logs it.
// Create TimeActionFilter

  public class TimeActionFilter : IActionFilter
    {
        private Stopwatch stopwatch;

        public void OnActionExecuting(ActionExecutingContext filterContext)
        {
            stopwatch = Stopwatch.StartNew();
            Debug.WriteLine($"Stopwatch Started");
        }

        public void OnActionExecuted(ActionExecutedContext filterContext)
        {
            stopwatch.Stop();
            var elapsedMilliseconds = stopwatch.ElapsedMilliseconds;
            // log the elapsed time
            Debug.WriteLine($"Action '{filterContext.ActionDescriptor.DisplayName}' executed in {elapsedMilliseconds} ms");
        }
    }

// Register the filter in Startup.cs

public void ConfigureServices(IServiceCollection services)
  {
    services.AddControllers(options =>
     {
         options.Filters.Add(typeof(TimeActionFilter));
     });
  }

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.




European ASP.NET Core 9.0 Hosting - HostForLIFE :: Use and Examples of the Response Cache Attribute in .NET Core

clock September 11, 2024 06:52 by author Peter

Response caching is the technique by which a browser or other client stores a server's response in memory. As a result, requests for the same resources will be processed more quickly in the future. Additionally, this will spare the server from processing and producing the same response over and over again. ASP.NET Core uses the ResponseCache property to set the response caching headers. Moreover, we can use the Response Caching Middleware to control the caching behavior from the server side. Once we've configured clients and other proxies to determine how to cache the server response, they can read the response caching headers. The HTTP 1.1 Response Cache Specification stipulates that browsers, clients, and proxies need to have caching headers.

What Does "caching" Mean?
The process of temporarily storing frequently used data so that it can be rapidly retrieved is known as caching. The application's overall speed will be improved because there won't be as much need to fetch the same data from the database or other storage devices.

ASP.NET Core Caching Types

  • Multiple Caching Mechanism Types are Supported by ASP.NET Core. They are listed in the following order.
  • In-Memory Caching: The most basic type of caching, in-memory caching works well with a single server. The main memory of the web server houses the data. It is quick and appropriate for data that doesn't require persistence past the web server process' lifetime or require a lot of memory. It works well for storing modest volumes of information.
  • Distributed Caching: Applications that must share data across multiple servers in load-balancing or multi-server environments are best suited for distributed caching. It entails keeping data in an external system, like NCache, SQL Server, Redis, and so forth. Although it requires more work than in-memory caching, large-scale applications must use it to guarantee consistency between requests and sessions.
  • Response Caching: Response Caching is the process of keeping the result of a request-response cycle in the cache in order to serve the resource from the cache in response to subsequent requests rather than having to generate the response from scratch. This method can greatly enhance the performance of a web application, particularly for resources that are costly to produce and don't change frequently.

HTTP Based Response Caching
Now let's talk about the different HTTP Cache Directives and how to control the way caching behaves. Cache-control is the primary header parameter that we utilize to specify whether or not a response can be cached. The cache-control header should be respected and adhered to by clients, proxy servers, and browsers when it shows up in the response.

Let's now examine the standard cache-control directives.

  • public: denotes the ability for a cache to hold the response locally on the client or in a shared location.
  • private: denotes that the response may only be stored in a client-side private cache and not in a shared cache.
  • no-cache: The no-cache flag instructs a cache not to respond to any requests with a stored response.
  • no-store: The no-store flag instructs a cache not to keep the answer.

Browsers and clients understand no-cache and no-store differently, despite the fact that they sound and even behave similarly. We will discuss this in more depth as we go through the instances.

A few more headers, in addition to cache control, can regulate the caching behavior.

For backward compatibility with the no-cache directive and the HTTP 1.0 specification, the pragma header is used. It will disregard the pragma header if we supply the cache-control header.

Vary: This tells it that it can only send a cached response if every field in the header precisely matches the request that came in before. A new response is generated by the server if any of the fields are modified.

Illustrations of HTTP Cache Directives
We will now create an ASP.NET Core application to demonstrate how the cache directives work. Now let's add a controller action method to an ASP.NET Core Web API project that has been created.
public record EmployeeDto
{
    public Guid Id { get; init; }
    public string Name { get; init; }
    public EmployeeType Type { get; init; }
    public string Mno { get; init; }
    public decimal Salary { get; init; }
    public DateTime CurrentDate { get; init; } = DateTime.Now;
}

[HttpGet("{id}")]
public IActionResult GetById(Guid id)
{
    var emp = _employeeService.GetById(id);
    if (emp == null)
    {
        return NotFound();
    }

    return Ok(emp);
}

ResponseCache Attribute
The ResponseCache attribute for an ASP.NET Core application specifies the properties for configuring the relevant response caching headers. This attribute can be used for specific endpoints or at the controller level.

Let's update the API endpoint with the ResponseCache attribute.
[HttpGet("{id}")]
[ResponseCache(Duration = 120, Location = ResponseCacheLocation.Any)]
public IActionResult GetById(Guid id)
{
    var emp = _employeeService.GetById(id);

    if (emp == null)
    {
        return NotFound();
    }

    return Ok(emp);
}

The max-age header, which we use to set the cache duration for two minutes (120 seconds), is produced by this Duration property. In a similar manner, the cache-control header's location will be set by the Location property. Both the client and the server will be able to cache the response because we have the location set to Any, which is similar to the public directive of the cache-control header.

Let us now access the API endpoint and confirm these contents within the response headers.
cache-control: public,max-age=120

Furthermore, if the browser uses a cached response after we repeatedly invoke the endpoints, the response from the disk cache will be indicated in the status code.
200 is the status code (from disk cache).

Let's now examine the various ResponseCache parameter options.

All we have to do is update the Location property to ResponseCacheLocation.Client in order to change the cache location to private.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Client)]

By doing this, the cache-control header value will be altered to private, indicating that the response can only be cached by the client.
cache-control: private,max-age=60

Let's now change ResponseCacheLocation.None as the Location parameter:
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.None)]

This will cause the client to be unable to use a cached response without revalidating with the server, as it will set the cache-control and pragma headers to no-cache:

We can confirm that the server always generates a fresh response in this configuration, and the browser never uses the cached response.
cache-control: no-cache,max-age=60

We can confirm that the server always generates a fresh response in this configuration, and the browser never uses the cached response.

NoStore Property
Let's now set the ResponseCache attribute's NoStore property to true.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any, NoStore = true)]

This will cause the response's cache-control header to be set to no-store, telling the client not to cache the response.
cache-control: no-store

Keep in mind that this will take precedence over the Location value we set. The client won't save the answer in its cache in this instance either.

Though the cache-control no-cache and no-store values might produce identical test results, different browsers, clients, and proxies interpret these headers in different ways. No-cache simply means that the client should not use a cached response without revalidating with the server, whereas no-store instructs clients or proxies to not store the response or any portion of it anywhere.

The VaryByHeader property of the ResponseCache attribute can be used to set the vary header.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any, VaryByHeader = "User-Agent")]

Here, we set the VaryByHeader property to User-Agent, which means that if the request originates from the same client device, it will use the cached response. The User-Agent value will alter and the client device will request a fresh response from the server. Let us confirm this.

Let's first see if the response headers contain the vary header.
cache-control: public,max-age=60
vary: User-Agent

Consequently, we can force the server to send a fresh response for a different device by configuring the VaryByHeader property to User-Agent.

VaryByQueryKeys Property
When the specified query string parameters change, we can force the server to send a new response by utilizing the VaryByQueryKeys property of the ResponseCache attribute. Naturally, if we set the value to "*," we can create a fresh response each time a query string parameter changes.

For instance, if the ID value in the URI changes, we might want to produce a fresh response.
    …/emp?id=53C68EE5-107B-42DB-821E-E1F893C5BDA3
    …/emp?id=6E5692FA-EEF6-426A-B280-EF444CB2BA1E

To do this, let's alter the Get action to add the id parameter and supply the ResponseCache attribute's VaryByQueryKeys property.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any, VaryByQueryKeys = ["id"])]

Recall that in order to set the VaryByQueryKeys property, we must activate the Response Caching Middleware. The code will raise a runtime exception if it doesn't.

Response Cache Middleware
When a response can be cached, the ASP.NET Core application's Response Caching Middleware establishes this and stores and serves the response from the cache.

The Response Caching Middleware can be enabled by adding a few lines of code to the Program class.

// service inject
builder.Services.AddResponseCaching();

// in the middleware
app.UseResponseCaching();

Using the AddResponseCaching() method, we must first add the middleware. Afterwards, we can use the UseResponseCaching() method to configure the application to use the middleware.

That is all. Now that the Response Caching Middleware has been turned on, the VaryByQueryKeys property ought to function.

Now that the application is running, let's check the response cache.

It is evident that if the query string remains unchanged, we will receive a cached response; however, if we modify the query string, the server will send a fresh response. Let's examine the response cache and modify the value of the query string.


Take note that the VaryByQueryKeys property does not have a corresponding HTTP header. Response Caching Middleware oversees managing this HTTP feature.

We learned the new technique and evolved together.

Happy coding!

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in