European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core 9.0 Hosting - HostForLIFE :: ​Simple Load Balancer in .NET Core with YARP

clock October 8, 2024 08:55 by author Peter

Load balancing is essential in distributed systems and web applications to ensure that traffic is efficiently distributed across multiple servers or resources.

Reverse Proxy

A reverse proxy is a server that sits between client devices (e.g., browsers) and the backend servers, forwarding client requests to the appropriate server and then returning the server's response to the client.

Sticky Sessions

Sticky sessions (also known as Session Affinity) in YARP is a feature that ensures requests from a particular client are always routed to the same backend server. This is particularly important for applications that rely on server-side session data, such as when storing user state or authentication information in memory on specific servers.

YARP(Yet Another Reverse Proxy) Setup

create a new ASP.NET Core web appliLoad balancing is essential in distributed systems and web applications to ensure that traffic is efficiently distributed across multiple servers or resources.

Reverse Proxy

A reverse proxy is a server that sits between client devices (e.g., browsers) and the backend servers, forwarding client requests to the appropriate server and then returning the server's response to the client.

Sticky Sessions

Sticky sessions (also known as Session Affinity) in YARP is a feature that ensures requests from a particular client are always routed to the same backend server. This is particularly important for applications that rely on server-side session data, such as when storing user state or authentication information in memory on specific servers.

YARP(Yet Another Reverse Proxy) Setup

  • create a new ASP.NET Core web application
  • Install the Yarp.ReverseProxy NuGet package

Program.cs
var builder = WebApplication.CreateBuilder(args);

builder.Services.AddReverseProxy().LoadFromConfig(builder.Configuration.GetSection("ReverseProxy"));

var app = builder.Build();
app.MapReverseProxy();

app.Run();


appsettings.json
{
  "ReverseProxy": {
    "Routes": {
      "api-route": {
        "ClusterId": "api-cluster",
        "Match": {
          "Path": "{**catch-all}"
        },
        "Transforms": [
          { "PathPattern": "{**catch-all}" }
        ]
      }
    },
    "Clusters": {
      "api-cluster": {
        "SessionAffinity": {
          "Enabled": "true",
          "AffinityKeyName": "Key1",
          "Cookie": {
            "Domain": "localhost",
            "Expiration": "03:00:00",
            "IsEssential": true,
            "MaxAge": "1.00:00:00",
            "SameSite": "Strict",
            "SecurePolicy": "Always"
          }
        },
          "LoadBalancingPolicy": "RoundRobin",
          "Destinations": {
            "destination1": {
              "Address": "https://localhost:7106"
            },
            "destination2": {
              "Address": "https://localhost:7107"
            }
          }
        }
      }
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  },
  "AllowedHosts": "*"
}

Destination Address: The destination address should point to the base URL of your hosted web application.

Output
Here is the console output of round-robin policy.

Load balancer

Summary
Load balancing helps in improving the scalability, performance, availability, and security of web applications and services.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ​Implementing CORS in .NET Core 8

clock October 4, 2024 07:56 by author Peter

Web pages are prohibited from sending requests to a domain other than the one that provided them thanks to a security feature called Cross-Origin Resource Sharing, or CORS. Building safe and useful apps in modern web development requires knowing how to properly implement CORS, especially when utilizing Angular as a front-end framework and.NET Core 8 as a backend. The best practices and typical pitfalls for configuring CORS in a.NET Core 8 environment are described in this article.

Assume you have a friend who lives next door (another website) and a toy box at home (your website). Your mom (the browser) has a rule that states your friend can only play with their own toys. Your friend wants to play with your toys (data) in your toy box. This is to ensure that everything is secure and safe. Assume that your friend's website is funfriend.com and that yours is mycooltoys.com. CORS checks are performed if funfriend.com requests to borrow a toy from mycooltoys.com.

  • Is funfriend.com allowed to borrow toys from mycooltoys.com?
  • Did they follow the rules about how to ask?
  • If everything is good, then your friend can borrow the toy!

Setting Up CORS in .NET Core 8

  • Install Necessary Packages: If you haven't already, ensure you have the required .NET Core packages installed. In most cases, the default setup will suffice, but you can install any specific CORS libraries if needed.
  • Configure CORS in Startup: In .NET Core 8, the configuration of CORS is typically done in the Program.cs file. Here’s a simple setup.

    var builder = WebApplication.CreateBuilder(args);
    // Add CORS services
    builder.Services.AddCors(options =>
    {
        options.AddPolicy("AllowAngularApp",
            builder => builder.WithOrigins("https://your-angular-app.com")
                              .AllowAnyMethod()
                              .AllowAnyHeader()
                              .AllowCredentials());
    });
    var app = builder.Build();
    // Use CORS policy
    app.UseCors("AllowAngularApp");
    app.MapControllers();
    app.Run();


  • Allowing Specific Origins: For production environments, it’s crucial to specify the exact origin rather than using AllowAnyOrigin(), which is a common pitfall. Limiting allowed origins enhances security.

options.AddPolicy("AllowAngularApp",
    builder => builder.WithOrigins("https://your-angular-app.com")
                      .AllowAnyMethod()
                      .AllowAnyHeader());

  • Handling Preflight Requests: Ensure your server can handle preflight requests. These are OPTIONS requests sent by browsers to check permissions. By enabling CORS and handling these requests, you ensure that your application can respond correctly.
  • Allow Credentials: If your Angular application needs to send cookies or HTTP authentication information, you need to set AllowCredentials() in your CORS policy. Be cautious with this feature, as it requires that the origin is explicitly specified and cannot be set to AllowAnyOrigin().

Advanced CORS Configuration

  • Customizing Allowed Methods: You can customize allowed methods if your API uses specific HTTP methods.

    options.AddPolicy("AllowAngularApp",
        builder => builder.WithOrigins("https://your-angular-app.com")
                          .WithMethods("GET", "POST", "PUT", "DELETE")
                          .AllowAnyHeader()
                          .AllowCredentials());


Setting Exposed Headers: If your API returns custom headers that the client needs to access, specify these using WithExposedHeaders.

options.AddPolicy("AllowAngularApp",
    builder => builder.WithOrigins("https://your-angular-app.com")
                      .AllowAnyMethod()
                      .AllowAnyHeader()
                      .WithExposedHeaders("X-Custom-Header")
                      .AllowCredentials());

Logging CORS Requests: For debugging purposes, you can log CORS requests to track any issues that arise. Here’s a simple logging middleware.
app.Use(async (context, next) =>
{
    if (context.Request.Headers.ContainsKey("Origin"))
    {
        var origin = context.Request.Headers["Origin"];
        Console.WriteLine($"CORS request from: {origin}");
    }
    await next();
});

Best Practices

  • Limit Origins: Always specify the exact origins that are permitted to interact with your API. Avoid using wildcards (*) as they expose your API to potential security risks.
  • Use HTTPS: Ensure both your .NET Core backend and Angular frontend are served over HTTPS. This secures data in transit and enhances trustworthiness.
  • Regularly Review CORS Policies: As your application grows and evolves, periodically review your CORS configurations to ensure they align with current security requirements.
  • Test CORS Configurations: Use tools like Postman or browser developer tools to test your CORS setup. Check for errors and ensure your API is returning the expected headers.
  • Document Your API: Clearly document the CORS policies and allowed origins in your API documentation. This helps other developers understand how to interact with your API correctly.

Common Pitfalls

  • Misconfigured Allowed Origins: One of the most frequent mistakes is misconfiguring allowed origins. Double-check the exact URLs, including the protocol (HTTP vs. HTTPS) and any potential trailing slashes.
  • Forgetting to Apply CORS Middleware: Ensure that the UseCors middleware is applied before any endpoints are mapped. Placing it after endpoint mapping can lead to unexpected behaviors.
  • AllowAnyOrigin with AllowCredentials: This combination is not allowed and will cause CORS requests to fail. If you need credentials, specify the exact origins.
  • Not Handling OPTIONS Requests: Ignoring the preflight OPTIONS requests can lead to issues when your API is accessed from different origins. Ensure your server can properly handle these requests.

Example
Default

Allowing Specific Origins

Conclusion
Implementing CORS in .NET Core 8 for Angular applications is crucial for creating a secure and functional web application. By following best practices and being aware of common pitfalls, you can ensure that your CORS setup is both effective and secure. Regularly revisiting your CORS configuration as your application evolves will help maintain security and functionality in the long run.

Happy Coding!

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Using the Shopping Cart Discount Function to Better Understand the Concept of a Rule Engine

clock September 30, 2024 09:30 by author Peter

Hello everyone, I hope everything is going well for you. We will examine the idea of a rule engine in this post, as well as how NRule (a.NET package) implements it. We will develop a new.NET 8 desktop application to demonstrate the shopping cart discount feature for users according to their membership status (Normal/Silver/Gold/VIP) in order to facilitate comprehension.

Now let's use NRule and Rule Engine together

A software program that carries out one or more business rules in a runtime production setting is called a rule engine. The policies may be derived from corporate guidelines, statutory regulations, or other sources. You may think of a rule engine as an advanced translator of if/then statements. A rule engine's main advantage is that its business rules may be written in a non-programmer's language and can be changed without affecting the application code underneath.

NRules - An Introduction
NRules is an open-source .NET rule engine that supports defining business rules separate from the system logic. It allows for rules to be written in C#, using internal DSL to express rule conditions and actions. NRules is highly extensible, can integrate with any .NET application, and supports dynamic rule compilation.
Shopping Cart Discount Logic Using NRules

In this example, we will create a simple console application in C# that uses NRules to apply different discount percentages based on the user's membership type (Normal, Silver, Gold, VIP).

Step 1. Set Up the Project
First, create a new Console Application in Visual Studio or using the .NET CLI. Then, add the NRules package and NRules.Runtime package via NuGet.

dotnet add package NRules
dotnet add package NRules.Runtime

Step 2. Define the Domain Model
Create classes for Customers and Orders
public class Customer
{
    public string MembershipType { get; set; }
}

public class Order
{
    public Customer Customer { get; set; }
    public decimal TotalAmount { get; set; }
    public decimal DiscountedAmount { get; set; }
}


Step 3. Define the Rules
Create a class MembershipDiscountRule for discount rules. We'll consolidate the rules into a single class for simplicity.
using RuleEngine_ShoppingCartDiscount.Models;
using NRules.Fluent.Dsl;
using System;

namespace RuleEngine_ShoppingCartDiscount
{
    public class MembershipDiscountRule : Rule
    {
        public override void Define()
        {
            Order order = null;

            When()
                .Match<Order>(() => order,
                              o => o.Customer != null,
                              o => o.DiscountedAmount == 0); // Ensure discount is not already applied

            Then()
                .Do(ctx => ApplyDiscount(order))
                .Do(ctx => ctx.Update(order));
        }

        private void ApplyDiscount(Order order)
        {
            var discount = order.Customer.MembershipType switch
            {
                "Normal" => 0.90m, // 10% discount
                "Silver" => 0.80m, // 20% discount
                "Gold" => 0.75m, // 25% discount
                "VIP" => 0.70m, // 30% discount
                _ => 1.00m // No discount
            };

            order.DiscountedAmount = order.TotalAmount * discount;
            Console.WriteLine($"Applied {((1 - discount) * 100)}% discount for {order.Customer.MembershipType} member. Total now: {order.DiscountedAmount}");
        }
    }
}


Step 4. Configure and Run the Rule Engine


In your Main method, set up the rule repository, compile rules, create a session, and create the list of Customers (Normal,Silver,Gold,MVP).

// See https://aka.ms/new-console-template for more information
// Load rules
using NRules;
using NRules.Fluent;
using RuleEngine_ShoppingCartDiscount;
using RuleEngine_ShoppingCartDiscount.Models;

var repository = new RuleRepository();
repository.Load(x => x.From(typeof(MembershipDiscountRule).Assembly));

// Compile rules
var factory = repository.Compile();

// Create a session
var session = factory.CreateSession();

var customers = new List<Customer>
{
    new Customer { MembershipType = "Normal" },
    new Customer { MembershipType = "Silver" },
    new Customer { MembershipType = "Gold" },
    new Customer { MembershipType = "VIP" }
};

// Create customer and order
foreach (var customer in customers)
{
    var order = new Order { Customer = customer, TotalAmount = 100 };

    // Insert facts into rules engine's memory
    session.Insert(order);

    // Start match/resolve/act cycle
    session.Fire();

    Console.WriteLine($"Final amount to pay: {order.DiscountedAmount}\n");
}

Console.ReadLine();


Step 5. Test the Application

Run the console application and it will print the All membership-based discount price.

Code Explanation

  • Rules Definition: A single rule handles all membership types by using a switch expression to determine the discount rate based on the membership type.
  • Rule Engine Setup: Rules are loaded, and compiled, and a session is created where the order is inserted as a fact.
  • Execution: The session.Fire() method triggers the rule engine, which evaluates the inserted facts against the compiled rules and applies the appropriate discount.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Docker and Kubernetes: Containerizing React JS and.NET Core Applications

clock September 24, 2024 07:54 by author Peter

This tutorial will use React JS web forms and the.NET Core Web API to develop a prototype product application backend. Additionally, we will use Docker and Kubernetes to containerize the same process.

Example of a Product Application: Web API for Backend (.NET Core)
First, create a new Web API for Product Management using.NET Core.

Step 2: Install the NuGet packages listed below, which are for the in-memory database.

Step 3. Add the product class inside the entities folder.
namespace ProductManagementAPI.Entities
{
    public class Product
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public decimal Price { get; set; }
    }
}

Step 4. Create an AppDbContext class inside the data folder with an in-memory connection and a DB set property.
using Microsoft.EntityFrameworkCore;
using ProductManagementAPI.Entities;
namespace ProductManagementAPI.Data
{
    public class AppDbContext : DbContext
    {
        public DbSet<Product> Products { get; set; }

        public AppDbContext(DbContextOptions<AppDbContext> options)
            : base(options)
        {
        }
        protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
        {
            // This check prevents configuring the DbContext if options are already provided
            if (!optionsBuilder.IsConfigured)
            {
                // Configure the in-memory database here, if needed
                optionsBuilder.UseInMemoryDatabase("InMemoryDb");
            }
        }
        protected override void OnModelCreating(ModelBuilder modelBuilder)
        {
            // Optionally configure entity mappings here
        }
    }
}


Step 5. Add a product repository inside the repositories folder.
IProductRepository
using ProductManagementAPI.Entities;
namespace ProductManagementAPI.Repositories
{
    public interface IProductRepository
    {
        Task<List<Product>> GetAllProductsAsync();
        Task<Product> GetProductByIdAsync(int id);
        Task AddProductAsync(Product product);
        Task UpdateProductAsync(Product product);
        Task DeleteProductAsync(int id);
    }
}

ProductRepository
using Microsoft.EntityFrameworkCore;
using ProductManagementAPI.Data;
using ProductManagementAPI.Entities;
namespace ProductManagementAPI.Repositories
{
    public class ProductRepository : IProductRepository
    {
        private readonly AppDbContext _context;
        public ProductRepository(AppDbContext context)
        {
            _context = context;
        }
        public async Task<List<Product>> GetAllProductsAsync()
        {
            return await _context.Products.ToListAsync();
        }
        public async Task<Product> GetProductByIdAsync(int id)
        {
            return await _context.Products
                .AsNoTracking()
                .FirstOrDefaultAsync(p => p.Id == id);
        }
        public async Task AddProductAsync(Product product)
        {
            if (product == null)
            {
                throw new ArgumentNullException(nameof(product));
            }

            _context.Products.Add(product);
            await _context.SaveChangesAsync();
        }
        public async Task UpdateProductAsync(Product product)
        {
            if (product == null)
            {
                throw new ArgumentNullException(nameof(product));
            }

            _context.Entry(product).State = EntityState.Modified;
            await _context.SaveChangesAsync();
        }
        public async Task DeleteProductAsync(int id)
        {
            var product = await _context.Products.FindAsync(id);
            if (product == null)
            {
                throw new KeyNotFoundException("Product not found.");
            }
            _context.Products.Remove(product);
            await _context.SaveChangesAsync();
        }

    }
}

Step 6. Create a new product controller with different action methods that we used to perform different operations using our front-end application after invoking the same.
using Microsoft.AspNetCore.Mvc;
using ProductManagementAPI.Entities;
using ProductManagementAPI.Repositories;
namespace ProductManagementAPI.Controllers
{
    [ApiController]
    [Route("api/[controller]")]
    public class ProductsController : ControllerBase
    {
        private readonly IProductRepository _repository;
        public ProductsController(IProductRepository repository)
        {
            _repository = repository;
        }
        [HttpGet]
        public async Task<IActionResult> GetAllProducts()
        {
            var products = await _repository.GetAllProductsAsync();
            return Ok(products); // Returns only the list of products
        }
        [HttpGet("{id}")]
        public async Task<IActionResult> GetProductById(int id)
        {
            var product = await _repository.GetProductByIdAsync(id);
            if (product == null)
            {
                return NotFound();
            }
            return Ok(product); // Returns only the product data
        }
        [HttpPost]
        public async Task<IActionResult> AddProduct([FromBody] Product product)
        {
            if (product == null)
            {
                return BadRequest();
            }
            await _repository.AddProductAsync(product);
            return CreatedAtAction(nameof(GetProductById), new { id = product.Id }, product);
        }
        [HttpPut("{id}")]
        public async Task<IActionResult> UpdateProduct(int id, [FromBody] Product product)
        {
            if (product == null || id != product.Id)
            {
                return BadRequest();
            }
            await _repository.UpdateProductAsync(product);
            return NoContent();
        }
        [HttpDelete("{id}")]
        public async Task<IActionResult> DeleteProduct(int id)
        {
            await _repository.DeleteProductAsync(id);
            return NoContent();
        }
    }

}

Step 7. Register our services inside the service container and configure the middleware.
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Options;
using ProductManagementAPI.Data;
using ProductManagementAPI.Repositories;
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddScoped<IProductRepository, ProductRepository>();
builder.Services.AddDbContext<AppDbContext>();
builder.Services.AddCors(options => {
    options.AddPolicy("CORSPolicy", builder => builder.AllowAnyOrigin().AllowAnyMethod().AllowAnyHeader());
});

// Configure in-memory database
builder.Services.AddDbContext<AppDbContext>(options =>
    options.UseInMemoryDatabase("InMemoryDb"));

builder.Services.AddControllers();
// Learn more about configuring Swagger/OpenAPI at https://aka.ms/aspnetcore/swashbuckle
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();

var app = builder.Build();

// Configure the HTTP request pipeline.
app.UseCors("CORSPolicy");
app.UseSwagger();
app.UseSwaggerUI();
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();

Step 8. Finally, run the application and use Swagger UI to execute different API endpoints.

Sample Product Application: Frontend (React JS)

Let’s create a client application using React JS and consume the above API endpoints within it.

Step 1. Create a new React JS application with the help of the following command.
npx create-react-app react-netcore-crud-app

Step 2. Navigate to your project directory.
cd react-netcore-crud-app

Step 3. Install Axios to consume and hit backend API and bootstrap for designing purposes.
npm install axios
npm install bootstrap


npm install axios
npm install bootstrap


Step 4. Add the following components and services.
Product list component
// src/components/ProductList/ProductList.js
import React, { useState, useEffect } from 'react';
import ProductListItem from './ProductListItem';
import productService from '../../services/productService';
const ProductList = () => {
    const [products, setProducts] = useState([]);
    useEffect(() => {
        fetchProducts();
    }, []);
    const fetchProducts = async () => {
        try {
            const productsData = await productService.getAllProducts();
            setProducts(productsData);
        } catch (error) {
            console.error('Error fetching products:', error);
        }
    };
    const handleDelete = async (id) => {
        try {
            await productService.deleteProduct(id);
            fetchProducts(); // Refresh product list
        } catch (error) {
            console.error('Error deleting product:', error);
        }
    };
    const handleEdit = () => {
        fetchProducts(); // Refresh product list after editing
    };
    return (
        <div className="container">
            <h2 className="my-4">Product List</h2>
            <ul className="list-group">
    {Array.isArray(products) && products.length > 0 ? (
        products.map(product => (
            <ProductListItem key={product.id} product={product} onDelete={() => handleDelete(product.id)} onEdit={handleEdit} />
        ))
    ) : (
        <p>No products available</p>
    )}
</ul>
        </div>
    );
};
export default ProductList;


Product list item component

// src/components/ProductList/ProductListItem.js
import React, { useState } from 'react';
import productService from '../../services/productService';
const ProductListItem = ({ product, onDelete, onEdit }) => {
    const [isEditing, setIsEditing] = useState(false);
    const [editedName, setEditedName] = useState(product.name);
    const [editedPrice, setEditedPrice] = useState(product.price);
    const handleEdit = async () => {
        setIsEditing(true);
    };
    const handleSave = async () => {
        const editedProduct = { ...product, name: editedName, price: parseFloat(editedPrice) };
        try {
            await productService.updateProduct(product.id, editedProduct);
            setIsEditing(false);
            onEdit(); // Refresh product list
        } catch (error) {
            console.error('Error updating product:', error);
        }
    };
    const handleCancel = () => {
        setIsEditing(false);
        // Reset edited values
        setEditedName(product.name);
        setEditedPrice(product.price);
    };
    return (
        <li className="list-group-item">
            {isEditing ? (
                <div className="row">
                    <div className="col">
                        <input type="text" className="form-control" value={editedName} onChange={e => setEditedName(e.target.value)} required />
                    </div>
                    <div className="col">
                        <input type="number" className="form-control" value={editedPrice} onChange={e => setEditedPrice(e.target.value)} required />
                    </div>
                    <div className="col-auto">
                        <button className="btn btn-success me-2" onClick={handleSave}>Save</button>
                        <button className="btn btn-secondary" onClick={handleCancel}>Cancel</button>
                    </div>
                </div>
            ) : (
                <div className="d-flex justify-content-between align-items-center">
                    <span>{product.name} - ${product.price}</span>
                    <div>
                        <button className="btn btn-danger me-2" onClick={onDelete}>Delete</button>
                        <button className="btn btn-primary" onClick={handleEdit}>Edit</button>
                    </div>
                </div>
            )}
        </li>
    );
};
export default ProductListItem;


Product service
// src/services/productService.js
import axios from 'axios';
const baseURL = 'https://localhost:7202/api/products';
const productService = {
    getAllProducts: async () => {
        try {
            const response = await axios.get(
              baseURL,
              {
                timeout: 3000,
                headers: {
                  Accept: 'application/json',
                },
              },
            );

            return response.data;
          } catch (err) {
            if (err.code === 'ECONNABORTED') {
              console.log('The request timed out.');
            } else {
              console.log(err);
            }
          }
    },
    addProduct: async (product) => {
        const response = await axios.post(baseURL, product);
        return response.data;
    },
    deleteProduct: async (id) => {
        const response = await axios.delete(`${baseURL}/${id}`);
        return response.data;
    },
    updateProduct: async (id, product) => {
        const response = await axios.put(`${baseURL}/${id}`, product);
        return response.data;
    }
};
export default productService;


App component
// src/App.js
import React, { useState } from 'react';
import ProductList from './components/ProductList/ProductList';
import ProductForm from './components/ProductForm/ProductForm';
function App() {
    const [refresh, setRefresh] = useState(false);
    const handleProductAdded = () => {
        setRefresh(!refresh); // Toggle refresh state to trigger re-render
    };
    return (
        <div>
            <ProductList key={refresh} />
            <ProductForm onProductAdded={handleProductAdded} />
        </div>
    );
}
export default App;


Step 5. Run the application using the following command and perform the different CRUD operations with the help of the same.

Docker Files for Application
Docker file for backend application (.NET Core).
# Use the official .NET Core SDK as a parent image
FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build

WORKDIR /app

# Copy the project file and restore any dependencies (use .csproj for the project name)
COPY *.csproj ./
RUN dotnet restore

# Copy the rest of the application code
COPY . .

# Publish the application
RUN dotnet publish -c Release -o out

# Build the runtime image
FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS runtime

WORKDIR /app
COPY --from=build /app/out ./

# Expose the port your application will run on
EXPOSE 80

# Start the application
ENTRYPOINT ["dotnet", "ProductManagementAPI.dll"]


  • Line 1-2: Uses the official .NET Core SDK image (mcr.microsoft.com/dotnet/sdk:6.0) as a base.
  • Line 4: Sets the working directory to /app.
  • Line 6-7: Copies the project file(s) (*.csproj) into the container.
  • Line 8: Runs dotnet restore to restore dependencies specified in the project file(s).
  • Line 10-11: Copies the rest of the application code into the container.
  • Line 13-14: Publishes the application in Release configuration (dotnet publish -c Release -o out), outputting to the out directory.
  • Line 16-17: Uses the official .NET Core ASP.NET runtime image (mcr.microsoft.com/dotnet/aspnet:6.0) as a base.
  • Line 19-20: Sets the working directory to /app and Copies the published output from the build stage (from /app/out) into the /app directory of the runtime stage.
  • Line 22-23: Exposes port 80 to allow external access to the application.
  • Line 25-26: Specifies dotnet ProductManagementAPI.dll as the entry point command to start the application.

Docker file for frontend application (React JS).
FROM node:16-alpine
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
EXPOSE 3000
CMD ["npm", "start"]

  • Line 1: specifies the base image, using Node.js version 18.
  • Line 2: sets /app as the working directory for subsequent commands.
  • Line 3: Copies the contents of your local directory into the container's working directory.
  • Line 4: Installs the project dependencies inside the container.
  • Line 5: Build the production version of your React app.
  • Line 6: This exposes port 3000, which is where the application will run.
  • Line 7: The command starts the React application using serve to serve the build folder.

Next, modify your backend hard-coded URL in the product service.
// src/services/productService.js
import axios from 'axios';

//const baseURL = 'https://localhost:31912/api/products';
const baseURL = process.env.REACT_APP_API_URL;

const productService = {
    getAllProducts: async () => {
        try {
            const response = await axios.get(
              baseURL,
              {
                timeout: 3000,
                headers: {
                  Accept: 'application/json',
                },
              },
            );

            return response.data;
          } catch (err) {
            if (err.code === 'ECONNABORTED') {
              console.log('The request timed out.');
            } else {
              console.log(err);
            }
          }
    },
    addProduct: async (product) => {
        const response = await axios.post(baseURL, product);
        return response.data;
    },
    deleteProduct: async (id) => {
        const response = await axios.delete(`${baseURL}/${id}`);
        return response.data;
    },
    updateProduct: async (id, product) => {
        const response = await axios.put(`${baseURL}/${id}`, product);
        return response.data;
    }
};

export default productService;

Containerize the front-end and back-end application
Step 1. Build the docker images
docker build -t productbackendapp:latest .

docker build -t productfrontendapp:latest .

Step 2. Create a deployment and service YAML files for the backend application.

deployment.yml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: product-management-api
  labels:
    app: product-management-api
spec:
  replicas: 1
  selector:
    matchLabels:
      app: product-management-api
  template:
    metadata:
      labels:
        app: product-management-api
    spec:
      containers:
      - name: product-management-api
        image: productbackendapp:latest
        imagePullPolicy: Never
        ports:
        - containerPort: 80

service.yml
apiVersion: v1
kind: Service
metadata:
  name: product-management-api-service
  labels:
    app: product-management-api
spec:
  type: NodePort
  ports:
    - protocol: TCP
      port: 80
      targetPort: 80
  selector:
    app: product-management-api


Step 3. Create deployment, service, and backend-config map YAML files for the frontend application.

deployment.yml

apiVersion: apps/v1
kind: Deployment
metadata:
  name: react-client-deployment
spec:
  replicas: 1
  selector:
    matchLabels:
      app: react-client
  template:
    metadata:
      labels:
        app: react-client
    spec:
      containers:
      - name: react-client
        image: productfrontendapp:latest
        imagePullPolicy: Never
        ports:
        - containerPort: 3000
        env:
        - name: REACT_APP_API_URL
          valueFrom:
            configMapKeyRef:
              name: backend-config
              key: REACT_APP_API_URL


service.yml
apiVersion: v1
kind: Service
metadata:
  name: react-client-service
spec:
  selector:
    app: react-client
  ports:
    - protocol: TCP
      port: 80
      targetPort: 3000
  type: NodePort


backend-configmap.yml
apiVersion: v1
kind: ConfigMap
metadata:
  name: backend-config
data:
  REACT_APP_API_URL: "http://localhost:30191/api/products"

Step 4. Apply all the above files one by one using kubectl commands. (Note: make sure Kubernetes is running on your system with docker daemon.)
kubectl apply -f deployment.yml
kubectl apply -f service.yml
kubectl apply -f backend-configmap.yml


Step 5. Verify the deployment, services, pods, and config map are up and running or not with the help of kubectl commands.

Step 6. Hit the backend and frontend product application using services.

 

Conclusion
In this article, we created a product management backend application using .NET Core with different API endpoints that are required to perform CRUD operations. Later on, we created the front-end application using React JS and consumed the back-end application inside the same with the help of Axios. Also, we containerized both applications with the help of Docker and Kubernetes.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ASP.NET Core IActionFilter Explanation

clock September 20, 2024 07:30 by author Peter

We will discuss ASP.NET Core IActionFilter in this article. Okay, let's get going.

What in ASP.NET Core are Filters?

Cross-cutting issues like logging, authentication, authorization, exception handling, caching, etc. are added via filters.

We can carry out cross-cutting logic in the following ways thanks to filters:

  • Before a controller action method handles an HTTP request.
  • Following the handling of an HTTP request by a controller action method.
  • Prior to being delivered to the client, but after the response has been prepared.

Action Filters in the Core ASP.NET
In the ASP.NET Core, action filters run both before and after an action method. They carry out operations such as logging, modifying the arguments used in the action, and changing the outcome.

IActionFilter

IActionFilter is an interface in ASP.NET Core that lets you see and control the actions that are running in your application. It offers ways to provide unique behavior both before and after an action method is called. IActionFilter is an effective tool for cross-cutting issues, but there are benefits and drawbacks to its use.

Advantages of IActionFilter

  • Cross-Cutting Concerns: It allows you to handle cross-cutting concerns like logging, validation, caching, or authentication in a clean and centralized manner instead of scattering those concerns across multiple action methods.
  • Reusable Logic: Filters can be reused across multiple controllers and actions, promoting DRY (Don't Repeat Yourself) principles in your codebase.
  • Separation of Concerns: By using action filters, you can separate concerns between the action logic and the filtering logic, leading to cleaner, more maintainable code.
  • Execution Order Control: Action filters can be ordered using the `Order` property, providing flexibility in the execution sequence of filters, which is essential for scenarios where the order of execution matters.
  • Access to Action Context: Action filters have access to the action context, allowing them to inspect and modify request data easily or affect the result that gets returned.

Disadvantages of IActionFilter

  • Complexity: Introducing filters can add complexity to your application. When multiple filters are used, understanding the execution flow can become difficult, especially with dependencies between filters.
  • Performance Overhead: Each action filter adds a slight overhead to the execution of requests, which can accumulate, particularly in high-throughput applications. It's essential to ensure that filters are efficient and necessary.
  • Tight Coupling: If not used judiciously, filters may lead to tight coupling between your API's action methods and the filters, making it harder to test and maintain individual components.
  • Limited Scope: Filters operate on action methods, meaning they cannot be applied globally to things like middleware or other request pipelines. If you need a broader application of logic, you might need a different approach.
  • Difficulty in Testing: While filters can be reusable, they can also complicate unit tests. Mocking dependencies or asserting functionality across multiple layers can become challenging.

Example
TimeActionFilter logs the execution time of action methods. The OnActionExecuting method captures the start time, while OnActionExecuted calculates the elapsed time and logs it.
// Create TimeActionFilter

  public class TimeActionFilter : IActionFilter
    {
        private Stopwatch stopwatch;

        public void OnActionExecuting(ActionExecutingContext filterContext)
        {
            stopwatch = Stopwatch.StartNew();
            Debug.WriteLine($"Stopwatch Started");
        }

        public void OnActionExecuted(ActionExecutedContext filterContext)
        {
            stopwatch.Stop();
            var elapsedMilliseconds = stopwatch.ElapsedMilliseconds;
            // log the elapsed time
            Debug.WriteLine($"Action '{filterContext.ActionDescriptor.DisplayName}' executed in {elapsedMilliseconds} ms");
        }
    }

// Register the filter in Startup.cs

public void ConfigureServices(IServiceCollection services)
  {
    services.AddControllers(options =>
     {
         options.Filters.Add(typeof(TimeActionFilter));
     });
  }

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.




European ASP.NET Core 9.0 Hosting - HostForLIFE :: Use and Examples of the Response Cache Attribute in .NET Core

clock September 11, 2024 06:52 by author Peter

Response caching is the technique by which a browser or other client stores a server's response in memory. As a result, requests for the same resources will be processed more quickly in the future. Additionally, this will spare the server from processing and producing the same response over and over again. ASP.NET Core uses the ResponseCache property to set the response caching headers. Moreover, we can use the Response Caching Middleware to control the caching behavior from the server side. Once we've configured clients and other proxies to determine how to cache the server response, they can read the response caching headers. The HTTP 1.1 Response Cache Specification stipulates that browsers, clients, and proxies need to have caching headers.

What Does "caching" Mean?
The process of temporarily storing frequently used data so that it can be rapidly retrieved is known as caching. The application's overall speed will be improved because there won't be as much need to fetch the same data from the database or other storage devices.

ASP.NET Core Caching Types

  • Multiple Caching Mechanism Types are Supported by ASP.NET Core. They are listed in the following order.
  • In-Memory Caching: The most basic type of caching, in-memory caching works well with a single server. The main memory of the web server houses the data. It is quick and appropriate for data that doesn't require persistence past the web server process' lifetime or require a lot of memory. It works well for storing modest volumes of information.
  • Distributed Caching: Applications that must share data across multiple servers in load-balancing or multi-server environments are best suited for distributed caching. It entails keeping data in an external system, like NCache, SQL Server, Redis, and so forth. Although it requires more work than in-memory caching, large-scale applications must use it to guarantee consistency between requests and sessions.
  • Response Caching: Response Caching is the process of keeping the result of a request-response cycle in the cache in order to serve the resource from the cache in response to subsequent requests rather than having to generate the response from scratch. This method can greatly enhance the performance of a web application, particularly for resources that are costly to produce and don't change frequently.

HTTP Based Response Caching
Now let's talk about the different HTTP Cache Directives and how to control the way caching behaves. Cache-control is the primary header parameter that we utilize to specify whether or not a response can be cached. The cache-control header should be respected and adhered to by clients, proxy servers, and browsers when it shows up in the response.

Let's now examine the standard cache-control directives.

  • public: denotes the ability for a cache to hold the response locally on the client or in a shared location.
  • private: denotes that the response may only be stored in a client-side private cache and not in a shared cache.
  • no-cache: The no-cache flag instructs a cache not to respond to any requests with a stored response.
  • no-store: The no-store flag instructs a cache not to keep the answer.

Browsers and clients understand no-cache and no-store differently, despite the fact that they sound and even behave similarly. We will discuss this in more depth as we go through the instances.

A few more headers, in addition to cache control, can regulate the caching behavior.

For backward compatibility with the no-cache directive and the HTTP 1.0 specification, the pragma header is used. It will disregard the pragma header if we supply the cache-control header.

Vary: This tells it that it can only send a cached response if every field in the header precisely matches the request that came in before. A new response is generated by the server if any of the fields are modified.

Illustrations of HTTP Cache Directives
We will now create an ASP.NET Core application to demonstrate how the cache directives work. Now let's add a controller action method to an ASP.NET Core Web API project that has been created.
public record EmployeeDto
{
    public Guid Id { get; init; }
    public string Name { get; init; }
    public EmployeeType Type { get; init; }
    public string Mno { get; init; }
    public decimal Salary { get; init; }
    public DateTime CurrentDate { get; init; } = DateTime.Now;
}

[HttpGet("{id}")]
public IActionResult GetById(Guid id)
{
    var emp = _employeeService.GetById(id);
    if (emp == null)
    {
        return NotFound();
    }

    return Ok(emp);
}

ResponseCache Attribute
The ResponseCache attribute for an ASP.NET Core application specifies the properties for configuring the relevant response caching headers. This attribute can be used for specific endpoints or at the controller level.

Let's update the API endpoint with the ResponseCache attribute.
[HttpGet("{id}")]
[ResponseCache(Duration = 120, Location = ResponseCacheLocation.Any)]
public IActionResult GetById(Guid id)
{
    var emp = _employeeService.GetById(id);

    if (emp == null)
    {
        return NotFound();
    }

    return Ok(emp);
}

The max-age header, which we use to set the cache duration for two minutes (120 seconds), is produced by this Duration property. In a similar manner, the cache-control header's location will be set by the Location property. Both the client and the server will be able to cache the response because we have the location set to Any, which is similar to the public directive of the cache-control header.

Let us now access the API endpoint and confirm these contents within the response headers.
cache-control: public,max-age=120

Furthermore, if the browser uses a cached response after we repeatedly invoke the endpoints, the response from the disk cache will be indicated in the status code.
200 is the status code (from disk cache).

Let's now examine the various ResponseCache parameter options.

All we have to do is update the Location property to ResponseCacheLocation.Client in order to change the cache location to private.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Client)]

By doing this, the cache-control header value will be altered to private, indicating that the response can only be cached by the client.
cache-control: private,max-age=60

Let's now change ResponseCacheLocation.None as the Location parameter:
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.None)]

This will cause the client to be unable to use a cached response without revalidating with the server, as it will set the cache-control and pragma headers to no-cache:

We can confirm that the server always generates a fresh response in this configuration, and the browser never uses the cached response.
cache-control: no-cache,max-age=60

We can confirm that the server always generates a fresh response in this configuration, and the browser never uses the cached response.

NoStore Property
Let's now set the ResponseCache attribute's NoStore property to true.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any, NoStore = true)]

This will cause the response's cache-control header to be set to no-store, telling the client not to cache the response.
cache-control: no-store

Keep in mind that this will take precedence over the Location value we set. The client won't save the answer in its cache in this instance either.

Though the cache-control no-cache and no-store values might produce identical test results, different browsers, clients, and proxies interpret these headers in different ways. No-cache simply means that the client should not use a cached response without revalidating with the server, whereas no-store instructs clients or proxies to not store the response or any portion of it anywhere.

The VaryByHeader property of the ResponseCache attribute can be used to set the vary header.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any, VaryByHeader = "User-Agent")]

Here, we set the VaryByHeader property to User-Agent, which means that if the request originates from the same client device, it will use the cached response. The User-Agent value will alter and the client device will request a fresh response from the server. Let us confirm this.

Let's first see if the response headers contain the vary header.
cache-control: public,max-age=60
vary: User-Agent

Consequently, we can force the server to send a fresh response for a different device by configuring the VaryByHeader property to User-Agent.

VaryByQueryKeys Property
When the specified query string parameters change, we can force the server to send a new response by utilizing the VaryByQueryKeys property of the ResponseCache attribute. Naturally, if we set the value to "*," we can create a fresh response each time a query string parameter changes.

For instance, if the ID value in the URI changes, we might want to produce a fresh response.
    …/emp?id=53C68EE5-107B-42DB-821E-E1F893C5BDA3
    …/emp?id=6E5692FA-EEF6-426A-B280-EF444CB2BA1E

To do this, let's alter the Get action to add the id parameter and supply the ResponseCache attribute's VaryByQueryKeys property.
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any, VaryByQueryKeys = ["id"])]

Recall that in order to set the VaryByQueryKeys property, we must activate the Response Caching Middleware. The code will raise a runtime exception if it doesn't.

Response Cache Middleware
When a response can be cached, the ASP.NET Core application's Response Caching Middleware establishes this and stores and serves the response from the cache.

The Response Caching Middleware can be enabled by adding a few lines of code to the Program class.

// service inject
builder.Services.AddResponseCaching();

// in the middleware
app.UseResponseCaching();

Using the AddResponseCaching() method, we must first add the middleware. Afterwards, we can use the UseResponseCaching() method to configure the application to use the middleware.

That is all. Now that the Response Caching Middleware has been turned on, the VaryByQueryKeys property ought to function.

Now that the application is running, let's check the response cache.

It is evident that if the query string remains unchanged, we will receive a cached response; however, if we modify the query string, the server will send a fresh response. Let's examine the response cache and modify the value of the query string.


Take note that the VaryByQueryKeys property does not have a corresponding HTTP header. Response Caching Middleware oversees managing this HTTP feature.

We learned the new technique and evolved together.

Happy coding!

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



European ASP.NET Core 9.0 Hosting - HostForLIFE :: An Explanation of RSA Encryption and Decryption in the ASP.NET Core and Framework

clock September 2, 2024 08:39 by author Peter

An asymmetric cryptography algorithm is the RSA algorithm. In actuality, asymmetric refers to the fact that it operates on both the public and private keys. As implied by the name, the private key is kept secret while the public key is distributed to everybody. I've used the BouncyCastle package for RSA encryption and decryption in the sample below.

Notes

  • Key Size: The example uses a 2048-bit key, which is a common and secure size for RSA.
  • Encoding: Data is encoded as UTF-8 before encryption and decoded back after decryption.
  • Security: Always handle and store keys securely. Exposing private keys or mishandling encrypted data can compromise security.

Step 1. First, you need to install the BouncyCastle package. You can do this via NuGet.

Step 2. Import the required package into the service.
using Org.BouncyCastle.Crypto;
using Org.BouncyCastle.Crypto.Encodings;
using Org.BouncyCastle.Crypto.Engines;
using Org.BouncyCastle.Crypto.Parameters;
using Org.BouncyCastle.OpenSsl;
using Org.BouncyCastle.Security;


Encryption Complete method
public  string EncryptRSA(string plaintext)
{
    string encryptedText = "";
    byte[] plaintextBytes = Encoding.UTF8.GetBytes(plaintext);

    // Load the public key from a PEM string or file
    string publicKeyPem = @""; // Replace your public key here
    AsymmetricKeyParameter publicKey;
    using (var reader = new StringReader(publicKeyPem))
    {
        PemReader pemReader = new PemReader(reader);
        publicKey = (AsymmetricKeyParameter)pemReader.ReadObject();
    }
    // Initialize the RSA engine for encryption with the public key
    IAsymmetricBlockCipher rsaEngine = new Pkcs1Encoding(new RsaEngine());
    rsaEngine.Init(true, publicKey); // true for encryption
    // Encrypt the data
    byte[] encryptedData = rsaEngine.ProcessBlock(plaintextBytes, 0, plaintextBytes.Length);
    // Convert the encrypted data to a Base64 string for easy transmission/storage
    encryptedText = Convert.ToBase64String(encryptedData);
    return encryptedText;
}


Breakdown of the encryption method

Step 1
plaintextBytes: The input string is converted to a byte array using UTF-8 encoding. This byte array represents the data to be encrypted.
byte[] plaintextBytes = Encoding.UTF8.GetBytes(plaintext);

Step 2

StringReader: The publicKeyPem string is passed to a StringReader to create a text reader.
PemReader: This reads the PEM-formatted key and converts it into an AsymmetricKeyParameter object.
using (var reader = new StringReader(publicKeyPem))
{
    PemReader pemReader = new PemReader(reader);
    publicKey = (AsymmetricKeyParameter)pemReader.ReadObject();
}

Step 3

  • IAsymmetricBlockCipher: This interface represents the RSA encryption engine.
  • Pkcs1Encoding: This wraps the RsaEngine to add PKCS#1 padding, which is commonly used in RSA encryption.
  • Init Method: The RSA engine is initialized for encryption by passing true along with the public key.

IAsymmetricBlockCipher rsaEngine = new Pkcs1Encoding(new RsaEngine());
rsaEngine.Init(true, publicKey); // true for encryption


Step 4

  • ProcessBlock: This method processes the data (encrypts it) using the initialized RSA engine. It takes the plaintext bytes and returns the encrypted byte array.
  • Convert.ToBase64String: The encrypted byte array is converted to a Base64 string. Base64 encoding is used to make the encrypted data easier to transmit or store, as it converts binary data into ASCII string format.

byte[] encryptedData = rsaEngine.ProcessBlock(plaintextBytes, 0, plaintextBytes.Length);
encryptedText = Convert.ToBase64String(encryptedData);

Output Sample for RSA Encryption

Decryption Complete method
public string DecryptRSA(string encryptedText)
{
    string decryptedText = "";
    try
    {

        string pemPrivateKey = @""; // Replace your private key here
        RsaPrivateCrtKeyParameters keyPair;
        using (var reader = new StringReader(pemPrivateKey))
        {
            keyPair = (RsaPrivateCrtKeyParameters)new PemReader(reader).ReadObject();
        }
        var rsaParams = DotNetUtilities.ToRSAParameters(keyPair);
        using (var rsa = new RSACryptoServiceProvider())
        {
            rsa.ImportParameters(rsaParams);
            // Convert encrypted text from Base64
            byte[] encryptedData = Convert.FromBase64String(encryptedText);
            // Decrypt the data
            byte[] decryptedData = rsa.Decrypt(encryptedData, RSAEncryptionPadding.Pkcs1);
            return Encoding.UTF8.GetString(decryptedData);
        }
    }
    catch { }
    return decryptedText;
}


Output sample for decryption


Conclusion

  • The method loads a public RSA key from a PEM string.
  • It initializes an RSA encryption engine using BouncyCastle.
  • The plaintext is encrypted using the public key, and the resulting encrypted data is returned as a Base64-encoded string.

This encryption method ensures that the plaintext is securely transformed into an encrypted format using the RSA algorithm, which can then only be decrypted by the corresponding private key.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: Stop ASP.NET Replay and Session Fixation Attacks

clock August 26, 2024 07:28 by author Peter

An essential component of online application security is session management. We fix a widespread issue in this post that lets potential attackers reuse existing session IDs to obtain unauthorized access to ASP.NET sessions even after they have been logged out. We'll go over how to apply best practices like SSL/TLS and secure cookie settings, as well as how to correctly invalidate sessions upon logout and regenerate session IDs upon login. You can safeguard your ASP.NET application against replay and session fixation attacks and provide a safer environment for your users by adhering to these rules.

How to Handle the Problem
1. Upon logout, invalidate the session

Make sure that when the user logs out, the session is appropriately invalidated. You can accomplish this by using Session.Abandon() in your logout procedure.
An illustration of the logout action

Example in Logout Action
public ActionResult Logout()
{
    // Clear the session
    Session.Abandon();
    Session.Clear();
    // Clear authentication cookies
    FormsAuthentication.SignOut();
    // Redirect to the login page or home page
    return RedirectToAction("Login", "Account");
}

Explanation

  • Session.Abandon() marks the session as abandoned, which means that the session will no longer be used and a new session will be created for the next request.
  • Session.Clear() removes all items from the session.
  • FormsAuthentication.SignOut() logs the user out and clears the authentication ticket.

2. Regenerate the Session ID Upon Login
It's important to regenerate the session ID after a successful login to prevent session fixation attacks. This ensures that any previous session ID is no longer valid.

Example

public ActionResult Login(LoginViewModel model)
{
    if (ModelState.IsValid)
    {
        // Authenticate the user
        var isAuthenticated = Membership.ValidateUser(model.Username, model.Password);
        if (isAuthenticated)
        {
            // Regenerate session ID to prevent session fixation
            SessionIDManager manager = new SessionIDManager();
            string newSessionId = manager.CreateSessionID(HttpContext.Current);
            bool redirected = false;
            bool isAdded = false;
            manager.SaveSessionID(HttpContext.Current, newSessionId, out redirected, out isAdded);
            // Set authentication cookie
            FormsAuthentication.SetAuthCookie(model.Username, model.RememberMe);
            return RedirectToAction("Index", "Home");
        }
    }
    return View(model);
}

Explanation
The SessionIDManager is used to create a new session ID, which is then saved to the current session. This ensures that after logging in, the user’s session ID is different from the one used before authentication.

3. Enforce Session ID expiration
Set a shorter session timeout to minimize the risk of an old session ID being used.

Web. config Setting

<system.web>
    <sessionState timeout="20" />
</system.web>

Explanation
The timeout attribute specifies the number of minutes a session can be idle before it is abandoned. A shorter timeout can reduce the risk of session reuse.

4. Use SSL/TLS
Ensure that your application uses SSL/TLS to protect the session ID in transit. This prevents attackers from capturing the session ID via network sniffing.

5. Secure Cookies
Mark the session cookies as HttpOnly and Secure to prevent client-side access to the session ID and ensure they are only transmitted over secure connections.

Web. config Setting

<system.web>
    <authentication mode="Forms">
        <forms requireSSL="true" />
    </authentication>
    <sessionState cookieSameSite="Strict" />
</system.web>

Explanation

  • requireSSL="true" ensures that cookies are only sent over HTTPS.
  • cookieSameSite="Strict" helps prevent CSRF attacks by limiting the conditions under which cookies are sent.

Summary
By ensuring that the session is properly invalidated on logout, regenerating the session ID upon login, setting session expiration policies, and securing your application with SSL/TLS and secure cookies, you can effectively mitigate the risk of session fixation and replay attacks. This will enhance the overall security of your ASP.NET application.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core 9.0 Hosting - HostForLIFE :: How to Begin ASP.NET Core Integration Testing?

clock August 21, 2024 08:52 by author Peter

This kind of testing includes evaluating several software components simultaneously, from beginning to end, and evaluates them collectively. Integration testing in the structured development process uses unit-tested modules as input, aggregates them into a bigger set, and runs integration tests according to the test plan's specifications to produce output results that trigger system testing. Because it allows you to test the system in real-time and gain valuable insights into how it functions, integration testing is crucial.

Difference between unit testing and integration testing
In unit testing, we concentrate on a brief section of code, typically one function or method. Verifying that a piece of code is functioning and delivering the desired outcome is the aim. In order to test the logic using fictitious data, external dependencies like as databases, APIs, or services are typically mocked.

Integration testing examines how various system components interact with one another. Integration tests, as opposed to unit tests, verify that the integrated components function as intended in a realistic setting by using real or in-memory databases and other services. For a deeper understanding, let's dive right into the code.

Setting Up the .NET Core Project for Integration Testing
In this project, we will create a test environment using docker containers to create the test db using Postgres docker image. Create a test project inside your project solution using the .NET XUnit template.

Let's install all the necessary Nuget packages. Below is a list of packages that are required for testing.

  • Microsoft.AspNetCore.Mvc.Testing
  • AutoFixture
  • AutoFixture.AutoMoq
  • Testcontainers
  • Test containers.PostgreSQL

For setting up the test environment we need to use the WebApplicationFactory class provided by Microsoft.Net.

Understanding the WebApplicationFactory

WebApplicationFactory<TEntryPoint> is used to create a TestServer for integration tests. `TEntryPoint` refers to the entry point class of the System Under Test (SUT), which is usually the Program.cs class.

To use this in our testing project, we first need to expose the Program.cs class. The reason for this is to inform the testing project that this is the entry point of the system.
There are two ways to expose the Program.cs class to the testing project.
First, add the below XML in your starting project (WebAPI project).

The second is to create a partial class with the name public partial class Program { }

Setting up the Postgres container for our testing environment requires us to construct a customized version of our WebApplicationFactory class. The WebApplicationFactory class, which is inherited and implements the IAsyncLifetime interface, is modified in the version shown below. With this implementation, all resources acquired by the test environment or Docker container are released once our tests in Visual Studio Test Explorer have concluded, ensuring that the Postgres container is disposed of appropriately.

We also need to set up Docker Compose to fetch the latest PostgreSQL image. Right-click on the solution, then choose the "Container Orchestration Support" option and select Docker Compose. This will automatically add the required files to your solution. Add the below to your docker-compose.yml file.

To signal that the class contains tests and to offer shared object instances among the tests in the class, test classes implement the IClassFixture class fixture interface. To do that, let's implement IClassFixture<TWebAppFactory> and create a BaseIntegrationTest Class.

To construct a completely functional test environment, we must create a database and the necessary tables inside it after retrieving the Postgres image. Include migration code in the BaseIntegrationTest class's constructor as well.

Now we are all set with the test environment and let's move to the actual test. Create a test class in my case.

Now let's run the test from VisualStudio Test Explorer; also, add a docker desktop to your PC. When we run the test first it will fetch the image from the docker hub with the label "postgres: label" and set up the container with our provided configuration in the docker-compose.yml file.


Docker fetched the PostgreSQL image for us.

Here, you can also create containers, and those containers will automatically be disposed of after the test finishes.

As you can see from the result, our test has also been passed also.

Conclusion
Integration testing ensures that different components of an application work together correctly, validating end-to-end functionality and detecting issues not covered by unit tests. It is crucial for identifying integration problems and verifying that the system meets its requirements as a whole. Thanks for reading!



European ASP.NET Core 9.0 Hosting - HostForLIFE :: ASP.NET Core API Integration with Stripe for Subscription Payments

clock August 19, 2024 07:56 by author Peter

Step 1: Create an account on Stripe and obtain credentials
You must get your API keys and create a Stripe account before you can begin the integration. Take these actions.

  • Visit stripe.com to register or log in to Stripe.
  • Open the Dashboard and find the API area.
  • Make a copy of your publishable key and secret key. These keys are going to be used for Stripe application authentication.

Step 2: Make an application using the.NET API
To integrate Stripe, create a new ASP.NET Core API application. Download the.NET SDK from the.NET website if it isn't already installed on your computer.
Launch a terminal or command prompt.

The command to start a new API project is as follows.
dotnet new webapi -n SubscriptionSystem

Navigate to the project directory.
cd SubscriptionSystem

Step 3: Set Up Stripe in .NET
To use Stripe, install the Stripe NuGet package in your .NET application.

Install the Stripe package.
dotnet add package Stripe.net

Add your Stripe secret key to the configuration.

Open appsettings.json and add the following.
{
  "Stripe": {
    "SecretKey": "your_stripe_secret_key"
  }
}

Step 4: Implement Subscription Functionality
Now, let's implement the subscription functionality. We will create a StripeController to handle the subscription process.

Creating the DTOs
First, create the necessary DTOs for handling Stripe data.

PaymentDto.cs

namespace SubscriptionSystem.Dtos
{
    public class PaymentDto
    {
        public string PaymentMethodId { get; set; }
        public string CustomerId { get; set; }
    }
}

StripePaymentRequestDto.cs

namespace SubscriptionSystem.Dtos
{
    public class StripePaymentRequestDto
    {
        public string Email { get; set; }
        public string PaymentMethodId { get; set; }
    }
}


StripeProductDto.cs
namespace SubscriptionSystem.Dtos
{
    public class StripeProductDto
    {
        public string Id { get; set; }
        public string Name { get; set; }
        public long Amount { get; set; }
        public string Currency { get; set; }
        public string Interval { get; set; }
    }
}


SubscriptionDto.cs
namespace SubscriptionSystem.Dtos
{
    public class SubscriptionDto
    {
        public string SubscriptionId { get; set; }
        public string CustomerId { get; set; }
        public string ProductId { get; set; }
    }
}

Creating the Service Interface and Implementation
Create an interface for the Stripe service and its implementation.

IStripeService.cs
using SubscriptionSystem.Dtos;

namespace SubscriptionSystem.Interfaces
{
    public interface IStripeService
    {
        Task<string> CreateCustomerAsync(string email, string paymentMethodId);
        Task<string> CreateSubscriptionAsync(string customerId, string priceId);
        Task CancelSubscriptionAsync(string subscriptionId);
        Task<StripeProductDto> CreateProductAsync(string name, long amount, string currency, string interval);
    }
}

StripeService.cs
using Stripe;
using SubscriptionSystem.Dtos;
using SubscriptionSystem.Interfaces;

namespace SubscriptionSystem.Services
{
    public class StripeService : IStripeService
    {
        public async Task<string> CreateCustomerAsync(string email, string paymentMethodId)
        {
            var options = new CustomerCreateOptions
            {
                Email = email,
                PaymentMethod = paymentMethodId,
                InvoiceSettings = new CustomerInvoiceSettingsOptions
                {
                    DefaultPaymentMethod = paymentMethodId
                }
            };
            var service = new CustomerService();
            var customer = await service.CreateAsync(options);
            return customer.Id;
        }

        public async Task<string> CreateSubscriptionAsync(string customerId, string priceId)
        {
            var options = new SubscriptionCreateOptions
            {
                Customer = customerId,
                Items = new List<SubscriptionItemOptions>
                {
                    new SubscriptionItemOptions { Price = priceId }
                },
                Expand = new List<string> { "latest_invoice.payment_intent" }
            };
            var service = new SubscriptionService();
            var subscription = await service.CreateAsync(options);
            return subscription.Id;
        }

        public async Task CancelSubscriptionAsync(string subscriptionId)
        {
            var service = new SubscriptionService();
            await service.CancelAsync(subscriptionId);
        }

        public async Task<StripeProductDto> CreateProductAsync(string name, long amount, string currency, string interval)
        {
            var productOptions = new ProductCreateOptions
            {
                Name = name,
            };
            var productService = new ProductService();
            var product = await productService.CreateAsync(productOptions);

            var priceOptions = new PriceCreateOptions
            {
                UnitAmount = amount,
                Currency = currency,
                Recurring = new PriceRecurringOptions { Interval = interval },
                Product = product.Id,
            };
            var priceService = new PriceService();
            var price = await priceService.CreateAsync(priceOptions);

            return new StripeProductDto
            {
                Id = price.Id,
                Name = product.Name,
                Amount = price.UnitAmount.Value,
                Currency = price.Currency,
                Interval = price.Recurring.Interval
            };
        }
    }
}

Creating the Controller
Create a StripeController to handle the subscription process.
StripeController.cs

using Microsoft.AspNetCore.Mvc;
using SubscriptionSystem.Dtos;
using SubscriptionSystem.Interfaces;

namespace SubscriptionSystem.Controllers
{
    [ApiController]
    [Route("api/[controller]")]
    public class StripeController : ControllerBase
    {
        private readonly IStripeService _stripeService;

        public StripeController(IStripeService stripeService)
        {
            _stripeService = stripeService;
        }

        [HttpPost("create-customer")]
        public async Task<IActionResult> CreateCustomer([FromBody] StripePaymentRequestDto paymentRequest)
        {
            var customerId = await _stripeService.CreateCustomerAsync(paymentRequest.Email, paymentRequest.PaymentMethodId);
            return Ok(new { CustomerId = customerId });
        }

        [HttpPost("create-subscription")]
        public async Task<IActionResult> CreateSubscription([FromBody] SubscriptionDto subscriptionDto)
        {
            var subscriptionId = await _stripeService.CreateSubscriptionAsync(subscriptionDto.CustomerId, subscriptionDto.ProductId);
            return Ok(new { SubscriptionId = subscriptionId });
        }

        [HttpPost("cancel-subscription")]
        public async Task<IActionResult> CancelSubscription([FromBody] SubscriptionDto subscriptionDto)
        {
            await _stripeService.CancelSubscriptionAsync(subscriptionDto.SubscriptionId);
            return NoContent();
        }

        [HttpPost("create-product")]
        public async Task<IActionResult> CreateProduct([FromBody] StripeProductDto productDto)
        {
            var product = await _stripeService.CreateProductAsync(productDto.Name, productDto.Amount, productDto.Currency, productDto.Interval);
            return Ok(product);
        }
    }
}

Step 5. Handling Stripe Webhooks
Stripe webhooks allow your application to receive notifications about changes to your customer's subscription status. To handle webhooks.

  1. Create a Webhook Endpoint: This endpoint will receive webhook events from Stripe.
  2. Verify the Webhook Signature: Ensure that the event is from Stripe by verifying the signature.

Setting Up the WebhookStripeController.cs[HttpPost("webhook")]
public async Task<IActionResult> Webhook()
{
    var json = await new StreamReader(HttpContext.Request.Body).ReadToEndAsync();
    try
    {
        var stripeEvent = EventUtility.ConstructEvent(
            json,
            Request.Headers["Stripe-Signature"],
            "your_stripe_webhook_secret"
        );

        // Handle the event
        if (stripeEvent.Type == Events.CustomerSubscriptionCreated)
        {
            var subscription = stripeEvent.Data.Object as Subscription;
            // Handle the subscription creation
        }
        else if (stripeEvent.Type == Events.CustomerSubscriptionDeleted)
        {
            var subscription = stripeEvent.Data.Object as Subscription;
            // Handle the subscription cancellation
        }

        return Ok();
    }
    catch (StripeException e)
    {
        return BadRequest();
    }
}

Update the Dependency Injection in the Program.cs

Make sure to register the Stripe service in the dependency injection container.

Program.cs

builder.Services.AddScoped<IStripeService, StripeService>();
builder.Services.AddScoped<ProductService>();
builder.Services.AddScoped<SubscriptionService>();

Conclusion
In this article, we went through the steps to integrate Stripe for subscription payments in an ASP.NET Core API application. This includes setting up Stripe, creating the necessary DTOs, implementing the service interface, and creating a controller to handle subscription functions. This template allows you to easily manage customer creation, product development, and subscription lifecycles. Stripe’s integration enhances your application by providing secure and reliable payment processing.

HostForLIFE ASP.NET Core 9.0 Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in