European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core Hosting - HostForLIFE :: Discover how to integrate Firebase with .NET

clock April 14, 2025 07:19 by author Peter

A free database is provided by a Firebase database connection.  In essence, we are able to perform CRUD tasks, which include Insert, Update, Get, and Delete.  I've broken out how to install and create a project on the Firebase console panel here.  Opening Firebase and logging into your account is the first step.  The second step is to select the terminal application that opens Dashboard, click Create Project, and then enter the name of your project.

 

Making a real-time database is the third phase. Then select the test mode and press the "enable" button. You can now use this configuration for crude operations as your database has been built. The first step is to register for a Firebase account. I've included a link that will assist you in using the Firebase console app to create a project.

These are some steps listed below that take you to the exact place of Firebase integration with .NET. Firstly, you need to create a .net project and then install the NuGet package; and name of the package is FireSharp, and the version of the package is 2.0.4.

Now, you will need credentials to perform the CRUD operation.

To connect with your real-time database, copy the base path from the console app.

Then we need the authsecret key, that key you can fetch from the project setting > service account > database secret key.


Lastly, let’s write code in .NET.
// Firebase configuration
IFirebaseConfig ifc = new FirebaseConfig()
{
    AuthSecret = "**********x8Ed6HVU0YXlXW-L75ho4ps",
    BasePath = "https://we****.firebaseio.com/"
};

IFirebaseClient client;

// Create a user object
User user = new User()
{
    Id = 1,
    FirstName = "Test 1",
    LastName = "Test 2"
};

// Initialize Firebase client
client = new FireSharp.FirebaseClient(ifc);

// Insert data
var set = client.Set("User/" + user.Id, user);

// Delete data
set = client.Delete("User/" + user.Id);

// Update data
set = client.Update("User/" + user.Id, user);

// Retrieve data
set = client.Get("User/" + user.Id);


To explore more classes, please visit the official doc of Firebase Admin .NET SDK: firebase.google.com/docs/reference/admin/dotnet



European ASP.NET Core Hosting - HostForLIFE :: Creating Custom Components in Blazor

clock April 11, 2025 09:33 by author Peter

Microsoft created the open-source Blazor web framework, which enables programmers to create interactive online apps with C# and.NET. Blazor builds modular and reusable user interface components using a component-based architecture. Building intricate and reusable web apps requires the use of custom components. We will use examples to demonstrate how to develop custom components in Blazor in this article.

Prerequisites
Before we begin, ensure you have the following set up on your development environment:

  • Visual Studio 2022.
  • Basic knowledge of C# and HTML.

Understanding Components in Blazor
Components in Blazor are similar to user controls in other web frameworks. They are self-contained pieces of code that contain both markup and logic. Components can be composed and nested to create complex UI elements. In Blazor, components can be created using Razor syntax or using C# code. There are two types of components in Blazor:

  • Razor Components: These are defined using Razor syntax (.razor files) and allow for a mix of HTML and C# code.
  • Code-Behind Components: These are defined using C# classes and are more suitable for more complex logic or when you want to separate the UI and C# code.

In this article, we'll focus on creating custom Razor components.

Step 1. Create a New Blazor Project
Let's start by creating a new Blazor project. Open Visual Studio and follow these steps:

  • Click on "Create a new project."
  • In the "Create a new project" dialog, search for "Blazor WebAssembly App," select the "Blazor WebAssembly App" template and click on "Next".
  • Choose a name and location for your project, and click "Next".
  • Choose the ".NET 7.0" option from the framework and click "Create" to generate the project.

Step 2. Add a Custom Component
In this example, we'll create a simple custom component that displays a welcome message with the ability to customize the name.

  • Right-click on the "Pages" folder in your Blazor project, and select "Add" > "New Folder." Name the folder "Components."
  • Right-click on the newly created "Components" folder, and select "Add" > "New Item."
  • In the "Add New Item" dialog, search for "Razor Component" and select the "Razor Component" template.
  • Name the component "WelcomeMessage.razor" and click "Add."

Step 3. Define the Custom Component
Now, let's define the content of our custom component. Open the "WelcomeMessage.razor" file and replace its content with the following code.
@code {
    [Parameter] public string Name { get; set; } = "Guest";
}
<h3>Welcome, @Name!</h3>

In this code, we have a simple Razor component with a parameter named "Name." The parameter represents the name of the user to display in the welcome message. We've set a default value of "Guest" in case the name is not provided.

Step 4. Using the Custom Component

Now that we have our custom component defined let's use it in one of our existing Blazor pages. Open the "Index.razor" file located in the "Pages" folder and add the following line at the top of the file to import the "WelcomeMessage" component.
@page "/"

@using YourAppName.Components


Next, add the following code within the existing <div> tag in the "Index.razor" file:
<WelcomeMessage Name="Peter" />

This line of code will render the "WelcomeMessage" component with the name "Peter".

Step 5. Build and Run the Application

With the custom component in place, we can now build and run the application to see it in action. Press Ctrl + F5 or click the "Start Debugging" button in Visual Studio to build and run the application.
Once the application loads in your browser, you should see the welcome message, "Welcome, Peter!" If you don't see the name, check if you've correctly implemented the custom component.

How to Create Reusable Components?
One of the main benefits of using custom components in Blazor is the ability to create reusable UI elements. To create a reusable component, you can define it in a separate file and import it into other components as needed. Here's an example of a reusable component that displays a button.

Create a new component named as SubmitButton and add the below code.
<button class="@ButtonClass" @onclick="OnClick">@ButtonText</button>

@code {
    [Parameter]
    public string ButtonText { get; set; } = "Button";

    [Parameter]
    public string ButtonClass { get; set; } = "btn-primary";

    [Parameter]
    public EventCallback<MouseEventArgs> OnClick { get; set; }
}


This component takes three parameters: the button text, the button class, and a callback that is triggered when the button is clicked. The default values for the button text and class are set in the component, but they can be overridden when the component is used.

To use this component in your application, you can add the following code to a Razor page.
<SubmitButton ButtonText="Click Me" ButtonClass="btn-success" OnClick="HandleClick" />
@code {
    private async Task HandleClick(MouseEventArgs args)
    {
        // Handle the button click event
    }
}


This will render a button with the text "Click Me" and the class "btn-success". When the button is clicked, the HandleClick method will be called.

Conclusion

Custom components are a powerful feature of Blazor that allow developers to create reusable and modular UI elements. By creating custom components, developers can build complex web applications more efficiently and with greater flexibility. In this article, we explored how to create custom components in Blazor using examples. We hope this article has been helpful in understanding how to create custom components in Blazor.

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core Hosting - HostForLIFE :: ASP.NET Core Advanced APIs: Versioning, EF Core, and Middleware

clock April 7, 2025 10:15 by author Peter

My preferred framework for creating scalable web services is ASP.NET Core Web API. When creating scalable web services, I like to use the ASP.NET Core Web API. I can't wait to tell you about it. Using Entity Framework Core, Dependency Injection, API versioning, and a little extra code to log requests and responses, I'll walk you through how I created it in this tutorial.

We're looking at sophisticated concepts for developing APIs that expand as your program gets larger, so this won't be your typical beginner's assignment. Let's get started.

Step 1. Setting Up the Project
First, I fired up Visual Studio and created a new ASP.NET Core Web API project. Here’s how I did it.

  • Open Visual Studio and click on Create a new project.
  • I chose ASP.NET Core Web API and clicked Next.
  • Named the project EmployeeManagementAPI—you can name it whatever you like—and clicked Create.
  • I selected .NET 7.0 and hit Create.

Once Visual Studio had set up the basic project structure, I was ready to roll. Next, it was time to integrate Entity Framework Core so I could store and manage employee data in a database.

Step 2. Integrating Entity Framework Core

I needed to hook up a database to store employee records. For this, I went with Entity Framework Core because it’s super flexible and easy to work with.

Installing EF Core

First things first, I installed the required packages via Package Manager Console.
Install-Package Microsoft.EntityFrameworkCore.SqlServer
Install-Package Microsoft.EntityFrameworkCore.Tools


With that out of the way, I moved on to creating a DbContext to represent the database. I created a folder called Data and added a new class called EmployeeContext. Here’s the code I put in.
using Microsoft.EntityFrameworkCore;
using EmployeeManagementAPI.Models;

namespace EmployeeManagementAPI.Data
{
    public class EmployeeContext : DbContext
    {
        public EmployeeContext(DbContextOptions<EmployeeContext> options)
            : base(options)
        {
        }

        public DbSet<Employee> Employees { get; set; }
    }
}


Next, I needed an Employee model. In the Models folder, I added the Employee.cs class.
namespace EmployeeManagementAPI.Models
{
    public class Employee
    {
        public int Id { get; set; }
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public string Department { get; set; }
        public decimal Salary { get; set; }
    }
}


Configuring the Database Connection
With the DbContext and model in place, I needed to configure the connection string. I added the connection string in appsettings.json like this.

"ConnectionStrings": {
  "EmployeeConnection": "Server=(localdb)\\mssqllocaldb;Database=EmployeeManagementDB;Trusted_Connection=True;"
}

Then, in Program.cs, I added the following line to register EmployeeContext with Dependency Injection.

builder.Services.AddDbContext<EmployeeContext>(options =>
    options.UseSqlServer(builder.Configuration.GetConnectionString("EmployeeConnection")));


Running Migrations
Finally, I created the database using EF Core migrations. Here’s what I did.

  • Add-Migration InitialCreate
  • Update-Database

This created the Employees table in the database. With the database ready, it was time to move on to the service layer.

Step 3. Building the Service Layer
Rather than dumping all the logic into the controller, I created a service layer to handle employee operations. This helps keep the code cleaner and easier to maintain.
Creating the Service Interface and Implementation

In the Services folder, I added an interface, IEmployeeService, and its implementation, EmployeeService. Here's what I came up with,

First, the interface.

public interface IEmployeeService
{
    Task<IEnumerable<Employee>> GetAllEmployeesAsync();
    Task<Employee> GetEmployeeByIdAsync(int id);
    Task AddEmployeeAsync(Employee employee);
    Task UpdateEmployeeAsync(Employee employee);
    Task DeleteEmployeeAsync(int id);
}


Then, I implemented the interface in EmployeeService.cs.

public class EmployeeService : IEmployeeService
{
    private readonly EmployeeContext _context;

    public EmployeeService(EmployeeContext context)
    {
        _context = context;
    }

    public async Task<IEnumerable<Employee>> GetAllEmployeesAsync()
    {
        return await _context.Employees.ToListAsync();
    }

    public async Task<Employee> GetEmployeeByIdAsync(int id)
    {
        return await _context.Employees.FindAsync(id);
    }

    public async Task AddEmployeeAsync(Employee employee)
    {
        _context.Employees.Add(employee);
        await _context.SaveChangesAsync();
    }

    public async Task UpdateEmployeeAsync(Employee employee)
    {
        _context.Entry(employee).State = EntityState.Modified;
        await _context.SaveChangesAsync();
    }

    public async Task DeleteEmployeeAsync(int id)
    {
        var employee = await _context.Employees.FindAsync(id);
        if (employee != null)
        {
            _context.Employees.Remove(employee);
            await _context.SaveChangesAsync();
        }
    }
}


Now, I needed to register this service in Program.cs so it could be injected into the controllers.

builder.Services.AddScoped<IEmployeeService, EmployeeService>();


Step 4. Building the Employee Controller
With the service layer ready, I moved on to the controller. In the Controllers folder, I created EmployeesController.cs.

[Route("api/[controller]")]
[ApiController]
public class EmployeesController : ControllerBase
{
    private readonly IEmployeeService _employeeService;

    public EmployeesController(IEmployeeService employeeService)
    {
        _employeeService = employeeService;
    }

    [HttpGet]
    public async Task<IActionResult> GetAllEmployees()
    {
        var employees = await _employeeService.GetAllEmployeesAsync();
        return Ok(employees);
    }

    [HttpGet("{id}")]
    public async Task<IActionResult> GetEmployeeById(int id)
    {
        var employee = await _employeeService.GetEmployeeByIdAsync(id);
        if (employee == null)
        {
            return NotFound();
        }
        return Ok(employee);
    }

    [HttpPost]
    public async Task<IActionResult> AddEmployee([FromBody] Employee employee)
    {
        await _employeeService.AddEmployeeAsync(employee);
        return CreatedAtAction(nameof(GetEmployeeById), new { id = employee.Id }, employee);
    }

    [HttpPut("{id}")]
    public async Task<IActionResult> UpdateEmployee(int id, [FromBody] Employee employee)
    {
        if (id != employee.Id)
        {
            return BadRequest();
        }
        await _employeeService.UpdateEmployeeAsync(employee);
        return NoContent();
    }

    [HttpDelete("{id}")]
    public async Task<IActionResult> DeleteEmployee(int id)
    {
        await _employeeService.DeleteEmployeeAsync(id);
        return NoContent();
    }
}


This controller was straightforward and tied everything together. I now had a fully functional API for managing employees.

Step 5. Adding API Versioning
As the API grew, I knew I’d need to implement API versioning to ensure backward compatibility. I installed the versioning package.
Install-Package Microsoft.AspNetCore.Mvc.Versioning


Next, I configured versioning in Program.cs.
builder.Services.AddApiVersioning(options =>
{
    options.AssumeDefaultVersionWhenUnspecified = true;
    options.DefaultApiVersion = new ApiVersion(1, 0);
    options.ReportApiVersions = true;
});


Now, I could version my controllers like this.

[ApiVersion("1.0")]
[Route("api/v{version:apiVersion}/[controller]")]
[ApiController]
public class EmployeesV1Controller : ControllerBase
{
    // Version 1.0 controller code
}


Step 6. Custom Middleware for Logging
One thing I always like to do is log requests and responses, especially when working with APIs. So, I wrote some custom middleware to log incoming requests and outgoing responses.

Here’s what my middleware looked like.

public class RequestLoggingMiddleware
{
    private readonly RequestDelegate _next;
    private readonly ILogger<RequestLoggingMiddleware> _logger;

    public RequestLoggingMiddleware(RequestDelegate next, ILogger<RequestLoggingMiddleware> logger)
    {
        _next = next;
        _logger = logger;
    }

    public async Task InvokeAsync(HttpContext context)
    {
        _logger.LogInformation($"Incoming request: {context.Request.Method} {context.Request.Path}");
        await _next(context);
        _logger.LogInformation($"Outgoing response: {context.Response.StatusCode}");
    }
}


Then, I registered this middleware in Program.cs.
app.UseMiddleware<RequestLoggingMiddleware>();

Now, every request and response was being logged, which made debugging much easier.

Conclusion

And there you have it—an advanced Employee Management API built with ASP.NET Core Web API. We covered a lot of ground, from integrating Entity Framework Core to creating a solid service layer, and even added some extra touches like API versioning and custom middleware.

This is the kind of architecture that scales well and keeps things organized. If you’ve made it this far, your API is in great shape for future growth.

Next Steps

  • Consider adding authentication and authorization to secure the API (I recommend using JWT).
  • Look into caching to improve performance, especially for frequently accessed data.
  • Write unit tests for your services and controllers to ensure your API behaves as expected.

Happy coding!

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core Hosting - HostForLIFE :: Using IOptions Pattern with Records in .NET 9.0

clock March 24, 2025 09:00 by author Peter

In modern .NET applications, effective configuration management is essential for ensuring flexibility and a clear separation of concerns. The IOptions<T> pattern is the preferred approach for injecting configuration settings into services. With .NET 9.0, records provide a concise and immutable way to represent configuration models. In this blog, we’ll explore how to utilize the IOptions<T> pattern efficiently with records.

Why Choose Records for Configuration?
Using records in C# for configuration settings offers several advantages:

  • Immutability: Prevents unintentional modifications to configuration values.
  • Value-based equality: Instances with identical values are treated as equal.
  • Concise syntax: Reduces boilerplate code compared to traditional classes.

Configuring Settings in .NET 9.0
Step 1. Create the Configuration Record
Rather than using a class, define a record to represent your configuration settings.
public record AppSettings
    {
       public const string SectionName = "AppSettings";
        public required string ApplicationName { get; init; }
        public required int MaxRequests { get; init; }
        public required LoggingSettings Logging { get; init; }
    }

    public record LoggingSettings
    {
        public required string LogLevel { get; init; }
    }
}


Step 2. Configure Settings in appsettings.json
Specify your configuration values within the appsettings.json file:

{
  "AllowedHosts": "*",
  "AppSettings": {
    "ApplicationName": "MyCoolApp",
    "MaxRequests": 100,
    "Logging": {
      "LogLevel": "Information"
    }
  }
}

Step 3. Register Configuration in Program.cs
Use the Configure method to bind the configuration to the record.
builder.Services.Configure<AppSettings>(builder.Configuration.GetSection(AppSettings.SectionName));

Step 4. Inject Configuration with IOptions<T>

Inject IOptions<AppSettings> into services, controllers, or components.
public class RecordService
{
   private readonly AppSettings _settings;

    public RecordService(
        IOptions<AppSettings> settings)
    {
        _settings = settings.Value;

    }

    public AppSettings GetAppSettingsAsync()
    {
        return  _settings;
    }
}

Step 5. Configure Service Registration
Register the service in Program.cs:
builder.Services.AddSingleton<RecordService>();

Step 6. Utilize the Service
Resolve the service and call its methods.

[Route("api/[controller]")]
[ApiController]
public class RecordsController : ControllerBase
{
    private readonly RecordService _recordService;

    public RecordsController(RecordService recordService)
    {
        _recordService = recordService;
    }


    [HttpGet]
    [Route("settings")]
    public IActionResult GetAppSettings()
    {
        return Ok(_recordService.GetAppSettingsAsync());
    }
}


Execute the Code
Execute the code and see the result for yourself.


  • Alternative: Using IOptionsSnapshot<T> and IOptionsMonitor<T>
  • IOptionsSnapshot<T>: Provides a scoped instance of options, making it useful for refreshing values between requests in a web application.
  • IOptionsMonitor<T>: Enables real-time configuration updates without requiring an application restart.

Conclusion
Leveraging records with the IOptions pattern in .NET 8.0 results in cleaner, immutable, and more efficient configuration management. Whether you use IOptions<T>, IOptionsSnapshot<T>, or IOptionsMonitor<T>, this approach fosters a well-organized and maintainable application.

Do you incorporate records for configuration in your .NET applications? Feel free to share your thoughts in the comments!

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



European ASP.NET Core Hosting - HostForLIFE :: AI and.NET: Using.NET 9AI and.NET: Using.NET 9 to Create Intelligent Apps to Create Intelligent Apps

clock March 10, 2025 08:21 by author Peter

The way we create applications is evolving as a result of the convergence of artificial intelligence (AI) and contemporary software development. Microsoft continues to provide developers with tools and frameworks that make it simple to integrate AI capabilities into robust, scalable applications with the introduction of.NET 9.

We'll examine how.NET 9 facilitates AI-driven development in this post, with an emphasis on machine learning models and a demonstration of an intelligent sentiment analysis API project with sample code.

Why Develop AI with.NET 9?
Better performance, enhanced support for cloud-native apps, and an improved ecosystem that excels at AI workloads are just a few of the fantastic new features in.NET 9. It's an excellent option for creating intelligent, high-performance programs because of its extremely straightforward APIs, improved AOT (Ahead-of-Time) compilation, and improved container integration. Additionally, developers can include AI directly into their.NET solutions thanks to libraries like ML.NET and compatibility with well-known AI frameworks like TensorFlow and ONNX.

Setting the Stage: Tools and Prerequisites

To follow along, you’ll need,

  • Visual Studio or VS Code with .NET 9 SDK: Install the latest version from the official Microsoft site, for coding and debugging. I will be using VS Code for the demo project.
  • ML.NET: A machine learning framework for .NET (install via NuGet: Microsoft.ML).
  • A Basic Understanding of ASP.NET Core: For building the API.

Project Overview: Sentiment Analysis API

Let’s build a simple ASP.NET Core Web API that uses ML.NET to do sentiment analysis on text submitted by users. The API will predict if a text expresses positive or negative sentiment, which is a great starting point for AI-driven apps.

Step 1. Setting Up the Project
Create a new ASP.NET Core Web API project.
dotnet new webapi -n dotnet9SentimentApi -f net9.0
cd dotnet9SentimentApi

Open the folder in VS Code. The project should be as shown below.

Let’s add the ML.NET NuGet package.

You can either run the command or do it from VS code UI as illustrated below.

dotnet add package Microsoft.ML

Step 2. Define the Data Models
Create a folder named Models and add two classes: SentimentData and SentimentPrediction.

SentimentData.cs: Represents the input data for the model.
namespace dotnet9SentimentApi.Models;

public record SentimentData
{
    public string Text { get; set; } = string.Empty;
}


SentimentPrediction.cs: Represents the model’s output.
namespace dotnet9SentimentApi.Models;

public record SentimentPrediction
{
    public bool Prediction { get; set; } // True = Positive, False = Negative
    public float Probability { get; set; }
}

Step 3. Training a Simple Sentiment Model with ML.NET
For simplicity, we’ll use a pre-trained model approach here, but ML.NET allows you to train your own model with a dataset. Create a SentimentModelTrainer.cs class in a Services folder to simulate model setup:

To train the model with your own data.
private void TrainModel()
    {
        // Simulated training data (in practice, load from a file or database)
        var data = BuildTrainingData();

        var dataView = _mlContext.Data.LoadFromEnumerable(data);

        // Enhanced pipeline with text preprocessing
        var pipeline = _mlContext.Transforms.Text.FeaturizeText("Features", new Microsoft.ML.Transforms.Text.TextFeaturizingEstimator.Options
        {
            WordFeatureExtractor = new Microsoft.ML.Transforms.Text.WordBagEstimator.Options(),
            StopWordsRemoverOptions = new Microsoft.ML.Transforms.Text.StopWordsRemovingEstimator.Options()
        }, nameof(SentimentTrainingData.Text))
            .Append(_mlContext.BinaryClassification.Trainers.LbfgsLogisticRegression(labelColumnName: nameof(SentimentTrainingData.Sentiment)));

        // Train the model
        _model = pipeline.Fit(dataView);

        // Optional: Save the model for reuse
        _mlContext.Model.Save(_model, dataView.Schema, "sentiment_model.zip");
    }

Note. In a real-world scenario, you’d train with a larger dataset (e.g., from a CSV file) and save the model using _mlContext.Model.Save().

Complete SentimentModelTrainer.cs class

using System;
using dotnet9SentimentApi.Models;
using Microsoft.ML;

namespace dotnet9SentimentApi.Services;

public class SentimentModelTrainer
{
    private readonly MLContext _mlContext;
    private ITransformer _model;

    public SentimentModelTrainer()
    {
        _mlContext = new MLContext();
        TrainModel();

    }

    private void TrainModel()
    {
        // Simulated training data (in practice, load from a file or database)
        var data = BuildTrainingData();

        var dataView = _mlContext.Data.LoadFromEnumerable(data);
        // Enhanced pipeline with text preprocessing
        var pipeline = _mlContext.Transforms.Text.FeaturizeText("Features", new Microsoft.ML.Transforms.Text.TextFeaturizingEstimator.Options
        {
            WordFeatureExtractor = new Microsoft.ML.Transforms.Text.WordBagEstimator.Options(),
            StopWordsRemoverOptions = new Microsoft.ML.Transforms.Text.StopWordsRemovingEstimator.Options()
        }, nameof(SentimentTrainingData.Text))
            .Append(_mlContext.BinaryClassification.Trainers.LbfgsLogisticRegression(labelColumnName: nameof(SentimentTrainingData.Sentiment)));

        // Train the model
        _model = pipeline.Fit(dataView);

        // Optional: Save the model for reuse
        _mlContext.Model.Save(_model, dataView.Schema, "sentiment_model.zip");
    }

    public SentimentPrediction Predict(string text)
    {
        var predictionEngine = _mlContext.Model.CreatePredictionEngine<SentimentData, SentimentPrediction>(_model);
        return predictionEngine.Predict(new SentimentData { Text = text });
    }

    //build training data
    private List<SentimentTrainingData> BuildTrainingData()
    {
        return new List<SentimentTrainingData>
        {
            new() { Text = "I love this product!", Sentiment = true },
            new() { Text = "This is terrible.", Sentiment = false },
            new() { Text = "The weather is nice!", Sentiment = true },
            new() { Text = "Horrible service, never again.", Sentiment = false },
            new() { Text = "Absolutely fantastic experience!", Sentiment = true },
            new() { Text = "It’s a complete disaster.", Sentiment = false },
            new() { Text = "I’m so happy with this!", Sentiment = true },
            new() { Text = "Disappointing and awful.", Sentiment = false },
            new() { Text = "I’m so impressed!", Sentiment = true },
            new() { Text = "I’m never coming back.", Sentiment = false },
            new() { Text = "I’m so excited!", Sentiment = true },
            new() { Text = "I’m so disappointed.", Sentiment = false },
            new() { Text = "I’m so pleased with this!", Sentiment = true },
            new() { Text = "I’m so upset.", Sentiment = false },
            new() { Text = "I’m so satisfied with this!", Sentiment = true },
            new() { Text = "I’m so angry.", Sentiment = false },
            new() { Text = "I’m so grateful for this!", Sentiment = true },
            new() { Text = "I’m so annoyed.", Sentiment = false },
            new() { Text = "I’m so thankful for this!", Sentiment = true }
        };
    }

}
public record SentimentTrainingData
{
    public string Text { get; set; } = string.Empty;
    public bool Sentiment { get; set; }
}


Step 4. Build the API Controller
Create a SentimentController in the Controllers folder.
using dotnet9SentimentApi.Models;
using dotnet9SentimentApi.Services;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;

namespace dotnet9SentimentApi.Controllers
{
    [Route("api/[controller]")]
    [ApiController]
    public class SentimentController : ControllerBase
    {
        private readonly SentimentModelTrainer _modelTrainer;

        public SentimentController(SentimentModelTrainer modelTrainer)
        {
            _modelTrainer = modelTrainer;
        }

        [HttpPost("analyze")]
        public ActionResult<SentimentPrediction> AnalyzeSentiment([FromBody] SentimentData input)
        {
            if (string.IsNullOrEmpty(input.Text))
                return BadRequest("Text cannot be empty.");

            var prediction = _modelTrainer.Predict(input.Text);
            return Ok(new { Sentiment = prediction.Prediction ? "Positive" : "Negative", Confidence = prediction.Probability });
        }
    }
}

Step 5. Register Services in the Program.cs
Update Program.cs to use minimal APIs and register the service.
// Add services to the container
builder.Services.AddControllers();
builder.Services.AddSingleton<SentimentModelTrainer>();

// Configure the HTTP request pipeline
app.UseHttpsRedirection();
app.MapControllers();

Complete the code for the Program.cs
using dotnet9SentimentApi.Services;

var builder = WebApplication.CreateBuilder(args);

// Add services to the container.
// Learn more about configuring OpenAPI at https://aka.ms/aspnet/openapi
builder.Services.AddOpenApi();

// Add services to the container
builder.Services.AddControllers();
builder.Services.AddSingleton<SentimentModelTrainer>();

var app = builder.Build();

// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
    app.MapOpenApi();
}

// Configure the HTTP request pipeline
app.UseHttpsRedirection();
app.MapControllers();

app.Run();

Our solution is ready to test.

The folder structure looks as shown below.

Test the API
dotnet run

You can use Postman to test the API or HTTP file in VS Code. I will test using VS Code .http file.
@dotnet9SentimentApi_HostAddress = http://localhost:5288

# post analyze sentiment
POST {{dotnet9SentimentApi_HostAddress}}/api/sentiment/analyze
Content-Type: application/json

{
  "text": "I am not happy with the result!"
}

###

The result is shown below

The result is not as expected – indicates an issue with the sentiment analysis model’s training or configuration. Given the tiny dataset and simplistic setup in the original sample, the model isn’t learning meaningful patterns.

However, we can improve the result with more training data and testing.

Enhancing the Project
Scalability: Deploy to Azure with container support for cloud-native scaling.
Model Accuracy: Use a real dataset (e.g., IMDB reviews) and train with more sophisticated algorithms like FastTree.
Performance: Cache predictions for frequently analyzed text using MemoryCache.
Integration: Extend with Azure Cognitive Services or TensorFlow.NET for advanced AI capabilities.

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core Hosting - HostForLIFE :: Using Delegate Chaining in.NET to Create a Chain of Responsibility

clock March 6, 2025 06:46 by author Peter

One of the most underappreciated yet effective design patterns is the Chain of Responsibility, or CoR, pattern. Like ASP.NET Core middleware or HttpClient handlers, it enables us to create adaptable, extensible processing pipelines.

How .NET Uses Chain of Responsibility?

The CoR pattern is heavily used in .NET, especially in ASP.NET Core and HttpClient.

  • ASP.NET Core Middleware
    • Middleware components are chained together to process HTTP requests.
    • Each middleware can decide to process the request, pass it to the next middleware, or short-circuit the pipeline.
  • HttpClient Handlers
    • It uses a chain of HttpMessageHandler instances to process HTTP requests.
    • Each handler can modify the request, pass it to the next handler, or short-circuit the pipeline.
  • Validation Pipelines: Libraries like FluentValidation use a similar pattern to chain validation rules.

Delegate-Based CoR for Validation Rules
Instead of hardcoding validation logic, we dynamically add validation rules using delegate chaining.

Define the Delegate Pipeline with Short-Circuit support.

public class ValidationPipeline
{
    private Func<User, Task<bool>> _pipeline = user => Task.FromResult(true); // Default: Always passes
    public void AddRule(Func<User, Task<bool>> rule)
    {
        var previous = _pipeline;
        _pipeline = async user => await previous(user) && await rule(user); // Chain with AND condition
    }
    public async Task<bool> ValidateAsync(User user) => await _pipeline(user);
}


Dynamically Add Rules
Validate Pipeline
public class FeatureToggleService : IFeatureToggleService
{
    private readonly ValidationPipeline _validateRules;
    private readonly IFeatureManagerSnapshot _featureManager;
    public FeatureToggleService(IFeatureManagerSnapshot featureManager)
    {
        _validateRules = new ValidationPipeline();
        _featureManager = featureManager;
    }
    public async Task<bool> CanAccessFeatureAsync(User user)
    {
        _validateRules.AddRule(async user => await _featureManager.IsEnabledAsync("CustomGreeting"));
        _validateRules.AddRule(user => Task.FromResult(user.Role == "Amin"));
        _validateRules.AddRule(user => Task.FromResult(user.HasActiveSubscription));

        return await _validateRules.ValidateAsync(user);
    }
}
public interface IFeatureToggleService
{
    Task<bool> CanAccessFeatureAsync(User user);
}

How Can It Help?

  • Dynamic & Extensible: Add/remove rules without modifying existing logic.
  • Follows Open-Closed Principle (OCP): New rules can be added without modifying old code.
  • Composable: Chain rules like ASP.NET Core middleware.
  • Async-Support: Works well with async validation.  

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core Hosting - HostForLIFE :: Compare Core APIs with Code using ASP.NET SOAP Services

clock February 17, 2025 07:51 by author Peter

In enterprise application development, SOAP (Simple Object Access Protocol) services have long been an integral part of enterprise application development, especially in industries such as financial services, healthcare, and government, where structured communication is required. Ziggy Rafiq compares two critical approaches to SOAP service implementation in the .NET ecosystem: ASMX and CoreWCF for ASP.NET Core SOAP APIs. The purpose of this article is to help developers choose the best SOAP implementation for their application by providing practical examples of the two SOAP implementations.

The following are some of the things you will learn from this article:

  • Maintenance and enhancement of legacy SOAP services are done through the use of ASMX.
  • With CoreWCF, you can create ASP.NET Core APIs that are scalable, cross-platform, and compatible with multiple platforms.
  • Make sure you select the right framework for your application based on its requirements.

The guide is invaluable for developers integrating modern SOAP solutions with REST and gRPC or transitioning to modern SOAP solutions.

The ASP.NET SOAP Web Services (ASMX)
ASP.NET SOAP Web Services (ASMX), as part of the ASP.NET Framework, provide a method for exposing methods as SOAP services that clients can consume over HTTP; these services were the go-to solution for SOAP-based communication in early ASP.NET applications for SOAP-based communication over the internet.

ASMX Features
Built-in support for SOAP: ASMX services support SOAP by automatically generating SOAP envelopes and annotating methods with the [WebMethod] attribute.
Auto-generated WSDL: When you create an ASMX service, a WSDL file is automatically generated, which clients can use to understand how the service works.
Platform limitations: In cross-platform environments, ASMX services are less flexible because they require IIS (Internet Information Services) to host them, making them more limited.

ASMX Web Service Code Example
The following steps will guide you through creating an ASMX web service in the .NET Framework:
1. Visual Studio should be used to create an ASP.NET Web Forms project.

2. Assemble a calculator service by adding a .asmx file to your project (such as CalculatorService.asmx).

 

3. The service should be implemented with the [WebMethod] attribute.
An example of a simple calculator service is as follows:
using System.Web.Services;

namespace ASMXService
{
    /// <summary>
    /// Summary description for CalculatorService
    /// </summary>
    [WebService(Namespace = "http://tempuri.org/")]
    [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
    [System.ComponentModel.ToolboxItem(false)]
    // To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
    // [System.Web.Script.Services.ScriptService]
    public class CalculatorService : System.Web.Services.WebService
    {

        [WebMethod]
        public int Add(int a, int b)
        {
            return a + b;
        }

        [WebMethod]
        public int Subtract(int a, int b)
        {
            return a - b;
        }

    }
}


How to Create and Run a Program?

  • Visual Studio should be used to create an ASP.NET Web Forms project.
  • Make sure your project contains an ASSMX file.
  • The above code should be added to your service.
  • You can view the auto-generated WSDL by visiting the .asmx file in the browser after running the project.

How Do ASP.NET Core SOAP APIs Work?
In ASP.NET Core, Microsoft's cross-platform framework, SOAP services aren't built in, but developers can use CoreWCF to create SOAP-based APIs. The CoreWCF project brings WCF-like functionality to .NET Core, allowing developers to develop SOAP APIs in a modern, scalable, and cross-platform environment.

CoreWCF SOAP APIs for ASP.NET Core

  • Requires CoreWCF for SOAP implementation: In contrast to ASMX, ASP.NET Core does not come with SOAP support by default but can be added using CoreWCF.
  • Cross-platform support: CoreWCF services can be run on Windows, Linux, and macOS, making them suitable for modern cloud-native applications.
  • Integration with modern features: ASP.NET Core features such as middleware, dependency injection, and performance scalability are integrated into CoreWCF.

ASP.NET Core SOAP API Code Example
The following steps will help you create a SOAP API in ASP.NET Core using CoreWCF:

Step 1. The following NuGet packages are required to install CoreWCF:
dotnet add package CoreWCF
dotnet add package CoreWCF.Http


Step 2. Use the [ServiceContract] and [OperationContract] attributes to define the service contract:
using CoreWCF;

namespace CoreWCFService.Contracts.Interfaces;

[ServiceContract]
public interface ICalculatorService
{
    [OperationContract]
    double Add(double a, double b);

    [OperationContract]
    double Subtract(double a, double b);

    [OperationContract]
    double Multiply(double a, double b);

    [OperationContract]
    double Divide(double a, double b);

}

Step 3. Creating a class that inherits from the service contract is the first step toward implementing the service:
using CoreWCFService.Contracts.Interfaces;

namespace CoreWCFService.Contracts;
public class CalculatorService : ICalculatorService
{
    public double Add(double a, double b) => a + b;
    public double Subtract(double a, double b) => a - b;
    public double Multiply(double a, double b) => a * b;
    public double Divide(double a, double b) => b != 0 ? a / b : throw new DivideByZeroException("It cannot be divide by zero.");

}

Step 4. Program.cs should be configured with CoreWCF. Configure CoreWCF by adding the following lines:
using CoreWCF;
using CoreWCF.Configuration;
using CoreWCFService.Contracts;
using CoreWCFService.Contracts.Interfaces;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddServiceModelServices();
builder.Services.AddServiceModelMetadata();

builder.Services.AddSingleton<CalculatorService>();


builder.Services.AddOpenApi();

var app = builder.Build();


((IApplicationBuilder)app).UseServiceModel(builder =>
{
    builder.AddService<CalculatorService>();
    builder.AddServiceEndpoint<CalculatorService, ICalculatorService>(
        new BasicHttpBinding(), "/CalculatorService");
});


app.MapGet("/calculate/add/{a}/{b}", (double a, double b, CalculatorService service) =>
{
    return Results.Ok(new { Result = service.Add(a, b) });
}).WithName("AddNumbers");

app.MapGet("/calculate/subtract/{a}/{b}", (double a, double b, CalculatorService service) =>
{
    return Results.Ok(new { Result = service.Subtract(a, b) });
}).WithName("SubtractNumbers");

app.MapGet("/calculate/multiply/{a}/{b}", (double a, double b, CalculatorService service) =>
{
    return Results.Ok(new { Result = service.Multiply(a, b) });
}).WithName("MultiplyNumbers");

app.MapGet("/calculate/divide/{a}/{b}", (double a, double b, CalculatorService service) =>
{
    if (b == 0)
        return Results.BadRequest("Cannot divide by zero.");

    return Results.Ok(new { Result = service.Divide(a, b) });
}).WithName("DivideNumbers");


if (app.Environment.IsDevelopment())
{
    app.MapOpenApi();
}

app.UseHttpsRedirection();
app.Run();

# CoreWCFService.http Example
@CoreWCFService_HostAddress = http://localhost:5071

GET {{CoreWCFService_HostAddress}}/calculate/add/15/10
Accept: application/json

###
{
  "Result": 25
}

GET {{CoreWCFService_HostAddress}}/calculate/subtract/20/5
Accept: application/json

###
{
  "Result": 15
}

GET {{CoreWCFService_HostAddress}}/calculate/multiply/20/5
Accept: application/json

###
{
  "Result": 100
}

GET {{CoreWCFService_HostAddress}}/calculate/divide/20/4
Accept: application/json

###
{
  "Result": 5
}

GET {{CoreWCFService_HostAddress}}/calculate/divide/50/0
Accept: application/json

###
{
  "Error": "It cannot be divide by zero."
}


Step 5. Test the SOAP API. After running the application, navigate to /CalculatorService?wsdl to view the WSDL. Then, use tools like Postman or SOAP UI to test the SOAP service.

Differentiating ASMX from ASP.NET Core

Feature ASP.net SOAP Web Services (ASMX) ASP.net Core SOAP APIs (CoreWCF)
Framework .Net Framework ASP.net Core
Cross-Platform Support No Yes
Middleware and DI Support No Yes
Performance Moderate High
SOAP Support Built-In Require CoreWCF
Ideally User Case When looking after Legacy/Old Applications and System. Modern applications and systems are built in the current day.

When to Choose Which?

When to Choose Which?

  • In the following situations, you should use ASP.NET SOAP Web Services (ASMX):
  • ASMX is heavily used in legacy applications you maintain.

For your project, migrating to ASP.NET Core isn't feasible or cost-effective.

ASP.NET Core SOAP APIs (CoreWCF) are recommended if:

  • The SOAP-based services you are building are being developed.
  • A cross-platform solution must be scalable and support multiple platforms.
  • Modern technologies such as REST, gRPC, or message queues can be integrated with SOAP.

Summary

While ASMX Web Services can still be used to maintain legacy applications, ASP.NET Core SOAP APIs, which are driven by CoreWCF, provide greater performance, flexibility, and support for contemporary development techniques. CoreWCF is the ideal option for contemporary enterprise applications since it can produce scalable, cross-platform SOAP services. By using CoreWCF, developers can easily combine their SOAP solutions with more recent technologies like REST and gRPC, future-proofing their systems.

The code for this article can be found on Ziggy Rafiq's GitHub Repository https://github.com/ziggyrafiq/SOAP-Services-Comparison  This is for developers who need to maintain legacy SOAP services while transitioning to modern, scalable SOAP solutions or integrating SOAP into a broader ecosystem of modern web services.

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.


 

 



European ASP.NET Core Hosting - HostForLIFE :: Configuring Dynamic CORS with JSON in.NET Core

clock February 10, 2025 06:26 by author Peter

A crucial component of online applications is Cross-Origin Resource Sharing (CORS), which permits or prohibits the sharing of resources between several origins (domains). The client and server of a contemporary web application are frequently housed on separate domains. When the client sends HTTP requests to the server, this may result in CORS problems. We'll look at how to set up CORS to dynamically permit numerous origins from your appsettings in this post.in a.NET Core JSON file.

Configure CORS in the appsettings.JSON

We must first specify the permitted origins in the appsettings.json file in order to permit multiple origins. The list of URLs (origins) from which cross-origin requests are permitted will be stored here.

{
  "Cors": {
    "AllowedOrigins": [
      "https://example1.com",
      "https://example2.com",
      "https://example3.com"
    ]
  }
}


Read the Configuration in Program.cs
var builder = WebApplication.CreateBuilder(args);
// Get allowed origins from appsettings.json
var allowedOrigins = builder.Configuration
    .GetSection("Cors:AllowedOrigins")
    .Get<string[]>();
// Add CORS policy
builder.Services.AddCors(options =>
{
    options.AddPolicy("AllowSpecificOrigins",
        builder => builder.WithOrigins(allowedOrigins) // Apply multiple origins dynamically
                          .AllowAnyMethod()
                          .AllowAnyHeader());
});
// Add services to the container (e.g., AddControllers)
builder.Services.AddControllers();
var app = builder.Build();
// Use CORS policy
app.UseCors("AllowSpecificOrigins");
// Configure the HTTP request pipeline
app.MapControllers();
app.Run();


Apply CORS Policy in the Middleware

// Apply the CORS policy globally
app.UseCors("AllowSpecificOrigins");
// Other middleware (e.g., UseRouting, UseEndpoints)


Conclusion
Using appsettings.json to manage CORS settings in your .NET Core 9 application allows for greater flexibility and better maintainability. You can easily add or remove origins without changing your application's code. This is particularly useful when deploying your application to different environments (development, staging, production) with different origin policies. By following these steps, you can dynamically configure and manage multiple allowed origins for your application.

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



European ASP.NET Core Hosting - HostForLIFE :: KnightMoves.SqlObjects: An Improved.NET SQL Builder

clock February 5, 2025 05:47 by author Peter

KnightMoves.A.NET NuGet package library called SqlObjects implements an object-based SQL builder. This package uses a different approach than other SQL compilers, which rely on string manipulation techniques like concatenation and interpolation. This package wraps the syntax of the SQL language with C# objects instead of printing SQL strings, making a SQL query entirely composed of objects. In comparison to string manipulators, this enables a far more potent experience and set of capabilities.

Syntax Matching
Some ORMs and SQL builders use method names that are similar but different from the SQL language. This library matches the SQL syntax almost exactly, with some minor exceptions. The strategy for this library is that when you're using the SQL builder, you are able to think and code in SQL instead of trying to remember the new terminology of the SQL builder.

Let's dive in with some examples.

Examples
First, create a simple Console application in Visual Studio and add KnightMoves.SqlObjects NuGet package library from https://nuget.org.

Once you have a basic console application generated, you can add your code to the Main() method of the Program.cs file.

Start with importing the namespace.
using KnightMoves.SqlObjects;

Next, you add the code below to the Main() method.

The fluent SQL builder is available through the static TSQL class so you can begin there and code as much as though you’re coding in SQL.
var sql = TSQL

   .SELECT()
   .STAR()
   .FROM("Products")
   .Build()

;

Console.WriteLine(sql);

Run the application to see how the SQL is built. Here's the output:
SELECT
   *
FROM [Products]


That used a basic SELECT * but there are various ways to specify the columns of the select list. The most basic way is to use the COLUMN() method for each column you specify.

var sql = TSQL

   .SELECT()
     .COLUMN("ProductID")
     .COLUMN("ProductName")
   .FROM("Products")
   .Build()

;

Console.WriteLine(sql);


Here's the output:
SELECT
 [ProductID],
 [ProductName]
FROM [Products]


But we’re just getting started. You can provide a collection of column names and pass that to the COLUMNS() method (notice it is plural) and it will use those names to create the list of columns.

var columns = new List { "ProductID", "ProductName" };

var sql = TSQL

     .SELECT()
       .COLUMNS(columns)
     .FROM("dbo", "Products", "p")
     .Build()

;

Console.WriteLine(sql);


Output
SELECT
 [ProductID],
 [ProductName]
FROM [dbo].[Products] p


If you know SQL well, then you know that there are all manner of things you can do in the select list to make it a more robust query. This library handles them. Let’s start with a simple alias using .AS().

var sql = TSQL

   .SELECT()
     .COLUMN("ProductID").AS("Id")
     .COLUMN("ProductName")
   .FROM("Products")
   .Build()

;


Output
SELECT
 [ProductID] AS [Id],
 [ProductName]
FROM [Products]


You can see it correctly produces the line [ProductID] AS [Id]

Do you need to specify the schema and a multipart identifier? Easy. Suppose you’re using dbo as the schema and p as an alias for the Products table. Then you can do so like this.
var sql = TSQL

   .SELECT()
     .COLUMN("p", "ProductID", "Id")
     .COLUMN("p", "ProductName")
   .FROM("dbo", "Products", "p")
   .Build()

;

Console.WriteLine(sql);

Output
SELECT
 [p].[ProductID] AS [Id],
 [p].[ProductName]
FROM [dbo].[Products] p

You can also see an alternative to provide the alias. Instead of using .AS() you can provide the alias as a third parameter to the COLUMN() method.

It’s a pain to keep repeating the COLUMN() method call, and we know that we can use a collection of column names, but what if we need to prefix them with the table alias? Easy, we can do it like this.
var columns = new List { "ProductID", "ProductName" };

var sql = TSQL

     .SELECT()
       .COLUMNS("p", columns)
     .FROM("dbo", "Products", "p")
     .Build()

;

Console.WriteLine(sql);

Output
SELECT
 [p].[ProductID],
 [p].[ProductName]
FROM [dbo].[Products] p


The use of aliases becomes more important when you’re joining tables. So, let’s give that a try by joining Products and Categories.
var sql = TSQL

     .SELECT()
       .COLUMN("p", "ProductID")
       .COLUMN("c", "CategoryName")
     .FROM("dbo", "Products", "p")
     .INNERJOIN("dbo", "Categories", "c").ON("c", "CategoryID").IsEqualTo("p", "CategoryID")
     .Build()

;

Console.WriteLine(sql);


Output
SELECT
 [p].[ProductID],
 [c].[CategoryName]
FROM [dbo].[Products] p
INNER JOIN [dbo].[Categories] c ON [c].[CategoryID] = [p].[CategoryID]


If you need to join more tables, then all you have to do is slap another INNERJOIN() call exactly where you normally would if you’re coding in SQL with the schema and alias like so.
var sql = TSQL

 .SELECT()
   .COLUMN("p", "ProductID")
   .COLUMN("p", "ProductName")
   .COLUMN("c", "CategoryName")
   .COLUMN("s", "CompanyName")
 .FROM("dbo", "Products", "p")
 .INNERJOIN("dbo", "Categories", "c").ON("c", "CategoryID").IsEqualTo("p", "CategoryID")
 .INNERJOIN("dbo", "Suppliers", "s").ON("s", "SupplierID").IsEqualTo("p", "SupplierID")
 .Build()

;

Console.WriteLine(sql);

Output
SELECT
 [p].[ProductID],
 [p].[ProductName],
 [c].[CategoryName],
 [s].[CompanyName]
FROM [dbo].[Products] p
INNER JOIN [dbo].[Categories] c ON [c].[CategoryID] = [p].[CategoryID]
INNER JOIN [dbo].[Suppliers] s ON [s].[SupplierID] = [p].[SupplierID]


Notice that throughout this demo, you can see that when you're using this library, you can think in SQL terms. Some things will deviate slightly, such as the use of COLUMN() instead of just literally typing in the column name where it belongs and later you’ll see that we use a fluent method call for operators such as IsEqualTo() instead of the = string character, but the thought process is the same. You're thinking in SQL even though you're coding in C#.

For further assistance, because the library is SQL in C# dressing, its methods, and signatures pop up in the intelicode features of the IDE, where you can search through the options to find what you're looking for easily.


We are barely scratching the surface here. The library implements all DML statements of Microsoft's T-SQL language, which is fully documented here: KnightMoves.SqlObject Documentation. Head on over there to get started and see what you can do with the basics. Stay tuned for other articles in this series, where we’ll cover more and more features of this robust library. Thanks for reading this far. I sincerely hope you enjoy this library as much as I enjoyed making it.

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.



European ASP.NET Core Hosting - HostForLIFE :: Dynamic Rules Engine for.NET Financial App User Workflows

clock January 20, 2025 07:55 by author Peter

You must create a system that enables flexible, dynamic rule generation and execution based on user context (such as roles, account details, transaction kinds, etc.) in order to use.NET to implement a Rules Engine for user-specific workflows in a financial application. For business logic, a rules engine offers an abstraction layer that makes upgrades, maintenance, and user-specific adaptations simple.

1. Understand the Financial Domain
Financial applications typically deal with transactions, account balances, regulatory requirements, fraud detection, and other rules that can be user- or context-specific. In this example, let’s assume we're building a rules engine for managing financial transactions where rules need to be applied based on.

  • Account type (savings, checking, business, etc.).
  • Transaction type (deposit, withdrawal, transfer, etc.).
  • Transaction amount.
  • User role (admin, regular user, auditor, etc.).
  • User-specific preferences (risk appetite, investment profile).

2. Define the Rule Structure

A rule typically contains.

  • Condition: The condition or predicate that must be true for the rule to execute (e.g., transaction amount > $500).
  • Action: The result or effect if the rule is triggered (e.g., send a notification, log an event, or block the transaction).
  • User Context: The context in which the rule is evaluated (e.g., user role, account type).

In the financial system, rules might look like.

  • If the transaction amount is> $1000 and the account type is business, fraud detection is triggered.
  • If the account balance is < $50 and the withdrawal request is for $100, block the withdrawal.

3. Create a Rule Interface
Create a base interface for rules that can be implemented for different types of rules.
public interface IRule
{
bool Evaluate(UserContext context, Transaction transaction);
void Execute(Transaction transaction);
}

4. Define Specific Rule Implementations
Implement specific rules based on the financial domain.

Example: Transaction Amount Limit Rule.
public class TransactionAmountLimitRule : IRule
{
private readonly decimal _limit;
public TransactionAmountLimitRule(decimal limit)
{
    _limit = limit;
}
public bool Evaluate(UserContext context, Transaction transaction)
{
    return transaction.Amount > _limit;
}
public void Execute(Transaction transaction)
{
    Console.WriteLine($"Transaction amount exceeds the limit of {_limit}. Action required.");
}
}

Example: Account Type and Fraud Detection Rule.
public class FraudDetectionRule : IRule
{
public bool Evaluate(UserContext context, Transaction transaction)
{
    return transaction.Amount > 1000 && context.AccountType == "Business";
}

public void Execute(Transaction transaction)
{
    Console.WriteLine($"Fraud detection triggered for transaction of {transaction.Amount} on business account.");
    // Integrate with a fraud detection system here
}
}


5. Create the User Context and Transaction Classes
Define classes to represent user and transaction data. These objects will be passed to the rules engine to evaluate whether a rule should fire.
public class UserContext
{
public string Role { get; set; }
public string AccountType { get; set; }
public decimal AccountBalance { get; set; }

// Other user-specific data...
}

public class Transaction
{
public decimal Amount { get; set; }
public string TransactionType { get; set; }
public string AccountId { get; set; }

// Other transaction details...
}

6. Rules Engine Execution
The core of the rules engine is to evaluate the conditions and execute the rules. You can use a chain of responsibility pattern, a strategy pattern, or a simple loop to apply the rules.
public class RulesEngine
{
private readonly List<IRule> _rules;
public RulesEngine()
{
    _rules = new List<IRule>();
}
public void AddRule(IRule rule)
{
    _rules.Add(rule);
}
public void ExecuteRules(UserContext context, Transaction transaction)
{
    foreach (var rule in _rules)
    {
        if (rule.Evaluate(context, transaction))
        {
            rule.Execute(transaction);
        }
    }
}
}


7. Implementing User-Specific Workflow
Depending on the user’s role or account type, different rules might be triggered. For example.
A "Premium User" might have different transaction limits.
An "Admin" may be exempt from certain fraud detection rules.

var userContext = new UserContext
{
Role = "Regular",
AccountType = "Business",
AccountBalance = 1200
};
var transaction = new Transaction
{
Amount = 1500,
TransactionType = "Withdrawal",
AccountId = "12345"
};
// Initialize rules engine and add rules
var rulesEngine = new RulesEngine();
rulesEngine.AddRule(new TransactionAmountLimitRule(1000));
rulesEngine.AddRule(new FraudDetectionRule());
// Execute rules for the given context and transaction
rulesEngine.ExecuteRules(userContext, transaction);

8. User-Specific Workflow Example
In practice, rules can be set up to dynamically adjust based on the user context.
if (userContext.Role == "Premium")
{
rulesEngine.AddRule(new TransactionAmountLimitRule(5000));  // Higher limit for premium users
}
else if (userContext.Role == "Admin")
{
rulesEngine.AddRule(new NoFraudDetectionRule());  // Admin users may not be subject to fraud detection
}
// Then, execute the rules for the user's specific context
rulesEngine.ExecuteRules(userContext, transaction);


9. Persisting and Managing Rules Dynamically
For flexibility, you can store the rules in a database, and even allow rules to be edited via an admin UI. This allows you to modify workflows without changing the application code.

  • Store rules as JSON, XML, or database records.
  • Load rules dynamically based on the user or transaction type.
  • Allow admins to manage rules from a UI or API.

For dynamic rule evaluation.
public class DynamicRuleLoader
{
public IEnumerable<IRule> LoadRules(UserContext userContext)
{
    // Query database or external source to load applicable rules for the user
    return new List<IRule>
    {
        new TransactionAmountLimitRule(1000),
        new FraudDetectionRule()
    };
}
}


10. Testing and Maintenance

  • Unit Testing: Each rule can be unit tested independently by mocking user contexts and transactions.
  • Performance Considerations: Optimize for performance if the rules engine is large or if rules are complex. For example, caching user-specific rules, batching rule evaluations, or implementing a priority queue for rule execution.
  • Audit Logging: For financial systems, ensure that rule evaluations and their outcomes are logged for compliance and auditing purposes.

Incorporating a dynamic and flexible rules engine for user-specific workflows in a . NET-based financial application can significantly enhance the system's ability to handle diverse business logic, user contexts, and complex transaction scenarios. By leveraging a well-structured rules engine, financial institutions can ensure that transaction processing, fraud detection, and user-specific workflows are handled efficiently and consistently.

The system can be tailored to meet evolving business demands, regulatory compliance, and user preferences by defining rules based on a variety of user factors (like roles, account types, and preferences) and transaction characteristics (like amount, type, and status). Business rules can be updated without requiring significant modifications to the underlying software thanks to the separation of business rules from core application logic, which also makes maintenance and future scaling easier.

Additionally, businesses can modify and adapt the workflow to changing requirements or user-specific scenarios by putting in place a dynamic rules-loading system and providing administrative interfaces for controlling rules. In addition to improving user experience and operational efficiency, this guarantees that the application remains responsive to evolving requirements and regulatory changes.

In the end, this strategy not only gives financial institutions the opportunity to automate and streamline decision-making procedures, but it also permits increased control, transparency, and auditability—all of which are essential for upholding compliance and confidence in the heavily regulated financial sector.

HostForLIFE ASP.NET Core Hosting

European Best, cheap and reliable ASP.NET hosting with instant activation. HostForLIFE.eu is #1 Recommended Windows and ASP.NET hosting in European Continent. With 99.99% Uptime Guaranteed of Relibility, Stability and Performace. HostForLIFE.eu security team is constantly monitoring the entire network for unusual behaviour. We deliver hosting solution including Shared hosting, Cloud hosting, Reseller hosting, Dedicated Servers, and IT as Service for companies of all size.

 



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in