European ASP.NET 4.5 Hosting BLOG

BLOG about ASP.NET 4, ASP.NET 4.5 Hosting and Its Technology - Dedicated to European Windows Hosting Customer

European ASP.NET Core Hosting :: Error When Published ASP.NET Core? See Below Tips!

clock November 5, 2019 05:38 by author Scott

At the past few years, we have discussed about common error that you can find when published .NET Core, the most common error is 502.5 – process failure error.

Startup errors with ASP.NET Core don’t provide much information either, at least not in a production environment. Here are 7 tips for understanding and fixing those errors.

1. There are two types of startup errors.

There are unhandled exceptions originating outside of the Startup class, and exceptions from inside of Startup. These two error types can produce different behavior and may require different troubleshooting techniques.

2. ASP.NET Core will handle exceptions from inside the Startup class.

If code in the ConfigureServices or Configure methods throw an exception, the framework will catch the exception and continue execution.

Although the process continues to run after the exception, every incoming request will generate a 500 response with the message “An error occurred while starting the application”.

Two additional pieces of information about this behavior:

- If you want the process to fail in this scenario, call CaptureStartupErrors on the web host builder and pass the value false.

- In a production environment, the “error occurred” message is all the information you’ll see in a web browser. The framework follows the practice of not giving away error details in a response because error details might give an attacker too much information. You can change the environment setting using the environment variable ASPNETCORE_ENVIRONMENT, but see the next two tips first. You don’t have to change the entire environment to see more error details.

3. Set detailedErrors in code to see a stack trace.

The following bit of code allows for detailed error message, even in production, so use with caution.

public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
    WebHost.CreateDefaultBuilder(args)
           .CaptureStartupErrors(true) // the default
           .UseSetting("detailedErrors", "true")
           .UseStartup<Startup>();

4. Alternatively, set the ASPNETCORE_DETAILEDERRORS environment variable.

Set the value to true and you’ll also see a stack trace, even in production, so use with caution.

5. Unhandled exceptions outside of the Startup class will kill the process.

Perhaps you have code inside of Program.cs to run schema migrations or perform other initialization tasks which fail, or perhaps the application cannot bind to the desired ports. If you are running behind IIS, this is the scenario where you’ll see a generic 502.5 Process Failure error message.

These types of errors can be a bit more difficult to track down, but the following two tips should help.

6. For IIS, turn on standard output logging in web.config.

If you are carefully logging using other tools, you might be able to capture output there, too, but if all else fails, ASP.NET will write exception information to stdout by default. By turning the log flag to true, and creating the output directory, you’ll have a file with exception information and a stack trace inside to help track down the problem.

The following shows the web.config file created by dotnet publish and is typically the config file in use when hosting .NET Core in IIS. The attribute to change is the stdoutLogEnabled flag.

<system.webServer>
  <handlers>
    <add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />

  </handlers>
  <aspNetCore processPath="dotnet" arguments=".\codes.dll"
              stdoutLogEnabled="true" stdoutLogFile=".\logs\stdout" />
</system.webServer>

Important: Make sure to create the logging output directory.

Important: Make sure to turn logging off after troubleshooting is complete.

7. Use the dotnet CLI to run the application on your server.

If you have access to the server, it is sometimes easier to go directly to the server and use dotnet to witness the exception in real time. There’s no need to turn on logging or set and unset environment variables.

Summary

Debugging startup errors in ASP.NET Core is a simple case of finding the exception. In many cases, #7 is the simplest approach that doesn’t require code or environment changes. FYI, we also have support latest ASP.NET Core on our hosting environment. Feel free to visit our site at https://www.hostforlife.eu.



European ASP.NET Core Hosting - HostForLIFE.eu :: Get Currency Format Using Google In ASP.NET

clock May 21, 2019 11:00 by author Peter

Many times, I have encountered a vital question: "How can I change 20000.00 to $20,000.00?" This involves a simple currency formatter. You can use either jQuery or server-side coding to show the currency in this format. Here, in this post, I will show you how to format the currency input from user in ASP.NET.
Format Currency using jQuery

First, start with jQuery. Google has provided a very simple way to format currency. You can find out about it from here. Or you can follow this.

To continue, you have to download two JavaScript files. One is jquery.formatCurrency-1.4.0.js and the second one is jquery.min.js. I have attached these JS files with the code I have attached along with this post.

So, let's start with sample coding. First, create a new project and add a new page, name it whatever you want. Then, add a new text box. Firstly, we will do it with the onBlur function of jQuery, so we don't need any more extra buttons for showing our formatted currency.

Add those downloaded JS files into your header section of the web form. And then, paste the following code into your page.

JS
    <script src="jquery.min.js"></script> 
    <script src="jquery.formatCurrency-1.4.0.js"></script> 
    <script type="text/javascript"> 
            $(document).ready(function () { 
                $('.text').focusout(function () { 
                    $('.text').formatCurrency(); 
                    $('.text').formatCurrency('.currencyLabel'); 
                }); 
            });        
    </script> 


HTML
    <div> 
         <p>Currency Formatter</p> 
         <asp:TextBox runat="server"  
     
         ID="txtPrice" CssClass="text"></asp:TextBox> 
         Show by Jquery: <span class="currencyLabel"></span> 
    </div> 


Check the CssClass of text box. It's the method by which formatCurrency() method is calling to format it to text box and also show the output value to a span.
Format Currency using C#

I hope it's clear to you how to format currency by jQuery. Now, let's see how to do this using C#. Don't worry, C# has an inbuilt function for this. For C#, we are taking an extra button to display the output into a label.

    <asp:TextBox ID="txtCurrency" runat="server"></asp:TextBox> 
    <asp:Button ID="btnChange" Text="Format" runat="server" OnClick="btnChange_Click"   /> 
    <asp:Label ID="lblShow" runat="server"></asp:Label> 
     
    protected void btnChange_Click(object sender, EventArgs e) 
    { 
        lblShow.Text = (Convert.ToDouble(txtCurrency.Text)).ToString("C2"); 
    } 


Make sure this method is only applicable to data types like decimal and double. So you have to add a check to see whether user input is bound to numbers.

 



European ASP.NET Core Hosting - HostForLIFE.eu :: Dependency Injection For Quartz.NET In .NET Core

clock May 14, 2019 12:18 by author Peter

Quartz.NET is a handy library that allows you to schedule recurring tasks via implementing IJob interface. Yet the limitation of it is that, by default, it supports only a parameterless constructor which complicates injecting external service inside of it, i.e., for implementing repository pattern. In this article, we'll take a look at how we can tackle this problem using standard .NET Core DI container.

The whole project referred to in the article is provided inside the following Github repository. In order to better follow the code in the article, you might want to take a look at it.

Project Overview
Let's take a look at the initial solution structure.

The project QuartzDI.Demo.External.DemoService represents some external dependency we have no control over. For the sake of simplicity, it does quite a humble job.

The project QuartzDI.Demo is our working project which contains simple a Quartz.NET job.
    public class DemoJob : IJob 
    { 
        private const string Url = "https://i.ua"; 
     
        public static IDemoService DemoService { get; set; } 
     
        public Task Execute(IJobExecutionContext context) 
        { 
            DemoService.DoTask(Url); 
            return Task.CompletedTask; 
        } 
    } 


It is set up in a straightforward way:
    var props = new NameValueCollection 
    { 
        { "quartz.serializer.type", "binary" } 
    }; 
    var factory = new StdSchedulerFactory(props); 
    var sched = await factory.GetScheduler(); 
    await sched.Start(); 
    var job = JobBuilder.Create<DemoJob>() 
        .WithIdentity("myJob", "group1") 
        .Build(); 
    var trigger = TriggerBuilder.Create() 
        .WithIdentity("myTrigger", "group1") 
        .StartNow() 
        .WithSimpleSchedule(x => x 
            .WithIntervalInSeconds(5) 
            .RepeatForever()) 
    .Build(); 
    await sched.ScheduleJob(job, trigger); 


We provide our external service via the job's static property.
    DemoJob.DemoService = new DemoService(); 

As the project is a console application, during the course of the article, we'll have to manually install all needed infrastructure and will be able to build a more thorough understanding of what .NET Core actually brings to the table.

At this point, our project is up and running. And what is most important is, it's dead simple, which is great. But we pay for that simplicity with a cost of application inflexibility, which is fine if we want to leave it as a small tool. But that's often not the case for production systems. So let's tweak it a bit to make it more flexible.
Creating a Configuration File

One of the inflexibilities is that we hard-code the URL we call into a DemoJob. Ideally, we would like to change it and also change it depending on our environment. .NET Core comes with an appsettings.json mechanism for that matter.

In order to start working with the .NET Core configuration mechanism, we have to install a couple of NuGet packages.

  • Microsoft.Extensions.Configuration 
  • Microsoft.Extensions.Configuration.FileExtensions 
  • Microsoft.Extensions.Configuration.Json 

Let's create a file with such a name and extract our URL there,
    { 
      "connection": { 
        "Url": "http://i.ua" 
      } 
    } 

Now, we can extract our value from the config file as below.
    var builder = new ConfigurationBuilder() 
                    .SetBasePath(Directory.GetCurrentDirectory()) 
                    .AddJsonFile("appsettings.json", true, true); 
    var configuration = builder.Build(); 
    var connectionSection = configuration.GetSection("connection"); 
    DemoJob.Url = connectionSection["Url"];
 

Note that to make it happen, we had to change the URL from constant to property.
    public static string Url { get; set; } 

Using Constructor Injection

Injecting service via a static property is fine for a simple project, but for a bigger one, it might carry several disadvantages: such as a job might be called without a service provided, thus failing or changing the dependency during the object runtime. To address these issues, we should employ constructor injection.

Although there is nothing wrong with Pure Dependency Injection and some people argue that you should strive for it, in this article, we'll use a built-in .NET Core DI container which comes with a NuGet package Microsoft.Extensions.DependencyInjection.

Now, let us specify a service we depend on inside constructor arguments.
    private readonly IDemoService _demoService; 
    public DemoJob(IDemoService demoService) 
    { 
        _demoService = demoService; 
    } 

In order to invoke a parameterful constructor of the job, Quartz.NET provides IJobFactory interface. Here's our implementation.
    public class DemoJobFactory : IJobFactory 
    { 
        private readonly IServiceProvider _serviceProvider; 
     
        public DemoJobFactory(IServiceProvider serviceProvider) 
        { 
            _serviceProvider = serviceProvider; 
        } 
     
        public IJob NewJob(TriggerFiredBundle bundle, IScheduler scheduler) 
        { 
            return _serviceProvider.GetService<DemoJob>(); 
        } 
     
        public void ReturnJob(IJob job) 
        { 
            var disposable = job as IDisposable; 
            disposable?.Dispose(); 
        } 
    } 


Let's register our dependencies.
    var serviceCollection = new ServiceCollection(); 
    serviceCollection.AddScoped<DemoJob>(); 
    serviceCollection.AddScoped<IDemoService, DemoService>(); 
    var serviceProvider = serviceCollection.BuildServiceProvider(); 


The final piece of the puzzle is to make Quartz.NET use our factory. IScheduler has property JobFactory just for that matter.
    sched.JobFactory = new DemoJobFactory(serviceProvider); 
    Using Options Pattern 


Now, we can pull the same trick with configuration options. Again, our routine starts with a Nuget package. This time Microsoft.Extensions.Options.

Let's create a strongly typed definition for configuration options.
    public class DemoJobOptions 
    { 
        public string Url { get; set; } 
    } 


Now, we populate them as below.
    serviceCollection.AddOptions(); 
    serviceCollection.Configure<DemoJobOptions>(options => 
    { 
        options.Url = connectionSection["Url"]; 
    }); 


And inject them into a constructor. Note that we inject IOptions<T>, not the options instance directly.
    public DemoJob(IDemoService demoService, IOptions<DemoJobOptions> options) 
    { 
        _demoService = demoService; 
        _options = options.Value; 
    } 


Conclusion
In this article, we've seen how we can leverage .NET Core functionality to make our use of Quartz.NET more flexible.

 

 



European ASP.NET Core Hosting :: How to Use Dapper Asynchronously in ASP.NET Core 2.1

clock November 12, 2018 08:13 by author Scott

In this post, we're going to create a very simple ASP.NET Core 2.1 application which uses Dapper to access data. There's already a sample project worked up over on GitHub, and you might want to use that to follow along here.

Step 1: Get the NuGet Package

First things first, we need to grab the NuGet package for Dapper. In Visual Studio, you can do this by right-clicking on your project file and selecting Manage NuGet Packages and then search for the Dapper package, like so:

With that installed, let's try creating a repository.

Step 2: Creating an Employee Class and Repository

For this demo, I am not going to go over how to create a database or show a demo database with sample data; I don't have one available and it's a pain to make one. So let's assume we have a table Employee with columns for FirstName, LastName, ID, and DateOfBirth. We can make a corresponding C# class for this table, like so:

public class Employee
{
   
public int ID { get; set; }
   
public string FirstName { get; set; }
   
public string LastName { get; set; }
   
public DateTime DateOfBirth { get; set; }
}

Now we need a sample repository. Let's call it EmployeeRepository and give it an interface so we can use ASP.NET Core's Dependency Injection setup.

Here's the interface:

public interface IEmployeeRepository
{
   
Task<Employee> GetByID(int id);
   
Task<List<Employee>> GetByDateOfBirth(DateTime dateOfBirth);
}

Now we can work up a skeleton implementation of this repository. Here's what we're starting with:

public class EmployeeRepository : IEmployeeRepository
{
   
public async Task<Employee> GetByID(int id)
   
{

   
}

   
public async Task<List<Employee>> GetByDateOfBirth(DateTime dateOfBirth)
    {

We also need to update our project's Startup file to include our new repository in the services layer:

public class Startup
{
   
public Startup(IConfiguration configuration)
   
{
       
Configuration = configuration;
   
}

   
public IConfiguration Configuration { get; }

   
public void ConfigureServices(IServiceCollection services)
   
{
       
services.AddTransient<IEmployeeRepository, EmployeeRepository>();

       
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);
   
}

   
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
   
{
       
//...
   
}
}

Next, we need to enable this repository to use Dapper. Before we can do that, however, we need it to be aware of what connection string we are using.

Step 3: Injecting IConfiguration

Very often in ASP.NET Core projects, Connection Strings are defined in the appSettings.json file:

{
 
"Logging": {
   
"LogLevel": {
     
"Default": "Debug",
     
"System": "Information",
     
"Microsoft": "Information"
   
}
 
},
 
"ConnectionStrings": {
   
"MyConnectionString": "MyConnectionString"
 
}
}

The problem is: how do we pass that connection string to the repository so it can create a SqlConnection object for Dapper to use.

ASP.NET Core introduces a new IConfiguration object which can be injected into other classes. That injected instance will contain a method called GetConnectionString which we can use to obtain our connection string from the appSettings.json files. So, let's inject IConfiguration like so:

public class EmployeeRepository : IEmployeeRepository
{
   
private readonly IConfiguration _config;

   
public EmployeeRepository(IConfiguration config)
   
{
       
_config = config;
    }
   

   
//Remainder of file is unchanged
}

Step 4: Creating a SqlConnection

With the injected IConfiguration now available to our repository, we can create a Dapper-enabled SqlConnection object that all of our repository's methods can utilize.

public class EmployeeRepository : IEmployeeRepository
{
   
private readonly IConfiguration _config;

   
public EmployeeRepository(IConfiguration config)
   
{
       
_config = config;
   
}

   
public IDbConnection Connection
   
{
       
get
       
{
           
return new SqlConnection(_config.GetConnectionString("MyConnectionString"));
       
}
    }
   

   
//Remainder of file is unchanged
}

Step 5: Employee by ID

Let's first create a method to return employees by their ID.

To start, let's remember that the way Dapper works is by processing raw SQL and then mapping it to an object. Because our table columns and object properties share the same names, we don't need to do any special mapping here.

Here's the implementation of our GetByID method:

public class EmployeeRepository : IEmployeeRepository
{
   
//...

   
public async Task<Employee> GetByID(int id)
   
{
       
using (IDbConnection conn = Connection)
        {

            string sQuery = "SELECT ID, FirstName, LastName, DateOfBirth FROM Employee WHERE ID = @ID";
           
conn.Open();
           
var result = await conn.QueryAsync<Employee>(sQuery, new { ID = id });
           
return result.FirstOrDefault();
       
}
   
}
}

Step 6: Employees by Date of Birth

We also need to get all employees born on a particular date. Since we are now returning a collection of employees rather than a single one, the implementation changes very slightly.

public class EmployeeRepository : IEmployeeRepository
{
    //...
   

     
public async Task<List<Employee>> GetByDateOfBirth(DateTime dateOfBirth)
   
{
       
using (IDbConnection conn = Connection)
       
{
           
string sQuery = "SELECT ID, FirstName, LastName, DateOfBirth FROM Employee WHERE DateOfBirth = @DateOfBirth";
           
conn.Open();
           
var result = await conn.QueryAsync<Employee>(sQuery, new { DateOfBirth = dateOfBirth });
           
return result.ToList();
       
}
   
}
}

Step 7: Implement the Controller

The final step is creating a controller to which our EmployeeRepository can be injected. Here it is:

[Route("api/[controller]")]
[ApiController]
public class EmployeeController : ControllerBase
{
   
private readonly IEmployeeRepository _employeeRepo;

   
public EmployeeController(IEmployeeRepository employeeRepo)
   
{
       
_employeeRepo = employeeRepo;
   
}

   
[HttpGet]
   
[Route("{id}")]
   
public async Task<ActionResult<Employee>> GetByID(int id)
   
{
       
return await _employeeRepo.GetByID(id);
   
}

   
[HttpGet]
   
[Route("dob/{dateOfBirth}")]
   
public async Task<ActionResult<List<Employee>>> GetByID(DateTime dateOfBirth)
   
{
       
return await _employeeRepo.GetByDateOfBirth(dateOfBirth);
   
}
}

Summary

That's it! We've implemented Dapper into our ASP.NET Core 2.1 application!



European ASP.NET Core Hosting :: How to Measure and Report the Response Time of ASP.NET Core

clock November 8, 2018 09:57 by author Scott

Performance is a buzzword for APIs. One of the most important and measurable parameters of the API performance is the response time. In this article, we will understand how to add code to measure the response time of an API and then return the response time data to the end client.

What is the need for this?

So, let's take a moment to think why we would ever need such a feature to measure the Response time of an API. Following are some of the points that have been the inspiration for writing code to Capture response time.

  • You need to define the SLA (Service Level Agreements) for your API with your clients. The clients need to understand how much time  the API takes to respond back. The response time data over time can help us decide on an SLA for our API.
  • Management is interested in reports as to how fast or slow the application is. You need to have data to corroborate your claims. It is worth it to have reports on the performance of the application and to share it with Stakeholders.
  • The client needs to have the information of the Response time of the API so that they can track how much time is spent on the client and the Server.

You might also have encountered similar requests in your project and it is worthwhile looking at an approach to capture the response time for the API.

Where to add the code?

Let's explore a couple of approaches to capture the response time of our API focusing mostly on capturing the time spent in our API. Our objective is to calculate the time elapsed in milliseconds from the time the request is received by the Asp.net core runtime to the time the response is processed and sent back from the Server.

What factors are we ignoring?

It's important to understand that this discussion doesn't include the time spent in N/W, Time spent in IIS and Application Pool Startup. If the Application Pool wasn't up and running, then the first request can affect the overall response time of the API. There is an Application Initialization Module which we can make use of but that is out of scope for this article.

First Attempt

One very naive approach to capturing the response time of an API would be to add code to every API method at the start and end and then measure the delta to calculate the response time as shown 

// GET api/values/5   
[HttpGet]  
public IActionResult Get() {  
    // Start the watch   
    var watch = new Stopwatch();  
    watch.Start();  
    // Your actual Business logic   
    // End the watch  
    watch.Stop();  
    var responseTimeForCompleteRequest = watch.ElapsedMilliseconds;  
} 

This code should be able to calculate the time spent in an operation. But this doesn't seem to be the right approach for the following reasons.

If an API has a lot of operations, then we need to add this code to multiple places which are not good for maintainability.

This code measures the time spent in the method only, it doesn't measure the time spent on other activities like middleware, filters, Controller selection, action method selection, Model binding etc.

Second Attempt

Let's try to improve the above code by centralizing the code in one place so that it is easier to maintain. We need to execute the response time calculation code before and after a method is getting executed. If you have worked with earlier versions of Asp.net Web API, you would be familiar with concepts of Filter. Filters allow you to run code before or after specific stages in the request processing pipeline.

We will implement a filter for calculating the Response time as shown below. We will create a Filter and use the OnActionExecuting to start the timer and then stop the timer in method OnActionExecuted, thus calculating the response time of the API.

public class ResponseTimeActionFilter: IActionFilter { 
    private const string ResponseTimeKey = "ResponseTimeKey"; 
    public void OnActionExecuting(ActionExecutingContext context) { 
        // Start the timer  
        context.HttpContext.Items[ResponseTimeKey] = Stopwatch.StartNew(); 
    } 
    public void OnActionExecuted(ActionExecutedContext context) { 
        Stopwatch stopwatch = (Stopwatch) context.HttpContext.Items[ResponseTimeKey]; 
        // Calculate the time elapsed  
        var timeElapsed = stopwatch.Elapsed; 
    } 
}  

This code is not a reliable technique for calculating the response time as it doesn't address the issue of calculating the time spent in execution of middleware, controller selection, action method selection, model binding etc. The filter pipeline runs after the MVC selects the action to execute. So, it effectively doesn't instrument the time spent in the Other Asp.net pipeline. 

Third Attempt

We will use the Asp.net Core Middleware to Calculate the Response time of the API.

So, what is Middleware?

Basically, Middleware are software components which handle the Request/Response. Middleware is assembled into an application pipeline and serves in the incoming request. Each component does the following operations.

  • Chooses whether to pass the request to the next component in the pipeline. 
  • Can perform work before and after the next component in the pipeline is invoked.

If you have worked with HTTPModules or HTTPHandlers in ASP.NET, then you can think of Middleware as a replacement in ASP.NET Core. Some of the examples of middleware are -

  • MVC Middleware
  • Authentication
  • Static File Serving
  • Caching
  • CORS

 

 

We want to add code to start the timer once the request enters the ASP.NET Core pipeline and stop the timer once the response is processed by the Pipeline. Custom Middleware at the start of the request pipeline seems to be the best approach for getting the access to the request as early as possible and access until the last step is executed in the pipeline.

We will build a Response Time Middleware which we will add as the first Middleware to the request Pipeline so that we can start the timer as soon the request enters the Asp.net core pipeline.

What to do with the Response time data?

Once we capture the response time data we can process data in the following ways.

  1. Add the Response time data to a Reporting database or an analytics solution.
  2. Write the Response time data to a log file.
  3. Pass the response time data to a message queue which can further be processed by another application for reporting and analytics.
  4. Send the Response time information to the client applications consuming our Rest API using the Response headers.
  5. There may be other useful ways of using the response time data. Please leave a comment and tell me how you process the response time data in your application.

Let's write the code

We will write the code considering the following points.

  • Calculating the response time data for the API
  • Reporting the data back to client applications by passing the data in the Response headers.

Full code snippet for the ResponseTimeMiddleware is shown below.

public class ResponseTimeMiddleware { 
    // Name of the Response Header, Custom Headers starts with "X-"
 
    private const string RESPONSE_HEADER_RESPONSE_TIME = "X-Response-Time-ms";
 
    // Handle to the next Middleware in the pipeline
 
    private readonly RequestDelegate _next;
 
    public ResponseTimeMiddleware(RequestDelegate next) {
 
        _next = next;
 
    }
 
    public Task InvokeAsync(HttpContext context) {
 
        // Start the Timer using Stopwatch
 
        var watch = new Stopwatch();
 
        watch.Start();
 
        context.Response.OnStarting(() => {
 
            // Stop the timer information and calculate the time
  
            watch.Stop();
 
            var responseTimeForCompleteRequest = watch.ElapsedMilliseconds;
 
            // Add the Response time information in the Response headers.
  
            context.Response.Headers[RESPONSE_HEADER_RESPONSE_TIME] = responseTimeForCompleteRequest.ToString();
 
            return Task.CompletedTask;
 
        });
 
        // Call the next delegate/middleware in the pipeline
  
        return this._next(context);
 
    }
 
}

Explanation of the code

The interesting part happens in the InvokeAsync method, We use Stopwatch class to start the stopwatch once the requests enter into the first middleware of the request and then stop the stopwatch once the request has been processed and the response is ready to be sent back to the client. OnStarting method provides an opportunity to write a custom code to add a delegate to be invoked just before response headers will be sent to the client.

Lastly, we add the Response time information in a Custom Header. We use the X-Response-Time-msheader as a Response Header. As a convention, the Custom Header starts with an X.

Conclusion

In this article, we understood how to leverage ASP.NET middleware to manage cross-cutting concerns like measuring the response time of the APIs. There are various other useful use cases of using middleware which can help to reuse code and improve the maintainability of the application.

/p>



European ASP.NET Core Hosting - HostForLIFE.eu :: How to Add Localisation to ASP.NET Core Application

clock January 25, 2017 06:07 by author Scott

In this post I'll walk through the process of adding localisation to an ASP.NET Core application using the recommended approach with resx resource files.

Introduction to Localisation

Localisation in ASP.NET Core is broadly similar to the way it works in the ASP.NET 4.X. By default you would define a number of .resx resource files in your application, one for each culture you support. You then reference resources via a key, and depending on the current culture, the appropriate value is selected from the closest matching resource file.

While the concept of a .resx file per culture remains in ASP.NET Core, the way resources are used has changed quite significantly. In the previous version, when you added a .resx file to your solution, a designer file would be created, providing static strongly typed access to your resources through calls such as Resources.MyTitleString.

In ASP.NET Core, resources are accessed through two abstractions, IStringLocalizer and IStringLocalizer<T>, which are typically injected where needed via dependency injection. These interfaces have an indexer, that allows you to access resources by a string key. If no resource exists for the key (i.e. you haven't created an appropriate .resx file containing the key), then the key itself is used as the resource.

Consider the following example:

using Microsoft.AspNet.Mvc; 
using Microsoft.Extensions.Localization;

public class ExampleClass 
{
    private readonly IStringLocalizer<ExampleClass> _localizer;
    public ExampleClass(IStringLocalizer<ExampleClass> localizer)
    {
        _localizer = localizer;
    }

    public string GetLocalizedString()
    {
        return _localizer["My localized string"];
    }
}

In this example, calling GetLocalizedString() will cause the IStringLocalizer<T> to check the current culture, and see if we have an appropriate resource file for ExampleClass containing a resource with the name/key "My localized string". If it finds one, it returns the localised version, otherwise, it returns "My Localized string".

The idea behind this approach is to allow you to design your app from the beginning to use localisation, without having to do up front work to support it by creating the default/fallback .resx file. Instead, you can just write the default values, then add the resources in later.

Personally, I'm not sold on this approach - it makes me slightly twitchy to see all those magic strings around which are essentially keys into a dictionary. Any changes to the keys may have unintended consequences, as I'll show later in the post.

Adding localisation to your application

For now, I'm going to ignore that concern, and dive in using Microsoft's recommended approach. I've started from the default ASP.NET Core Web application without authentication.

The first step is to add the localisation services in your application. As we are building an MVC application, we'll also configure View localisation and DataAnnotations localisation. The localisation packages are already referenced indirectly by the Microsoft.AspNetCore.MVC package, so you should be able to add the services and middleware directly in your Startup class:

public void ConfigureServices(IServiceCollection services) 
{
    services.AddLocalization(opts => { opts.ResourcesPath = "Resources"; });

    services.AddMvc()
        .AddViewLocalization(
            LanguageViewLocationExpanderFormat.Suffix,
            opts => { opts.ResourcesPath = "Resources"; })
        .AddDataAnnotationsLocalization();
}

These services allow you to inject the IStringLocalizer service into your classes. They also allow you to have localised View files (so you can have Views with names like MyView.fr.cshtml) and inject the IViewLocalizer, to allow you to use localisation in your view files. Calling AddDataAnnotationsLocalizationconfigures the Validation attributes to retrieve resources via an IStringLocalizer.

The ResourcePath parameter on the Options object specifies the folder of our application in which resources can be found. So if the root of our application is found at ExampleProject, we have specified that our resources will be stored in the folder ExampleProject/Resources.

Configuring these classes is all that is required to allow you to use the localisation services in your application. However you will typically also need some way to select what the current culture is for a given request.

To do this, we use the RequestLocalizationMiddleware. This middleware uses a number of different providers to try and determine the current culture. To configure it with the default providers, we need to decide which cultures we support, and which is the default culture.

Note that the configuration example in the documentation didn't work for me, though the Localization.StarterWeb project they reference did, and is reproduced below.

public void ConfigureServices(IServiceCollection services) 
{
    // ... previous configuration not shown

    services.Configure<RequestLocalizationOptions>(
        opts =>
        {
            var supportedCultures = new[]
            {
                new CultureInfo("en-GB"),
                new CultureInfo("en-US"),
                new CultureInfo("en"),
                new CultureInfo("fr-FR"),
                new CultureInfo("fr"),
            };

            opts.DefaultRequestCulture = new RequestCulture("en-GB");
            // Formatting numbers, dates, etc.
            opts.SupportedCultures = supportedCultures;
            // UI strings that we have localized.
            opts.SupportedUICultures = supportedCultures;
        });
}

public void Configure(IApplicationBuilder app) 
{
    app.UseStaticFiles();
    var options = app.ApplicationServices.GetService<IOptions<RequestLocalizationOptions>>();    app.UseRequestLocalization(options.Value);

    app.UseMvc(routes =>
    {
        routes.MapRoute(
            name: "default",
            template: "{controller=Home}/{action=Index}/{id?}");
    });
}

Using localisation in your classes

We now have most of the pieces in place to start adding localisation to our application. We don't yet have a way for users to select which culture they want to use, but we'll come to that shortly. For now, lets look at how we go about retrieving a localised string.

Controllers and services

Whenever you want to access a localised string in your services or controllers, you can inject an IStringLocalizer<T> and use its indexer property. For example, imagine you want to localise a string in a controller:

public class HomeController: Controller 
{
    private readonly IStringLocalizer<HomeController> _localizer;

    public HomeController(IStringLocalizer<HomeController> localizer)
    {
        _localizer = localizer;
    }

    public IActionResult Index()
    {
        ViewData["MyTitle"] = _localizer["The localised title of my app!"];
        return View(new HomeViewModel());
    }
}

Calling _localizer[] will lookup the provided string based on the current culture, and the type HomeController. Assuming we have configured our application as discussed previously, the HomeController resides in the ExampleProject.Controllers namespace, and we are currently using the fr culture, then the localizer will look for either of the following resource files:

  • Resources/Controller.HomeController.fr.resx
  • Resources/Controller/HomeController.fr.resx

If a resource exists in one of these files with the key "The localised title of my app!" then it will be used, otherwise the key itself will be used as the resource. This means you don't need to add any resource files to get started with localisation - you can just use the default language string as your key and come back to add .resx files later.

Views

There are two kinds of localisation of views. As described previously, you can localise the whole view, duplicating it and editing as appropriate, and providing a culture suffix. This is useful if the views need to differ significantly between different cultures.

You can also localise strings in a similar way to that shown for the HomeController. Instead of an IStringLocalizer<T>, you inject an IViewLocalizer into the view. This handles HTML encoding a little differently, in that it allows you to store HTML in the resource and it won't be encoded before being output. Generally you'll want to avoid that however, and only localise strings, not HTML.

The IViewLocaliser uses the name of the View file to find the associated resources, so for the HomeController's Index.cshtml view, with the fr culture, the localiser will look for:

  • Resources/Views.Home.Index.fr.resx
  • Resources/Views/Home/Index.fr.resx

The IViewLocalizer is used in a similar way to IStringLocalizer<T> - pass in the string in the default language as the key for the resource:

@using Microsoft.AspNetCore.Mvc.Localization
@model AddingLocalization.ViewModels.HomeViewModel
@inject IViewLocalizer Localizer
@{
    ViewData["Title"] = Localizer["Home Page"];
}
<h2>@ViewData["MyTitle"]</h2> 

DataAnnotations

One final common area that needs localisation is DataAnnotations. These attributes can be used to provide validation, naming and UI hints of your models to the MVC infrastructure. When used, they provide a lot of additional declarative metadata to the MVC pipeline, allowing selection of appropriate controls for editing the property etc.

Error messages for DataAnnotation validation attributes all pass through an IStringLocalizer<T> if you configure your MVC services using AddDataAnnotationsLocalization(). As before, this allows you to specify the error message for an attribute in your default language in code, and use that as the key to other resources later.

public class HomeViewModel 
{
    [Required(ErrorMessage = "Required")]
    [EmailAddress(ErrorMessage = "The Email field is not a valid e-mail address")]
    [Display(Name = "Your Email")]
    public string Email { get; set; }
}

Here you can see we have three DataAnnotation attributes, two of which are ValidationAttributes, and the DisplayAttribute, which is not. The ErrorMessage specified for each ValidationAttribute is used as a key to lookup the appropriate resource using an IStringLocalizer<HomeViewModel>. Again, the files searched for will be something like:

  • Resources/ViewModels.HomeViewModel.fr.resx
  • Resources/ViewModels/HomeViewModel.fr.resx

A key thing to be aware of is that the DisplayAttribute is not localised using the IStringLocalizer<T>. This is far from ideal, but I'll address it in my next post on localisation.

Allowing users to select a culture

With all this localisation in place, the final piece of the puzzle is to actually allow users to select their culture. The RequestLocalizationMiddleware uses an extensible provider mechanism for choosing the current culture of a request, but it comes with three providers built in

  • QueryStringRequestCultureProvider
  • AcceptLanguageHeaderRequestCultureProvider
  • CookieRequestCultureProvider

These allow you to specify a culture in the querystring (e.g ?culture=fr-FR), via the Accept-Languageheader in a request, or via a cookie. Of the three approaches, using a cookie is the least intrusive, as it will obviously seamlessly be sent with every request, and does not require the user to set the Accept-Language header in their browser, or require adding to the querystring with every request.

Again, the Localization.StarterWeb sample project provides a handy implementation that shows how you can add a select box to the footer of your project to allow the user to set the language. Their choice is stored in a cookie, which is handled by the CookieRequestCultureProvider for each request. The provider then sets the CurrentCulture and CurrentUICulture of the thread for the request to the user's selection.

To add the selector to your application, create a partial view _SelectLanguagePartial.cshtml in the Shared folder of your Views:

@using System.Threading.Tasks
@using Microsoft.AspNetCore.Builder
@using Microsoft.AspNetCore.Localization
@using Microsoft.AspNetCore.Mvc.Localization
@using Microsoft.Extensions.Options

@inject IViewLocalizer Localizer
@inject IOptions<RequestLocalizationOptions> LocOptions

@{
    var requestCulture = Context.Features.Get<IRequestCultureFeature>();
    var cultureItems = LocOptions.Value.SupportedUICultures
        .Select(c => new SelectListItem { Value = c.Name, Text = c.DisplayName })
        .ToList();
}

<div title="@Localizer["Request culture provider:"] @requestCulture?.Provider?.GetType().Name"> 
    <form id="selectLanguage" asp-controller="Home"
          asp-action="SetLanguage" asp-route-returnUrl="@Context.Request.Path"
          method="post" class="form-horizontal" role="form">
        @Localizer["Language:"] <select name="culture"
                                        asp-for="@requestCulture.RequestCulture.UICulture.Name" asp-items="cultureItems"></select>
        <button type="submit" class="btn btn-default btn-xs">Save</button>

    </form>
</div> 

We want to display this partial on every page, so update the footer of your _Layout.cshtml to reference it:

<footer> 
    <div class="row">
        <div class="col-sm-6">
            <p>&copy; 2016 - Adding Localization</p>
        </div>
        <div class="col-sm-6 text-right">
            @await Html.PartialAsync("_SelectLanguagePartial")
        </div>
    </div>
</footer> 

Finally, we need to add the controller code to handle the user's selection. This currently maps to the SetLanguage action in the HomeController:

[HttpPost]
public IActionResult SetLanguage(string culture, string returnUrl) 
{
    Response.Cookies.Append(
        CookieRequestCultureProvider.DefaultCookieName,
        CookieRequestCultureProvider.MakeCookieValue(new RequestCulture(culture)),
        new CookieOptions { Expires = DateTimeOffset.UtcNow.AddYears(1) }
    );

    return LocalRedirect(returnUrl);
}

And that's it! If we fire up the home page of our application, you can see the culture selector in the bottom right corner. At this stage, I have not added any resource files, but if I trigger a validation error, you can see that the resource key is used for the resource itself:

My development flow is not interrupted by having to go and mess with resource files, I can just develop the application using the default language and add resx files later in development. If I later add appropriate resource files for the fr culture, and a user changes their culture via the selector, I can see the effect of localisation in the validation attributes and other localised strings:

As you can see, the validation attributes and page title are localised, but the label field 'Your Email' has not, as that is set in the DisplayAttribute.

Summary

In this post I showed how to add localisation to your ASP.NET Core application using the recommended approach of providing resources for the default language as keys, and only adding additional resources as required later.

In summary, the steps to localise your application are roughly as follows:

1. Add the required localisation services
2. Configure the localisation middleware and if necessary a culture provider

3. Inject IStringLocalizer<T> into your controllers and services to localise strings

4. Inject IViewLocalizer into your views to localise strings in views

5. Add resource files for non-default cultures

6. Add a mechanism for users to choose their culture



European ASP.NET Core Hosting - HostForLIFE.eu :: Dependency Injection in ASP.NET Core

clock January 16, 2017 11:12 by author Scott

One of the nice things that the new ASP.NET Core stack brings to the table, is Dependency Injection (DI) as a first-class citizen, right out of the box. DI is nothing new, even for ASP.NET, but in the earlier versions, it wasn't baked into the platform, and developers were forced to jump through hoops in order to enable it.

Let's look at the status quo and how things are changing for the better with the new DI system in ASP.NET Core...

Status quo

Because of the history of ASP.NET, the timelines and factoring of its different products, like WebForms, MVC, SignalR, Katana (OWIN) and Web API, they've each had their own way of doing DI. Some products have extensibility points that you can leverage in order to plug in an Inversion of Control (IoC) container:

  • Web API: System.Web.Http.Dependencies.IDependencyResolver and System.Web.Http.Dependencies.IDependencyScope
  • MVC: System.Web.Mvc.IDependencyResolver
  • SignalR: Microsoft.AspNet.SignalR.IDependencyResolver

While others, like WebForms and Katana, doesn't. Some will argue that the IDependencyResolver-type abstraction, which is essentially an implementation of the Service Locator pattern, is an anti-pattern and should be avoided, but that's a discussion for another day.

There are also other ways of achieving DI within some of the frameworks; MVC has IControllerFactory and IControllerActivator, Web API has IHttpControllerActivator etc. All of these are extensibility points that you can implement in order to leverage DI in your controllers.

Implementing these abstractions yourself isn't something that you typically want or should have to do. Most IoC containers have already implemented these adapters for you and ship them as NuGet packages. If we take Autofac as an example, some adapters include

  • Autofac.Mvc4
  • Autofac.Mvc5
  • Autofac.Owin
  • Autofac.WebApi
  • Autofac.WebApi2
  • Autofac.SignalR
  • Autofac.Web (WebForms)

As you can see, it quickly starts to add up - and this is just for a single container! Imagine if I'd compiled a list for the gazillion different IoC containers in the .NET space. Each of the adapters needs to be maintained, updated, versioned etc. That's a big burden on the adapter maintainers and the community in general.

On the consuming side of this, for a typical web application using MVC, SignalR and Web API, you'd end up needing three (or more) of these adapters, in order to leverage DI across the application.

The future

Even though a lot of ideas and code have been carried forward from Katana, ASP.NET Core is by all means a re-imagining, re-write, re-EVERYTHING of the entire, current ASP.NET stack. Hell, it's even triggered a re-jigging of the entire .NET (Core) platform and tooling. This means that it's a perfect time to bring DI into the platform itself, and make all components on top benefit of a single, unified way of doing DI.

Say hello to IServiceProvider! Even though the interface itself isn't new (it's been living in mscorlib under the System namespace since .NET 1.1), it's found new life in the ASP.NET Core DI system. It's also accompanied by a couple of new interfaces; IServiceCollection, which is essentially a builder for an IServiceProvider and IServiceScope, which is intended for resolving services within a specific lifetime scope, like per-request.

In order for things to Just Work™, out of the box, Microsoft have implemented a lightweight IoC container that ships with the ASP.NET Core hosting layer. It's in the Microsoft.Extensions.DependencyInjection NuGet package.

When ASP.NET Core is bootstrapped, it creates an instance of IServiceCollection and passes it to user code using the ConfigureServicesmethod of the Startup class:

public class Startup 
{
    public void ConfigureServices(IServiceCollection services)
    {
        // This method gets called by the runtime.
        // Use this method to add services to the container.

         // Adds the services MVC requires to run.
        services.AddMvc();

        // Add some custom services
        services.AddSingleton<ICache, Cache>();
        services.AddScoped<IDatabaseSession, DatabaseSession>();
    }

    // ...
}

In this method, you're free to add whatever services your application needs, and they will magically be available for constructor injection across the board. Different components in the stack also ship with extension methods to conveniently add the services the component needs to the collection, like AddMvc (shown above), AddCors, AddEntityFramework etc.

Now, it's important to note that the default implementation, living in Microsoft.Extensions.DependencyInjection is a deliberately lightweight, feature poor (is that a word?), fast, implementation of an IoC container. It has just the amount of features needed for the runtime/platform/framework to compose itself and run. A "lowest common denominator" feature set, if you will. If you want more advanced features, like many do, Microsoft actively encourages you to Bring Your Own Container (BYOC), or layer the functionality on top, which I've done with Scrutor. This brings us back to IoC container adapters.

If you want to use a third party container, you have to, like before, implement your own version of IServiceProvider (and its accompanying interfaces), or use an adapter that someone in the community has already provided. There are already several of these available, like

The difference this time is that you only need a single adapter to enable DI across the board. To plug in the adapter, you have to change the return type of the ConfigureServicesmethod to IServiceProvider and return the adapter implementation. By using StructureMap.Dnx as an example, let's look at our startup class again:

public class Startup 
{
    public IServiceProvider ConfigureServices(IServiceCollection services)
    {
        // This method gets called by the runtime.
        // Use this method to add services to the container.

        // Adds the services MVC requires to run.
        services.AddMvc();

        // Add some custom services
        services.AddSingleton<ICache, Cache>();
        services.AddScoped<IDatabaseSession, DatabaseSession>();

        // Create an instance of a StructureMap container.
        var container = new Container();

        // Here we can add stuff to container, using StructureMap-specific APIs...

        // Populate the StructureMap container with
        // services from the IServiceCollection.
        container.Populate(services);

        // Resolve the StructureMap-specific IServiceProvider
        // and return it to the runtime.
        return container.GetInstance<IServiceProvider>();
    }

    // ...
}

By doing this, all components will resolve its services from the StructureMap container, and you'll be able to utilize the full feature set of StructureMap, like awesome diagnostics, property injection, convention based registrations, profiles, decoration etc.

This post turned out longer than I expected, just to show a couple of lines of code at the end, but I thought it would be interesting to put everything in perspective and hopefully you did too. As you can see the DI story has been greatly simplified in the this new world, while still allowing you, as an application, library or framework developer, to utilize DI across the board, with minimal effort.

 

 



European ASP.NET Core 1.0 Hosting - HostForLIFE.eu :: Setup Angular 2 in ASP.NET Core 1.0

clock October 12, 2016 00:11 by author Scott

This tutorial aims for starting Angular 2 in ASP.NET Core using Visual Studio 2015. The release of Angular 2, ASP.NET Core RC is becoming interesting to build SPA.

I have compiled the steps involved in starting to learn Angular 2. This is detailed explanation, you will feel much easier at end of article.

Create Your ASP.NET Core Project

Open Visual Studio 2015 Community Edition Update 3, Select New Web Project naming it “ngCoreContacts” and select “Empty” project template. Don’t forget to install new web tools for ASP.NET Core 1.0

I used Visual Studio 2015 Community Edition Update 3(Must update), TypeScript 2.0 (must), latest NPM, Gulp.

Setup ASP.NET Core to Serve Static Files

ASP.NET Core is designed as pluggable framework to include and use only necessary packages, instead of including too many initial stuff.

Lets create HTML file named “index.html” under wwwroot folder.

Right click on wwwroot folder, Add New Item and create index.html file. This HTML page will act as default page.

<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8" />
    <title>Angular 2 with ASP.NET Core</title>
</head>
<body>
    <h1>Demo of Angular 2 using ASP.NET Core with Visual Studio 2015</h1>
</body>
</html>

For ASP.NET Core to serve static files, we need to add StaticFiles middle ware in Configure method of Startup.cs page. Ensure that packages are restored properly.

project.json is redesigned to make it better, we have Static Files middleware to serve static assets like HTML, JS files etc.

public void Configure(IApplicationBuilder app)
        {
            app.UseDefaultFiles();
            app.UseStaticFiles();
        }

 

{
  "dependencies": {
    "Microsoft.NETCore.App": {
      "version": "1.0.1",
      "type": "platform"
    },
    "Microsoft.AspNetCore.Diagnostics": "1.0.0",
    "Microsoft.AspNetCore.Server.IISIntegration": "1.0.0",
    "Microsoft.AspNetCore.Server.Kestrel": "1.0.1",
    "Microsoft.Extensions.Logging.Console": "1.0.0",
    "Microsoft.AspNetCore.StaticFiles": "1.0.0"
  },

  "tools": {
    "Microsoft.AspNetCore.Server.IISIntegration.Tools": "1.0.0-preview2-final"
  },

  "frameworks": {
    "netcoreapp1.0": {
      "imports": [
        "dotnet5.6",
        "portable-net45+win8"
      ]
    }
  },

  "buildOptions": {
    "emitEntryPoint": true,
    "preserveCompilationContext": true,
    "compile": {
      "exclude": [ "node_modules" ]
    }
  },

  "runtimeOptions": {
    "configProperties": {
      "System.GC.Server": true
    }
  },

  "publishOptions": {
    "include": [
      "wwwroot",
      "web.config"
    ]
  },

  "scripts": {   
    "postpublish": [ "dotnet publish-iis --publish-folder %publish:OutputPath% --framework %publish:FullTargetFramework%" ]
  }
}

Run the application now, ASP.NET Core renders static HTML page.

Delete this index.html page, we will be injecting this dynamically later. Till now you saw demonstration of “wwwroot“ as root folder for ASP.NET Core web apps.

Setup Angular 2 in ASP.NET Core

Angular 2 is famously claiming to be ONE Framework for MOBILE and DESKTOP apps. There’s won’t be any breaking changes after final release.

This tutorial refers 5 MIN QUICK START for getting started, it’s more focused on other light weight code editors; but here we are using Visual Studio 2015 Community Edition Update 3 for its built in TypeScript tooling and other features.

We will be using Webpack for module bundler, it’s an excellent alternative to the systemJS approach. To know more about inner details of read “webpack and Angular 2

Majority of webpack scripting is based on AngularClass’s angular2-webpack-starter. I have modified according to ASP.NET Core web apps.

Adding NPM Configuration file for Angular 2 Packages

Angular 2 team is pushing the code changes using NPM rather than CDN or any other source, due to this we need to add NPM configuration file (package.json) to this ASP.NET Core application.

Right Click on “ngCoreContacts“, add new file “NPM Configuration File“; by default package.json is added to ASP.NET Core project. This acts Node Package Manager (NPM) file, a must for adding packages for Angular 2

From the Angular 2 Quick start provided above, we need to add dependencies for required for Angular 2 in ASP.NET Core application. Copy Paste below configuration in package.json file

{
    "version": "1.0.0",
    "description": "ngcorecontacts",
    "main": "wwwroot/index.html",
  "scripts": {
    "build:dev": "webpack --config config/webpack.dev.js --progress --profile",   
    "build:prod": "webpack --config config/webpack.prod.js  --progress --profile --bail",
    "build": "npm run build:dev",   
    "server:dev:hmr": "npm run server:dev -- --inline --hot",
    "server:dev": "webpack-dev-server --config config/webpack.dev.js --progress --profile --watch --content-base clientsrc/",
    "server:prod": "http-server dist --cors",
    "server": "npm run server:dev",
    "start:hmr": "npm run server:dev:hmr",
    "start": "npm run server:dev",
    "version": "npm run build",
    "watch:dev:hmr": "npm run watch:dev -- --hot",
    "watch:dev": "npm run build:dev -- --watch",
    "watch:prod": "npm run build:prod -- --watch",
    "watch:test": "npm run test -- --auto-watch --no-single-run",
    "watch": "npm run watch:dev",   
    "webpack-dev-server": "webpack-dev-server",
    "webpack": "webpack"
  },
  "dependencies": {
    "@angular/common": "~2.0.1",
    "@angular/compiler": "~2.0.1",
    "@angular/core": "~2.0.1",
    "@angular/forms": "~2.0.1",
    "@angular/http": "~2.0.1",
    "@angular/platform-browser": "~2.0.1",
    "@angular/platform-browser-dynamic": "~2.0.1",
    "@angular/router": "~3.0.1",
    "@angular/upgrade": "~2.0.1",
    "angular-in-memory-web-api": "~0.1.1",
    "@angularclass/conventions-loader": "^1.0.2",
    "@angularclass/hmr": "~1.2.0",
    "@angularclass/hmr-loader": "~3.0.2",
    "@angularclass/request-idle-callback": "^1.0.7",
    "@angularclass/webpack-toolkit": "^1.3.3",
    "assets-webpack-plugin": "^3.4.0",
    "core-js": "^2.4.1",
    "http-server": "^0.9.0",
    "ie-shim": "^0.1.0",
    "rxjs": "5.0.0-beta.12",
    "zone.js": "~0.6.17",
    "@angular/material": "^2.0.0-alpha.9",
    "hammerjs": "^2.0.8"
  },
  "devDependencies": {
    "@types/hammerjs": "^2.0.33",
    "@types/jasmine": "^2.2.34",
    "@types/node": "^6.0.38",
    "@types/source-map": "^0.1.27",
    "@types/uglify-js": "^2.0.27",
    "@types/webpack": "^1.12.34",
    "angular2-template-loader": "^0.5.0",
    "awesome-typescript-loader": "^2.2.1",
    "codelyzer": "~0.0.28",
    "copy-webpack-plugin": "^3.0.1",
    "clean-webpack-plugin": "^0.1.10",
    "css-loader": "^0.25.0",
    "exports-loader": "^0.6.3",
    "expose-loader": "^0.7.1",
    "file-loader": "^0.9.0",
    "gh-pages": "^0.11.0",
    "html-webpack-plugin": "^2.21.0",
    "imports-loader": "^0.6.5",
    "json-loader": "^0.5.4",
    "parse5": "^1.3.2",
    "phantomjs": "^2.1.7",
    "raw-loader": "0.5.1",
    "rimraf": "^2.5.2",
    "source-map-loader": "^0.1.5",
    "string-replace-loader": "1.0.5",
    "style-loader": "^0.13.1",
    "sass-loader": "^3.1.2",   
    "to-string-loader": "^1.1.4",
    "ts-helpers": "1.1.1",
    "ts-node": "^1.3.0",
    "tslint": "3.15.1",
    "tslint-loader": "^2.1.3",
    "typedoc": "^0.4.5",
    "typescript": "2.0.3",
    "url-loader": "^0.5.7",
    "webpack": "2.1.0-beta.22",
    "webpack-dev-middleware": "^1.6.1",
    "webpack-dev-server": "^2.1.0-beta.2",
    "webpack-md5-hash": "^0.0.5",
    "webpack-merge": "^0.14.1"
  }
}

Right after saving this, ASP.NET Core starts restoring the packages. It would download all packages mentioned independencies section of above package.json.

Sometimes in solution explorer you might see ‘Dependencies – not installed’, don’t worry this bug in tooling. All the npm packages are installed.

Add TypeScript configuration file – must for Angular 2 in ASP.NET Core using TypeScript

We are creating Angular 2 in ASP.NET Core starting with TypeScript, this obvious reason adds to include TypeScript Configuration file which does work of transpiling it to JavaScript, module loading, target ES5 standards.

Add “tsconfig.json” in the project, copy paste below configuration.

{
  "compilerOptions": {
    "target": "es5",
    "module": "commonjs",
    "moduleResolution": "node",
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "allowSyntheticDefaultImports": true,
    "sourceMap": true,
    "noEmitHelpers": true,
    "strictNullChecks": false,
    "baseUrl": "./clientsrc",
    "paths": [],
    "lib": [
      "dom",
      "es6"
    ],
    "types": [
      "hammerjs",     
      "node",     
      "source-map",
      "uglify-js",
      "webpack"
    ]
  },
  "exclude": [
    "node_modules",
    "dist"
  ],
  "awesomeTypescriptLoaderOptions": {
    "forkChecker": true,
    "useWebpackText": true
  },
  "compileOnSave": false,
  "buildOnSave": false,
  "atom": { "rewriteTsconfig": false }
}

It’s mandatory to install TypeScript 2.o for working with Angular 2.

At present typings.json is not required because we are using @types with TypeScript. However if your using any other packages which don’t have entries in @types then typings.json has to be added.

Use Webpack as Module Bundler

Webpack is a powerful module bundler. A bundle is a JavaScript file that incorporate assets that belong together and should be served to the client in a response to a single file request. A bundle can include JavaScript, CSS styles, HTML, and almost any other kind of file.

Webpack roams over your application source code, looking for import statements, building a dependency graph, and emitting one (or more) bundles. With plugin “loaders” Webpack can preprocess and minify different non-JavaScript files such as TypeScript, SASS, and LESS files.

In package.json, we have added “webpack“ packages as “devdependencies“. They will perform all bundling work.

What webpack does is written in a JavaScript configuration file know as webpack.config.js. As always the applications are run in Development, Test and Production environment.

There are some common functionalities and some specific to environments. We will focus on development and productionenvironment to write accordingly.

Development environment should have source maps for debugging TypeScript files, minifying bundles of JS, CSS etc files not necessary.

Production environment should minify bundles to reduce loading time, do not include source maps. Webpack 2 also does tree shaking i.e. eliminate unused code to further reduce bundle sizes.

webpack.config.js – Based on environment set process.env.NODE_ENV, it runs either dev or prod configurations.

Webpack.common.js before bundling environment specific files, it performs tasks meant to be used for both environment.

  • Webpack splits Angular 2 apps into 3 files polyfills(to maintain backward compatibility with older browsers) , vendors(all JS, CSS, HTML, SCSS, images, JSON etc into one file) and boot (application specific files)
  • resolve based on various file extensions
  • Webpack itself doesn’t know what to do with a non-JavaScript file. We teach it to process such files into JavaScript withloaders. For this, we have written loaders TS, HTML, JSON, Fonts, images
  • Any static assets placed in “clientsrc/assets” will be copied to assets folder using CopyWebpackPlugin
  • CleanWebpackPlugin cleans “wwwroot/dist” folder every time we run it, so that we get fresh set of files.
  • I told you above to delete the index.html file, now the clientsrc/index.html will be moved to wwwroot usingHtmlWebpackPlugin. Plus Webpack injects the bundle files i.e. polyfills, vendor, boot JS files and includes them in HTML script reference.

Now let’s see webpack.dev.js for development purpose

  • Running “webpack-dev-server” – this runs entire application in memory, any changes to source file gets applied immediately
  • Loads application in debug mode with source map. Everything run in memory i.e. html, js, static files are loaded in memory.
  • Runs the application on localhost 3000 port. Port can be changed as your convenience

Now let’s see webpack.prod.js for production purpose

  • Merges all the bundle files and copies to wwwroot.
  • Minifies all files to load faster using UglifyJsPlugin plugin

Writing Angular 2 application

Until now we created ASP.NET Core app, added TSconfig file, webpack configuration. Now it’s time to write Angular 2 application

In the github repo, you can see “clientsrc” folder. This contains the angular 2 app which gets bundled into using webpack configurations we wrote

“Clientsrc“ folder has index.html, polyfills.browses.ts, vendor.browsers.ts and mostly importantly boot.ts

We have app folder containing HTML, Angular 2 components and root level module (app.module.ts) which gets loaded while bootstrapping application.

Some of files might be not interesting now, will focus them in separate articles later.

Running the application

Before running make sure you have run command “npm install”. This might not be needed but still it will ensure all packages are installed.

Now let’s run the application in development mode

  • From command line (directory should be same as package.json), type “npm start” & hit enter. It will start running the webpack-dev-server which loads application and listens on localhost:3000.
  • When on console it says “bundle is now VALID” then open a browser and navigate to http://localhost:3000 to see application getting loaded.

Notice wwwroot folder, we don’t see any files copied because everything is running in memory.

Now that application runs properly on browser, let’s understand how Angular 2 app loads

  • When browser starts rendering index.html page, it encounters <my-app>Loading…</my-app> tag.
  • Then Angular’s module platformBrowserDynamic bootstraps clientsrc/app/AppModule through lineplatformBrowserDynamic().bootstrapModule(AppModule)
  • AppModule then loads the component app.component.ts which is mentioned in @NgModule as bootstrap entry
  • Clientsrc/src/Appcomponent then resolves the <my-app> tag as selector in it and renders UI with TypeScript code.

When we enter “npm start” in console to run the application, execution points scripts section of package.json to below code

webpack-dev-server --config config/webpack.dev.js --progress --profile --watch --content-base clientsrc/

This invokes webpack-dev-server, runs the development config and watches for any changes in clientsrc folder. Any changes in this folder will reload application with changes.

Running the application in Production mode

Assuming the application is now ready to deployed, we need to have PROD build. For this run command

//builds app and copies in wwwroot
Npm run build:prod

Now if you see wwwroot folder, we see the HTML, JS bundle files. This wwwroot folder can be deployed on any web server like IIS or nginx

You can either do F5 to run from Visual Studio IDE or run command npm run server:prod



About HostForLIFE

HostForLIFE is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2019 Hosting, ASP.NET 5 Hosting, ASP.NET MVC 6 Hosting and SQL 2019 Hosting.


Month List

Tag cloud

Sign in