Dotnet Core Hosted Service

Problem Statement: Develop a hosted service that runs in a dotnet core app. This service needs to support dependency injection and logging configuration from the app.

Frameworks:
Asp.Net core 3.0 (This works in 2.2 as well)

Step 1:
In Startup.cs add this hosted service to Service Collection

services.AddHostedService();

Step 2:
I just want to highlight of couple of key areas from snippet below.

  • Hosted services are singleton, so you cannot inject scoped or transient services.
  • Service Provider can be used to create scope from which you can get all required services from dependency container.
  • Cancellation token is the key service that can be used to handle graceful shutdown and also keep the service running in the background.
public class EmailBackgroundService : BackgroundService
{
private readonly IServiceProvider _serviceProvider;
private readonly ILogger<EmailBackgroundService> _logger;
public EmailBackgroundService(IServiceProvider serviceProvider, ILogger<EmailBackgroundService> logger)
{
_serviceProvider = serviceProvider;
_logger = logger;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
await RunSomeprocess();
await Task.Delay(5000, stoppingToken);
}
}
private async Task RunSomeprocess()
{
using var providerScope = _serviceProvider.CreateScope();
var userService = providerScope.ServiceProvider.GetService<IUserService>();
_logger.Log( LogLevel.Information , $"{userService.GetUserId()}");
}
}

Have fun!

Entity Framework Core Override Conventions

Problem Statement: In our project, we use code first approach to generate migrations. I see one feature which I couldn’t control through Fluent API.

EF core is generating indexes on foreign keys. I see this as default behavior but I always like to have control in creating indexes.

Current Frameworks:

Asp.Net Core 2.2 API with EF Core.

Solution 1:

I can go and delete unwanted indexes after every migration is generated but this manual step might be missed one day, so I started looking for automating this.

Solution 2:

I found EF core configuration has given the option to replace one of its core services, so I tried to remove the Foreign Key Index convention.

public static IServiceCollection InitDatabaseContext(this IServiceCollection services, string connectionString)
{
services.AddDbContext<AppDbContext>(opts =>
{
opts.UseSqlServer(connectionString);
opts.ReplaceService<IConventionSetBuilder, CustomSetBuilder>();
});
return services;
}
public class CustomSetBuilder : SqlServerConventionSetBuilder
{
public CustomSetBuilder(RelationalConventionSetBuilderDependencies dependencies, ISqlGenerationHelper sqlGenerationHelper) : base(dependencies, sqlGenerationHelper)
{
}
public override ConventionSet AddConventions(ConventionSet conventionSet)
{
var et = conventionSet.ForeignKeyAddedConventions.FirstOrDefault(f => f is ForeignKeyIndexConvention);
if (et != null)
conventionSet.ForeignKeyAddedConventions.Remove(et);
return base.AddConventions(conventionSet);
}
}

Testing:
I did generate a migration for my project with default conventions and then did generate migration after removing this convention. I am 100% convinced that this convention affected only default index creation for all foreign keys.

Note: Please test thoroughly before you add this to your project.

Have fun!

.Net Core Projects Code Coverage

Code coverage features are only supported
Visual Studio Ultimate.
Resharper

I tried this approach to generate a report in local and also in Azure DevOps build pipeline.

Step 1:
Add this code coverage library (Go To Package) NuGet package for all test package.

Step 2:
I created a javascript file that does the entire heavy lifting of running unit tests with code coverage and convert to report and open the report.

var commandExists = require('command-exists');
const execSync = require('child_process').execSync;
const rimraf = require('rimraf');
function runCoverageReport() {
console.log('started code coverage');
rimraf.sync('TestResults');
let result = execSync('dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=cobertura /p:CoverletOutput=./TestResults/');
console.log(result.toString());
console.log('finished code coverage');
}
function generateReport() {
console.log('started report generation');
let result = execSync('reportgenerator "-reports:./**/TestResults/coverage.cobertura.xml" "-targetdir:./TestResults/CoverageReport/"');
console.log(result.toString());
console.log('finished report generation');
}
function openReport() {
const osPlatform = process.platform;
console.log(`opening report for ${osPlatform}`);
if (osPlatform === 'darwin') {
execSync('open ./TestResults/CoverageReport/index.htm')
}
else if (osPlatform === 'win64' || osPlatform === 'win32') {
execSync('start ./TestResults/CoverageReport/index.htm')
}
}
async function main() {
try {
await commandExists('coverlet');
console.log('coverlet command found and running coverage report');
} catch (e) {
console.log('installing coverlet.....');
execSync('dotnet tool install --global coverlet.console');
}
runCoverageReport();
try {
await commandExists('reportgenerator');
console.log('reportgenerator command found and running report');
} catch (e) {
console.log('installing report generator.....');
execSync('dotnet tool install --global dotnet-reportgenerator-globaltool');
}
generateReport();
openReport();
};
main();

Step 3:
You have a couple of options to run this. I always create prefer to create a npm script command in package.json


"scripts" : {
"code:coverage" : "node code-coverage.js"
},
"devDependencies": {
"command-exists": "^1.2.8"
}

Finally
I have tested this both on windows and mac.

One Last Thing

I also integrated this into Azure DevOps build pipeline

For test runner task

Add publish code coverage task and then set the below settings

Happy Coding!

View pdf files in ionic app

Requirement: Generate pdf stream on server side and show pdf inside ionic app.

Steps:
1. Install these two plugins file and document viewer from Ionic native

Document Viewer
File Api

2. Get the pdf stream as array buffer or blob. You need to apps proper response type to http get request.

3. Now create and show pdf file from the server response.

import {File} from '@ionic-native/file';
import { DocumentViewer } from '@ionic-native/document-viewer';
export class DocumentViewer
{
constructor(private file : File, private documentViewer: DocumentViewer)
{
}
public openDocument(document: Document)
{
if (!document)
{
this.showError("Document selected is null");
return;
}
let path "documents/"+databaseId;
let headers = new Headers({
'Accept': 'application/pdf'
});
let options = new RequestOptions({headers, responseType: ResponseContentType.ArrayBuffer});
let observableResponse = this.http.get(appConstants.apiUrl+
path, options);
observableResponse.subscribe((response: Response) => {
this.showDocument(response, document);
}, (error: any) => {
});
}
private showDocument(response: Response, document: Document)
{
let buffer = response.arrayBuffer();
let pdfBlob = new Blob([buffer], {type: 'application/pdf'});
this.file.writeFile(this.file.dataDirectory, document.databaseId+".pdf", pdfBlob, {replace: true}).then(c => {
this.documentViewer.viewDocument(this.file.dataDirectory+document.databaseId+".pdf", "application/pdf",
{print: {enabled: true}, bookmarks: {enabled: true}, email: {enabled: true}, title: document.title});
});
}
}

Send JSON in http Get with angular and Web Api

Requirement: Need to send complex in http get request using Angular and get the JSON in Web API

Frontend
First prepare URLSearchParams object and add key value pairs.
Send the RequestOptions to Angular http service.

final url format will be
{{API-URL}}/api/folders/370/documents?filter={ 'startDate':'08/01/2005', 'endDate':'08/31/2005','currentPage': '2', 'pageLimit': '50'}

import {Http, Headers, RequestOptions, Response, ResponseContentType, URLSearchParams} from "@angular/http";
@Injectable()
export class DocumentService
{
constructor(private http: Http)
{
}
loadDocuments(folderId: number, currentPage: number) : Observable<Response>
{
let headers = new Headers({
'Content-Type' : 'application/json',
'Accept': 'application/json'
});
let requestOptions = new RequestOptions({headers});
requestOptions.params = this.getFilterData(currentPage);
let path = "folders/"+folderId+"/documents";
return this.http.get(appConstants.apiUrl+ path, requestOptions );
}
private getFilterData(currentPage: number) : any
{
let params: URLSearchParams = new URLSearchParams();
let filterValue = JSON.stringify( {currentPage: currentPage, pageLimit: 50 });
params.set('filter', filterValue);
return params;
}
}
view raw service.ts hosted with ❤ by GitHub

On the API Side
Got the query parameter and deserialize to ViewModel
Your server side ViewModel matches your JSON object sent in the request.

var filterData = Request.GetQueryNameValuePairs();
if (filterData == null || !filterData.Any())
{
throw new ArgumentException("Invalid request: search parameters are missing.");
}
var viewModel = JsonConvert.DeserializeObjectfilterData.First().Value);

Customize bash in Mac terminal to show version control information

Highlights: I came across nice utility vcprompt which shows the git source control information in Mac OS bash terminal.

Steps:
Install vcprompt using brew
brew install vcprompt

Now Add this code in .bashrc file in your home directory
You can search for different colors and use those as variables with “$” prefix.
You can pass different format options for vcprompt -f flag.

NO_COLOR='\e[0m'
RED='\e[0;31m'
export PS1="\n\u:\w \[$RED\] \$(vcprompt -f [%b])\[$NO_COLOR\] \nā†’ "

Once you are done with above changes, either restart the terminal or run the command to load configuration file.

source ~/.bashrc

Result: Now You will see the project current branch and other information when you navigate to a folder which is under source control.

Xml DataType With Entity Framework

Requirement: Need to save an object collection in Sql Server Xml data type column.
Note: In Sql Server 2016 JSON support was added.

Steps:
1. Employee and Address classes declared.
2. Created EmployeeMapping of type EntityTypeConfiguration which is used by DbContext.
Note: Addresses collection should be ignored in mapping to avoid key required error on Address Entity.

Note: it’s not full proof code.

namespace EntityFrameworkXmlDemo
{
internal class Program
{
public static void Main(string[] args)
{
Console.WriteLine("Welcome");
bool retrieve = true;
var context = new SampleContext("connection string here");
if (retrieve)
{
var employee = context.Employees.FirstOrDefault(e => e.Id == 1);
var addresses = employee.Addresses;
}
else
{
var employee = new Employee()
{
Name = "Fanstastic"
};
employee.Addresses.Add(new Address()
{
Name = "Newyork",
Zip = 1234
});
employee.Addresses.Add(new Address()
{
Name = "California",
Zip = 7658
});
employee.SerializeAddress();
context.Employees.Add(employee);
context.SaveChanges();
}
Console.ReadKey();
}
}
public class SampleContext : DbContext
{
public DbSet<Employee> Employees { get; set; }
public SampleContext(string connectionString) : base(connectionString)
{
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Configurations.Add(new EmployeeMapping());
base.OnModelCreating(modelBuilder);
}
}
public class EmployeeMapping : EntityTypeConfiguration<Employee>
{
public EmployeeMapping()
{
this.ToTable("Employees");
this.HasKey(d => d.Id);
this.Property(d => d.Id).HasDatabaseGeneratedOption(DatabaseGeneratedOption.Identity);
this.Property(d => d.Name);
this.Property(d => d.Address).HasColumnType("xml");
this.Ignore(d => d.Addresses);
}
}
public class Employee
{
private IList<Address> _addresses;
public int Id { get; set; }
public string Name { get; set; }
public string Address { get; set; }
public IList<Address> Addresses
{
get
{
_addresses = DeSerializeAddress();
return _addresses;
}
set => _addresses = value;
}
public void SerializeAddress()
{
var serializer = new XmlSerializer(typeof(List<Address>));
var stringWriter = new StringWriter();
serializer.Serialize(stringWriter, _addresses);
Address = stringWriter.ToString();
}
public IList<Address> DeSerializeAddress()
{
if (!string.IsNullOrEmpty(Address))
{
var serializer = new XmlSerializer(typeof(List<Address>));
var stringReader = new StringReader(Address);
return (IList<Address>)serializer.Deserialize(stringReader);
}
return _addresses;
}
public Employee()
{
_addresses = new List<Address>();
}
}
public class Address
{
public string Name { get; set; }
public int Zip { get; set; }
}
}
view raw Program.cs hosted with ❤ by GitHub

deploy node web application to IIS

Requirement: Need to deploy node web application to on premise IIS 8.5.

Steps:
1. Create a App Pool user and assign the user to IIS_IUSRS group. It’s easy to maintain permissions when we work with windows groups. This is a special group created for IIS and it makes life easy when we move applications to different windows servers.
2. Create Virtual directory for your application and give full permissions to IIS_IUSRS on this directory.
3. Now create a website or application for the virtual directory and set the app pool identity created in step 1.
4. Now download IISNode msiĀ here. please pay attention on documentation relation to url rewrite.
5. After you install msi, run the batch file with admin privileges “C:\Program Files\iisnode\setupsamples.bat”.
6. Open IIS manager and expand default website, you should see samples are deployed under node application. This gave me pretty good understanding for me.
7. I have two different node web applications
Node application is serving both static and Api.
Node application is serving api only and static files are served by IIS.

Node Serving both static file and APi

Create web.config file in the virtual directory and add these settings.
rewrite: Any request comes to web site will be served by app.js
handlers: iisnode is a http handler which will execute app.js using node.

Note: what if don’t have app.js in the root directory? please see next section.
Please remove any hardcode port number from the index.js. port should passed from process “server.listen(process.env.PORT);”

<configuration>
<system.webServer>
<handlers>
<add name="iisnode" path="app.js" verb="*" modules="iisnode" />
</handlers>
<rewrite>
<rules>
<rule name="sample">
<match url="/*" />
<action type="Rewrite" url="app.js" />
</rule>
</rules>
</rewrite>
</system.webServer>
</configuration>
view raw web.config hosted with ❤ by GitHub

Node Serving APi and IIS serving static content

Create web.config file in the virtual directory and add these settings.
default document: IIS serves default document from subfolder (new). index.html will load all css and other files.
rewrite: url that ends with api will be served by index.js which is under subfolder (server).

Note: Please remove any hardcode port number from the index.js. port should passed from process “server.listen(process.env.PORT);”

<configuration>
<system.webServer>
<handlers>
<add name="iisnode" path="server/index.js" verb="*" modules="iisnode" />
</handlers>
<rewrite>
<rules>
<rule name="sample">
<match url="api/*" />
<action type="Rewrite" url="server/index.js" />
</rule>
</rules>
</rewrite>
<defaultDocument enabled="true">
<files>
<add value="new/index.html" />
</files>
</defaultDocument>
</system.webServer>
</configuration>
view raw web.config hosted with ❤ by GitHub

Hope this helps.

Using SQL Variables inside query in clause

Problem: I need to use a variable which should be used inside the in clause.


Case 1:
DECLARE @customerIds VARCHAR(200) = '123,456,789'
SELECT * From Customers Where Id in (????)
Case 2:
DECLARE @customerNames VARCHAR(200) = 'abc, def, ghi'
SELECT * From Customers Where FirstName in (????)

Solution Case 1:
Since already have comma separated integers, we should parse the variable as integer table.


CREATE FUNCTION [dbo].[StringSplitInt]
(
	@Idlist varchar(800),
	@delim char(1)
)
RETURNS @returntable TABLE
(
	Id int not null
)
AS
BEGIN
	DECLARE @list varchar(800) = RTRIM(LTRIM(@Idlist)) + @delim;
	DECLARE @idString varchar(10), @id INT;
	WHILE (LEN(@list) > 0)
	BEGIN
		SET @idString = SUBSTRING(@list, 1, CHARINDEX(@delim, @list,1)-1);
		SET @list = SUBSTRING(@list, LEN(@idString)+2, LEN(@list) - LEN(@idString)-1)
		INSERT @returntable(Id)
		SELECT CAST(@idString AS INT)
	END
	RETURN
END

Go

DECLARE @customerIds VARCHAR(200) = '123,456,789'
SELECT * From Customers Where Id in (SELECT Id from StringSplitInt(@customerIds))

Solution for Case 2:
Now we parse the string variable to table


CREATE FUNCTION [dbo].[StringSplitString]
(
	@Idlist varchar(800),
	@delim char(1)
)
RETURNS @returntable TABLE
(
	Id VARCHAR(MAX) COLLATE Latin1_General_CI_AI
)
AS
BEGIN
	DECLARE @list varchar(800) = RTRIM(LTRIM(@Idlist)) + @delim;
	DECLARE @idString varchar(100);
	WHILE (LEN(@list) > 0)
	BEGIN
		SET @idString = SUBSTRING(@list, 1, CHARINDEX(@delim, @list,1)-1);
		SET @list = SUBSTRING(@list, LEN(@idString)+2, LEN(@list) - LEN(@idString)-1)
		INSERT @returntable(Id)
		SELECT @idString
	END
	RETURN
END

Go

DECLARE @customerNames VARCHAR(200) = 'abc, def, ghi'
SELECT * From Customers Where FirstName in (Select Id from StringSplitString(@customerNames))

Note: You might get Collation mismatch error between the FirstName and Id Column returned by function. (I need to read a little more about this). I solved this problem by defining function column with our SQL Server collation setting.

Autocompletion for git commands

Requirement: it would be great to get autocompletion for git commands in mac terminal.

Install Git download

Now we need to download bash file with all commands. Open terminal and run this command from Home directory.

curl -O https://raw.githubusercontent.com/git/git/master/contrib/completion/git-completion.bash .git-completion.bash

Now check for .bash_profile in your home directory. open terminal
This command should show all invisible file also.

ls-a

Create file .bash_profile if it doesn’t exist.

touch .bash_profile

Open file .bash_profile in text editor or editor of your choice.

open .bash_profile

Now append this code for .bash_profile file.


if [ -f ~/.git-completion.bash ]; then
    . ~/.git-completion.bash
fi

Save it and close the terminal.

Verify:
Open the terminal.
type > git h
This should complete command as “git help”

Verify 2:

You can get completion for git command options also.
git log –a
this should list all possible options start with “–a”.