Using Group Policy Editor to enable or disable desktop backgrounds

System administrators are known to be hard heads when it comes to corporate computer personalization. Locking down desktop backgrounds is one of the most common tasks that administrators perform. If you have elevated privileges on your computer there is an easy way to get around this.

Open the Group Policy Editor, by typing gpedit.msc in Run box and hitting Enter.

gpedit-desktop-wallpaper

 

Navigate to User Configuration > Administrative Templates > Desktop Click Desktop again. Double-click Desktop Wallpaper.

Ensure that the Not Configured option is selected. This will enable the option to change the desktop background.

Adding a default value for a datetime column in SQLServer

The easiest way to add a default value for a DateTime column is to use  SQL Server Management Studio (SSMS) GUI.

  1. Put your table in design view (Right click on table in object explorer->Design)
  2. Add a column to the table (Click on the column you want to update if it already exists)
  3. In Column Properties, enter (getdate()) in Default Value or Binding field as pictured below
Default Datetime
Adding a default value of datetime

For an existing column, you can use this code

ALTER TABLE YourTable ADD CONSTRAINT DF_YourTable DEFAULT GETDATE() FOR YourColumn

 

HTTP GETs with HttpWebRequest,WebClient and HTTPClient

The .NET framework offers you three different classes to consume REST API’s. HttpWebRequest,WebClientHttpClient.
A lot of developers get confused trying to determine which class is the best to use. (Thus the numerous questions on Stack overflow)
 2016-04-08_1357
In order to understand the differences between the 3 classes let’s walk through a simple example of validating an email address via a web service. In our example, we’re going to pass in an email address to an EmailValidatorClient. The client has 3 methods for retrieving the response and uses either HttpWebRequest, WebClient or HttpClient to generate a response.

HttpWebRequest:

HttpWebRequest and HttpWebResponse are the most common base classes for making HTTP requests. They provide you full control of the request which also means that you can screw things up if you don’t know what you’re doing.
public static string ValidateEmailWithHTTPRequest( string email)
  {
  string formattedUrl = BuildUrl(email);
  string responseString;
  //HTTP GET
  var webRequest = (HttpWebRequest)WebRequest.Create(formattedUrl);
  webRequest.Method = "GET" ;
  webRequest.Headers.Add( "X-Mashape-Key" ,"KEY" );
  using (WebResponse webResponse = webRequest.GetResponse())
  using (var streamReader = new StreamReader(webResponse.GetResponseStream()))
  {
  responseString = streamReader.ReadToEnd();
  streamReader.Close();
  }
  return responseString;
  }

 

Pro’s
  • Provides full customization of the request
  • Does not block the user interface thread which will make your app more responsive
  • Supports HTTP compression
Con’s
  • More control means more code to write
  • Requires you to have a deeper understanding of request/response paradigm
WebClient :
The WebClient is a highlevel abstraction on top of HttpWebRequest. This has the advantage of simplifying common tasks. You can see how simple it is to use in the code below. We simply call DownloadString(requestUrl) and we have our response.
public static string ValidateEmailWithWebClient( string email)
  {
  var requestUrl = BuildUrl(email);
  using (var client = new WebClient())
  {
  client.Headers.Add( "X-Mashape-Key" , "dSfIwFVR0Lmshy1kFFMafFuu0n82p1tELRGjsna9NEEg8damsH" );
  var response = client.DownloadString(requestUrl);
  return response;
  }

  }

 

Pro’s
  • Useful if you want a simple way to call an endpoint without the need to add or customize the request
  • Provides progress tracking and allows you to cancel download operations
Con’s
  • Does not grant you as much control over the request
  • Webclient runs on the user interface thread so your app might become unresponive if it’s a large download
  • Webclient doesn’t support HTTPCompression
  • Webclient ignores the HTTP contentType’s charset value when you use it to get the response text. You need to exlicitly set the encoding.
  • Unit testing code that calls WebClient is a pain
HTTPClient
HttpClient is the newest to .NET. It is only supported on versions of .NET higher than 4.0 and provides another layer of abstraction on top of HttpWebRequest and HttpWebResponse.
The main difference between HttpClient and WebClient is that, while webclient simplifies the process of making requests, HTTPClient additionally adds extra functionality to highlevel abstraction provided by WebClient.
HttpClient provides powerful functionality with better syntax support for newer threading features, e.g. it supports the await keyword. It also enables threaded downloads of files with better compiler checking and code validation
public static async Task< string > ValidateEmailWithHTTPClient(string email)
  {
  string requestUrl = BuildUrl(email);
  using (var client = new HttpClient())
  {
  client.DefaultRequestHeaders.Accept.Clear();
  client.DefaultRequestHeaders.Add( "X-Mashape-Key" ,"key" );

  //GET
  HttpResponseMessage response = await client.GetAsync(requestUrl);
  if (response.IsSuccessStatusCode)
  {
            return response.Content.ReadAsStringAsync().Result;
  }
  }
  return string .Empty;
  }

 

Pro’s 
  • A single HttpClient instance supports concurrent requests. To get concurrency with WebClient, you need to create a fresh instance per concurrent request, which can get awkward when you introduce custom headers, cookies, and authentication schemes.
  • HttpClient lets you write and plug in custom message handlers. This enables mocking in unit tests and the creation of custom pipelines
  • HttpClient has a richer and extensible type system for headers and content.
Con’s
  • It doesn’t support progress reporting.
Other libraries

Microsoft brings bash to Windows 10

In the 2016 annual developer conference (build), Microsoft announced that it was bringing the bash shell to Windows 10.bash-windows

This is pretty significant and will definitely be a welcome addition to the ever changing and more inclusive Microsoft. Developers will now be able to write .sh scripts on windows as well as use editors such as emacs or vi to edit their code.

Bash will be available with the Windows 10 Anniversary update this summer, but is available to Windows insiders before that.

VS Extension of the day : CodeMaid

Working with poorly written long code files can be a pain. It’s especially painfully when you’re either attempting to track down a bug or introduce a new feature. In comes CodeMaid!

Let’s be practical. CodeMaid will not automatiacally fix bugs for you, neither will it write features for you. It will however, help you navigate through your code quicker and faster. Frankly, that’s all I needed.

Code maid is very well documented so I won’t duplicate everything on their website. However, I do want to hightlight one feature which I use a lot to get a general sense of what is happening in a class.

Code Digging

CodeMaid gives you a visual tree of the file you’re looking at and gives you the ability to quickly switch between sorting methods to get a better overview.

Digging     Digging sort order

If you like visually code then I would definately recommend this visual studio extension. You can download it from the visual studio gallery or from their site.

Bulk upload into SQL Server using SQLBulkCopy – Part 2

In Part 1 of this series, we looked at the basics of how the SQLBulkCopy class works. It’s time to setup our situation drill & dig into some real code.

Situation Drill: You’ve just been hired as the lead developer on a short-term project for a mulitnational retail cooperation. Your manager wants a reporting application that is able to import a large XML file and extract only certain fields
each day into a SQL Server database. Develop a prototype app which uses SQL Bulk Copy to import this XML file into the database.

Download source files here.

Our goal is to populate a table  called TopOrders with the contents of an XML file. So let’s begin by taking a look at the XML file that we want to import. The file comprises a list of orders. For the purposes of this drill, we will extract 3 fields, OrderID, CustomerID & Ship Address.

xml

 

 

 

 

 

 

 

 

Download XML file here

TopOrders schema

Run the following script in a database of your choice to setup your target table.

CREATE TABLE [dbo].[TopOrders] (
    [NewOrderID]         INT           NOT NULL,
    [NewCustomerId]      NVARCHAR (50) NULL,
    [NewShippingAddress] NVARCHAR (50) NULL,
    PRIMARY KEY CLUSTERED ([NewOrderID] ASC)
);

The next step is to set up projects in Visual Studio. Add two console applications and one class library project to look something like this.

bulkcopy
Importer.Console.DataTable– Implementation of SQLBulkCopy using a DataTable

Importer.Console.DataReader– Implementation of SQLBulkCopy with IDataReader

Importer.Console.Core– Common files

Configuring SQLBulkCopy

Next, create a strongly typed class to represent the SQLBulkCopySettings & Column Mappings.

public class ColumnMapping
    {
        public string SourceColumn { get; set; }
        public string DestinationColumn { get; set; }
    }
public class SqlBulkCopySettings
    {
        public int BatchSize { get { return 10; } }
        public int NotifySize { get { return BatchSize*5; } }
        public int NumberOfMergeAttemptsBeforeGivingUp { get { return 2; } }
        public string DestinationTableName { get { return "TopOrders"; } }
        public List<ColumnMapping> ColumnMappings { get; private set; }
 
 
        public static SqlBulkCopySettings GetSettings()
        {
            return new SqlBulkCopySettings
            {
                ColumnMappings = new List<ColumnMapping>
                {
                    new ColumnMapping
                    {
                        SourceColumn = "OrderID",
                        DestinationColumn = "NewOrderID"
                    },
                    new ColumnMapping
                    {
                        SourceColumn = "CustomerID",
                        DestinationColumn = "NewCustomerId"
                    },
                    new ColumnMapping
                    {
                        SourceColumn = "ShipAddress",
                        DestinationColumn = "NewShippingAddress"
                    }
 
                }
            };
        }
    }

In our main program ( Importer.Console.DataTable), we’ll begin by setting up the connection string to our target database and read the XML data into a DataTable.

string connectionString = @"Data Source=[Server];Integrated Security=True";
     var sourceDataTable = GetSourceDataTable();
     SqlBulkCopySettings settings = SqlBulkCopySettings.GetSettings();
private static DataTable GetSourceDataTable()
        {
            DataSet dataSet = new DataSet();
            dataSet.ReadXml(@"c:bulkuploadorders.xml");
            var sourceDataTable = dataSet.Tables[0];
            return sourceDataTable;
        }

The code below is simple to understand. The main method opens a connection to the target database. A new instance of SQLBulkCopy is created and assigned the necessary properties for the operation. (For more details about these properties refer to Part 1 of this series) . The method WriteToServer is called to complete the operation.

try
{
 using (SqlConnection destinationConnection = new SqlConnection(connectionString))
       {
        destinationConnection.Open();
 
        Logger.Info("Database Opened");
 
         using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection.ConnectionString))
         {
            bulkCopy.SqlRowsCopied += bulkCopy_SqlRowsCopied;
            bulkCopy.NotifyAfter = settings.NotifySize;
            bulkCopy.BatchSize = settings.BatchSize;
            bulkCopy.DestinationTableName = settings.DestinationTableName;
            foreach (var columnMapping in settings.ColumnMappings)
              {
               bulkCopy.ColumnMappings.Add(columnMapping.SourceColumn, columnMapping.DestinationColumn);
               }
                 bulkCopy.WriteToServer(sourceDataTable);
                       
                   Logger.Info("Bulk copy setup sucess");
                   }
 
               }
           }
           catch (Exception exception)
           {
               Logger.Error("Bulk copy was unable to complete the operation with the follwing error {0}", exception.Message);
           }

 

 

Result
toporders

 

 

 

 

 

 

Using a DataTable works well until you have millions of records at once and start running out of memory. An alternative to the DataTable is to use an implmentation of IDataReader.

Watch out for part 3 in this series to learn how to implement IDataReader to stream custom objects in bulk to your database.

Bulk upload into SQL Server using SQLBulkCopy – Part 1

Uploading large files can be made easier with the help of SQLBulkCopy. SQLBulkcopy is a simple and easy to use tool for transferring complicated or simple data from one data store to another.  This article will look primarily at the two main ways (via Datatable & IDataReader) of uploading a large data source into SQLServer.

Send-large-files-at-one-go

How it works

SQLBulkCopy essentially works by configuring properties on the class to determine how the transfer to SQLServer will be done. After, the settings are complete, you then call the method  WriteToServer() which accepts either a DataTable or IDataReader. Here is a list of the most important properties:

Construction

BatchSize

Batchsize is integer property which determines how many records are sent to the server in a batch. If you don’t set this property all the records are sent.

 bulkCopy.BatchSize = 50;
ColumnMappings

The column mappings property allows you to map columns from your source to destination (SQLServer). You only need to set this property if columns names are different. The code below matches ShipAddress from the source to the destination column of NewShippingAddress.

bulkCopy.ColumnMappings.Add("ShipAddress", "NewShippingAddress");  
DestinationTableName

The DestinationTableName property determines which table you want to copy your data to.

bulkCopy.DestinationTableName = "TopOrders"; 
SqlRowsCopied & NotifyAfter

One cool thing about the bulkcopy object is that it has an event, SqlRowsCopied which is raised after a specific number of records are uploaded. You can set the NotifyAfter property to any integer value. It has a default value of 0.

bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsTransfer);
bulkCopy.NotifyAfter = 100;

private static void OnSqlRowsTransfer(object sender,SqlRowsCopiedEventArgs e)
{
        Console.WriteLine("Copied {0} so far...", e.RowsCopied);
}   
WriteToServer

The WriteToServer method processes your source table data to a destination table. It accepts an array of DataRows or DataTable orIDataReader. With DataTable you can also specify the state of the rows that needs to be processed.

The following code will process rows from sourceData DataTable which has RowState as Added to DestinationTable.

bulkCopy.WriteToServer(sourceData, DataRowState.Added);

In part 2 of this article, we’ll look at a fully working program which will demonstrate how to upload a big dataset using datatables and SQLBulkCopy.

How to fix “The VPN Client Driver has Encountered an Error” in windows 8

Disclaimer : Do this at you own risk. Be sure to back up your registry before attempting this on your system.

Due to compatibility problems, users may experiences errors when trying to establish a connection with the Cisco AnyConnect client on Windows 8. Upon trying to connect, they may get an error such as “The VPN Client driver has encountered an error” or “Cannot initiate VPN.”

Fortunately, there is an easy registry edit that has been found to frequently resolve this error:

  1. Go to the Windows 8 Start Screen by pressing the Windows key.
  2. Type regedit to initiate an application search, then press Enter to launch regedit.
  3. Navigate to HKEY_LOCAL_MACHINESYSTEMCurrentControlSetServicesvpnva
  4. Locate the key DisplayName.
  5. Right-click this key and select Modify.
  6. In the value data field, erase everything and replace it with Cisco AnyConnect VPN Virtual Miniport Adapter for Windows x64
    • If you are running a 32-bit system, perform the same steps, but instead replace the entry withCisco AnyConnect VPN Virtual Miniport Adapter for Windows x86
  7. Restart the Cisco AnyConnect client. The issue should now be resolved.

IIS 7 taking over 400 error message from Web API

Recently I’ve run into a problem on one of our staging servers, where the custom error result from an API endpoint  returns IIS error pages instead of the custom error content generated in code (In this case a Json object).

Overview
So essentially our front end application accepts an SMS number and POSTS the accepted input to the ASP.NET Web API for validation and persistence in the backend.
 [DataContract]
    public class Sms
    {
        [DataMember(Name = “model”)]
        public SmsDTO Model { get; set; }
        [DataMember(Name = “mobile”)]
        [Required]
        [SmsNumber]
        public string Mobile { get; set; }
    }
The validation of the model is done via the built in model validation features in MVC/API. A custom attribute [SMSNumber] inherits from Validation Attribute & is responsible for this functionality
 public class SmsNumberAttribute : ValidationAttribute
    {
        protected override ValidationResult IsValid(object value, ValidationContext validationContext)
        {
            string number = (string)value;
            if (string.IsNullOrEmpty(number))
            {
                return base.IsValid(value, validationContext);
            }
            if (SmsNumberPatternManager.Instance.IsNumberValid(number))
            {
                return ValidationResult.Success;
            }
            return new ValidationResult(string.Format(“{0} is not a valid SMS Number. Please enter a valid number.”, value));
        }
    }
Expected
Ultimately, what we want is that when a users enters an invalid number we give them a descriptive message which explains the error.
Actual
What is actually happening on the staging server is this.
The idea is that the request should generate a 400 error, but still provide appropriately formatted error information – in this case JSON – to the client. This works fine on my development machine on Windows 7 with IIS 7.5, but fails in the staging environment on IIS 7 Windows Server 2008.
On the staging server the response that the server generates in not my JSON object, but rather IIS HTML error page.
Response on Staging Server
Cache-Control:public, no-store, max-age=0, s-maxage=0
Content-Length:11
Content-Type:text/html
Server:Microsoft-IIS/7.0
Vary:*
X-AspNet-Version:4.0.30319
X-AspNetMvc-Version:4.0
X-Powered-By:ASP.NET
Response on Local Development Box
  1. Cache-Control:public, no-store, max-age=0, s-maxage=0
    Content-Encoding:gzip
    Content-Length:108
    Content-Type:application/json; charset=utf-8
    Server:Microsoft-IIS/7.5
    Vary:*
    X-AspNet-Version:4.0.30319
    X-AspNetMvc-Version:4.0
    X-Powered-By:ASP.NET
Why is this happening?
This occurs because the error is trapped by ASP.NET, but then ultimately still handled by IIS which looks at the 400 status code and returns the stock IIS error page.
The Solution

Enter Response.TrySkipIisCustomErrors

There’s a solution to this problem with the deftly named TrySkipIisCustomErrors property on the Response object which is tailor made for this particular scenario. In a nutshell this property when set to true at any point in the request prevents IIS from injecting its custom error pages.
The last line of my SMSNumberAttribute will now look like this :
public class SmsNumberAttribute : ValidationAttribute
    {
 protected override ValidationResult IsValid(object value, ValidationContext validationContext)
        {
          .
          .
HttpContext.Current.Response.TrySkipIisCustomErrors = true;
return new ValidationResult(string.Format(“{0} is not a valid SMS Number. Please enter a valid number.”, value));
          }
   }

Reading web.config key settings in HTML markup

In order for html pages to recognize settings from our web.config we’ll need to tell IIS to treat .html files as dynamic pages

This can be done by the following:
....<system.web>
    <compilation ...>
        <buildProviders>
            <add extension=".html" 
                 type="System.Web.Compilation.PageBuildProvider" />
        </buildProviders>
    ....

and

....<system.webServer>
    <handlers>
        <remove name="WebServiceHandlerFactory-Integrated" />
        <add name="PageHandlerFactory-Integrated-HTML" path="*.html" 
             verb="GET,HEAD,POST,DEBUG" type="System.Web.UI.PageHandlerFactory" 
             resourceType="Unspecified" preCondition="integratedMode" />
    </handlers>....
You can then call any setting from your web.config in your .html page like this
<%=ConfigurationManager.AppSettings["MyAttribute"]%>
or
<%$ AppSettings: MyAttribute %>

This uses the expression builder syntax which means you can decoratively assign appSettings values.