find a location for property in a new city

Monday 28 May 2012

Could not load file or assembly 'msshrtmi' or one of its dependencies

Sometimes after publishing my Azure solution I get a yellow screen of death giving a "System.BadImageFormatException" exception with a description of "Could not load file or assembly 'msshrtmi' or one of its dependencies. An attempt was made to load a program with an incorrect format."

I tried everything to get rid of this msshrtmi complaint: building it again, rebuilding it, cleaning the solution, restarting my computer even and that fixes everything in Windows ever!

Very strange as my newly published Azure solution is fine but my development site is now at the mercy of this mysterious msshrtmi assembly.. whatever the hell that is.

Kill msshrtmi! Kill it with fire!

So it's is an assembly right? Assemblies go in the bin folder... let's look there... There it is! msshrtmi.dll. DELETE IT!

I don't know what it is and I didn't put it there so I deleted it and all is back to normal. Excellent.

Follow britishdev on Twitter

How to turn off logging on Combres

I have been using Combres for my .NET app to minify, combine and compress my JavaScript and CSS. So far I have found it awesome.

It really does do everything it claims. Minifing your scripts and CSS, then combines them all so that all the code is sent via one HTTP request (well one for CSS and one for JavaScript). It also handles GZip or deflate compression should the request accept it. It is easy to set up since it is now available through NuGet and using it will make YSlow smile when it scans your site.

Logging

One thing that is annoying though is that Combres seems to log everything it does. "Use content from content's cache for Resource x...Use content from content's cache for Resource y" This is fine in development but unnecessary in production so I wanted to turn it off but couldn't find how to do this in any documentation.

The way I managed to turn off logging was to find the line in the web.config that combres put in that looks like this:

<combres definitionUrl="~/App_Data/combres.xml" logProvider="Combres.Loggers.Log4NetLogger" />

You simply need to remove the logProvider so it looks like this:

<combres definitionUrl="~/App_Data/combres.xml" />

If you still want logging on your development environment you can simply remove the logProvider attribute in a web.config transform.

Follow britishdev on Twitter

Tuesday 22 May 2012

Export and back up your SQL Azure databases nightly into blob storage

With Azure I have always believed, if you can do it with the Azure Management Portal then you can do it with a REST API. So I thought it would be a breeze to make an automated job to run every night to export and back up my SQL Azure database into a BACPAC file in blob storage. I was suprised to find scheduling bacpac exports of your SQL Azure databases is not documented in the Azure Service Management API. Maybe it is because the bacpac exporting and importing is in beta? Nevermind. I successfully have a worker role backing up my databases and here's how:

It is a REST API so you can't use nice WCF to handle all your POST data for you but there is a trick still to avoid writing out all your XML parameters by hand and instead strong typing a few classes.

Go to your worker role or console application and add a service reference to your particular DACWebService (it varies by region):

  • North Central US: https://ch1prod-dacsvc.azure.com/DACWebService.svc
  • South Central US: https://sn1prod-dacsvc.azure.com/DACWebService.svc
  • North Europe: https://db3prod-dacsvc.azure.com/DACWebService.svc
  • West Europe: https://am1prod-dacsvc.azure.com/DACWebService.svc
  • East Asia: https://hkgprod-dacsvc.azure.com/DACWebService.svc
  • Southeast Asia: https://sg1prod-dacsvc.azure.com/DACWebService.svc

Once you import this Service Reference you will have some new classes that will come in handy in the following code:

//these details are passed into my method but here is an example of what is needed
var dbServerName = "qwerty123.database.windows.net";
var dbName = "mydb";
var dbUserName = "myuser";
var dbPassword = "Password!";

//storage connection is in my ServiceConfig
//I know these the CloudStorageConnection can be obtained in one line of code
//but this way is necessary to be able to get the StorageAccessKey later
var storageConn = RoleEnvironment.GetConfigurationSettingValue("Storage.ConnectionString");
var storageAccount = CloudStorageAccount.Parse(storageConn);

//1. Get your blob storage credentials
var credentials= new BlobStorageAccessKeyCredentials();
//e.g. https://myStore.blob.core.windows.net/backups/mydb/2012-05-22.bacpac
credentials.Uri = string.Format("{0}backups/{1}/{2}.bacpac",
    storageAccount.BlobEndpoint,
    dbName,
    DateTime.UtcNow.ToString("yyyy-MM-dd"));
credentials.StorageAccessKey = ((StorageCredentialsAccountAndKey)storageAccount.Credentials)
                                   .Credentials.ExportBase64EncodedKey();

//2. Get the DB you want to back up
var connectionInfo = new ConnectionInfo();
connectionInfo.ServerName = dbServerName;
connectionInfo.DatabaseName = dbName;
connectionInfo.UserName = dbUserName;
connectionInfo.Password = dbPassword;

//3. Fill the object required for a successful POST
var export = new ExportInput();
export.BlobCredentials = credentials;
export.ConnectionInfo = connectionInfo;

//4. Create your request
var request = WebRequest.Create("https://am1prod-dacsvc.azure.com/DACWebService.svc/Export");
request.Method = "POST";
request.ContentType = "application/xml";
using (var stream = request.GetRequestStream())
{
    var dcs = new DataContractSerializer(typeof(ExportInput));
    dcs.WriteObject(stream, export);
}

//5. make the POST!
using (var response = (HttpWebResponse)request.GetResponse())
{
    if (response.StatusCode != HttpStatusCode.OK)
    {
        throw new HttpException((int)response.StatusCode, response.StatusDescription);
    }
}

This code would run in a scheduled task or worker role to be set for 2am each night for example. It is important you have appropriate logging and notifications in the event of failure.

Conclusion

This sends off the request to start the back up / export of database into a bacpac file. The success of this is no indication that the back up was successful, only the request submition. If your credentials are wrong you will get a 200 OK response but it the back up will fail silently later.

To see if it has been successful you can check on the status of your exports via the Azure Management Portal, or by waiting a short while and having a look in your blob storage.

I have not covered Importing because, really, exporting is the boring yet important activity that must happen regularly (such as nightly). Importing is the one you do on the odd occassion when there has been a disaster and the Azure Management Portal is well suited to such an occassion.

Follow britishdev on Twitter

Friday 18 May 2012

How to get Azure storage Account Key from a CloudStorageAccount object

I have a CloudStorageAccount object and I want to get the AccountKey out of it. Seems like you should be able to get the Cloud Storage Key out of a CloudStorageAccount but I did struggle a bit at first.

I first used the CloudStorageAccount.FromConfigurationSetting(string) method at first and then played about in debug mode to see if I could find it.

I then found that that method doesn't return the correct type of object which left me unable to find my Azure storage access key. I then tried the same thing but using CloudStorageAccount.Parse(string) instead. This did have access to the Azure storage access key.

//this method of getting your CloudStorageAccount is no good here
//var account = CloudStorageAccount.FromConfigurationSetting("StorageConnectionStr");

//these two lines do...
var accountVal = RoleEnvironment.GetConfigurationSettingValue("StorageConnectionStr");
var account = CloudStorageAccount.Parse(accountVal);

//and then you can retrieve the key like this:
var key = ((StorageCredentialsAccountAndKey)account.Credentials)
          .Credentials.ExportBase64EncodedKey();

It is strange that CloudStorageAccount.FromConfigurationSetting doesn't give you access to the account key in your credentials but CloudStorageAccount.Parse does. Oh well, hope that helps.

Follow britishdev on Twitter

Thursday 10 May 2012

Set maxlength on a textarea

It's annoyed me for quite a while that you can set a maximum length on an <input type="text" /> but not on a textarea. Why? Are they so different?

I immediately thought I was going to have to write some messy JavaScript but then I learned that HTML5 implements maxlength on text areas now and I'm only considering modern browsers! Wahoo!

Then I learnt that IE9 doesn't support it so JavaScript it is...

JavaScript textarea maxlength limiter

What I have done that works well is bind events keyup and blur on my text area to a function that removes characters over the maxlength provided. The code looks like this:

$('#txtMessage').bind('keyup blur', function () {
    var $this = $(this);
    var len = $this.val().length;
    var maxlength = $this.attr('maxlength')
    if (maxlength && len > maxlength) {
        $this.val($this.val().slice(0, maxlength));
    }
});

Conclusion

It works quite well because if the browser already supports maxlength on a textarea there will be no interruption because the value of the textarea will not go over that maxlength.

The keyup event doesn't fire when the user pastes text in using the mouse but that is where the blur event comes in. Also, an enter click makes a new line on a textarea so the user has to click the submit button (blurring the textarea).

Beware though that this is for usability only; a nasty user could easily bypass this so ensure you are checking the length server side if it is important to you.

Follow britishdev on Twitter