Friday 23 July 2010

Need Virtual Server hosting in the UK or USA? Try ElasticHosts

If you’re looking to host Virtual Servers in the cloud with good response times from the Europe or the USA, you should take a look at ElasticHosts. Here’s why:

  • By my calculations they are just about the cheapest of the options available in the UK. I compared them against RackSpace, CloudHosts (part of UKFast), Amazon and FlexiScale. Only FlexiScale came close.
  • Incredible flexibility in sizing your server. Forget about Small, Large and Extra Large: with ElasticHosts you get a slider control for CPU, Memory and Hardisk allowing you to resize your machine as your needs vary.
  • You can either choose pay-as-you-go pricing, with no minimum commitment; or if demand is predictable you can take out monthly subscriptions and save money. You can even mix and match: if say, you typically use 5GB bandwidth in a month, but sometimes burst up to 7GB, you can subscribe for 5GB a month, and use pay-as-you-go pricing for the extra.
  • They have a REST API giving complete programmatic control over your pool of servers, allowing you to do everything from stopping and starting servers, through provisioning completely new servers on the fly, to copying drives from the cloud to your machine.
  • They offer a free, no commitment, five day trial to get you hooked started.

I’ve saved the best till last: ElasticHosts Customer Service is amazing. Fronted by Elastigirl herself (aka Anna Griffiths, her cover as Helen Parr now being thoroughly blown) they’ve been quick to respond whether I’ve called or emailed.

The first time I got in touch, I was rather cheeky: the server they offer for the free trial is clocked at 2GHz with 1GB RAM. This rather struggled as I loaded it with Windows Server 2008, so I called asking if there was anything they could do to make the trial server more representative of the system we might be looking to subscribe to.

“Sure”, said Anna, “What would you like?”

I was stunned! And continued to be so as she upped the memory, hard disk capacity, bandwidth availability and gave me the use of a static IP address.

If I had one suggestion to make, it’s that they set up a support forum to supplement their email and telephone support. I’m sure they’ve already answered several of the questions I asked many times over, and I always prefer to bother Google before I bother a support person. Added to that, it would be nice to have a place to share some of the titbits that I discovered – like how to call the ElasticHosts REST API from .Net (you have to force WebRequest to include the Basic Authentication header).

So give ElasticHosts a try. I’m sure you’ll be pleasantly surprised.

Wednesday 21 July 2010

Debugging Windows batch files when used as scheduled tasks

Here’s a little gem that I pieced together thanks to Google this morning: how to trouble-shoot a Windows bat file when using it as a scheduled task.

We’re using SQL Express, and we want to make sure all our databases are safely backed up on a regular schedule. One thing that Microsoft cut out of SQL Server when pruning it to create the free version is SQL Agent, the tool that enables you to run scheduled tasks against the database.

No big deal: following Greg Robidoux’s advice I created a stored procedure to backup a database, and then wrote a batch file that used SQLCMD to execute it for each database on the server. Add to the batch file a call to RoboCopy to transfer the backups to our NAS drive, then set up a scheduled task against the batch file, and I’m done, I thought.

If only!

The first problem was how to get the task to run under the Local System account – I didn’t want to use a standard account, because then I have the hassle of password management (I’m using Windows Server 2003 here – if I was on Windows Server 2008 R2 I could use Managed Service Accounts and have the server take care of the passwords). Going through the Add Scheduled Task UI doesn’t give you the option of using the Local System account.

For that, I discovered, you need to use the at command:

at 20:00 /every:m,t,w,th,f,sa,su "D:\Backups\DoBackups.bat"

does the job, scheduling my batch file for 8:00pm every day of the week.

OK. Scheduled task appears in the list. Run it to check that it works. Uh oh!

The task clearly failed to complete, but how was I to find out why? Clicking Advanced > View Log in the scheduled tasks window brings up the log file – the completely useless log file that tells you that your task started and then stopped straight away “with an error code of (2)”. Right – could you be more … specific?

So I pushed further into the murky world of bat file programming.

Joshua Curtis saved the day. His post on Redirecting Output to a File in Windows Batch Scripts gave me exactly what I needed.

First, I refactored my batch script into DoBackupsCore.bat. Then, in DoBackups.bat I wrote this:

echo [%date% - %time%] Log start > D:\Backups\log.txt
CALL "D:\Backups\DoBackupsCore.bat" >> D:\Backups\log.txt 2>&1

On the first line the > command redirects output to the log file, erasing anything already in it. In the 2nd line, the >> command redirects the output of my actual backup script to the log file, but appends to what’s already there. The really clever part of this magic spell is the last 4 characters: "2>&1”. I refer you to Joshua for the details, but this basically makes sure that errors are logged to the file, as well as successful outcomes

So I got what I needed: lovely, wordy error messages enabling me to fix my script and go home with that lovely backed-up feeling.

Friday 16 July 2010

My PDC 2010 Prediction: Free Enthusiast App Hosting on Windows Azure

So Microsoft have announced that there will, after all, be a Platform Developers Conference this year. It will however, be an austerity PDC, chopped down to two days, held on Microsoft’s campus rather than in a big conference city, and with a much narrower focus. This is billed to be a Cloud Conference. And I think I know what Mr Steve “Developers, Developers, Developers” Balmer is going to be announcing.

Think about what Microsoft have revealed over the last couple of weeks:

  • WebMatrix, a new streamlined IDE aimed at enthusiasts wanting to produce web applications. And skipping hand-in-hand with WebMatrix comes Razor, a new syntax for creating web views.
  • SQL Server Compact Edition 4, a new version of the embeddable, in-process edition of SQL Server that has been enabled to run inside ASP.Net
  • IIS Express, a version of IIS 7.5 with full support for the new IIS Integrated pipeline, SSL, etc. but capable of running in a locked down environment with no admin privileges.

Then there are the rumours about KittyHawk, a tool aimed a tech-savvy business users wanting to produce Silverlight applications without any coding.

Add all this up, and what do you get? Hundreds of enthusiasts with shiny new web applications, eager to share them with the world – but not wanting to go through the pain of ftp-ing their creation to a web host.

Hence my prediction: At PDC 2010 Microsoft will announce free hosting for enthusiast-scale web applications within Windows Azure. And they’ll throw in tools to publish web applications from WebMatrix and Visual Studio Express to Azure at the click of a button.

Take a look at the top two feature requests on mygreatwindowsazureidea.com (the official feature request list for Windows Azure): Make it less expensive to run my very small service on Windows Azure and Continue Azure offering free for Developers. Microsoft seem pretty serious about addressing issues on such lists just recently.

Google have a head-start in this arena: they’ve been offering this kind of free hosting in their AppEngine for years. But with their Visual Studio tooling, I think Microsoft could clean up.

Wednesday 14 July 2010

C# 4 broke my code! The Pitfalls of COM Interop and Extension methods

A couple of weeks back I got the go-ahead from my boss to upgrade to Visual Studio 2010. What with Microsoft’s legendary commitment to backwards compatibility, and the extended beta period for the 2010 release, I wasn’t expecting any problems. The Upgrade Wizard worked its magic without hiccup, the code compiled, everything seemed fine.

Banana Skin - Image: Chris Sharp / FreeDigitalPhotos.netThen I ran our suite of unit tests. 6145 passed. 15 failed.

When I perused the logs of the failing tests they all had this exception in common:

COMException: Type mismatch. (Exception from HRESULT: 0x80020005 (DISP_E_TYPEMISMATCH))

How could that happen? Nothing should have changed: though I was compiling in VS 2010, I was still targeting .Net 3.5. The runtime behaviour should be identical. Yet it looked like COM Interop itself was broken. Surely not?

A footprint in the flowerbed

I dug deeper. In every case, I was calling an extension method on a COM object – an extension method provided by Microsoft’s VSTO Power Tools, no less.

These Power Tools are painkillers that Microsoft made available for courageous souls programming against the Office Object Model using C#. Up until C# 4, calling a method with optional parameters required copious use of Type.Missing – one use for every optional parameter you didn’t wish to specify. Here’s an example (you might want to shield your eyes):

var workbook = application.Workbooks.Add(Missing.Value);
workbook.SaveAs(
    "Test.xls", 
    Missing.Value, 
    Missing.Value, 
    Missing.Value, 
    Missing.Value, 
    Missing.Value, 
    XlSaveAsAccessMode.xlNoChange, 
    Missing.Value, 
    Missing.Value, 
    Missing.Value, 
    Missing.Value, 
    Missing.Value);

The Power Tools library provides extension methods that hide the optional parameters:

using Microsoft.Office.Interop.Excel.Extensions;

var workbook = application.Workbooks.Add();
workbook.SaveAs(
    new WorkbookSaveAsArgs { FileName = "Test.xls"} );

Can you see what the problem is yet? I couldn’t. So I thought I’d try a work-around.

A big part of C# 4 is playing catch-up with VB.Net improving support for COM Interop. The headline feature is that fancy new dynamic keyword that makes it possible to do late-bound COM calls, but the one that’s relevant here is the support for optional and named parameters. The nice thing is that, whereas using the dynamic keyword requires you to target .Net 4.0, optional and named parameters are a compiler feature, so you can make use of them even if you are targeting .Net 3.5.

That meant that I didn’t need Power Tools any more. In C# 4 I can just rewrite my code like this:

var workbook = application.Workbooks.Add();
workbook.SaveAs( Filename: "Test.xls" );

And that worked. COM Interop clearly wasn’t broken. But why was the code involving extension methods failing?

Looking again over the stack traces of the exceptions in my failed tests I noticed something very interesting. The extension methods didn’t appear. The call was going directly from my method into the COM Interop layer. I was running a Debug build, so I knew that the extension method wasn’t being optimised out.

The big reveal

Then light dawned.

Take a look at the signature of the SaveAs method as revealed by Reflector:

void SaveAs(
   [In, Optional, MarshalAs(UnmanagedType.Struct)] object Filename,
    /* 11 other parameters snipped */)

Notice in particular how Filename is typed as object rather than string. This always happens to optional parameters in COM Interop to allow the Type.Missing value to be passed through if you opt out of supplying a real value.

Now when C# 3.0 was compiling my code and deciding what I meant by

workbook.SaveAs(new WorkbookSaveAsArgs { ... } );

it had to choose between a call to the SaveAs instance method with 12 parameters or a call to the SaveAs extension method with a single parameter. The extension method wins hands down.

But the landscape changes in C# 4.0. Now the compiler knows about optional parameters. It’s as if a whole bunch of overloads have been added to the SaveAs method on Workbook with 0 through to 12 parameters. Which method does the compiler pick this time? Remember that it will always go for an instance method over an extension method. Since new WorkbookSaveAsArgs { … } is a perfectly good object the compiler chooses the SaveAs instance method, completely ignoring the extension method.

All seems well until we hit Run and Excel is handed the WorkbookSaveAsArgs instance by COM Interop. Not knowing how to handle it, it spits it back out in the form of a Type Mismatch exception.

Mystery solved.

The moral

So watch out if your code uses extension methods on COM objects and you’re planning on upgrading to VS 2010. Don’t assume that your code works the same as before just because it compiled OK. Unit tests rule!

Update: Microsoft have confirmed that this is indeed a breaking change that they didn’t spot before shipping.