Archive for the ‘ClickOnce Deployment’ Category

Click Once deployment and VS2013 Update 3

September 5, 2014

As many of you know, I still answer the occasional question about Click Once deployment. This is a deployment feature that lets you host a client or console application deployment on a webserver, share, or even in Azure blob storage, and perform automatic updates when there are changes.

For the benefit of all those who used this technology, I try to keep up with what Microsoft’s doing with it. That’s not as hard as it might seem, as they rarely do anything with it. (I can sense you all nodding your heads.) However, there are a couple of changes in Update 3 for VS2013 that are pertinent specifically to Click Once deployment. I’ve known about this for a couple of months, so I’ve already gotten over the shock. If you need a few minutes to recover, please go ahead. I’ll be here when you come back.

Problem using WinRT APIs in a WPF application

The first thing is the problem referenced here. Scott Hanselman had a blog entry showing how to call WinRT APIs in Windows 8 from C# Desktop Applications (WPF), but nobody could get it to work. The error was “Resolve Manifest Files task failed” when trying to build the WPF application. To eliminate this problem, go to your project’s Properties, go to the Publish Tab, and then select Application Files. Set the publish status of all “*.winmd” files to “Exclude” as displayed in the following image:

image

Signing a .NET 4.0 or lower app with a SHA256 code-signing certificate

The second issue shows up when you use one of the new SHA256 code-signing certificates to sign an application that targets .NET 4.0 or earlier. When you do this, the hash for the deployment manifest (.application file) is generated using the SHA256 algorithm, which can not be understood by .NET 4.0 and below. When running the application on a machine that only has .NET 4.0 or below, this would result in exceptions like “The application is impropertly formatted”, “The manifest may not be valid,”, “Manifest XML signature is not valid,” or “Signature Description could not be created for the signature algorithm supplied”.

With Update 3 in VS2013, Microsoft has updated the built tasks to generate the hash using the SHA1 algorithm if the target .NET version for the application is below .NET 4.5, but still use a SHA256 hash for .NET 4.5 and above. So you no longer have to install .NET 4.5 just because you’re using a SHA256 certificate.

Summary

In this article, I discussed a couple of issues they have fixed in Click Once deployment with VS2013 Update 3. If you have seen any of the issues mentioned above, please install the update and let me know if you still have any problems.

Azure Blob Storage, Click Once deployment, and recursive file uploads

July 17, 2014

In this blog post, I am going to show you how to upload an folder and all of its contents from your local computer to Azure blob storage, including subdirectories, retaining the directory structure. This can have multiple uses, but I want to call out one use that people still using Click Once deployment might appreciate.

I used to be the (only) Click Once MVP, and still blog about it once in a while. Click Once is a Microsoft technology that allows you to host the deployment of your client application, console application, or VSTO add-in on a file share or web site. When updates are published, the user picks them up automatically. This can be a very handy for those people still dealing with these technologies, especially since Microsoft removed the Setup & Deployment package feature from Visual Studio after VS2010 and replaced it with a lame version of InstallShield (example of lameness: it wouldn’t let you deploy 64-bit applications). But I digress.

I wrote a blog article showing how you can host your Click Once deployment in Azure blob storage very inexpensively. (It’s even cheaper now.) The problem is you have to get your deployment up to Blob Storage, and for that, you need to write something to upload it, use something like Cerebrata’s Azure Management Studio, or talk the Visual Studio team and ClickOnce support into adding an option to the Publish page for you. I tried the latter — what a sales job I did! “Look! You can work Azure into Click Once and get a bunch of new Azure customers!” “Oh, that’s a great idea. But we have higher priorities.” (At least I tried.)

Having been unsuccessful with my rah! rah! campaign, I thought it would be useful if I just provided you with the code to upload a folder and everything in it. You can create a new Windows Forms app (or WPF or Console, whatever makes you happy) and ask the user for two pieces of information:

  • Local folder name where the deployment is published. (For those of you who don’t care about ClickOnce, this is the local folder that you want uploaded.)
  • The name of the Blob Container you want the files uploaded to.

Outside of a Click Once deployment, there are all kinds of uses for this code. You can store some of your files in Blob Storage as a backup, and use this code to update the files periodically. Of course, if you have an excessive number of files, you are going to want to run the code in a background worker and have it send progress back to the UI and tell it what’s going on.

Show me the code

I’m going to assume you understand recursion. If not, check out the very instructive wikipedia article. Or put the code in and just step through it. I think recursion is really cool; I once wrote a program in COBOL that would simulate recursion that went up to 10 levels deep. (For you youngsters, COBOL is not a recursive programming language.)

In your main method, you need to add all of the following code (up until the next section).

First, you need to set up your connection to the Azure Storage Account and to the blob container that you want to upload your files to. Assuming you have the connection string to your storage account, here’s what you need to do.

First you’re going to get an instance of the CloudStorageAccount you’re going to use. Next, you get an reference to the CloudBlobClient for that storage account. This is what you use to access the actual blob storage. And lastly, you will get a reference to the container itself.

The next thing I always do is call CreateIfNotExists() on the container. This does nothing if the container exists, but it does save you the trouble of creating the container out in Blob Storage in whatever account you’re using if you haven’t already created it. Or if you have forgotten to create it. If you add this, it makes sure that the container exists and the rest of your code will run.

//get a reference to the container where you want to put the files, 
//  create the container if it doesn't exist
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobCOntainer = cloudBlobClient.GetContainerReference(containerName);
cloudBlobContainer.CreateIfNotExists();

I also set the permissions on the container. If this code actually creates the container, the default access is private, and nobody will be able to get to the blobs without either using a security token with the URL or having the storage account credentials.

//set access level to "blob", which means user can access the blob 
//  but not look through the whole container
//  this means the user must have a URL to the blob to access it
BlobContainerPermissions permissions = new BlobContainerPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Blob;
cloudBlobContainer.SetPermissions(permissions);

So now we have our container reference (cloudBlobContainer) set up and the container is ready for use.

Next we need to get a list of files to upload, and then we need to upload them. I’ve factored this into multiple methods. Here are the top commands:

List<String> listOfiles = GetListOfFilesToUpload(folderPath);
string status = UploadFiles(listOfiles, folderPath);

After this finishes, all of your files are uploaded. Let’s look at the methods called.

GetListOfFilesToUpload(string folderPath)

This is the hard part – getting the list of files. This method calls the recursive routine. It starts with instantiating the list of files that will be the final result. Then it retrieves the list of files in the requested directory and adds them to the list. This is “the root”. Any files or directories in this root will be placed in the requested container in blob storage. [folderName] is the path to the local directory you want to uploaded.

//this is going to end up having a list of the files to be uploaded
//  with the file names being in the format needed for blob storage
List<string> listOfiles = new List<string>();
            
//get the list of files in the top directory and add them to our overall list
//these will have no path in blob storage bec they go in the root, so they will be like "mypicture.jpg";
string[] baseFiles = Directory.GetFiles(folderName);
for (int i = 0; i < baseFiles.Length; i++)
{
    listOfiles.Add(Path.GetFileName(baseFiles[i]));
}

Files will be placed in the same relative path in blob storage as they are on the local computer. For example, if D:\zAzureFiles\Images is our “root” upload directory, and there is a file with the full path of “D:\zAzureFiles\Images\Animals\Wolverine.jpg”, the path to the blob will be “Animals/Wolverine.jpg”.

Next we need to get the directories under our “root” upload directory, and process each one of them. For each directory, we will call GetFolderContents to get the files and folders in each directory. GetFolderContents is our recursive routine. So here is the rest of GetListOfFilesToUpload:

//call GetFolderContents (this routine) for each folder to retrieve everything under the top directory
string[] directories = Directory.GetDirectories(folderName);
for (int i = 0; i < directories.Length; i++)
{
    // an example of a directory : D:\zAzureFiles\Images\NatGeo; (root is D:\zAzureFiles\Images
    // topDir gives you the just the directory name, which is NatGeo in this example
    string topDir = GetTopDirectory(directories[i]);
    //GetFolderContents is recursive
    List<String> oneList = GetFolderContents(directories[i], topDir);
    //you have a list of files with blob storage paths for everything under topDir (incl subfolders)
    //  (like topDir/nextfolder/nextfolder/filename.whatever)
    //add the list of files under this folder to the list going in this iteration
    //eventually when it works its way back up to the top, 
    //  it will end up with a complete list of files
    //  under the top folder, with relative paths
    foreach (string fileName in oneList)
    {
        listOfiles.Add(fileName);
    }
}

And finally, return the list of files.

return listOfiles;

GetTopDirectory(string fullPath)

This is a helper method that just pulls off the last directory. For example, it reduces “D:\zAzureFiles\Images\Animals to “Animals”. This is used to pass the folder name to the next recursion.

private string GetTopDirectory(string fullPath)
{
    int lastSlash = fullPath.LastIndexOf(@"\");
    string topDir = fullPath.Substring(lastSlash + 1, fullPath.Length - lastSlash - 1);
    return topDir;
}

GetFolderContents(string folderName, string blobFolder)

This is the recursive routine. It returns List<string>, which is a list of all the files in and below the folderName passed in, which is the full path to the local directory being processed, like D:\zAzureFules\Images\Animals\.

This is similar to GetListOfFilesToUpload; it gets a list of files in the folder passed in and adds them to the return object with the appropriate blob storage path. Then it gets a list of subfolders to the folder passed in, and calls GetFolderContents for each one, adding the items returned from the recursion in to the return object before returning up a level of recursion.

This sets the file names to what they will be in blob storage, i.e. the relative path to the root. So a file on the local computer called D:\zAzureFiles\Images\Animals\Tiger.jpg would have a blob storage path of Animals/Tiger.jpg.

returnList is the List<String> returned to the caller.

List<String> returnList = new List<String>();
            
//process the files in folderName, set the blob path
string[] fileLst = Directory.GetFiles(folderName);
for (int i = 0; i < fileLst.Length; i++)
{
    string fileToAdd = string.Empty;
    if (blobFolder.Length > 0)
    {
        fileToAdd = blobFolder + @"\" + Path.GetFileName(fileLst[i]);
    }
    else
    {
        fileToAdd = Path.GetFileName(fileLst[i]);
    }
    returnList.Add(fileToAdd);
}

//for each subdirectory in folderName, call this routine to get the files under each folder
//  and then get the files under each folder, etc., until you get to the bottom of the tree(s) 
//  and have a complete list of files with paths
string[] directoryLst = Directory.GetDirectories(folderName);
for (int i = 0; i < directoryLst.Length; i++)
{
    List<String> oneLevelDownList = new List<string>();
    string topDir = blobFolder + @"\" + GetTopDirectory(directoryLst[i]);
    oneLevelDownList = GetFolderContents(directoryLst[i], topDir);
    foreach (string oneItem in oneLevelDownList)
    {
        returnList.Add(oneItem);
    }
}
return returnList;

UploadFiles(List<string> listOfFiles, string folderPath)

This is the method that actually uploads the files to Blob Storage. This assumes you have a reference to the cloudBlobContainer instance that we created at the top.

[listOfFiles] contains the files with relative paths to the root. For example “Animals/Giraffe.jpg”. [folderPath] is the folder on the local drive that is being uploaded. In our examples, this is D:\zAzureFiles\Images. Combining these gives us the path to the file on the local drive. All we have to do is set the reference to the location of the file in Blob Storage, and upload the file. Note – the FileMode.Open refers to the file on the local disk, not to the mode of the file in Blob Storage.

internal string UploadFiles(List<string> listOfFiles, string folderPath)
{
    string status = string.Empty;
    //now, listOfiles has the list of files you want to upload
    foreach (string oneFile in listOfFiles)
    {
        CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference(oneFile);
        string localFileName = Path.Combine(folderPath, oneFile);
        blob.UploadFromFile(localFileName, FileMode.Open);
    }
    status = "Files uploaded.";
    return status;
}

Summary

So you have the following:

  • The code for the calling routine that sets the reference to the cloudBlobContainer and makes sure the container exists. This calls GetsListOfFilesToUpload and UploadFiles to, well, get the list of files to upload and then upload them.
  • GetListOfFilesToUpload calls GetFolderContents (which is recursive), and ultimately returns a list of the files as they will be named in Blob Storage.
  • GetFolderContents – the recursive routine that gets the list of files in the specified directory, and then calls itself with each directory found to get the files in the directory.
  • UploadFiles is called with the list of files to upload; it uploads them to the specified container.

If the files already exist in Blob Storage, they will be overwritten. For those of you doing ClickOnce, this means it will overlay the appname.application file (the deployment manifest) and the publish.htm if you are generating it.

One other note to those doing ClickOnce deployment – if you publish your application to the same local folder repeatedly, it will keep creating versions under Application Files. This code uploads everything from the top folder down, so if you have multiple versions under Application Files, it will upload them all over and over. You might want to move them or delete them before running the upload.

This post provided and explained the code for uploading a folder and all of its sub-items to Azure Blob Storage, retaining the folder structure. This can be very helpful for people using ClickOnce deployment and hosting their files in Blob Storage, and for anyone else wanting to upload a whole directory of files with minimal effort.

Bay Area Azure events in March and April 2014

March 14, 2014

There are several upcoming Windows Azure events in the SF Bay Area. All of these events are free and open to everyone. Food and drinks will be provided, so please register if you’re coming so we can make sure we have enough food!

March 18: A Real Story of Azure Migration

On March 18, Giscard Biamby is coming to speak about his company’s experience migrating one of their larger legacy applications to Windows Azure and how they implemented continuous delivery. It’s always interesting to hear these stories from real customers rather than from Microsoft marketing. For more details or to register, click here.

March 29: Global Windows Azure Bootcamp

On March 29, I will be running a Global Windows Azure Bootcamp at Foothill College in Los Altos Hills with the help of several of my friends. This is a free event run by community leaders worldwide on the same day. So far, there are over a hundred locations confirmed. Everyone is welcome. If you know nothing about Azure and would like to have an opportunity to learn a bit and have people available to help you with some hands-on labs, this is a great opportunity. Also, if you’re already doing Azure and have questions about your project, feel free to attend this bootcamp and take advantage of the opportunity to ask some experts for advice.

I’ll be presenting an overview of Windows Azure. Neil Mackenzie will be speaking on IAAS and Networking. Eugene Chuvyrov and Fabien Lavocat will be showing how to use Mobile Services from an iOS device and a Windows device. The rest of the day will be hands-on labs. For more details or to register, click here.

April 2: Vittorio Bertocci on Identity and Active Directory

On April 2nd, the Windows Azure meetup in San Francisco will be hosting Vittorio Bertocci from Microsoft. Vittorio will be in SF for the Microsoft \\build conference (April 2nd through April 4th). Vittorio is brilliant, and is a vibrant, entertaining speaker, focusing on Identity and Active Directory in Windows Azure. He spoke last year, and we had a huge turnout, lots of conversation and audience participation, and it was a great event. This should be another great meetup. For more details or to register, click here.

April 22nd: Deep Dive on Windows Azure Web Sites

On April 22nd, Aidan Ryan will be speaking at the Windows Azure meetup in San Francisco, doing a deep dive on Windows Azure Web Sites. This feature becomes more robust every day, and Aidan will cover the basics as well as the recent releases, such as web jobs. He’s also promised to incorporate anything new announced at \\build. For more details or to register, click here.

I feel very fortunate to live in the San Francisco Bay Area where there are so many opportunities for keeping up with technology are available. I hope I’ll see you at one or more of these events!

Fun stuff to do if you’re in San Francisco for the BUILD conference

June 21, 2013

Is there any tourist-y stuff to do in San Francisco?

I have to start by saying what’s fun for some people will not be fun for everyone. I’m not going to repeat all the San Francisco treats (such as Rice-a-Roni) for you; that’s what guidebooks are for. Everyone knows about Fisherman’s Wharf, Pier 39, Alcatraz, the Golden Gate Park, the California Academy of Sciences at the park, and of course the famous Golden Gate Bridge (the best view of which is from the north side of the bridge, from the battlements in the Marin Headlands). For people who like to shop, the Westfield Center is on Market and Powell St, and Union Square is two blocks up Powell.

There’s also the Letterman Center for the Digital Arts in the Presidio, home to Lucasfilm and ILM. (Take your picture with the Yoda fountain!). If you have a car, you can drive north on 101, and take the Lucas Valley Drive exit and go west to Nicasio, and drive by the entrance to Skywalker Ranch. (I’m not posting the # here, I don’t want the double-L’s coming after me (the Lucasfilm Lawyers)). (Did you notice I closed all of my parentheses correctly? Good to know that learning LISP was relevant to my life.) Oh, by the way, you can’t see anything from the entrance, and they have a lot of security cameras, so don’t try climbing over the fence and running for the main house. (Don’t ask.)

Any tech events going on around \\build?

So let’s talk about fun things to do if you’re a tech person coming to \\build – you know, the parties, the events happening at the same time, where you can see people you haven’t seen since the last conference or MVP summit? Here’s a list I managed to cobble together. If you know of any I’ve missed, please add them in the comments so I can go to them, too. Be sure to check the event pages themselves in case there are any changes after I post this.

Monday, 21 June

  • Microsoft .NET client-side development futures / panel discussion. Microsoft offices, SF, 6:30 p.m.
    Discuss Microsoft .NET client-side development and the future thereof with Laurent Bugion and Ward Bell (both are Silverlight MVPs). There will also be 1 or 2 guys from Xamarin joining in. More info here.
  • Preparing applications for production environments. Microsoft offices, Mountain View, 6:30 p.m.
    You need a car to get from SF to this meetup in Silicon Valley about preparing applications for production environments. More info here.

Tuesday, 22 June

  • Vittorio Bertocci speaking about Identity/AD/ACS and Windows Azure. Microsoft offices, SF, 6:30 p.m. 
    Come see Vittorio Bertocci, a superstar from Microsoft who is the expert in Identity/AD/ACS in Windows Azure! He’s always entertaining and informative, and great at answering questions. This event is kindly being sponsored by AppDynamics, so there will be pizza and drinks; please sign up ahead of time so we get enough pizza! More info here.
  • Bay Area F# User Group meetup. Microsoft offices, SF, 6:30 p.m.
    Meetup of the Bay Area F# user group. More info here.
  • Xamarin Welcome Party 7:00-midnight
    This is conveniently about a block from the Microsoft offices, and I suspect their numbers will coincidentally increase at about the time the two meetups end. More info here.

Wednesday, 23 June

  • Glenn Block speaking about scriptcs. Github HQ, SF, 7:00 p.m.
    Come see Glenn Block from Microsoft talk about scriptcs at the second SF .NET User Group (yes, there’s two, don’t ask). Github is on 4th St; I hear they are letting in the first 55 people who have RSVP’d to the meetup. More info here.
  • \\Build Welcome Reception
    If the pattern of the past few \\build conferences holds, the \\build welcome reception will be Wednesday night. I’ll post more information when I find out if I’m right or now.

Thursday, 24 June

  • Deep Fried Bytes party. Thirsty Bear Brewing Co., South of Market 8:00-10:00 p.m.
    To get tickets to this, you have to track down Chris Woodruff or Keith Elder at the \\build conference. More info here.
  • \\Build Attendee Party
    This is another educated guess. If the pattern holds, there will be an Attendee party on Thursday night. I’ll post details here when/if I get them!

How do I find the Microsoft office in San Francisco?

Several of these events are at the Microsoft offices in San Francisco. They are very generous with their space, and we who run the meetups and user groups really appreciate their support, especially that of Bruno Terkaly with DPE for hosting all of our SF meetups.

The office is about two blocks from the Moscone Center, where the \\build conference is being held, on Market Street where Powell runs into Market. Of course, they don’t have a big sign on the street that says Microsoft, you have to be “in the know” to find it. Fortunately, Microsoft Research (apparently in the same location) has a very nice page here that shows you where it is.

Is Rice-a-Roni really the San Francisco Treat they claim it is?

Yes it is. Do you have the Rice-a-Roni song in your head yet?

Windows 8 and ClickOnce : the definitive answer revisited

April 14, 2013

Since I posted the article titled Windows 8 and Click Once: the definitive answer, it became clear that it was not actually the definitive answer.

I got a ping on twitter from Phil Haack from GitHub telling me that this did not fix their Smart Screen filter problem.

After talking to him, and seeing his build and signing commands, I discovered they recently changed their signing certificate. For those of you who remember the early days of ClickOnce (2005) when you changed the signing certificate and everybody had to uninstall and reinstall the application, this seemed too likely an indicator to ignore.

Reputation

I didn’t talk in my article about “reputation” (and I should have, so I duly apologize here for that). In my first conversations with Microsoft, they mentioned that an application had a reputation, and this reputation had some bearing on the appearance of the Smart Screen Filter, and this reputation was built based on how many people had installed your application.

When I asked how many people had to install your application before the Smart Screen filter stopped interrupting the running of the application, I could not get a clear answer. Of course, this makes sense that they wouldn’t want to make their algorithm public, because you could publish your app, install it X number of times yourself, and make it reputable. (I’m not suggesting you do that, but if you do, please post back and tell us your results. Inquiring minds want to know.)

Since we’ve been in business for a few years, and have well over a thousand customers (probably thousands) who have installed the desktop application, this didn’t impact us. The reason I didn’t mention it in the blog post is because I created a new Windows Forms test application and deployed it solely for the purpose of testing the information in the article, and had no problem with the Smart Screen Filter. I installed the application maybe a dozen times while messing with the build commands, so I figured, “Wow, the number of installs required is pretty small.” Haha!

So on behalf of Phil, I pinged my contact at Microsoft, and he went off to investigate. After a bit of research, he found some information internal to Microsoft. I won’t quote it directly in case I shouldn’t, but the gist of it was this: The digital certificate information may be taken into account when determining the reputation of the application. A-HA! I thought to myself (and immediately started humming that song, “Take On Me”.)

So the problem at GitHub is probably due to the certificate being updated right about the same time they start signing their assembly for customers using Windows 8. I expect that fairly soon, as people install or get updates (if they are using automatic updates), their reputation will be sterling and nobody will ever see the Smart Screen Filter again when installing GitHub.

Knowing this, it makes sense that my test application didn’t get stopped even though it was a brand new application. I signed it with my company’s signing certificate, which has been in use for several months.

Which leads me to another issue I noticed when talking to Phil. I noticed that rather than using PostBuild or BeforePublish commands, he was using AfterCompile commands to sign his executable. I asked him about it.

PostBuild, BeforePublish, and AfterCompile, oh my!

Apparently when Phil signs his executable using PostBuild or BeforePublish commands, when the user installs it, he gets the dreaded “exe has a different computed hash than specified in the manifest” error. He found that using AfterCompile instead fixed the problem.

I went back to Microsoft, and they soon verified the problem, and said it is due to WPF applications having a different set of targets and execution order, so the standard AfterBuild/BeforePublish commands don’t quite work. So the bottom line is this: The signing of the exe doesn’t work right with BeforePublish or PostBuild if you are using VS2012 and you have a WPF application. In that case, you must use AfterCompile. So in the original post, use case #3, but put in AfterCompile instead of BeforePublish.

If you are using VS2010, OR you have a Windows Forms or Console application, you can use PostBuild or BeforePublish with no problem.

Hopefully we now have the definitive answer to handling the Smart Screen filter and signing a ClickOnce application that will be run on a Windows 8 machine.

Thanks to Zarko Hristovski and Paul Keister, who also reported the problem with the BeforePublish command, and who verified that AfterCompile worked for them. Thanks to Phil Haack for the answer to a problem I didn’t know existed yet. And thanks to Saurabh Bhatia at Microsoft for his help with Windows 8 and ClickOnce.

Tech Days San Francisco, 2-3 May 2013, through Azure-colored glasses

April 9, 2013

Living in the San Francisco Bay Area is awesome if you work in tech. There are so many companies springing up all the time and so many interesting places to work. The hard part of working in tech is keeping up with the current technologies and learning the new skills that can help you advance your career. A great way to do that is to keep your eyes open for conferences, dev days, tech days, etc., in your area, sign up and go. There are so many great opportunities being offered by the community leaders in your area.

A really interesting opportunity is coming up in the San Francisco Bay Area in early May – Tech Days SF. While primarily for IT Pros, there are also sessions that will be interesting to developers. What developer couldn’t benefit from knowing more on the IT Pro side? I was recently talking to another Azure MVP, and we agreed that now with all of the features in Windows Azure, it would behoove us to learn about virtual networks and some of the other IT-type features we never had to know when just doing software development.

There are some great speakers coming, which I doubly appreciate, because I managed to poach Glenn Block from Microsoft to speak at the Azure Meetup in San Francisco the night before (5/1) about mobile services (official announcement coming soon). And there is going to be a wide variety of topics; here is a random selection that just coincidentally seem Azure-related or Azure-useful:

  • Windows Azure
  • Managing the Cloud from the CmdLine
  • Microsoft IT – Adopted O365 and Azure
  • Windows Azure Virtual Machines (IAAS)
  • PowerShell Tips and Tricks (You can use PowerShell scripts with Windows Azure)
  • Manage Server 2012 Like a Pro or Better, Like an Evil Overlord (I like the title)

This is just one of many opportunities available to keep your skills up-to-date. So check it out, sign up, and go expand your knowledge!

(Reminder – There’s also a Global Windows Azure Bootcamp in San Francisco on 4/27!)

Windows 8 and ClickOnce : the definitive answer

February 24, 2013

There have been a lot of copies of Windows 8 sold since it came out a few months ago, and the Surface Pro was just released. (In fact, I’m writing this on my brand new Surface Pro, which I really like, but that’s a subject for another time.)

If you’re using ClickOnce deployment, you’re probably wondering how (or if) it’s going to work with Windows 8. I’ve worked with Saurabh Bhatia at Microsoft to ensure that this article will cover what you need to know. We use ClickOnce at GoldMail (whose product is now called Point Across) for our desktop product and VSTO applications, as well as several internal utility applications, so I’ve also tested this on our products to make sure it’s accurate.

If you are hosting your deployment on a file share or on an intranet, you won’t have to make any changes. You can go get ice cream now while the rest of us soldier on.

If you are hosting your deployment on the internet, you will eventually get calls from your customers who have upgraded to Windows 8 or purchased a Windows 8 machine. So let’s talk about that.

I’m not going to talk about the bootstrapper right now; that’s going to come up later. For now, let’s concentrate on the ClickOnce application itself. When a user installs a ClickOnce application on Windows 8, here’s what happens:

  • ClickOnce gets the manifest, checks the certificate, and shows the ClickOnce prompt with “trusted publisher” or “unknown publisher” (depending on your signing certificate).
  • The user clicks the Install button.
  • It checks the certificate on the executable. If it’s not signed, the Smart Screen Filter is triggered.

So here’s what the user experience looks like when you install a ClickOnce application on Windows 8:

You get the standard install prompt:

The publisher is known because I am signing the deployment with a signing certificate purchased from a Certificate Authority – in this case, Verisign.

If you click Install, it shows the standard install dialog and actually installs the application. But then it shows a blue band across your screen saying, “Windows SmartScreen prevented an unrecognized app from starting. Running this app might put your PC at risk.”

There is a small “More Info” link under the warning, and a big “OK” button on the bottom of the dialog. Which one would you click? Which one would your customers click? Most people will click the OK button.

If the user clicks OK, the dialog closes, and nothing else happens. Now let’s say the user goes to TileWorld (I’m borrowing David Pogue’s name for the new Windows 8 interface formerly known as Metro). The user can see the application there in the list of apps because it actually got installed. If he clicks on it to run it, nothing happens. So congratulations! The user has installed your application, but he can’t run it.

What happens if the user clicks “More Info” instead of “OK”? He sees the following screen, and he can choose “Run Anyway” or “Don’t run”.

For “Publisher”, it says “Unknown publisher” – this is referring to the executable, which is not specifically signed. Only the manifests are signed. This has never been a requirement for ClickOnce deployments. Until now.

If the user chooses “Run Anyway”, it will run the application. Yay! And when he goes back to TileWorld and tries to run it from there the next time, it will work and will not prompt him again. Yay!

So let’s say he clicks “Run Anyway”, and now he has no problem running your application. What happens when an update is published and he installs it? Uh-oh. The smart screen filter interrupts again, and he has to select “More Info” and “Run Anyway” again.

Is there a way to circumvent your ClickOnce application being captured and stopped by the Smart Screen Filter? Yes. Otherwise, this would be a much shorter (and depressing) article. All you have to do is sign the application executable after building it and before deploying it. For this, you need your signing certificate and signtool.exe, which is one of the .NET Framework tools. There are three points in the build/publish process at which you can do this:

1. Post-publish

2. Post-build

3. Pre-publish

#1: Signing the application executable post-publish

To do it post-publish, you have to do the following:

  • a. Publish the files to a local directory.
  • b. Use signtool to sign the exe for the application.
  • c. Use mage or mageUI to re-sign the application manifest (.exe.manifest).
  • d. Use mage or mageUI to re-sign the deployment manifest (.application).
  • e. Copy the files to the deployment location.

If you’ve already automated your deployment with a script and msbuild, this may be the choice you make. If you publish directly from Visual Studio, the other two options are easier.

#2: Signing the application executable post-build

To do this, you define a post-build command in your project. Assuming your certificate (pfx file) is in the top level of your project, you can use something like this:

"C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\signtool.exe" sign /f "$(ProjectDir)TestWin8CO_TemporaryKey.pfx" /p nightbird /v "$(ProjectDir)obj\x86\$(ConfigurationName)\$(TargetFileName)"

  • The double quotes are required.
  • “C:Program Files (x86)Microsoft SDKsWindows\v7.0A\bin\signtool.exe” is the path to the signtool application, used to sign the executable.
  • $(ProjectDir) points to the top directory of the project. The subfolder “\obj\x86” will vary depending on your build output path. The above was created and tested on VS2010. On VS2012, my subfolder is just \obj.
  • $(ConfigurationName) is the build configuration name, such as Debug or Release – this is required because it signs it in the obj directory and has to know which folder to use.
  • $(TargetFileName) is the name of the application executable.
  • TestWin8CO_TemporaryKey.pfx is the name of my certificate file, which is in the top folder of my project.
  • /p nightbird – this is the password for my temporary certificate

I have specified the full path to signtool.exe. I tried to do this with one of the msbuild variables that points to the location of the .NET framework files, but it doesn’t work – it doesn’t translate the variable until after it executes the statement. If you print it out in the post-build command, it shows the right location in the Visual Studio output window, but gives you an error that it can’t find it when it actually runs this statement. I’m saving you some time here, because I messed around with that for quite a while trying to get it to work, and after asking Saurabh at Microsoft, he couldn’t get it to work without specifying the whole path, either. So if you get it to work with a msbuild variable, let me know how.

After you’ve created your version of the post-build command, you need to put it in the project properties. Double-click on Properties and click on the Build Events tab. Put your command in the Post-build event command line box.

Now build the project, and the output window will show the results.

If you now publish the application and put the files in the deployment directory, the user can install it and will not see the Smart Screen Filter. Yay!

What if you have multiple programmers working on the application, and they all build and run the application? Every programmer must have signtool.exe in the exact same location for this post-build command to work for everybody. If you have a 32-bit machine, the folder for the “Microsoft SDKs” is under “C:Program Files”, without the “(x86)” on the end. And someone might actually install Windows to a drive other than C. If their signtool.exe file is not in the same location, they can’t build and run the application, which means they can’t put in changes and test them.

Only the person publishing the application really needs this build command to work. So how do you execute this only for the person publishing the application? You can set up a pre-publish command.

#3: Signing the application executable pre-publish (recommended solution)

The pre-publish command is executed after building the application and right before publishing it. There is no box for this under Build Events, so you have to add it to the project yourself. (Be sure to clear out the post-build event command line before doing this.)

To add a pre-publish command, right-click on the project in Visual Studio and select “Unload Project”.

Now right-click on the project again and select “Edit yourprojectname.csproj”.

It will open the csproj file in Visual Studio so you can edit it. Go down to the bottom and add a new section before the </Project> line. You’re going to put your pre-publish command line in this section.

<Target Name=”BeforePublish”>

</Target>

So what do you put in this section? You are going to specify a command to execute, so you have to use Exec Command, and put the command to execute in double quotes. Since you can’t put double-quotes inside of double-quotes (at least, not if you want it to work), you need to change the double-quotes in your command to &quot; instead. So my build command from above now looks like this:

<Exec Command="&quot;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\signtool.exe&quot; sign /f &quot;$(ProjectDir)TestWin8CO_TemporaryKey.pfx&quot; /p nightbird /v &quot;$(ProjectDir)obj\x86\$(ConfigurationName)\$(TargetFileName)&quot;" />

After making this match your parameters, save the csproj file and then close it. Then right-click on the project and reload it:

Now if you build your project, you won’t see anything about signing the application executable in the output window. It will only do it if you publish, and there won’t be logging letting you know it signed it. How do you know if it worked? Go to the folder you published to, and look in the Application Files folder. Locate the application executable in the folder for the new version. Right-click on it, choose properties. Look for a tab called “Digital Signatures”. If it’s not found, it’s not signed. If you do see it, go to that tab; it will show the signature list and the signer of the certificate. You can double-click on the signer and then view the signing certificate.

How will the application work after publishing it with a signed executable?

If you sign your executable and your deployment with a valid certificate from a Certificate Authority like Verisign using one of the methods above, when the user clicks install, it will install without stopping and showing the SmartScreen filter, and updates will do the same. Yay!

Do I have to use a certificate from a Certificate Authority to circumvent the Smart Screen Filter?

Yes.

Is there any workaround?

No.

If you try the old tried and true “install the certificate in the trusted publishers store on the client computer”, you will find that this does not circumvent the Smart Screen Filter. You must have a certificate from a valid Certificate Authority. Without one, your customer will get the Smart Screen filter when he installs the application, and every time he installs an update.

What about the bootstrapper (setup.exe)?

The bootstrapper (setup.exe) is signed the same way as the ClickOnce deployment; this happens when you publish. When run, this installs the prerequisites and then calls the ClickOnce application installation. If your certificate is not from a valid CA, the Smart Screen Filter will catch it. This isn’t as critical a problem as the ClickOnce deployment itself because in most cases, your users will only run this the first time.

What about VSTO applications?

If your VSTO application is deployed via a file share or the intranet zone, you will not be impacted. If your VSTO application is deployed via the Internet zone, you may be impacted.

There is no executable for a VSTO application, just an assembly, so you don’t have to do any extra signing. However, the following is true:

If you sign your deployment with a certificate from a CA, everything will work fine, and the Smart Screen filter will not interrupt either the setup.exe or the vsto file from installing the app or keep the app from running.

If you are using a test certificate, setup.exe will be caught by the Smart Screen filter. If you click ‘Run Anyway’, it will install the prerequisites, but it will not let you install the VSTO application.

If you install the test certificate in the Trusted Publishers store, setup.exe will still be caught by the Smart Screen filter, but the VSTO application can be installed and run. This is strongly advised against, as installing the certificate on the user’s machine introduces a significant security risk.

Which method to you recommend?

The advantage of the post-build command is that it is transparent. You can easily go into the Build properties and see there is a post-build command. A pre-publish command is kind of hidden in the project file. However, everybody has to have signtool.exe in the same place, and for us that’s a non-starter. Also, if I did leave the post-build command in there, someone might change it to match their path and check in the change, causing a problem when we actually try to build the application for production.

I used the post-build methods to test my build command until I got it to work, and then ported it to a pre-publish command. 

To summarize, a flowchart:

In summary, here’s a flowchart to help you easily see whether your users will get the Smart Screen filter when they install your application on Windows 8.

One last note: The first version of VS2012 had a bug where the bootstrapper created when publishing a ClickOnce application would not work on a Windows XP machine. This problem was fixed in the first update.

[edit: Fixed build paths, some \’s were missing. Added recommendation. –Robin 2.26.2013]

[edit: After publishing this article, I heard from a couple of people who were still having problems. Please check out the next blog article about this if you are still having problems with the Smart Screen filter, or getting the dreaded “exe has a different computed hash than the manifest” error. –Robin 4.14.2013]

Host your ClickOnce deployment in Azure for pennies per month

July 18, 2011

A while back, I wrote an article that shows you how to host your ClickOnce deployment in Windows Azure Blob Storage. The article assumed that you already had a Windows Azure account.

Since prequels are so popular in Hollywood (Star Wars I-III, anyone?), I thought I would write a prequel to explain how much it costs to host your deployment in Azure, and how to sign up for an Azure account and create the storage account. Hopefully, this article will be more popular than Jar Jar Binks.

Show me the money

How much does it cost to host your ClickOnce deployment in Windows Azure Storage? Well, for a pay-as-you-go account, here are the costs as of today, which I found by going to here and clicking on “Pay-As-You-Go”.

Windows Azure Storage

  • $0.15 per GB stored per month
  • $0.01 per 10,000 storage transactions

    Data Transfers

    • North America and Europe regions
      • $0.15 per GB out
    • Asia Pacific Region
      • $0.20 per GB out
    • All inbound data transfers are at no charge.

Let’s take an example. Let’s say we have a deployment consisting of 150 files and a total size of 30MB. We have 100 customers, and we are going to publish a new version every month, starting in January, and all 100 customers are going to update to every version. At the end of the year, how much will this have cost us?

Put your mathlete hats on and get out your calculators. Ready? Here we go…

The storage cost for one month = $0.15 / GB * 30MB * 1GB/1000MB = $.0045. So January will be (1*value), February will be (2*value) because we’ll have two versions. March will be (3*value), and so on until December when it hits (12*value) because we have 12 versions stored. After calculating that out for the whole year, the total cost of storing the deployment files for the year will cost $0.2475. This is affordable for most people.

Let’s talk about the storage transactions. If you have a file bigger than 32MB, it is one transaction per 4MB and one at the end of the list of blocks. If the file is smaller than 32MB, it’s 1 transaction for that file. All of the files in our case are less than 32MB. So when we upload a new version of the deployment, here are the costs:

Storage Transaction cost when uploading once = 30 files * $.01/10000 = $0.00003.

Data Transfer costs are free going up, so nothing to calculate there. How about coming back down to your customer?

Transaction cost when downloading once = 30 files * $.01/10000 = $0.00003.

Data transfer cost when downloading once = 30 MB * 1GB/1000MB * $0.15/GB = $0.0045

Now you’re wishing you’d paid attention in all of those math classes, aren’t you? And we’re not done yet. Let’s calculate our total for the entire year.

  • $0.00036 = Storage Transaction cost for uploading 12 versions throughout the year.
  • $0.00 = Data Transfer cost for uploading 12 versions.
  • $0.2475 = Storage for 12 versions uploaded once per month and retained throughout the year.
  • $0.036 = Storage Transaction cost for downloading 12 versions for 100 customers.
  • $5.40 = Data Transfer cost when downloading 12 versions for 100 customers.

So our grand total is $5.68386, which is an average of 47 cents per month.

For more detailed information on Windows Azure storage costs, check out this blog entry from the Windows Azure Storage Team; it was written before they eliminated the Data Transfer cost of uploading to blob storage so don’t include that cost. Thanks to Neil McKenzie for clarification, and for providing the link to the Windows Azure Storage Team blog.

Hook me up with an Azure account

You have three basic options.

  1. If you have an MSDN subscription either through your company or because you are a bizspark customer, you probably get an MSDN benefit that more than covers your ClickOnce deployment costs. The basic mechanism for signing up will be similar, but the way you set up your storage account will be the same, so that information below should work for you as well as for those who have no MSDN account. You will have to give your credit card to cover any charges over the free usage benefit.
  2. If you want to try this out for free without giving your credit card, you can sign up for a free 30-day Azure pass. At the end of 30 days, you will have to delete the storage account and set it up on a real account if you want to continue using it. (If you use the same storage account name on the new account, the URL will be the same and your users will be able to pick up updates even though you changed accounts.)
  3. If you sign up for a pay-as-you-go account, you have to give your credit card, but you get a free benefit which would make my deployment example free for the first 3 months. Then at the end of 3 months, it will start charging your credit card, and you will not have to move your storage account. Let’s take a look at how to sign up for this type of account.

Go to http://www.microsoft.com/windowsazure/offers/ This should take you to the Windows Azure Platform Offers shown in Figure 1.


Figure 1: Windows Azure Platform Offers

Click on the Pay-As-You-Go tab and then click the Buy button on the right. Next, you will be given a choice to sign up for a new Windows Live account, or use one you already have (Figure 2).


Figure 2: Sign up or sign in.

They are going to send you e-mail on this account, so be sure it’s an account you actually check periodically. After logging in with your Windows Live account, you will be prompted for your profile information (Figure 3).


Figure 3: Profile information.

Fill in your address and phone number and click the Next button. You will be prompted for company information (Figure 4). I think you’ll find that a lot of people work for “n/a”. I doubt Microsoft looks at that information, but you can amuse yourself by putting in the name of the most popular fruit in America, just in case someone IS looking at the company names — give them a surprise. Although, it is widely reported that Apple uses Windows Azure Storage for their new iCloud service, so it might not surprise them at all. (Google would definitely surprise them!)


Figure 4: Company information

Now they will ask for your Service Usage Address. (You can check the box to use the information you filled in on the profile page.) This is displayed in Figure 5.


Figure 5: Service Usage Address.

Fill in the information and click Finish. Next you will get directions to close this page and go to the Services page. You will find yourself at the Customer Portal for the Microsoft Online Services (Figure 6).


Figure 6: Customer Portal for Microsoft Online Services

Now you get to pick a plan. If you pick the Windows Azure Platform Introductory Special, they provide some benefit for free for the first 90 days. This benefit covers our ClickOnce deployment example above, so it would be free for the first three months, and then would cost you as noted above. If you’re nuts and you don’t like free stuff and just want to pay now, You can select the Windows Azure Platform Consumption. Click the Buy Now button on your selection; you will be prompted to log in again and then taken to the Pricing and Online Subscription Agreement screen (Figure 7).


Figure 7: Pricing and Online Subscription Agreement.

Fill in your subscription name. Pick something that you like and can remember. Then read the Online Subscription agreement as carefully as you read all of these things, check the box and hit the Next button. If you don’t read it carefully, and Microsoft comes to your house to pick up your firstborn child, don’t say I didn’t warn you.

Next comes the hard part. Fill in your credit card information and click the Submit button. If your credit card information is correct, you will be sent to the Azure portal (Figure 8).

I now have an Azure account! How do I set up my new storage account?

This is the Windows Azure Portal, which you can reach through this URL: http://windows.azure.com


Figure 8: Windows Azure Portal

This screen is where you manage all of your Azure services. You can define services, set up databases, and set up storage accounts, which is what we’re here to do. Click on the ‘New Storage Account’ icon at the top of the screen as shown in Figure 9.


Figure 9:Create a new storage account

Next you will be prompted for your new storage account name (Figure 10). This will be used in the URLs for accessing your deployment, so you should probably think twice before making it something like “myapplicationsux” or “mypornpix”. The name must have only lowercase letters and numbers. After you fill it in, it will tell you if it’s already used. If it doesn’t give you any errors, it’s available.

In regards to the region, you will be asked to either choose a region, choose an affinity group, or create a new affinity group. This is not something you can change later, so choose wisely. (Unlike Walter Donovan in Indiana Jones and the Last Crusade, if you choose poorly, you will not instantly grow ancient and disintegrate.)


Figure 10: Create a new storage account

An affinity group is basically specifying a location and naming it. You can then choose the affinity group when setting up other services to ensure that your compute instances and your data are in the same region, which will make them as performant as possible.

Just in case you ever want to use this account  for something other than Blob Storage, I recommend setting up an affinity group. Select the radio button for “Create or choose an affinity group”, and then select the dropdown. Then you can select the location – be sure to use the dropdown. Mine defaulted to “anywhere in the US”, but it’s better to select a specific region, such as North Central or South Central, or whatever region is closest to you. Then click OK to go ahead and create the storage account. You should now see your storage account in the Windows Azure Portal (Figure 11).


Figure 11: Storage Account

You can assign a custom DNS entry to your storage account by clicking the Add Domain button on the top of the screen and following the instructions.

The URL for accessing your blob storage is on the right side of the screen. Mine is robindotnet.blob.core.windows.net. On the right are also the View buttons for retrieving the primary access key that you will need to set up a client application to access your blob storage. With these two pieces of information, you should be able to view your data.

For uploading and maintaining your files in blob storage, I use Cloud Storage Studio from Cerebrata which is excellent, but not free. There are free storage explorers available, such as the Azure Storage Explorer from CodePlex and the Cloudberry Explorer for Azure Blob Storage.

You should be good to go. Now go read the article on how to actually put your ClickOnce deployment in your new storage account, and start racking up those pennies.

How to host a ClickOnce deployment in Azure Blob Storage

February 13, 2011

Now that Microsoft Azure is becoming more widely used, I’m going to do some blogging about it, since I’ve had an opportunity to work with it quite a bit. What better place to start than to do a crossover blog entry on both ClickOnce deployment and Microsoft Azure? So I’m going to show you how to host your ClickOnce deployment in your Azure Blob Storage.

To do this, you need an application that you can use to manage blob storage. I use the Cloud Storage Studio from cerebrata in my example. A free application recommended by Cory Fowler (Microsoft Azure MVP) is the Azure Storage Explorer from codeplex.

Here  is a video that explains this process in detail, complete with screenshots. There is a summary below.

To summarize:

Create a container in blob storage for your ClickOnce deployment. You’ll need the container name when setting your url. I selected ‘clickoncetest’. The only characters allowed are lower case letter, numbers, and the hyphen (-).

In your project properties, set your Publishing Folder Location to somewhere on your local drive. Set the Installation Folder URL to the URL that will point to the container in blob storage that is going to host your deployment.

For example, I set the first one to E:\__Test\clickoncetest. My account is goldmailrobin, so my installation URL will be http://goldmailrobin.blob.core.windows.net/clickoncetest/

Publish your application. Then go to the local folder and copy the files and folders up to the container in blob storage. When you are finished, in the root of that container you should have the deployment manifest (yourapp.application file) and the bootstrapper (setup.exe) (and publish.htm if you included it). You should also have a folder called “Application Files”.

In “Application Files”, you should see the ‘versioned folders’ that contain the assemblies for each version of your application that you have published.

When doing updates, you need to upload the versioned folder for the new update, and replace the files in the root folder (yourapp.application, setup.exe, and publish.htm).

If you have any problems, you can check the MIME types on the published files and make sure they are right. These can be changed for each file if needed. With ClickOnce deployments, you should definitely be using the option that appends .deploy to all of your assemblies, so you should not have any files with unknown extensions. If you want to double-check, the MIME types for a ClickOnce deployment are explained here.

Remember that with Blob Storage, it is not going to be storing the files that is going to be the biggest cost factor, it is going to be the transfer of the files to and from the client.

How about a bootstrapper package for SQLServer Express 2008 R2?

February 6, 2011

When publishing a ClickOnce application, you can open the Prerequisites dialog and select any of the packages to be installed prior to the ClickOnce application. You would probably select the .NET Framework that your application targets and Windows Installer 3.1. You would also select the SQLServer Express 2008 database software if you are using a SQLExpress database in your application.

Last summer, Microsoft released SP-1 for SQLServer 2008 and several people posted to the MSDN ClickOnce Forum and Stack Overflow asking where they could get a new, updated bootstrapper package. I decided to pursue it and see if I could track it down.

I’ll just ask Microsoft.

Saurabh Bhatia, who’s kind enough to help me answer the most difficult ClickOnce questions, routed me to the SQLServer Express program manager, Krzysztof Kozielczyk. Krzysztof said he totally agreed that Microsoft should release a new version of the bootstrapper package every time they updated SQLExpress, but they were busy working on the upcoming releases and he did not have the resources to work on it. It’s hard to know what to say when you tell someone “You should have this” and they say “You’re right, we should, I’m sorry we don’t.” (I’m going to have to remember to use that on my boss in the future.)

According to Krzysztof, the problem is that they can’t just create the package and release it. They have to create the boostrapper package and test it in a bunch of combinations or different variables, such as operating system version, SQLServer Express version (to test that the upgrade works), .NET version, number of olives in a jar, and kinds of mayonnaise available on the west coast, among others. Then if it passes mustard (haha), they have to find somewhere to post it, make it available, and then keep it updated. So more people are involved than just his team, and at that time, everyone in DevDiv at Microsoft was working on the upcoming release of VS2010.

Persistence is one of my best qualiities, so I continued over the following months to badger try to convince Krzysztof to at least help me create the package and provide it to the community, especially after R2 came out. Every time someone posted the request to one of the forums, I would take a screenshot and send it to him. He was unfailingly kind, but stuck to his guns. There’s even been a bug filed in Connect for this issue. (On the bright side, Krzysztof did answer the occasional SQLExpress question from the forums for me.)

Success! (and disclaimers)

Well, we’ve had quite a bit of back and forth lately, and I’m happy to report that I now have a bootstrapper package that I can use to install SQLServer Express 2008 R2 as a prerequisite to a ClickOnce application. (I skipped 2008 SP-1). Krzysztof did not provide the solution, but by peppering the guy with questions, I have finally created a working package. So big thanks to Krzysztof for continuing to answer my questions and put up with the badgering over the past few months. Now he doesn’t need to avoid me when I go to the MVP Summit at the end of this month!

Disclaimer: This did not come from Microsoft, I figured it out and am posting it for the benefit of the community. Microsoft has no responsibility or liability for this information. I tested it (more on that below), but you must try it out and make sure it works for you rather than assuming. Legally, I have to say this: Caveat emptor. There is no warranty expressed or implied. Habeas corpus. E pluribus unum. Quid pro quo. Vene vidi vici. Ad infinitum. Cogito ergo sum. That is all of the Latin-ish legalese I can come up with to cover my you-know-what. (I have more sympathy for Krzysztof now.)

I tested the package as the prerequisite for a ClickOnce application with both 32-bit and 64-bit Windows 7. I tested it with no SQLExpress installed (i.e. new installation) and with SQLExpress 2008 installed (to test that it updates it to R2). I did not test it with a setup & deployment package, but I’m certain it will work. Maybe one of you will try it out and let me know. I did not try it all of the combinations listed above for Microsoft testing, although I did eat some olives out of a jar while I was testing it.

Enough talk, show me the goods

Here are the instructions on creating your own SQLServer 2008 R2 prerequisite package. You must have a SQLServer 2008 Prerequisite package in order to do this. If you don’t have one, and you don’t have VS2010, install the Express version (it’s free). I think you can also install the Windows 7 SDK and it ill provide those prerequisite packages (I’m guessing, since they show up under the Windows folder under SDKs).

I didn’t post the whole package because there is a program included in the bootstrapper package called SqlExpressChk.exe that you need, and it is not legal for me to distribute it. (I don’t think my Latin would withstand the scrutiny of a Microsoft lawyer.)

First, locate your current bootstrapper packages. On my 64-bit Windows 7 machine, they are under C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bootstrapper\Packages. If you don’t have the \en\ folder, presumably you have one for a differeng language; just wing it and substitute that folder in the following instructions.

  1. Copy the folder called “SQLExpress2008” to a new folder called “SQLExpress2008R2”.
  2. Download the zip file with the product.xml and package.xml you need from here and unzip it.
  3. In the new \SQLExpress2008R2\ folder, replace the product.xml file with my version.
  4. In the \SQLExpress2008R2\en\ folder, replace the package.xml file with my version.
  5. Delete the two SQLExpr*.exe files in the \SQLExpress2008R2\en\ folder.
  6. Download the 32-bit install package from here and put it in the \SQLExpress2008R2\en\ folder.
  7. Download the 64-bit install package from here and put it in the \SQLExpress2008R2\en\ folder.
  8. Close and re-open Visual Studio. You should be able to go to the Publish tab of your application and click on the Prerequisites button and see “SQLExpress2008R2” in the list.

Do NOT remove the SqlExpressChk.exe file from the \SQLExpress2008R2\ folder, or the eula.rtf from the \SQLExpress2008R2\en\ folder.

If you’re using ClickOnce deployment, don’t forget that it does not install updates to the prerequisites automatically – it only updates the ClickOnce application. You will either have to have your customers install the prerequisite before they upgrade the ClickOnce application (just ask them to run the setup.exe again), or programmatically uninstall and reinstall the application for them, including the new prerequisite. (The code for that is in this MSDN article.)