Archive for February, 2011

Azure Tools/SDK 1.3 and IIS Logging

February 16, 2011

Running in Azure with Tools/SDK version 1.2, I had working code that assured that my IIS logs and IIS Failed Request logs were transferred automatically to blob storage, where I could easily view them. After upgrading to Azure Tools/SDK 1.3, my IIS logs no longer showed up in blob storage. After looking around, I found this MSDN article, which talks about this being a known problem. It’s caused because the processes involved do not have the right permissions to access the folders where the logs are located.

The article also says this: To read the files yourself, log on to the instance with a remote desktop connection. I thought, “Great, at least I can get to them and view them.” Well, not so much. You can RDP into the instance and track down the IIS logs, but the IIS Failed Request logs are not created.

The article blithely throws this solution your way: “To access the files programmatically, create a startup task with elevated privileges that manually copies the logs to a location that the diagnostic monitor can read. Doesn’t that sound easy? Not so much.

I started a thread in the MSDN Forum and my good friend Peter Kellner opened up a problem ticket with Azure support. So I finally have a solution, with input from and my thanks to Steve Marx (Microsoft Azure Team), Andy Cross, Christian Weyer (MVP), Ruidong Li (Microsoft Azure support), Neil Mackenzie (Azure MVP), and Cory Fowler (Azure MVP). Sometimes it takes a village to fix a problem. I have to give most of the credit to Ruidong Li, who took the information from Steve Marx and Christian Weyer on startup tasks and PowerShell and ran with it.

I’m going to give all the info for getting the IIS logs and IIS Failed Request logs working. The basic information is available from many sources, including this article by Andy Cross.

For the IIS Failed Request logs, you have to put this in your web.config.

<!-- This is so the azure web role will write to the iis failed request logs-->
<tracing>
  <traceFailedRequests>
    <add path="*">
    <traceAreas>
      <add provider="ASP" verbosity="Verbose" />
      <add provider="ASPNET" 
        areas="Infrastructure,Module,Page,AppServices" 
        verbosity="Verbose" />
      <add provider="ISAPI Extension" verbosity="Verbose" />
      <add provider="WWW Server" 
        areas="Authentication,Security,Filter,StaticFile,CGI,Compression,Cache,RequestNotifications,Module" 
        verbosity="Verbose" />
    </traceAreas>
    <failureDefinitions timeTaken="00:00:15" statusCodes="400-599" />
    </add>
  </traceFailedRequests>
</tracing>

In the OnStart method of your WebRole, you need this:

//from http://blog.bareweb.eu/2011/01/implementing-azure-diagnostics-with-sdk-v1-3/

// Obtain a reference to the initial default configuration.
string wadConnectionString = "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";
CloudStorageAccount storageAccount =
    CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue(wadConnectionString));
RoleInstanceDiagnosticManager roleInstanceDiagnosticManager =
    storageAccount.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId, 
    RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
config.ConfigurationChangePollInterval = TimeSpan.FromSeconds(30.0);
//transfer the IIS and IIS Failed Request Logs
config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);

//set the configuration for use
roleInstanceDiagnosticManager.SetCurrentConfiguration(config);  

If you publish it at this point, you get no logs in blob storage, and if you RDP into the instance there will be no IIS Failed Request logs. So let’s add a startup task (per Christian Weyer and Steve Marx) to “fix” the permissions.

First, create a small file called FixDiag.cmd with Notepad, and put this line of code in it. This is going to be the command executed when the role instance starts up. Add this file to your project and set the Build Action to “Content” and “Copy to Output Directory” to “copy always” so it will include the file in the deployment when you publish your application to Azure. Here are the contents of the file.

powershell -ExecutionPolicy Unrestricted .\FixDiagFolderAccess.ps1>>C:\output.txt

This is going to run a script called FixDiagFolderAccess.ps1 and output the results to C:\output.txt. I found the output file to be really helpful when trying to figure out if my script was actually working or not. So what does the powershell script look like?

Here’s the first bit. This loads the Microsoft.WindowsAzure.ServiceRuntime assembly. If it’s not available, it waits a few seconds and loops around and tries again. Then it gets the folder where the Diagnostics information is stored.

echo "Output from Powershell script to set permissions for IIS logging."

Add-PSSnapin Microsoft.WindowsAzure.ServiceRuntime

# wait until the azure assembly is available
while (!$?)
{
    echo "Failed, retrying after five seconds..."
    sleep 5

    Add-PSSnapin Microsoft.WindowsAzure.ServiceRuntime
}

echo "Added WA snapin."

# get the DiagnosticStore folder and the root path for it 
$localresource = Get-LocalResource "DiagnosticStore"
$folder = $localresource.RootPath

echo "DiagnosticStore path"
$folder

This is the second part. This handles the Failed Request log files. Following Christian’s lead, I’m just setting this to give full access to the folders. What’s new is I’m creating a placeholder file in the FailedReqLogFiles\Web folder. If you don’t do that, MonAgentHost.exe will come around and delete the empty Web folder that was created during app startup. If the folder isn’t there, when IIS tries to write the failed request log, it gives a “directory not found” error.

# set the acl's on the FailedReqLogFiles folder to allow full access by anybody.
# can do a little trial & error to change this if you want to.

$acl = Get-Acl $folder

$rule1 = New-Object System.Security.AccessControl.FileSystemAccessRule(  
    "Administrators", "FullControl", "ContainerInherit, ObjectInherit",   
    "None", "Allow")
$rule2 = New-Object System.Security.AccessControl.FileSystemAccessRule(  
    "Everyone", "FullControl", "ContainerInherit, ObjectInherit",   
    "None", "Allow")

$acl.AddAccessRule($rule1)
$acl.AddAccessRule($rule2)

Set-Acl $folder $acl

mkdir $folder\FailedReqLogFiles\Web
"placeholder" >$folder\FailedReqLogFiles\Web\placeholder.txt

Next, let’s handle the IIS logs. The credit for this goes to Rudy with Microsoft Azure Support. This creates a placeholder file, and then it retrieves the folders under Logfiles\Web looking for one that starts with W3SVC – this is the folder the IIS logs will end up in. There won’t be any the first time because the folder is not created until the web application is loaded the first time.

In the first incarnation of this code, it just waited a specific amount of time and then set the folder ACLs. The problem was that there seems to be some kind of race condition, and it appeared that if it didn’t start up before the Azure process that copies the logs started up, then you could set the permissions on the folders all day long and it wouldn’t ever transfer the log files. So this code forces it to download the page, which causes the IIS logging to start, and it beats the race condition (or whatever the problem is). At any rate, this works every time.

mkdir $folder\Logfiles\Web
"placeholder" >$folder\Logfiles\Web\placeholder.txt

# Get a list of the directories for the regular IIS logs. 
# You have to wait until they are actually created,
#   which is why there's a loop here. 
# Just keep looking until you find the folder(s).

$dirs = [System.IO.Directory]::GetDirectories($folder + "\\Logfiles\\web\\", "W3SVC*")
$ip = (gwmi Win32_NetworkAdapterConfiguration | ? { $_.IPAddress -ne $null }).ipaddress

$ip
echo "dirs.count"
$dirs.count
while ($dirs.Count -le 0) 
{ 
       Sleep 10
       $bs = (new-object System.Net.WebClient).DownloadData("http://" + $ip[0])
       echo "in the loop"         
       
       $dirs = [System.IO.Directory]::GetDirectories($folder + "\\Logfiles\\Web\\", "W3SVC*")
       echo "dirs"
       $dirs
       echo "dirs[0]"
       $dirs[0]
       echo "dirs.count"
       $dirs.count
}

echo "after while loop"
echo "dirs[0]"
$dirs[0]

Now that there’s a folder and you know where it is, set the permissions on it.

# Now set the ACLs on the "first" directory you find. (There's only ever one.)

$acl = Get-Acl $dirs[0]
$acl.AddAccessRule($rule1)
$acl.AddAccessRule($rule2)

Set-Acl $dirs[0] $acl

This powershell script should be called FixDiagFolderAccess.ps1 (it needs to match the name specified in FixDiag.cmd). Add this to your project and as before, set the Build Action to “Content” and “Copy to Output Directory” to “Copy Always”.

In your Service Configuration file, you need to add this to the end of the <ServiceConfiguration> element. In order for the –ExecutionPolicy flag to work, you must be running in Windows Server 2008 R2, which has PS2.

osFamily="2" osVersion="*"

In your Service Definition file, you will need to specify the Startup Task. This goes right under the opening element for the <WebRole>. As recommended by Steve Marx, I’m running this as a background task. That way if there is a problem and it loops forever or won’t finish for some reason, I can still RDP into the machine. Also, in order to set the ACL’s, I need to run this with elevated permissions.

    <Startup>
      <Task commandLine="FixDiag.cmd" executionContext="elevated" taskType="background" />
    </Startup>

So that should set you up for a web application. You can get your IIS logs and IIS Failed Request logs transferred automatically to blob storage where you can view them easily. And if you RDP into your instances, you can look at both sets of logs that way as well.

What if you have a WCF service, and no default web page? In the line that does the WebClient.DownloadData, just add the name of your service, so it looks like this:

$bs = (new-object System.Net.WebClient).DownloadData("http://" + $ip[0] + "MyService.svc")

What if your WCF service or web application only exposes https endpoints? I don’t know. I’m still searching for an answer to that question. I tried using https instead of http, and I get some error about it being unable to create the trust relationship. At this point, I’ve spent so much time on this, I have to just enable RDP on the services with https endpoints and access the logging by RDP’ing into the instance. If you have any brilliant ideas, please leave a comment.

[Edit 3/8/2011] I figured out for the https endpoints that rather than call DownloadData using the IP address, use the DNS name of the service, along with the service name. For example, say you are Contoso.com, and you have an SSL certificate. Your services will likely have DNS entries with the domain name contoso.com, like bingservice.contoso.com. If your service is BingService.svc, the URL would be https://bingservice.contoso.com/BingService.svc. If I put that into the DownloadData statement, it works. Since it’s https, of course it has to access the service with the same domain as the SSL certificate.

How to host a ClickOnce deployment in Azure Blob Storage

February 13, 2011

Now that Microsoft Azure is becoming more widely used, I’m going to do some blogging about it, since I’ve had an opportunity to work with it quite a bit. What better place to start than to do a crossover blog entry on both ClickOnce deployment and Microsoft Azure? So I’m going to show you how to host your ClickOnce deployment in your Azure Blob Storage.

To do this, you need an application that you can use to manage blob storage. I use the Cloud Storage Studio from cerebrata in my example. A free application recommended by Cory Fowler (Microsoft Azure MVP) is the Azure Storage Explorer from codeplex.

Here  is a video that explains this process in detail, complete with screenshots. There is a summary below.

To summarize:

Create a container in blob storage for your ClickOnce deployment. You’ll need the container name when setting your url. I selected ‘clickoncetest’. The only characters allowed are lower case letter, numbers, and the hyphen (-).

In your project properties, set your Publishing Folder Location to somewhere on your local drive. Set the Installation Folder URL to the URL that will point to the container in blob storage that is going to host your deployment.

For example, I set the first one to E:\__Test\clickoncetest. My account is goldmailrobin, so my installation URL will be http://goldmailrobin.blob.core.windows.net/clickoncetest/

Publish your application. Then go to the local folder and copy the files and folders up to the container in blob storage. When you are finished, in the root of that container you should have the deployment manifest (yourapp.application file) and the bootstrapper (setup.exe) (and publish.htm if you included it). You should also have a folder called “Application Files”.

In “Application Files”, you should see the ‘versioned folders’ that contain the assemblies for each version of your application that you have published.

When doing updates, you need to upload the versioned folder for the new update, and replace the files in the root folder (yourapp.application, setup.exe, and publish.htm).

If you have any problems, you can check the MIME types on the published files and make sure they are right. These can be changed for each file if needed. With ClickOnce deployments, you should definitely be using the option that appends .deploy to all of your assemblies, so you should not have any files with unknown extensions. If you want to double-check, the MIME types for a ClickOnce deployment are explained here.

Remember that with Blob Storage, it is not going to be storing the files that is going to be the biggest cost factor, it is going to be the transfer of the files to and from the client.

How about a bootstrapper package for SQLServer Express 2008 R2?

February 6, 2011

When publishing a ClickOnce application, you can open the Prerequisites dialog and select any of the packages to be installed prior to the ClickOnce application. You would probably select the .NET Framework that your application targets and Windows Installer 3.1. You would also select the SQLServer Express 2008 database software if you are using a SQLExpress database in your application.

Last summer, Microsoft released SP-1 for SQLServer 2008 and several people posted to the MSDN ClickOnce Forum and Stack Overflow asking where they could get a new, updated bootstrapper package. I decided to pursue it and see if I could track it down.

I’ll just ask Microsoft.

Saurabh Bhatia, who’s kind enough to help me answer the most difficult ClickOnce questions, routed me to the SQLServer Express program manager, Krzysztof Kozielczyk. Krzysztof said he totally agreed that Microsoft should release a new version of the bootstrapper package every time they updated SQLExpress, but they were busy working on the upcoming releases and he did not have the resources to work on it. It’s hard to know what to say when you tell someone “You should have this” and they say “You’re right, we should, I’m sorry we don’t.” (I’m going to have to remember to use that on my boss in the future.)

According to Krzysztof, the problem is that they can’t just create the package and release it. They have to create the boostrapper package and test it in a bunch of combinations or different variables, such as operating system version, SQLServer Express version (to test that the upgrade works), .NET version, number of olives in a jar, and kinds of mayonnaise available on the west coast, among others. Then if it passes mustard (haha), they have to find somewhere to post it, make it available, and then keep it updated. So more people are involved than just his team, and at that time, everyone in DevDiv at Microsoft was working on the upcoming release of VS2010.

Persistence is one of my best qualiities, so I continued over the following months to badger try to convince Krzysztof to at least help me create the package and provide it to the community, especially after R2 came out. Every time someone posted the request to one of the forums, I would take a screenshot and send it to him. He was unfailingly kind, but stuck to his guns. There’s even been a bug filed in Connect for this issue. (On the bright side, Krzysztof did answer the occasional SQLExpress question from the forums for me.)

Success! (and disclaimers)

Well, we’ve had quite a bit of back and forth lately, and I’m happy to report that I now have a bootstrapper package that I can use to install SQLServer Express 2008 R2 as a prerequisite to a ClickOnce application. (I skipped 2008 SP-1). Krzysztof did not provide the solution, but by peppering the guy with questions, I have finally created a working package. So big thanks to Krzysztof for continuing to answer my questions and put up with the badgering over the past few months. Now he doesn’t need to avoid me when I go to the MVP Summit at the end of this month!

Disclaimer: This did not come from Microsoft, I figured it out and am posting it for the benefit of the community. Microsoft has no responsibility or liability for this information. I tested it (more on that below), but you must try it out and make sure it works for you rather than assuming. Legally, I have to say this: Caveat emptor. There is no warranty expressed or implied. Habeas corpus. E pluribus unum. Quid pro quo. Vene vidi vici. Ad infinitum. Cogito ergo sum. That is all of the Latin-ish legalese I can come up with to cover my you-know-what. (I have more sympathy for Krzysztof now.)

I tested the package as the prerequisite for a ClickOnce application with both 32-bit and 64-bit Windows 7. I tested it with no SQLExpress installed (i.e. new installation) and with SQLExpress 2008 installed (to test that it updates it to R2). I did not test it with a setup & deployment package, but I’m certain it will work. Maybe one of you will try it out and let me know. I did not try it all of the combinations listed above for Microsoft testing, although I did eat some olives out of a jar while I was testing it.

Enough talk, show me the goods

Here are the instructions on creating your own SQLServer 2008 R2 prerequisite package. You must have a SQLServer 2008 Prerequisite package in order to do this. If you don’t have one, and you don’t have VS2010, install the Express version (it’s free). I think you can also install the Windows 7 SDK and it ill provide those prerequisite packages (I’m guessing, since they show up under the Windows folder under SDKs).

I didn’t post the whole package because there is a program included in the bootstrapper package called SqlExpressChk.exe that you need, and it is not legal for me to distribute it. (I don’t think my Latin would withstand the scrutiny of a Microsoft lawyer.)

First, locate your current bootstrapper packages. On my 64-bit Windows 7 machine, they are under C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bootstrapper\Packages. If you don’t have the \en\ folder, presumably you have one for a differeng language; just wing it and substitute that folder in the following instructions.

  1. Copy the folder called “SQLExpress2008” to a new folder called “SQLExpress2008R2”.
  2. Download the zip file with the product.xml and package.xml you need from here and unzip it.
  3. In the new \SQLExpress2008R2\ folder, replace the product.xml file with my version.
  4. In the \SQLExpress2008R2\en\ folder, replace the package.xml file with my version.
  5. Delete the two SQLExpr*.exe files in the \SQLExpress2008R2\en\ folder.
  6. Download the 32-bit install package from here and put it in the \SQLExpress2008R2\en\ folder.
  7. Download the 64-bit install package from here and put it in the \SQLExpress2008R2\en\ folder.
  8. Close and re-open Visual Studio. You should be able to go to the Publish tab of your application and click on the Prerequisites button and see “SQLExpress2008R2” in the list.

Do NOT remove the SqlExpressChk.exe file from the \SQLExpress2008R2\ folder, or the eula.rtf from the \SQLExpress2008R2\en\ folder.

If you’re using ClickOnce deployment, don’t forget that it does not install updates to the prerequisites automatically – it only updates the ClickOnce application. You will either have to have your customers install the prerequisite before they upgrade the ClickOnce application (just ask them to run the setup.exe again), or programmatically uninstall and reinstall the application for them, including the new prerequisite. (The code for that is in this MSDN article.)