Saturday, December 11, 2010

Windows Azure Platform Monitoring services

Did you know that Microsoft has a public service status dashboard, where you can easily and quickly check the status of all Azure Data Centers and Services status? Yes, there is such Service Dashboard. It is located on the following address:

http://www.microsoft.com/windowsazure/support/status/servicedashboard.aspx

So the next time you experience some troubles with live Windows Azure environment, first check out the Service Dashboard, and then you can contact the Windows Azure support to report live site issues here. Do not forget to first get your Subscription ID first, and then always include it, when reporting issues to the support team!

Windows Azure Storage Tips

Windows Azure is a great platform. It has different components (like Compute, Storage, SQL Azure, AppFabric) which can be used independently. So for example you can use just Windows Azure Storage (be it Blob, Queue or Table) without even using Compute (Windows Azure Roles) or SQL Azure or AppFabric. And using just Windows Azure Storage is worthy. The price is very competitive to other cloud storage providers (such as Amazon S3).

To use Windows Azure Storage from within your Windows Forms application you just need to add reference to the Microsoft.WindowsAzure.StorageClient assembly. This assembly is part of Windows Azure SDK.

O.K. Assuming you have created a new Windows Forms application, you added reference to that assembly, you tried to create your CloudStorageAccount using the static Parse or TryParse method, and you try to build your application. Don’t be surprised, you will get following error (warning):

Warning    5    The referenced assembly "Microsoft.WindowsAzure.StorageClient, Version=1.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL" could not be resolved because it has a dependency on "System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which is not in the currently targeted framework ".NETFramework,Version=v4.0,Profile=Client". Please remove references to assemblies not in the targeted framework or consider retargeting your project.   

And you will not be able to build.

Well, some of you may not know, but with the Service Pack 1 of .NET Framework 3.5, Microsoft announced a new concept, named “.NET Framework Client Profile” which is available for .NET Framework 3.5 SP 1 and .NET Framework 4.0. The shorter version of what Client Profile is follows:

The .NET Framework 4 Client Profile is a subset of the .NET Framework 4 that is optimized for client applications. It provides functionality for most client applications, including Windows Presentation Foundation (WPF), Windows Forms, Windows Communication Foundation (WCF), and ClickOnce features. This enables faster deployment and a smaller install package for applications that target the .NET Framework 4 Client Profile.

For the full version – check out the inline links.

What to do in order to use Microsoft.WindwsAzure.StorageClient from within our Windows Forms application – go to project Properties and from “Target Framework” in “Application” tab select “.NET Framework 4” and not the “* Client Profile” one:

ClientProfile

The gotcha, is that the default setting for Visual Studio is to use Client Profile of the .NET Framework. And you cannot choose this option from the “New Project” wizard, and all new projects you create are targeting the .NET Framework Client Profile (if you choose a .NET Framework 4 or 3.5 project template).

Slides from my talks on Windows Azure Topics

The year of 2010 was good for me. With the support of Microsoft Bulgaria & Martin Kulov (Microsoft Regional Director & Visual Studio ALM MVP) I established a Windows Azure User Group here in Bulgaria and I had presented couple of introductory and couple of deep dive topics covering Windows Azure, SQL Azure, Windows Azure Storage Services, Developing and Deploying Windows Azure Applications with Visual Studio 2010. The most recent is my talk at Microsoft PDC Local (Sofia, Bulgaria) where I showed up my “Azure Video Converter” Demo application. A proof of concept application to demonstrate the power of Windows Azure and how can developers use Windows Azure worker roles to execute third party software, which requires no access to registry/administrative privileges, etc. (a generic x-copy deployment apps).

All the slides can be downloaded or viewed online here.

Looking forward for even more exciting content in the coming 2011!

New book on Windows Azure released

During the passed couple of months I was a technical reviewer for a new book on Windows Azure – Microsoft Azure: Enterprise Application Development by Packt Publishing. I guess what would the most questions be. And the answer is: the book covers Windows Azure Platform up to version 1.2 of Tools & SDKs. And the screenshots are from the old Management Portal. But hey, everything written in that book is totally accurate and up to date. It just does not cover the new features, announced at PDC 2010.

Wednesday, November 24, 2010

Azure Video Converter Update

I’ve just updated the Azure Video Converter project. It is closer to what I was planning it to be. Now the main Worker Role is stripped from predefined, hardcoded executable files. The project utilizes Windows Azure Queues to decouple processing logic from requesting logic. I also added a very simple Windows Forms application that you can use to convert files. Please visit the project home page and documentation to learn more if you are interested.

Hope you like it!

Friday, November 5, 2010

Windows Azure development fabric port issues / port walking

So you wanted to get your hands on Windows Azure and downloaded all the tools/sdk, walked though some web casts, blog posts and create your first Cloud Service project with a single Web Role. Now you notice that it runs mostly on port 81, but sometimes on 82 and wonder why. Here is the answer.

By default, when you add a WebRole to your Cloud Service project an HttpIn Input endpoint is automatically assigned to it and bound to port 80. The InputEndpoints for a Role in Windows Azure are defined in the ServiceDefinition.csdef file. But you can easily see and manipulate them via a nice UI. To check and manipulated all settings for a Role just right click on the Role leaf under Roles folder in your CloudService project, and select Properties:

azure_role_settings

From there, you can change your Endpoints:

azure_role_endpoints

As you can see, you can change your endpoints right from this User Interface properties window. You can also change a lot of other stuff here.

Here is when you have to start being careful and know exactly what you are doing! The Endpoints defined here are the endpoints which will be used when you deploy your Cloud Service to the Cloud! If you change the port here, either accidentally or intentionally, the new port will be used, and your web application will not be accessible on port 80 anymore! Be aware of that!

But why the port is not 80 when I run locally? Why my application runs on port 81 or even 82?

There is only one simple answer to that question: your port 80 is already bound to other application when your package is deployed to the development fabric. When you hit F5 (or Ctrl+F5) to run your project locally, a local Development Fabric is started, your cloud service application is packed into a deployment package and deployed to the Development Fabric. The Development Fabric then looks into its definition and searches for required endpoints. It finds out that your service requires port 80 as an Endpoint. Development fabric tries to bind to port 80 on localhost (127.0.0.1). It always binds to 127.0.0.1 IPv4 address. If port 80 is already occupied, instead of throwing an error and making our life harder, it just tries next port (81). If next port is also occupied, Dev Fabric tries next one (82) and so on until a free port is found. Please be aware, that this only happens on Local Development Fabric. This process will never happen on Live windows Azure Environment (either staging or production)!

What is the reason for my port 80 being occupied?

Here are some:

Applications that most often occupy port 80 are, but not limited to:

  • IIS (Is there a Microsoft Web Developer who hasn’t enabled IIS on development machine?)
  • Skype. Yes Skype! (Tools -> Options -> Advanced -> Connection: [] User port 80 and 443 as alternative for incoming connections). This option is enabled by default. And if you happen to be running Skype before starting IIS for example, your IIS will not be able to bind to port 80 and you might wonder why. Development fabric however, just searches for next available port and binds to it.
  • Apache Web Server
  • Any other Web server

Why sometimes development fabric binds to next-next available port? Why it binds to 82, while there is no application bound to 81?

Sometime, the cycle between destroying a deployment and creating next one is too short, and port 81 (for example) is still occupied by a deployment still in destroying stage. While the next deployment is searching for a port to bind to. That's why 82 is taken.

Another issue might be that a connection to that port sill exists. Connection state might be CLOSE_WAIT, but that also does not allow binding to the port. You can see a list of all connections with their statuses with the following command:

netstat –an

Everything said so far is 100% valid for a Worker Role. Let’s say you have a worker role, which exposes an IP Endpoint and implements custom TCP Server. You have on-premise application which uses custom TCP based protocol, and you have developed your custom TCP server to run in the cloud. Then you exposed the endpoint for this custom TCP server. When deployed to Windows Azure, your endpoints will be exactly same as defined in your service definition. However when running locally your endpoints might be different. You should expect this behavior, and if this is unwanted, always check out the network connections with the “netstat” command or Sysinternals TcpView.

You can also check the current active endpoints for your development fabric deployment. Just right click on the Azure windows icon in the system tray and select “Show Development Fabric UI”. Then navigate to your deployment and select Service Details Node:

azure_dev_fabric_ui

Friday, October 29, 2010

Windows Azure Update

I thought I know a lot about Windows Azure! But PDC 2010 Key Note totally changed my view. The Windows Azure team seems to have done a tremendous work lately! Really tremendous! And once again, Microsoft proved that they listen to community, listen to what do we want.

Here is a list of features that will soon be available, but I suggest that you watch at least the keynote of day one (http://player.microsoftpdc.com)! If you want to fast forward to the Windows Azure part – go to straight to 1hr 20 min:

  • Elevated privileges
  • Full IIS
  • Remote Desktop
  • VM Role
  • New, much richer and better management portal

… much much more!

I have no patience to get my hands on it!

Don’t miss to visit the Windows Azure team blog!

Saturday, October 23, 2010

Microsoft Productivity Power Tools for Visual Studio 2010 has been Updated

It’s great that we have this feature pack from Microsoft and they are constantly improving it. Just go and download latest version from here.

Productivity Power Tools features at a glance:

  • Solution Navigator.
    A powerful tool windows which merges functionality from Solution Explorer, Class View, Object Explorer, Call Hierarchy, Navigate To and Find Symbol references!
  • Tab Well UI.
    Working with multiple files has never been easier with Tab Well UI. The feature I used the most – Pinned tabs. It allows you to pin certain files so you never lose them :)
  • Searchable Add Reference dialog.
    My favorite! Your default add reference dialog is changed to better user experience. Never scroll for assembly! Just search for it!
  • HTML Copy
    Another favorite. Never have to use external tools or add-ins! Just select any code and copy it! It is automatically copied as HTML. If you paste in RTF editor – your code is recognized. If you paste it in plain text editor – no bloating HTML is pasted, as well as if you paste inside Visual Studio.

There are so much more features, just go and get it! It’s free!

Saturday, October 9, 2010

Equivalent of Unix’s ls –lah | grep something for Windows

Hello, ever since I am using linux based platforms I loved them for the rich command shell. It’s been very easy for me to find what I am looking for using command piping and the grep command. A very common combination that I’ve been through is “ls –lah | grep something”, which searches for specific file in current folder. Or “ps –ax | grep processname” which searches for specific processanme in the list of all running processes. Recently I had to, on Windows OS, very often run “netstat –an” to search whether specific port is occupied and by which executable. However that Windows OS is Windows Server, and it has lots of services running, so finding a specific port was terribly hard. A quick search gave me desired result.

You can use Windows’ command “find” exactly the same way you use “grep” in linux/unix! So finding a specific port occupation is like that:

netstat –anb | find “:80”

this will list all “:80” in the list, which basically means – all 80 ports occupied!

Great stuff!

Friday, October 8, 2010

Convert video files in Windows Azure /using FFMPEG/

There was a question on the Windows Azure MSDN forums, that pushed me to do that sample! The question is “Can I use Windows Azure environment to convert video files”. The simple answer is "YES”! But how to achieve that?

Frankly, I was thinking on that for a while. Yes, I really was! I’ve been using FFMPEG in my projects, but Linux based with PHP. I was wandering how it works on Windows OS. And the time to figure that out came!

I created a very very light demo project on how to process video files using FFMPEG in Windows Azure, you can download it from here.

What you have to know – FFMPEG windows binary is single executable which brings all codecs (up to release date) with it. You have to put it in you Azure role (worker role preferably) and execute it via the Process.Start(ProcessStartiInfo psi) overload method. I have included the binary and a sample video for your convenience!

Here is what I do:

            Assembly asm = Assembly.GetExecutingAssembly();
            string path = asm.Location;
            path = path.Substring(0, path.LastIndexOf(@"\")+1);
            //path = path + "ffmpeg\ffmpeg.exe";
            string tmpName = Path.GetTempFileName();
            tmpName = tmpName.Substring(0, tmpName.Length - 4);
            tmpName = tmpName + ".flv";
            ProcessStartInfo psi = new ProcessStartInfo();
            psi.FileName = string.Format(@"""{0}ffmpeg\ffmpeg.exe""", path);
            psi.Arguments = string.Format(@"-i ""{0}ffmpeg\MVI_1042.AVI"" -y ""{1}""", path, tmpName);
            psi.CreateNoWindow = false;
            psi.ErrorDialog = false;
            psi.UseShellExecute = false;
            psi.WindowStyle = ProcessWindowStyle.Hidden;
            psi.RedirectStandardOutput = true;
            psi.RedirectStandardError = true;        
            try
            {
                // Start the process with the info we specified.
                // Call WaitForExit and then the using statement will close.
                using (Process exeProcess = Process.Start(psi))
                {
                    StreamReader output = exeProcess.StandardOutput;
                    StreamReader error = exeProcess.StandardError;
                    exeProcess.WaitForExit();
                    string outString = output.ReadToEnd();
                    string errString = error.ReadToEnd();
                    Trace.WriteLine(outString);
                    Trace.TraceError(errString);
                    byte[] fileBytes = File.ReadAllBytes(tmpName);
                }
            }
            catch(Exception e)
            {
                Trace.TraceError(e.Message);
            }

Of course, you can upgrade a lot of thing of this demo application, but it was create in just couple of minutes! One thing you may want to consider is using Azure Blob as store for your executable! Each time you start a conversion process, just check the blob if it is changed. Thus you will be able to update to the most recent version of FFMPEG without even touching your deployed Azure Service!

Wednesday, September 29, 2010

Tip on using the SQL Azure migration wizard

If you’ve been developing or just playing around for Windows Azure and using the SQL Azure, inevitably you’ve been using SQL Azure Migration Wizard. If not – go ahead and download it! It’s the ultimate tool to migrate your SQL Server data to and from SQL Azure and vise-versa.

I would like to share a tip, that most probably you have noticed but you are not sure what it is. Since couple of release SQL Azure Migration Wizard relies on SQL Server 2008 R2 bits (Express also works). There is small problem when you also have earlier version of SQL Server and/or Management Studio. The tool from management studio that is useful to SQL Azure Migration Wizard is called bcp. It is a command line tool to bulk coping SQL Server tables. And there is a difference in version that comes with SQL Server 2008 R2, and the one that comes with SQL Server 2008 (-). The most recent one has command-line option “-d”, while the others don’t. The trick is to change your PATH environment variable. Remove anything related to older version of SQL Server like this:

C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Binn

Where the old BCP resides. If you run BCP from that folder you will see:

C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Binn>bcp
usage: bcp {dbtable | query} {in | out | queryout | format} datafile
  [-m maxerrors]            [-f formatfile]          [-e errfile]
  [-F firstrow]             [-L lastrow]             [-b batchsize]
  [-n native type]          [-c character type]      [-w wide character type]
  [-N keep non-text native] [-V file format version] [-q quoted identifier]
  [-C code page specifier]  [-t field terminator]    [-r row terminator]
  [-i inputfile]            [-o outfile]             [-a packetsize]
  [-S server name]          [-U username]            [-P password]
  [-T trusted connection]   [-v version]             [-R regional enable]
  [-k keep null values]     [-E keep identity values]
  [-h "load hints"]         [-x generate xml format file]

There is no option “-d” which is to select a database, which option exists in R2 version of SQL Server 2008:

C:\>bcp
usage: bcp {dbtable | query} {in | out | queryout | format} datafile
  [-m maxerrors]            [-f formatfile]          [-e errfile]
  [-F firstrow]             [-L lastrow]             [-b batchsize]
  [-n native type]          [-c character type]      [-w wide character type]
  [-N keep non-text native] [-V file format version] [-q quoted identifier]
  [-C code page specifier]  [-t field terminator]    [-r row terminator]
  [-i inputfile]            [-o outfile]             [-a packetsize]
  [-S server name]          [-U username]            [-P password]
  [-T trusted connection]   [-v version]             [-R regional enable]
  [-k keep null values]     [-E keep identity values]
  [-h "load hints"]         [-x generate xml format file]
  [-d database name]

The point that having earlier version of SQL Server Management Studio it’s folder appears earlier in the PATH environment variable. When you run SQL Azure Migration Wizard it tries to launch bcp.exe without specifying folder relaying on the fact that the folder will be part of the PATH environment variable. But the earlier version will come first and that bcp will be executed, so you will get errors in SQL Azure Migration Wizard. To avoid that errors and run everything smoothly, just remove the old folder from the PATH variable.

  1. How to edit PATH variable?
  2. Click on START then navigate to “Computer”
  3. Right click on “Computer” and select “Properties”
  4. From properties window select “Advanced system settings”:

image

  1. On the new window that will popup, a tab “Advanced” will be selected. There is a button “Environment Variables” at lower right corner. Click on it:

image

  1. Edit the “PATH” variable, which is under “System variables”. Do not edit the one under “User variables”:

image

Good luck and enjoy the cloud!

Tuesday, August 31, 2010

How to publish your Windows Azure application right from Visual Studio 2010

Windows Azure is an emerging technology that will be gaining bigger share of our life as software developers or IT Pros. Using earlier releases of Windows Azure Tools for Visual Studio there was an almost painful process of deploying application into the Azure environment. The standard Publish process was creating Azure package and opens the Windows Azure portal for us to publish our package manually. This option still exists in Visual Studio 2010, and is the only option in Visual Studio 2008. However, there is a new, slick option that allows us to publish / deploy our azure package right from within Visual Studio. This post is around that particular option.

Before we begin, let’s make sure we have installed the most recent version of Windows Azure Tools for Visual Studio.

For the purpose of the demo I will create very simple CloudDemo application. Just select “File” –> “New” –> “Project”, and then choose “Cloud” from “Installed Templates”. The only available template is “Windows Azure Cloud Service C#”:

01_NewProject

A new window will pop up, which is a wizard for initial configuring Roles for our service. Just add one ASP.NET Web Role:

02_addWebRole

Assuming this is our cloud project we want to deploy, let’s first pass the Windows Azure Web Role deployment checklist, before we continue (it is a common mistake to miss configuring of DiagnosticsConnectionString setting of our WebRole).

Now is time to publish our Windows Azure service with that single ASP.NET WebRole. There is initial configuration, that must be performed once. Then every time we go to publish a new version, it will be just a single click away!

Right click on the Windows Azure Service project from our solution and choose “Publish” from the context menu:

03_publishMenu

This will popup a new window, that will help publish our project:

04_publish_mainScreen

There are two options to choose from: Create Service Package Only and Deploy your Cloud Service to Windows Azure. We are interested in the second one – Deploy your Cloud Service to Windows Azure. Now we have to configure our credentials for deploying onto Windows Azure. The deployment process uses the Windows Azure managed API that works with client certificate authentication, and there is a neat option for generating client certificates for use with Windows Azure. From that window that is still open (Publish Cloud Service) open the drop down, which is right below “Credentials” and choose “Add …”:

05_publish_mainScreen_addCredential

Another window “Cloud Service Management Authentication” will open:

06_publish_addCredentialWindow

Within this window we will have to Create a certificate for authentication. Open the drop down and choose “<Create…>”:

07_publish_addCredentialWindowCreate

This option will automatically create certificate for us (we have to name it). Once the certificate is created, we select it from the drop down menu and proceed to step (2) of the wizard, which is uploading our certificate to the Windows Azure Portal. For this task, the wizard offers us an easy way of doing this by copying the certificate to a temp folder. By clicking on the “Copy the full path” link it (the full path) is automatically copied onto our clipboard:

08_publish_addCredentialAlmostFinal

Now we have to log-in to the Windows Azure portal (http://windows.azure.com/) (but don’t close any Visual Studio 2010 Window, as we will be coming back to it) and upload certificate to the appropriate project. First we must the project for which we will assign the certificate:

09_AzurePortal_SelectProject

Then we click on the “Account” tab and navigate to the “Manage my API certificates” link:

10_AzurePortal_Account

Here, we simply click browse and just paste the copied path to the certificate, then click Upload:

11_AzurePortal_UploadCertificate

Please note, that there is a small chance of encountering an error of type “The certificate is not yet valid” during the upload process. If it happens you have wait for a minute or two and try to upload it again. The reason for this error is that your computer time might not me as accurate and synchronized, as Windows Azure server’s. Thus, your clock may be a minute or more ahead of actual time and your generated certificate is valid from point of time, which has not yet occurred on Windows Azure servers. When you upload the certificate you will see it in the list of installed certificates:

13_AzurePortal_UploadedCertificate

After you upload the certificate successfully to the Windows Azure server, you have to go back to the “Account” tab and copy the Subscription ID to your clipboard:

12_AzurePortal_SubscibtinId

Going back to Visual Studio’s “Cloud Service Management Authentication” window, you have to paste your subscription ID onto the field for it:

14_publish_CloseToOK

At the last step of configuring our account, we have to define a meaningful name for it, so when we see in the drop down list of installed Credentials, we will know what service is this account for. For this project I chose the name “WindowsAzureCloudDemoCert”. When we are ready and hit OK button, we will go back to the “Publish Cloud Service” window, we will select “WindowsAzureCloudDemoCert” from Credentials drop down. An authentication attempt will be made to the Azure service to validate Credentials. If everything is fine we will see details for our account, such as Account name, Slots for deployment (production & stating), Storage accounts associated with that service account:

 15_publish_OK

When you hit OK a publish process will start. A successfull publish process finishes for about 10 minutes. A friendly window within Visual Studio “Windows Azure Activity Log” will show the process steps and history:

16_published

Well, as I said there is initial process of configuring credentials. Once you set up everything all right, the publish process will be just choosing the credentials and Hosted Service Slot for deployment (production or staging).

Have a great time developing for Windows Azure!

Tuesday, August 24, 2010

How to debug your application (http protocol) using Fiddler

Fiddler has been out there for a while, but recently I discovered that it is either unknown, or not used, so I decided to write a short post on what it is, and how we can easily debug HTTP traffic (for example WCF Service calls) using it.

Before diving into essentials I would like to mention what it is. Fiddler is a debugging HTTP proxy(you can read more on what proxy server is here). Being so, when started it automatically configures user’s system to use it.

So, here is a “Welcome” screen of Fiddler:

fiddler_01_main

Where, you can see a Web Sessions list (1), Working area (2,3) which is split over Action tabs (2) and Information window part (3). From the Action tabs, you would spent most of the time inside “Inspectors” while you debug your http traffic.

Once you download, install and run it, you can check the Internet Connection settings, and you will notice that it has automatically configured the system to use a HTTP Proxy on address 127.0.0.1 (which is local machine) and port 8888:

fiddler_02_connections

Please note, that although the shortest way to go to “Internet options” if from the “Tools” –> “Options” menu from Internet Explorer, these options are not just “Internet Explorer Options”. These are system-wide internet connectivity options, and all windows based programs are reading them. So when this proxy is set, any program that relies on windows settings (and not its own settings) will use that proxy for HTTP connections.

So let’s see how to inspect a single HTTP request. Just install and run Fiddler. You can run fiddler either from Windows’ Start Menu, or from Internet Explorer’s “Tools” menu. Once you run it, just leave it open and navigate to a website of your choice. You will be surprised how many HTTP requests are issued for a single web page to load. Every single image you see, every single style loaded, every JavaScript load initiates a HTTP request. So pick up one of the requests (preferable some of the first requests, that will load the HTML) and go to Inspectors from Action tabs:

fiddler_03_inspectors

So, you see the selected web session (1), and you click on Inspectors (2). Well, the Inspectors information area is split horizontally on two parts. First part (3,4) is inspecting HTTP request issued by the client application, while the second (bottom) part (5,6) is inspecting the server response. On the image 3 and 5 are the different types of inspectors, while 4 & 6 are the information parts that shows information structured by the way of showing it (using 3 & 5). You will easily see that you can inspect the request by viewing it in:

  • Headers – showing only headers
  • TextView – showing plain text view of the request
  • WebForms – showing only variable names & values sent with the request (if you have some forms submitting some data)
  • HexView – shows hexadecimal representation of the request
  • Auth – showing just the Authentication headers (if present), so you can inspect issues with authentication
  • Raw – showing the Raw HTTP request (headers + body as sent by the client application)
  • XML – showing the request in XML tree navigation structure. It is very useful for visualizing XML RPC communication or SOAP communication, as it displays the XML in tree structure.

For the Response part of the screen you will see 3 additional types of displaying information:

  • Transformer – showing general information on the response, as well as response size
  • Caching – showing whether the response sent is cached
  • Privacy – showing any information, if present, regarding PGP privacy
  • Image View – showing the actual image, if the request was returning image

During development process we are more likely to run your application at “localhost”. What is localhost? Long explanation you can find here: http://en.wikipedia.org/wiki/Localhost. In basics, this your computer. And localhost is always resolved as IPv4 address 127.0.0.1 or IPv6 address ::1. Here comes the tricky part. You can read the FAQ Section on Debugging localhost. As explained, if your application launches on http://localhost:45960/WebSite1/Default.aspx, you can change the “localhost” to “ipv4.fiddler” (http://ipv4.fiddler:45960/WebSite1/Default.aspx) and you will see the same page, but the traffic will go through fiddler.

The issue comes from the way localhost is interpreted by applications. Although a system proxy is configured, the applications (especially .NET based) automatically bypass proxy for this address. And you have no means (in .NET) to instruct application to use proxy server when connecting to localhost. So you have to use any other name to access your application. The tricky moment is that Cassini Web Server (Visual Studio development server) binds only to IPv4 address of 127.0.0.1. And you must somehow add another hostname that resolves to 127.0.0.1. The easiest, the smoothest way to do that without writing any code, without altering any configuration setting of your project, without changing the registry, is relying on the way windows DNS resolver works. On every windows machine (well at least since Windows XP & 2000 and later) there is one single plain text file, located on same place on all windows platforms (x86, x64) regardless of windows version:

fiddler_04_hosts

c:\windows\system32\drivers\etc\hosts

It is a file without extension. In more recent versions of windows you have to be administrator (run notepad as administrator) in order to save changes to that file. It is very simple and its first lines are following:

# Copyright (c) 1993-2009 Microsoft Corp.
#
# This is a sample HOSTS file used by Microsoft TCP/IP for Windows.
#
# This file contains the mappings of IP addresses to host names. Each
# entry should be kept on an individual line. The IP address should
# be placed in the first column followed by the corresponding host name.
# The IP address and the host name should be separated by at least one
# space.
#
# Additionally, comments (such as these) may be inserted on individual
# lines or following the machine name denoted by a '#' symbol.
#
# For example:
#
#      102.54.94.97     rhino.acme.com          # source server
#       38.25.63.10     x.acme.com              # x client host

# localhost name resolution is handled within DNS itself.
#    127.0.0.1       localhost
#    ::1             localhost

I open that file, and add a single line at the bottom:

127.0.0.1 developdemo

Now I know that whenever I type http://developdemo/ I will open the localhost (127.0.0.1). And whatever windows application makes a HTTP request to “developdemo” – it will be requesting information from localhost. And if Fiddler is running, it will log all and any HTTP requests that are going to and from developdemo host.

Well, I gave you the basics, and I hope that you now know that there is a way to debug your HTTP based application.

Thursday, July 1, 2010

Windows Azure Web Role Deployment Checklist

When you develop for Windows Azure for the first time, there are a couple of things that you have to be careful about, in order to deploy your application successfully.

For this checklist (to be more complete) I will use a Silverlight Application with WCF RIA Services and Entity Framework 4.0. It is great and Windows Azure already supports .NET 4.0 along with all the goodies that come along with it. I will show the application in another post, now it is more important what we have to do in order to successfully deploy our application. I will not put them in ordered list, because if you fail in any, your application will cycle through “initializing – busy – stopping – initializing – busy – stopping”.

You can download the source solution from here.

· Diagnostics connection string

Whenever you choose to create a new Web Role, a WebRole.cs file is added to your web application. It is very important part of your Windows Azure Web Role. It gives you the possibility to intercept the Start and Stop events of your Web Role and do some initializing or cleaning up. It also starts the Diagnostics Monitor, which is something you can’t live in Azure without! The Diagnostics Monitor uses Azure Storage to save diagnostics logs, that’s why it needs a connection string. This connection string, however is set to use development storage by default:

It is very easy to change that connection string – just right click on your Web Role from the Cloud Service project choose Properties, then select Settings from the left navigation tabs. Now click on the […] button which is located right next to the Value string and new window will appear:

You have to enter storage credentials. A good practice is to always use HTTPS endpoints. The Account name and Account key, you can get from the Azure Developer portal. Once you create an Azure Storage service, you can give the account name, and the keys are generated from the portal:

You will have to go to Windows Azure (1) and then select your Storage Service (2). Your account name (4) and account keys (5, 6) are located within main Cloud Storage settings area (3). You can use either of the keys – Primary (5) or Secondary (6) as configuration.

You are done. Of course the Windows Azure Storage service can be accessed from any place with internet connection, so now you can start and run your application to check out if it will run smoothly on local development fabric.

· References in Web Role

Dealing with References can be really annoying in Web Roles & Worker Roles! You have to be very careful – you have to select Copy Local = True, for each and every assembly that you refer and that is not part of Core .NET Framework 4.0. This of course includes any third party assemblies (such as NetAdvantage for .NET). More interesting for you to find out, that you have to set Copy Local to True even for the following assemblies:

Microsoft.WindowsAzure.Diagnostics

Microsoft.WindowsAzure.StorageClient

The WindowsAzure part of the names of these assemblies would make you think that they must exists on the Azure VM, but they don’t! However the Microsoft.WindowsAzure.ServiceRuntime is part of Azure VM deployment, and you can safely leave it as Copy Local = False.

Working with WCF RIA Services, you will also have to set this attribute (Copy Local) for all RIA assemblies:

System.ServiceModel.DomainServices.EntityFramework

System.ServiceModel.DomainServices.Hosting

System.ServiceModel.DomainServices.Hosting.OData

System.ServiceModel.DomainServices.Server

· Target CPU Architecture

Windows Azure uses 64 bit CPUs and the operating system (Windows Server 2008 based) is x64. So your assemblies – any and all assemblies that run as part of your Web Role should be compiled as either “Any CPU” or “x64”. In very rare situations, when you use Worker Role, you can use x86 assemblies, but you can only execute them in separate process, that is not part of the Role! The target CPU architecture is managed from "Configuration Manager". You can launch the configuration manager by right clicking on the Solution in Solution Explorer and selecting "Configuration Manager". You will see the following dialog:

Most probably you've never seen this dialog, but it is something that you have to pay attention when developing for Azure!

· Web.config issues

As you may already know – Windows Azure Guest OS is Windows 2008 based. That means your web roles are run on IIS 7. So any configuration that is not compatible will cause your web role to not start and begin cycling.

Well, this is first, not fully complete, but the very first checklist you have to check even before you are going to deploy your WebRole!

Tuesday, June 15, 2010

Display PHP error messages on IIS 7.0 / 7.5

Have you ever tried running PHP on IIS with FastCGI? Yes, it runs, and it does it smoothly. However regardless of php.ini settings for “display_errors” and “error_reporting”, starting with IIS 7.0, you will most probably see only “Internal Server Error 500.XX” for any error generated by PHP (even if it is just a warning or even notice). And yes, the hard workaround is to turn on Failed Request Tracing on the site, and examine FRT log files. Which I’m sure you don’t want!

The solution is here: http://serverfault.com/questions/69839/show-php-error-message-on-iis-7

The reason:

With IIS7, it doesn't pass the errors through by default. It's "existingResponse" that needs to be set.

The solution (run this line with elevated command prompt):

c:\windows\system32\inetsrv\appcmd.exe set config "{sitename}" -section:system.webServer/httpErrors /existingResponse:"PassThrough" /commit:apphost

and do not forget to replace {sitename} with the real site name.

Monday, April 26, 2010

New features in SQL Azure

An amazing set of new features has been announced recently. These features include:
  • 50 GB Database SKU, available upon request
  • MARS (Multiple Active Result Sets) This is very useful feature, since lots of developers are having issues with connection when using Entity Framework. The EF connection string enables MARS by default and developers are having though times to identify connection errors.
  • Alter rename process for symmetry in renaming databases
You can read more details at SQL Azure team blog:
http://blogs.msdn.com/sqlazure/archive/2010/04/16/9997517.aspx

IIS Compression in Windows Azure is now possible

As by recent blog post from Steve Marx, the Dynamic Compression module is now enabled in Windows Azure. That change is made is March release of Windows Azure Guest OS 1.2

Friday, April 23, 2010

Imagine Cup local finals judge

I am honored to be a judge at local finals of the Imagine Cup Student Competition 2010. I will be part of the jury for the Software Design category.

Good luck to all teams!

May the technology be with you!

Tuesday, April 6, 2010

Solution to Silverlight Design Time issues in a Windows Azure Cloud Service

I just want to share a blog post from Azure team, where they reveal how to deal with bunch of irritating issues that one would have if developing Silverlight based application for the cloud. You can read the full story here: http://blogs.msdn.com/jnak/archive/2010/03/23/fixing-the-silverlight-design-time-in-a-windows-azure-cloud-service.aspx

Saturday, March 20, 2010

Official Silverlight 4 Tools for VS 2010 RC is out

It was not long ago when VS2010 RC become available. One unfortunate thing with it, was that the SL 4 BETA tools were not supported on VS 2010 RC and there were tons of question when shall we expect version of SL 4 tools that will work with VS 2010 RC. Well, here they are:

http://msdn.microsoft.com/en-us/library/cc838244(VS.96).aspx

Silverlight 4 RC Developer Runtime + Silverlight 4 Tools for VS 2010 RC + WCF RIA Services RC.

I am not sure whether to be happy or sad for the last one. I was exploring the .NET RIA Services since their early CTP and I do have a project that uses Silverlight 3 RTM + WCF RIA Services Beta all developed on VS 2008. What was my surprise when I saw the installer splash screen of Silverlight 4 RC Tools. It was so kind to tell me that I have earlier version of WCF RIA Services, which will be uninstalled for the sake of new RC version. However, WCF RIA Services RC will only support Silverlight 4 and Visual Studio 2010 RC. So I decided to not make that big upgrade step of my project, because it is bit more complex than just a Silverlight application :). Anyway I know that sooner or later I will have to upgrade everything to 2010 versions because in either way WCF RIA Services will not support Silverlight 3.

The only solution to “touch” Silverlight 4 RC Tools without crashing all my previous work is to use a VM to play around. I just feel I have too many VMs :) I am saving one separate real HDD for all my VMs …

Monday, February 8, 2010

Visual Studio 2010 side-by-side with Visual Studio 2008

Well, BETA2 of Visual Studio 2010 was announced with GO LIVE license, upgrade capability to RTM and side-by-side working with existing VS2008 installations.

Well, maybe if you are using Visual Studio to compile and run your “Hello World” Console or Web Application you will have no issues running side-by-side.

However, if you really want to explore all the new technologies, while still keeping “old” projects running, you simply can’t use both Visual Studios side-by-side. And here is one particular case in which you simply have to choose whether to stay on VS2008 or go for VS2010 and migrate your projects to 2010, because they will not be supported on VS2008. The special case is called:

WCF RIA Services

RIA Services is designed to make our Silverlight life easier, but instead it brings only troubles (when using side-by-side). And why? Because WCF RIA Services works either with VS 2010 and SL4 OR VS2008 SP1 and SL3. There is no side-by-side! If you have existing project(s) on VS 2008 SP1 with WCF RIA Services, you simply can’t go and explore VS2010 SL4 and WCF RIA Services within VS 2010.

Although you might want to say it is limitation for WCF RIA – yes, it is. But I can use both VS2010 and VS2008SP1 for Windows Azure projects. And for me – side-by-side, means “side-by-side” without any “but”.

So for me so far – no side-by-side.

Sunday, February 7, 2010

MySQL hosted on Windows Azure

People are often asking whether MySQL is supported on Windows Azure. The simple answer is YES, you can run a MySQL on Windows Azure! Great!

But is it worth? I would say NO! And here are my thoughts on that.

First, take a sneak peak at the presentation of Mohit Srivastava and Tushar Shanbhag from PDC’09: Developing PHP and MySQL Applications with Windows Azure. Or download the slides and take a quick look of “OK, you can run MySQL on Windows Azure”. After one hour of amazing talk we will be almost convinced that we definitely can run MySQL on Windows Azure.

BUT…

I would question the value of bringing a MySQL to Azure!

What is Windows Azure? Here are just a couple of quotes from Introducing the Windows Azure platform and Introducing Windows Azure white papers:

Windows Azure is designed to support applications that scale out, running multiple copies of the same code across many commodity servers.

The intrinsic support for scale-out applications and scale-out data that Windows Azure provides can handle much larger loads than more conventional Web technologies.

The answer grows out of the primary Windows Azure goal of supporting massively scalable applications. Traditional relational databases can scale up, handling more and more users by running the DBMS on ever-larger machines. But to support truly large numbers of simultaneous users, storage needs to scale out, not up.

In resume – the goal of Windows Azure is to be highly scalable, highly reliable, highly available, elastic, etc. cloud system. And these white papers describe how is that scalability and reliability achieved.

Now, a quick sneak peak over the SLA (Service Level Agreement) that will assure our business for system up-time and availability. What we are most interested of using MySQL on Windows Azure is the Windows Azure Compute SLA. And the abstract of it sais:

For compute, we guarantee that when you deploy two or more role instances in different fault and upgrade domains your Internet facing roles will have external connectivity at least 99.95% of the time.

Additionally, we will monitor all of your individual role instances and guarantee that 99.9% of the time we will detect within two minutes when a role instance’s process is not running and initiate corrective action.

What does that mean to you, when are running MySQL? Rewind back to the Developing PHP and MySQL Applications with Windows Azure. What did they do? They have deployed a single instanced Worker Role that runs MySQL! Now I do have a couple of questions for you, if you are still going to host your MySQL in windows Azure:

  • 1. Will you have SLA for your MySQL deployment (which is the purpose of using Windows Azure)?
    • NO! And no, because your MySQL will be running on a single instanced role. You simply can’t run MySQL on multiple instances! Well you can, but you can’t really work with it.
  • 2. Will your MySQL be easily scalable (which is the purpose of using Windows Azure)?
    • NO! Because scaling out in Windows Azure is achieved by just increasing the number of instances that are running within your role. And if you want to “scale-out” a MySQL in Windows Azure, you will have to deploy entirely new worker role with single instance.

For me, there is just no worth of bothering with running MySQL on Windows Azure. You are going to loose all the strengths of Windows Azure and will utilize it just as virtual server hosting. Just give up on MySQL if you want to go with Windows Azure and refactor your PHP code to use SQL Azure!

And by the way … there will be almost the same issue with running PHP. However, this is easier to handle. Couple of important things you need to know in order to avoid hours of debugging a php code that is not debuggable – use a DataBase to store your sessions! If you want to scale out on Azure, do not ever use default sessions with PHP – move them to database. And let this database be SQL Azure. Well, moving sessions to database is the easier. You could also move the session to Windows Azure Tables.

If you want to write files on a “hard drive” – use Windows Azure Drive or go directly to Windows Azure Blob.

Monday, January 25, 2010

Jason Beres to speak in Sofia on Visual Studio 2010 and .NET 4

Next week we will be a host for an amazing session. It will be held on 2nd of February 2010 at Elieff Center for Education and Culture, 1 University Park str., Studentski grad 1700, Sofia.

“Building Better Experiences using Visual Studio 2010 and .NET 4”

As Visual Studio 2010 is right around the corner, and it's a completely new IDE based on WPF, chock full of exciting new features and productivity enhancers. So now is a great time to figure out the answers to these 2 key questions:
  • What's in Visual Studio 2010 and .NET 4 for me?
  • What is going to take to learn all of this?
In this talk, Jason will answer those questions. You'll get an understanding of the full run down of new features in Visual Studio 2010, including the new testing and debugging features, new data options, new deployment scenarios, new code editor features and many other new time savers and coding enhancers. You'll also get an understanding of new language features in Visual Basic and C#, as well as the new support for the F# language and the new project support for Silverlight 4, Windows Azure, SharePoint and Microsoft Office applications. At the conclusion of this talk, you'll know exactly why you need Visual Studio 2010 and .NET 4, and how they will improve your coding experience and capabilities to build even better software in 2010 and beyond!
Jason Beres is the VP of Product Management for Infragistics, the world leader in user interface development tools and experts in the User Experience (UX) market. Jason is a founder of Florida .NET User Groups, he is the founder of the Central New Jersey .NET User Group, he is a Microsoft MVP, and he is on the INETA Speakers Bureau. Jason is the author of several books on .NET development, including the recently published Silverlight 3 Programmers Reference from Wrox Press and the upcoming Silverlight 4 Professional from Wrox Press. Jason is a national and international conference speaker; he is a frequent columnist for several .NET publications, and keeps very active in the .NET community.
Expected duration of the lecture is about 1.5 hours and there will be catering party, excellent for networking after it.
See you there!