Saturday, December 17, 2011

Windows Azure basics (part 1 of n)

We live in dynamic times. Buzzwords such as cloud computing, elastic scale, reliability and their synonyms are taking more and more space in our daily life. People (developers) want to move to the cloud. They are often confused by all the new terms. In this part 1 of [we-will-see-at-the-end-how-many] articles I will try to explain with non-geeky words the Windows Azure terms.

First of all, what is Cloud Computing before all? This is when Computing power (namely CPU, RAM, Storage, Networking) is delivered as a service via a network (usually internet), and not as a product (a server that we buy).

Cloud computing is a marketing term for technologies that provide computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. A parallel to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service.

So what is Windows Azure? Is it the new server operating system from Microsoft? Is it the new hosting solution? Is it the new workstation OS? Well, Windows Azure is the Microsoft’s Cloud Computing platform. It delivers various cloud services. Compute, Database, Storage, CDN, Caching, Access Control to name few.

Next part of the article will be focusing on Windows Azure Compute services.

Windows Azure Guest OS? When we talk about cloud computing, inevitably we talk about virtualization. Virtualization at very big degree. And when we talk about virtualization, we have a Host OS and Guest OS. When we talk about Windows Azure OS, we talk about Windows Azure Guest OS. This is the operating system that is installed on the Virtual Machines that run in the cloud. Windows Azure Guest OS has 2 families – OS Family 1 and OS Family 2. Windows Azure Guest OS Family 1 is based on Windows Server 2008 SP 1 x64, and Family 2 is based on Windows Server 2008 R2. All and any guest OS is 64 bits. You can get the full list of Windows Azure Guest OS here.

Windows Azure Cloud Service, or Hosted Service. The Hosted Service is the essence of your Cloud application:

A hosted service in Windows Azure consists of an application that is designed to run in the hosted service and XML configuration files that define how the hosted service should run

A hosted service can have one or more Roles.

Now it comes to the Roles. Our cloud application can be a Web Based application, or a background processing application, or some legacy application which is hard to migrate. Or mix of the three. In order to make things easy for developers, Microsoft has defined 3 distinguished types of “Roles” – Web Role, Worker Role and VM Role. You can read a bit more for the “Role”s here. But the main idea is that a Role defines an application living environment. The Role contains all the code that our application consists of. It defines the environment where our application will live – how many CPUs will be installed; the amount of RAM installed; volume of local storages; will it be a full IIS or a background worker; will it be Windows Azure Guest OS 1.x or 2.x; will it has open ports for communication with outer world (i.e. tcp port 80 for Web Role); will it has some internal TCP ports open for internal communication between roles; what certificates will the environment has; environment variables; etc.

The Role is like a template for our cloud application. When we configure our Cloud Service (or Azure Hosted Service), we set the number of instances involved for each Role.

Instance is a single Virtual Machine (VM), which has all the properties defined by the Role and has our application code deployed. When I mentioned that the Role defines the number of CPUs, RAM, local storage, I was referring the configuration for each VM where our code will be deployed. There are couple (5) of predefined VM configuration which we can use:

Virtual Machine Size CPU Cores Memory Cost Per Hour
Extra Small Shared 768 MB $0.04
Small 1 1.75 GB $0.12
Medium 2 3.5 GB $0.24
Large 4 7 GB $0.48
Extra Large 8 14 GB $0.96

More information on Virtual Machine sizes can be found here.

And here comes the beauty of the Cloud. We code once. We set the overall parameters once. And we deploy once! If it comes that we need more servers – we just set the number of instances for our role. We do it live. There is no downtime. Windows Azure automatically will launch as many VMs as we requested. Will configure them for our application and will deploy our code in each and every one of them and will finally join them to the cluster of our highly available and reliable cloud application. When we don’t need (let’s say) 10 servers anymore, then we can easily instruct Windows Azure that we only need 2 from now on and that’s it. The cloud will automatically shutdown 8 servers and remove them, so we won’t be paying any more extra money.

It is important to note, though, that the Role defines the size of the VM for all the Instances of it. We cannot have instances of same Role but different VM size. This is by design. If we defined our Role to use Extra Large VM, then all the instances we have will be running on that size of VM.

Key takeaways

I hope that this article helped you understand couple of basic terms about Windows Azure. You shall be able to confidently answer the following questions:

  • What is Windows Azure ?
  • What is Windows Azure Hosted Service (or just Hosted Service)?
  • What is a Role?
  • What is a Role Instance (or just Instance)?

Wednesday, December 14, 2011

Optimize your database cursors (considering SQL Azure)

Yeah, I know most of the DBAs (if not all) say to avoid using cursors in your SQL Server code, but there are still some things, which you can only achieve via cursors. You can read a lot discussions on whether to use cursors or not, is it good, is it bad.

My post is not about arguing what is good and what is bad. My post is about a tiny little option, which, if your logic allows you can use to optimize how your cursor works.

So we are using cursors, for good or bad. Everything might work just fine if we are using on-premise SQL Server, and if the server is not under heavy load. Our stored procedures, which are using cursors are executing in a matter of seconds. There is nothing unusual. We deploy our application to The Cloud. And of course we utilize SQL Azure as our DB backend. Now strange things begin happening. Our stored procedures crash with timeout exceptions. If we login to the server and use the good “sp_who3” (yes, this works in SQL Azure!) to see the processes running, we notice that some procedures do report a  SOS_SCHEDULER_YIELD. You can read a lot of information on what does that mean. As by definition:

Occurs when a task voluntarily yields the scheduler for other tasks to execute. During this wait the task is waiting for its quantum to be renewed.

Most of the resources you will find explaining what does lot of SOS_SCHEDULER_YIELD mead, will suggest high CPU load, non-optimized queries, etc. But we look at our code and there is nothing unusual. Also, as this is SQL Azure, we can’t see the actual CPU load of the OS. We can’t add more CPU or more RAM. What do we do now ?

Well, review once again our cursor logic! If it is the case that we only read from the cursor’s data. We only read forward, never backward. We never change cursor’s data (update/delete). Then there is a pretty good chance that we can use the FAST_FORWARD keyword when declaring our cursors:

Specifies a FORWARD_ONLY, READ_ONLY cursor with performance optimizations enabled. FAST_FORWARD cannot be specified if SCROLL or FOR_UPDATE is also specified.

It is amazing performance booster and load relief! And we, most probably, will never see again the SOS_SCHEDULER_YIELD process status for our procedures.

Most (if not all) of the cursors I’ve written are never reading backward or updating data, so I pretty amazed to see the performance differences using this keyword. I for sure will use it from now on, whenever possible!.

Monday, December 5, 2011

Microsoft Windows Azure gets ISO/IEC 27001:2005 certification

It’s a great step toward proving that Microsoft is reliable cloud partner, to announce that Microsoft has passed ISO/IEC 27001:2005 certification. It is very strong information security certification which proves that our data is securely and reliably stored in the cloud.

You can find the official certificate on the certification authority’ website here. As you can read the scope of the certification is as follows:

The Information Security Management System for Microsoft Windows Azure including development, operations and support for the compute, storage (XStore), virtual network and virtual machines services, in accordance with Windows Azure ISMS statement of applicability dated September 28, 2011. The ISMS meets the criteria of ISO/IEC 27001:2005 ISMS requirements Standard.

Meaning that SQL Azure, CDN, ACS, Caching and Service Bus services are not yet covered by this certification. But I believe it is work in progress and very soon we will see update on that part. Yet, the most important part – where our code resides (Compute) and where our data live(storage) is covered.

You can read the original blog post by Steve Plank here.

As there are some additional steps, the full information about this certification will become available in January 2012.

Sunday, October 23, 2011

Unity Windows Azure Settings Injector

This is my first combined CodePlex / NuGet package contribution. This small piece of code is meant to help those of you who are using Unity as Policy Injection / DI framework and are considering moving to the cloud.

This project includes auto resolver for following Windows Azure Configuration Settings:

  • LocalStorage
  • Setting (both string setting and Azure Storage Connection string)

The source code is located at CodePlex: http://uasi.codeplex.com while the single line installation is located at NuGet:

PM> Install-Package UASI

The NuGet package will automatically add references to Unity assemblies (if not already added), to UASI assembly, and will make necessary configuration changes to your web.config / app.config file. It will also add (commented out) the following simple usage scenario for your project:

<unity xmlns="http://schemas.microsoft.com/practices/2010/unity">
<
sectionExtension 
type="Unity.AzureSetting.Injector.SectionExtensionInitiator, UASI"
/>
     <!--
Bellow is sample usage of package –
>
     <
container
>      
<
register type="IStorageHelper" mapTo="StorageHelper"
>
<
lifetime type="singleton"
/>
         <
constructor
>
           <
param name="connectionString"
>
             <
azureSetting
              key="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"
              type="ConnectionString"
/>
           </
param
>
         </
constructor
>
         <
property name="RootFolderName"
>
           <
azureSetting key="LocalStore" type="LocalStorage"
/>
         </
property
>
       </
register
>
     </
container
>
</
unity>

Hope that it will work smoothly with your projects!


What to expect:


In the project roadmap are implementations for IPEndPoint resolver (for both Internal / External EndPoints) and some real sample usage showcase.


Stay tuned for updates.

Friday, October 21, 2011

ServiceReference.ClientConfig build management in Silverlight projects

I’ve blogged a before for web.config transformations and how I would like to see them in other projects. While for Windows Azure cloud service project there is already similar feature implemented, I wonder why there is no out of the box support for other project types.

Here I will reveal the powerful, yet simple implementation of such transformation over your ServiceReferences.ClientConfig files. Thus you will no longer wonder what are your service endpoints, and which endpoint you are using. And this is not Windows Azure specific. It is relevant for any Silverlight project.

Just implement these simple steps:

1. Edit your .csproj file of the Silverlight application. Add the following block:

<UsingTask TaskName="TransformXml"

   
AssemblyFile="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.Tasks.dll"
/>
   <
Target Name="BeforeBuild"
Condition="exists('ServiceReferences.$(Configuration).ClientConfig')">
     <!--
Generate transformed app config in the intermediate directory –
>    
<
TransformXml Source="ServiceReferences.ClientConfig"
      Destination="$(TargetDir)\ServiceReferences.ClientConfig"
      Transform="ServiceReferences.$(Configuration).ClientConfig"
/>
     <!--
Force build process to use the transformed configuration file from now on. –
>
     <
ItemGroup
>
       <
Content Remove="ServiceReferences.ClientConfig"
/>     
<
ContentWithTargetPath 
Include="$(TargetDir)\ServiceReferences.ClientConfig"
>      
<
TargetPath>ServiceReferences.ClientConfig</TargetPath
>
       </
ContentWithTargetPath
>
     </
ItemGroup
>
   </
Target>

right after:

 <Import 
Project="$(MSBuildExtensionsPath32)\Microsoft\Silverlight\$(SilverlightVersion)\Microsoft.Silverlight.CSharp.targets" />

2. Add new XML file to your project. Name it ServiceReferences.[BuildConfiguration].ClientConfig. Where [BuildConfiguration] can be the name of *any* build configurations you have defined for your project. The default build configurations are “Debug” and “Release”, but you may add as many as you like, to suit your development/testing/staging/live environments. Remember to set “Build Action” to “None”, and “Copy to output directory” to “Never”:



3. Add the required content in that custom file. For example, if you want to just change an endpoint for a service, you will have something like this (ServiceReferences.Debug.ClientConfig):

<?xml version="1.0" encoding="utf-8"?>
<
configuration
  xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"
>
     <
system.serviceModel
>
         <
client
>
             <
endpoint 
address="http://127.0.0.1:81/DummyService.svc"
xdt:Transform="SetAttributes"
/>
         </
client
>
     </
system.serviceModel
>
</
configuration>

For more information on XML transformations supported, please take a look at the MSDN documentation for Web.Config transformations. Do not panic! The documentation is for “web.config” transformations, but these are just XML transformations that can transform any XML file Winking smile

Move Silverlight applications to the Cloud

While moving a typical ASP.NET application to the cloud might require more actions, and would have more “points of break” when moving to the cloud, a Silverlight application is much more amenable to move to the Cloud.

At last Windows Azure User group meeting we covered most common scenarios of Silverlight applications and moved them to Windows Azure. We had a Silverlight application communicating with WCF Services, application that uses WCF RIA services, and application that uses Media services (video player). We moved entire application into the cloud and showed how one can leverage the Windows Azure CDN to achieve better user experience in terms of application load.

Here you can find the source files for the Demos I used: http://bit.ly/okjFNn 

Here you can find the slides: http://bit.ly/nugWoP

The TCP Server I used for the demos is based on “A very basic TCP Server written in C#” article in CodeProject. Second TCP demo (SilverCloudBase_05_Sockets_Adv) is a bit edited version of that server, to support message broadcasting to all connected clients.

The session recording will be available soon, so stay tuned!

Thursday, October 13, 2011

Upcoming features for SQL Azure

Some amazing news has been announced recently at SQL PASS conference.

Key announcements on SQL Azure included the availability of new CTPs for SQL Azure Reporting and SQL Azure Data Sync (now publicly available), as well as a look at the upcoming Q4 2011 Service Release for SQL Azure. 

According the post from Windows Azure Team, the SQL Azure Q4 2011 Service Release will be available by end of 2011 and is aimed at simplifying elastic scale out needs.

Key features include:

  • The maximum database size for individual SQL Azure databases will be expanded 3x from 50 GB to 150 GB.
  • Federation. With SQL Azure Federation, databases can be elastically scaled out using the sharding database pattern based on database size and the application workload.  This new feature will make it dramatically easier to set up sharding, automate the process of adding new shards, and provide significant new functionality for easily managing database shards.
  • New SQL Azure Management Portal capabilities.  The service release will include an enhanced management portal with significant new features including the ability to more easily monitor databases, drill-down into schemas, query plans, spatial data, indexes/keys, and query performance statistics.
  • Expanded support for user-controlled collations.

Read more details here and here (SQL Azure Reporting CTP) or watch the Keynote from the conference.

Clouds are coming to Seattle next month

Technical Content, Technical Experts

The Cloud Experience track at SIC is for experienced developers who want to learn how to leverage the cloud for mobile, social and web app scenarios.  No matter what platform or technology you choose to develop for, these sessions will provide you with a deeper understanding of cloud architecture, back end services and business models so you can scale for user demand and grow your business.

Register today using the promo code “azure 200” and attend SIC for only $150 (a $200 savings).

  • Attend a full day of technical sessions and learn more about leveraging the cloud for mobile, web and social scenarios. View the list of confirmed Cloud Experience speakers.  Sessions include:
    • Great Mobile Apps Make Money – Intro to Cloud Experience Track
    • Mobile + Cloud, Building Mobile Applications with Windows Azure
    • Zero to Hero: Windows Phone, Android, iOS Development in the Cloud
    • Building Web Applications with Windows Azure
    • Building Social Games on Windows Azure
  • Cloud Experience speakers and technical experts will be available to provide technical assistance and resources for developing, deploying and managing mobile, social and web apps in the cloud.

Seattle Interactive Conference (SIC): November 2-3, 2011, The Conference Center at WSCC

Tuesday, October 4, 2011

Slides and Recording from last user group meeting / Identity and Access Control in the Cloud

Hello all. Last weekend we had a great cross-user-group sessions and party at Bansko, Bulgaria. Here we have 8 user group focused on various Microsoft Technologies. I, particularly, am engaged with the Windows Azure User Group Bulgaria and was talking on Identity and Access Control in the Cloud. It was very good talk and very good audience. The slides can be viewed/downloaded here. While I had the good intend to stream live over Live Meeting, there were some technical problem outside of what could be solved, but I made a recording using Camtasia Studio from TechSmith. You can download full-sized video from here, and I also made a lower quality video.

Looking forward for our next meeting, which will be held soon and the topic will be interested to most Web and Silverlight developers considering moving to the cloud.

Monday, October 3, 2011

Windows Azure SDK 1.5 Update released!

It appeared that there is a bug in the Windows Azure SDK 1.5, which was released in September after Build Conference. The development team has been working hard to provide a fix for this issue. You can read more about the issue here and here, and download the new update from here (just click “Get Tools and SDK”).

Note, that this is just an update, you do not need to uninstall v.1.5 of the SDK to apply this update. But you’ll have to update the reference to Microsoft.WindowsAzure.StorageClient.dll in all your projects that were upgraded to v.1.5.

Monday, September 26, 2011

Geo-Replication for your Windows Azure Storage accounts at no cost!

Amazing news came from the BUILD conference. One such news is the great new feature of Windows Azure StorageGeo-Replication for your data at *no additional cost*!

Isn’t it great? Out of the box, without additional cost, Microsoft is maintaining additional copy of all your storage data in another data-center in same region. For example if you selected West Europe for your storage account location, the geo-replica is kept in North Europe! Everything is done without breaking the current Windows Azure Storage services durability. Understand that there the multiple (likely 3!) copies of your data at both data centers! Hundreds of kilometers away. At no cost (0 Euro)! So in case a major disaster event happens in your “primary” datacenter, all of your data is recoverable at the second one!

All of this – for durability! Make sure you won’t lose any data stored in Windows Azure Storage!

No, currently you can’t use the secondary copy of your data, you don’t even see this in the management portal. So you can’t use this feature for geo-distributed application! But hey, there is the Windows Azure CDN to fulfill that purpose!

Thursday, September 15, 2011

Windows Azure Development Cookbook by Neil Mackenzie

2220EN_Microsoft Windows Azure Development Cookbook_covRecently a new book on Windows Azure was published: Windows Azure Development Cookbook! And it’s not yet another book on the subject! The great about this book is that its author is a fellow Azure MVP - Neil Mackenzie and the team of technical reviewers include Brent Stineman, another great name on the Windows Azure arena.  Both of them are amongst the top answerers in the Windows Azure forums, so be sure that this book includes a lot of gems from both real-world Azure application development, and real-world questions and answers. The fabulous world we live in, allows us to get this book in a format for variety of e-book readers so we can have it us wherever we go! Another good reading for any Azure enthusiast!

Windows Azure SDK 1.5 / Windows Azure Tools for Visual Studio 1.5

It’s great time, and it will become even greater! The new Windows Azure SDK 1.5 and Windows Azure Tools for Visual Studio 2010 are already LIVE! You can download it directly from the inline link (http://www.microsoft.com/download/en/details.aspx?id=27422) or use the Web Platform Installer. Another exciting release is the Windows Azure AppFabric SDK 1.5!

Amongst all the new goodies in Windows Azure SDK are:

  • Re-architected emulator, which enables higher fidelity between local and cloud developments.
  • Support for uploading service certificates in csupload.exe.
  • A new csencrypt.exe tool to manage remote desktop encryption passwords.
  • Enhancements in the Windows Azure Tools for Visual Studio for developing and deploying cloud applications.
  • The ability to create ASP.NET MVC3 Web Roles and manage multiple service configurations in one cloud project.
  • Improved validation of Windows Azure packages to catch common errors like missing .NET assemblies and invalid connection strings.

Even greater news is the totally new and fresh Windows Azure Toolkit for Windows 8!

You can read the full story here: http://blogs.msdn.com/b/windowsazure/archive/2011/09/14/just-announced-build-new-windows-azure-toolkit-for-windows-8-windows-azure-sdk-1-5-geo-replication-for-azure-storage-and-more.aspx

Wednesday, August 17, 2011

Windows Azure Configuration Settings per build configuration (2)

It was not too long ago when I blogged about how to configure your Cloud Service project to respect build configurations and transform your CSCFG file based on this.

Things change though! The Windows Azure team is not sleeping and is constantly improving the development experience for the platform. One of the new features introduced with the latest Windows Azure Tools for Visual Studio 2010 (August 2011 update) is the Configurations manager:

This new manager allows you to keep more than one service configuration! Please remember that Service Configuration is not Service Definition! You still have the single Service Definition, but you can have multiple Service Configurations. Using it is so easy. Just right click on your Cloud project and select “Manage Configurations”:

There are additional options in the typical Role configuration screens, so you can manage the configuration settings per configuration:

For each configuration you create, the manager automatically adds a configuration file to the project. Naming convention follows the naming convention of the Web Projects we used to see:

Now, when you deploy your cloud project, the Publish screen had some more features, and one of them is a drop down menu with all configurations you have:

You may also have noticed that right from the Publish screen, there is also an option to chose the Build Configuration.

Although this is a great step toward making developer’s life easier I would set a couple of cons for this new feature:

  • There is no option to choose which configuration to use when debugging. If you use the default configuration files, which are created with a new project, then the “.Local.cscfg” is automatically used when using local compute emulator. But if you have existing project and you add new service configuration, then the original one, or the “Default” is used. “Default” configuration is the one that has no addition extension (i.e. ServiceConfiguration.cscfg)
  • We are forced to maintain two copies of the same file (Configuration settings). For me particularly mine approach before this tools update was better, in terms that you had to only put changes in different configuration files and not maintain the full set of properties.
  • This totally concerns mine project which still had mine transformations left and using the new tools, but having only single configuration in the Configuration Manager: the deployment did not take mine transformed file, but the original configuration file which was defined in the Configuration Manager! So be warned: if you were using the configuration transformation solution from my previous blog post, remove it and use the new Configuration Manager instead!

2011-08-23 Update on the first bullet:
There is an option to choose which configuration to be active during development/debugging. You need to go to Properties for the Cloud project, then select the “Development” tab. And there you will see similar screen with options to select which will be the “development/debugging” configuration:

Thursday, June 23, 2011

All ingress traffic in Windows Azure is free from 1st of July

It is an amazing news for everyone benefiting the Windows Azure platform! Starting July the first  all ingress (inbound) traffic will be free! No time restrictions (off-peak, on-peak), no geographic restrictions, all (inbound) traffic will be free of charge!

The original information source is here: http://blogs.msdn.com/b/windowsazure/archive/2011/06/22/announcing-free-ingress-for-all-windows-azure-customers-starting-july-1st-2011.aspx

Go and enjoy cloud development with Windows Azure!

Monday, June 13, 2011

Windows Azure Configuration Settings per build configuration

Don’t you like the neat feature that web projects have – applying different configuration settings based on build configuration? If you create a new Web Application project, you surely have noticed the 3 web.config files – web.config, web.debug.config & web.release.config. This is a feature of web projects, which uses nice xml transform task to shape your final web.config file according to your build configuration:

WebConfings

For instance you put your development connection string in your web.config file, and you put your production environment in your web.release.config file. Now you don’t have to manually edit the web.config when you deploy. Just build in Release configuration and you are ready!

Now, don’t you want to have something like that, but for your ServiceConfiguration.cscfg in your Cloud Service (Azure) project? I want! Here are the steps you have to follow to achieve this:

Full source code for the given sample can be downloaded from here.

1. Open the folder for your CloudService Project and manually add the new config file (i.e. ServiceConfiguration.Release.cscfg) (we will edit this file later). You have to do this because you can’t just Add New File to this project. The project template for Cloud Service project does not allow you to do so.

2. Unload your Cloud Service project and select “Edit XXXX.ccproj”

3. You have to manually include the ServiceConfiguration.Release.cscfg file to the project. So locate the ItemGroup section where your project files are included:

  <ItemGroup>
    <ServiceDefinition Include="ServiceDefinition.csdef" />
    <ServiceConfiguration Include="ServiceConfiguration.cscfg" />
  </ItemGroup
>

And add the new file within that ItemGroup section:

    <None Include="ServiceConfiguration.Release.cscfg" />

4. Navigate to the last line, just before

</Project>

and after

 <Import Project="$(CloudExtensionsDir)Microsoft.CloudService.targets" />

Now add the following code:

  <UsingTask TaskName="TransformXml" 
             AssemblyFile="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.Tasks.dll" />
  <PropertyGroup>
    <ServiceConfigurationTransform>ServiceConfiguration.$(Configuration).cscfg</ServiceConfigurationTransform>
  </PropertyGroup>
  <Target Name="TransformServiceConfiguration" 
          BeforeTargets="CopyServiceDefinitionAndConfiguration" 
          Condition="exists('$(ServiceConfigurationTransform)')">
    <!-- Generate transformed service config in the intermediate directory -->
    <TransformXml Source="@(ServiceConfiguration)" 
                  Destination="$(IntermediateOutputPath)%(Filename)%(Extension)" 
                  Transform="$(ServiceConfigurationTransform)" />
    <!--Force build process to use the transformed configuration file from now on.-->
    <ItemGroup>
      <ServiceConfiguration Remove="ServiceConfiguration.cscfg" />
      <ServiceConfiguration Include="$(IntermediateOutputPath)ServiceConfiguration.cscfg" />
    </ItemGroup>
  </Target
>

5. Save and close the ccproj file, and reload your project.


6. Now lets edit the ServiceConfiguration.Release.cscfg file. Let’s assume we have a “MySetting” configuration setting which we want to alter based on build configuration. Then edit the .Release.cscfg file as follows:

<?xml version="1.0" encoding="utf-8"?>
<sc:ServiceConfiguration serviceName="AzureSettingsSample
"
  xmlns:sc="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration
"
  xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <sc:Role name="SimpleWorker" xdt:Locator="Match(name)">
    <sc:ConfigurationSettings>
      <sc:Setting name="MySetting" value="Release
"
        xdt:Transform="SetAttributes" xdt:Locator="Match(name)" />
    </sc:ConfigurationSettings>
  </sc:Role>
</sc:ServiceConfiguration
>

The additional namespace declaration is required so that the TransformXml task recognizes the nodes and attributes.


Voilah! You can change as many settings as you would like, and never mess with commenting out production settings, or forget to change the “usedevelopmentstorage=true” diagnostics connection string!


The only thing you have to remember is that you have edit the ccproj file each time you add new build configuration and want to include new ServiceConfiguration.BuildConfig.cscfg file. You can have as many as you would like.


Credits go to Oleg Sych’s post on configuration settings.


Again, the full source code can be downloaded from here.

Monday, June 6, 2011

Windows Azure User Group Meeting

Our next meeting is scheduled from June the 8th, 2011, 18:30 Local Time (GMT + 3.0 daylight saving time). I’ll be talking on Diagnostic & Monitoring Windows Azure apps and will also share some troubleshooting tips & tricks. Location as usual – Microsoft Bulgaria Office, Sofia, 55 Nikola Vapcarov Blvd. The meeting will also be broadcasted via Windows Live Meeting on the following address: https://www.livemeeting.com/cc/mvp/join?id=HT6WMG&role=attend&pw=xfK%24Z%2Bk2j 

One of the participants will have a chance to win a license for Cerebrata’s Cloud Storage Studio (a US $69.99 value), so don’t miss this event, either face-to-face or Online!

Monday, May 30, 2011

GITCA's 24 hours in the cloud is coming

In case you missed the PDC 2010 Local (Bulgaria), where I showed how to build scalable video converter using Windows Azure Worker Role & Windows Azure Storage, or you would like to refresh your knowledge on Azure, please join the GITCA's 24 hours in the cloud event on June the 1st. My session is scheduled for 10:00 P.M. PDT (06:00 AM GMT, June the 2nd), but there are also a lot of good session which you might want to watch. During the event you can ask questions using the official twitter event hash: #24HitC . To join the event, simply visit this site on June the 1st: http://vepexp.microsoft.com/24HitC/ !

See you there!

Tuesday, April 19, 2011

Table Valued Parameter procedures with SQL Azure

Yes, it’s supported and it’s fairly easy to use a Table Value Parameter in stored procedures with SQL Azure. And here I will show you a quick introduction on how to do this.
In order to use a table value parameter in stored procedure we first need to create a custom user defined table type (UDT). Here is my very simple table UDT:
CREATE TYPE ReferenceIds AS TABLE
(
 Id INT
)

Now let’s create a stored procedure that accepts that type:

CREATE PROCEDURE [dbo].[upGetRefIds]
(
 @references ReferenceIds readonly
)
AS
BEGIN
 SELECT Count(Id) From @references
END

It is important to note that when using UDT as parameter, it can only be input parameter, and it must be explicitly set as read only.
Finally let’s write some ADO.NET:

            using (SqlConnection con =                          new SqlConnection(                             ConfigurationManager.                             ConnectionStrings["AzureSQL"].ConnectionString))             {
                 using (SqlCommand cmd = con.CreateCommand())                 {
                     cmd.CommandText = "upGetRefIds";
                     cmd.CommandType = CommandType.StoredProcedure;
                     DataTable dt = new DataTable();
                     dt.Columns.Add("Id", typeof(int));
                     dt.Rows.Add(2);
                     dt.Rows.Add(12);
                     dt.Rows.Add(2342);






                    con.Open();



                    cmd.Parameters.AddWithValue("@references", dt);
                     var result = cmd.ExecuteScalar();
                     System.Diagnostics.Debug.WriteLine(result.ToString());
                 }
             }

Maybe you already noted – the @references parameter is passed as regular DataTable, which has 1 column defined of type integer (same as our user defined type). This is the only “special” trick to make the magic work!

That’s it. As simple as that!

Saturday, April 2, 2011

Microsoft MVP for Windows Azure

Yes! It is not a April’s fool joke! It’s a fact! It is a great honor for me to be awarded with the Microsoft MVP Award for Windows Azure!

And of course, when there are awards and winners and prices, there are also “thanks”. My great thanks go for Martin Kulov (Microsoft Regional Director & Microsoft MVP for ALM) who is a great guy and incredible Microsoft influencer Smile! I am proud to know him! Of course also a huge gratitude to my family for supporting me in all mine initiatives!

So what’s next? Even more Windows Azure User Group meetings and even more community activities. Stay tuned for updates!

Slides and code from Microsoft Days’ 2011 Bulgaria / SQL Azure Agent

And here there are. PowerPoint presentation can be downloaded from: SqlAzureAgent_MSDays2011_20110330.pptx  And the code is located at: http://sqlazureagent.codeplex.com/. Go for it! Download, build, run, change, play! If you have questions: just ask!

Tuesday, March 29, 2011

Bug in CodePlex prevent publishing a release package

While I was completing my next CodePlex project, which will be published soon, I discovered a strange bug. There was some nasty JavaScript error which was preventing me from publishing a release. Initially I thought it is a specific browser issue, but I tried with every possible (not all of them, but major ones – FF, IE, GC, Opera) browser there is (for Windows) and the problem still was there. And now when I am really close to release I just wanted that release published.

At the end it appeared to be the DatePicker JS in the Release manager which was crashing under certain circumstances. That was preventing users from publishing a Release.
In order to reproduce the issue, set the "preferred languages" of your browser (any browser) to "Bulgarian, Bulgaria (bg-BG)" and remove all other languages. In that case the JavaScript correctly gets/sets the date format, which is "dd.mm.yyyy 'г.'", but that last 'г.' (which includes the single quotes) breaks the JavaScript for the page. Frankly I am not entirely sure whether the system locale settings also affect this behavior. All my machines are with system locale set to Bulgaria. However when I remove the "Bulgarian, Bulgaria (bg-BG)" from browser's preferred languages, and leave there only English everything works fine.

I reported this issue to the CodePlex team and hope that they will fix it soon Smile

Meanwhile, if you are e CodePlex enthusiast and have projects there, keep in mind that locale / language settings might affect your experience!

Sunday, March 27, 2011

SQL Azure: Enterprise Application Development reviewed

As I blogged earlier this year, there are two books on Windows Azure from Packt publishing. I was personally involved as technical reviewer with one of them, and now I am sharing my feedback on the second.
Microsft SQL AzureMicrosoft SQL Azure: Enterprise Application Development, is the second one from the "Azure" series. Published right after the "Microsoft Azure: Enterprise Application Development" the book is the perfect complement to it. Reading Microsoft SQL Azure, you will learn the basics of cloud services (i.e. what is a Cloud, what types of clouds are there and who are the big players). You will, of course catch up with Windows Azure, as it is briefly described, in case you missed the "Microsoft Azure" book.
Focusing on the SQL Azure service it self, the book covers all the steps required for you to leverage a cloud based RDMS. All the information you find there is well structured and accompanied with good number of screenshots and sample SQL statements. You will not miss any of the features delivered from SQL Azure. All the answers are there – what is the security model of SQL Azure; how to connect and execute queries against the cloud (how to use Sql Server Management Studio); how can you use Sql Server Integration Services (a.k.a. SSIS) and what are the limitations; how to sync your cloud data with on-premise data; what are the tools supported by SQL Azure; and more and more questions and answers. I could hardly find a question for SQL Azure that this book does not answer!
I would highly recommend this book as a complement to the "Microsoft Azure: Enterprise Application Development". These two books are the complete guide to develop application for the Microsoft's Cloud!

Thursday, March 24, 2011

Retrospective

Looking back in 2010 when I was a technical evangelist for Infragistics I can’t say that I regret for anything. Evangelism is not a job. Evangelism is belief! That’s my honest opinion. I was evangelist before I stepped officially into that role, and I continue to be evangelist now. I’m inspired by a presentation from Guy Kawasaki (The Art of the Start – a great session by the way, worth watching the full presentation!), where he states something like “You don’t hire evangelists, they find you”. You can read more of his great posts on his blog.

So, back to the subject, one of the projects I was involved in 2010 was the new MyWorldMaps reporting site. We went through a lot of changes during technical implementantion, but in the end it became a really neat Silverlight 4 application, that uses MVVM, Azure (my passion) and of course the amazing DataVisualistion suite from Infragistics.  I can only be proud to have been part of it. And now I see it is so liked by the original requester Brian Hitney from Microsoft.

I formed maybe the toughest part of it – Distribution Stats. It is based on the Motion Framework (officially announced here), which on the other hand is a result of hard working DataVisualization team. This type of statistics shows you multidimensional data. How many is multi? 4! Yes, four dimensions! X-axis will show you cumulative hits for given browser/region. The Y-axis shows you percentage each item occupy based on the total hits cumulated. Third dimension is obvious – time (data changes over time). And finally change in size of a bubble represents relative change of hits for given time point over the previous.  It is so great to see that it finally went LIVE. Good work all, and thanks Infragistics!

Wednesday, March 23, 2011

SQL Azure limitations

During my talks on SQL Azure, and the session I’ve been to (only locally in Bulgaria) we always listen limitations like “excessive resource usage”, “long running transactions”, “idle time”. But it is very hard to find out officially what are the exact numbers behind these statements. Now that I prepare for my session next week at Microsoft Developer Days 2011 I am hunting for a numbers. And here they are (keep in mind that these numbers are subject to change at any time without notification):

    • Excessive Memory Usage: When there is memory contention, sessions consuming greater than 16-megabyte (MB) for more than 20 seconds are terminated in the descending order of time the resource has been held, such as the oldest session is terminated first. Termination of sessions stops as soon as the required memory is available.When the connection is lost due to this reason, you will receive error code 40553.
    • Idle Connections: Connections to your SQL Azure database that are idle for 30 minutes or longer will be terminated. Since there is no active request, SQL Azure does not return any error.
    • Transaction Termination: SQL Azure kills all transactions after they run for 24 hours. If you lose a connection due to this reason, you will receive error code 40549.
    • Lock Consumption: Sessions consuming greater than one million locks are terminated. When this happens, you will receive error code 40550. You can query the sys.dm_tran_locks dynamic management view (DMV) to obtain information about the current state of locking in SQL Azure.
    • Log File Size: Transactions consuming excessive log resources are terminated. The maximum permitted log size for a single transaction is 1-gigabyte (GB). When the connection is lost due to this reason, you will receive error code 40552.

Full list of limitations and a very good reading: http://social.technet.microsoft.com/wiki/contents/articles/sql-azure-connection-management-in-sql-azure.aspx

Important note on Role recycling when you have IntelliTrace enabled

I was recently using Intellitrace to catch some crashes of a deployed to Azure service. I may say that I am astonished by the simplicity of using IntelliTrace. Although I have pretty good logging (apparently not that good, though Smile) my system was restarting and I couldn’t catch why. Once I enabled Intellitrace  it took me about (literally) 1 minute to catch the issue (including the time for downloading the InterlliTrace log). However I noticed that my role is not recycled after a crash. I wouldn’t connect both if I didn’t come across this post in the Windows Azure forums (the post is written by the Windows Azure Support guys):

When IntelliTrace is enabled on a role that is running in Windows Azure and that role crashes, the role is not restarted. This is to improve the information available to IntelliTrace. We do not recycle the role when it is collecting traces, instead it is put into the “Unresponsive” state.

For more information about IntelliTrace, see the following:
Blog Post: Using IntelliTrace to debug Windows Azure Cloud Services
MSDN Article: Debugging with IntelliTrace

Wednesday, February 2, 2011

Azure books from Packt Publishing

Microsoft_Azure_Enterprise_Application_Development_covAs I already mentioned, I was honored to be technical reviewer of one of the first books on the Azure subject: Microsoft Azure: Enterprise Application Development from Packt Publishing. The process of technical reviewing does take time, and requires attention. You can’t just sit and say “hey I will be reviewing this book”. You often go through the references, check for correctness of provided links, run the provided code to check for any errors, carefully read the content and check for missing or misspelled technical terms / details. I can only imagine what is the process of writing a book. It literally takes an year or more. So don’t be surprised if you don’t see screenshots from the new Silverlight portal, or some of new stuff that sneaked in during PDC 2010! Before all, this book was written by the time of first commercial availability of Windows Azure and represents technically correct everything what was available for Windows Azure SDK 1.2! Which by the way is 100% technically accurate with Windows Azure SDK 1.3. You just have more features available since PDC 2010, but nothing from the existing feature has drastically changed, that would not reflect what is in the book content. Microsoft Azure: Enterprise Application Development covers all aspects and modules (if I may say so) of Windows Azure and how an Enterprise can leverage the platform to build highly scalable and reliable solution on top of Microsoft’s Cloud!

Microsft SQL AzureThere is another book, which focuses on SQL AzureSQL Azure: Enterprise Application Development. While you will one chapter for the Windows Azure platform in general and hosting ASP.NET application within the cloud, the focuses solely on the only one Relational Database Management System As A Service – SQL Azure! I have the honor to obtain a copy of that book and write an abstract overview of the full content. While it may take a bit more of my time I will just share with you that this book will give you the answers for questions like: How can I use SQL Azure with SSIS / SSRS? Can I use SQL Azure without paying for Windows Azure (for sure you can)? Can I use SQL Azure with my PHP application? What about syncing on-premise with Cloud data? And much more! So stay tuned for full overview of this new book on the SQL Azure!