Thursday, August 14, 2014

Azure PowerShell IaaS bulk add Endpoints

There are scenarios when your VMs on Azure cloud will need a lot of EndPoints. Of course you have to always be aware of the limits that come with each Azure service. But you also don’t want to add 20 endpoints (or 50) via the management portal. It will be too painful.

Luckily you can extremely easy add as many endpoints as you will using the following simple PowerShell script:


Add-AzureAccount
Select-AzureSubscription -SubscriptionName "Your_Subscription_Name"
$vm = Get-AzureVM -ServiceName "CloudServiceName" -Name "VM_Name"
for ($i=6100; $i -le 6120; $i++)
{
$EndpointName = "FtpEndpoint_"
$EndpointName += $i
Add-AzureEndpoint -Name $EndpointName -Protocol "tcp" -PublicPort $i -LocalPort $i -VM $vm
}
$vm | Update-AzureVM


You can also find the whole script as a Gist.


Of course, you can use this script, with combination of Non-Interactive OrgID Login Azure PowerShell to fully automate your process.

Wednesday, August 13, 2014

Azure PowerShell non-interactive login

An interesting topic and very important for automation scenarios is how to authenticate a PowerShell script by providing credentials non-interactively.

Luckily a recent version of Azure PowerShell (0.8.6) you can provide additional –credential parameter to the Add-AzureAccount command (hopefully documentation will be updated soon to reflect this additional parameter). This is very helpful and the key point to enable non-interactive PowerShell Automations with organizational accounts (non-interactive management with PowerShell has always been possible with a Management Certificate).

In order to provide proper credentials to the Add-AzureAccount we need to properly protect our password and store it in a file, that can later be used. For this we can use the following simple PowerShell commands:

read-host -assecurestring | convertfrom-securestring | out-file d:\tmp\securestring.txt


Next we have to use the previously saved password to construct the credentials needed for Add-AzureAccount:

# use the saved password 
$password = cat d:\tmp\securestring.txt | convertto-securestring
# currently (August, the 13nd, 2014) only organizational accounts are supported (also with custom domain).
# Microsoft Accounts (Live ID) are not supported
$username = "user@tenant.onmicrosoft.com" # or user@yourdomain.com if 'yourdomain.com' is registered with AAD
$mycred = new-object -typename System.Management.Automation.PSCredential -argumentlist $username,$password
Add-AzureAccount -credential $mycred


The whole PowerShell can also be found under the following Gist.


Credits go to Jamie Thomson and fellow MVP Mike Wood from their contribution on StackOverflow.

Friday, December 20, 2013

Windows Azure – secrets of a Web Site

Windows Azure Web Sites are, I would say, the highest form of Platform-as-a-Service. As per documentation “The fastest way to build for the cloud”. It really is. You can start easy and fast – in a minutes will have your Web Site running in the cloud in a high-density shared environment. And within minutes you can go to 10 Large instances reserved only for you! And this is huge – this is 40 CPU cores with total of 70GB of RAM! Just for your web site. I would say you will need to reengineer your site, before going that big. So what are the secrets?

Project KUDU

What very few know or realize, that Windows Azure Websites runs Project KUDU, which is publicly available on GitHub. Yes, that’s right, Microsoft has released Project KUDU as open source project so we can all peek inside, learn, even submit patches if we find something is wrong.

Deployment Credentials

There are multiple ways to deploy your site to Windows Azure Web Sites. Starting from plain old FTP, going through Microsoft’s Web Deploy and stopping at automated deployment from popular source code repositories like GitHub, Visual Studio Online (former TFS Online), DropBox, BitBucket, Local Git repo and even External provider that supports GIT or MERCURIAL source control systems. And this all thanks to the KUDU project. As we know, Windows Azure Management portal is protected by (very recently) Windows Azure Active Directory, and most of us use their Microsoft Accounts to log-in (formerly known as Windows Live ID). Well, GitHub, FTP, Web Deploy, etc., they know nothing about Live ID. So, in order to deploy a site, we actually need a deployment credentials. There are two sets of Deployment Credentials. User Level deployment credentials are bout to our personal Live ID, we set user name and password, and these are valid for all web sites and subscription the Live ID has access to. Site Level deployment credentials are auto generated and are bound to a particular site. You can learn more about Deployment credentials on the WIKI page.

KUDU console

I’m sure very few of you knew about the live streaming logs feature and the development console in Windows Azure Web Sites. And yet it is there. For every site we create, we got a domain name like

http://mygreatsite.azurewebsites.net/

And behind each site, there is automatically created one additional mapping:

https://mygreatsite.scm.azurewebsites.net/

Which currently looks like this:

Key and very important fact – this console runs under HTTPS and is protected by your deployment credentials! This is KUDU! Now you see, there are couple of menu items like Environment, Debug Console, Diagnostics Dump, Log Stream. The titles are pretty much self explanatory. I highly recommend that you jump on and play around, you will be amazed! Here for example is a screenshot of Debug Console:

Nice! This is a command prompt that runs on your Web Site. It has the security context of your web site – so pretty restricted. But, it also has PowerShell! Yes, it does. But in its alpha version, you can only execute commands which do not require user input. Still something!

Log Stream

The last item in the menu of your KUDU magic is Streaming Logs:

Here you can watch in real time, all the logging of your web site. OK, not all. But everything you’ve sent to System.Diagnostics.Trace.WriteLine(string message) will come here. Not the IIS logs, your application’s logs.

Web Site Extensions

This big thing, which I described in my previous post, is all developed using KUDU Site Extensions – it is an Extension! And, if you played around with, you might already have noticed that it actually runs under

https://mygreatsite.scm.azurewebsites.net/dev/wwwroot/

So what are web site Extensions? In short – these are small (web) apps you can write and you can install them as part of your deployment. They will run under separate restricted area of your web site and will be protected by deployment credentials behind HTTPS encrypted traffic. you can learn more by visiting the Web Site Extensions WIKI page on the KUDU project. This is also interesting part of KUDU where I suggest you go, investigate, play around!

Happy holidays!

Wednesday, December 4, 2013

Reduce the trail-deploy-test time with Windows Azure Web Sites and Visual Studio Online

Visual Studio Online

Not long ago Visual Studio Online went GA. What is not so widely mentioned is the hidden gem – preview version of the actual Visual Studio IDE! Yes, this thing that we use to develop code has now gone online as preview (check the Preview Features page on the Windows Azure Portal).

- What can we do now?
- Live, real-time changes to a Windows Azure Web Site!
- Really !? How?

First you need to create new VSO account, if you don’t already have one (please waste no time but get yours here!). Then you need to link it to your Azure subscription! Unfortunately (or should I use “ironically”?) account linking (and creating from within the Azure management portal) is not available for an MSDN benefit account, as per FAQ here.

Link an existing VSO account

Once you get (or if you already have) a VSO account, you can link it to your Azure subscription. Just sign-in to the Azure Management portal with the same Microsoft Account (Live ID) used to create VSO account. There you shall be able to see the Visual Studio Online in left hand navigation bar. Click on it. A page will appear asking you to create new or link existing VSO account. Pick up the name of your VSO account and link it!

 

Enable VSO for an Azure Web Site

You have to enable VSO for each Azure Web Site you want to edit. This can be achieved by navigating to the target Azure Web Site inside the Azure Management Portal. Then go to Configure. Scroll down and find “Edit Site in Visual Studio Online” and switch this setting to ON. Wait for the operation to complete!

Edit the Web Site in VSO

Once the Edit in VSO is enabled for you web site, navigate to the dashboard for this Web Site in Windows Azure Management Portal. A new link will appear in the right hand set of links “Edit this Web Site”:

The VSO IDE is protected with your deployment credentials (if you don’t know what is your deployment credentials, please take a few minutes to read through this article).

And there you go – your Web Site, your IDE, your Browser! What? You said that I forgot to deploy my site first? Well. Visual Studio Online is Visual Studio Online. So you can do “File –> New” and it works! Oh, yes it works:

Every change you make here is immediately (in real-time) reflected to the site! This is ultimate, the fastest way to troubleshoot issues with your JavaScript / CSS / HTML (Views). And, if you were doing PHP/Node.js – just edit your files on the fly and see changes in real-time! No need to re-deploy, re-package. No need to even have IDE installed on your machine – just a modern Browser! You can edit your site even from your tablet!

Where is the catch?

Oh, catch? What do you mean by “Where is the catch”? The source control? There is integrated GIT support! You can either link your web-site to a Git (GitHub / VSO project with GIT-based Source Control), or just do work with local GIT repository. The choice is really yours! And now you have fully integrated source control over your changes!

Tuesday, October 15, 2013

Windows Azure Migration cheat-sheet

I was recently asked whether I do have some cheat-sheet for migrating applications to Windows Azure. The truth is that everything is in my head and I usually go with “it should work” – quickly build, pack and deploy. Then troubleshoot the issues. However there are certain rules that must be obeyed before making any attempt to port to Windows Azure. Here I will try to outline some.

Disclaimer

What I describe here is absolutely my sole opinion, based on my experience. You are free to follow these instructions at your own risk. I describe key points in migrating an application to the Windows Azure Platform-as-a-Service offering – the regular Cloud Services with Web and/or Worker Roles. This article is not intended for migrations to Infrastructure Services (or Windows Azure Virtual Machines).

Database

If you work with Microsoft SQL Server it shall be relatively easy to go. Just download, install and run against your local database the SQL Azure Migration Wizard. It is The tool that will migrate your database or will point you to features you are using that are not compatible with SQL Azure. The tool is regularly updated (latest version is from a week before I write this blog entry!).

Migrating schema and data is one side of the things. The other side of Database migration is in your code – how you use the Database. For instance SQL Azure does not accept “USE [DATABASE_NAME]” statement. This means you cannot change database context on the fly. You can only establish connection to a specific database. And once the connection is established, you can work only in the context of that database. Another limitation, which comes as consequence of the first one is that 4-part names are not supported. Meaning that all your statements must refer to database objects omitting database name:

[schema_name].[table_name].[column_name],

instead of

[database_name].[schema_name].[table_name].[column_name].

Another issue you might face is the lack of support for SQLCLR. I once worked with a customer who has developed a .NET Assembly and installed it in their SQL Server to have some useful helpful functions. Well, this will not work on SQL Azure.

Last, but not least is that you (1) shall never expect SQL Azure to perform better, or even equal to your local Database installation and (2) you have to be prepared for so called transient errors in SQL Azure and handle them properly. You better get to know the Performance Guidelines and Limitations for Windows Azure SQL Database.

Codebase

Logging

When we target own server (that includes co-locate/virtual/shared/etc.) we usually use local file system (or local database?) to write logs. Owning a server makes diagnostics and tracing super easy. This is not really the case when you move to Windows Azure. There is a feature of Windows Azure Diagnostics Agent to transfer your logs to a blob storage, which will let you just move the code without changes. However I do challenge you to rethink your logging techniques. First of all I would encourage you to log almost everything, of course using different logging levels which you can adjust runtime. Pay special attention to the Windows Azure Diagnostics and don’t forget – you can still write your own logs, but why not throwing some useful log information to System.Diagnostics.Trace.

Local file system

This is though one and almost always requires code changes and even architecting some parts of the application. When going into the cloud, especially the Platform-as-a-Service one, do not use local file system for anything else, but a temporary storage and static content that is part of your deployment package. Everything else should go to a blob storage. And there are many great articles on how to use blob storage here.

Now you will probably say “Well, yeah, but when I put everything into a blob storage isn’t it vendor-lock-in?” And I will reply – depending on how you implement this! Yes, I already mentioned it will certainly require code change and, if you want to make it the best way and avoid vendor-lock-it, it will probably also require architecture change for how your code works with files. And by the way, file system is also “vendor-lock-in”, isn’t it?

Authentication / Authorization

It will not be me if I don’t plug-in here. Your application will typically use Forms Authentication. When you redesign your app anyway I highly encourage you rethink your auth/autz system and take a look into Claims! I have number of posts on Claims based authentication and Azure ACS(Introduction to Claims, Securing ASMX web services with SWT and claimsIdentity Federation and Sign-out, Federated authentication – mobile login page for Microsoft Account (live ID), Online Identity Management via Azure ACS, Creating Custom Login page for federated authentication with Azure ACSUnified identity for web apps – the easy way). And couple of blogs I would recommend you to follow in this direction:

Other considerations

To the moment I cant dive deeper in the Azure ocean of knowledge I have to pull out something really important that fits all types of applications. If it happens, I will update the content. Things like COM/COM+/GDI+/Server Components/Local Reports – everything should work in a regular WebRole/WorkerRole environment. Where you also have full control for manipulating the operating system! Windows Azure Web Sites is far more restrictive (to date) in terms of what you can execute there and to what part of the operating system you have access.

Here is something for you think on: I worked out with a customer who was building SPA Application to run in Windows Azure. They have designed a bottleneck for scaling in their core. The system manipulates some files. It is designed to keep object graphs of those files in-memory. It is also designed in a way that end-user may upload as many files as day want during the course of their interaction with the system. And the back-end keeps a single object graph for all the files user submitted in-memory. This object graph cannot be serialized. Here is the situation:

In Windows Azure we (usually, and to comply with SLA) have at least 2 instances of our server. These instances are load balanced using round-robin algorithm. The end user comes to our application, logs-in and uploads a file. Works, works, works – every request is routed to a different server. Now user uploads new file, and again, and again … each request still goes to a different server.

And here is the question:

What happens when the server side code wants to keep a single object graph of all files uploaded by the end user?

The solution: I leave it to your brains!

Conclusion

Having in mind the above mentioned key points in moving application to Windows Azure, I highly encourage you to play around and test. I might update that blog post if something rather important comes out from the deep ocean of Azure knowledge I have. But for the moment, these are the most important check-points for your app.

If you have questions – you are more than welcome to comment!