tag:blogger.com,1999:blog-51773053108279782432024-03-05T06:08:46.370+02:00Anton Staykov's BlogAnton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.comBlogger143125tag:blogger.com,1999:blog-5177305310827978243.post-19170031729167246342014-12-14T23:28:00.001+02:002014-12-14T23:43:23.863+02:00Experimenting with Azure Stream Analytics<p>Just little over a month ago Microsoft <a href="http://weblogs.asp.net/scottgu/azure-announcing-new-real-time-data-streaming-and-data-factory-services">announced</a> public preview of a new service – <a href="http://azure.microsoft.com/en-us/services/stream-analytics/">Stream Analytics</a>. A service designed to process in (near) real time huge amount of streamed data. With its current state the services integrates with Azure Event Hubs and Azure Blob storage for data source streams (also called Inputs) and Event Hubs, Blob Storage, Azure SQL Database as possible write targets (also called Outputs). With the support of SQL-like language, you can design your stream processor so you can slice and dice your real-time input data, and turn it into a trustful information. </p> <p>Now comes the power of cloud. In couple of easy steps and couple of hours you can bring up a reliable infrastructure that can handle tens of thousands events/messages per second. I was really curious how far can it go in a simple test. So I quickly made up a test scenario. Base for my experiment is the <a href="http://azure.microsoft.com/en-us/documentation/articles/stream-analytics-get-started/">getting started tutorial here</a>. There is a small issue with “Start the Job” step. Described is that you must go to “configure” section for your Job in order to adjust your job output start time. This configuration however is not located under Configure section. This specific setting is configured on the window where you start your job:</p> <p>Now. In order to make the things more interesting I made the following adjustments:</p> <ul> <li>Scaled my event hub to 10 scale units. Thus achieving potentially 10000 events per seconds target.</li> <li>Changed the Event Hub sample code a bit to bump up more messages.</li> <li>Created small PowerShell to help me start N simultaneous instances of my command line app</li> <li>Did everything on a VM in same Azure DC (West Europe) where my Event Hub and Stream Analytics are running</li></ul> <p>Code changes to the original Service Bus Event Hub demo code.</p> <p>I stripped out all unnecessary code (i.e. creating the event hub – I have already created it, I know it is there, parsing command line arguments, etc.). My final Program.cs looks like this:</p><pre class="brush: csharp;"> static void Main(string[] args)<br /> {<br /> System.Net.ServicePointManager.DefaultConnectionLimit = 1024;<br /> eventHubName = "salhub";<br /> Console.WriteLine("Start sending ...");<br /> Stopwatch sw = new Stopwatch();<br /> sw.Start();<br /> Paralelize();<br /> sw.Stop();<br /> Console.WriteLine("Completed in {0} ms", sw.ElapsedMilliseconds);<br /> Console.WriteLine("Press enter key to stop worker.");<br /> Console.ReadLine();<br /> }<br /><br /> static void Paralelize()<br /> {<br /> Task[] tasks = new Task[25];<br /> for (int i = 0; i < 25; i++)<br /> {<br /> tasks[i] = new Task(()=>Send(2000)); <br /> }<br /> <br /> Parallel.ForEach(tasks, (t) => { t.Start(); });<br /> Task.WaitAll(tasks);<br /> }<br /><br /> public static void Send(int eventCount)<br /> {<br /> Sender s = new Sender(eventHubName, eventCount);<br /> s.SendEvents();<br /> }<br /><br /></pre><br /><p>Now with this single command line app, I am sending 25 x 2 000, or 50 000 messages in parallel. To make things funnier I run this single console app in pseudo-parallel by just starting it 20 times with this simple PowerShell script:</p><pre class="brush: ps;">for($i=1; $i -le 20; $i++)<br />{<br /> start .\BasicEventHubSample.exe <br />}<br /></pre><br /><p>Thus I start the processes almost the same time. And wait to finish, i.e. to have all processes send all their messages. Twenty times 50 000 messages should make 1 000 000 messages. Then just get the result of the slowest operation. Of course all the measures are then a little approximate, but good enough to give me idea about the possibilities in my hands. Without the need to invest in expensive hardware and developing complex solutions. One more thing – I started my stream analytics job before I start my data pumping command line executable, just to make sure that the Stream processor is already there when I start bombing with data.</p><br /><p>Please note couple of things. First of all Stream Analytics is in preview, so there might be issues and glitches. But the end results are just astonishing. Looking at the graphs for both the Event Hub and Stream analytics is just awesome. By the way, the last thing that I proved, is that <a href="http://msdn.microsoft.com/en-us/library/azure/dn741336.aspx">new service tiers of Azure SQL Database</a> are also awesome. With this amount of data in stream analytics, it had no issues writing the results into a single Basic (with 5 DTUs) database! I began seeing the results in my SQL Database table at the moment I switched from started command line programs to my SQL Server management studio and could see the result coming in real time.</p><br /><p>Bottom line, with my last try, I bumped 1 000 000 events into Event hub in just about 75 seconds! That makes a little above 13 000 events in second! With just couple of line of code. How cool it is to look at graphic like this:</p><br /><p><img src="http://i.imgur.com/4HNRWFY.png"></p><br /><p>How cool it is to look at graphics like the Azure Event Hubs one:</p><br /><p><img src="http://i.imgur.com/XDyYqNU.png"></p><br /><p>Azure Event hubs, millions of messages. How long would it take us if we had to create a local test lab to process that amount of data?</p><br /><p>We have to not forget some of the known issues and limitations for the Stream Analytics <a href="http://azure.microsoft.com/en-us/documentation/articles/stream-analytics-limitations/">as listed here</a>. Most important of them being:</p><br /><ul><br /><li>Geographic availability (Central US and West Europe)</li><br /><li>Streaming Unit quota (12 streaming units per azure per region per subscription!)</li><br /><li>UTF-8 as the only supported encoding for CSV and JSON input sources</li><br /><li>Really neat performance metrics such as latency are not currently provided</li></ul><br /><p>With this base line, I am convinced that Azure Event Hubs can really deliver millions of events per second throughput, and that Stream Analytics can really process that amount of data. </p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-36018626147468403772014-12-06T23:52:00.000+02:002014-12-06T23:52:38.305+02:00Easy authentication in Azure Web Sites<p>Since couple of year (3-4) I strongly evangelize single-sign-on, federated identity, claims authentication and so on. There are at least two major points to support that:</p> <p>You (as developer) don’t want to be responsible for the leak of tens or hundreds of thousands passwords and personal data. This responsibility is just too high.</p> <p>Living in 21st century, there is not a single Internet user, who does not have at least 2 online identities which can be used for authentication (Google, Microsoft, FaceBook, Yahoo, etc.)</p> <p>Having said that, I have also written a number of articles on claims based authentication, custom login pages, etc. In all of them user had to go through some learning curve. This is not the case today! Today, Microsoft is thinking about developers and lets them to focus on application itself, business logic and just does not care about authentication! Do not forget that you can run .NET (ASP.NET WebForms, MVC and even ASP.NET vNext!), Java, Node.Js, PHP, Python on Azure Web Sites today! With three easy steps, you can protect your Web Site with <a href="http://azure.microsoft.com/en-us/services/active-directory/">Azure Active Directory</a>!</p> <p>What is <a href="http://azure.microsoft.com/en-us/services/active-directory/">Azure Active Directory</a> – this is the Identity management system that is responsible for all Office 365 subscribers, Dynamics CRM Online subscribers, Microsoft Intune and all Azure Subscriptions! You may even had no idea, but with every Azure subscription, comes one default <a href="http://azure.microsoft.com/en-us/services/active-directory/">Azure Active Directory</a>. So, if you are using Azure, regardless of that being MSDN benefit, Regular pay-as-you-go or a free Trial, you already have one <a href="http://azure.microsoft.com/en-us/services/active-directory/">Azure Active Directory</a> tenant! If you wish, you can learn a bit more about how to manage your <a href="http://technet.microsoft.com/en-us/library/hh967611.aspx">Azure Active Directory here</a>.</p> <p>So, dear user, you have <a href="http://azure.microsoft.com/en-us/documentation/articles/web-sites-create-deploy/">created your Azure Web Site</a> and now you have to protect it with your Azure Active Directory tenant. Here are the three easy steps you have to follow:</p> <p>1. Navigate to the Configure tab of your Web site. Scroll down to Authentication / Authorization section and click Configure</p><img src="http://i.imgur.com/30QUwBI.png" width="629" height="436"> <p>3. Select your Azure Active Directory (if you have not changed anything, the name of your Active directory will most probably be “Default Directory) and chose “Create new application”:</p> <p><img src="http://i.imgur.com/Fgy1nVg.png"></p> <p>Done:</p> <p><img src="http://i.imgur.com/lCvW4OP.png" width="623" height="432"></p> <p>Now your site is protected by Azure Active directory, has automatic Claims Authentication, you don’t have to worry about salting and hashing users passwords, don’t need to worry about how user would reset their password and so on. Protecting your site has never been easier!</p> <p>What are the catches! <img class="wlEmoticon wlEmoticon-smile" style="border-top-style: none; border-bottom-style: none; border-right-style: none; border-left-style: none" alt="Smile" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxEjzpODMtiTl82eHdTHsgZV8aziQOF-zHZ_CUx8h_3DYSUB2CmIKdtIhRz20l-JDReRMBUpTg1vQwNPwbG0GDbedCXQuA_bGe7WeHZNYDpwoUoiDepRUj-7ImR1zY7cEly2gPFQtKJDV3/?imgmax=800"> There is always a catch! First of all, this service is yet in preview and has some limitations:</p> <ul> <li>You can only protect your site with your Azure Active directory, but you can add Microsoft Accounts (i.e. <a href="mailto:someone@hotmail.com">someone@hotmail.com</a>) to your Azure Active Directory, but not any external users (i.e. FaceBook, Google, Yahoo)</li> <li>With the current release all users in the configured directory will have access the application. </li> <li>With the current release the whole site is placed behind login the requirement (you cannot define “public” pages, but it is relatively easy to do this in a web.config file). </li> <li>Head less authentication/authorization for API scenarios or service to service scenarios are not currently supported. </li> <li>With the current release there is no distributed log-out so logging the user out will only do so for this application and not all global sessions (which means, that if user comes back, he/she will automatically be logged-in again). </li></ul> <p>Quick, easy and works across the whole stack of supported platforms on Azure Web Sites (.NET, PHP, Java, Node.JS, Python).</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-7013726929199723442014-09-13T16:32:00.001+03:002014-09-13T16:32:18.709+03:00Give me your e-mail to tell you if you are being hacked!<h3>History</h3> <p>A lot of accounts from public services have recently been hacked, exploited, publicly listed, etc. With every single account breach there are at least 5 services that tell you “check if your account has been hacked” and ask you for your e-mail or account username. Almost never asking for your password. Here I will try to explain why You, dear user shall avoid using any of these services, even if the operator behind the service seems to be respectful like the “<a href="https://www.bsi.bund.de/">Bundesamt für sicherheit in der Informationstechnik</a>” (or the German Agency for Information Security) which also offer the service <a href="https://www.sicherheitstest.bsi.de/">“Check if your account exists in the hackers networks that we monitor”</a>.</p> <h3>Problem</h3> <p>This year started with a lot of account breaches in different public services (mainly e-mail services). One such news <a href="https://www.bsi.bund.de/DE/Presse/Pressemitteilungen/Presse2014/Mailtest_21012014.html">was announced on the very same German Agency for Information Security</a> where they so friendly offer you the free service of checking whether your account is subject to any identity theft. Then it was the <a href="http://bgr.com/2014/05/27/ebay-hack-145-million-accounts-compromised/">eBay accounts breach</a>. Then the <a href="https://www.apple.com/pr/library/2014/09/02Apple-Media-Advisory.html">iCloud celebrity accounts breach</a>. Then <a href="http://www.ibtimes.com/5-million-gmail-usernames-passwords-hacked-posted-russian-bitcoin-forum-report-1684368">Google account breach</a>. Probably much more in between. With every massive and hysterical announced account breach come a dozen of sites to tell you</p> <blockquote> <p><em><font size="3">You should immediately change your password!</font></em></p></blockquote> <p>and </p> <blockquote> <p><em><font size="3">Hey, gimmie your e-mail, I will tell you if it is hacked!</font></em></p></blockquote> <p>pretending to </p> <blockquote> <p><font size="3">I will not save your e-mail address anywhere, you can trust me!</font></p></blockquote> <p>While the first warning have some sense, none of the others does! </p> <blockquote> <p><font size="4"><em>For Your own good and safe Internet browsing, do not ever use any services that pretend to tell you if your account is being hacked or not!</em></font></p></blockquote> <p>Why? Here is the story of “Why?”</p> <h3>How the attacks work</h3> <p>Without pretending for be a thorough analysis, let me tell you how these attacks (for hacking user accounts) work.</p> <p>Online user identities are usually composed from three main components:</p> <ul> <li>A service (Facebook, Google, Microsoft, eBay, Apple, etc.) <li>A Username / login <li>A Password</li></ul> <p>In order to “hack” your account, the attacker have to first focus on a Service. This is the easiest part. Just follow for couple of months the security reports from one or more monitoring agencies (like <a href="http://www.symantec.com/security_response/publications/threatreport.jsp">Symantec</a>, <a href="https://isc.sans.edu/">SANS Institute</a>, or any other) and watch out which service comes out most often. Or just pick one. </p> <p>Ok, the attacker has identified the service to attack. Say this is Facebook. What next? Now he/she has to hack tens of millions of accounts. Using techniques like <a href="http://en.wikipedia.org/wiki/Brute-force_attack">brute-force</a> attack to identify both login name + password will simply not work. Period. Nobody does this today! The attacker will look for other techniques to obtain, be careful here, <strong>your login name</strong>! Exactly! Your e-mail address. This very same e-mail address that other “friendly” services ask you to give them to check if your account is being hacked / hijacked! </p> <p>By giving your login name / E-mail address to a “let me check this for you” service, you simply fill out attackers database with <strong>real accounts </strong>that can later be used for password hacking!</p> <p>Now, because, You dear user have left your e-mail address in a similar service, You are already potentially subject to hacker attack! <strong>Please, never give your e-mail address or login name to any services of this kind !</strong> Not even to the German Agency for Information security. Even if the service seems to be trustful, using such a service does not do any good for you at all! It only serves its owners for different purposes.</p> <p>We slowly came to the last component of an Online identity that an attacker has to crack to solve the puzzle – <strong>the password</strong>. Your precious “123456”. Again, passwords are (almost) never hacked using brute-force attacks. Attackers usually use dictionaries of most widely used password. So called <a href="http://en.wikipedia.org/wiki/Dictionary_attack">dictionary attack</a>. Simple words, no (or few) special characters, no (or few) capital letters. <a href="http://www.securityweek.com/apple-says-no-icloud-breach-targeted-hack-against-celebrities">Analysis report</a> shows that even this recent iCloud security breach was committed using dictionaries. </p> <h4>Next steps</h4> <p>OK, now what? </p> <p>First and foremost, never give your account (e-mail address / login name) to a 3rd parties! The worst that could happen – you will be primary target for attacks, if you were safe until now! The least that could happen – you will be entered into a list for further monitoring – SPAM, Hack attacks, etc.! Lists with valid e-mail addresses are being trade (sold for real money!) over the internet ever day!</p> <p>To make sure you are secure online, never use a dictionary word in your password! Your password shall not consist of a single word! Most of the online services already have mechanisms to prevent you from using weak passwords. Trust these “password strength” indicators and never let your password be in the “weak zone”. </p> <p>Well, be careful and always think about your own Internet safety! And never ever give your account from one Service (say Google) to another service (say German Agency for Information Security). For your Google account, trust only Google. For your Facebook account, trust only Facebook, etc.</p> <p>If you see a report for account hack or security breach, never rush for other services, then the very one you use and is responsible for your account. Most of the big players on the market already have forensic tools in place, and make sure you know them and you know how to use them!</p> <h4>Google Account </h4> <p>If you use Google, then navigate to the security section in Your Account. When you are logged-in with your Google account on any of Google’s service, click on the little arrow next to your e-mail and select “Account”:</p> <p><img src="http://i.imgur.com/dYUZWBi.png"></p> <p>Then navigate to Security:</p> <p><img src="http://i.imgur.com/4BcWiBG.png" width="615" height="215"></p> <p>This part, has the “Recent activity” section which shows really good and interesting information.</p> <h4>Microsoft Account (former Windows Live ID / Hotmail) </h4> <p>If you use Microsoft services the “Recent activity” information is in similar place. Login with your Microsoft account on any of Microsoft services (Hotmail/Outlook, OneDrive) and click on your name:</p> <p><img src="http://i.imgur.com/xiYxmzo.png" width="283" height="193"></p> <p>Under “Account settings” you will find “Recent Activity”:</p> <p><img src="http://i.imgur.com/yn8QyaM.png" width="357" height="244"></p> <h4>Final notes</h4> <p>Again, never leave (enter, give away) your personal account information to anyone on the Internet!</p> <p>Use strong passwords. It is not that important to change the password often! It is important to use strong password and regularly check the account activity section. Change your password only if you see suspicious action in the recent activity! Or if you receive a legitimate message from your service provider that you have to change your password. Like the e-mail all eBay users received in May 2014:</p> <p><img src="http://i.imgur.com/GGIDqSU.png" width="593" height="486"></p> <p>When you receive such an e-mail, first check its authenticity – check the sender and reply-to addresses in message properties. Check for official information on senders (in that case eBay) public internet site. Never click on any link directly from the e-mail. Just navigate to the service as usual and change your password.</p> <p>When you enter your account information (login and password) <strong><font size="4">always</font></strong> check if you do it on the providers sign-in page by verifying web page’s SSL Certificate! All the Big players pay for Extended Validation Certificate which makes the address bar / Certificate path green and displays their name (EV stands for Extended Validation):</p> <p><img src="http://i.imgur.com/9SfBwlh.png"></p> <p><img src="http://i.imgur.com/JLzCPaK.png"></p> <p>While others just save couple of hundred dollars and not pay for Extended Validation. Still providing a Trusted and encrypted connection with the site: </p> <p><img src="http://i.imgur.com/rNGHdim.png"></p> <p><img src="http://i.imgur.com/DHPErba.png"></p> <p><img src="http://i.imgur.com/I2276Ry.png"></p> <p><strong>NEVER ENTER YOUR CREDENTIALS</strong>, if the SSL Connection is not verified or not trusted:</p> <p><img src="http://i.imgur.com/oP6S9yr.png"></p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-61012032190772522902014-08-14T10:14:00.001+03:002014-08-14T10:14:02.634+03:00Azure PowerShell IaaS bulk add Endpoints<p>There are scenarios when your VMs on Azure cloud will need a lot of EndPoints. Of course you have to always be aware of the <a href="http://azure.microsoft.com/en-us/documentation/articles/azure-subscription-service-limits/">limits that come with each Azure service</a>. But you also don’t want to add 20 endpoints (or 50) via the management portal. It will be too painful. </p> <p>Luckily you can extremely easy add as many endpoints as you will using the following simple PowerShell script:</p><pre><br />Add-AzureAccount<br />Select-AzureSubscription -SubscriptionName "Your_Subscription_Name"<br />$vm = Get-AzureVM -ServiceName "CloudServiceName" -Name "VM_Name"<br />for ($i=6100; $i -le 6120; $i++)<br />{<br /> $EndpointName = "FtpEndpoint_"<br /> $EndpointName += $i<br /> Add-AzureEndpoint -Name $EndpointName -Protocol "tcp" -PublicPort $i -LocalPort $i -VM $vm<br />}<br />$vm | Update-AzureVM<br /><br /></pre><br /><p>You can also find the whole script as a <a href="https://gist.github.com/astaykov/22fd38e3593bbf273329#file-azure-powershell-bulk-add-endpoints">Gist</a>. </p><br /><p>Of course, you can use this script, with combination of <a href="http://bit.ly/1usEF36">Non-Interactive OrgID Login Azure PowerShell</a> to fully automate your process.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com1tag:blogger.com,1999:blog-5177305310827978243.post-45383860884588006342014-08-13T19:35:00.001+03:002014-08-13T19:42:05.344+03:00Azure PowerShell non-interactive login<p>An interesting topic and very important for automation scenarios is how to authenticate a PowerShell script by providing credentials non-interactively. </p> <p>Luckily a recent version of Azure PowerShell (0.8.6) you can provide additional <strong><em>–credential</em></strong> parameter to the <a href="http://msdn.microsoft.com/en-us/library/azure/dn722528.aspx">Add-AzureAccount</a> command (hopefully documentation will be updated soon to reflect this additional parameter). This is very helpful and the key point to enable non-interactive PowerShell Automations with organizational accounts (non-interactive management with PowerShell has always been possible with a Management Certificate).</p> <p>In order to provide proper credentials to the Add-AzureAccount we need to properly protect our password and store it in a file, that can later be used. For this we can use the following simple PowerShell commands:</p><pre>read-host -assecurestring | convertfrom-securestring | out-file d:\tmp\securestring.txt<br /><br /></pre><br /><p>Next we have to use the previously saved password to construct the credentials needed for Add-AzureAccount:</p><pre># use the saved password <br />$password = cat d:\tmp\securestring.txt | convertto-securestring <br /># currently (August, the 13nd, 2014) only organizational accounts are supported (also with custom domain). <br /># Microsoft Accounts (Live ID) are not supported <br />$username = "user@tenant.onmicrosoft.com" # or user@yourdomain.com if 'yourdomain.com' is registered with AAD <br />$mycred = new-object -typename System.Management.Automation.PSCredential -argumentlist $username,$password <br />Add-AzureAccount -credential $mycred<br /><br /></pre><br /><p>The whole PowerShell can also be found under the following <a href="https://gist.github.com/astaykov/4805e070a3ad47a883da#file-non-interactive-azure-manage">Gist</a>.</p><br /><p>Credits go to <a href="http://sqlblog.com/blogs/jamie_thomson/">Jamie Thomson</a> and fellow MVP <a href="http://mvwood.com/">Mike Wood</a> from their contribution on <a href="http://stackoverflow.com/questions/25206485/add-azureaccount-credential-not-working-as-id-hoped">StackOverflow</a>.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-44796358646973209242013-12-20T16:40:00.001+02:002013-12-22T03:03:06.218+02:00Windows Azure – secrets of a Web Site<p><a href="http://www.windowsazure.com/en-us/services/web-sites/">Windows Azure Web Sites</a> are, I would say, the highest form of Platform-as-a-Service. As per documentation “<em>The fastest way to build for the cloud</em>”. It really is. You can start easy and fast – in a minutes will have your Web Site running in the cloud in a high-density shared environment. And within minutes you can go to 10 Large instances reserved only for you! And this is huge – this is 40 CPU cores with total of 70GB of RAM! Just for your web site. I would say you will need to reengineer your site, before going that big. So what are the secrets?</p> <h2>Project KUDU</h2> <p>What very few know or realize, that Windows Azure Websites runs <a href="https://github.com/projectkudu/kudu/wiki">Project KUDU</a>, which is publicly available on <a href="https://github.com/projectkudu/kudu/wiki">GitHub</a>. Yes, that’s right, Microsoft has released Project KUDU as open source project so we can all peek inside, learn, even submit patches if we find something is wrong.</p> <h2>Deployment Credentials</h2> <p>There are multiple ways to deploy your site to Windows Azure Web Sites. Starting from plain old FTP, going through Microsoft’s Web Deploy and stopping at automated deployment from popular source code repositories like GitHub, Visual Studio Online (former TFS Online), DropBox, BitBucket, Local Git repo and even External provider that supports GIT or MERCURIAL source control systems. And this all thanks to the KUDU project. As we know, Windows Azure Management portal is protected by (very recently) Windows Azure Active Directory, and most of us use their Microsoft Accounts to log-in (formerly known as Windows Live ID). Well, GitHub, FTP, Web Deploy, etc., they know nothing about Live ID. So, in order to deploy a site, we actually need a deployment credentials. There are two sets of Deployment Credentials. User Level deployment credentials are bout to our personal Live ID, we set user name and password, and these are valid for all web sites and subscription the Live ID has access to. Site Level deployment credentials are auto generated and are bound to a particular site. You can learn more about Deployment credentials on the <a href="https://github.com/projectkudu/kudu/wiki/Deployment-credentials">WIKI page</a>.</p> <h2>KUDU console</h2> <p>I’m sure very few of you knew about the live streaming logs feature and the development console in Windows Azure Web Sites. And yet it is there. For every site we create, we got a domain name like</p> <p><a href="http://mygreatsite.azurewebsites.net/">http://mygreatsite.azurewebsites.net/</a></p> <p>And behind each site, there is automatically created one additional mapping:</p> <p><a href="https://mygreatsite.scm.azurewebsites.net/">https://mygreatsite.scm.azurewebsites.net/</a></p> <p>Which currently looks like this:</p> <p><img src="http://i.imgur.com/ETlCLoA.png"></p> <p>Key and very important fact – this console runs under HTTPS and is protected by your deployment credentials! This is KUDU! Now you see, there are couple of menu items like Environment, Debug Console, Diagnostics Dump, Log Stream. The titles are pretty much self explanatory. I highly recommend that you jump on and play around, you will be amazed! Here for example is a screenshot of Debug Console:</p> <p><img src="http://i.imgur.com/ZHqLlut.png"></p> <p>Nice! This is a command prompt that runs on your Web Site. It has the security context of your web site – so pretty restricted. But, it also has PowerShell! Yes, it does. But in its alpha version, you can only execute commands which do not require user input. Still something!</p> <h2>Log Stream</h2> <p>The last item in the menu of your KUDU magic is Streaming Logs:</p> <p><img src="http://i.imgur.com/pJCHNt8.png"></p> <p>Here you can watch in real time, all the logging of your web site. OK, not all. But everything you’ve sent to System.Diagnostics.Trace.WriteLine(string message) will come here. Not the IIS logs, your application’s logs.</p> <p><strong>Web Site Extensions</strong></p> <p>This big thing, which I described in my previous post, is all developed using KUDU Site Extensions – it is an Extension! And, if you played around with, you might already have noticed that it actually runs under</p> <p><a href="https://mygreatsite.scm.azurewebsites.net/dev/wwwroot/">https://mygreatsite.scm.azurewebsites.net/dev/wwwroot/</a></p> <p>So what are web site Extensions? In short – these are small (web) apps you can write and you can install them as part of your deployment. They will run under separate restricted area of your web site and will be protected by deployment credentials behind HTTPS encrypted traffic. you can learn more by visiting the Web <a href="https://github.com/projectkudu/kudu/wiki/Azure-Site-Extensions">Site Extensions WIKI page on the KUDU project</a>. This is also interesting part of KUDU where I suggest you go, investigate, play around!</p> <p>Happy holidays!</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-13677269901233128572013-12-04T14:44:00.000+02:002013-12-04T14:44:51.857+02:00Reduce the trail-deploy-test time with Windows Azure Web Sites and Visual Studio Online<h2>Visual Studio Online</h2> <p>Not long ago <a href="http://www.windowsazure.com/en-us/services/visual-studio-online/">Visual Studio Online</a> went GA. What is not so widely mentioned is the hidden gem – preview version of the actual Visual Studio IDE! Yes, this thing that we use to develop code has now gone online as preview (check the <a href="http://www.windowsazure.com/en-us/services/preview/">Preview Features page on the Windows Azure Portal</a>). </p> <p>- What can we do now? <br>- Live, real-time changes to a Windows Azure Web Site!<br>- Really !? How?</p> <p>First you need to create new VSO account, if you don’t already have one (please waste no time but <a href="http://go.microsoft.com/fwlink/?LinkId=307137">get yours here</a>!). Then you need to link it to your Azure subscription! Unfortunately (or should I use “<strong>ironically</strong>”?) account linking (and creating from within the Azure management portal) is not available for an MSDN benefit account, as per <a href="http://www.visualstudio.com/set-up-billing-for-your-account-vs#CannotLinkVSOAccount">FAQ here</a>. </p> <h2>Link an existing VSO account</h2> <p>Once you get (or if you already have) a VSO account, you can link it to your Azure subscription. Just sign-in to the Azure Management portal with the same Microsoft Account (Live ID) used to create VSO account. There you shall be able to see the Visual Studio Online in left hand navigation bar. Click on it. A page will appear asking you to create new or link existing VSO account. Pick up the name of your VSO account and link it!</p><img style="margin: 0px" src="http://i.imgur.com/zMy810A.png"> <h2> </h2> <h2>Enable VSO for an Azure Web Site</h2> <p>You have to enable VSO for each Azure Web Site you want to edit. This can be achieved by navigating to the target Azure Web Site inside the Azure Management Portal. Then go to <strong><em>Configure</em></strong>. Scroll down and find “Edit Site in Visual Studio Online” and switch this setting to ON. Wait for the operation to complete!</p> <p><img src="http://i.imgur.com/LZw8spA.png"></p> <h2>Edit the Web Site in VSO</h2> <p>Once the Edit in VSO is enabled for you web site, navigate to the dashboard for this Web Site in Windows Azure Management Portal. A new link will appear in the right hand set of links “Edit this Web Site”:</p> <p><img src="http://i.imgur.com/nrY6W7q.png"></p> <p>The VSO IDE is protected with your deployment credentials (if you don’t know what is your deployment credentials, please take a few minutes to read through <a href="https://github.com/projectkudu/kudu/wiki/Deployment-credentials">this article</a>). </p> <p>And there you go – your Web Site, your IDE, your Browser! What? You said that I forgot to deploy my site first? Well. Visual Studio Online<strong> is</strong> Visual Studio Online. So you can do “File –> New” and it works! Oh, yes it works: </p> <p><img src="http://i.imgur.com/2URJ8Wj.png"></p> <p>Every change you make here is immediately (in real-time) reflected to the site! This is ultimate, the fastest way to troubleshoot issues with your JavaScript / CSS / HTML (Views). And, if you were doing PHP/Node.js – just edit your files on the fly and see changes in real-time! No need to re-deploy, re-package. No need to even have IDE installed on your machine – just a modern Browser! You can edit your site even from your tablet!</p> <h2>Where is the catch?</h2> <p>Oh, catch? What do you mean by “Where is the catch”? The source control? There is integrated GIT support! You can either link your web-site to a Git (GitHub / VSO project with GIT-based Source Control), or just do work with local GIT repository. The choice is really yours! And now you have fully integrated source control over your changes!</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-67523168885253515372013-10-15T13:22:00.001+02:002013-10-15T13:22:03.489+02:00Windows Azure Migration cheat-sheet<p>I was recently asked whether I do have some cheat-sheet for migrating applications to Windows Azure. The truth is that everything is in my head and I usually go with “it should work” – quickly build, pack and deploy. Then troubleshoot the issues. However there are certain rules that must be obeyed before making any attempt to port to Windows Azure. Here I will try to outline some.</p> <h3>Disclaimer</h3> <blockquote> <p>What I describe here is absolutely my sole opinion, based on my experience. You are free to follow these instructions at your own risk. I describe key points in migrating an application to the Windows Azure Platform-as-a-Service offering – the regular Cloud Services with Web and/or Worker Roles. This article is not intended for migrations to Infrastructure Services (or Windows Azure Virtual Machines).</p></blockquote> <h3>Database</h3> <p>If you work with Microsoft SQL Server it shall be relatively easy to go. Just download, install and run against your local database the <a href="http://sqlazuremw.codeplex.com/">SQL Azure Migration Wizard</a>. It is <strong>The</strong> tool that will migrate your database or will point you to features you are using that are not compatible with SQL Azure. The tool is regularly updated (latest version is from a week before I write this blog entry!). </p> <p>Migrating schema and data is one side of the things. The other side of Database migration is in your code – how you use the Database. For instance SQL Azure does not accept “<strong>USE [DATABASE_NAME]</strong>” statement. This means you cannot change database context on the fly. You can only establish connection to a specific database. And once the connection is established, you can work only in the context of that database. Another limitation, which comes as consequence of the first one is that 4-part names are not supported. Meaning that all your statements must refer to database objects omitting database name:</p> <p>[schema_name].[table_name].[column_name], </p> <p>instead of </p> <p>[database_name].[schema_name].[table_name].[column_name].</p> <p>Another issue you might face is the lack of support for SQLCLR. I once worked with a customer who has developed a .NET Assembly and installed it in their SQL Server to have some useful helpful functions. Well, this will not work on SQL Azure.</p> <p>Last, but not least is that you (1) shall never expect SQL Azure to perform better, or even equal to your local Database installation and (2) you have to be prepared for so called <strong><em>transient</em></strong> errors in SQL Azure and handle them properly. You better get to know the <a href="http://msdn.microsoft.com/en-us/library/windowsazure/jj156164.aspx">Performance Guidelines and Limitations for Windows Azure SQL Database</a>.</p> <h3>Codebase</h3> <h4>Logging</h4> <p>When we target own server (that includes co-locate/virtual/shared/etc.) we usually use local file system (or local database?) to write logs. Owning a server makes diagnostics and tracing super easy. This is not really the case when you move to Windows Azure. There is a feature of Windows Azure Diagnostics Agent to transfer your logs to a blob storage, which will let you just move the code without changes. However I do challenge you to rethink your logging techniques. First of all I would encourage you to log almost everything, of course using different logging levels which you can adjust runtime. Pay special attention to the <a href="http://msdn.microsoft.com/en-us/library/windowsazure/dn186185.aspx">Windows Azure Diagnostics</a> and don’t forget – you can still write your own logs, but why not throwing some useful log information to System.Diagnostics.Trace.</p> <h4>Local file system</h4> <p>This is though one and almost always requires code changes and even architecting some parts of the application. When going into the cloud, especially the Platform-as-a-Service one, do not use local file system for anything else, but a temporary storage and static content that is part of your deployment package. Everything else should go to a blob storage. And there are many great articles on how to use blob storage <a href="http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/">here</a>.</p> <p>Now you will probably say “Well, yeah, but when I put everything into a blob storage isn’t it <strong>vendor-lock-in</strong>?” And I will reply – depending on how you implement this! Yes, I already mentioned it will certainly require code change and, if you want to make it the best way and avoid vendor-lock-it, it will probably also require architecture change for how your code works with files. And by the way, file system is also “vendor-lock-in”, isn’t it?</p> <h3>Authentication / Authorization</h3> <p>It will not be me if I don’t plug-in here. Your application will typically use Forms Authentication. When you redesign your app anyway I highly encourage you rethink your auth/autz system and take a look into Claims! I have number of posts on Claims based authentication and Azure ACS(<a href="http://blogs.staykov.net/2012/05/introduction-to-claims.html">Introduction to Claims</a>, <a href="http://blogs.staykov.net/2012/05/secure-your-asmx-webservices-with-swt.html">Securing ASMX web services with SWT and claims</a>, <a href="http://blogs.staykov.net/2013/04/identity-federation-and-sign-out.html">Identity Federation and Sign-out</a>, <a href="http://blogs.staykov.net/2013/04/federated-authenticationmobile-login.html">Federated authentication – mobile login page for Microsoft Account (live ID)</a>, <a href="https://www.simple-talk.com/cloud/security-and-compliance/online-identity-management-via-windows-azure-access-control-service/">Online Identity Management via Azure ACS</a>, <a href="https://www.simple-talk.com/cloud/development/creating-a-custom-login-page-for-federated-authentication-with-windows-azure-acs/">Creating Custom Login page for federated authentication with Azure ACS</a>, <a href="https://www.simple-talk.com/cloud/development/unified-identity-for-web-apps-8211-the-easy-way/">Unified identity for web apps – the easy way</a>). And couple of blogs I would recommend you to follow in this direction:</p> <ul> <li>Dominic Baier: <a title="http://leastprivilege.com/" href="http://leastprivilege.com/">http://leastprivilege.com/</a></li> <li>Vittorio Bertocci: <a title="http://www.cloudidentity.com/blog/" href="http://www.cloudidentity.com/blog/">http://www.cloudidentity.com/blog/</a></li></ul> <h3>Other considerations</h3> <p>To the moment I cant dive deeper in the Azure ocean of knowledge I have to pull out something really important that fits all types of applications. If it happens, I will update the content. Things like COM/COM+/GDI+/Server Components/Local Reports – everything should work in a regular WebRole/WorkerRole environment. Where you also have full control for manipulating the operating system! Windows Azure Web Sites is far more restrictive (to date) in terms of what you can execute there and to what part of the operating system you have access.</p> <p>Here is something for you think on: I worked out with a customer who was building SPA Application to run in Windows Azure. They have designed a bottleneck for scaling in their core. The system manipulates some files. It is designed to keep object graphs of those files in-memory. It is also designed in a way that end-user may upload as many files as day want during the course of their interaction with the system. And the back-end keeps a single object graph for all the files user submitted in-memory. This object graph cannot be serialized. Here is the situation:</p> <blockquote> <p>In Windows Azure we (usually, and to comply with SLA) have at least 2 instances of our server. These instances are load balanced using round-robin algorithm. The end user comes to our application, logs-in and uploads a file. Works, works, works – every request is routed to a different server. Now user uploads new file, and again, and again … each request still goes to a different server. </p></blockquote> <p>And here is the question: </p> <blockquote> <p>What happens when the server side code wants to keep a single object graph of all files uploaded by the end user?</p></blockquote> <p>The solution: I leave it to your brains!</p> <h3>Conclusion</h3> <p>Having in mind the above mentioned key points in moving application to Windows Azure, I highly encourage you to play around and test. I might update that blog post if something rather important comes out from the deep ocean of Azure knowledge I have. But for the moment, these are the most important check-points for your app. </p> <p>If you have questions – you are more than welcome to comment!</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-4462723542936805872013-08-22T07:57:00.001+02:002013-08-22T07:57:30.297+02:00Azure SessionAffinity plugin update<p>Important update for the <a href="https://github.com/richorama/AzurePluginLibrary/tree/master/plugins/SessionAffinity4" target="_blank">SessionAffinity4</a> plugin if you use Azure SDK newer than 2.0 (this is 2.1 and next). First thing to note is that you need to install this plugin (as any other in the <a href="https://github.com/richorama/AzurePluginLibrary" target="_blank">AzurePluginLibrary</a> project) for each version of Azure SDK you have.</p> <p>If you were using the plugin with Azure SDK 2.0 the location of the plugin is following:</p> <blockquote> <p><font size="2" face="Consolas"><strong>C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\plugins</strong></font></p></blockquote> <p>For v. 2.1 of the Azure SDK, the new location is:</p> <blockquote> <p><font size="2" face="Consolas"><strong>C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.1\bin\plugins</strong></font></p></blockquote> <p>However the plugin has dependency on the Microsoft.WindowsAzure.ServiceRuntime assembly. And as the 2.1 SDK has new version, the plugin will fail to start. Solution is extremely simple. Just browse to the plugin folder, locate the configuration file:</p> <blockquote> <p><font size="2" face="Consolas"><strong>SessionAffinityAgent4.exe.config</strong></font></p></blockquote> <p>It will look like this:</p><pre class="brush: xml;"><?xml version="1.0" encoding="utf-8" ?><br /><configuration><br /> <startup useLegacyV2RuntimeActivationPolicy="true"> <br /> <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" /><br /> </startup><br /></configuration><br /></pre><br /><p>Add the following additional configuration:</p><pre class="brush: xml;"> <runtime><br /> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"><br /> <dependentAssembly><br /> <assemblyIdentity name="Microsoft.WindowsAzure.ServiceRuntime" publicKeyToken="31bf3856ad364e35" /><br /> <bindingRedirect oldVersion="1.0.0.0-2.1.0.0" newVersion="2.1.0.0" /><br /> </dependentAssembly><br /> </assemblyBinding><br /> </runtime><br /></pre><br /><p>So the final configuration file will look like that:</p><pre class="brush: xml;"><?xml version="1.0" encoding="utf-8" ?><br /><configuration><br /> <startup useLegacyV2RuntimeActivationPolicy="true"><br /> <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5" /><br /> </startup><br /> <runtime><br /> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"><br /> <dependentAssembly><br /> <assemblyIdentity name="Microsoft.WindowsAzure.ServiceRuntime" publicKeyToken="31bf3856ad364e35" /><br /> <bindingRedirect oldVersion="1.0.0.0-2.1.0.0" newVersion="2.1.0.0" /><br /> </dependentAssembly><br /> </assemblyBinding><br /> </runtime><br /></configuration><br /></pre><br /><p>Now repackage your cloud service and deploy. </p><br /><p>Please remember – only update the configuration file located in the v<strong> 2.1</strong> of the Azure SDK!</p><br /><p>Happy Azure coding!</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-85757878306533169312013-08-21T23:21:00.001+02:002013-08-21T23:23:35.433+02:00Running Java Jetty server on Azure with AzureRunMe<p>The <a href="https://github.com/RobBlackwell/AzureRunMe" target="_blank">AzureRunMe</a> project exists for a while. There are a lot of commercial projects (Java, Python, and others) running on Azure using it. The most common scenario for running Java on Azure uses Apache Tomcat server. Let's see how can we use Jetty to run our Java application in a Cloud Service.</p> <p>Frist we will need a Visual Studio. Yep … there are still options for our deployment (such as size of the Virtual Machine to name one) which require recompilation of the whole package and are not just configuration options. But, you can use the <a href="http://www.microsoft.com/visualstudio/eng/products/visual-studio-express-products" target="_blank">free Express version</a> (I think you will need both for Web and for Windows Desktop versions). And yes, it is absolutely free and you can use it to build your AzureRunMe package for Azure deployment. Along with Visual Studio, you have to also install the latest version (or the latest supported by the AzureRunMe project) of <a href="http://www.windowsazure.com/en-us/downloads/?sdk=net" target="_blank">Windows Azure SDK for .NET</a>.</p> <p>Then get the latest version of <a href="https://github.com/RobBlackwell/AzureRunMe" target="_blank">AzureRunMe from GirHub</a>. Please go through the <a href="https://github.com/RobBlackwell/AzureRunMe/blob/master/README.md" target="_blank">Readme</a> to get to know the AzreRunMe project overall.</p> <p>Next is to get the JRE for Windows ZIP package. If you don't have it already on your computer, you have to <a href="http://www.oracle.com/technetwork/java/javase/downloads/index.html" target="_blank">download it from Oracle's site</a> (no direct link supported because Oracle wants you to accept license agreement first). I got the Server JRE version. Have the ZIP handy.</p> <p>Now let's get <a href="http://www.eclipse.org/jetty/" target="_blank">Jetty</a>. The version I got is <a href="http://download.eclipse.org/jetty/stable-9/dist/" target="_blank">9.0.5</a>. </p> <p>Now get hands dirty.</p> <p>Create a folder structure similar to the following one:</p> <p><img src="http://i.imgur.com/N10hXdE.png"></p> <p>As per AzureRunMe requirements – my application is prepared to run from a single folder. I have java-1.7, jetty-9.0.5 and runme.bat into that folder. To prepare my application for AzureRunMe I create two zip files:</p> <ul> <li><strong>java-1.7.zip</strong> – the Java folder as is</li> <li><strong>jetty-9.0.5.zip</strong> – contains both runme.bat + jetty-9.0.5 folder</li></ul> <p>I also have put a WAR file of my application into jetty's webapps folder. It will later be automatically deployed by the Jetty engine itself. I then upload these two separate ZIP files into a blob container of my choice (for the example I named it <strong>deploy</strong>). Content of the runme.bat file is as simple as that:</p> <blockquote> <p><font face="Consolas">@echo off<br>REM Starting Jetty with depolyed app<br>cd jetty-9.0.5<br>..\java-1.7\jre\bin\java -jar start.jar jetty.port=8080</font></p></blockquote> <p>It just starts the jetty server. </p> <p>Now let's jump to Visual Studio to create the package. Once you've installed Visual Studio and downloaded the latest version of <a href="https://github.com/RobBlackwell/AzureRunMe" target="_blank">AzureRunMe</a>, you have to open the AzureRunme.sln file (Visual Studio Solution file). Usually just double click on that file and it will automatically open with Visual Studio. There are very few configuration settings you need to set before you create your package. Right click on the <strong>WorkerRole</strong> item which is under <strong>AzureRunMe</strong>:</p> <p><img src="http://i.imgur.com/FFLBq0q.png"></p> <p>This will open the Properties pages:</p> <p><img src="http://i.imgur.com/ZV8SA6z.png"></p> <p>In the first page we configure the number of Virtual Machines we want running for us, and their size. One more option to configure – Diagnostics Connection String. Here just replace <strong>YOURACCOUNTNAME</strong> and <strong>YOURACCOUNTKEY</strong> with respective values for Azure Storage Account credentials.</p> <p>Now move to the <strong>Settings</strong> tab:</p> <p><img src="http://i.imgur.com/vbjfSVb.png"></p> <p>Here we have to set few more things:</p> <ul> <li><strong>Packages</strong>: the most important one. This is semicolon (<strong>;</strong>) separated list of packages to deploy. Packages are downloaded and unzipped in the order of appearance in the list. I have set two packages (zip files that I have created earlier): deploy/java-1.7.zip;deploy/jetty-9.0.5.zip</li> <li><strong>Commands</strong>: this again a semicolon (<strong>;</strong>) separated list of batch files or single commands to execute when everything is ready. In my case this is the <strong>runme.bat</strong> file, which was in jetty-9.0.5.zip package.</li> <li>Update storage credentials to 3 different places.</li></ul> <p>For more information and description of each setting, please refer to <a href="https://github.com/RobBlackwell/AzureRunMe" target="_blank">AzureRunMe project's documentation</a>. </p> <p>Final step. Right click on AzureRunMe item with the cloud icon and select "Create Package":</p> <p><img src="http://i.imgur.com/mkBYTha.png"></p> <p>If everything is fine you shall get a nice set of files which you shall use to deploy your jetty server in Azure:</p> <p><img src="http://i.imgur.com/bDr4eKU.png"></p> <p>You can refer to the <a href="http://www.windowsazure.com/en-us/manage/services/cloud-services/how-to-create-and-deploy-a-cloud-service/#deploy" target="_blank">online documentation here</a>, if you have doubts on how to deploy your cloud service package.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-34098546280806540222013-07-24T15:50:00.001+02:002013-07-24T15:50:31.831+02:00SessionAffinity plugin for Windows Azure<p>In <a href="http://blogs.staykov.net/2013/07/session-affinity-and-windows-azure.html" target="_blank">previous post</a> I've reviewed what Session Affinity is and why it is so important for your Windows Azure (Cloud Service) deployments. I also introduced the <a href="https://github.com/richorama/AzurePluginLibrary/tree/master/plugins/SessionAffinity" target="_blank">SessionAffinity</a> and <a href="https://github.com/richorama/AzurePluginLibrary/tree/master/plugins/SessionAffinity4" target="_blank">SessionAffinity4</a> plugins, part of the <a href="https://github.com/richorama/AzurePluginLibrary" target="_blank">Azure Plugin Library project</a>. Here I will describe what this plugin is and how it works.</p> <p>The SessionAffinity plugin is based around Microsoft's <a href="http://www.iis.net/downloads/microsoft/application-request-routing" target="_blank">Application Request Routing</a> module, which can be installed as an add-on in Microsoft's web server – <a href="http://www.iis.net/home" target="_blank">IIS (Internet Information Services).</a> This module has dependency of the following other (useful) modules:</p> <ul> <li><a href="http://www.iis.net/downloads/microsoft/url-rewrite" target="_blank">URL Rewrite</a> – similar to the <a href="http://httpd.apache.org/docs/current/mod/mod_rewrite.html" target="_blank">Apache's mod_rewrite</a>. You can even translate most of the Apache's mod_rewrite rule to IIS URL Rewrite Rules;</li> <li><a href="http://www.iis.net/downloads/microsoft/web-farm-framework" target="_blank">Web Farm Framework</a> - simplifies the provisioning, scaling, and management of multiple servers;</li> <li><a href="http://www.iis.net/downloads/microsoft/application-request-routing" target="_blank">ARR</a> - enables Web server administrators, hosting providers, and Content Delivery Networks (CDNs) to increase Web application scalability and reliability through rule-based routing, client and host name affinity, load balancing of HTTP server requests, and distributed disk caching;</li> <li>External Cache</li></ul> <p>The two most important features of ARR that will help us achieve Session Affinity are the URL Rewrite and load balancing. Of course they only make sense when there is a Web Farm of Servers to manage.</p> <p>Here is a basic diagram which illustrates what happens to your [non-.NET-deployment] when you use SessionAffinity Plugin:</p> <p><img src="http://i.imgur.com/LIB3HFi.png"></p> <p>First and most important of all – SessionAffinity plugin only works with Worker Roles! This type of role you will use when you want to deploy a non .NET web server (Apache, Apache Tomcat, NGIX, etc.) . This is very important. Web Role is special kind of Role, and the internal Azure infrastructure does additional things on IIS configuration which literally mess with the Session Affinity plugin and ARR configurations. So, use only Worker Roles when you want to use Session Affinity.</p> <p>The plugin itself consists of two main modules: </p> <p>Installer bootstrapper – takes care of installing the ARR module and all its dependencies</p> <p>SessionAffinityAgent.exe – a .NET based console application which is both configuration utility and watchdog service. It completes the initial configuration of the ARR – sets the load balance algorithm to Weighted Round Robin and configures the affinity based on cookie! The second important job of this application is to monitor the Azure Environment for changes via the <a href="http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.serviceruntime.roleenvironment.changed.aspx" target="_blank">RoleEnvironment.Changed</a> event. This event occurs when any change to the role environment happens – instances are added or removed, configuration settings is changed and so on. You can read more about handling Role Environment changes on <a href="http://brentdacodemonkey.wordpress.com/2011/09/24/leveraging-the-roleentrypoint-year-of-azure-week-12/" target="_blank">this excellent blog post</a>. When you happen to add more role instances (or remove any) all the ARR modules on all the instances must be re-configured to include all the VMs in the Web Farm. This is what Session Affinity agent is doing by constantly monitoring the environment.</p> <p>With this setup now there is ARR module installed on each of the instances. Each ARR module knows about how many total server are there. There is also a software load balancer (part of the Web Farm framework), which also knows which are all the servers (role instances).</p> <p>These are the components in a single instance:</p> <p><img src="http://i.imgur.com/uurkDHO.png"></p> <p>Web requests going to port 80 are accepted by the local IIS site. It has a configured URL Rewrite Rule which transfers the request to the local Web Farm Framework. Web farm framework is aware of all the servers configured in the setup (all role instances). It checks where an affinity cookie exists in the requests. If such cookie does not exists, random server is chosen from the pool and new cookie is created to keep track of which user was assigned to the user. The request is finally redirected internally to the Apache listening on port 8080. This information is synchronized across all servers that are part of the Web Farm. Next request will have the cookie and the user will be sent to the same server.</p> <p>Here is simple flow diagram for a web request that goes on public port 80 to the cloud service deployed with Session Affinity plugin:</p> <p><img src="http://i.imgur.com/gNMTvyc.png"></p> <p>SessionAffinity4 plugin (the one that works with Windows Server 2012 / OS Family 3) has one configurable option: </p> <blockquote> <p><strong>Two10.WindowsAzure.Plugins.SessionAffinity4.ArrTimeOutSeconds</strong></p></blockquote> <p>As its name suggest this is Timeout in seconds. Timeout for what? If we look at the flow chart we will see that there are a lot of things to happen. The last one is waiting for response from the Apache Web server (on port 8080). This timeout indicates how long will the ARR module wait for response from Apache before sending Timeout Error (HTTP 50x). If you don't set value for this setting, 180 seconds (3 minutes) is considered a reasonable time to wait. If you want, you can change this value in the Service Configuration file. For example if you have some heavy pages or long running operations you may want to increase the timeout. Be careful, the error page returned to the end user is a standard HTTP 500 error page! So it is better that your server never times-out, or at least you have to configure the value of <strong>ArrTimeOutSeconds</strong> to a value which is greater then the expected longest processing page.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com1tag:blogger.com,1999:blog-5177305310827978243.post-7287624983931648482013-07-23T17:34:00.001+02:002013-07-23T17:34:02.602+02:00Session Affinity and Windows Azure<p>Everybody speaks about <a href="http://blogs.technet.com/b/microsoft_blog/archive/2013/06/24/partners-in-the-enterprise-cloud.aspx" target="_blank">recently announced partnership between Microsoft and Oracle on the Enterprise Cloud</a>. Java has been a <a href="http://msdn.microsoft.com/en-us/library/windowsazure/hh694271" target="_blank">first-class citizen for Windows Azure</a> for a while and was available via tool like <a href="https://github.com/RobBlackwell/AzureRunMe" target="_blank">AzureRunMe</a> even before that. Most of the customers I've worked with are using <a href="http://tomcat.apache.org/" target="_blank">Apache Tomcat</a> as a container for Java Web Applications. The biggest problem they face is that Apache Tomcat relies on Session Affinity. </p> <p>What is Session Affinity and why it is so important in Windows Azure? Let's rewind a little back to <a href="http://blogs.staykov.net/2012/03/windows-azure-basics-part-2-of.html" target="_blank">this post I've written</a>. Take a look at the abstracted network diagram:</p> <p><img src="http://i.imgur.com/XDHdnRm.jpg"></p> <p>So we have 2 (or more) servers that are responsible for handling Web Requests (Web Roles) and a Load Balancer (LB) in front of them. Developers has no control over the LB. And it uses one and only one load balancing algorithm – Round Robin. This means that requests are evenly distributed across all the servers behind the LB. Let's go through the following scenario:</p> <ul> <li>I am web user X who opens the web application deployed in Azure. </li> <li>The Load Balancer (LB) redirects my web request to Web Role Instance 0. </li> <li>I submit a login form with user name and password. This is second request. It goes to Web Role Instance 1. This server now creates a session for me and knows who I am. </li> <li>Next I click "my profile" link. The requests goes back to Web Role Instance 0. This server knows nothing about me and redirects me to the login page again! Or even worse – shows some error page.</li></ul> <p>This is what will happen if there is no Session Affinity. Session Affinity means that if I hit Web Role Instance 0 first time, I will hit it every time after that. There is no Session Affinity provided by Azure! And in my personal opinion, Session Affinity does not fit well (does not fit at all) in the Cloud World. But sometimes we need it. And most of the time (if not all cases), it is when we run a non-.NET-code on Azure. For .NET there are things like <a href="http://msdn.microsoft.com/en-us/library/aa478952.aspx" target="_blank">Session State Providers</a>, which make developer's life easier! So the issue remains mainly for non .net (Apache, Apache Tomcat, etc). </p> <p>So what to do when we want Session Affinity with .NET web servers? Use the <a href="https://github.com/richorama/AzurePluginLibrary/tree/master/plugins/SessionAffinity" target="_blank">SessionAffinity</a> or <a href="https://github.com/richorama/AzurePluginLibrary/tree/master/plugins/SessionAffinity4" target="_blank">SessionAffinity4</a> plugin. This basically is the same "product", but the first one is for use with Windows Server 2008 R2 (OS Family = 2) while the second one is for Windows Server 2012 (OS Family = 3). </p> <p>I will explain in a next post what is the architecture of these plugins and how exactly they work.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-65230130685067807532013-05-09T21:22:00.001+02:002013-05-09T21:25:53.259+02:00Active Directory in Azure – Step by Step<p>Ever since Windows Azure Infrastructure Services were announced in preview I keep hearing questions "How to run Active Directory in Azure VM? And then join other computers to it". This article assumes that you already know how install and configure Active Directory Directory Services Role, Promote to Domain Controller, join computers to a Domain, <a href="http://msdn.microsoft.com/en-us/library/windowsazure/jj156074.aspx" target="_blank">Create and manage Azure Virtual Networks</a>, <a href="http://www.windowsazure.com/en-us/manage/windows/tutorials/virtual-machine-from-gallery/" target="_blank">Create and manage Azure Virtual Machines</a> and <a href="http://www.windowsazure.com/en-us/manage/services/networking/add-a-vm-to-a-virtual-network/" target="_blank">add them to Virtual Network</a>.</p> <blockquote> <p><em>Disclaimer: Use this solution at your own risk. What I describe here is purely my practical observation and is based on repeatable reproduction. Things might change in the future.</em></p></blockquote> <p>The foundation pillar for my setup is the following (totally mine!) statement: <font color="#ff0000">The first Virtual Machine you create into an empty Virtual Network in Windows Azure will get the <strong>4th</strong> IP Address in the sub-net range. That means, that if your sub-net address space is <strong>192.168.0.0/28</strong>, the very first VM to boot into that network will get IP Address <strong>192.168.0.4</strong>. The given VM will always get this IP Address across intentional reboots, accidental restarts, system healing (hardware failure and VM re-instantiating) etc., as long as there is no other VM booting while that first one is down.</font></p> <p>First, lets create the virtual network. Given the knowledge from my foundation pillar, I will create a virtual network with two separate addressing spaces! One addressing space would be 192.168.0.0/28. This will be the addressing space for my Active Directory and Domain Controller. Second one will be 172.16.0.0/22. Here I will add my client machines. </p> <p>Next is one of the the most important parts – assign DNS server for my Virtual Network. I will set the IP Address of my DNS server to 192.168.0.4! This is because I know (assume) the following:</p> <ul> <li>The very first machine in a sub-network will always get the 4th IP address from the allocated pool;</li> <li>I will place only my AD/DC/DNS server in my AD Designated network;</li></ul> <p>Now divide the network into address spaces as described and define the subnets. I use the following network configuration which you can import directly (however please note that you must have already created the <strong>AffinityGroup</strong> referred in the network configuration! Otherwise network creation will fail):</p><pre class="brush: xml;"><NetworkConfiguration <br /> xmlns:xsd="http://www.w3.org/2001/XMLSchema" <br /> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" <br /> xmlns="http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration"><br /> <VirtualNetworkConfiguration><br /> <Dns><br /> <DnsServers><br /> <DnsServer name="NS" IPAddress="192.168.0.4" /><br /> </DnsServers><br /> </Dns><br /> <VirtualNetworkSites><br /> <VirtualNetworkSite name="My-AD-VNet" AffinityGroup="[Use Existing Affinity Group Name]"><br /> <AddressSpace><br /> <AddressPrefix>192.168.0.0/29</AddressPrefix><br /> <AddressPrefix>172.16.0.0/22</AddressPrefix><br /> </AddressSpace><br /> <Subnets><br /> <Subnet name="ADDC"><br /> <AddressPrefix>192.168.0.0/29</AddressPrefix><br /> </Subnet><br /> <Subnet name="Clients"><br /> <AddressPrefix>172.16.0.0/22</AddressPrefix><br /> </Subnet><br /> </Subnets><br /> </VirtualNetworkSite><br /> </VirtualNetworkSites><br /> </VirtualNetworkConfiguration><br /></NetworkConfiguration><br /></pre><br /><p>Now create new VM from gallery – picking up your favorite OS Image. Assign it to sub-net <strong>ADDC</strong>. Wait to be provisioned. RDP to it. Add AD Directory Services server role. Configure AD. Add DNS server role (this will be required by the AD Role). Ignore the warning that DNS server requires fixed IP Address. Do <strong>not</strong> change network card settings! Configure everything, restart when asked. Promote computer to Domain Controller. Voilà! Now I have a fully operations AD DS + DC.</p><br /><p>Let's add some clients to it. Create a new VM from gallery. When prompted, add it to the <strong>Clients</strong> sub-net. When everything is ready and provisioned, log-in to the VM (RDP). Change the system settings – Join a domain. Enter your configured domain name. Enter domain administrator account when prompted. Restart when prompted. Voilà! Now my new VM is joined to my domain.</p><br /><p>Why it works? Because I have:</p><br /><ul><br /><li>Defined DNS address for my Virtual Network to have IP Address of 192.168.0.4</li><br /><li>Created dedicated Address Space for my AD/DC which is 192.168.0.0/29</li><br /><li>Placed my AD/DC designated VM in its dedicated address space</li><br /><li>Created dedicated Address Space for client VMs, which does not overlap with AD/DC designated Address Space</li><br /><li>I put client VMs only in designated Address Space (sub-net) and never put them in the sub-net of AD/DC</li></ul><br /><p>Of course you will get same result if with a single Address Space and two sub-nets. Being careful how you configure the DNS for the Virtual Network and which sub-net you put your AD and your Client VMs in.</p><br /><p>This scenario is validated, replayed, reproduced tens of times, and is being used in production environments in Windows Azure. However – use it at your own risk.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com9tag:blogger.com,1999:blog-5177305310827978243.post-45465291226185657972013-05-08T00:58:00.001+02:002013-05-08T00:58:17.028+02:00Windows Azure Basics–Compute Emulator<p>Following the first two posts of the series “Windows Azure Basics” (<a href="http://blogs.staykov.net/2011/12/windows-azure-basics-part-1-of-n.html" target="_blank">general terms</a>, <a href="http://blogs.staykov.net/2012/03/windows-azure-basics-part-2-of.html" target="_blank">networking</a>) here comes another one. Interestingly enough, I find that a lot of people are confused what exactly is the compute emulator and what are these strange IP Addresses and port numbers that we see in the browser when launching a local deployment. </p> <p>If you haven’t read the <a href="http://blogs.staykov.net/2012/03/windows-azure-basics-part-2-of.html" target="_blank">Windows Azure Basics – part 2 Networking</a>, I strongly advise you to do so, as rest of current post assumes you are well familiar with real Azure deployment networking components.</p> <p>A real world Windows Azure deployment has following important components:</p> <ul> <li>Public facing IP Address (VIP) <li>Load Balancer (LB) with Round Robin routing algorithm <li>Number of Virtual Machines (VM) representing each instance of each role, each with its own internal IP address (DIP – Direct IP Address) <li>Open ports on the VIP <li>Open ports on each VM</li></ul> <p>In order to provide developers with as close to real world as possible, a compute emulator needs to simulate all of these components. So let's take a look what happens when we launch locally a Cloud Service (a.k.a. Hosted Service).</p> <h2>VIP Address</h2> <p>The VIP address for our cloud service will be 127.0.0.1. That is the public IP Address (VIP) of the service, via which all requests to the service shall be routed.</p> <h2>Load Balancer</h2> <p>Next thing to simulate is the Azure Load Balancer. There is a small software emulated Load Balancer, part of the Compute Emulator. You will not see it, you are not able to configure it, but you must be aware of its presence. It binds to the VIP (127.0.0.1). Now the trickiest thing is to find the appropriate ports to bind. You can configure different Endpoint for each of your roles. Only the<strong> Input Endpoints</strong> are exposed to the world, so only these will be bound to the local VIP (127.0.0.1). If you have a web role, the default web port is 80. However, very often this socket (127.0.0.1:80) is already occupied on a typical web development machine. So, the compute emulator tries to bind to the next available port, which is 81. In most of the cases port 81 will be free, so the "public" address for viewing/debugging will be <a href="http://127.0.0.1:81/">http://127.0.0.1:81/</a>. If port 81 is also occupied, compute emulator will try the next one – 82, and so on, until it successfully binds to the socket (127.0.0.1:<strong>XX</strong>). So when we launch a cloud service project with a web role we will very often see browser opening this wired address (<a href="http://127.0.0.1:81">http://127.0.0.1:81</a>). The process is same for all Input Endpoints of the cloud service. Remember, the <strong>Input endpoints</strong> are unique per service, so an <strong>Input Endpoint</strong> cannot be shared by more than one Role within the same cloud service.</p> <p>Now that we have the load balancer launched and bound to the correct sockets, let's see how the Compute Emulator emulated multiple instances of a Role.</p> <h2>Web Role</h2> <p>Web Roles are web applications that run within IIS. For the web roles, compute emulator uses IIS Express (and can be configured to use full IIS if it is installed on the developer machine). Compute Emulator will create a dedicated virtual IP Address on the local machine for each instance of a role. These are the DIPs of the web role. A local DIP looks something like 127.255.0.0. Each local "instance" then gets the next IP address (i.e. 127.255.0.0, 127.255.0.1, 127.255.0.2 and so on). It is interesting that the IP Addresses begin at 0 (127.255.0.0). Then it will create a separate web site in IIS Express (local IIS) binding it to the created Virtual IP Address and port 82. The emulated load balancer will then use round robin to route all requests coming to 127.0.0.1:81 to these virtual IP Addresses. </p> <blockquote> <p><em>Note: You will not see the DIP virtual address when you run <strong>ipconfig</strong> command</em>.</p></blockquote> <p>Here is how does my IIS Express look like when I have my cloud service launched locally:</p> <p><img src="http://i.imgur.com/BHd9qrp.png"></p> <h2>Worker role</h2> <p>This one is easier. The DIP Addressing is the same, however the compute emulator does not need IIS (neither IIS Express). It just launches the worker role code in separate processes, one for each instance of the worker role.</p> <h2>The emulator UI</h2> <p>When you launch a local deployment, Compute Emulator and Storage Emulator are launched. You can bring the Compute Emulator UI by right clicking on the small azure colored windows icon in the tray area:</p> <p><img src="http://i.imgur.com/jU5d040.png"></p> <p>For purpose of this post I've created a sample Cloud Service with a Web Role (2 instances) and a Worker Role (3 instances). Here is the Compute Emulator UI for my service. And if I click on "Service Details" I will see the "public" addresses for my service:</p> <p><img src="http://i.imgur.com/6GyP8oE.png"></p> <h2>Known issues</h2> <p>One very common issue is the so-called <strong><em>port walking</em></strong>. As I already described, the compute emulator tries to bind to the requested port. If that port isn't available, it tries next one and so on. This behavior is known as "port walking". Under certain conditions we may see port walking even between consequent runs of same service – i.e. the first run compute emulator binds to 127.0.0.1:81, the next run it binds to 127.0.0.1:82. The reasons vary, but the obvious one is "port is busy by another process". Sometimes the Windows OS does not free up the port fast enough, so port 81 seems busy to the compute emulator. It then goes for the next port. So, don't be surprised, if you see different ports when debugging your cloud service. It is normal.</p> <p>Another issue is that sometimes browser launches the DIP Address (<a href="http://127.255.0.X:82/">http://127.255.0.X:82/</a>) instead the VIP one (<a href="http://127.0.0.1:81/">http://127.0.0.1:81/</a>). I haven't been able to find a pattern for that behavior, but if you see a DIP when you debug your web roles, switch manually to the VIP. It is important to always use our service via the VIP address, because this way we also test out application cloud readiness (distributing calls amongst all instances, instead of just one). If the problem persists, try restarting Visual Studio, Compute Emulator or the computer itself. If issue still persists, open a question at <a href="http://stackoverflow.com/questions/tagged/azure" target="_blank">StackOverflow</a> or the <a href="http://social.msdn.microsoft.com/Forums/en-US/windowsazuredevelopment/threads" target="_blank">MSDN Forum</a> describing the exact configuration you have, ideally providing a Visual Studio solution that constantly reproduces the problem. I will also be interested to see the constant repeatable issue. </p> <blockquote> <p><em><strong>Tip for the post</strong>: If you want to change the development VIP address ranges (so that it does not use 127.0.0.1) you can check out the following file:</em></p> <p><em>%ProgramFiles%\Microsoft SDKs\Windows Azure\Emulator\devfabric\DevFC.exe.config</em></p> <p><em>DevFC stands for "Development Fabric Controller". But, please be careful with what you do with this file. Always make a backup of the original configuration before you change any setting!</em></p></blockquote> <p>Happy Azure coding!</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com2tag:blogger.com,1999:blog-5177305310827978243.post-86142214513221354712013-04-08T12:09:00.001+02:002013-04-08T12:09:58.818+02:00Bending the Windows Azure Media Services–H.264 Baseline profile<blockquote> <p>Disclaimer: What I will describe here is not officially supported by Microsoft and by Windows Azure Media Services. This means that if task fails you cannot open support ticket, nor you can complain. I discovered these hidden feature by digging deeply into the platform. Use the code and task preset at your own risk and responsibility. And note that what works now, may not work tomorrow.</p></blockquote> <p>Exploring the boundaries of <a href="http://www.windowsazure.com/en-us/home/scenarios/media/" target="_blank">Windows Azure Media Services</a> (WAMS), and following questions on <a href="http://stackoverflow.com/questions/15870198/h-264-baseline-profile-azure-media-services" target="_blank">StackOverflow</a> and respective <a href="http://social.msdn.microsoft.com/Forums/en-US/MediaServices/thread/95ec8895-4a73-4a0c-8505-3ca5d8bbe13e" target="_blank">MSDN Forums</a>, it appears that WAMS has previously supported H.264 Baseline Profile and have had a task preset for Baseline Profile. But now it only has Main Profile and High Profile <a href="http://msdn.microsoft.com/en-us/library/windowsazure/jj129582.aspx" target="_blank">task presets</a>. And because the official documentation says that <a href="http://msdn.microsoft.com/en-us/library/windowsazure/hh973634.aspx#export_formats" target="_blank">Baseline Profile is supported output format</a>, I don’t see anything wrong in exploring how to achieve that.</p> <p>So what can we do, to encode a video into H.264 baseline profile if we really want? Well, use the following Task Preset at your own will (and risk <img class="wlEmoticon wlEmoticon-smile" style="border-top-style: none; border-left-style: none; border-bottom-style: none; border-right-style: none" alt="Smile" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwBurh_EoKKxOpqc3JJgtRXyPKgBHrMCkvfzARZiF7RZmY2I1jtizcv_jQwwHW3L6XjL_ep1LlwdZbVs45gKfOJdXJGp4YcIpt-c27W6FygOlIAUoAsupnG67fydcQr7J6C39gqe5Hdzq8/?imgmax=800"> ):</p><pre class="brush: xml;"><?xml version="1.0" encoding="utf-16"?><br /><!--Created with Expression Encoder version 4.0.4276.0--><br /><Preset<br /> Version="4.0"><br /> <Job /><br /> <MediaFile<br /> WindowsMediaProfileLanguage="en-US"<br /> VideoResizeMode="Letterbox"><br /> <OutputFormat><br /> <MP4OutputFormat<br /> StreamCompatibility="Standard"><br /> <VideoProfile><br /> <BaselineH264VideoProfile<br /> RDOptimizationMode="Speed"<br /> HadamardTransform="False"<br /> SubBlockMotionSearchMode="Speed"<br /> MultiReferenceMotionSearchMode="Speed"<br /> ReferenceBFrames="True"<br /> AdaptiveBFrames="True"<br /> SceneChangeDetector="True"<br /> FastIntraDecisions="False"<br /> FastInterDecisions="False"<br /> SubPixelMode="Quarter"<br /> SliceCount="0"<br /> KeyFrameDistance="00:00:05"<br /> InLoopFilter="True"<br /> MEPartitionLevel="EightByEight"<br /> ReferenceFrames="4"<br /> SearchRange="32"<br /> AutoFit="True"<br /> Force16Pixels="False"<br /> FrameRate="0"<br /> SeparateFilesPerStream="True"<br /> SmoothStreaming="False"<br /> NumberOfEncoderThreads="0"><br /> <Streams<br /> AutoSize="False"<br /> FreezeSort="False"><br /> <StreamInfo><br /> <Bitrate><br /> <ConstantBitrate<br /> Bitrate="4000"<br /> IsTwoPass="False"<br /> BufferWindow="00:00:04" /><br /> </Bitrate><br /> </StreamInfo><br /> </Streams><br /> </BaselineH264VideoProfile><br /> </VideoProfile><br /> <AudioProfile><br /> <AacAudioProfile<br /> Level="AacLC"<br /> Codec="AAC"<br /> Channels="2"<br /> BitsPerSample="16"<br /> SamplesPerSecond="44100"><br /> <Bitrate><br /> <ConstantBitrate<br /> Bitrate="160"<br /> IsTwoPass="False"<br /> BufferWindow="00:00:00" /><br /> </Bitrate><br /> </AacAudioProfile><br /> </AudioProfile><br /> </MP4OutputFormat><br /> </OutputFormat><br /> </MediaFile><br /></Preset><br /></pre><br /><p>You can quickly check whether it works for you by using the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools/tree/master/RunTask" target="_blank">RunTask</a> command line, part of the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools" target="_blank">MediaServicesCommandLineTools</a> project. The <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools/blob/master/etc/H264_BaselineProfile.xml" target="_blank">H264_BaselineProfile.xml</a> is provided for reference in the etc folder of the project. You can tweak and Audio and Video bitrates at your will by editing the XML.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-41668753918302088022013-04-06T16:09:00.001+02:002013-04-06T16:09:40.503+02:00Federated Authentication–Mobile Login Page for Microsoft Live Id<p>Say you are developing a web site, which will have desktop users, mobile users, all kind of users. Because you respect your users, you let them login to your site using their existing credentials. One of which happens for be Microsoft Account (or formerly known as Microsoft Live ID). Also, because you really enjoy the Windows Azure platform and the fact that Azure Access Control Service is totally free with no catch, you implemented your federated login using Azure ACS. You also implemented a custom login page for you users. </p> <p>Now you noticed that Microsoft Account does not recognize mobile users 100% and you have better logic for determining mobile user agents. You also want to forcibly redirect your mobile user to the mobile login page for Microsoft Account. But how?</p> <p>Well, since you already implemented a custom login page, you already know what this URL is:</p> <p><a href="https://[namespace].accesscontrol.windows.net/v2/metadata/IdentityProviders.js?protocol=wsfederation&realm=[realm]&reply_to=[reply_to]&context=&request_id=&version=1.0&callback">https://[namespace].accesscontrol.windows.net/v2/metadata/IdentityProviders.js?protocol=wsfederation&realm=[realm]&reply_to=[reply_to]&context=&request_id=&version=1.0&callback</a>= <p>This is the URL where you get the JSON feed of registered Identity Providers for your relying party application. When you retrieve it, you have LoginUrl for Live ID looking similar to this one: <p><a href="https://login.live.com/login.srf?wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted">https://login.live.com/login.srf?wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted</a>] <p>Now, you can one more parameter to the query string to force a very lightweight (mobile) login page for Microsoft Account. This parameter is <strong><em><font color="#ff0000">pcexp</font></em></strong> and the value should be <strong><em><font color="#ff0000">false</font></em></strong>. So now your LoginUrl for Microsoft Account (Live ID) will look similar to this one: <p><a href="https://login.live.com/login.srf?wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted]&pcexp=false">https://login.live.com/login.srf?wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted]<b><font color="#ff0000">&pcexp=false</font></b></a> <p>That’s perfect! It works! Thanks! <p>But.. but you also have a WML version of your site. And you recognize and respect these user agents too. Well, there is solution to this issue too. The solution is to replace the whole domain and login page, but keep the query string intact. So, if the original login Url is this: <p><a href="https://login.live.com/login.srf?wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted"><strong><font color="#ff0000">https://login.live.com/login.srf?</font></strong>wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted</a>] <p>Replace <strong><font color="#ff0000">login.live.com/login.srf?</font></strong> with <font color="#ff0000">mid.live.com/si/login.aspx?</font>. The result is: <p><a href="https://mid.live.com/si/login.aspx?wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted"><strong><font color="#ff0000">https://mid.live.com/si/login.aspx?</font></strong>wa=wsignin1.0&wtrealm=https%3a%2f%2faccesscontrol.windows.net%2f&wreply=https%3a%2f%2f[namespace].accesscontrol.windows.net%2fv2%2fwsfederation&wp=MBI_FED_SSL&wctx=[encrypted</a>] <p>Done. Happy coding! <p>Please respect your users and their existing online identities! Do not ask them to create new usernames/password if they don’t explicitly want to! Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-55616988093168542832013-04-05T12:06:00.001+02:002013-04-05T12:08:01.684+02:00Bending the Azure Media Services – clip or trim your media files<blockquote> <p>Disclaimer: What I will describe here is not officially supported by Microsoft and by Windows Azure Media Services. This means that if task fails you cannot open support ticket, nor you can complain. I discovered these hidden feature by digging deeply into the platform. Use the code and task preset at your own risk and responsibility. And note that what works now, may not work tomorrow.</p></blockquote> <p>So, we have <a href="http://www.windowsazure.com/en-us/home/features/media-services/" target="_blank">Windows Azure Media Services</a>, which can transcode (convert from one video/audio format to another), package and deliver content. How about more advanced operations, such as clipping or trimming. I want, let’s say to cut off first 10 seconds of my video. And the last 5 seconds. Can I do it with Windows Azure Media Services ? Yes I can, today (5 April 2013).</p> <p>The easiest way to start with Media Services is by using the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools" target="_blank">MediaServicesCommandLineTools</a> project from GitHub. It has very neat program – <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools/tree/master/RunTask" target="_blank">RunTask</a>. It expects two parameters: partial (last N characters) Asset Id and path to task preset. It will then display a list of available Media Processors to execute the task with. You chose the Media Processor and you are done! </p> <p>So what task preset is for Clipping or Trimming? You will not find that type of task on the list of <a href="http://msdn.microsoft.com/en-us/library/windowsazure/hh973619.aspx" target="_blank">Task Presets for Azure Media Services</a>. But you will find a couple of interesting task presets in the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools" target="_blank">MediaServicesCommandLineTools</a> project under the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools/tree/master/etc" target="_blank">etc</a> folder. Lets take look at the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools/blob/master/etc/Clips.xml" target="_blank">Clips.xml</a>:</p><pre class="brush: xml;"><?xml version="1.0" encoding="utf-16"?><br /><!--Created with Expression Encoder version 4.0.4276.0--><br /><Preset<br /> Version="4.0"><br /> <Job /><br /> <MediaFile><br /> <Sources><br /> <Source<br /> AudioStreamIndex="0"><br /> <Clips><br /> <Clip<br /> StartTime="00:00:04"<br /> EndTime="00:00:10" /><br /> </Clips><br /> </Source><br /> </Sources><br /> </MediaFile><br /></Preset><br /></pre><br /><p>It is a very simple XML file with two attribute values that are interesting for us. Namely <strong>StartTime</strong> and <strong>EndTime</strong>. These attributes define points in time where to start clipping and there to end it. With the given settings (StartTime: 00:00:04, EndTime: 00:00:10) the result media asset will be a video clip with length of 6 seconds which starts at the 4th second of the original clip and ends at the 10th second of the original.</p><br /><p>As can also see, I haven’t removed an important comment in the XML – "Created with Expression Encoder version 4.0.4276.0". Yes, I used Expression Encoder 4 Pro to create a custom job preset. You can try that too!</p><br /><p>Tune on for more “media services bending tips”.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com3tag:blogger.com,1999:blog-5177305310827978243.post-57383583338430297902013-04-04T09:28:00.001+02:002013-04-04T21:49:15.686+02:00Identity Federation and Sign-Out<p>We live in 21st century, don’t we! I am a firm believer that from now on no user shall ever create a new username/password combination again. Ever! There are already enough existing online identity providers – such as Google, Yahoo, Facebook, Microsoft Account (formerly know as Live ID), Office365, OpenId, Twitter, LinkedIn, national identity providers such as <a href="https://nemlog-in.dk/visomlogin.aspx" target="_blank">NemID</a> in Denmark, and so on, and so on.</p> <p>I do believe that every single internet user has profile with at least one of these Identity Providers. And if you, dear reader, do not have any existing online profile, please do leave a comment, but be honest!</p> <p>All of the developers, architects, decision makers, by all means we shall respect this fact!</p> <p>I do respect it. In every single project I face I do my best to convince decision makers that it is always better to respect users and give them opportunity to use an existing online identity when there is a need to protect some parts of the application we develop. And way I do it, is by evangelizing <a href="http://www.windowsazure.com/en-us/manage/services/other/manage-acs/" target="_blank">Windows Azure Access Control Service</a>, which is now part of <a href="http://www.windowsazure.com/en-us/home/features/identity/" target="_blank">Windows Azure Active Directory</a>. I’ve written a number of articles on that subject (<a href="http://blogs.staykov.net/2012/05/introduction-to-claims.html" target="_blank">Introduction to Claims</a>, <a href="http://blogs.staykov.net/2012/05/secure-your-asmx-webservices-with-swt.html" target="_blank">Securing ASMX web services with Claims and SWT tokens</a>, <a href="https://www.simple-talk.com/cloud/security-and-compliance/online-identity-management-via-windows-azure-access-control-service/" target="_blank">Online Identity Management via Windows Azure ACS</a>, <a href="https://www.simple-talk.com/cloud/development/unified-identity-for-web-apps-8211-the-easy-way/" target="_blank">Unified Identity for Web Apps – the easy way</a>, <a href="https://www.simple-talk.com/cloud/development/creating-a-custom-login-page-for-federated-authentication-with-windows-azure-acs/" target="_blank">Creating custom login page for Federated Authentication with Windows Azure ACS</a>) and yet I see people unaware of such service and do want to implement their own ASP.NET Membership Provider.</p> <p>I also see people willing to embrace the service. They go their way through the <a href="http://visualstudiogallery.msdn.microsoft.com/e21bf653-dfe1-4d81-b3d3-795cb104066e" target="_blank">Identity and Access Tool for Visual Studio</a> 2012 and create their first web application with federated login. While the tool is great in its core – by doing what it is supposed to do, it yet hides a lot of process information and does not give you a complete log of what it did. There is one very neat option – create a local Controller with custom Login View:</p> <p><img src="https://public.bn1.livefilestore.com/y1pVGudLpUp9Yqr0GDPJAkq3iZ7ec6HVN1JXyOmtfY2beZfxYpD1AQKJ1IW_6IOOoScPFWlx7U_J08UkVp42aypFg/IdentityAndAccessTool_Configuration_CustomLogin.png?psid=1"></p> <p>While this option is great, it misses one very core feature – the <strong>log off</strong> feature! So you happily created your federated sign in, configured Identity Providers, etc. Now you login to test. Next you click the default <strong>[log off]</strong> link in your web app. And … you are still logged in! What the hack? You will ask.</p> <p>Well, when using Federated Log-in, we also have to use a Federated Log-Off (or Sign Out). For this, we have to edit our default log-off method and add one single line. Imagine the default Log Off method looks like:</p><pre class="brush: csharp;">[HttpPost] <br />[ValidateAntiForgeryToken] <br />public ActionResult LogOff() <br />{ <br /> WebSecurity.Logout(); <br /> return RedirectToAction("Index", "Home"); <br />}<br /></pre><br /><p>We only have to add:</p><pre class="brush: csharp;"> FederatedAuthentication.WSFederationAuthenticationModule.SignOut();<br /></pre><br /><p>So the final Log Off will be like this:</p><pre class="brush: csharp;">[HttpPost]<br />[ValidateAntiForgeryToken]<br />public ActionResult LogOff()<br />{<br /> WebSecurity.Logout();<br /> FederatedAuthentication.WSFederationAuthenticationModule.SignOut();<br /> return RedirectToAction("Index", "Home");<br />}<br /></pre><br /><p>And voliah! We are done. Now we can also successfully log off the web application. Note that <a href="http://msdn.microsoft.com/en-us/library/system.identitymodel.services.federatedauthentication.aspx" target="_blank">FederatedAuthentication</a> type is part of the <a href="http://msdn.microsoft.com/en-us/library/system.identitymodel.services.aspx" target="_blank">System.IdentityModel.Services</a> assembly and you must add a reference to it.</p><br /><p>Couple of things to pay attention to and remember:</p><br /><ul><br /><li>Identity And Access menu item (result of Identity and Access tool installation) will <strong>only</strong> be visible for web projects targeting <strong>4.5 Framework</strong>! <br /><li>You have to reference <a href="http://msdn.microsoft.com/en-us/library/system.identitymodel.services.aspx" target="_blank">System.IdentityModel.XX (4.0.0.0)</a> assemblies and not <a href="http://msdn.microsoft.com/en-us/library/microsoft.identitymodel.aspx" target="_blank">Microsoft.IdentityModel.XX (3.5.0.0)</a> assemblies in your project. Failing to do so, you may see unexpected behavior and even errors and failures. Very often if you upgrade your project from .NET Framework prior 4.5 to .NET Framework 4.5 there are references left to Microsoft.IdentityModel.XX – remove them explicitly! <br /><li>Do respect your users’ existing online identities! The users will respect you, too!</li></ul> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-57906460388966907202013-04-03T14:05:00.001+02:002013-04-03T14:05:17.119+02:00A journey with Windows Azure Media Services–Smooth Streaming, HLS<p>Back in January Scott Gu <a href="http://weblogs.asp.net/scottgu/archive/2013/01/22/announcing-release-of-windows-azure-media-services.aspx" target="_blank">announced</a> the official release of <a href="http://www.windowsazure.com/en-us/home/scenarios/media/" target="_blank">Windows Azure Media Services</a>. It is amazing platform that was out in the wild (as a CTP, or Community Technology Preview) for less then an year. Before it was RTW, I created a small project to demo out its functionality. The source code is public on <a href="https://github.com/astaykov/WaMediaWeb" target="_blank">GitHub</a> and the live site is public on <a href="http://wamediaweb.azurewebsites.net/" target="_blank">Azure Web Sites</a>. I actually linked my GitHub repo with the Website on Azure so that every time I push to the Master branch, I got a new deployment on the WebSite. Pretty neat!</p> <p>At its current state Windows Azure Media Services does support the VOD (or Video On Demand) scenario only. Meaning that you can upload your content (also known as <strong><em>ingest</em></strong>), convert it into various formats, and deliver to audience on demand. What you cannot currently do is publish Live Streaming – i.e. from your Web Cam, or from your Studio.</p> <p>This blog post will provide no direct code samples. Rather then code samples, my aim is to outline the valid workflows for achieving different goals. For code samples you can take a look at the <a href="http://www.windowsazure.com/en-us/develop/media-services/" target="_blank">official getting started guide</a>, <a href="https://github.com/astaykov/WaMediaWeb" target="_blank">my code with web project</a>, or the <a href="https://github.com/RobBlackwell/MediaServicesCommandLineTools" target="_blank">MediaServicesCommandLineTools project on GitHub</a>, which I also contribute to.</p> <p>With the current proposition from Azure Media Services you can encode your media assets into ISO-MP4 / H.264 (AVC) video with AAC-LC Audio, <a href="http://www.iis.net/downloads/microsoft/smooth-streaming" target="_blank">Smooth Streaming</a> format to deliver greatest experience to your users, or even to <a href="https://developer.apple.com/resources/http-streaming/" target="_blank">Apple HTTP Live Streaming format</a> (or just HLS). Everything from the comfort of your chair at home or in the office. Without the big overspend in expensive hardware. Getting the results however may be tricky sometime, and the platform does not help you with very detailed error messages (which I hope will change in the very near future).</p> <p>You can achieve different tasks (goals) in different ways sometime. Windows Azure Media Services currently works with 4 Media Processors:</p> <ul> <li>Windows Azure Media Encryptor </li> <li>Windows Azure Media Encoder</li> <li>Windows Azure Media Packager</li> <li>Storage Decryption</li></ul> <p>When you want to complete some task you always provide a <strong><em>task preset</em></strong> and a <strong><em>media processor</em></strong> which will complete the given task. It is really important to pay attention to this detail, because giving a task preset to the wrong processor will end up in error and task failure.</p> <h3>So, how to get (create/encode to) a Smooth Streaming Content?</h3> <p>Given we have an MP4 video source - H.264 (AVC) Video Codec + AAC-LC Audio Codec. The best will be if we have multiple MP4 files representing same content but with different bitrates. Now we can use the <strong><em>Windows Azure Media Packager</em></strong> and the <a href="http://msdn.microsoft.com/en-us/library/windowsazure/hh973635.aspx" target="_blank">MP4 To Smooth Streams task preset</a>.</p> <p>If we don’t have MP4 source, but we have any other <a href="http://msdn.microsoft.com/en-us/library/windowsazure/hh973634.aspx#import_formats" target="_blank">supported import format</a> (unfortunately MOV is not a supported format), we can use <strong><em>Windows Azure Media Encoder</em></strong> to transcode our media into either an MP4 (H.264) single file, or directly into Smooth Streaming Source. <a href="http://msdn.microsoft.com/en-us/library/windowsazure/jj129582.aspx" target="_blank">Here is a full list of a short-named task presets</a> that can be used with Windows Azure Media Encoder. To directly create a Smooth Streaming asset, we can use any of the <strong><em>VC1 Smooth Streaming XXX</em></strong> task presets, or any of the <strong><em>H264 Smooth Streaming XXX</em></strong> task presets. That will generate a Smooth Streaming asset encoded with either VC-1 Video profile, or H.264(AVC) Video Codec.</p> <h3>OK, how about Apple HTTP Live Streaming (or HLS)?</h3> <p>Well, Apple HLS is similar to Smooth Streaming. However, there is a small detail, it only supports H.264 Video codec! The most standard way of creating Apple HLS asset is by using <strong>Windows Azure Media Packager</strong> and the XML task preset for “<a href="http://msdn.microsoft.com/en-us/library/windowsazure/hh973635.aspx" target="_blank">Convert Smooth Streams to Apple HTTP Live Streams</a>”. Please take a note on the media processor – it is the Windows Azure Media Packager. This also will accept an input asset to be valid Smooth Streaming Asset encoded with H.264 (AVC) video codec! Do not forget that you could have created Smooth Streams with <strong>VC-1 Video Profile</strong> codec, which are totally valid and running Smooth Streams, but they will fail to convert to Apple HTTP Live Streams.</p> <h3>Hm, can’t we get all-in-one?</h3> <p>I mean, can’t I have a single media asset and deliver either Apple HTTP Live Streams or Smooth Streams, depending on my client? Sure we can. However this is CPU intensive process. It is called “<strong>dynamic packaging</strong>”. The source must be a multi-bitrate MP4 asset. This one consists of multiple MP4 files of same content with different bitrates. And it requires an on-demand streaming reserved units from Media Services. You can read more about dynamic packaging <a href="http://msdn.microsoft.com/en-us/library/windowsazure/jj889436.aspx" target="_blank">here</a>.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-91448572709085141862012-10-04T22:02:00.001+02:002012-10-04T22:02:05.971+02:00SQL Azure and Entity Framework<p>Recently I was asked by a friend “How to deal the Transient Fault handling framework against SQL Azure while using Entity Framework?”. How really?</p> <p>Here are a bunch of resources that describe in detail what the Transient faults are, how to deal with them, and in particular how to use the TFHF (Transient Fault Handling Framework) along with Entity Framework:</p> <p><a href="http://blogs.msdn.com/b/appfabriccat/archive/2010/12/11/sql-azure-and-entity-framework-connection-fault-handling.aspx">http://blogs.msdn.com/b/appfabriccat/archive/2010/12/11/sql-azure-and-entity-framework-connection-fault-handling.aspx</a> <p><a href="http://blogs.msdn.com/b/appfabriccat/archive/2010/10/28/best-practices-for-handling-transient-conditions-in-sql-azure-client-applications.aspx">http://blogs.msdn.com/b/appfabriccat/archive/2010/10/28/best-practices-for-handling-transient-conditions-in-sql-azure-client-applications.aspx</a> <p><a href="http://windowsazurecat.com/2010/10/best-practices-for-handling-transient-conditions-in-sql-azure-client-applications/">http://windowsazurecat.com/2010/10/best-practices-for-handling-transient-conditions-in-sql-azure-client-applications/</a> <p>A concrete sample from the Windows Azure CAT (CAT states for Customer Advisory Team) team site:</p><pre class="csharpcode" style="overflow: visible; word-wrap: break-word; border-top: medium none; font-family: ; border-right: medium none; white-space: pre-wrap; border-bottom: medium none; text-transform: none; word-spacing: 0px; zoom: 1; color: ; padding-bottom: 5px; text-align: left; padding-top: 5px; padding-left: 5px; margin: 10px; border-left: medium none; orphans: 2; widows: 2; letter-spacing: normal; line-height: 14px; padding-right: 5px; width: 611px; background-color: rgb(244,244,244); text-indent: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px"><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas"><font style="font-size: 9pt">// Define the order ID for the order we want.</font></font></span><font style="font-size: 9pt"><br /><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">int</font></span><font color="#000000" face="Consolas"> orderId = 43680;<br /><br /></font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Create an EntityConnection.</font></span><br /><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">EntityConnection</font></span><font color="#000000" face="Consolas"> conn = </font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">new</font></span><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">EntityConnection</font></span><font color="#000000" face="Consolas">(</font><span class="str" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#800000" face="Consolas">"name=AdventureWorksEntities"</font></span></span><font color="#000000" face="Consolas">);<br /><br /></font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Create a long-running context with the connection.</font></span><br /><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">AdventureWorksEntities</font></span><font color="#000000" face="Consolas"> context = </font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">new</font></span><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">AdventureWorksEntities</font></span><font color="#000000" face="Consolas">(conn);<br /><br /></font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">try</font></span><br /><font color="#000000" face="Consolas">{<br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Explicitly open the connection inside a retry-aware scope.</font></span><br /><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font style="background-color: #ffff00" color="#000000" face="Consolas">sqlAzureRetryPolicy</font></span><font color="#000000" face="Consolas">.ExecuteAction(() =><br /> {<br /> </font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">if</font></span><font color="#000000" face="Consolas"> (conn.State != </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">ConnectionState</font></span><font color="#000000" face="Consolas">.Open)<br /> {<br /> conn.Open();<br /> }<br /> });<br /><br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Execute a query to return an order. Use a retry-aware scope for reliability.</font></span><br /><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">SalesOrderHeader</font></span><font color="#000000" face="Consolas"> order = </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font style="background-color: #ffff00" color="#000000" face="Consolas">sqlAzureRetryPolicy</font></span><font color="#000000" face="Consolas">.ExecuteAction<</font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">SalesOrderHeader</font></span><font color="#000000" face="Consolas">>(() =><br /> {<br /> </font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">return</font></span><font color="#000000" face="Consolas"> context.SalesOrderHeaders.Where(</font><span class="str" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#800000" face="Consolas">"it.SalesOrderID = @orderId"</font></span></span><font color="#000000" face="Consolas">,<br /> </font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">new</font></span><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">ObjectParameter</font></span><font color="#000000" face="Consolas">(</font><span class="str" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#800000" face="Consolas">"orderId"</font></span></span><font color="#000000" face="Consolas">, orderId)).Execute(</font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">MergeOption</font></span><font color="#000000" face="Consolas">.AppendOnly).First();<br /> });<br /><br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Change the status of the order.</font></span><br /><font color="#000000" face="Consolas"> order.Status = 1;<br /><br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Delete the first item in the order.</font></span><br /><font color="#000000" face="Consolas"> context.DeleteObject(order.SalesOrderDetails.First());<br /><br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Save changes inside a retry-aware scope.</font></span><br /><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font style="background-color: #ffff00" color="#000000" face="Consolas">sqlAzureRetryPolicy</font></span><font color="#000000" face="Consolas">.ExecuteAction(() => { context.SaveChanges(); });<br /><br /> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">SalesOrderDetail</font></span><font color="#000000" face="Consolas"> detail = </font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">new</font></span><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">SalesOrderDetail</font></span><br /><font color="#000000" face="Consolas"> {<br /> SalesOrderID = 1,<br /> SalesOrderDetailID = 0,<br /> OrderQty = 2,<br /> ProductID = 750,<br /> SpecialOfferID = 1,<br /> UnitPrice = (</font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">decimal</font></span><font color="#000000" face="Consolas">)2171.2942,<br /> UnitPriceDiscount = 0,<br /> LineTotal = 0,<br /> rowguid = </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">Guid</font></span><font color="#000000" face="Consolas">.NewGuid(),<br /> ModifiedDate = </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#408080" face="Consolas">DateTime</font></span><font color="#000000" face="Consolas">.Now<br /> };<br /><br /> order.SalesOrderDetails.Add(detail);<br /><br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Save changes again inside a retry-aware scope.</font></span><br /><font color="#000000" face="Consolas"> </font><span style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font style="background-color: #ffff00" color="#000000" face="Consolas">sqlAzureRetryPolicy</font></span><font color="#000000" face="Consolas">.ExecuteAction(() => { context.SaveChanges(); });<br />}<br /></font><span class="kwrd" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#0000ff" face="Consolas">finally</font></span><br /><font color="#000000" face="Consolas">{<br /> </font><span class="rem" style="border-top: medium none; border-right: medium none; border-bottom: medium none; zoom: 1; color: ; padding-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: medium none; padding-right: 0px"><font color="#008000" face="Consolas">// Explicitly dispose of the context and the connection.</font></span><br /></font><font face="Consolas"><font style="font-size: 9pt" color="#000000"> context.Dispose();<br /> conn.Dispose();<br />}</font></font></pre><br /><p>Well, this is the raw source provided. To be hones, I would extract it / encapsulate in some more generalized way (for instance create some Extension methods to call for all CRUD operations; or even better – create my own DataService on top of the EF, so my code will never work with the bare boned EF context, but some contract instead.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com1tag:blogger.com,1999:blog-5177305310827978243.post-27219694396938138172012-10-03T21:52:00.001+02:002012-10-03T21:52:23.671+02:00SQL Azure Federations Talk at SQL Saturday 152 / Bulgaria<p>Last Saturday we had the first edition of SQL Saturday for Bulgaria – <a href="http://sqlsaturday.com/152/eventhome.aspx">SQL Saturday 152</a>. I submitted my talk in the early stages of event preparation. It is “An intro to SQL Azure Federations”. I rated it as “beginners”, as it is intended to put the grounds on scaling out with SQL Azure. However it turned out that the content is for at least level 300 technical talk, and the audience shall have foundations for SQL Azure to attend the talk. Anyway, I think it went smoothly and funny. You can find the <a href="http://sdrv.ms/T2kwyj">slides here</a>. And I hope to pack a GitHub project soon for the extensions on EF Code First I used to get data out from Federation Members and perform Fan-out Queries.</p> <p>Already looking forward for the next appearance of SQL Saturday in Bulgaria.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-90965687739053017932012-06-08T03:00:00.000+02:002012-06-08T03:00:04.784+02:00Windows Azure v.Next–Azure Websites, Linux on Azure, Persistent VM and much more …<div dir="ltr" style="text-align: left;" trbidi="on">
Building Cloud applications has never been easier! Ever! The recent news announced at <a href="http://www.meetwindowsazure.com/" target="_blank">MEET Windows Azure</a> event just proved it! The most exciting, the most anticipating, the most wanted release of Windows Azure is now here! <a href="https://www.windowsazure.com/en-us/develop/overview/" target="_blank">Check out the samples</a>, <a href="https://www.windowsazure.com/en-us/develop/downloads/" target="_blank">get the tools</a> and dive in the clouds!<br />
<h2>
Azure Websites</h2>
Did you want to run your Drupal site in <a href="http://www.microsoft.com/windowsazure/" target="_blank">Windows Azure</a>? Or maybe your Joomla project, or the new Umbraco 5, don’t forget your small WordPress site. Now you can either built it from scratch, or just deploy it. How to deploy? Do you like Git, or FTP ? Whatever you like, whatever you are confortable with – Windows Azure Websites is the platform to run your site, be it small or large scale enterprise site! Here is just a screenshot showing you the sample gallery, where you can chose how to start your site, if you haven’t yet:<br />
<img src="https://public.sn2.livefilestore.com/y1pojmMwn8v_8JRIx1DzgXAoVTCTc1kGY7WqhuhbNwQjzpeSL9uYYlc2hvjMTPZkam-XuaoAnqR3nvg4ulAe3iP4g/AzureWebSites_Gallery.png?psid=1" /><br />
You say that Joomla runs on PHP and MySQL! You are correct, Windows Azure supports PHP for quite some time, actually (almost) since the beginning, but it is easier now. What about MySQL? Well have you heard of <a href="http://www.cleardb.com/" target="_blank">ClearDB</a>? A company that have been providing database-as-a-service for MySQL based applications. Globally distributed, fault tolerant database as a service. They have been partnering with Microsoft to provide <a href="http://www.cleardb.com/store/azure" target="_blank">MySQL-as-a-service within the Windows Azure data centers</a>. Well, ironically enough their site is down for the time I write this blog post. But, trust me, since the MySQL is running in Windows Azure, it will not be down <img alt="Smile" class="wlEmoticon wlEmoticon-smile" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYQLaZVXE0sMJVI6ANE26-b-b2uAUefSJP7KLeWH8vGIFD2L0E8c0PPrkG6KUAo-3H6SJDlvh3upBU8jENoS6Kk1GPC99HEeF9uvxc-xPc8zgD6BEO2WwzVzgoI8FqLj8Pwn6AkgNOFeEH/?imgmax=800" style="border-bottom-style: none; border-left-style: none; border-right-style: none; border-top-style: none;" />.<br />
Oh, you have noticed – the Windows Azure Portal – reimagined! The whole portal now runs on HTML5 with METRO style interface. I have to admit that I like it much better than the old Silverlight based portal!<br />
<h2>
Persistent VM</h2>
It is not a replacement for Windows Azure VM Role, which still is stateless. It is a whole new feature, named Persistent VM. Having said that – it means, that all change you made to your VM after you deploy it to Windows Azure, will be reliably persisted across VM reboots, healings, recycling. How cool is that? Not only that, now with the Persistent VM feature, you would get a SLA for just 1 instance! What could you use that Persistent VM for? Just imagine – SQL Server, SharePoint, Linux …<br />
<h2>
Linux</h2>
What else you could do with Windows Azure now? You can, for example run your Linux based VM! Yes, Linux on Azure! How cool is that, ah? Currently there are 4 distros you can chose from:<br />
<ul>
<li>OpenSUSE 12.1 </li>
<li>CentOS-6.2 </li>
<li>Ubuntu 12.04 </li>
<li>SUSE Linux Enterprise Server 11 SP2</li>
</ul>
But I am sure more will come soon! <br />
<h2>
Virtual Network</h2>
Connecting your own infrastructure to the cloud has never been easier. Windows Azure Virtual Network lets you configure network topology, including configuration of IP addresses, routing tables and security policies. It uses IPSEC protocol to provide a secure connection between your corporate VPN gateway and Windows Azure. <br />
If I were you, I would go through the new Windows Azure <a href="http://www.microsoft.com/en-us/news/download/presskits/cloud/docs/MeetWindowsAzureFS.docx" target="_blank">Fact Sheet</a>, go for the free trial to check out the Websites, and maybe even try the Linux VMs! <br />
As a side note, something that is really on my head for quite a few years – finally we, in Bulgaria, will officially have Windows Azure! </div>Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com3tag:blogger.com,1999:blog-5177305310827978243.post-31930877127485899862012-05-17T14:29:00.001+02:002012-05-17T14:29:59.477+02:00Secure your ASMX WebServices with SWT and Claims<p>I was recently involved into interesting project, that was using the plain old ASMX web services. We wanted to migrate it to the <a href="http://www.microsoft.com/windowsazure/" target="_blank">Windows Azure</a> Access Control Service and make use of Claims.</p> <p>The way we achieved that is to add additional Soap Header to the client requests that includes Simple Web Token (SWT). On the server side, we make a check for this specific header existence, then extract the token, perform some validation checks and inject a fresh new Claims Identity into the Service instance. One thing to look out for is that you have to think of a workaround, if your ASMX WebService is a Singleton object. My implementation works with non-singleton implementations. And I currently get my Simple Web Tokens from Windows Azure Access Control Service’s WRAP endpoint. I have configured a “Password” service identities and I play with the RuleGroups to add additional claims, based on identity used. It is pretty flexible!</p> <p>The result is on … <a href="https://github.com/astaykov/ClaimsASMX" target="_blank">GitHub</a>. I initially wanted to be on CodePlex, because I have other projects there and am more used to TFS style of working. But CodePlex’s TFS is down for quite some time, which was a good excuse to use <a href="https://github.com/astaykov/ClaimsASMX" target="_blank">GitHub</a>. There is some explanations in the Readme.txt file, as well as comments in the code. So feel free to get the code, play around with it, ping me if it is not working for some reason, and so on!</p> <p>The project makes extensive use of <a href="https://github.com/RobBlackwell/webtokens" target="_blank">SWT Implementation, done by the Two10Degrees’ team</a>. But I added a compiled assembly reference for convenience.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-64755420251686110692012-05-16T21:01:00.001+02:002012-06-07T18:17:36.274+02:00MEET Windows Azure on June the 7th<p>I’m following <a href="http://www.microsoft.com/windowsazure/" target="_blank">Windows Azure</a> since its first public CTP at PDC’2008. It was amazing then, it is even more amazing now, and more exciting to come (I’m really, really excited!) …</p> <p>Get ready to <a href="http://www.meetwindowsazure.com/" target="_blank">MEET Windows Azure</a> live on June the 7th. Register to watch live (June the 7th 1PM PDT) <a href="http://register.meetwindowsazure.com/" target="_blank">here</a>. Be informed by following the conversation <a href="http://twitter.com/#!/windowsazure" target="_blank">@WindowsAzure</a>, <a href="http://twitter.com/#!/search/%23MEETAzure" target="_blank">#MEETAzure</a>, <a href="http://twitter.com/#!/search/%23WindowsAzure" target="_blank">#WindowsAzure</a></p> <p>And, if you want to be more social, register for the <a href="http://lanyrd.com/2012/meetazure/" target="_blank">Social meet up on Twitter</a> event, <a href="http://www.magnusmartensson.com/post/2012/05/16/Social-meet-up-on-Twitter-for-MEET-Windows-Azure-on-June-7th.aspx" target="_blank">organized by fellow Azure MVP Magnus Martensson</a>.</p> <p>What I can tell you for sure, without breaking my NDA, is that you don’t want to miss that event!</p> <p>See you there!</p> <p><strong>MEET Windows Azure Blog Relay:</strong> <ul> <li>Roger Jennings (<a href="https://twitter.com/#!/rogerjenn">@rogerjenn</a>): <a href="http://oakleafblog.blogspot.se/2012/05/social-meet-up-on-twitter-for-meet.html">Social meet up on Twitter for Meet Windows Azure on June 7th</a></li> <li>Anton Staykov (<a href="https://twitter.com/#!/astaykov">@astaykov</a>): <a href="http://blogs.staykov.net/2012/05/meet-windows-azure-on-june-7th.html">MEET Windows Azure on June the 7th</a></li> <li>Patriek van Dorp (<a href="https://twitter.com/#!/pvandorp">@pvandorp</a>): <a href="http://cloudythoughts.siadis.com/windows-azure/social-meet-up-for-meet-windows-azure-on-june-7th">Social Meet Up for ‘MEET Windows Azure’ on June 7th</a></li> <li>Marcel Meijer (<a href="https://twitter.com/#!/MarcelMeijer">@MarcelMeijer</a>): <a href="http://blogs.msmvps.com/marcelmeijer/blog/2012/05/16/meet-windows-azure-on-june-the-7th">MEET Windows Azure on June the 7th</a></li> <li>Nuno Godinho (<a href="https://twitter.com/#!/NunoGodinho">@NunoGodinho</a>): <a href="http://msmvps.com/blogs/nunogodinho/archive/2012/05/16/social-meet-up-for-meet-windows-azure-on-june-7th.aspx">Social Meet Up for ‘MEET Windows Azure’ on June 7th</a></li> <li>Shaun Xu (<a href="https://twitter.com/#%21/shaunxu">@shaunxu</a>) <a href="http://blogs.shaunxu.me/archive/2012/05/16/letrsquos-meet-windows-azure.aspx">Let's MEET Windows Azure</a></li> <li>Maarten Balliauw (<a href="https://twitter.com/#!/maartenballiauw">@maartenballiauw</a>): <a href="http://blog.maartenballiauw.be/post/2012/05/17/Social-meet-up-on-Twitter-for-MEET-Windows-Azure-on-June-7th.aspx">Social meet up on Twitter for MEET Windows Azure on June 7th</a></li> <li>Brent Stineman (<a href="https://twitter.com/#!/brentcodemonkey">@brentcodemonkey</a>): <a href="http://brentdacodemonkey.wordpress.com/2012/05/17/meet-windows-azure-june72012/">Meet Windows Azure (aka Learn Windows Azure v2)</a></li> <li>Herve Roggero (<a href="http://twitter.com/hroggero">@hroggero</a>): <a href="http://geekswithblogs.net/hroggero/archive/2012/05/17/social-meet-up-on-twitter-for-meet-windows-azure-on.aspx">Social Meet up on Twitter for Meet Windows Azure on June 7th</a></li> <li>Paras Doshi (<a href="http://twitter.com/#!/paras_doshi">@paras_doshi</a>): <a href="http://parasdoshi.com/2012/05/19/get-started-on-windows-azure-attend-meet-windows-azure-event-online/">Get started on Windows Azure: Attend “Meet Windows Azure” event Online</a></li> <li>Simran Jindal (<a href="http://twitter.com/#!/SimranJindal">@SimranJindal</a>): <a href="http://simranjindal.wordpress.com/2012/05/21/meet-windows-azure-an-online-and-in-person-event-social-meetup-meetazure-beer-for-beer-lovers-on-june-7th-2012/">Meet Windows Azure – an online and in person event, social meetup #MeetAzure (+ Beer for Beer lovers) on June 7th 2012</a></li> <li>Michael Wood (<a href="https://twitter.com/#!/mikewo">@mikewo</a>): <a href="http://mvwood.com/blog/learn-about-windows-azure-and-chat-with-experts-june-7th/">Learn about Windows Azure and Chat with Experts, June 7th</a></li> <li>Shiju Varghese (<a href="http://twitter.com/#!/shijucv/">@shijucv</a>): <a href="http://weblogs.asp.net/shijuvarghese/archive/2012/05/28/social-meet-up-on-twitter-for-meet-windows-azure-on-june-the-7th.aspx">Social meet up on Twitter for MEET Windows Azure on June the 7th</a></li> <li>Jeremie Devillard (<a href="http://twitter.com/#!/jeremiedev/">@jeremiedev</a>): <a href="http://jeremiedevillard.wordpress.com/2012/05/16/meet-the-cloudwindows-azure-event-7th-june/">Meet the Cloud–Windows Azure Event 7th June</a></li> <li>Kris van der Mast (<a href="https://twitter.com/#!/KvdM">@KvdM</a>): <a href="http://blog.krisvandermast.com/GetReadyToMeetWindowsAzure.aspx">Get ready to meet Windows Azure</a></li> <li>Mike Martin (<a href="https://twitter.com/#!/techmike2kx">@TechMike2KX</a>): <a href="http://techmike2kx.wordpress.com/2012/05/30/dont-miss-the-online-windows-azure-event-of-the-year-meet-windows-azure-on-june-7th">Don’t miss the online Windows Azure event of the year : MEET Windows Azure on June 7th </a></li> <li>Bill Wilder (<a href="https://twitter.com/#!/codingoutloud">@codingoutloud</a>): <a href="http://blog.codingoutloud.com/2012/06/04/meet-windowsazure-live-streamed-event-june-7-2012/">Get ready to “Meet #WindowsAzure” in a live streamed event June 7 at 4:00 PM Boston time</a></li> <li>Eric Boyd (<a href="https://twitter.com/#!/EricDBoyd">@EricDBoyd</a>): <a href="http://ericdboyd.com/2012/06/05/meet-windows-azure-unveiling-the-latest-platform/">Meet Windows Azure – Unveiling the Latest Platform</a></li> <li>Magnus Mårtensson (<a href="https://twitter.com/#!/noopman">@noopman</a>): <a href="http://www.magnusmartensson.com/post/2012/05/16/Social-meet-up-on-Twitter-for-MEET-Windows-Azure-on-June-7th.aspx">Social meet up on Twitter for MEET Windows Azure on June 7th</a></li></ul> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0tag:blogger.com,1999:blog-5177305310827978243.post-43370823445464620502012-05-01T21:34:00.001+02:002012-05-01T21:34:52.552+02:00Introduction to Claims<p>It is 21<sup>st</sup> century! We live in a digital world where we can do almost everything online. While you might think this is an amazing, let me ask you a question: How many online identities do you have? And here I don’t only mean your Google or FaceBook account. I mean every single username and password you had to ever create. I personally, have about 8 (eight!) that I <b>actively use</b> (including a digital signature), another 10 or even more that are used fairly rare and maybe over 20, which I had to create by some reason, and then abandon. For me, as a consumer this drives me crazy. Just couple a weeks ago, I rejected an invitation to the next business social networking site, from a person who I really trust, just because that new network did not offer me an option to use any of mine existing identities. They required from me to create another user name, another minimum 6 symbols password containing upper and lowercase characters and numbers. No thanks. I’m done with creating online identities! <p>As being a developer, I also know that the easiest way to go with a site, which offers some kind personalization, is to use my own authentication and authorization mechanism! But this thinking I have left behind me. I decided to step into the <b>present</b> (not even the future!) and pay attention to terms like Identity Provider, Claims, Trust, Relying party application, and similar. Fortunately for me, there is the Windows Azure Access Control Service (or just ACS) on the market, that really helps me build applications that respond to the needs of the customers. Combining the power of ACS with Windows Identity Foundation (or WIF) I can easily create applications that would offer the consumers, the option to use some of their existing online identities (such as Microsoft Live ID, Google, Yahoo, Facebook and others). <p>If you want to join me, let me first list out the terms which you will begin working with on a daily basis: <p>Take a closer look to the following sentence: “I claim that my name is Anton Staykov, and I can prove it by showing you my personal Identification card, issued by the Bulgarian Government”. It represents almost all the terms and objects you will work with, when working with ACS. <p><b>Claim</b> – this is an assertion about an object issued by an Identity Provider. In the given sentence, the Claim is “Name” and it value is “Anton Staykov” <p><b>Identity Provider</b> – an authority, which issues security tokes, that contain claims. Bulgarian or any Government is an Identity Provider, which issues Passports. And the passports are <p><b>Security Tokens</b> – this is a digitally signed object, which contains claims. A Token may contain one or more claims. <p>And last, but not least, you, dear reader are the <b>Relying Party</b> to which I present my token that contains claims. <p>There is one more player on the scene, and it is the Federation Provider. This, in essence is an Identity provider. It stays as mediator between mine application and your Facebook identity. When I want to give you the chance of using your FaceBook account to identify in front of mine application, I don’t want to bother with implementation details, which might (or might not) be very different from the details I need to know when I give you the option to sign with your Live ID. In my application I have a piece of code, where I say – look, I only trust Tokens and Claims that come from that federation provider. And that very federation provider takes care of implementation details around FaceBook, Live ID, Google, OpenID, WS-Federation, etc. And not only it takes care of these details, but even more. If, one day I decide that I no longer trust Google as Identity Provider, I just uncheck a checkbox, do nothing in my code, and you will no longer be able to use your Google account to present yourself in front of mine application. <p>As some final words, I want to share with you details from some studies conducted amongst online users about their perceptions about online shopping experience. <p>· 3 out 4 online shoppers avoid creating new user accounts <p>· 76% of online shoppers admit to have given incomplete or wrong information when required to create new user account <p>· 24 % of online shoppers abandon the site, when it requires a registration <p>With this said, I hope I made you believe (at least a bit) that Claims is the way to identify online user. And if you happen to be a developer, I really hope I have lighted up a small fire, which will drive you to at least investigate a bit more about Claims and social sign-in.</p> Anton Staykovhttp://www.blogger.com/profile/01413558839144725133noreply@blogger.com0