We had the great privileged of having Jim Priestley in attendance. Jim described the quite remarkable Azure web site he and his team at Radwell International, Inc built.
Jim describes the system in his blog. In a nutshell, they built an awesome catalog storefront using Azure blob storage and SQL Azure providing services to 25,000 users a month with hundreds of hits a minute. The site is FAST and rapidly returns hundreds of pages in less than a second - really shows off the power of Windows Azure. Hearing Jim talk makes me confident that Windows Azure is the platform I want to program for going forward far into the future.
I am researching the Microsoft Windows Azure Platform. What follows are the notes I am taking as I learn this rich "cloud-based" environment. If you are just getting started with Windows Azure like I am, you may find these notes useful.
To keep the start up costs of programming Windows Azure applications to a minimum, I am focusing on primarily free resources.
Quick List of Links
- Windows Azure Platform Free 90 Day Trial
- Visual Web Developer 2010 Express
- Windows Azure Tools for Visual Studio
- Windows Azure Platform Management Portal
- Channel 9’s Cloud Cover
- Windows Azure Platform Training Kit
- Overview of Building an Application that Runs in a Hosted Service
- Tracing to Azure Compute Emulator SDK V1.3
- Microsoft’s Website Spark program
- Tips and Tools for a Better Azure Deployment Lifecycle
Get a Windows Azure Account
First, sign up for the Windows Azure Platform Free 90 Day Trial or otherwise create a Windows Azure Platform account.
Get Visual Studio
There are a variety of ways to program applications for Windows Azure. I happen to be fond of Visual Studio, having programmed in that environment for many years. The quickest and least expensive way to acquire Visual Studio is download and install Visual Web Developer 2010 Express.
After installing Visual Web Developer 2010 Express, create a new project. Within the New Project dialog, navigate to Installed Templates, Visual C#, and click Cloud. Visual Studio will then guide you through the process of installing the Windows Azure Tools for Visual Studio.
After installing the Windows Azure Tools for Visual Studio, run through the Code Quick Start tutorial, Create and deploy an ASP.NET application in Windows Azure. The tutorial will walk you through creating a very simple ASP.NET application and its deployment to Windows Azure. Read the tutorial very carefully as there is a lot of very important information to digest all at once.
Get Some Costs!
There’s a note in the Code Quick Start tutorial about configuring your VM to use the Extra-Small size. As an experiment, I left the size at the default setting of Small and deployed anyway. After I deployed, I let the sample application sit in staging on the Windows Azure Platform and it went unused for two days. It turns out that the free trial includes 25 compute hours for Small VMs. By the time I shut the sample application down, it accumulated 52 compute hours. I was billed 27 hours overage at $0.12 per hour. Keep this in mind as you work with this platform. If you’re concerned about costs, pay attention to the details.
Tip: To change the VM size, use Solution Explorer to navigate to the “Role” project that contains the service configuration and definition files ending in “cscfg” and “csdef”. Open the “Roles” folder, launch the WebRole and change the size using the user interface.
Get Some Training
Download and install the Windows Azure Platform Training Kit. The hands-on labs use a dependency checker to make sure you have all the correct software installed and running. If you’re using Visual Web Developer 2010 Express, the dependency checker will indicate that you do not have Visual Studio installed. Just ignore this and continue through the check out. The hands-on lab runs perfectly fine using the Express version of Visual Studio.
The first hands on lab (Exercise 1: Building Your First Windows Azure Application ) walks through the process of setting up a guest book web application. The first exercise will step you through building a class library that will serve as a data access layer (DAL). You will also create a simple web page within the context of a Web Role that uses the class library and uses Windows Azure resources to create storage objects. The second exercise sets up a Worker Role to process uploaded bit maps into thumbnails. The third exercise walks through the deployment of the finished application to Windows Azure.
Microsoft Windows Azure Basics
Windows Azure is a platform as a service (paas) environment that consists of three components: Fabric, Compute and Storage. Fabric refers to the network of interconnected servers and their connections. The Compute and Storage components are also a part of the Fabric. The Compute component provides a computation environment via Worker Role services. A special kind of Worker Role, called a Web Role is available that simplifies creating ASP.NET front-ended applications. The Storage component provides three types of scalable data storage: Tables, Blobs and Queues.
- Structured storage in the form of non-relational tables.
- You can use a Table to persist and index your application’s objects.
- Tables do not have a schema other than requiring identifiers. Typically, you’ll use a Table to efficiently store and retrieve objects using LINQ and data binding with an ObjectDataSource.
- Important Properties
- PartitionKey - unique identifier for the partition. By definition all table entities are organized by partition to support load balancing across storage nodes.
- RowKey - unique identifier for an entity within a given partition. Every insert, update, and delete operation requires the RowKey.
- Timestamp - maintained on the server side to record the time an entity was last modified. This property is required by only used by Windows Azure to provide optimistic concurrency.
- Unstructured storage in the form of block and paged blobs, which are optimized for streaming.
- You can upload a blob and commit it using their block IDs. Blocks can be no more than 65 MB in size.
- Larger blobs up to 1 TB in size can be stored in paged blobs.
- Storage of messages that may be read by any client who has access.
- A queue contains an unlimited number of messages. Each message must be less than 8 KB in size.
- Typically, you’ll use a queue to share messages between application roles - See Below.
- Other Windows Azure Storage Offerings
In the first exercise, much of the code that deals with Table Storage is encapsulated within a standard C# class library that serves as a DAL. default.aspx begins by creating objects from this library and also calls Windows Azure methods directly via Microsoft.WindowsAzure.StorageClient and other libraries. In the class library, the DataContext class provides a bridge between the data-binding logic and the collection of the POCOs (Plain old CLR objects). The DataSource in the library provides the actual LINQ query to retrieve a list of POCOs.
During initialization, Windows Azure applications typically retrieve an instance of a CloudStorageAccount storage account. The initialization of CloudStorageAccount requires a connection string. Usually this comes from the application’s configuration settings (”DataConnectionString”). To simplify the retrieval of the storage account’s configuration settings, initialize Windows Azure by calling SetConfigurationSettingPublisher in the Application_Start method of Global.asax.
To initialize the DAL to use Windows Azure storage, in the DataSource:
- Call CloudTableClient.CreateTablesFromModel
- Pass the type for the DAL’s DataContext.
- The base type for the DAL’s DataContext is Microsoft.WindowsAzure.StorageClient.TableServiceContext.
- Do this only once for the session. Use a static constructor for the DataSource class.
- Microsoft no longer recommends using CloudTableClient.CreateTablesFromModel. Rather you should use CloudTableClient.CreateTable.
- For the DataSource get a new instance of the DAL’s DataContext.
- Set the DAL’s DataContext’s RetryPolicy to an appropriate value, typically 3 times with 1 second between each retry.
- Do this during the regular constructor of the DAL’s DataContext.
When it is time to save the user’s entries from the user interface, create a new POCO object and initialize it with the values from the user interface. Create a new DataSource and add the POCO object to the data source.
To work with Blob storage in Windows Azure, first get a storage account (CloudStorageAccount) from configuration settings “DataConnectionString” just like when you requested a CloudTableClient in the above. Get a blobStorage from the storage account by calling CreateCloudBlobClient. Get a CloudBlobContainer from the blobStorage by calling GetContainerReference.
Create the container if it doesn’t exist already by calling CreateIfNotExist. Create a unique name for the blob and pass that along when requesting a blob from blobStorage using GetBlockBlobReference. Set the ContentType for the blob (usually something like image/pjpeg). Then upload the blob by calling the blob’s UploadFromStream method. The stream can come from a asp:FileUpload web control.
When it is time to save the user’s entries from the user interface, include in the POCO, the blob’s address contained in the blob’s Uri property. In this way, the blob is associated with the entry object in the Table storage.
Message Queue storage objects are created in a very similar manner as a blob and table storage by calling the storage account’s CreateCloudQueueClient method. Create a CloudQueue object by calling the client’s GetQueueReference passing a name that serves as an address for the queue.
When it is time to save the user’s entries from the user interface, create a new CloudQueueMessage and pass along a string that includes the message. This message will be retrieved by a separate Worker Role and serves as a trigger to convert the blob into a thumbnail.
In Windows Azure, a common pattern is to create a Worker Role that waits for a message to arrive in the Queue, processes the message and then performs some action on the message. For instance, you could use a worker role to convert an uploaded picture file into a thumbnail. The Web Role would store the uploaded picture file in a blob and then place a message on the queue. The Worker Role would see the message, convert the file into a thumbnail, store the thumbnail into the blob and then update the associated POCO's thumbnail's Uri property.
Just as in the Web Role, the Worker Role needs a storage account CloudStorageAccount from configuration settings "DataConnectionString" Only this time the setting is initialized by calling SetConfigurationSettingPublisher in the OnStart method of WorkerRole class.
During the OnStart method, Get a blobStorage and queue object in exactly the same manner as the web role by using the storageAccount's methods, CreateCloudBlobClient and CreateCloudQueueClient respectively. Set the permissions for the blob using the CloudBlobContainer.
In the hands-on lab, the worker role project carries a reference to the DAL. The worker role then uses the same DAL as the web role and is able to manipulate the same Table storage.
A worker role provides a Run method that is called to initiate processing. More information on the lifecycle methods of worker roles is in the MSDN Library article Overview of Building an Application that Runs in a Hosted Service. Typically, a worker role will check for messages in the queue by calling the GetMessage method. If there are no messages, we simply pause the worker role for one second, after which the Run method will exit and Windows Azure will then restart the worker role.
In order to convert an uploaded picture file into a thumbnail, you can use the .net System.Drawing library for simplicity. If this were a production application, a more robust technique would be called for. One posibility is to review Jeff Prosise's blog post Silverlight's Big Image Problem (and What You Can Do About It).
Debugging Windows Azure Applications
For development purposes, the Windows Azure Compute Emulator comes with the SDK tools and provides a Windows Azure environment simulation. In this simulation, you can run and test an application on your local computer before you actually deploy it to the Azure cloud. The Compute Emulator also provides a GUI to assist you with monitoring trace and other information. This GUI has a handy shortcut in the form of an icon in the Windows system tray.
For a step by step guide for writing to the diagnostic logs that appear in the Compute Emulator’s monitor window, review the MSDN article How to Initialize the Windows Azure Diagnostic Monitor. Basically, you write to the trace logs using the methods provided by System.Diagnostics.Trace. The SDK provides a “listener” library that you hook into your application via web.config.
Note that web applications actually run in a different application pool than the web and work roles. Subsequently, if you wish to write trace messages from a web page code behind such as default.aspx.cs, you’ll need to add a listener to web.config. Although the exercises in the hands-on lab use System.Diagnostics.Trace.TraceInformation to write trace information to the Azure Computer Emulator diagnostic monitor, the correct listener is not referenced in the lab and you will not see the trace information in the logs. More information on this issue is available from Andy Cross's article Tracing to Azure Compute Emulator SDK V1.3. To view trace output from the code in the hands-on lab, add the following listener to the web.config in the System.Diagnostic Trace Listeners section.
|XML |||copy code |||?|
<add type="Microsoft.ServiceHosting.Tools.DevelopmentFabric.Runtime.DevelopmentFabricTraceListener, Microsoft.ServiceHosting.Tools.DevelopmentFabric.Runtime, Version=126.96.36.199, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="DevFabricListener">
For development purposes, the Windows Azure Storage Emulator comes with the SDK tools and provides a Windows Azure environment simulation of Blob, Table and Queue storage. The Storage Emulator also provides a GUI to assist you with viewing blobs, and POCOs. This GUI is available from Visual Studio’s Server Explorer.
This is one area where using Visual Web Developer 2010 Express poses a challenge. Visual Web Developer 2010 Express does not provide a Server Explorer. Instead, a Data Explorer is provided and although Windows Azure Storage appears, I was unable to view anything. Visual Studio 2010 Professional does not have this issue.
It is possible to get a free version of Visual Studio 2010 Professional by joining Microsoft’s Website Spark program.
Deploying Windows Azure Applications
There are many things to consider when deploying applications to Windows Azure. At minimum, you’ll need to create a Storage Account and a Hosted Service. For the best performance and lowest costs, you'll also want to create an Affinity Group so that the Storage Account and the Hosted Service are deployed to the same location. Using the Windows Azure Management Portal found at https://windows.azure.com/, perform the following steps to create your application’s initial structure in Windows Azure:
- Create a Storage Account
- Create an Affinity Group
- Copy the Primary access key from the Properties Pane
- Create a Hosted Service
- Choose the Affinity Group you created in the step above
- Select the Do not Deploy option
The reason to defer deployment is so you can go back to the application source and make any last minute changes such as providing the AccountName and AccountKey in the connection strings listed in the ServiceConfiguration.cscfg using the Primary access keys from the step above.
One way to move your application from Visual Studio into Windows Azure is to create a package and then upload it. After right-clicking the cloud project in Solution Explorer, select Publish. Then choose Create Service Package Only in the Deploy Windows Azure project dialog. Visual Studio will first build your project and then create the Service Package which you’ll then upload using the Windows Azure Management Portal:
- From the Portal’s Home page, select Hosted Services, Storage Accounts & CDN.
- Select Hosted Services, then locate your service in the list.
- Right click the service and select New Staging Deployment.
- Use the dialog to name your deployment - use a version identifier of some kind.
- Use the dialog to point to the package and configuration files stored on your local computer.
The Portal will then upload your package and configuration files, prep the fabric and deploy your application. After the deployment completes, Windows Azure generates a temporary unique URL and makes it available in the properties panel. Click this to test your application in staging. Once your application is deployed in Windows Azure staging, you can do the following:
- Deploy to Production
- Works the same way as deploying to Staging.
- Alternatively, swap the VIP between Staging and Production (see below).
- Configure the hosted service
- Upgrade the hosted service
- Upload an updated package
- Upload an updated CSCFG file
- Instructions to upgrade the hosted service
- Promote Staging to Production - Virtual IP Swap (VIP-Swap)
- The Virtual IP (VIP) address of staging is swapped with the VIP address of production
- Instructions to swap VIPs
I’ve grown quite fond of the open source applications I’ve been working with over the years. Wordpress (the engine behind this blog) is all done in PHP under Apache and uses MySQL for the back end. WebProtege (a wonderful ontology editor from Standford) uses a servlet container for it’s collaboration magic - I like Tomcat alot. Given that I wanted these two applications to be managed from the same space, I dug a little and found a vendor that provide Apache and private Tomcat services.
I tend to go way overboard sometimes and this time is no fairly typical. For years, I’ve wanted to work in the above environment but in a local offline way. From my early days of researching into internet technologies, I wanted to set up an environment whereby I could develop on a “development” server and then “promote” to production. Recently, I began to construct a VirtualPC server to provide the development server function. This server uses Microsoft Windows XP (that’s the OS I know best). I installed Apache 2.2 and it listens on port 80 just like any webserver. Also, I installed Tomcat 6.0 and it listens on port 8080. I use mod_jk to hook up the two application servers so to the end user all web applications look like they’re in the same space.
I’m programming offline against this VirtualPC (yes it is very responsive this way). I modified my Hosts file so clients applications/browsers always think they’re talking to “www.slholmes.org” even though they’re talking to a computer that is running locally in a virtual machine. (VirtualPC networking was tricky but it boiled down to setting up Microsoft’s Loopback Network Adapter).
Once I got some sample apps (including WordPress) running under both PHP (Apache) and Java (Tomcat), I realized that the applications I planned to deployed would all have security accounts, authentication and authorization requirements. I dug a little and found Josso - Java Open Single Sign On. Josso is a security system divided into two main components. The first component is a Gateway which is a web service that runs under Tomcat. The other component is a set of Agents that “front-end” web applications to provide authentication and authorization services.
Today, after much configuration, head scratching and research, I successfully logged into a sample PHP web application, then switched over to a protected Tomcat application. Josso did the heavy lifting, realized I was already logged in and presented the protected Tomcat application without requiring me to log in again under some other context. This is two completely different webapp technologies using the same security engine.
Next steps. As I mentioned, WordPress uses MySQL as the backend. WordPress also has a fairly elaborate “Account” sub-system. I’m going to attempt to hook up Josso to the WordPress authorization sub-system and see how far I can get. I would like to offer up “accounts” for registered users so they can write their own blogs and work with the other web applications all on the same system. We’ll see how far I can get.
Where am I?
Recently, I configured my Blackberry and a variety of softwares to let my friends and family know approximately where I am at any given moment. I tried a variety of applications and techniques and settled on those that work best for me. Automated location tracking is still in its infancy but already many applications are up and running and providing interesting useful services.
In the very near future, my hope is that my Blackberry might make suggestions for a good restaurant in my area or might let me know that one of my friends is nearby. After some initial exploratory research, I found many services are already set up to do exactly this. Most of them are just getting started so I lowered my expectations a little and set out to get just some basic functionality happening.
When setting up automatted location tracking for the first time, it helps if you already have a very good idea of what you want to accomplish. Then it is simply a matter of finding the right tools to get it all working. In my case, I simply wanted to update my friends and family of my whereabouts but keep the necessity to join yet more social networks to a minimum. I came up with a small list of tasks I needed to research, install and configure:
- My Blackberry would get my current location from its built-in GPS.
- My Blackberry would send my current location to the Interweb, somehow.
- My status on FaceBook would be updated with my current location.
- A pretty little map on my Blog would be updated.
Naturally, the idea would be that I didn’t touch the Blackberry to do any of this. It just happened and it would be timely and accurate.
If you look at the list, you can divide the tasks into two processes. First, my location is detected and pushed to a central location somewhere on the web. Secondly, my location is pulled from a central location and web sites are updated. That central location turned out to be Yahoo’s Fire Eagle service which I’ll discuss in more detail later in this post.
Pushing Your Location to Fire Eagle
Installing, and configuring the web applications required to accomplish the four little items listed above is not difficult to do but does involve following a number of steps for each application - an orchestration if you will. All of the applications require no tender exchange, i.e. they’re free! All you need is a BlackBerry with GPS, and the Interweb.
Blackberry Location Tracking from GPS using MoosTrax
Most models of Blackberry have a built-in GPS along with one or two built-in GPS applications. You may have already played around with yours a little. As a matter of fact, it helps to get one of the built-in applications up and running if for no other reason than to make sure the GPS is fully functional. Later on, if you are trying to troubleshoot some weirdness, you can go into the built-in application first and make sure the GPS is still operational.
In order to send location information to the Interweb, I wanted just a simple program that runs on a Blackberry, that pings whatever satellite is in my area and that sends my current geographical location to the web periodically. MoosTrax does this very well and has just the right amount of functionality. It works for the most part in the background, i.e. hands off. To instal MoosTrax, using your Blackberry, navigate to http://moostrax.com/ota and download MoosTrax’s BlackBerry application.
After the installation, you’ll probably go looking for the MoosTrax application. LOL! There isn’t one, at least not on the Blackberry Curve. Instead, you will use BlackBerry’s Options application to configure a small number of MoosTrax settings.
So, if you’re like me, you’re already probably confused and ready to throw in the towel, right? Fortunately, detailed instructions are available at MoosTrax’s support site. Follow those instructions carefully. After installation, you’ll need to link up with your account on MoosTrax web site and make some changes to the Blackberry MoosTrax settings.
If you have trouble connecting to your account at moostrax.com, turn off the Connect with BES option. In order to save your BlackBerry’s battery, change MoosTrax to update every 240 seconds (4 minutes) and only when your location changes more than 260 meters. These settings send a good amount of information to your MoosTrax account and you’ll be able to view fairly accurate plots of trips you might take.
Once you successfully log in with the MoosTrax Blackberry application, you’ll be able to see your location on the web when you log into your MoosTrax.com online web account. It is at that site where you will connect the MoosTrax application with Yahoo’s Fire Eagle application.
Yahoo’s Fire Eagle
Fire Eagle is Yahoo’s centralized “hub” for location data. An application on the web or in our case, the BlackBerry, updates Fire Eagle with location information. Fire Eagle then routes that information to other applications on the Web.
Fire Eagle really has two major objectives - to make it easy for people to capture, manage and use their location all over the web and to make it easy for developers to build applications that can use that location without having to do all kinds of incredibly complicated work building updaters and working with very complicated geo data.
- Tom Coates, Head of Yahoo’s Fire Eagle project
Fire Eagle provides a nice friendly way to manage all the web applications out there that work with your location data. For each service you subscribe to you may specify how much or how little the application can know about your whereabouts.
After creating an account with Fire Eagle, return to MoosTrax and navigate your browser to the Extras section of your account. You’ll see an option there to configure MoosTrax with Fire Eagle. Follow the instructions and test it all out. If you use the settings as I recommend, MoosTrax will only update Fire Eagle once every four minutes and then only if you have moved more than 260 meters.
When you’re testing things, be sure to spend plenty of time on your accounts at both MoosTrax and Fire Eagle. Make sure you understand why MoosTrax is doing what it does to avoid confusion later on. Also, MoosTrax has a fairly active online support forum for those times when you just aren’t sure about what you’re seeing. I found it is best to be patient. Monitor your progress over several days and make adjustments to everything as you go.
Pulling Your Location from Fire Eagle
Once you are successfully “pushing” your location information to Fire Eagle, you can start using that information to do interesting things by using one of the many web applications listed in Fire Eagle’s Application Gallery. I’ll describe two applications that perform items 3 and 4 from my little list at the start of this post.
Geoupdater - Status Update for FaceBook
Geoupdater is a great little web application that posts a simple status message to your FaceBook account. The status includes a link to a map of where you are. When I connected Geoupdater to Fire Eagle, I set the Read Level setting to my current city because I didn’t want to bombard my friends and family with useless location information. I pass through a small number of city level locations on my way to and from work so my updates to FaceBook are relatively few.
My co-workers are hooked into FaceBook as well so now they know if I’m at the office or at home. I love the fact that my friends, family and co-workers know approximately where I am at all times. They love it too! When I go on a trip out side my region, things get even more exciting! (Actually most people probably just hide my status. And, I know who they are, too
blogloc - Embeddable Google Map for your Blog
blogloc provides a great little service for your blog. After you create an account with their online web site, you can hook up blogloc with Fire Eagle. Then you visit their FAQ page where you can get a custom copy/paste code block for your blog’s template. I spent a little time with mine and got just the right size and zoom level so that my readers (you!) can know approximately where I am at the moment.
blogloc’s embeddable map uses Google Maps and is highly interactive. You can zoom all the way down to street level if you like. However, I have the Fire Eagle to blogloc connection set to city level so the map only shows some random point in my vicinity and not my precise location.
So, Now What?
I’ve played around with a variety of other web applications and I’m not too terribly sure about most of them. I do feel like I am prepared for the next big thing once it comes along. Friends on Fire seems really nice but the thumb tack that shows my location on their map looks pretty lonely - no friends
As a matter of fact, most of the web sites, services and other applications using location data have little to no content. And where there is some information about a restaurant or some other place of interest nearby, the information usually looks computer generated and lacks links to more info. This is sure to change as the Semantic Web takes off and makes searching for things far more sophisticated.
Hey! The Semantic Web and Geo Location working together,