Monday, September 06, 2010

Windows Azure and Cloud Computing Posts for 9/4/2010+

A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

AzureArchitecture2H_thumb311  
Update 9/6/2010 (Labor Day): New items marked
• Update 9/5/2010: New items marked

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now freely download by FTP and save the following two online-only PDF chapters of Cloud Computing with the Windows Azure Platform, which have been updated for SQL Azure’s January 4, 2010 commercial release:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available for download at no charge from the book's Code Download page.


Tip: If you encounter articles from MSDN or TechNet blogs that are missing screen shots or other images, click the empty frame to generate an HTTP 404 (Not Found) error, and then click the back button to load the image.

Azure Blob, Drive, Table and Queue Services

•• Erik Ejlskov Jensen pointed me to Chris J. T. Auld’s Windows Azure Drives + SQL Compact 4: Part #1 post of 8/20/2010 that explains how to deploy SQL Server Compact 4.0 CTP 1 to Windows Azure storage with the *.sdf file on a Windows Azure Drive. This post is part of the answer to whether Ben Cull’s SuperQ (see post below) will run under Windows Azure.

Here’s the start of Chris Auld’s article:

image So there’s not a whole lots of stuff on the interweb that gets you started with drives and SQL Compact 4 is only just out into the open. My need was for a lightweight database for a sample app I’m building with Chris Klug for our session at Tech Ed New Zealand.

I thought I’d run quickly through how I got things going.

Getting the Bits
I grabbed the bits using the Microsoft Web Platform installer. I wanted two things. 1) SQL Compact 4 CTP1 and; 2) The Web Matrix tool.
It’s hopefully fairly obvious why I needed SQL Compact. The reason I wanted Web Matrix is because it’s currently the only tool I could find that will allow me to create and edit SQL Compact databases.

trznaj5k     ixuyzv04

I’ll be using Visual Studio for much of this post; I’m guessing y’all know how to get your mittens on that now. …

The TechEd New Zealand session that “Chris Squared” presented on 9/1/2010 was:

  • WPH401 From Phone Zero to Phone Hero in 60 minutes
  • Track: Windows Phone Development
  • Speaker(s): Chris Auld, Chris Klug
  • Time: Wednesday, September 1 14:55 - 15:55 Theatre

Join Chris Squared (Chris Auld & Chris Klug) from Intergen as they build a real Windows Phone 7 app from scratch in 60 minutes. This is a hard core, dual data projector, coding marathon. Chris and Chris will build a Windows Phone 7 series application including Windows Azure hosted push notifications, a rich Silverlight UI and partner integration. You’ll see how to take an idea from concept to the Windows Phone Marketplace in just an hour. In other words, not your general 'Hello World' application.

Chris Auld also presented COS200 The Business of Cloud Computing, COS202 Next Generation Business Applications in the Cloud with Microsoft Dynamics CRM 2011, COS304 Azure Storage Deep Dive, and DEV307 Taking the Covers off Dynamics CRM 2011. Chris was a bit busy during those two days.

Chris and Chris presented WPH401 as DEV359 at TechEd Australia 2010 at the end of August. There’s an 00:03:11 video interview with Chris Auld at TechEd New Zealand 2010 here, but neither TechEd Online NewZealand or Australia have the video of the Phone Zero presentation. Very strange.

Chris Klug’s illustrated preview post about the Phone Zero to Phone Hero in 60 minutes session is here. Zero To Phone Hero - Key points - part 1 is a post-TechEd article with code snippets.

Chris Klug said in response to a comment:

I have put some of my code in a blog post now. We will however not be publishing all the code as the application is going to go public and be put on the Windows phone 7 Marketplace as soon as it launches...


•• John Waters announced Our [Falafel Mobile’s] Journey to the Cloud Begins in this on 9/5/2010 post:

image Today, we embarked on our journey into the Cloud, more specifically the Windows Azure Cloud. This journey goes hand in hand with our venture into the Mobile market. Mobile apps and Cloud services make a perfect match. The devices roam across the face of the planet, running a variety of Operating Systems, from the Apple iPhone and iPad devices in all their incarnations, to the slew of Android machines hitting the market, and now the upcoming Windows Phones.

imageFalafel Software is writing software for all these environments, and now we are also writing services for them as well. We will be providing a range of services to support our mobile offerings, including SQL Azure for database storage, REST WCF services, BLOB file storage in the Cloud, and Notifications.

It appears to me that Windows Phone 7 will lead many developers to take advantage of Windows Azure storage.

Ben Cull proposed SuperQ – A Portable Persistent .NET Queue on 9/6/2010 (local date/time):

image SuperQ is much like the Queue Storage for Windows Azure, except it automatically creates a local db instead of using the cloud. It is also self-polling, so all you need is a callback method and you’re ready to go. This allows you to create quick and easy task schedulers and background worker processes.

Step #1 – Ensure you have SQL Compact Edition 4 Installed

imageTo let SuperQ build a database automatically in your App_Data folder, we have opted to use SQL Compact Edition 4.

Use the Web Platform Installer to download and install SQL Compact Edition 4. If you’ve never used Web Platform Installer, I strongly urge you to try it, it makes installing common software a breeze.

Otherwise you can download SQL Compact Edition 4 directly from here: http://www.microsoft.com/downloads/details.aspx?FamilyID=0d2357ea-324f-46fd-88fc-7364c80e4fdb&displaylang=en

Step #2 – Reference the SuperQ Library

Download and reference the SuperQ library from here: http://github.com/downloads/bjcull/SuperQ/SuperQ_v0.1.dll

If you’re interested in the inner workings of the queue, or would like to help out, please feel free to jump over to http://github.com/bjcull/SuperQ and grab the latest source code for yourself.

On a side note, either me or lukencode will probably be doing a blog post on how to implement some of the features that SuperQ makes use of such as Async Callbacks and SQL Compact DB Creation, soon.

Step #3 – Use the Queue

You now have access to one of the most abbreviated and super of queues known to man. Here are a few common methods to get you going:

  • Create or Open a Queue

image

This will create a queue in your data folder (Usually App_Data) with the name provided. If it already exists it just opens it. Sorry about the namespace, it might be changed in future.

  • Push a message onto the queue

image 

Yep, you can push whatever class you like onto the queue and you’re guaranteed to get it back.

  • Get a message off the queue

image

Keep in mind that SuperQ is designed to use only one type at a time per queue. Please also remember to delete the message from the queue once you have successfully processed it. See Below.

  • Delete the message from the queue

image

To make sure that the queue message will survive even if your program crashes, you must delete the message from the queue within 30 seconds (soon to be an option) otherwise the message will become available to other GetMessage calls.

  • Automatically receive a callback when a message arrives

If you do not want to poll the queue manually, you can setup a callback method that will be called each time there is a message waiting to be processed.

First create the callback method like so:

image

Now you can start and stop the callback like so:

image

Can I have an example of all of the above?

No problem imaginary heading friend, SuperQ can help. Here’s a quick Console App combining all the above methods:

image

Is this a potential replacement for Azure queues? There’s no indication in the SQL Server Compact 4.0 CTP documentation or blogs that it’s deployable to Windows Azure.

•• See the preceding post for the answer to SQL Compact 4 CTP 1 to Windows Azure.


<Return to section navigation list> 

SQL Azure Database, Codename “Dallas” and OData

Darryl Miller posted the following tweet on 8/24/2010 (missed when posted):

image

imageOthers have said “LightSwitch is to {Silverlight | ASP.NET | Visual Studio} what MS Access is to database applications” but most didn’t mean it as a compliment.


• James Coenen-Eyre (@bondigeek) posted on 9/5/2010 WCF Data Services, OData & jQuery. If you are an ASP.NET developer you should be embracing these technologies…:

… and even if you are not an ASP.NET developer you should.

image I have talked a lot of late about WCF Data Services, OData and jQuery and with good reason. These technologies have caused a radical shift in how I develop for the web. So much so that the concept of traditional asp.net data binding is a distant memory now.

imageNow don’t get me wrong, in the early days of asp.net, data binding was a bit of a god send but it also had it’s limitations of being tied pretty heavily to the server.

I now find myself being far more conscious of the end user experience and wanting to push as much processing on to the client as possible and it is the likes of WCF Data Services, OData and jQuery that facilitate this and make it a truly pleasant experience.

If I am boring you with all my talk of OData I make no apologies. It’s simple, elegant and yet powerful. You cannot downplay or underestimate the achievement that the folks at Microsoft have pulled off. Pop on over the odata.org site and check it out. It is well documented.

Now of course jQuery is not a Microsoft effort but they have embraced it and I tip my hat to them for doing so. jQuery is your friend, use it, use it wisely and allow it to open up your world as an asp.net developer.

One of the lessons I have learned over the past decade is not to stick to one technology or technology provider. Sure I have my bias (and that is Microsoft) but I don’t go through life with blinkers on. I attribute that attitude to my good friend Peter in Germany who I remember at the age of 15 (Peter is at least 10 years my senior) telling me at the time to open up my horizons musically as I had Genesis on high rotation. Cut me some slack folks that was back in 1980 Smile

Back to the subject at hand and the powerful combination of all these technologies. Here is the full list which might look daunting to a junior but are the strings you will need in your bow to develop re-world, scalable applications (with a bias for Microsoft. No LAMPs here folks)

  • SQL Server
  • LINQ
  • ASP.NET
  • C#
  • OData Protocol
  • WCF Data Services
  • ADO.NET Entity Framework
  • jQuery
  • HTML
  • CSS

Yes it’s a pretty lengthy list and reads more like a shopping list for potential employers but these are things you should be familiar with. If you are not and you develop for the web then I would encourage you to explore them.

Case Study

So lets take a look at a real work example of tying all of these technologies together.

The sample application below manages a list of products. You can add, edit, delete or disable a product. Now although the sample is hosted in an iframe so you may not notice, there are no traditional post backs going on here at all. All interaction with the server is via ajax calls facilitated by jQuery.

In addition the list is paged and there is no traditional asp.net data binding going on here at all. However, all the data access and paging is being performed on the server. However, rendering the UI and accessing the data is all being handled by javascript using jQuery on the client.

The diagram below shows how all the different parts relate to each other. This follows a traditional client server type application in that everything from the WCF Data Service layer down is on the server and the rest is client side code. The beauty of this is that we very very easily can create a new client application (in Silverlight perhaps) to talk the backend.

image

Advantages

So what are the advantages we are getting out of architecting our application this way.

  • Performance – The application is very responsive to the client. We let the client browser handle all the rendering of the application and we only receive or transmit the data that we need to between the client and server.
  • Standards – The data is returned to the client as json so we can deal with this however we like in a standard way. We are not limited to json, we can also return the data as xml. Try out this link in your browser – http://samples.bondigeek.com/services/PagerService.svc/Products. You should get back the list of products as an RSS feed.
  • Data Modelling made easy– It could not be easier. Create your database tables, configure relationships, drag and drop on to a new ADO.NET Entity Framework Data Model and boom, your done. (See the next point) If our model changes, fine, we just alter the database, refresh the model and we are good to go.
  • Reduction in code – Once we bind our model to our service exposing the data is only a few lines of code. The snippet below is all we need to expose our Products and enable querying them. The QueryInterceptor is an additional feature that I have added that will automatically filter out items marked as deleted when I execute additional queries. (See the next point)
public class PagerService : DataService<PagerSample.services.BlogSamplesEntities>
{
    // This method is called only once to initialize service-wide policies.
    public static void InitializeService(DataServiceConfiguration config)
    {

        config.SetEntitySetAccessRule("Products", EntitySetRights.AllRead);

        config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
    }

    [QueryInterceptor("Products")]
    public Expression<Func<Product, bool>> OnQueryProducts()
    {
        return lo => lo.IsDeleted == false;
    }

}
  • Extensibility – One of the beauties of web development is the ability to change things very quickly without the need for recompilation. This solution is no different and also applies to WCF Data Services. We can query the data from the client directly from a url thanks to the OData Protocol. The javascript below does all the data retrieval, paging, sorting and binding of the data. The key points to note here is the url being constructed. When the page is initially rendered this is the URL that gets executed: “/services/PagerService.svc/Products?$orderby=IsDisabled asc, ProductName asc&$skip=0&$top=10&$inlinecount=allpages”. This handles everything for us: The Sorting and the Paging. We are driving this from the client but executing it on the server. No need for any state to be maintained on the server.
function GetProducts() {

    var skip = "&$skip=" + ((productsPage - 1) * pageSize);
    var top = "&$top=" + pageSize;

    $("#products-table tbody").children().remove();

    var url = "/services/PagerService.svc/Products?$orderby=IsDisabled asc, " + currentProductSortColumn + " " + currentProductSortDirection + skip + top + "&$inlinecount=allpages";

    $.ajax({
        type: "GET",
        url: url,
        async: false,
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        success: function (msg) {
            totalProducts = msg.d.__count;
            products = msg.d.results;
            var rows = "";
            for (var i = 0; i <= products.length - 1; i++) {

                var className = "odd-row";
                if (i % 2 == 0) {
                    className = "even-row";
                }

                var buttons = "<a href='#' class='sprite edit-small edit-product-category' productcategoryid='" + products[i]["ProductId"] + "''/>";
                if (products[i]["IsDisabled"] == false) {
                    buttons += "<a href='#' class='sprite disable-small enable-product-category' productcategoryid='" + products[i]["ProductId"] + "''/>";
                }
                else {
                    buttons += "<a href='#' class='sprite enable-small enable-product-category' productcategoryid='" + products[i]["ProductId"] + "'/>";
                    className += " disabled";
                }
                buttons += "<a href='#' class='sprite delete-small delete-product-category' productcategoryid='" + products[i]["ProductId"] + "'/>";

                rows += "<tr><td class='" + className + "'>" + products[i]["ProductName"] + "</td><td class='" + className + "'>" + products[i]["ProductTitle"] + "</td><td class='" + className + " select-column'>" + buttons + "</td></tr>";
            }
            $("#products-table tbody").append(rows);
            ApplyProductClickHandlers();

            if (totalProducts > pageSize) {
                $("#product-pager").children().remove();
                $("#product-pager").append(PagerHtml(totalProducts, productsPage));

                $("#product-pager a").click(function () {
                    if ($(this).hasClass("next")) {
                        productsPage = Number(productsPage) + 1;
                    }
                    else if ($(this).hasClass("previous")) {
                        productsPage = Number(productsPage) - 1;
                    }
                    else {
                        productsPage = $(this).attr("page");
                    }
                    GetProducts();
                });                

                $("#product-pager").removeClass("hidden");
            }
            else {
                $("#product-pager").removeClass("hidden");
            }
        },
        error: function (xhr) {
            ShowError(xhr);
        }
    });

}
  • Dynamic SQL and LINQ – No need to write another stored procedure (unless you really need to for performance). Once you get your teeth in to the LINQ way of doing things you will rarely go back. The beauty of this is that if your model changes and changes to the extent it would break something you will know at compile time, not the next time someone executes that once in a blue moon stored procedure you created and forgot to update and caused your site to bomb out. Take for instance the example below; If I decided to change a field name or even table then the code would not compile below and I would know straight away.
[WebGet]
public void EnableDisableProduct(int productCategoryId)
{
    using (BlogSamplesEntities ctx = new BlogSamplesEntities())
    {
        var item = (from p in ctx.Products
                    where p.ProductId == productCategoryId
                    select p).SingleOrDefault();

        if (item != null)
        {
            item.IsDisabled = !item.IsDisabled;
        }

        ctx.SaveChanges();
    }
}
  • Lean and Clean HTML – Although not a product of the technology we are using, combining all of these technologies together does make it very easy for us to write Lean and Clean mark-up with none of the usual asp.net mechanisms getting in the way. We have full and complete control of our mark-up.

Well I am going to conclude this post for now and I hope I have given you a taste of what makes these technologies so important and why as a Web Developer embracing them will make your life a whole lot easier.


• Lynn Langit (@llangit) posted her SQL Azure [Slide] Deck – Updated for September 2010 on 9/5/2010:

image Here’s my SQL Azure deck for September presentations – updated with all August 2010 release changes.  I’ll be presenting this several times in September and October and will post the deck recording later during this time period.  Enjoy!

image

imageView SQL Azure September 2010 on SlideShare


• Fabrice Marguerie explained Sesame: Microsoft Dallas support in a 9/3/2010 post about the latest Sesame upgrade:

image With the release of Microsoft Codename "Dallas" CTP3 and the features announced in the previous post, it became easier to add support for Dallas to Sesame Data Browser.

imageThe new version of Sesame published today allows you to browse Dallas datasets, as demonstrated below.

Creating a connection to a Dallas dataset in Sesame is easy. You just need a dataset URL and an account key.

The Dallas portal

In order to get an account key and URLs, you need to visit https://www.sqlazureservices.com and sign in with a Windows Live ID. You'll then be able to subscribe to datasets, such as AP Online (Associated Press) or business information from InfoGroup:

Dallas Subscriptions

Note: Only CTP3 services are supported in Sesame.

Once you have a subscription, you can visit its preview page. This is where you'll find the URL you'll use in Sesame.
Here is for example the page for AP Online:

Dallas URL AP Online

The URL for AP Online is highlighted in blue on the above picture.

Your account key is available on the "Account keys" page:

Dallas Account Keys

Dallas in Sesame

Now that you have an account key and a URL, you can create an OData connection in Sesame:

Sesame Dallas Connection

Here is the result for AP Online if you click on GetBreakingNewsCategories:

Sesame Dallas GetBreakingNewsCategories

Copy a category ID and click on GetBreakingNewsContentByCategory.
Paste the category ID in categoryId and type 5 for count:

Sesame Dallas GetBreakingNewsContentByCategory

After clicking on Open, you'll get data in a grid, as usual:

Sesame Dallas GetBreakingNewsContentByCategory data

You can also locate the news items on a map:

Sesame Dallas Map

Known bugs
  • A bug in Dallas CTP3 prevents the "Load more" button to work correctly in all cases.
  • Dallas CTP3 does not support all filter operations. Namely, the EndsWith, StartsWith, Contains, DoesNotContain and IsContainedIn don't work with Dallas datasets.

Please give this new version a try. As always, your feedback and suggestions are welcome!


• Infragistics explained Building the oMovies Browser: Netflix oData using Infragistics NetAdvantage Silverlight Controls in this 10/2010 tutorial:

The oMovies Browser is a small project demonstrating how to interface with the Netflix oData catalog using Infragistics Silverlight controls. The oMovies project makes use of the xamGrid, xamSlider and Infragistics Virtual Collection class to facilitate paging.

imageIn the browser you are able to navigate through the list of Netflix genres and view the associated titles that that genre:

Genres

You are also able to browse through the highest rated titles for a given year:

Genres

… The anonymous author continues with a detailed, fully illustrated tutorial. …

Code Download

If you’d like to dive ever further into the oMovies browser, you can download the code here.


BrowserBoy posted Q&A: SQL Server business intelligence meets the cloud with Mark Kromer on 9/4/2010:

imageIn the first edition of our SQL in Five series, Microsoft data platform technology specialist Mark Kromer answers five questions on the state of cloud-based business intelligence. Read on for Mark’s thoughts on what “BI in the cloud” really means, the potential of SQL Azure and how Microsoft’s PowerPivot technology fits into the mix.

Gartner recently noted a general lack of understanding amongst IT professionals when it comes to “BI in the cloud.” How would you define it?

imageMark Kromer: interestingly, Gartner’s definition states that only one portion of an analytical application needs to be in the cloud to be classified as “BI in the cloud.” to me, this means that the basic value proposition of BI is the same, which I typically define as allowing knowledge workers in your business access to knowledge from data that enables them to make better business decisions. The only difference is that perhaps it is the data warehouse, or a data source, or the analytical engine that is in the cloud.

A present state Microsoft example would be for an analyst in your business to open Excel 2010 with the PowerPivot add-in and connect to a cloud-based SQL Azure database. The analyst would then use the PowerPivot in-memory analysis services engine to perform mining and analytics on that data set from within Excel.

Where would you say the most potential lies in terms of the benefits of cloud-based BI? it has to be scalability right?

Kromer: When I read the question as “potential” for cloud-based BI, I think of the idea that Microsoft has spoken about for several years now, which is making BI pervasive and available to your entire organization through self-service BI.

In terms of scaling to a larger number of users — yes, I agree. But in terms of back-end processing and data warehousing, I think there are still a number of performance concerns to address in the cloud. most of the data sets that I can think of for cloud-based BI at this time would probably entail providing answers to business questions that are focused on departmental-sized data sets as opposed to multi-terabyte data warehouses and multi-terabyte hosted OLAP cubes.

Microsoft recently upped the database storage capacity for SQL Azure Business Edition from 10 GB to 50 GB. How important is this increase in regards to BI applications and data warehouses?

Kromer: well 50 GB is still pretty small for a data warehouse. But in terms of using that as a data source for PowerPivot — that is very important to enable cloud-based SQL Server for many more workloads. SQL Azure databases are not exact replicas of the SQL Server database engine, and it can be used with an on-premises version of SQL Server Integration Services. But from my perspective, 50 GB is a bit small for most data warehouses that I see today.

PowerPivot has SQL Azure functionality built-in, and Microsoft demoed some of those cloud capabilities at TechEd this year. What are some of the things folks can do with PowerPivot and Azure right now, and what are the benefits?

Kromer: since SQL Azure does not currently facilitate SQL Server Analysis Services, using PowerPivot is a great option for analysis on a SQL Azure database. PowerPivot uses a local in-memory version of analysis services, so you can load up your Excel 2010 spreadsheet with data from Azure and run analyses and pivots right from Excel with PowerPivot.

My first attempt at SQL Azure was to create a small database that I used to build a sample ASP.NET webpage as my data source. since you can connect to SQL Azure as a data source, that allows you programmatic access from technologies like ASP.NET as a SqlDataSource. There are a number of companies that I am talking to who are looking at database consolidation projects and are running trials with SQL Azure by migrating smaller database workflows over to SQL Azure databases to see if that will provide the cost savings they are looking for.

These features are being pegged as having business value right now, but cloud deployments have started slowly – particularly with smaller organizations. What are your thoughts on the current state of cloud computing in regards to BI and SQL Server in general?

Kromer: The adoption rates for cloud computing customers of Microsoft, Amazon, Google, and so on continue to grow. The rate of adoption that we have seen was very much the expectation, where smaller pockets of business workloads from teams that did not have dedicated IT staff and budget, were the vanguards in their companies. that resourcefulness demonstrated an agility and flexibility that CIOs and IT departments are taking notice of and are beginning to build business cases that take into account the cost savings of cloud computing, while factoring in the culture shift that comes from giving up complete data center control.

In terms of SQL Server and Microsoft BI, the SQL Azure product team at Microsoft is benefiting from large R&D investments. The cloud business model at Microsoft is going to continue to grow and cloud versions of other areas of the Microsoft data platform, including the BI components, will adopt more and more of the cloud computing paradigm.

Editor’s note: for more from Mark, check out his blog at MSSQLDUDE

Mark Kromer has over 16 years experience in IT and software engineering and is a leader in the business intelligence, data warehouse and database communities. currently, he is the Microsoft data platform technology specialist for the mid-Atlantic region, having previously been the Microsoft Services senior product manager for BI solutions and the Oracle principal product manager for project performance analytics.


D[oug] Rehnstrom started a thread that asked about Microsoft Sync Framework Power Pack for SQL Azure on 8/30/2010 in the SyncFx - Microsoft Sync Framework Developer Discussions forum (missed when posted):

Has there been an update to the Sync Framework Power Pack for SQL Azure since the November CTP? 

Liam Cavanagh replied on the same date:

No not yet.  We have updated the Sync Framework 2.1 that includes a SQL Azure Provider but have not updated the Power Pack UI.  We should have an update to this toward the end of this calendar year.

Liam is Senior Program Manager, SQL Azure and Sync Framework - http://msdn.microsoft.com/sync/. His group posted an updated Sync Framework 2.1 on 8/18/2010 (see Windows Azure and Cloud Computing Posts for 8/18/2010+.)


Bill McColl asked What do you need? Speed, Scale, Power, or all three? as a deck for his Cloud Supercomputing: Navigating The "Big Data" Space post of 9/4/2010:

Databases, Data Warehouses, Map Reduce, Hadoop, Realtime Data Warehouses,... What is the right technology for your "big data" analytics challenges? The best fit will depend on your specific requirements on three fronts:

  • Speed (How long can you afford to wait for the results of your big data processing? Is latency of several hours acceptable, or do you need analytics insight at the speed of your business transactions and events, i.e. in seconds?)
  • Scale (How much data are you gathering? How much do you need to process per second, per minute,  or per hour?)
  • Power (How complex do your analytics algorithms need to be? Do you need only simple summaries and statistics, or do you need powerful algorithms that can carry out deep analytics? Is simple batch processing enough, or do you need the power of continuous, always-on analytics?)

Cloud supercomputing is all about big data. Knowing where you need to be on these three axes is the key step in figuring out the right technology solution for your needs.

Watch the video here.


Pablo Castro answered on 9/3/2010 a question from Dan Dyck on the OData Mailing List about JSON OData examples, which I’ve reformatted below:

Hi Pablo,

imageThanks for the [previous] reply.

I have actually read that link (a few times). I did also find these links: http://www.odata.org/developers/protocols/operations and http://www.odata.org/media/6655/[mc-apdsu][1].htm, which I didn't feel was enough for me to hit the ground running.

More specifically, what I'm looking for are real examples of PUT, POST and DELETE. Also, here are some examples I found in the second link: One method says I can do it this way:

POST /service.svc/Customers HTTP/1.1
Host: host
Content-Type: application/json
Accept: application/json
Content-Length: nnn

{
  "__metadata": { "uri": "Customers(\'ASDFG\')" },
  "CustomerID": "ASDFG",
  "CompanyName": "Contoso Widgets",
  "Address": { "Street": "58 Contoso St", "City": "Seattle" },
  "Orders": [
     { "__metadata": {"uri": "Order(1)"} },
     { "__metadata": {"uri": "Order(2)"} }
  ]
}
And the other says this way:
POST /service.svc/Customers HTTP/1.1
Host: host
Content-Type: application/json
Accept: application/json
Content-Length: nnn

{
  "CustomerID": "ASDFG",
  "CompanyName": "Contoso Widgets",
  "Address": { "Street": "58 Contoso St", "City": "Seattle" },
  "Orders": [
        {
           "OrderID": 1,
           "ShippedDate": "\/Date(872467200000)\/"
        },
        {
           "OrderID": 2,
           "ShippedDate": "\/Date(875836800000)\/"
        }
  ]
}

Note this missing "__metadata" in the last one. Are they both correct? Is there a preferred method?

Thanks again,

Dan


I'll talk to folks to see what we can do around having better examples
on JSON, maybe in more concrete terms using jQuery or something like
that instead of just the HTTP requests/responses.

As for your specific question below, strictly speaking both are correct.

imageThey have different meaning though, let me walk through a few aspects:

  • [T]he __metadata property is optional on insert. If you don't need to
    specify a type name then you can just leave this out altogether (e.g.
    would be needed if there is inheritance in the model and the target
    set was of a base type that had subtypes defined).
  • [T]he first example inserts references to purchase orders. That is, it
    specifies links to existing orders, so the operation will create a
    customer entity and then link the two orders indicated in the payload to it.
  • [T]he second example is what we sometimes informally call a deep
    insert, where you send a whole tree that needs to be created on the
    server. The server is expected to decompose the tree into the
    independent entities, wire them up together as indicated through the
    navigation properties, and then insert the whole graph
  • [N]ote that you can combine the two things. In a fancy operation you
    may be creating several nodes in a tree and also binding (linking)
    other nodes to existing entities.

Below I'm including minimal but actual working POST, PUT and DELETE
requests against the public read-write demo service. You can try these
yourself, feel free to reuse the same session and all (or use "(S(readwrite))" in the segment in a browser to get redirected to a
fresh session).

Unfortunately the state of the art in cross-domain access from
JavaScript will get in the way of trying most of this in JavaScript
unless you host the page and service in the same domain.

-pablo

POST request:

POST http://services.odata.org/(S(ywikib3o4emvct3xt3sttaxq))/OData/OData.svc/CategoriesHTTP/1.1
content-type: application/json 
accept: application/json
Host: services.odata.org 
Content-Length: 26 { ID: 99, Name: 'newone' }

PUT request:

PUT http://services.odata.org/(S(ywikib3o4emvct3xt3sttaxq))/OData/OData.svc/Categories(99)HTTP/1.1 
content-type: application/json 
accept: application/json 
Host: services.odata.org 
Content-Length: 22 

{ Name: 'updatedone' }

DELETE request:

DELETE http://services.odata.org/(S(ywikib3o4emvct3xt3sttaxq))/OData/OData.svc/Categories(99)HTTP/1.1
accept: application/json 
Host: services.odata.org


<Return to section navigation list> 

AppFabric: Access Control and Service Bus

Bibhore Singhal wrote Hybrid Cloud with Windows Azure Service Bus in 8/2010 as an HCL Technologies white paper. From the author’s introduction in this 9/6/2010 post to the HCL blogs:

image Windows Azure Service B[us], also known as ‘App Fabric’, acts as a communication bridge between Cloud and On-Premise applications. It can connect applications to Cloud behind firewalls and network proxies.Besides connection, it is important to optimize the connectivity between Service B[us] and client applications for obtaining the best results.

image72Connection usage and correct connection mode are important parameters for optimizing the performance of Service BUS. Service response is directly proportional to the available network bandwidth. Poor local network infrastructure and incorrect combination of Service B[us] bindings can become a bottleneck that may degrade performance of the application.

image A high-speed network connection with sufficient bandwidth at disposal for service is recommended to achieve maximum performance with Service B[us]. TCP-based bindings require some special ports to be opened while HTTP-based bindings do not have such requirements. Therefore, HTTP-based bindings are recommended if the underlying network firewall does not allow special ports to be opened.

It is a difficult task to debug a service hosted on Service B[us] since there is no direct control on the input data. However, some special coding constructs can help developers debug Service BUS implementations. It is imperative to analyze and discuss network limitations with the organization security and IT team before going for a Service B[us] implementation in the local network.

Download HCL’s Whitepaper: Implementation of Hybrid Application via Service BUS Whitepaper.

It’s not clear to my why Bibhore uses “BUS” instead of “Bus” throughout the paper; perhaps he believes it’s a three-letter acronym for a technology name.

• chco posted on 9/3/2010 an excerpt from Juval Lowy’s Programming WCF Services, Third Edition (O’Reilly Media, 2010) as How to use the Service Bus in Windows Communication Foundation (WCF) Services to the O’Reilly Answers blog:

image The following is an excerpt from the O'Reilly publication Programming WCF Services, Third Edition. Below the author speaks about the Windows Azure AppFabric Service Bus (or just service bus for short) which is arguably the most accessible, powerful, and needed piece of the new Windows Azure Cloud Computing initiative.


image72In its first release, the service bus has several manifestations: as a relay service, as an events hub, and as a message buffer. While the service bus is designed to address some tough connectivity issues, it also provides an attractive solution for scalability, availability, and security issues by lowering the technology entrance barrier and making what used to be an advanced communication scenario mainstream and mundane.

As a relay service, the service bus addresses the challenge of Internet connectivity. Truth be told, Internet connectivity is difficult. Often, the service is located behind firewalls (both software and hardware firewalls) and behind a load balancer, and its address is dynamic and can be resolved only on the local network and cannot be translated to outside addressing. Virtualization adds a new dimension to connectivity along with maintaining transport sessions across machines. This situation is depicted in Figure 11-1.

Attached ImageFigure 11-1. Internet connectivity challenges

Moreover, since it is typically inadvisable to open the intranet to callers from the Internet, businesses often resort to using DMZs, but in turn incur the increased complexity of deployment, managing multiple sets of client credentials, and the unwanted traffic of unauthorized calls. Commonplace solutions for Internet connectivity from dynamic DNS to static IPs are often cumbersome, do not reflect changes in real time, have severe scalability and throughout implications, and are potentially insecure unless done by experts.

When it comes to having the service call back to the client, the problem is compounded by all the connectivity issues surrounding the client, virtually a mirror image of Figure 11-1. And yet, callbacks, events, and peer-to-peer calls are often an integral part of many applications, ranging from consumer machines such as non-real-time games, interactive communication, and media sharing to roaming business machines using ad hoc connections and full-fledged cross-intranet business-to-business applications.

What Is a Relay Service?

The solution for the Internet connectivity challenge is simple—since it is so difficult to connect the client to the service directly, avoid doing that (at least initially) and instead use a relay service. The relay service is a service residing in the cloud assisting in the connectivity, relaying the client calls to the service. Such a relay approach does require both the client and the service intranets to allow connecting to the cloud, but since the cloud constitutes neutral territory for both the client and the service, most environments, from consumer home machines to small intranets and large businesses, do allow calls out to the Internet, because that is perceived as an integral part of conducting business. A cloud-based relay service (as you will see later) also provides other benefits in terms of scalability, security, and administration.
Figure 11-2 shows how the relay service operates.

Attached ImageFigure 11-2. The service bus as a relay service

First, both the service and the client must establish connection against the relay service independently (steps 1 and 2 in Figure 11-2) and authenticate against the relay service. At this point, the relay also records where the service is and how to best call back to it. When the client calls the relay service (step 3), the relay service forwards the call (the client message) to the service (step 4).

The Windows Azure AppFabric Service Bus

While the sequence of Figure 11-2 sounds straightforward, in practice it involves a considerable amount of intricate network programming, messaging and standards know-how, and security expertise, as well as administration, IT operations, dedicated connectivity, and infrastructure. Such a solution is simply out of reach for the vast majority of applications. This is exactly the gap the Microsoft Windows Azure AppFabric Service Bus is designed to fill when acting as a ready-made relay service, hosted and managed at a Microsoft data center:

  • The service bus acts as a DMZ in the cloud, allowing only authenticated services (and potentially clients) to connect and relay messages.
  • The service bus provides a single place to manage the client and service credentials.
  • The service bus is the front end of the service, encapsulating and isolating the services from malicious callers lurking on the Internet.
  • The service bus is the one repelling various attacks, from DOS to replay attacks, while the identity and location of the actual service is completely hidden.
  • The service bus can offload from the service the need to manage and interact with the client credentials.

In addition to relaying messages and providing security management, the Windows Azure AppFabric Service Bus offers additional services such as a services registry and access control.

The main difference between connecting to a regular WCF service and using the relay service is in hosting. In the case of a relay service, the service must connect to the service bus, authenticate itself, and listen to calls from the relay service before the client sends its requests. This means you must use self-hosting and either launch the host explicitly or use an NT Service as a host. When using the WAS, you must deploy your service on a machine running Application Server AppFabric and configure the service for auto-start.


Note: Steps 2 and 3 in Figure 11-2 may be combined into a single call. The reason I broke it into two logical steps is that authenticating both the service and the client is done against another facility of Windows Azure AppFabric, called the Access Control Service (ACS). Presently, you can use the ACS to authenticate the client or the service applications (as discussed later in the chapter).

You can also use the ACS for authorization and even to build a kind of two-tier authentication system where user claims are converted to ACS tokens for application authentication. Presently, this requires a great deal of understanding of ACS and related topics such as federated and claims-based security, which are outside the scope of this chapter.


Learn more about this topic from Programming WCF Services, Third Edition.

Cover of Programming WCF Services

Programming WCF Services is the authoritative, bestselling guide to Microsoft's unified platform for developing service-oriented applications on Windows. The third edition of this thoroughly practical book provides insight, not documentation, to help you learn the topics and skills you need for building WCF-based applications. Written by Microsoft software legend Juval Löwy, this new edition is revised for the latest productivity-enhancing features for C# 4.0 and .NET 4.

Learn More Read Now on Safari


Vittorio Bertocci (@vibronet) reported a serious book typo at the end of his The First Dead-Tree Copy of the Book is Here! post of 9/4/2010:

image [I know, you must be sick of book-related posts: please bear with me just a bit longer, I think we’re almost done :-)]

[UPDATED]

image

image72YES! I just found under the door mat – wait for it - the very first printed copy of Programming Windows Identity Foundation! Considering that exactly today it’s 9 years that I work in Microsoft, I can think of very few better ways to celebrate :-)

In the picture above I re-created the work setup I used for most of the writing: Visual Studio on the tablet screen, the manuscript in the portrait screen, reference material (specs etc) on the old Relisys from Italy (before you ask: hooked to the tablet via DisplayLink). And of course the whiteboard, with the Enrico la Talpa that I use for christening ALL whiteboards, real or virtual. Ryan and David, please do not add any details about that ;-)

imageThe book is slightly thinner than I expected, but I get that feeling for every book: it’s a function of the amount of work they require, the thickness is never enough. The pry bar on the cover is awesome, I can’t believe it was never used before by the Microsoft Press covers team… I feel privileged :-)

The production guys did a superb job with the illustrations. I was real worried that some would be just too hard to read, but they ended up looking great and perfectly readable even when featuring multiple keys.

 image

OK, the physical copy is here; now the question becomes, when is it going to be there? I.e., when are the printed copies going to be available? I know as much as you in this respect. As silly as it sounds, a couple of weeks ago I bought a copy of the book on Amazon and I’ve been polling the shipments page ever since. I doubt that they fired up the printing machinery and printed just this copy for making me happy, hence I’d consider this a sign that the printed copies are just about to hit the warehouses.

Ahh, I was supposed to still do a couple of things before checking out: but now I think I’ll just get downstairs, open a beer and leaf through this bad boy. I’ve got to enjoy this bliss while that lasts, soon enough I’ll enter the phase in which I’ll be unsatisfied with the way I wrote this and that :-)

[UPDATE 9/30 - 6:30 PM PST] Not even 30 mins, bliss already over! Found a typo which completely changes the meaning of the sentence. Page 234: "the new version of the protocol is now backward compatible with OAuth 1.0 and is largely based on OAuth WRAP. " should be "the new version of the protocol is not backward compatible with OAuth 1.0 and is largely based on OAuth WRAP. " [Emphasis added.]

I remember being excited when the author’s copies of my first book, Using Access [1.0] for Windows, Special Edition, arrived in 1983. See Windows Azure and Cloud Computing Posts for 9/3/2010+ for the complete TOC.

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

•• Tom Hollander shows you how to Get logging in Windows Azure with Enterprise Library in this 9/6/2010 post:

imageHi again – yes I know it’s been a while. Recently I’ve started a new role in Microsoft which involves helping customers deploy applications on Windows Azure, Microsoft’s cloud computing platform. I thought it may be fitting it I kick this off with a post that bridges my (now quite) old role with my new one and talk about using Enterprise Library with Windows Azure.

One of the great things about Windows Azure is that it includes the .NET Framework in its entirety – so for the most part, .NET code you’ve written in the past will run fine in the cloud. However there are some important differences between your own servers and the cloud that you need to be aware of as a developer or designer. For example, while it is technically still possible to log to a flat file or the Windows Event Log, it’s generally impractical to access the resulting logs. As a result, logging and instrumentation is one of those things which you need to do a little differently in the cloud.

Using Windows Azure Diagnostics

It so happens that Windows Azure includes its own diagnostics infrastructure. It’s well documented elsewhere (try the official documentation or this MSDN Magazine article) but in a nutshell it involves high-performance buffers built on ETW which collect log events from your application. These events, along with other diagnostic sources such as event logs and performance counters, can be transferred to Windows Azure Storage (tables and blobs) either at predefined intervals or on-demand.

The most commonly described approach to logging to Windows Azure Logs is to use the standard System.Diagnostics Trace class as follows:

Trace.TraceInformation("Trace works fine, but you're somewhat limited...");

Next, you’ll need to configure System.Diagnostics to log to Windows Azure Diagnostics via the DiagnosticMonitorTraceListener:

<system.diagnostics>
  <trace autoflush="true">
    <listeners>
      <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, 
Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral,
PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics" /> </listeners> </trace> </system.diagnostics>

Finally, you’ll need to configure and start Windows Azure Diagnostics. In this example, let’s ask it to transfer all logs to Windows Azure Storage (specifically, the WADLogsTable table) every 5 minutes by modifying the WebRole or WorkerRole class as follows:

public override bool OnStart()
{
    StartDiagnostics();
    RoleEnvironment.Changing += RoleEnvironmentChanging;
    return base.OnStart();
}

private void StartDiagnostics()
{
    // Get default initial configuration.
    var config = DiagnosticMonitor.GetDefaultInitialConfiguration();

    config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Undefined;
    config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);

    // Start the diagnostic monitor with the modified configuration.
    DiagnosticMonitor.Start("DiagnosticsConnectionString", config);
}

Of course, you’ll also need to make sure you define a connection to your Windows Azure Storage account in your ServiceConfiguration.cscfg file.

Bringing in the Logging Application Block

There’s nothing wrong with any of the above code, but if you’re used to using Enterprise Library’s Logging Application Block for your logging needs, you’ll soon find that you’re somewhat limited. For example, you won’t automatically get access to things like the Process ID and machine name, you won’t get the ability to customise your message format and you won’t get the advanced routing and filtering options. The great news is that since the Logging Application Block is built on System.Diagnostics, you can use it in your cloud applications and configure it to write to Windows Azure logs.

First, let’s look at your logging code. There’s nothing special you need to do here and any existing EntLib logging code should work– since the Logging Application Block’s API is abstracted from the chosen logging mechanism you just log like you always did, either using the old school Logger facade or the new fangled DI alternatives. Make sure you choose an appropriate Severity level, as both Enterprise Library and Windows Azure Diagnostics can be configured to use this to filter messages or decide which ones to transfer to storage.

Logger.Write("Get Logging in Windows Azure with Enterprise Library!", 
"General", 1, 0, System.Diagnostics.TraceEventType.Information);

Now, let’s look at your configuration file. You can delete the <system.diagnostics> section we described earlier and use Enterprise Library’s configuration instead. You can use either the config tool or an XML editor to edit the config, but the important thing is to choose a System.Diagnostics Trace Listener and configure it to use the same DiagnosticMonitorTraceListener we used before. I’ve included just the <listeners> section of the Logging Application Block’s configuration below; in your app you can continue to use whatever combination of sources, formatters, filters and listeners you like.

<listeners>
  <add listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.SystemDiagnosticsTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
    type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
    name="Azure Diagnostics Trace Listener" />
</listeners>

Finally, you’ll still need to configure and start the Diagnostic Monitor when your role starts. The code in the first example will continue to do the trick (although keep in mind you may prefer to configure it differently by applying log transfer filters or choosing to transfer logs to storage only on-demand).

You should now be ready to deploy your application to the cloud and see it in action. Remember, with Windows Azure Diagnostics, the log events will only be transferred to Windows Azure Storage after the specified interval or on-demand (requiring additional code or Powershell scripts), so you may need to be more patient than you’re used to. Once the transfer is complete, you should see the events in your WADLogsTable table. The easiest way to view the data in the table is to use Visual Studio 2010’s Server Explorer, but there are many other web- and Windows-based tools that can do the same and more. The data in the log files will have a little bit of XML padding, but you’ll see your formatted data sitting proudly in the middle:

<TraceSource>General</TraceSource>
<Object>Timestamp: 9/6/2010 5:05:38 AM
Message: Get Logging in Windows Azure with Enterprise Library
Category: General
Priority: 1
EventId: 0
Severity: Information
Title:
Machine: RD00155D3134BD
App Domain: /LM/W3SVC/1273337584/ROOT-1-129282230874485863
ProcessId: 1724
Process Name: d:\windows\system32\inetsrv\w3wp.exe
Thread Name: 
Win32 ThreadId:2560
Extended Properties: </Object>

To summarise, while Windows Azure does require you to do a few things a little differently, most of your existing code and favourite techniques should still work fine. With just a small configuration change and some startup code, your Enterprise Library Logging code will feel right at home in the cloud.


•• Marius Oiaga posted a summary of Windows Azure Code Samples Available for Download to Softpedia on 9/6/2010:

image Microsoft traditionally makes available a variety of resources designed to streamline development of applications for its platform, with Windows Azure making no exception to this rule.

In addition to the software development kit (SDK) offered by the Redmond company, devs building projects for its Cloud platform can also access a number of samples via the Microsoft Download Center.

image“Windows Azure Architecture Guide – Part 1 – Code Samples” and "Developing Applications for the Cloud -Part 2 - Code Samples" are up for grabs, free of charge, from the software giant. However, the two resources have different purposes, according to the description provided by Microsoft.

image“These samples illustrate the steps towards migration of a typical application built with .NET to the Windows Azure platform,” the company revealed for Windows Azure Architecture Guide – Part 1.

“To check for software prerequisites needed to run the samples, run CheckDependencies.cmd in the sample scenarios folder once all content is extracted to your local file system".

“This batch file launches a dependency checking tool that reports any components that are missing in your system, and it provides links if needed for obtaining, installing and configuring the missing components. Not all of samples require the same components to run,” the software giant explained.

Essentially, developers will get no less than four samples. The resource is designed to help customers take an on-premises application into the Cloud, and optimize it following the migration to Windows Azure.

With the “Developing Applications for the Cloud – Code Samples,” the focus is placed on creating a new application tailored to Windows Azure from the get go.
“The samples are packaged as a self-extractable zip file. When run, you'll have to accept the End User Licensing Agreement to proceed,” Microsoft explained.

“After accepting the license, files will be extracted to chosen destination. The samples will require Windows Identity Foundation and IIS7 to work. You can use the "CheckDependencies" command batch file to identify potential missing components in your system".

“Running CheckDependencies command batch file is optional. It is not an installer.”

The Windows Azure Software Development Kit (June 2010) Refreshed on September 1, 2010 is available for download here.

Windows Azure Tools for Microsoft Visual Studio is available for download here.


Team CrosswordI started this TeamCrossWord (@teamcrossword) puzzle on 8/3/2010 and finished it on 8/5. Took me about an hour overall without help from my friends:

imageimage Notice that this social game is “powered by Windows Azure.” Click the FAQ link at the bottom of the page for social instructions, which I’ve repeated here:

How do I start a new puzzle?
Head to the TeamCrossword.com homepage, log in with Facebook and choose the puzzle you want to work on. Don't forget to invite friends!

How do I solve the puzzle?
You can move around the puzzle grid with either the arrow keys or by clicking on the squares in the grid. You can also click on clues in the Across and Down boxes. RETURN will move you to the next clue sequentially based on your current direction, and SPACE will change your current direction.

I'm stumped! How can I check answers?
You can only check your answers once the entire grid is filled out by clicking "Show Wrong". We know "other" online crossword puzzles let you cheat see what's wrong while you're in the middle, but we think you should ask your friends for help instead.

How do I invite friends?
Click the "Invite Friends" button to post a link to your Facebook wall, click the Email link to launch your email client, or else just paste the game URL where ever you want!

If you have other thoughts or concerns:
Please share! Check out our feedback and support forum and let us know what's on your mind.

imageSocial gaming is likely to be a major-scale application for PaaS (and IaaS) cloud purveyors, but will need monetization by advertising or hourly fees. The Fuse Labs homepage is here.


Rob Blackwell (@RobBlackwell) posted AzureRunMe v1.0.0.8 to CodePlex on 8/5/2010, claiming “CloudDrive support, better logging and diagnostics”:

AzureRunMe
imageRun your Java, Ruby, Python, Clojure or <insert language of your choice> project on Windows Azure Compute.
Introduction

imageThere are a number of code samples that show how to run Java, Ruby, Python etc on Windows Azure, but they all vary in approach and complexity. I thought there ought to be a simplified, standardised way.

I wanted something simple that took a self contained ZIP file, unpacked it and just executed a batch file. All the role information like ipaddress and port could be passed as environment variables %IPAddress% or %Http% etc.

I wanted ZIP files to be stored in Blob store to allow them to be easily updated with all Configuration settings in the Azure Service Configuration.

I wanted real time tracing of stdio, stderr, debug, log4j etc.

AzureRunMe was born ...

Getting Started

Organise your project so that it can all run from under one directory and has a batch file at the top level.

In my case, I have a directory called c:\foo. Under that I have copied the Java Runtime JRE. I have my JAR files in a subdirectory called test and a runme.bat above those that looks like this:

cd test
..\jre\bin\java -cp Test.jar;lib\* Test %http%

I can bring up a console window using cmd and change directory into Foo
Then I can try things out locally by typing:

C:>Foo> set http=8080
C:>Foo> runme.bat

The application runs and serves a web page on port 8080.

I package the jre directory as jre.zip and the test directory along with the runme.bat file together as dist.zip.

Having two ZIP files saves me time - I don't have to keep uploading the JRE each time I change my Java application.

My colleague has a ruby.zip file containing Ruby and Mongrel and his web application in rubyapp.zip in a similar way.

Upload the zip files to blob store. Create a container called "packages" and put them in there. The easiest way to do this is via Cerebrata Cloud Studio. Another alternative is to use the UploadBlob command line app distributed with this project.
The next step is to build and deploy Azure RunMe ..

Load Azure RunMe in Visual Studio and build.

Change the ServiceConfiguration.cscfg file:

  • Update DiagnosticsConnectionString with your Windows Azure Storage account details so that AzureRunme can send trace information to storage.
  • Update DataConnectionString with your Windows Azure Storage account details so that AzureRunme can get ZIP files from Blob store.
  • Change the TraceConnectionString to your appFabric Service Bus credentials so that you can use the CloudTraceListener to trace your applications.

By default, Packages is set to "packages\jre.zip;packages\dist.zip" which means download and extract jre.zip then download and extract dist.zip, before executing runme.bat

Click on AzureRunMe and Publish your Azure package. Sign into the Windows Azure Developer Portal at http://windows.azure.com

Create a New Hosted Service and upload the package and config to your Windows Azure account. You are nearly ready to go.

Change the app.config file for TraceConsole to include your own service bus credentials.

Run the TraceConsole locally on your desktop machine. Now run the Azure instance by clicking on Run in the Windows Azure Developer Portal.

Deployment might take some time (maybe 10 minutes or more), but after a while you should see trace information start spewing out in your console app. You should see that it's downloading your ZIP files and extracting the, Finally it should run your runme.bat file.

If all goes well your app should now be running in the cloud!

Rob continues with Environmental Variables, Compiling AzureRunMe, Diagnostics, Packages, Commands, Default Connection Limit, Advanced Tracing, Cloud Drives, Batch File Tricks and Issues topics. Rob claims:

    • Clojure + Compojure: Works
    • Tomcat Works: Contact me if you want to know how, but it's much neater than the TomCat Solution Accelerator because your WAR files can just come from Blob Store!
    • Restlet: Works
    • Jetty: Runs, but not with NIO support.
    • JBOSS: Not extensively tested, but a basic system comes up for web serving.
    • ApacheDS LDAP Server: Doesn’t work because it uses java.nio


imageSee Microsoft Research released for download v0.9 of their Generic Worker implementation for Windows Azure on 9/4/2010 in the Windows Azure Infrastructure section below.


The Windows Azure Team reminded developers that Ensuring High Availability of Web Applications in Windows Azure requires more than one instance per role in this 9/4/2010 post:

imageTo prevent your Windows Azure applications from potentially experiencing a loss of availability during platform upgrades, we recommend deploying more than one instance of your web-facing roles. Doing so also enforces the Windows Azure Compute Service Level Agreement (SLA), which guarantees 99.95% external connectivity for Internet-facing roles when two or more role instances are deployed for a given application.   Please click here to download the full Windows Azure Compute SLA.

Return to section navigation list> 

VisualStudio LightSwitch

MSDN published preliminary Deploying LightSwitch Applications documentation on 9/3/2010:

image22Deployment is the process by which you distribute a finished application. In Visual Studio LightSwitch, you deploy applications by using ClickOnce deployment. ClickOnce is a deployment technology that enables you to create self-updating applications that can be installed and run with minimal user interaction.

imageThe process of deploying a LightSwitch application differs depending on the application type and deployment topology that you select.

  • Desktop client, 2-tier deployment creates an application that runs on the end-user’s Windows desktop; the database and server components run on a networked computer.

  • Desktop client, 3-tier deployment creates an application that runs on the end-user’s Windows desktop; the database and server components run on an Internet Information Services (IIS) server.

  • Browser client, 3-tier deployment creates an application that runs in the end-user’s web browser; the database and server components run on an Internet Information Services (IIS) server.

The application type can be chosen in the Application Designer. For more information, see How to: Change the Deployment Topology and Application Type

There are two ways to deploy a LightSwitch application, by either publishing or packaging it. In either case, the LightSwitch Publish Application Wizard guides you through the deployment process.

  • A published application can be run on client computers immediately after the wizard has been completed. The application is ready to install and the installation automatically deploys the database schema to SQL Server.

  • A packaged application means that everything that is required to run the application is bundled together, but additional steps must be taken to make the application available to the user.

Updates to a LightSwitch can be deployed by running the wizard again. 3-tier browser clients need only re-navigate to the web page to get the updated version. 2-tier desktop clients will automatically have it receive the updates the next time, they are run.

NoteNote: In the Beta release, automatic updates for 3-tier desktop applications are not yet enabled. You will have to uninstall and reinstall the application to obtain updates.

Publishing a 2-tier desktop application

Publishing a 2-tier application requires that you have access to a networked computer to host the application and SQL server. You enter the UNC path for the location to host the application on the Publish Output page of the LightSwitch Publish Application Wizard. For the How do you want to publish the default database? option, select Publish directly to the database now.

On the Database Configuration page of the wizard, click the … button and enter the connection properties for your SQL Server database. The database server can be a different computer on the network. For the database name, enter the application name.

Once published, users can install the application from the UNC share by double-clicking the Setup.exe file.

NoteNote: For the Beta release, you will need to pre-configure the computer, following the instructions in the Install.htm file. The file is published to the same location as the Setup.exe file.

Publishing Updates

To publish updates to the application, run the LightSwitch Publish Application Wizard again without making any changes. The next time the user runs the application they will automatically receive the update from the file share.

Uninstalling

An end user can uninstall a 2-tier desktop application from the Programs and Features or Add and Remove Programs applet in the Windows Control Panel.

Publishing a 3-tier application

Publishing a 3-tier application requires that the user to have access to an Internet Information Services (IIS) server that has MSDeploy and remote servicing installed and also a SQL server. The publishing process is the same for both desktop and browser applications. On the Publish Output page of the LightSwitch Publish Application Wizard, select the Remotely publish to a server now option. You will have to know the Service URL, Site\Application name, User Name and Password. You can get this information from the IIS administrator. You should also check the Allow untrusted certificate option.

On the Database Configuration page of the wizard, click the … button and enter the connection properties for your SQL Server database. The database server can be a different computer on the network. For the database name, enter the application name.

Once published, users can install the application from the web site specified by the Site\Application name by clicking the <ApplicationName> link, where <ApplicationName> is the display name of your application. For a desktop application, the user will be prompted to install. For a browser application, the application will open in the web browser.

Publishing Updates

To publish updates to the application, run the LightSwitch Publish Application Wizard again without making any changes. For browser clients, the next time that the user runs the application they will automatically see the new version. For desktop clients, you must uninstall and then reinstall the application.

Uninstalling

An end user can uninstall a 3-tier desktop application from the Programs and Features or Add and Remove Programs applet in the Windows Control Panel. Browser applications must be uninstalled by the IIS administrator.

Packaging a 2-tier application

A packaged 2-tier application generates a set of files that it needs to put on the Internet Information Services (IIS) host as well as a SQL script which must be run to configure the database. On the Publish Output page of the LightSwitch Publish Application Wizard, select the Create a script file to install and configure the database option.

On the Database Configuration page of the wizard, select the This will be a new database named: option and enter a name for the database.

Once published, a .zip file that contains the package is placed in the Publish directory.

On the server where the application will be installed, deploy the database schema by using sqlcmd.exe command tool. The Sqlcmd.exe tool is installed with SQL Server, typically in the \Microsoft SQL Server\100\Tools\Binn directory under the Program Files or Program File (x86) directory. Using the tool, enter a command similar to the following: sqlcmd.exe -i MyApplication.sql -S server\folder. Replace MyApplication with the name of your application, and replace server\folder with the database location on the server. The required SQL Script file (.sql) is generated in the Publish directory.

Modify the generated web.config file to reflect database location used in in the previous step. The web.config file can be located in the Application Files subdirectory of the Publish directory. Update the intrinsic database connection string entry, similar to the following: <connectionStrings>         <add name="_IntrinsicData" connectionString="Data Source=SERVERNAME\SQLEXPRESS;Initial Catalog=MyApplication;Integrated Security=True;Pooling=True;Connect Timeout=30;User Instance=False" />     </connectionString>. Replace MyApplication with the name of your application, and replace SERVERNAME with the name of the database server.

Copy the contents of the Publish directory to the server and install the application.

Publishing Updates

To publish updates to the application, run the LightSwitch Publish Application Wizard again. The next time the user runs the application they will automatically receive the updates.

Uninstalling

An end user can uninstall a 2-tier desktop application from the Programs and Features or Add and Remove Programs applet in the Windows Control Panel. The server components must be uninstalled by the IIS administrator.

Packaging a 3-tier application

A packaged 3-tier application generates a set of files necessary to place on the Internet Information Services (IIS) host and also a SQL script which must be run to configure the database. On the Publish Output page of the LightSwitch Publish Application Wizard, select the Create a package on disk option, provide a name for the web site, and provide the location where publish wizard output will be placed. The default location is the Publish subdirectory under your project directory.

On the Database Configuration page of the wizard, select the This will be a new database named: option and enter a name for the database.

Once published, a .zip file that contains the package is placed in the Publish directory. Once this package has been created, a server administrator can use MSDeploy to deploy the application to Internet Information Services (IIS) and SQL servers. The administrator can open inetmgr, and navigate to the location the application will be deployed. The administrator can then right-click the location and select Deploy: Import Application. The result is a wizard which the administrator can run to deploy the application.

Publishing Updates

To publish updates to the application, run the LightSwitch Publish Application Wizard again without making any changes. For browser clients, the next time that the user runs the application they will automatically see the new version. For desktop clients, you must uninstall and then reinstall the application.

Uninstalling

An end user can uninstall a 3-tier desktop application from the Programs and Features or Add and Remove Programs applet in the Windows Control Panel. Browser applications must be uninstalled by the IIS administrator.

See Also:


Johannes Kebeck posted a two-part tutorial for LightSwitch with Bing Maps. Here’s Switch on the Light - Bing Maps and LightSwitch (Part 2/2) of 8/25/2010 (missed when posted):

image In the first part we have used Visual Studio 2010 LightSwitch to create a database and a user interface to select, add, edit and delete data without writing a single line of code. The one thing that bothers me for now is that in the summary list on the left half of the screen we only have the data from AddressLine1-field.

image_thumb[36]

image22I would like to have the complete address-string in this summary – partially because it is more informative but also because we will use this as the address-string that we hand over to the Bing Maps Geocoder later. To do this we need to stop the debugger and make some changes in the table itself. …

Johannes continues with a lengthy tutorial.


Johannes Kebeck posted a two-part tutorial for LightSwitch with Bing Maps. Here’s Switch on the Light - Bing Maps and LightSwitch (Part 1/2) of 8/25/2010:

Introduction

image Microsoft Visual Studio LightSwitch has just been released as public beta and it claims to be the “SIMPLEST WAY TO BUILD BUSINESS APPLICATIONS FOR THE DESKTOP, WEB AND CLOUD”. That’s quite a bold statement and it certainly deserves a closer lock. And while we’re at it we will of course have a look at the use of Bing Maps in LightSwitch projects – both for desktop and web applications. Yes, that’s right! At the end of this you will see the Bing Maps Silverlight Control in a desktop application:

image

Prerequisites

image22You will find the download of LightSwitch here. On MSDN you will also find a couple of “How Do I?” videos, tutorials and a great training kit. I will come back to this training kit a little bit later. If you already have Visual Studio 2010 installed, LightSwitch will snuggle into the existing environment, otherwise it will create a new installation of Visual Studio 2010.

The First Steps

Once you installed LightSwitch you will find a new template for VB.NET and C# development in your Visual Studio:

image

Let’s start our journey by creating a new project. I choose the Visual Basic template but you will find out shortly that it doesn’t really matter because you can do a lot of things without writing any code at all. …

Johannes continues with a conventional LightSwitch beginner’s tutorial.


Paul Patterson (@PaulPatterson) posted Microsoft LightSwitch – Creating and Using a Query on 9/3/2010:

imageThis post extends my last post where I created a new Schedule table and then created a relationship between that and my Customer table. Now I want to make my application a bit more effective and efficient by creating a query to help me better visualize my scheduling data.

(Thank you Beth Massi, those video tutorials walk-through posts are fantastic!)

What is a LightSwitch Query?

image22A query is something that returns a bunch of date based on some criteria or filter. In LightSwitch, queries can be created and then used for screens or even in code. I want to use queries as a way of better getting only the data I need to immediately see. More specifically, I want to create a query that will only provide me with scheduling information for today, as well as a query for scheduling information in the future.

I don’t really need to see past scheduling, because I don’t live in the past, although I do learn from it; hence how proficient I have become at software development (yes, I’ve made a LOT of mistakes…in the past…I’m perfect now! ). So, with my superior skill of learning from my mistakes, I will make a whole bunch of mistakes during this demonstration – and learn from them. I am going to create a query that will only return schedule items that are scheduled in the future.

Creating a Query

In LightSwitch, when a query is created, it is added to the application model, meaning that the query is essentially baked into the design of the entity the query is created for. For what I want to achieve, I need to create a query on my collection of Schedule records.

In my LightSwitch application Solution Explorer I expand the Application Data node of the Data Sources folder. I right-click on my Schedules collection and click the Add Query menu item.

Adding a query to my Schedules collection

Adding a query to my Schedules collection

LightSwitch opens a query designer window for a query titled Query1. I immediately rename this query to FutureSchedule.

Updated query name and display name in query designer

Updated query name and display name in query designer

With that out of the way, I now need to add some criteria to my query.

Filter Conditions

I want to only display schedule items that are either for today, or in the future. To do this, I am going to add a filter condition to the query. A filter condition is what helps narrow my query results. In this case, I am going to filter my query results to only display schedule items that have a date of today or greater.

So, I click on the Add Filter link.

Click the Add Filter button

Click the Add Filter button

Clicking the Add Filter button presents me with a filter criteria where I can enter the conditions of my filter.

The Filter Condition

The Filter Condition

In the first drop down I have an option to select a value of Where or Where Not. The condition I want is to return records where there is a StartDateTime value that is greater than or equal to today’s date. So, I;

  • select Where from the first drop down,
  • then select StartDateTime from the second drop down, which is a list of the attributes of the Schedule entity,
  • select the greater than or equal to operator value in the next drop down, And
  • …select…wait a minute. I see a problem, can you see a problem?

Hmm, I immediately see that there is going to be an issue here. My requirement is to filter using a value that will change day to day. I can’t simply stick a comparison to a literal date in there because the day is going to change each day.

Looking more at that list of comparison values, I see an item named Parameter. This might be how I can solve this problem. If I can somehow pass the current date to the query, and then have the query use that date as a comparison value, the query should be able to filter correctly with each passing day. Let’s give a go shall we then?

For now, I am going to select Literal as the comparison value, and leave the defaulted value in the filter condition. I’ll update this condition once I have my parameter all set up.

My temporary filter condition

My temporary filter condition

Parameters

Parameters are a way of passing values to a query when the application is running. For me, I want to pass the current date as a parameter to the query, and then have the query use that parameter as the filter condition.

In the query designer, I click the Add Parameter button.

Click the Add Parameter button

Click the Add Parameter button

Clicking the Add Parameter button presents me with a textbox and and parameter type dropdown box. I enter StartDateTimeParameter as the name of my parameter, and select DateTime as the type of parameter.

The StartDateTimeParameter Parameter

The StartDateTimeParameter Parameter

In the Properties window for the parameter, I update the Display Name property to  Schedule From so that if I ever have a need to be prompted for a parameter, the prompt will be better named.

Cool, now that I have my property created, I go an update the filter I previously created so that it will use the parameter value instead of a literal value.

Updated Filter Condition

Updated Filter Condition

One more tweak left -sorting

Sorting

I want to make sure that query results show my most recent schedule. I need to order the results by StartDateTime, in ascending order.

No problem, I click the Add Sort button in the query designer window.

My Sort criteria

My Sort criteria

I select StartDateTime as the field I want to sort on, and then select Ascending as the order by which I want to sort the field.

The Screen

… Armed with my query, I now want to create a screen that shows my my query results. So, on the menu bar at the top of the query designer, I click the Add Screen… button.

Click the Add Screen button

Click the Add Screen button

A simple list of scheduled items is all I need, so in the Add New Screen window I select the Editable Grid Screen template.  LightSwitch defaults a Screen Name of EditableScheduleGrid, which is fine. Then from the Screen Data drop down box I select the Schedule query I just created. I click the OK button.

Creating the screen for the query

Creating the screen for the query

LightSwitch opens the screen designer for my new EditableScheduleGrid screen. The first thing I do is update the Display Name property for the screen to say My Schedule

Changing the Display Name for the screen

Changing the Display Name for the screen

Almost there. I need to now somehow get the current date to the parameter for use by the query. To do that, I have to first create a property for my screen to hold the date.

When LightSwitch created my new screen it recognized that the data for the screen is my query. LightSwitch also saw that there is a parameter for the query, so it added the parameter as a property of the screen. I can see this because it is listed on the left of the designer.

Parameter query, and screen property showing in designer

Parameter query, and screen property showing in designer

I like how this is done automagically. I still need to wire the parameter to the screen property. This is easy, all I do is click the screen property (StartDateTimeParameter in grey), and LightSwitch wires the two together – as shown by the little arrowed line connecting the two.

Query parameter and screen property wired together

Query parameter and screen property wired together

With the StartDateTimeParameter screen property selected, I notice that the Property Type for the StartDateTimeParameter screen property is shown as a DateTime? value. Not sure what it means if that value has a question mark on the end of it, so I select the DateTime value (without the question mark).

Updating the StartDateTimeParameter property type

Updating the StartDateTimeParameter property type

Okay, now to get that date in there.

There are two things that I need to do next. The first is to call the screen to open it, and the pass the parameter to the screen that I am opening. So, I open the screen designer for the screen I know will open first in the application. In this case, the screen is named CustomerList.

I open the CustomerList screen in the screen designer. In the screen designer I expand the Screen Command Bar tree node and click the Add button. I then click to add a New Button.

Adding a new button to the screen command bar.

Adding a new button to the screen command bar.

Now I have a new button that will show up at the top of the screen when the application runs. I need to now set some properties on this button so it shows up all nice like.

With the new button selected, I head over to the properties pane and make some edits. I enter My Schedule in the Display Name property.

Updates to the new button properties

Updates to the new button properties

Over on the left of the designer, I see the Button object added to my screen.

The Button object

The Button object

I click on it to view the properties of the object. I edit the Method Name property to say ShowMySchedule. Below that, I click the Edit Execute() Code link.

The new Method Name for the button, and the Edit Execute() Code link

The new Method Name for the button, and the Edit Execute() Code link

LightSwitch opens the CustomerList.vb code window for with a stub created for my new method.

image

It is here that I am going to open my new screen and pass the current date as a parameter to the new screen. Here is what enter into the method.

image

Cool, now lets hit the old F5 key and see what happens….

SCORE!! My application opened. And because the first screen has that button added to the top, the new button shows up as expected.

The button showing on the opening screen.

The button showing on the opening screen.

When I click on the button, the code fires and opens the new screen showing my filtered Schedule collection, ordered by start date. Awesome!

The new Schedule screen using the current date parameter passed to the FutureSchedule query.

The new Schedule screen using the current date parameter passed to the FutureSchedule query.

Cool! Now with a few tweaks to the screen I have an excellent scheduling system available.

I agree that Beth Massi’s video tutorials are great. See Windows Azure and Cloud Computing Posts for 9/2/2010+ (scroll down.)


Dan Boris offers another approach to LightSwitch Query and Details Screens in this 9/4/2010 post:

image22In my first posting on LightSwitch I showed how to setup a screen to create a new record. This isn't very useful without a way to view and edit records. In this posting I will show how to setup a search and details screen.

I am going to start with the same Employee table I created in the first post:

clip_image002

Next I created a new Search screen for Employees. When you run the app you will get the new search screen:

clip_image004

On this screen you can browse records, search from them, export them to Excel and click the Employee name to edit the record. This brings up a couple questions. First, where does the edit screen come from? It's not the same as the Create screen we started with. Turns out LightSwitch dynamically creates the edit screen when you view the record. You can easily override this by creating a new Details Screen like this:

clip_image006

If you keep the Use as Default Details Screen option checked it will become the details/edit screen that opens when you click on a record.

The next question is how do you change the field that you click to view a record? This is not entirely intuitive; you need to make the change in the table view. Select the table, and then at the bottom of the properties window set the Summary Property to the one that you want to be clickable. You can also change the Display Name which will become the name at the top of the column. In this example I am using the Id as the Summary Property:

clip_image008

To finish this you need to go back to the editor for the SearchEmployee screen and add the FirstName into the field list:

clip_image010

Now when you run the application the Id will be clickable to take you to the detail/edit screen.

clip_image012


The Light Switch Blog posted yet another 00:03:40 Visual Studio LightSwitch Beta Demo ScreenCap video by Fernando Machado Piriz on 9/4/2010:

image22This demo shows how to create a simple application to record books loaned to friends by using Visual Studio LightSwitch. LightSwitch is a tool to create business applications from the scratch or from existing databases. Automatically generates a three tier application’s forms source code and database. Generated application uses Silverlight 4.0, WCF RIA Domain Services, ASP.NET 4.0, IIS and SQL Server.


• Jayaram Krishnaswamy posted a Microsoft LightSwitch Application using SQL Azure Database tutorial to the Pakt Publishing site in August 2010:

image Microsoft LightSwitch is the latest standalone product from Microsoft belonging to the Visual Studio suite of products. It is expressly targeted to tech savvy but non-programmers (hobbyists) to develop line of business applications using Microsoft Databases or other sources. This is a new game plan by Microsoft to catch a wider audience as well as encourage them to use its cloud offerings. If priced right and supported well, this may help businesses to develop applications on their own or call in junior level programmers with some knowledge of individual technologies like databases, user interfaces, mouse clicks, window navigation etc.

image22The LightSwitch Beta 1 was out on August 23rd for the general public while it has been used by MSDN members and Microsoft insiders for couple of months. This article by Jayaram Krishnaswamy shows how you may download and install this program. The article also shows how you may develop a simple database application using this product retrieving data from the Cloud hosted relational database, the SQL Azure.

(For more resources on Microsoft, see here.)

Your computer has to satisfy the system requirements that you can look up at the product site (while downloading) and you should have an account on Microsoft Windows Azure Services. Although this article retrieves data from SQL Azure, you can retrieve database from a local server or other data sources as well. However it is presently limited to SQL Server databases.

The article content was developed using Microsoft LightSwitch Beta 1, SQL Azure database, on an Acer 4810TZ-4011 notebook with Windows 7 Ultimate OS.

Installing Microsoft LightSwitch

The LightSwitch beta is now available at this site here, the file name is vs_vslsweb.exe: http://www.microsoft.com/visualstudio/en-us/lightswitch

When you download and install the program you may get into the problem that some requirement not being present. While installing the program for this article there was an initial problem. The Microsoft LightSwitch requires Microsoft SQL Server Compact 3.5 SP2. While this was already present on the computer it did not recognize. However in addition to SP2 there were also present the Microsoft SQL Server Compact 3.5 SP1 as well as SQL Server Compact 4.0. After removing Microsoft SQL Server Compact SP1 and SP2 the program installed without further problems after installing SQL Server Compact 3.5 SP2 again. Please review this link (http://hodentek.blogspot.com/2010/08/are-you-ready-to-see-light-with.html) for more detailed information. The next image shows the Compact products presently installed on this machine.

Microsoft LightSwitch Application using SQL Azure Database

Creating a LightSwitch Program

After installation you may not find a shortcut that displays an icon for Microsoft LightSwitch. But you may find a Visual Studio 2010 shortcut as shown. The Visual Studio 2010 Express is a different product which is free to install. You cannot create a LightSwitch application with Visual Studio 2010 Express.

Microsoft LightSwitch Application using SQL Azure Database

Click on Microsoft Visual Studio 2010 shown highlighted. This opens the program with a splash screen. After a while the user interface displays the Start Page as shown. You can have more than one instance open at a time.

Microsoft LightSwitch Application using SQL Azure Database

The Recent Projects is a catalog of all projects in the Visual Studio 2010 default project directory. Just as you cannot develop a LightSwitch application with VS 2010 Express, you cannot open a project developed in VS 2010 Express with the LightSwitch interface as you will encounter the message shown.

Microsoft LightSwitch Application using SQL Azure Database

This means that LightSwitch projects are isolated in the development environment although the same shell program is used.

When you click File | New Project you will see the New Project window displayed as shown here. Make sure you set the target to .NET Framework 4.0 otherwise you may not see any projects. It is strictly .NET Framework 4.0 for now. Also trying to create, File | New web site will not show any templates no matter what the .NET Framework you have chosen. In order to see Team Project you must have a Team Foundation Server present. In what follows we will be creating LightSwitch application (default name is Application1 for both C# and VB).

From what is displayed you will see more Silverlight project templates than LightSwitch project templates. In fact you have just one template either in C# or in VB. …

Microsoft LightSwitch Application using SQL Azure Database

Jay continues with a step-by-step tutorial for accessing SQL Azure databases and using it to create a simple LightSwitch application.


<Return to section navigation list> 

Windows Azure Infrastructure

• Microsoft Research released for download v0.9 of their Generic Worker implementation for Windows Azure on 9/4/2010:

imageThe Generic Worker is a worker-role implementation for Windows Azure that eases deployment, instantiation, and remote invocation of existing .NET applications within Azure without changing their source code. The Generic Worker framework enables multiple .NET methods and dependency libraries to be registered as applications and dynamically downloaded into a virtual machine on demand when a request to run an application is received. The Generic Worker framework provides a command-line client, .NET APIs, and a Trident workflow activity to invoke any registered application, with support for automatic transfer of simple parameters and files between desktop and the virtual machine.

imageGet the download details here. The download includes a Readme.txt file with a reference (but no link) to a Bridging the Gap between Desktop and the Cloud for eScience Applications paper by Yogesh Simmham, et al., that provides “details of the Generic Worker architecture and usage context.” From its introduction:

The widely discussed scientific data deluge creates not only a need to computationally scale an application from a local desktop or cluster to a supercomputer, but also the need to cope with variable data loads over time. Cloud computing offers a scalable, economic, on-demand model well matched to the evolving eScience needs. Yet cloud computing creates gaps that must be crossed to move science applications to the cloud.

In this article, we propose a Generic Worker framework to deploy and invoke science applications in the Cloud with minimal user effort and predictable, cost-effective performance. Our framework is an evolution of Grid computing application factory pattern and addresses the distinct challenges posed by the Cloud such as efficient data transfers to and from the Cloud, and the transient nature of its VMs.

We present an implementation of the Generic Worker for the Microsoft Azure Cloud and evaluate its use in a genome sequencing application pipeline. Our results show that the user overhead to port and run the application seamlessly across desktop and the Cloud can be substantially reduced without significant performance penalties, while providing on-demand scalability.

image

Figure 1. Generic Worker Architecture on Microsoft Azure. Numbered arrows are order of application deployment (R1-3) and execution (1-10).

image

Figure 2: Program lifecycle of a Generic Worker instance

image Thanks to Brent Stineman (@BrentCodeMonkey) for the heads-up. As Brent points out in his Microsoft Cloud Digest - September 5, 2010, this might “be the first look at the upcoming “VM role” in azure.”


Michael Wood posted Learning Azure with resources for developers new to Azure on 9/6/2010:

imageToday I received a contact on my blog from someone asking some good resources to start learning Azure.  I thought the answer I sent might be useful to others, so I’m posting it here.  Hope it helps.

image I picked up Windows Azure by just jumping into anything I could find to help me get the basics, then explored from there. There is a book coming out this month “Azure in Action” that I would highly recommend. You can get an advance copy from the Manning Early Access Program (http://manning.com/hay/). The content they have currently is complete, but I know the final book is going to have quite a few changes. If you buy now you can get started, then get the full book when it is released for a lower price than the full book will be later when it is officially published.

I also highly recommend checking out the Windows Azure Platform Training Kit. This material was created by Microsoft and has many hands on labs where it can step you through a lot of very common operations and concepts in Azure. (http://www.microsoft.com/downloads/en/details.aspx?FamilyID=413E88F8-5966-4A83-B309-53B7B77EDF78&displaylang=en)

There is also the Windows Azure online User Group. They have a virtual meeting every month. You can find out more about them at http://www.azureug.net/.

There is also the Windows Azure Boot Camps. They have been given all over the world and the idea for them was cooked up by Brian Prince, one of the co-authors of the book I recommended above. If you can’t find a boot camp scheduled around your area, all the materials and some train the trainer videos can be found on the boot camp website. (http://azurebootcamp.com/)

And don’t forget that they sometimes run offers for free trial accounts.  Keep an eye on http://www.microsoft.com/windowsazure/offers/ for “Introductory offers”.  This will get you a few online hours to play with.  Remember that with the Windows Azure SDK you can pretty much do 90% of your work local and don’t really need a cloud account.  Only when you want to push your work up to the cloud do you need the Azure account.


The OnCloudComputing blog reported Ernst & Young Survey on Cloud Computing in India - Over72% of India’s IT infrastructure companies will adopt cloud computing in a big way over the next 2-3 years on 9/4/2010:

Key highlights:

  • 81% of Indian CIOs are familiar with cloud computing concepts
  • 72% Indian CIOs cited data privacy and security issues as a concern area for their business while adopting cloud computing services
  • 58% Indian CIOs consider ability to focus on core activities and usage based payments as the top business benefits of cloud computing services
  • 92% of India CIOs are inclined to buy cloud services from data center service providers and IT systems integrators

image Over 72% of India’s IT infrastructure companies will adopt cloud computing in a big way over the next 2-3 years, according to a new study by Ernst & Young.

The study titled “Cloud adoption in India-Infrastructure as a Service (IaaS)” is based on interviews with 50 chief information officers (CIOs) from leading SMB, enterprises and IaaS ecosystem players.

According to Milan Sheth, Partner, Technology Advisory Practice, Ernst & Young, “We have seen significant interest in the potential of IaaS for the Indian market across industry segments. Virtualization, often seen as the first step in a cloud strategy, has begun to be more widely implemented across data centers and this could contribute to a rapid implementation of IaaS in India.”

Currently, the Indian market does not have a mature ecosystem that supports cloud IaaS. The SMB segment considers cloud IaaS for its “true” benefits while large enterprises perceive benefits on the operational side that are generally derived from an outsourcing model. Over 68% of the respondents have a positive mindset towards cloud computing, with 24% regarding it as a driver of the next wave of IT innovation and 44% believing that it will mature in a few years. Interestingly, cost is not the decision factor for adoption of cloud IaaS.

“The ongoing impact of open source, the whole concept of SaaS meeting the cloud, will prove to be a major trend for cloud adoption in India. With high awareness levels and the positive perception of cloud indicate a market that will see robust growth rate in the next 2 years,” Mr. Sheth added.

Ecosystem maturity, customer awareness of services and connectivity are identified as the major challenges for adoption of cloud IaaS in India. Vendor maturity and the capability to provide cloud services are an area of concern in addition to generally associated challenges for shift while adopting cloud IaaS.


Windows Azure Compute, North & South Central US reported [Yellow] Windows Azure Service Degradation on 9/3/2010:

imageSep 4 2010 12:28AM Service maintenance is underway and it impacts all datacenters in all regions. As a result, certain provisioning calls might be failing. Next update within one hour or as events warrant.

Sep 4 2010 2:40AM Service functionality has been restored. It is important to note that only certain provisioning calls might have been failing throughout the duration of the maintenance. Existing storage accounts were still accessible, and running deployments were not impacted.

Note that maintenance was conducted on all regions simultaneously.

<Return to section navigation list> 

Windows Azure Platform Appliance (WAPA)

image See Bibhore Singhal wrote Hybrid Cloud with Windows Azure Service Bus in 8/2010 as an HCL Technologies white paper in the Windows Azure Platform Appliance (WAPA) section above.


<Return to section navigation list> 

Cloud Security and Governance

JP Morgenthal posted a Cybersecurity: It’s More Than Worms, Hacking and Phishing essay on 9/5/2010:

To put things into perspective, let’s analogize about some information technology related initiatives.  In the realm of things, accounting is like a lake, integration is like a bay and cyber security is like the Pacific Ocean.  The scope of understanding required to be a cyber security expert is so vast that it fills volumes just trying to define it, let alone protect it.

The reason cyber security is so vast is that it is a strategy for mitigating risk from breach of confidentiality, lack of integrity and lack of availability of information systems and networks.  Consider the number of threats that target these three things and then consider this number is only the known threats.  Also, know that new threats are being uncovered daily.  Moreover, threats are not all technological, some of them are socially engineered, which make them all that more difficult to defend against.

When I talk to individuals about their cyber security strategy, most times these days, the answer reflects a tactical requirement to understand the nature of the threat and take some action to institute some protection.  If these organizations continue to operate in this manner with the number of growing threats, they will soon be all consumed just trying to keep up let alone operate the networks and systems.  The sheer magnitude of threat management will in and of itself result the worst threat of all—denial of service.

In my recent talk at the Air Force IT Conference I discussed using a Security Incident and Event Management (SIEM) tool combined with a Governance, Risk & Compliance (GRC) tool to assist with automation of handling of cyber threats.  The SIEM does a great job of allowing real incidents to be recognized, which can then be driven to closure using the GRC.  In this case, closure does not simply mean handled, it means completely mitigated.

Another growing trend I see in cyber security is to allow threat management to be designed outside of the realm of enterprise architecture.  When this occurs, security is implemented in a silo manner usually related to the operational focus of the IT group implementing the security architecture and solution.  For example, the network group focuses on network security, while the application group focuses on access control and authorization.  While common, this is perhaps the greatest weakness in a cyber security strategy and can be easily exploited by an attacker.  The cyber security solution must be implemented in an integrated manner watching horizontally from network through application.

For certain, if someone wants to breach or limit access to your systems and data, they will find a way to so that is not being watched.  Which brings me to the next most important point about cyber security—you cannot watch everything all the time.  We’ve all seen the spy movies where they time the video camera movements so they can sneak into the building undetected.  Even with the best tools on the market today, you may only be made aware of a breach after it has occurred and once it has been correlated with other known events that highlight the likelihood of breach.  At this point, stopping the breach is but a bullet point on the post-mortem slide and all attention must now be focused on the impact of the breach.

So, let’s see what we’ve got so far:

  1. We don’t know what we don’t know
  2. We can’t watch everything at all times
  3. Many are simply trying to understand the nature of the threat
  4. Current security architectures are being implemented in silo manner

And, now the cherry on top, not everything that is a threat is intentional.  As if cyber security wasn’t complex enough, now we’re not just policing for cyber criminal activity, we’re fending off and responding to: uneducated users, careless utility and maintenance personnel, suppliers and vendors, and general defects.

As I like to say, resistance is futile, so instead, implement a strategy that keeps you out of the fray as much as possible.  Implement and ensure compliance with your security policies, educate your organization on things they can do to minimize the opportunity for a cyber security incident, catalog and value your assets, and implement tools in a concerted and integrated manner.  Moreover, make it a policy to revisit this lifecycle as least two times annually.

One point I’d like to expand upon from above is catalog and value your assets.  Most organizations I have worked with do not do this as a general activity.  Cyber security threat management is a risk management activity.  You value your assets and apply resources to protect from the most critical and high-valued ones to the least critical and lower-valued ones.  If you’re in the business of arbitrage, where seconds equate to big dollars, deploy most of your budget protecting the trading systems and networks.  If you’re in government, you’re going to make sure that the systems required to operate the government get the most resources applied first.  Knowing where to apply your resources is the most critical aspect of the mission.

To me, cyber/info security may resemble the epitome of architecture as it requires more depth and more breadth than any other branch of architecture.  Moreover, to do it right, requires the architect to have experience and understanding in multiple facets of IT architecture including network, storage, server, virtualization, application and data.  The unfortunate truth, however, is that a complete security architecture is still viewed as a "nice to have", not a "must have".  Most business and IT executives feel comfortable knowing that they have a deadbolt on the front and back door and the windows are locked.


Brian Loesgen linked to Great Azure Security Resources on 9/4/2010:

imageIf you have questions about security and Windows Azure, there’s a really good compilation of resources available here.

There’s a lot of great content there about all things security-related about Azure, and developing secure apps on, and leveraging, the Azure platform.

A particularly interesting read is Windows Azure Security Overview, which talks about security from the Azure-side (fabric controller, facilities, network, everything!). This is more than most people need to know, but if you ever wondered what goes on under the hood, you’ll find this of interest.

See the compilation page, read and learn more at http://msdn.microsoft.com/en-us/library/ff934690.aspx.


<Return to section navigation list> 

Cloud Computing Events

The Copenhagen .NET Users Group announced on 9/6/2010 an Extreme Scaling with SQL Azure meeting on 9/29/2010 from 2:00 PM to 4:00 PM (GMT +1) at Valtech, Kanonbådsvej 2, København K 1437, Denmark. Here’s Bing’s translation of the Danish Event Details:

imageSQL Azure is Microsoft's new strategy to save your data in the cloud, but what do you do when it exceeds the much vaunted 10/50 GB limit. This is where sharding or separation comes into play-this session will show you how it can be done in an OLTP system and show you at some of the common pitfalls in the SQL azure, as we have discovered by analyzing SQL Azure, as an alternative to onsite SQL Server among different customers.


image• Brent Stineman reported that the Twin Cities Code Camp will occur on 10/9 and 10/10/2010 at the University of Minnesota - Keller Hall, 4-192 EE/CS Building, 200 Union Street SE, Minneapolis, MN 55455 and he’ll present Integrating the Cloud and On Premise Apps Using the Windows Azure AppFabric on 10/10:

image72Hybrid solutions are the most accessible, and many argue the most common cloud computing solutions. In this session we'll discuss/demonstrate the features of the Azure AppFabric and how they can be used to help integrate your existing and cloud applications. We will go over different use cases as well as how to help secure them from prying eyes.

Visit http://tccc9.eventbrite.com/ to register.


Joey Da Vila announced on 9/4/2010 the Fredericton [New Brunswick] .NET User Group / OData on September 22nd meeting:

fredericton .net user group

The Fredericton .NET User Group’s site is live (and looking good, too!). If you live in or around Fredericton and are a .NET developer, be sure to bookmark both their site and Twitter account. They’ve got announcements about upcoming meetings as well as developer job listings for New Brunswick and Nova Scotia.

(And don’t forget to register for TechDays Halifax, taking place November 2 – 3, while the early bird rate of CAD$349.99 still applies!)

Next Meeting: OData (Wednesday, September 22nd)

open data protocol

On Wednesday, September 22nd, the Fredericton .NET User Group’s meeting topic will be on OData and present by Andrew Trevors of SwiftRadius, an IT consulting company and Microsoft Certified Partner in Fredericton. OData, short for the Open Data Protocol, and it’s a web protocol for querying and updating data and free it from silos that exist in applications today. It applies and builds upon Web technologies such as HTTP, Atom Publishing Protocol (AtomPub) and JSON to provide access to information from a variety of applications, services, and stores.

The protocol emerged from experiences implementing AtomPub clients and servers in a variety of products over the past several years. OData is being used to expose and access information from a variety of sources including, but not limited to, relational databases, file systems, content management systems and traditional Web sites.

OData is consistent with the way the Web works – it makes a deep commitment to URIs for resource identification and commits to an HTTP-based, uniform interface for interacting with those resources (just like the Web).   This commitment to core Web principles allows OData to enable a new level of data integration and interoperability across a broad range of clients, servers, services, and tools.

OData is released under the Open Specification Promise to allow anyone to freely interoperate with OData implementations.

This article also appears in Canadian Developer Connection.


TechDays 2010 Vancouver will convene 9/14 and 9/15/2010 at Vancouver Convention Centre - West Building, 999 Canada Place, Vancouver BC V6C 3C1, Canada:

image Join us in Vancouver September 14 & 15 at the Vancouver Convention & Exhibition Centre. Like Tech·Ed, Tech·Days brings you an intensive learning experience where the depth of the technology training is matched only by the breadth of the topics covered—while sparing you the additional expenses or environmental concerns of air travel.

Tech•Days is also all about connection. With 50 200+ level sessions delivered by local Microsoft experts and consultants who will be readily available to meet, chat and answer your most relevant questions.
And the learning doesn’t end there. At the Birds-of-a-Feather lunches, you can propose your own topics for discussion with your peers and the Microsoft people who know the technologies best.

Developing for Three Screens and the Cloud track

The way individuals use applications has changed. No longer can an application reside only on the desktop or only on the web. Consumers are used to running applications on their mobile devices, leveraging the web where appropriate, as well as using a desktop application where it makes sense. Furthermore, the infrastructure to support these applications, and even the application itself, may reside on premise or in the Cloud. In this track you will learn how to develop and enhance your applications for the three key platforms - web, mobile, and desktop - and how to leverage the Cloud where it makes sense. You will also learn how Visual Studio 2010, SharePoint 2010, Windows Phone, Silverlight and other technologies can help enhance the application experience for your users.

The common Three Screens and the Cloud track has only one Windows Azure-related session:

DEV307: Using Microsoft Visual Studio 2010 to Build Applications That Run on Windows Azure

Day 1 - 2:20pm - 3:25pm

A platform is only as powerful as the tools that let you build applications for it. This session focuses on using demos, not slides, to show the best way to use Visual Studio 2010 to develop Windows Azure applications. Learn tips, tricks and solutions to common problems when creating or moving an existing application to run on Windows Azure. Come see how Visual Studio 2010 supports all parts of the development cycle as we show how to take an ASP.NET application running on IIS and make it a scalable cloud application running on Windows Azure.

Session Lead: Cory Fowler

However, Vancouver’s Local Flavor track has two:

LFV210 | BREAKOUT: The Economically Unstoppable Cloud
CloudCamp.org has hosted over 120 events in the past two years. In this session, co-founder Dave Nielsen shares insights and lessons learned from pivotal discussions between 100’s of experts around the world during CloudCamps he has personally facilitated.

Using skits, stats, photos & facts, Dave will help you see why “Cloud” is not just a buzzword, but an inevitable movement sucking us into a new age of scale & information, reshaping the way business & consumers use technology around the world.

If you’re a startup, developer, IT Professional, ISV, VAR, consultant, or manager of an IT department ... you need to see this presentation so you can learn how to take advantage of today’s cloud while preparing for the competition it will bring tomorrow.

LFV322 | BREAKOUT: Dynamics CRM, SharePoint, and Windows Azure – Better Together
As adoption of SharePoint and Dynamics CRM grows, many companies want to increase their integration of CRM and SharePoint platforms. This session will demonstrate different strategies that can be used to integrate the two platforms to deliver tightly integrated solutions using both products. This session will also take it into the clouds and demonstrate how you can build an online solution using Windows Azure, CRM Online, and BPOS.
Speaker: Shan McArthur


Peter Silva reported “Zeus discusses virtualization, networking, VMworld, F5 Networks and many other topics surrounding Cloud Computing” as a preface to his VMworld2010: Interview with Yankee Group’s Zeus Kerravala post of 9/4/2010:

image I get some analyst insight during this informative interview with Zeus Kerravala, Sr. VP of Research, of Yankee Group. Zeus discusses virtualization, networking, VMworld, F5 Networks and many other topics surrounding Cloud Computing.


Mark Wilson reported "Steve Ballmer will be the guest speaker at a special UK TechDays Future of Cloud Development event in London’s Docklands on 10/5/2010 in an Upcoming events (including special #uktechdays) event post of 9/3/2010:

We’re having difficulties scheduling WSUG events right now.  Without going into all the gory details, Microsoft’s funding for rooms, etc. is not available in the way that it has been in the past, so we need to find another way to do things…

Now that the summer holidays are over, I’d like to organise a “virtual” user group meeting, over Live Meeting – and have had some conversations with Microsoft about a session on “Azure for IT Pros” (how can we integrate our on-premise infrastructure with Windows Azure, etc.).  Please leave a comment if you think this will be of interest.

In the meantime, I wanted to tell you about a Microsoft-hosted event that may be of interest, although it may also be a bit “developery” for some Windows Server admins.

In any case, Steve Ballmer will be the guest speaker at a special UK TechDays “Future of Cloud Development” event in London’s Docklands on 5 October.

The site has not gone live yet but you can regist[er] on the event page or at 0870 166 6670, quoting event reference 9886 - you’ll also need the invitation [key]: 6D4723.

More details of the session content can be found below:

  • A lap around Windows Phone 7 (Mike Ormond) – In this session Microsoft will take a look at Windows Phone 7 and the developer ecosystem, from the capabilities and unique features of the platform to the development frameworks and tools you have at your disposal. Along the way they’ll build a simple application or two and explore how people can purchase your finished masterpiece.
  • imageA lap around the Windows Azure Platform (Eric Nelson) – Hear how the Windows Azure Platform provides a scalable compute and storage environment with Windows Azure, secure connectivity with Service Bus and Access Control Service, and a relational database with SQL Azure. Learn about these new services and see demos that show how to build applications that run in and take advantage of Microsoft’s new cloud platform.
  • We’re Not on XP Any More – A Windows 7 Application in 60 Minutes. (Mike Taulty) – In this code-only session Microsoft will use Visual Studio 2010 and any .NET assembly that we can beg, borrow, steal or even build in order to put together a simple, modern Windows 7 application from scratch using the journey to provide pointers on how your applications can shine by using features that Windows XP only dreamt about ( when it wasn’t dreaming of electric sheep in its world limited by 2 processor cores, 4GB of RAM and GDI based graphics).
  • Keynote: New opportunities and compelling experiences – Microsoft’s Chief Executive Officer, Steve Ballmer, will talk about new opportunities to deliver seamless experiences across many screens and a cloud, and why now is such an exciting time for developers
  • IE9 The Best Browser for Windows (Martin Beeby) – In this session Microsoft will use IE9 and a sprinkling of JavaScript and HTML5 to show you how to create an integrated and immersive experience maximizing the full power of your visitors Windows 7 PC.

TechDays will be held at the The London ICC (International Conference Centre), One Western Gateway, Royal Victoria Dock, London E16 1XL, United Kingdom. Welcome time is 11:00 AM GMT. Here’s the Event Overvies:

The Future of Cloud, Phone and App Development will illustrate why now is such an exciting time for developers using Microsoft tools, platforms and services.

Along with a fantastic, varied agenda, we are delighted to welcome Steve Ballmer, who will deliver the keynote speech detailing the new opportunities seamless experiences across a multitude of devices and the cloud.

Software development is an exciting and rapidly changing game, and if you've got a big idea, there are boundless opportunities for creativity.

Our special UK TechDays event is specifically designed to equip you, as a developer, with the knowledge and understanding of emerging technologies so that you can forge ahead of the rest to bring your ideas to fruition.

Covering Windows Azure, Windows 7 and the soon-to-be-released Windows Phone 7 and IE9, we'll show you how multiple screens and a cloud will change the way you think about development.

Join us at the ICC in London's Docklands - we look forward to seeing you there.


<Return to section navigation list> 

Other Cloud Computing Platforms and Services

No significant articles today (VMworld is over).

<Return to section navigation list> 

  

3 comments:

Vittorio said...

Hi Roger,
just to clarify: this is not my first book (it's in fact the 3rd) but I am excited nonetheless :-)

ErikEJ said...

Re SuperQ on Windows Azure: See this blog using Windows Azure Drive - http://www.syringe.net.nz/2010/07/20/WindowsAzureDrivesSQLCompact4NdashPart1.aspx

Roger Jennings (--rj) said...

Erik,

Thanks for the heads-up on this post. Just what I was looking for. Post updated for "Windows Azure Drives + SQL Compact 4: Part 1" but Chris's promised Part 2 appears to be missing.

--rj