Monday, May 23, 2011

Windows Azure and Cloud Computing Posts for 5/23/2011+

image2 A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

image4    

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Azure Blob, Drive, Table and Queue Services

Yves Goeleven (@yvesgoeleven) explained Improving throughput with NServiceBus on Windows Azure in a 5/19/2011 post:

image One of the things that has always bothered me personally on the ‘NServiceBus – Azure queue storage’ relationship is throughput, the amount of messages that I could transfer from one role to the other per second was rather limited.

imageThis is mainly due to the fact that windows azure storage throttles you at the http level, every queue only accepts 500 http requests per second and will queue up the remaining requests. Given that you need 3 requests per message, you can see that throughput is quite limited, you can transfer less than a hundred messages per second. (Sending role performing 1 post request, receiving role performing 1 get and 1 delete request)

One of the first things that you can do to increase throughput is using the SendMessages() operation on the unicast bus.This operation will group all messages passed into it into 1 single message and send it across the wire. Mind that queue storage also limits message size to 8KB, so in effect you can achieve a maximum improvement of factor 10, given that you have reasonable small messages and use binary formatting.

Secondly I’ve added support to the queue for reading in batches, using the GetMessages operation on the cloud queue client. By default the queue reads 10 messages at a time, but you can use a new configuration setting called BatchSize to control the amount of messages to be read. Mind that the BatchSize setting also influences the MessageInvisibleTime, as I multiply this number by the batchsize to define how long the messages have to stay invisible as overall process time may now take longer.

In the future I may consider even more improvements to increase throughput of queue storage. Like for example using multiple queues at a time to overcome the 500 requests per second limit. But even then, there is another throttle in place at the storage account level, which limites all storage operation requests to 5000 requests per second (this includes table storage and blob storage requests), so I may have to add support for using multiple storage accounts. But as Rinat Abdullin already pointed out to me on twitter this might have grave consequences on both overall latency and costs. So before I continue with this improvement I have a question for you, do you think this additional latency and costs are warranted?


<Return to section navigation list> 

SQL Azure Database and Reporting

Kevin Remde (@kevinremde) asked So you want to learn SQL Azure? in a 5/23/2011 post:

image Recently a great series of 15 introductory videos have been completed and made available up on Microsoft Showcase, all focusing on SQL Azure.   (You may recall that I blogged about SQL Azure in my “Cloudy April” series a few weeks ago.)

imageHere are the episodes:

Cloudy DatabasesThe videos are short, single-subject, high-quality, and easy to consume and even share.

Code samples are all also publically available on http://sqlazure.codeplex.com.

Kevin is an IT Pro Evangelist with Microsoft Corporation.


<Return to section navigation list> 

MarketPlace DataMarket and OData

imageNo significant articles today.


<Return to section navigation list> 

Windows Azure AppFabric: Access Control, WIF and Service Bus

The Windows Azure AppFabric Team offered a Recap of TechEd North America in a 5/23/2011 post:

image722322222Last week at the TechEd North America conference we made some exciting announcements regarding the Windows Azure AppFabric May CTP and the upcoming June CTP.

Following are several resource that can help you learn more about these announcements and the related technologies.

Press coverage:

You can watch all the TechEd sessions on-demand on Channel 9.

To get a good overview of the announcements you can watch Norm Judah’s foundational session:
From Servers to Services: On-Premise and in the Cloud – AppFabric announcements and cool demos start at minute 25.

To get a high level overview of our middleware strategy and roadmap you can watch Seetharaman Harikrishnan’s session:
An Overview of the Microsoft Middleware Strategy .

Here are the other sessions covering our technologies:

Don’t forget we are also frequently updating our Windows Azure AppFabric Learning Series available on CodePlex which provide an easy way to learn and get started with Windows Azure AppFabric.


Clemens Vasters (@clemensv, pictured below) reported Service Bus May 2011 CTP Client Bits now on NuGet in a 5/22/2011 post:

image(This post has been written by my coworker Eric Lam who pulled this all together – he wrote it as an email to the team – I’m just ripping that off)

NuGet (http://www.nuget.org/) is an open source package manager for .NET started by Microsoft. For some context there is a really good TechEd talk here.

image722322222To install:

1. Unless you have it (for instance as a side-effect of installing ASP.NET MVC3), get NuGet installed (it’s an VS extension) from here

2. Right click on the project you want to add Service bus support to and select “Add Library Package Reference”

clip_image001

3. The NuGet manager will show up, and search for “AppFabric”, which will show all our AppFabric packages available right now (note: Cache is there too!)

clip_image002

4. Click install, and that’s it! The package do a number of things

  • Copy Microsoft.ServiceBus.dll and Microsoft.ServiceBus.Messaging.dll to your local project directories
  • Add the references to your project
  • If you have a App.Config and/or Web.Config, it will add the necessary WCF bindings/extensions for project consumption.

Some things to note for the package:

  • The packages require the projects to use the .Net 4.0 Full Profile (Client profile doesn’t work).
  • For sample code you will need to install it on a C# Console Application project.
  • This is only available for the CTP right now, not yet for the production SDK.

Let us know if you have any feedback or comments.


Keith Bauer posted AppFabric Service Bus – Things You Should Know – Part 1 of 3 (Naming Your Endpoints) to the AppFabric CAT Blog on 5/20/2011:

image In this mini-series, I’ll share experiences from a recent project and explore the decisions that were made as we worked to design and deliver a solution which leverages the relay capability of the Windows Azure AppFabric Service Bus. The first part of this mini-series highlights the insights that were uncovered when considering the naming strategy used to identify thousands of Service Bus endpoints.

Introduction

image722322222After being involved in a few Service Bus deployments and participating in many architecture design reviews, I have observed there are two primary types of Service Bus patterns that emerge when considering use of the “relay service”. First, there are implementations that expose just a few Service Bus endpoints, which are then consumed by many clients; second, is almost the exact opposite, where you will find many (e.g. thousands) of endpoints, however each one is only consumed by a few clients. While other architectures are certainly possible (i.e. many endpoints, each with many clients, or few endpoints, each with few clients), I have not yet run across many examples of these when compared with the other two possibilities.

Figure 1 illustrates the Service Bus “relay service” architecture for the “many endpoints and few clients” pattern, which is indicative of the solution I found myself deeply engaged in. In particular, we had thousands of endpoints to register with the Service Bus, and we needed a naming system that made this not only possible, but easily manageable as well.

ManyEndpointFewClients

Figure 1 – Many Endpoints and Few Clients

Naming Your Endpoints

Now that we have a basic understanding of our “relay service” pattern, we need to think about a naming strategy for all of these endpoints. What is the best approach for naming 1,000 endpoints, 10,000 endpoints, or even 100,000 endpoints which all need to be registered with the Service Bus?

Before we consider the naming solution we used, we need clarity on one very important rule regarding Service Bus endpoints. The rule is this:
Once a service is registered, no other service can listen on any URI scoped at a lower level than your service.

In a hierarchical context, this rule means that if you register a node as an endpoint, then any child nodes cannot also be registered as endpoints. For example, if I registered [sb://krbcorp.servicebus.windows.net/detroit/facility1] then I would not be able to register [sb://krbcorp.servicebus.windows.net/detroit/facility1/orderservice]. However, if I did not register [sb://krbcorp.servicebus.windows.net/detroit/facility1] then I could have registered both [sb://krbcorp.servicebus.windows.net/detroit/facility1/orderservice] and [sb://krbcorp.servicebus.windows.net/detroit/facility1/shippingservice] as endpoints on the Service Bus.

Given this rule and the need to manage thousands of endpoints, we ended up with a naming solution that did not use city names, or facility names, or any other type of name; rather, we used GUID’s to uniquely identify each location. This was then followed by a descriptive name of the service. This structure allowed us to uniquely identify each endpoint on the Service Bus, which consisted of a GUID (representing a specific hosting location) and the name of the service. Here is an example:
[sb://krbcorp.servicebus.windows.net/B3B4D086-BEB9-4773-97D3-064B0DD306EA/myservice1]

There are a few reasons we selected this naming structure:

1. Using city and/or facility names, which admittedly are very descriptive and used in many examples, can cause a bit of trouble because they may not be unique when you have thousands of locations hosting a service that must be registered on the Service Bus. This is particularly true in the case of ISV’s which may have thousands of clients all over the world. Apparently city names like Franklin, Springfield, Salem, and many others, are some of the most commonly used city names in the world. We could have added another level in the hierarchy (e.g. State) to help with some of this, but even this was not guaranteed to be unique. We considered against this approach in order to avoid exceptions to our taxonomy. As it turned out, there was even more meta-data we wanted to capture about the location in which our service was hosted, and this additional meta-data embedded in the address had no bearing on the rules of the Service Bus.

2. GUID’s (representing the hosted location) and a service name (representing the service) guaranteed each registered endpoint would be unique – a requirement of the Service Bus.

3. The coding required for the automatic registration of an endpoint using GUID’s was a piece of cake. We simply generated a new GUID for each hosted service, concatenated this to our base address, and then appended the service name prior to registering the endpoint.

4. Appending the service name (i.e. myservice1) after the GUID provides us with the capability of registering more than one service per location, which non-coincidentally helps us meet the requirements of the Service Bus rule stated above.

5. And perhaps the most important… we didn’t have to worry about creating and using a taxonomy that could cause more discussion than actual work!

What else did we do?

Since we had a need to capture additional meta-data for each registered endpoint, and we elected to not have this reflected in the endpoint address, we decided to create a custom meta-data service repository using SQL Azure. This repository is populated by a custom WCF service hosted in Azure, which is called whenever a new endpoint is registered on the Service Bus. This allows us to maintain a rich set of data for each registered endpoint and keeps the naming process simple and effective.

Conclusion

This wraps up part 1 of 3 in the Service Bus mini-series. I hope this bit of insight helps as you explore the Service Bus offerings, especially if you find yourself in a scenario that considers various approaches for naming thousands of registered endpoints.

Authored by: Keith Bauer


The AppFabric Team Blog announced More Service Bus Videos and Code Samples Available! on 5/19/2011:

image722322222We just released two additional videos and the accompanying code samples to our Windows Azure AppFabric Learning Series available on CodePlex.

The new videos and code samples are:

Name Link to Sample Code Link to Video
Windows Azure AppFabric - How to use Service Bus Queues Click here Click here
Windows Azure AppFabric - How to use Service Bus Topics Click here Click here

Another video and code sample to be released soon: How to use Service Bus Relay.

The learning series is an easy way for you to learn about AppFabric and get started with coding using the samples.

As a reminder, the Service Bus features discussed in these videos are available in the CTP version in the service in our LABS previews environment at: http://portal.appfabriclabs.com/.

It is free to use, so sign-up, login and get started!

Don’t forget to send us feedback and ask questions on the Windows Azure AppFabric CTP forum.

If you haven’t signed up for Windows Azure AppFabric you can take advantage of our free trial offer. Just click on the image below and get started today!

free trial offer


Sam Vanhoutte described Using AppFabric topics & subscriptions to implement cloud-scale publish-subscribe in a 5/19/2011 post to his Codit blog:

image Earlier this week, the first CTP of the Service Bus enhancements was released; the May CTP.  This CTP was announced at TechEd and on the AppFabric Team Blog.  One week prior to that, Codit has been accepted in the Technology Adoption Program (TAP) of the Windows Azure AppFabric team.  In this program, we will work closely together with the product team to build our cloud integration offering.

image722322222Being BizTalk experts since our inception, we are highly interested in this release of the AppFabric Service Bus, since this one provides real ‘hardcore’ messaging capabilities, just like BizTalk has.  These capabilities make asynchronous, loosely coupled scenarios possible on a cloud scale service bus.

This post dives into one of the most interesting features towards flexible routing and decoupling of messages: topics & subscriptions. 

Terminology

Queues

AppFabric service bus queues can be compared with the storage queues or with the message buffers of AppFabric V1, but they offer much more complex and rich functionality (sessions, dead letter queues: DLQ, deferring of messages, duplicate checking, transactions…)

Some enhancements over the azure storage queues:

  • Large message sizes (currently: 256KB – storage queues: 8KB)
  • Unlimited TTL (time to live – storage queues: max 7d)
  • Correlation, request/reply
  • Grouping of messages

Topics & subscriptions

Topics & subscriptions allow to implement real publish/subscribe patterns on the Windows Azure AppFabric platform.  A sender/publisher submits his messages to a topic.  (this means he does not need to know anything about any potential subscribers/destinations).  And linked with a topic, you can add multiple Subscriptions. 
A subscription will register on messages that are matching the filter for that subscription.  A filter is defined as a SQL92 or LINQ expression and that filter is matched against the properties of a message (BrokeredMessage).  If the filter is matched, a message will be marked for delivery on that subscription.
A new concept is the use of filter actions.  These can be configured on a subscription to update the properties of a message. 

The following schema shows an overview of topics and subscriptions. 

Subscriptions

Someone familiar with BizTalk will immediately get this picture.  I would make the following analogy with BizTalk:

  • AppFabric Topic = BizTalk Receive Agent (Receive Port)
  • AppFabric Subscription = BizTalk Subscription
  • AppFabric Filter Action = A very lightweight BizTalk pipeline (only influencing message properties)
  • Message properties = Message Context

Implementing a pub/sub sample.

In this blog post, I am implementing a scenario where a client application (WPF app – the publisher block in the above diagram) is sending messages to a topic.  These messages are objects of a custom class: Material.  On these messages, the client application adds some properties.  These properties will be used to match the filters against.  The client allows to send multiple messages during a long period.  This will allow us to test creating subscriptions on the fly.

At the other end of the Service Bus, I will have multiple receivers that each either listen on an existing subscription, or that will create a subscription on the fly and receive from it.  I can start up different receivers and messages will be delivered to them, as long as the message properties match the filter of the subscription.

The code

Registering the Service Bus namespace

The blog post by David Ingham of the AppFabric team shows how you can register on the AppFabric labs portal and how you can download the SDK. You will need the name of the service bus namespace, an issuer name (owner is default, but should not be used in production) and the corresponding issuer secret key.

Creating the client application

  • Create a new WPF Forms application.
  • Add the Microsoft.ServiceBus and the Microsoft.ServiceBus.Messaging assemblies to the project.  Make sure you use the v2.0.0.0, because the v1.0.0.0 still references the old (?) version.
  • I designed the client form like displayed further down this article.
  • When clicking the submit button, I check if an interval is required between sending messages.  If that is the case, I send messages in a for loop with a Thread.Sleep between each send.  If there is no interval needed, I do a Parallel.For, so that all messages are send in parallel to the topic.
  • When staring up the application, I initialize two important objects: the ServiceBusNamespaceClient (which will be used for management of the namespace) and the MessagingFactory (which will be used to send and receive messages).

private void Initialize()
        {
// Read issuer name & secret
string issuerName = ConfigurationManager.AppSettings["issuerName"];
string issuerKey = ConfigurationManager.AppSettings["issuerKey"];
string sbNamespace = ConfigurationManager.AppSettings["sbnamespace"];
// Create credentials
SharedSecretCredential sbCredentials = TransportClientCredentialBase.CreateSharedSecretCredential(issuerName, issuerKey);
// Namespace client
Uri sbAddress = ServiceBusEnvironment.CreateServiceUri("sb", sbNamespace, "");
            sbClient = new ServiceBusNamespaceClient(sbAddress, sbCredentials);
            msgFactory = MessagingFactory.Create(sbAddress, sbCredentials);
        }

  • The implementation for sending the message is like this.  The object that is being sent is of type Material.  It will be serialized, using the standard Serializer.  I create a TopicClient for that specific topic and in that client, I create a Sender.
        private void SubmitToTopic(Material material, string action, int sequence)
        {
            TopicClient topicClient = null;
            lock (msgFactory)
            {
                topicClient = msgFactory.CreateTopicClient(TOPICNAME);
            }
            var sender = topicClient.CreateSender();
            BrokeredMessage message = BrokeredMessage.CreateMessage(material);
            message.Properties["Type"] = material.MaterialType;
            message.Properties["Number"] = material.Number;
            message.Properties["Region"] = material.Region;
            message.Properties["Action"] = action;
            message.Properties["Sequence"] = sequence;
            sender.Send(message);
        }

Creating the subscribers

  • In the subscribing application, I also add the initialization code as in the client app.
  • After collecting some information from the user, through the console, I start listening on the subscription, using the following code
var subscriptionClient = msgFactory.CreateSubscriptionClient(topicName, subscriptionName);
var receiver = subscriptionClient.CreateReceiver(ReceiveMode.ReceiveAndDelete);            
bool dontstop = true;            
while (dontstop)            
{                
BrokeredMessage message = null;                
while (receiver.TryReceive(TimeSpan.FromSeconds(10), out message))                
{                    
var material = message.GetBody<Material>();
Console.WriteLine("Material received from subscription " + subscriptionName);
Console.WriteLine(material);                
         }
Console.WriteLine("If you want to continue polling, enter X");                
         dontstop = Console.ReadLine().ToLower() == "x";            
      }
  • In the subscribing application, I also give the user the ability to create a subscription and immediately start listening on it. Therefore, I am executing the following code, prior to the above mentioned code.  I ask the user to enter a SQL expression that I am using to define the Filter for the Subscription.

Console.WriteLine("Please enter your new subscription name.");       
string subscriptionName = Console.ReadLine();      
Console.WriteLine("Please enter your filter expression.");      
string filter = Console.ReadLine();      
sbClient.GetTopic(topicName).AddSubscription(subscriptionName, new SqlFilterExpression(filter));

Creating the queues and topics

At this moment there is no out of the box tooling to create queues, topics and subscriptions.  Therefore, I wrote a simple WPF application to list, created, delete these objects in an AppFabric namespace. 
explorer

The application at work

The following screenshots show a simple test where I am sending a message with the Region set to EU (left) and I am listening on the EUSubscription with filter (Region = ‘EU’) on the right.

ClientApp

subscriber

Filters

The filter expressions that can be applied are very rich.  A SQL syntax is being used to create these filters.  (the underlying store of the service bus is SQL Azure).  I tried out different types of filters.
These filters have been successfully tested as SqlFilterExpressions:

  • Region = ‘EU’
  • Sequence > 3 AND Sequence < 7
  • Number LIKE ‘LG%’
  • Region IN (‘US’, ‘EMEA’, ‘EU’)

Conclusions

This first release of the Service Bus Enhancements is very promising.  This will form the basis for the rest of the entire middleware platform of the future.  This release of the Service Bus provides real rich messaging capabilities, allowing to build decoupled, asynchronous connections.

Things we might (not sure yet) expect over time are:

  • Pipelines (like in BizTalk) on topics, queues and subscriptions.
  • Advanced Filter Actions
  • Content based filtering (through Filter Actions?)
  • Tooling

<Return to section navigation list> 

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Wade Wegner (@wadewegner) Merged a Pull Request for the iOS Toolkit from Frederik Olsson on 5/23/2011:

image A few days ago, Fredrik Olsson opened a pull request against our GitHub repository for the Windows Azure Toolkit for iOS.  A pull request provides a way to tell others about changes you’ve made to a repository – by sending me a pull request, Fredrik was notifying me of his updates so that I could review and, if appropriate, merge them to the main repository. 

Windows Azure Toolkits for DevicesToday I merged Fredrik’s pull request.  To me, this is exciting for two reasons: 1) it brings solid updates to our library, and 2) it emphasizes one of the core value propositions for using GitHub – collaborative development!  This goes back to some of our initial conversations as we were planning these toolkits.  We spent a lot of time discussing where to store the source code for the iOS toolkit, and we ultimately chose GitHub because we felt it gave us the best opportunity for receiving updates from the community – Fredrik’s pull request validates this choice!

imageFredrick made some good updates to the library – some of them we planned to update soon, but a few we hadn’t.  These updates include (in his own words):

  • Proper use of WA (Windows Azure) as prefix, to avoid name clashes for common names such as Queue, Blob, etc.
  • TableEntity used valueForKey: and setValue:forKey:, thus overriding KVO functionality from NSObject. Renamed to objectForKey: and setObject:forKey: as the implementation intends.
  • "Block" generally changed to CompletionHandler, blocks should be named after their intention, with fallback to "block" only for truly generic functionality such as collection enumerations.
  • Use of the verb "get" changed to "fetch". "Get" is by convention reserved for out-arguments of methods.
  • Base64 helper converted to proper categories on NSString and NSData, this is a pattern that should have been applied more broadly.

You can see the full pull request here (along with our commentary).

Thank you for your contributions, Fredrik.  Keep them coming!


Matthew Weinberger (@MattNLM) reported Microsoft Windows Azure Team Reaches Out to ISVs in a 5/20/2011 post to the TalkinCloud blog:

image Microsoft Windows Azure, the built-in-Redmond PaaS offering, is looking for a few good ISVs to expand their cloud reach. To that end, Microsoft has released the Windows Azure ISV Business Economics package, a collection of case studies and how-to’s designed to show developers the business benefits of developing for their cloud.

imageHere’s the list of the package’s contents, taken directly from the official blog entry:

  • A Windows Azure platform overview including cost examples of ISVs using Windows Azure for their SaaS applications
    • An ISV overview deck
    • “Getting Started with Windows Azure” guidance document
  • Exclusive analyst reports
    • Gartner: Forecast – Public Cloud Services, Worldwide and Regions, Industry Sectors, 2009-2014
    • IDC: Worldwide Software as a Service 2010–2014 Forecast: Software Will Never Be the Same
    • Forrester: The Top Five Changes For Application Development In 2010
  • A collection of Microsoft whitepapers and successful ISV case studies
    • Whitepapers
      • The Economics of the Cloud, Nov. 2010
      • Securing the Microsoft Cloud, Aug. 2010
    • ISV Case Studies
      • Quest Software, U.S., 3,500 employees
      • Softeng, Spain, 30 employees
      • SharpCloud, U.K., 50 employees
      • Holtl Retail Solutions, Germany, 85 employees
      • NVoicePay, U.S., 49 employees
      • Sitecore, Denmark, 200 employees

imageThe part that really sticks out to me is the variance in size of those companies Microsoft has decided to turn into case studies. While they lead with Quest Software, they’re a veritable giant compared to the other companies on the list. It really sounds like Microsoft isn’t after the large, established players so much as they are smaller, specialized software startups.

That actually fits with Microsoft’s Windows Azure ISV recruitment strategy of late: back in February 2011, the company began offering free Azure services to entice developers into at least giving the platform a shot. While that deal is up very soon – June 30, 2011, to be exact – it seems obvious that Microsoft really, really wants to keep that small developer influx going


<Return to section navigation list> 

Visual Studio LightSwitch

Michael Washington (@adefwebserver) reported he reached A Time To Pivot in a 5/21/2011 post:

image Sometimes you need to make a change, a “pivot” so to speak. The pivot I decided to make, is to concentrate on LightSwitch applications moving forward.

Why have I come to this conclusion? Look at this screen shot:

image

image2224222222This is a Silverlight client for my Open Source project, http://ADefHelpDesk.com. The problem is that I started this project a year ago! The reason I haven't finished it:

  • So much time is spent handling basic stuff, and the more code added causes things to move slower because you are afraid of breaking something
  • Following the various “patterns” require even more code
  • It is just not “fun”, because I am not working on “the problem domain”, but “plumbing” (like spending time mining for iron ore rather than building cars)
  • Time is money, and this project has already taken twice as many hours as the original ASP.NET code

Simply, when I originally stated working on the project, it was fun and exciting. Later on it took me longer and longer to get the energy to work on it, because I knew I would spend 4 hours and only get “a little bit more done”. I was swimming deeper and deeper into thousands of lines of code, and the more I worked on it, the more I knew I would be spending time working on stuff that is not “fun”.

I do not have to have fun when I am being paid, but, with Open Source, if it isn’t fun, I’m not doing it!

Also, I originally thought that LightSwitch could not be used for projects that had intensive UI’s. However, after writing these two articles:

image

image

I now realize that you can make a custom LightSwitchShell” (to remove the normal LightSwitch menu and headers), and completely create the entire UI for your LightSwitch application.

So, it is possible for me to make the the exact UI, that I planned to do for http://ADefHelpDesk.com, in LightSwitch.

The Future Of The Visual Studio “Empire”

image

I also feel that Visual Studio is making a “long term bet” on LightSwitch. The reason I believe this, has nothing to do with Silverlight, and everything to do with HTML 5.

Steve Anonsen replied to a post about LightSwitch and HTML, and indicated that it is a “scenario… …and we want to support”. Yes I am reading way too much into that simple comment, but, it is clear that there is nothing stopping a future version of LightSwitch from creating Silverlight and/or HTML pages.

When LightSwitch is able to also create HTML applications, it will be a tool that allows you to create applications faster than PHP, Ruby, ect. What this would mean, is that you can create web applications faster than any other technology, period. This will solidify and further continue the dominance of the colossal Visual Studio  “Empire”.

Microsoft creates “Tools”. It is one thing to create a “Tree House” using only a hammer and nails, but, you need the proper “Tools” if you want to create a “House you can live in”.

Visual Studio already is the best “Development Tool” out there, as you can see, Microsoft is only planning to accelerate it’s dominance.

It’s nice to see that the managers in the Microsoft Developer Tools Division, are firmly at the wheel, are wide awake, and have the “ship” pointed in the right long-term direction.

The Road Ahead

LightSwitch is just a “framework”. My decision is really no different than my decision years ago to concentrate on making DotNetNuke websites rather than normal ASP.NET websites. That decision has allowed me to complete far more projects than if I had to code everything from scratch. Underneath the framework, it is just normal ASP.NET code.

With LightSwitch, you are still creating normal ASP.NET / Entity Framework/ WCF RIA / Silverlight code. My decision to concentrate on using the LightSwitch framework is simply that I want to complete more projects faster.


Michael Washington (@adefwebserver) posted Two Ways To Call LightSwitch From a Custom Silverlight Control to the Visual Studio Lightswitch Help blog on 5/21/2011:

imageIt is easy to create a LightSwitch application using your own custom user interface (UI), composed entirely of Silverlight Custom Controls. The article This Is How LightSwitch Does MVVM, covers the basics of how you bind Silverlight Custom Controls to LightSwitch Properties and Collections. This article demonstrates two ways to raise LightSwitch Methods (in a normal Silverlight MVVM applications these would usually be Commands), from a Silverlight Custom Control.

image

image2224222222We first start with the project from the article: LightSwitch Procedural Operations: “Do This, Then Do That”.

image

imageIn that project, we built an application that allowed us to select a Book, two Sites, and Transfer books between them. First we press the “Get Existing Inventory” button to see the available inventory at the two sites. After entering the amount to transfer, pressing the “Transfer Inventory” button makes the transfer.

Using A Silverlight Custom Control

image

The first step is to add a Silverlight Class Library to the LightSwitch solution.

image

We can open the Silverlight project up in Expression Blend and design the UI. The only thing we cannot use, is Expression Blend Behaviors. In this example, we are using Alan Beasley’s buttons from one of his projects.

image

Both of the methods to call LightSwitch from a Silverlight control, require references to:

  • Microsoft.LightSwitch – ..\ClientGenerated\Bin\Debug\Microsoft.LightSwitch.dll
  • Microsoft.LightSwitch.Client – ..\ClientGenerated\Bin\Debug\Microsoft.LightSwitch.Client.dll

The Sheel Method

image

Sheel Shah, from the LightSwitch team showed me this method. We just double-click on the button in the designer, and enter the following code:

using System.Windows.Controls;
using Microsoft.LightSwitch.Presentation;
namespace SilverlightUI
{
    public partial class InventoryUI : UserControl
    {
        public InventoryUI()
        {
            InitializeComponent();
        }
        private void TransfeerInventoryButton_Click
            (object sender, System.Windows.RoutedEventArgs e)
        {
            // Get a reference to the LightSwitch DataContext 
            var objDataContext = (IContentItem)this.DataContext;
            // Get a reference to the LightSwitch Screen
            var Screen = 
                (Microsoft.LightSwitch.Client.IScreenObject)objDataContext.Screen;
            // Call the Method on the LightSwitch screen
            Screen.Details.Dispatcher.BeginInvoke(() =>
            {
                Screen.Details.Methods["TransfeerInventory"]
                    .CreateInvocation(null).Execute();
            });
        }
    }
}

That’s it. The advantage is that it really cannot get any simpler. The only disadvantage of this, is that code can break (if the method signature changes), and it wont show up at compile-time.

The Karol Method

image

Karol Zadora-Przylecki, from the LightSwitch team showed me this method.

First we switch to File View (you have to click on the LightSwitch project in the Visual Studio Solution Explorer first to see that button).

image

We create a class file in the Common project with the following code:

using System;
namespace LightSwitchApplication.UserCode
{
    public interface IInventoryManagement
    {
        void TransfeerInventory();
        void GetExistingInventory();
    }
}

image

We go to the LightSwitch screen that we plan to use our Silverlight Custom Control on, and click Write Code.

image

We enter code to inherit from the Interface we just created.

We use this code for the implementation:

    // Interface
    void UserCode.IInventoryManagement.TransfeerInventory()
    {
        TransfeerInventory_Execute();
    }
    void UserCode.IInventoryManagement.GetExistingInventory()
    {
        GetExistingInventory_Execute();
    }

image

We add a reference to the Common project in our Silverlight project.

We are then able to use the following code for the button event:

    private void GetInventoryButton_Click
        (object sender, System.Windows.RoutedEventArgs e)
    {
        // Get a reference to the LightSwitch DataContext 
        var objDataContext = (IContentItem)this.DataContext;
        // Get a reference to the LightSwitch Screen
        var Screen = 
            (Microsoft.LightSwitch.Client.IScreenObject)objDataContext.Screen;
        // Cast the LightSwitch Screen to our custom Interface 
        // (that we put in the "common" project)
        var InventoryManagement = 
            (LightSwitchApplication.UserCode.IInventoryManagement)Screen;
        // Call the Method on the LightSwitch screen
        Screen.Details.Dispatcher.BeginInvoke(() =>
        {
            InventoryManagement.GetExistingInventory();
        });
    }

See the article at this link for the steps on adding a Silverlight Custom Control to your LightSwitch screen.

When we run the project, we can call any Method (Command) on our LightSwitch screen.

It is suggested that you also read: This Is How LightSwitch Does MVVM, to fully understand how all of this is simply part of the MVVM pattern.

Download

You can download the code at this link.


Michael Washington also described Using The Document Toolkit for LightSwitch on 5/19/2011:

image

imageThe Document Toolkit for LightSwitch is a commercial LightSwitch control extension. It is one of the first to become available, and is a great example of an extension that allows you to do amazing things easily with LightSwitch.

image

image2224222222Click here for directions to install the control extension.

The Resume Application

To run the Document Toolkit for LightSwitch through it’s paces, let’s build a LightSwitch application that will allow us to organize resumes. This application will allow us to enter information from job candidates, and upload and store their Microsoft Word resume in the application. The Document Toolkit for LightSwitch will allow us to see the resumes.

image

First we create a new LightSwitch Desktop application.

image

We use the schema above for the table. Notice that WordDocument has a type of Binary.

image

When we create a List and Details screen, we then have the opportunity to change it to use the Document Toolkit for LightSwitch extension.

image

Next, we add a button.

image

Enter the following code for the button:

using System;
using System.Linq;
using System.IO;
using System.IO.IsolatedStorage;
using System.Collections.Generic;
using Microsoft.LightSwitch;
using Microsoft.LightSwitch.Framework.Client;
using Microsoft.LightSwitch.Presentation;
using Microsoft.LightSwitch.Presentation.Extensions;
using Microsoft.LightSwitch.Threading;
using System.Windows.Controls;
namespace LightSwitchApplication
{
    public partial class ResumeDatasListDetail
    {
        partial void UploadDocument_Execute()
        {
            // Write your code here.
            Dispatchers.Main.Invoke(() =>
            {
                var dlg = new OpenFileDialog();
                if (dlg.ShowDialog() == true)
                {
                    // load document into byte[] (== binary)
                    var data = new byte[dlg.File.Length];
                    using (var stream = dlg.File.OpenRead())
                    {
                        stream.Read(data, 0, data.Length);
                    }
                    // assign loaded document to Document property
                    this.ResumeDatas.SelectedItem.WordDocument = data;
                }
            });
        }
    }
}

image

When we run the application, we can add a new person…

image

We can then click the Upload Document button…

image

The document shows up in the Document Viewer (note, it will have a watermark, to remove the watermark, you will need to purchase the product at: http://firstfloorsoftware.com/documenttoolkit).

Click Save to save the record and the document.


Return to section navigation list> 

Windows Azure Infrastructure and DevOps

The Windows Azure App Fabric Team announced Fraud Prevention Measure for New Windows Azure Platform Accounts on 5/23/2011:

imageAs a fraud prevention measure, we have recently implemented a verification process for newly purchased Windows Azure platform accounts. When a new customer first logs on to the Windows Azure platform developer portal, the customer must provide us with a cell phone number where we then send a 4-digit code via a text message. The customer must then enter this 4-digit code on the Developer Portal for verification purposes. This is a one-time only process. Existing Windows Azure platform customers with established accounts are not required to complete this verification process, even with the purchase of a new Windows Azure platform subscription. Once the customer has successfully passed this cell phone verification, they will not be challenged again.

However, due to limitations of this process, certain markets and cell phone carriers are presently not supported. In these instances, the customer will receive a message that they need to contact support for manual verification. We apologize for this inconvenience. We are currently working hard to expand the list of supported markets and carriers.


David Lemphers (@davidlem) reported My “Business Impact of Cloud Computing” Session Video is Available! in a 5/23/2011 post:

image A couple of weeks ago, I was very fortunate to be invited to OpenStack’s Design Summit for 2010, not only to attend and meet some amazing folks who I believe are at the leading edge of cloud computing, both as a technology shift but also as an emerging billion dollar industry, but to also speak on the topic of the business impact of cloud computing.

imageMy Session and My Slides

The more I work with clients and business partners, the more passionate I become around the business transformation that becomes possible through cloud computing. As organizations start to embrace cloud technologies, and transform into on-demand businesses, the whole landscape of product and services supply and demand is changing, both through evolution and revolution. It’s so exciting!

David is a former member of the Windows Azure team who’s now a cloud specialist with PwC, a.k.a PricewaterhouseCoopers.


Klint Finley (@klintron) reported Forrester Wave on PaaS Puts Microsoft and Salesforce.com at the Head of the Pack in a 5/19/2011 post to the ReadWriteCloud blog:

image Forrester has released its Wave report on platform-as-a-service providers for 2011. It looks at how these providers serve two different types of user: professional developers and business developers.

image By professional developers, Forrester means those that we usually think of as developers: those who actually know programming languages and write code. By business developers, Forrester means business experts who use visual programming tools to snap together applications from existing components.

image Many of the usual suspects, like Engine Yard and Makara (owned by Red Hat), aren't covered because they are focused mostly on Web, and not enterprise, development. Forrester evaluated PaaS offerings from:

Forrester concluded that Microsoft and Salesforce.com lead the pack for coders with Azure and Force.com respectively. However, Forrester does express some reservation about Salesforce.com's acquisition of Heroku. The report warns that Heroku could add complexity and confusion to Force.com, and notes that Salesforce.com has not expressed plans to move Heroku to its own cloud from Amazon Web Services.

For business developers, Salesforce.com was the only leader, though Caspio and WorkXpress received high marks for services specifically targeted at this market. The report says the companies didn't make the "leaders" cut because of "weaker strategy scores typical of new vendors." We covered Caspio here.

The report emphasizes that the PaaS market is still immature, with new companies springing up constantly and existing companies being acquired.


<Return to section navigation list> 

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

image

No significant articles today.


<Return to section navigation list> 

Cloud Security and Governance

David Navetta (pictured below) and Nicole Friess wrote "Privacy by Design": A Key Concern for VCs and Start-Ups on 5/23/2011 for the Information Law Group:

image The privacy landscape appears to be shifting toward a model that promotes greater consumer awareness of and control over data. Reflecting its consumer protection mission, the FTC’s Protecting Consumer Privacy in an Era of Rapid Change issued December 1, 2010 urges companies to adopt a "privacy by design" approach. Senators John Kerry (D-MA) and John McCain (R-AZ) introduced their "Commercial Privacy Bill of Rights" which adopts some of the FTC’s privacy by design principles, requiring companies to implement privacy protections when developing their products and services. The foundational principles of privacy by design, originally developed by Information and Privacy Commissioner of Canada Ann Cavoukian, address the effects of increasing complexity of data usage. With data now ubiquitously available, as well as processed and stored on a multinational level, privacy by design is becoming internationally recognized as fundamental for the protection of privacy and data integrity.

Although privacy by design isn’t set in stone (yet), start-up companies seeking to collect and use personal information as part of their business plan may want to consider incorporating privacy by design into their everyday business practices. Similarly, as part of their due diligence process, venture capital firms scrutinizing startups seeking to leverage personal information would be well-advised to determine if privacy is being “baked into” into the products and services being offered by such startups. It may be both difficult and costly for companies to implement privacy protections retroactively if privacy concerns are overlooked during the early stages of business planning. Start-ups have the advantage of building privacy protections into their business models from the outset, which can keep those companies out of trouble in the form of litigation or agency enforcement. Privacy-conscious VCs will be more inclined to fund start-ups that reduce risk by proactively address privacy issues and potential liability. In turn, VCs that scrutinize whether privacy is part of a start-up’s business plan will be able to better protect their investment (and their investors).

So what does privacy by design mean? How can start-up companies incorporate privacy by design principles into their business practices to attract VC funding? How should privacy and security legal risks (and solutions) be written into a start-up’s business plan? This post tries to answer these questions.

The authors continue with the following topics:

  • Step 1 - Understand Your Business Model.
  • Step 2: Understand Your Market.
  • Step 3 – Understand the Legal Risk Environment.
  • Step 4 – Integrate Privacy by Design.

And conclude:

As consumers express an increased demand for privacy protections, entrepreneurs should ask themselves if their products and services provide consumers with notice and choice as to how their data is collected and handled, and tailor their business practices accordingly. Companies are wise to understand their business model and the market in order to tailor their products and services accordingly.

Consumer outcry has caused companies such as Google and Facebook to retroactively change their privacy practices – a process than can be costly with unnecessary attendant negative publicity. Anticipating and preventing privacy violations before they happen mitigates the risk such invasions will occur as well as the costs of remediation. This means having a thorough understanding of the privacy legal risk environment. Doing so is difficult as the environment is in upheaval, therefore companies would be wise to seek professional advice to navigate the legal and regulatory landscape at both the state and federal level.

A start-up company has the advantage of being able to develop and implement a privacy program early, and bake privacy into the design of their products and services, thereby ensuring that these substantive privacy protections become a foundational part of its business model. Employees can be trained early regarding the need for privacy and network security, which helps foster a consumer-protective enterprise culture. Privacy by design makes privacy an essential component of the core product or service a company delivers. Spotting privacy issues and addressing concerns before launch aligns products and services with consumer expectations and can save everyone – entrepreneurs and VCs alike – from future headaches.


David Aiken (@thedavidaiken) posted a list of Cloud Security Links from TechEd North America on 5/19/2011:

image Yesterday I had the pleasure of being the person at the front in a discussion of Cloud Security at Microsoft’s TechEd conference. Big thanks to those who came along to the last session of the day. During the discussion, I was asked for a list of links to find out more information about Microsoft’s cloud security. This probably isn’t exhaustive, but here goes:

image

General Cloud Security including data center security:

Office 365 Security

Windows Azure Platform Security

THIS POSTING IS PROVIDED “AS IS” WITH NO WARRANTIES, AND CONFERS NO RIGHTS


<Return to section navigation list> 

Cloud Computing Events

Hanuk Kommalapati described a Few interesting TechEd 2011 Sessions for On-Demand Viewing in a 5/20/2011 post:

Keynote

Tech·Ed North America 2011 Keynote Address

Don't miss the opportunity to hear from two senior Microsoft leaders -- Corporate Vice President,Server and Tools Marketing,Robert Wahbe and Corporate Vice President,Visual Studio,Jason Zander -- as they showcase a broad array of technology in the 2011 Tech·Ed Keynote.

Cloud Computing

Introducing the Windows Azure Platform by David Chappell

Cloud computing looks like the biggest change to hit our industry in many years. But taking advantage of the shift requires understanding this new approach and how to exploit it. In this presentation, David Chappell looks at the Windows Azure platform and what it means for organizations that…

Moving Applications to the Cloud by Scott Densmore

You have been building applications on the Microsoft platform for years. You master ASP.NET,SQL Server,Active Directory,and the .NET framework. What does the Windows Azure Platform represent? What are the important considerations for moving your apps,your skills and practices to the cloud? This…

Using Windows Azure Virtual Machine Role by Vijay Rajagopalan

Learn how to use the Virtual Machine Role on Windows Azure. We’ll demonstrate best practices for building and uploading customized OS images and deploying and managing services using those images.

Microsoft SQL Azure Overview: Tools, Demos and Walkthroughs of Key Features

This session is jam-packed with hands-on demonstrations lighting up SQL Azure with new and existing applications. We start with the steps to creating a SQL Azure account and database, then walk through the tools to connect to it. Then we open Microsoft Visual Studio to connect to Microsoft .NET…

Building Scalable Database Solutions Using Microsoft SQL Azure Database Federations by Cihan Biyikoglu

SQL Azure provides an information platform that you can easily provision, configure and use to power your cloud applications. In this session we explore the patterns and practices that help you develop and deploy applications that can exploit the full power of the elastic, highly available, and…

All the Cloud Computing sessions can be found at: All TechEd 2011 Sessions on Cloud Computing.

Database and BI

Microsoft SQL Server: The Data and BI Platform for Today and Tomorrow by Quentin Clark

Foundational Sessions bridge the general topics outlined in the keynote address and the in-depth coverage in breakout sessions by sharing the company’s vision, strategy and roadmap for particular products and technologies, and are delivered by Microsoft senior executives. Attend this session for a…

A Lap around Microsoft Business Intelligence by Pej Javaheri

Familiar with the database functionality of Microsoft SQL Server or the collaboration functionality of Microsoft SharePoint,but have not had the chance to explore the broad and integrated Business Intelligence capabilities offered by across these products and Microsoft Excel? This session provides…

Performance Tuning and Optimization in Microsoft SQL Server 2008 R2 and SQL Server Code-Named "Denali" by Adam Machanic

Understanding when to search for performance problems and how to tune your queries to correct them is one of the most persistent questions that face DBAs. Join Adam and Mike to explore a methodology for discovering and improving performance problems. Along the way you'll see examples that take…

Creating Self-Service Analytic BI Applications with Microsoft SharePoint 2010 by Peter Myers

SharePoint 2010 is the premier BI portal solution. In this session learn how business users can be empowered to create self-service analytic applications by harnessing the power of SharePoint. Topics cover an introduction to the Chart and Status List Web Parts and the three SharePoint service…

Abundantly "Crescent": Demos Galore by Sean Boon

What is Microsoft project code-named "Crescent" really capable of? How can it address the needs of your business? This session covers a variety of demonstrations against REAL DATA,across various industry verticals,to enable you to see what's possible when deploying this new technology. If you like…

Microsoft SQL Server Code-Named "Denali" AlwaysOn Series,Part 1: Introducing the Next Generation High Availability Solution by Santosh Balasubramanian

In this session we talk about the new high availability solution that is being introduced in SQL Server code-named "Denali". The session provides an overview of AlwaysOn and introduces the key new features and capabilities that will help businesses achieve the high availability SLA for mission…

Microsoft SQL Server Code-Named "Denali" AlwaysOn Series,Part 2: Building a Mission-Critical High Availability Solution Using AlwaysOn by Justin Erickson

In this session we walk you through the steps to deploy a high availability solution using AlwaysOn. This is a demo-heavy presentation and the experts from the product development team walk you through a high availability solution architecture and deployment,explain key architectural concepts and…

All the Database and BI sessions can be found at: All TechEd 2011 Sessions on Database and BI

Other

An Overview of the Microsoft Web Stack by Scott Hanselman

Oh yes. Building web applications on the Microsoft stack continues to evolve. There are lots of great tools to leverage,but it can be difficult to keep up with all the options. In this technical and fast-paced session,learn from Scott Hanselman how the pieces fit together. We look at Microsoft…

A Lap around Microsoft Silverlight 5 by Pete Brown

Come see what’s new and exciting with the Silverlight 5 beta. Learn about features for business application development, visualization and casual gaming. In this demo- and code-focused developer session,we hit the major new features in Silverlight 5,and get you well on your way to being productive…

Four Ways to Leverage the Microsoft Lync 2010 Client APIs in Your Applications by Albert Kooiman

This session walks you through the capabilities of the Lync 2010 client SDK. We demonstrate how easy it is to embed Lync 2010 functionality in your Windows or Microsoft Silverlight application,as well as what possibilities you have to extend and customize the Lync client with your own business…

Moving Your App and Skills from Windows Forms to Microsoft Silverlight (and WPF) by Pete Brown

Let's tackle moving a Windows Forms application to Silverlight. We cover the differences between the platforms and overall approaches,look at what is successful,and then run through an application migration using a data-oriented Windows Forms application. At the end of this session,you'll have seen…

Browse all the sessions from TechEd 2011 at http://channel9.msdn.com/Events/TechEd/NorthAmerica/2011


Eric Nelson (@ericnel) listed Windows Azure Platform announcements from TechEd Atlanta May 2011 in a 5/20/2011 post:

image I will update the following as I know I’ve missed some – but it is a good starting point :-)

Drop me a comment if you have good links to share.

Eric

Windows Azure

SQL Azure

AppFabric

In the press:


Brian Loesgen summarized nine FREE Western US Azure Bootcamps in a 5/19/2011 post:

image If you’re looking for a quick way to get started with Windows Azure, we’re running a series of “bootcamps” in the western US.

There is no charge for these events, and if you’ve wanted to learn more about Windows Azure, this is a great way to do it.

Space is limited, register today (links are below)!

DATE

LOCATION

TIME

LINK

May 31, 2011

San Francisco, CA

8:30 AM – 6:00 PM PDT

1032486466

June 2, 2011

Los Angeles, CA

8:30 AM – 6:00 PM PDT

1032486471

June 3, 2011

Mountain View, CA

8:30 AM – 6:00 PM PDT

1032486469

June 6, 2011

Irvine, CA

8:30 AM – 6:00 PM PDT

1032486473

June 7, 2011

San Diego, CA

8:30 AM – 5:30 PM PDT

1032486472

June 14, 2011

Bellevue, WA

8:30 AM – 6:00 PM PDT

1032486474

June 16, 2011

Portland, OR

8:30 AM – 6:00 PM PDT

1032486475

June 20, 2011

Denver, CO

8:30 AM – 6:00 PM MDT

1032487656

June 21, 2011

Salt Lake City, UT

8:30 AM – 5:00 PM MDT

1032486470


<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Maureen O’Gara asserted “SAP users will be able to take their SAP licenses and applications to the cloud” as a deck for her SAP Moves to Amazon article of 5/23/2011 for the Cloud Computing Journal:

imageSAP is moving its software to Amazon - which is still recovering its dignity after crashing and losing data.

SAP's beginning with all of its BusinessObjects business intelligence and so-called Rapid Deployment widgetry. Initially the software will just work on Linux but Windows instances are promised along with SAP's signature ERP.

When all that's finally toted up it will represent a big chunk of SAP's change.

image SAP running wholesale in production on Amazon's public cloud - as opposed to just test and development- is supposed to do a lot to burnish the image of both the public cloud as an enterprise vehicle and Amazon, which says it hasn't lost any customers because of that unfortunate upgrade run amok.

image Ironically, SAP complained to Bloomberg afterwards that the Amazon outrage made it harder to sell clients on SaaS, but it's backpedaling on that now. It was the only company to say such a thing.

Anyway, SAP users will be able to take their SAP licenses and applications to the cloud. SAP says it has fully tested and benchmarked Amazon's resources and certified them by the same standards it applies to on-premise physical servers and virtual platforms. They reportedly worked very hard to get things just right. SAP optimized for Amazon and Amazon for SAP. Deployment should be less time-consuming.

The alliance makes Amazon Web Services a certified SAP global technology partner and the pair will support the stuff so SAP users can cut their infrastructure costs and TCO. Third-party providers and strategic integrators like Capgemini will offer planning, deployment, migration and other customer support services

SAP's Rapid Deployment solutions include CRM, business communications management, Sybase Mobile Sales for SAP CRM, supplier relationship management, supplier network collaboration, treasury and risk management, NetWeaver Master Data Management, IT Service Desk Operation and analytics.

By the way, SAP figures it should have a thousand customers for its relaunched SaaS Business ByDesign ERP solution by the end of the year.


Czaroma Roman reported Microsoft, SAP Team Up To Make Cloud Management and App Development Easier in a 5/21/2011 post to CloudTimes:

Expanding their long-term partnership, SAP and Microsoft announced their plans to improve integration between SAP software and Microsoft virtualization and cloud computing technologies, for their mutual customer base.

The announcement was made at SAPPHIRE® NOW, held in Orlando, Florida, May 15-18, 2011 where Microsoft was also named SAP Global Technology Partner of the Year.

The parties’ plan focuses on two key areas: first, to help .NET Framework developers to build applications connecting to SAP more easily; and second, to help customers harness the power of the cloud.

With this integration, developers can expect more support and integration between the development worlds of SAP and Microsoft. They intend to make business processes from SAP software more easily consumed and extended by .NET developers, simplifying the overall application development process. This will help redefine the SAP/Microsoft developer landscape with shorter development cycles, lower costs and openness into core applications.

Microsoft and SAP are also planning to provide integration between upcoming landscape management software from SAP, Microsoft System Center and Microsoft Windows Server Hyper-V technology, bringing greater agility for cloud management and deployments.

With SAP and Microsoft’s connected offerings, customers will be able to easily scale their deployments in their own data centers or through private clouds. It’s an integration that enhances flexibility, scalability and management in the cloud.

In the future, SAP and Microsoft plan to continue their collaboration to support deployment of SAP applications for hybrid computing scenarios on the Windows Azure Platform. This will enable companies to embrace cloud computing on their terms


<Return to section navigation list> 

0 comments: