Monday, October 05, 2009

Windows Azure and Cloud Computing Posts for 10/1/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

••• Update 10/4/2009: Mike Amundsen: REST Trade-offs; Guy Rosen; State of the Cloud – October 2009; Ron Schmelzer: Private Cloud Definitions; Sean Lin: AreYouFollowingMeToo; Harish Ranganathan: SQLAzure and the ASP.NET GridView; Jesper@Bitbucket: DDoS Attack was the problem; David Gristwood: Azure CTP, Commercial Release and Future features.
•• Update 10/3/2009: Charles Babcock and Dana Moore: Platform as a Service; Pingdom: Azure Tables miss SLA target; Eugenio Pace: Claims Based Identity & Access Control Guide; Bill Crounse, MD: American Well Systems’ use of HealthVault; Darryl K. Taft: IBM’s Hadoop for Massive Mashups; John Moore: The need for Personal Health Records; and more.
• Update 10/2/2009: Don Schlicting: SQL Azure overview; Murray Gordon: Azure Projects on CodePlex; James Staten: Cloud computing maturity; Mary Grush: Cloud safety and security; Stacey Higgenbotham: Comments on Staten’s report; Andrea DiMaio: Cloud storage geopolitics; Michelle Leroux Bustamente: Claims-based security; and more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

Azure Blob, Table and Queue Services

No significant new posts on this topic today.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

••• See Harish Ranganathan’s post about SQLAzure and the ASP.NET GridView in the Live Windows Azure Apps, Tools and Test Harnesses section.

Don Schlicting’s Cloud Database with Microsoft SQL Azure article of 10/2/2009 for Database Journal is an introduction to and basic tutorial for SADB.

Johan Danforth announced a new Release 0.1.1 on CodePlex of SQL Azure Explorer dated 10/1/2009 with the “same functionality but with better UI performance (async loading of databases, tables, etc.).”

ScriptAsCreate.png

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

Eugenio Pace announces Claims based Identity & Access Control Guide – Early drafts available on CodePlex in this 10/2/2009 post:

We finally have a CodePlex site for sharing early content with you all. Check the downloads section for:

  • A few intro chapters (some of the “theory”, technologies and protocols behind claims based identity)
  • The first scenario (roughly described in my post here, but better and nicer, and written in English :-))
  • The sample code for this first chapter

The first scenario is fairly basic. However, I think people with little previous experience with Claims will find this really useful. Those who are very experienced, will probably not find a lot of new content at this point.

image

There’s nothing specific to .NET Services Access Control for Windows Azure that I’ve found so far in the guide. See Keith Brown’s .NET Access Control Service takes a REST post of 9/22/2009 and Pedro Felix’s comment about Azure and AD FS v2 of 9/23/2009.

• Carl Franklin and Richard Campbell question Michelle Leroux Bustamente about the claims based security model in .NET Rocks’ Show #486.

The Geneva Team Blog reports on 10/1/2009 AD FS v2.0 Passes Liberty Alliance SAML 2.0 Interoperability Testing:

Interoperability of identity systems is an important consideration for a large percentage of customers. With this in mind we chose to participate in 8 weeks of SAML 2.0 testing, which is was conducted by the Drummond Group Inc. As previously announced, we entered testing with three profiles, IdP Lite, SP Lite and EGov 1.5.

Today the test results were made public , and we are thrilled to announce we have passed. We are very proud of this accomplishment, and all the hard work the [Active Directory Federation Services] (AD FS) team did to make this happen.

Vittorio Bertocci adds his spin to the ADFS v2.0 celebration on 9/30/2009 with It’s official: ADFS 2.0 passes Liberty Alliance SAML 2.0 interop tests with IBM, SAP, Novell, Ping, Siemens, Entrust:

Well, in the last 12 months we certainly covered a lot of ground!

Last October we announced that we were going to support the SAML protocol in ADFS 2 (at the time announced under the codename Geneva Server).

Today we are backing that claim (pun intended) with the results of the latest Liberty Alliance Interoperability Testing, which demonstrate that ADFS 2’s SAML 2.0 protocol implementation interoperates with the corresponding products from Entrust, IBM, Novell, Ping Identity, SAP, and Siemens.

As a rightfully proud team explains in the Geneva blog, the test included the three main profiles IdP Lite, SP Lite and EGov 1.5; and it was pretty much the Cartesian product of all vendors & test cases, which kind of explains why I haven’t seen my good friend Caleb Baker as often in the last few weeks :-) [Emphasis Vibro’s.]

Congratulations to the Federated Identity team for this important milestone. Thanks to their efforts & commitment, the question “does ADFS2 interoperate with X?” just became exceedingly easy to answer :-).

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

••• Harish Ranganathan’s Taking your Northwind Database to SQL Azure and binding it to an ASP.NET Grid View – Part II of 8/29/2009 continues with instructions for using the ASP.NET GridView’s wizard to bind a local Northwind table and then change the connection string to rebind the GridView to a SQL Azure table:

In the previous post we had examined on getting access to SQL Azure, creating your first database, accessing it with SQL Server Management Studio and then migrating the Northwind database schema to SQL Azure using the SQL Azure Migration Wizard Beta.

As explained earlier, the SQL Azure Migration Wizard migrates the schema of your database after tuning it for working with SQL Azure.  However, we would still need to migrate the Data to our SQL Azure Server.  At the moment, the step I took was to open the instnwnd.sql script in SQL Server Management Studio (SSMO) and copy the Insert statements alone to execute.  Note that, you cannot run all the scripts directly onto the SQL Azure portal like I explained earlier due to the limitations / formats supported currently in SQL Azure. …

For more details about uploading schemas and data to SQL Azure, see my Using the SQL Azure Migration Wizard with the AdventureWorksLT2008 Sample Database (updated 9/21/2009) and electronic-only chapters

  • Chapter 12: Managing SQL Azure Accounts, Databases, and DataHubs
  • Chapter 13: Exploiting SQL Azure Database’s Relational Features

from “Cloud Computing with the Windows Azure Platform” Short-Form Table of Contents and SQL Azure Chapters, which should be posted the week of 10/5/2009. I’ll update the preceding and this post when that happens.

Sean Lin posted the source code for his live Azure Are You Following Me Too demo project to CodePlex on 10/4/2009:

Project Description: This is a simple ASP .NET MVC app that runs on Windows Azure. Basically, it gets a list of twitter users who you are following [but] who aren't following you.
Demo: A live demo is running on http://ruf.cloudapp.net.
Dependencies: This project is dependent on the tweetsharp, a .NET twitter API client library.

Pingdom reports September uptime and downtime for my live OakLeaf Systems Azure Table Services Test Harness project:

Didn’t make the 99.95% SLA for two instances again (or the 99.9% SLA for one instance, for that matter).

John Moore’s The Follow-up Visit post of 10/3/2009 describes the agonies of filling out multiple paper personal health histories for affiliated providers instead of linking to an online Personal Health Record (PHR) to receive treatment for a broken wrist.

Don’t miss John’s link to Jonathan Rausch’s If Air Travel Worked Like Healthcare article for the National Journal Magazine that’s subtitled “Fasten your seat belts – it’s going to be a bumpy flight.” Great read.

Bill Crounse, MD’s Aloha to Health ICT Past and Future post of 10/2/2009 describes American Well Systems’ use of HealthVault:

As of January, every citizen in the state of Hawaii now has access to on-line medical services (telephone, messaging, and virtual web visits) from participating physicians.  Services are available twenty-four hours a day, seven days a week.  The technology powering all this comes from American Well and Microsoft HealthVault.

Dr. Crounse is Senior Director, Worldwide Health for Microsoft

Joseph Goedert reports Blumenthal Hints at Meaningful Use for electronic health records (EHR) required by ARRA in his 10/2/2009 post to the Health Data Management blog:

In a new letter to the industry updating federal activities, David Blumenthal, M.D., national coordinator for health information technology, explains why "meaningful use" of electronic health records is the crux of the Medicare and Medicaid incentives in the American Recovery and Reinvestment Act [ARRA]. He also gives some hints on how a proposed rule due by year-end will define meaningful use.

"By focusing on 'meaningful use,' we recognize that better health care does not come solely from the adoption of technology itself, but through the exchange and use of health information to best inform clinical decisions at the point of care," Blumenthal writes. "Meaningful use of EHRs, we anticipate, will also enable providers to reduce the amount of time spent on duplicative paperwork and gain more time to spend with their patients throughout the day.  It will lead us toward improvements and sustainability of our health care system that can only be attained with the help of a reliable and secure nationwide electronic health information system.

Joe continues with more quotes from Blumenthal's full letter that’s available in the Healthcare IT chief takes on meaning of 'meaningful' post at Healthcare IT News, not the link in Joe’s post.

• Murray Gordon’s Great Windows Azure projects on CodePlex post of 10/1/2009 lists the following Azure sample applications:

    • Client Cloud Services, Client Cloud Services (CCS) is a set of Windows Azure-based services which help application developers integrate licensing, trial management, feedback, error reporting and product usage into their applications
    • Azure Storage Manager, Azure Storage Manager helps you organize your Azure Storage Accounts and to modify data on them. It is written in C# 3.5 and has no other dependencies (like Azure Dev Tools). Get your Azure Log Files easily and delete tables with one click.
    • Azure File Upload Utility, This utility will allow users to take the data from a delimited flat file and upload it in their Azure table storage account. It provides an easy to use GUI to read a flat file from the user computer and upload it into specified Azure table storage.
    • Azure Database Upload Utility, This utility will allow users to take the data from a SQL Server database and upload it in their Azure table storage account. It provides an easy to use GUI to read data from a SQL Server and upload it into specified Azure table storage.

Murray is a Microsoft ISV Architect Evangelist.

Mary Jo Foley reports on 10/1/2009 Microsoft adds consumer-friendly face to its HealthVault platform:

Microsoft launched on October 1 a beta of a new MSN service aimed at helping consumers manage their own health information.

The new service, known as My Health Info, is based on Microsoft’s HealthVault. HealthVault is a software and services platform that is hosted on Windows Azure, Microsoft’s cloud-computing environment. The service makes use of Silverlight and allows users to search for health-related topics via Bing. [Emphasis added.]

My Health Info is designed to allow consumers to store all kinds of personal health information, like childrens’ vaccination schedules, prescription records, blood sugar levels, etc. It also will allow consumers to monitor “topical areas of interest,” like swine flu. The service can be configured to maintain separate records on multiple people, so that a user could manage information on multiple family members. …

Mary Jo notes that “Serving ads on health content is also a huge opportunity for the company.” Here’s the MSN front-end’s page for authorizing data display and entry from my HealthVault account:

I wasn’t able to view individual blood chemistry lab report values in the My Lab Results widget, which indicates there’s more work to do on the Silverlight UI.

<Return to section navigation list> 

Windows Azure Infrastructure

•• Mary Jo Foley copied slide 32 of David Gristwood’s Understanding The Azure Platform Sept 2009 presentation in her  Microsoft opens Chicago and Dublin datacenters; preps for more hosted offerings article of 10/1/2009. I neglected to add a copy of the slide in the item below.

The slide includes details of Windows Azure and SQL Azure features in the current (August 2009) CTPs, what’s expected to be in the Release to Web (RTW) commercial versions, and what you can expect in the future. Here’s a larger, more readable copy (click image for full-size capture):

 

•• Mike Amundsen explores REST Trade-Offs on 10/3/2009 with links to Fielding’s thesis:

Fielding's Chapter 5 describes the REST architectural style. [T]here's quite a bit of information in this one chapter of his 160+ page dissertation. [C]urrently, the part that interests me is 5.1 - Deriving REST. [I]n it, he identifies six primary constraints, the roles they play in the REST style, and the trade-offs involved with each.

Fielding introduces Chapter 5, “Representational State Transfer (REST)” with the following:

This chapter introduces and elaborates the Representational State Transfer (REST) architectural style for distributed hypermedia systems, describing the software engineering principles guiding REST and the interaction constraints chosen to retain those principles, while contrasting them to the constraints of other architectural styles. REST is a hybrid style derived from several of the network-based architectural styles described in Chapter 3 and combined with additional constraints that define a uniform connector interface. The software architecture framework of Chapter 1 is used to define the architectural elements of REST and examine sample process, connector, and data views of prototypical architectures.

•• Charles Babcock’s Platform As A Service: What Vendors Offer article of 10/3/2009 for InformationWeek claims "Web app development will move to online platforms, despite the trade-offs that approach often requires.” Charles analyzes PaaS offerings from:

  • Salesforce.com (Force.com)
  • Microsoft (Windows Azure)
  • WaveMaker (Studio)
  • Engine Yard
  • Google (App Engine)
  • IBM (Blue Cloud)

The PaaS analysis is a companion to Andrew Conry-Murray’s From Amazon To IBM, What 12 Cloud Computing Vendors Deliver article of 9/5/2009.

Dana Moore’s The Risks And Benefits Of Platform As A Service sidebar for Charles’ article is a excerpt from the Coming of Age in the Era of Cloud Computing article of the same date for Dr. Dobbs. In brief, here are the

Risks:

  • Vendor Lock-In
  • Technical Immaturity
  • Privacy And Control
  • Misjudging "Flexibility Versus Power"

Benefits:

  • Testing Is Deployment
  • Dynamic Allocation
  • Internal Entrepreneurship

Of course, Dana provides more detail for the bullet points.

James Urquhart continues his series about operating-system bloat in virtualized servers with Cloud computing and the big rethink: Part 2 of 10/2/2009:

In the opening post of this series, I joined Chris Hoff and others in arguing that cloud computing will change the way we package server software, with an emphasis in lean "just enough" systems software. This means that the big, all-purpose operating system of the past will either change dramatically or disappear altogether, as the need for a "handle all comers" systems infrastructure is redistributed both up and down the execution stack.

The reduced need for specialized software packaged with bloated operating systems in turn means the virtual server is a temporary measure; a stopgap until software "containers" adjust to the needs of the cloud-computing model. In this post, I want to highlight a second reason why server virtualization (and storage and network virtualization) will give way to a new form of resource virtualization.

I'll start by pointing out one of the unexpected (for me at least) effects of cloud computing on data center design. Truth be told, this is actually an effect of mass virtualization, but as cloud computing is an operations model typically applied to virtualization, the observation sticks for the cloud. …

Stacey Higgenbotham reports Despite the Hype, There’s No Rush to Cloud Computing Yet on 10/2/2009 for GigaOm, which is based on Forester’s “TechRadar™ For Infrastructure & Operations Professionals: Cloud Computing, Q3 2009” publication:

When it comes to cloud computing, the good news for companies offering everything that isn’t software as a service is that there’s still plenty of time to get your offerings into the market, according to a report released today from Forrester that takes the current temperature of several sections of the industry. The bad news is that adoption of one of the oldest cloud-based services — infrastructure as a service — is still pretty darn low, with only 4 percent of small businesses and 3 percent of enterprise customers even trying it.

iaas

Stacey continues with an analysis of the Forester data.

• James Staten’s Assessing The Maturity Of Cloud Computing Services 10/2/2009 post to the Forrester blog describes Forester’s “TechRadar™ For Infrastructure & Operations Professionals: Cloud Computing, Q3 2009,” a new, US$1,749 Tech Radar report:

The number one challenge in cloud computing today is determining what it really is, what categories of services exist within the definition and business model and how ready these options are for enterprise consumption. Forrester defines cloud computing as a standardized IT capability (services, software, or infrastructure) delivered via Internet technologies in a pay-per-use, self-service way.

While definition is crucial to having a fruitful discussion of cloud, the proper taxonomy and maturity of these options is more important when planning your investment strategy. To this aim, Forrester has just published our latest Tech Radar that maps the existing cloud service categories (not the individual vendors within each category) along maturity and value impact lines to help you build your strategic roadmap.

Udayan Bannerjee debunks the Cloud does not require change in programming model – Microsoft claim on 10/1/2009:

In two of my previous posts I have highlighted why I think cloud computing needs change in thinking. However, in a recent discussion Walid Abu-Hadba (of Microsoft) clearly stated that Microsoft’s cloud strategy assumes that they are going to retain the existing programming model for cloud. That is, programmers can develop their application without bothering about where it is going to be deployed.

So Udayan restates “why I think the programming model will change. I have based my thoughts on how Google App Engine is structured – after all Google has the largest cloud computing infrastructure”:

  • You do not have to worry about object-relation[al] (O-R) mapping. Big Table implementation of Google allows you to save the objects directly.
  • You will not need web server – app server – database server. All of them become redundant.
  • Designing for heavy transaction volume will not be necessary – it is in built. Instead …

    • …individual transaction performance will have to be optimized.
    • …designing for high data volume will be required.
    • …impact of data contention across transactions needs to be checked.
    • …compute intensive algorithm needs to be broken up.
  • Heterogeneity is the order of the day. Consuming services from different sources and providing services for others to use will be the starting point.
  • Approach to debugging will be totally different. I am not sure how it will be done.
  • In many situations, storing – searching – retrieving unstructured data may be advantageous compared to structured data.

I propose to elaborate each of these points in my future posts.

Is this Microsoft’s first major strategic decision in the post Bill Gates era? Let us see how it pans out.

It will be interesting to see how Udayan makes his case.

Ellen Rubin explains Why Cloud is at the Top of the CIO’s Priorities in this 10/1/2009 post:

In the most difficult economic climate in decades, CIOs are reevaluating their strategies and looking for new ways to reduce data center costs and overhead while improving responsiveness to business requirements. Cloud computing has emerged as a much more agile and efficient approach than what companies have done in the past: adding more compute, storage and networking capacity or trying to get more out of what they already own.

Cloud computing did not emerge from a vacuum, but has its origins in three technology "megatrends" that most CIOs are already familiar with. These developments were all born out of the same need -- to drive down costs, simplify data center operations and allow IT to be as agile as possible. As these megatrends have become pervasive, they've helped put the cloud in the CIO's strike zone:

Ellen expands on each of the preceding “megatrends.”

Greg Ness’s James Urquhart: Infrastructure 2.0 Young Turk post of 10/1/2009 says:

James was just admitted to the Infrastructure 2.0 Working Group but has been discussing infrastructure 2.0 issues as long as anyone. I’ve requested a picture, but until I get one I’ll substitute … an image from Wikipedia of prominent Young Turk Ahmet Riza. …

and also offers “a collection of related articles/blogs by James, tracing back to 2006.”

Mary Jo Foley reports Microsoft opens Chicago and Dublin datacenters; preps for more hosted offerings on 10/1/2009:

Just a week after celebrating the opening its “chiller-free” Dublin datacenter, Microsoft is turning on its $500 million, 700,000-square-foot Chicago one.

Phase one of the Chicago datacenter opened on September 30. Microsoft is turning on power in phases there so “customers today will enjoy top-notch performance and availability while we control costs for Microsoft and its shareholders,” according to a September 28 post on the Microsoft datacenters blog.

The Chicago datacenter is one of the largest datacenters in the world to make use of shipping containers, according to the company. Each of these containers holds 1,800 to 2,500 servers, which Microsoft officials have said enables the company to better conserve energy and take advantages of new power-effiency mechanisms. …

Photo credit: ZDNet

Rich Miller provides another perspective on the Chicago data center in his Microsoft Unveils Its Container-Powered Cloud post of 9/30 to the Data Center Knowledge blog:

As the bay door opens at Microsoft’s enormous new Chicago data center, the future backs in on a trailer. Forty-foot long containers packed with servers are unloaded with winches, and stacked two-high onto “air skates” that float on compressed air. Using the air skates, as few as four employees can move the 60-ton stack into place in Microsoft’s “container canyon” in the lower floor of the facility in Northlake, Ill.

Within eight hours, the new container is fully installed, hooked up to chilled water, power and a network connection. “These hold 2,000 servers each, and they can be deployed in hours,” said Kevin Timmons, Microsoft’s general manager for data center operations. “That’s an awesome, awesome thing.”

microsoft-chicago-containers

A look at one of the double-decker data center containers housed at the massive new Microsoft data center near Chicago. The facility includes both raised-floor space and plug-n-play bays for containers packed with servers.

Photo credit: Data Center Knowledge of

Rich’s says in his Photo Tour: A Container Data Center post of 10/1/2009:

What does a container data center look like? We have additional photos of Microsoft’s new data center in Chicago, which is optimized for 40-foot data center containers holding thousands of servers each. Check out our photo tour of the new facility, which provides additional views of the “container canyon” and the high-density server configurations inside the containers, plus a link to video of a container being installed.

NAVIGATION

James Hamilton compares Microsoft’s Chicago and Dublin data centers in his Microsoft Chicago is Live post of 10/1/2009:

Early industry rumors were that Rackable Systems (now SGI but mark me down as confused on how that brand change is ever going to help the company) had won the container contract for the lower floor of Chicago. It appears that the Dell Data Center Solutions team has now has the business and 10 of the containers are from DCS.

The facility is reported to be a ½ billion dollar facility of 700,000 square feet. The upper floor is a standard data center whereas the lower floor is the world’s largest containerized deployment. Each container holds 2,000 servers and ½MW of critical load. The entire lower floor when fully populated will house 112 containers and 224,000 servers. …

Unlike Dublin which uses a very nice air-side economization design, Chicago is all water cooled with water side economization but no free air cooling at all.

See the Barton George item in the Other Cloud Computing Platforms and Services section regarding DCS and data centers for Windows Azure.

James gives Microsoft props:

The Chicago facility takes a page from advanced material handling and slides the containers on air skates over the polished concrete floor. Just 4 people can move a 2 container stack into place. It’s a very nice approach.

Alessandro Perilli claims Microsoft prepares Azure to compete with Amazon EC2 in this 9/20/2009 post to the Virtualization.info blog:

Our attention of course also focuses on the many shapes that IaaS clouds can have, like the Server-as-a-Service (nobody every used this term so far, but the industry may do it at a point) or the imminent Desktop-as-a-Service architectures (DaaS).

Thus virtualization.info closely monitors both IaaS service providers (like Amazon, IBM, Rackspace, tuCloud, etc.) and IaaS technology providers (like Citrix, Desktone, Skytap, VMware, etc.).

Soon enough we are going to cover Microsoft as well. [Emphasis added.]

Right now the Microsoft cloud computing effort, called Azure, is recognized as a Platform-as-a-Service (PaaS) architecture that competes with Google AppEngine and, maybe, one day, with the result of a complex technology merge between VMware, SpringSource and Terremark.

But Microsoft is moving to extend the Azure capabilities to also become a IaaS cloud, which can compete with Amazon Elastic Computing Cloud (EC2), the Rackspace Cloud, etc. And, again, with whatever VMware and Terremark are planning to do together.

I’ve always contended that Azure competes with the Google App Engine, not Amazon Web Services. See my Lobbying Microsoft for Azure Compute, Storage and Bandwidth Billing Thresholds for Developers post of 9/3/2009 and earlier A Comparison of Azure and Google App Engine Pricing post of July 19, 2009.

Simon Munro reports that Windows Azure Is Not Cheap in this 9/30/2009 post to his personal blog:

As the official release of Azure looms, and the initial pricing model is understood, a lot of technical people are crunching numbers to see how much it will cost to host a solution on Azure.  It seems that most of the people doing the comparisons are doing them against smaller solutions to be hosted, not in some corporate on-premise data centre, but on any one of hundreds of public .net hosting providers out there.

This is not surprising since the type of person that is looking at the pre-release version of Azure is also the kind of person that has hundreds of ideas for the next killer website, if only they could find the time and find someone who is a good designer to help them (disclaimer:  I am probably one of those people).  So they look at the pricing model from the perspective of someone who has virtually no experience in running a business and is so technically capable that they have misconceptions about how a small business would operate and maintain a website.

Unsurprisingly they find that Azure works out more expensive than the cost of (perceived) equivalent traditional hosting. …

Microsoft needs to convince the geeks out there that there is a whole lot more that comes with Azure, that is very important to smaller businesses, that are not available from traditional hosting. So Microsoft needs to help us understand the costs, and not just the technology, in order for us to convince our customers that although Azure is not cheap, it makes good financial sense.

My view is that “Microsoft needs to convince the geeks” [i.e., Azure developers] by competing with Google App Engine on the cost of running development and demonstration projects (see the previous item.)

Reuven Cohen claims The Business of Cloud Computing is Booming in this 10/1/2009 post that’s subtitled: “Try to look beyond the hype and see if people are actually making money or getting work done.”

Getting a feel for the pulse of Cloud Computing can be a difficult endeavor. According to recent Google search trends "Cloud Computing" is at an all time high in terms of raw search queries. This has also been confirmed by analysts reports such as Gartner's Hype Cycle which shows Cloud computing at what they aptly describe as the peak of inflated expectations. For me tools like Google Trends & Google Insights helps shed light on Cloud Computing from a search engine point a view. But sadly does little in translating into actual financial facts and figures - the stuff that actually matters.


Trying to look beyond the hype and see if people are actually making money or getting work done should be the real litmus test in terms of gauging the business opportunity for cloud computing -- and at the end of the day it's probably a better statistic. But then again these sorts of "real world" revenue & sales pipeline stats are not nearly as easy to get. So I thought I'd take a moment, and discuss some of the recent success we've seen in our segment of the cloud world.

<Return to section navigation list> 

Cloud Security and Governance

Microsoft’s patterns & practices team released patterns & practices: Azure Security Guidance to CodePlex on 9/8/2009 but only 64 downloads were reported as of 10/3/2009. I’m making the assumption that most folks didn’t hear about the move contemporaneously, so here’s a repeat of any earlier OakLeaf Cloud Security and Governance item:

AzureSecurityLogo.jpg

According to authors J.D. Meier, Prashant Bansode and Paul Enfield:

This is the evolving knowledge base for the Azure Security Guide project currently in research.

Both the guide and this Knowledge Base will contain proven practices for building secure distributed applications on the Azure platform. Azure is Microsoft's offering in the cloud computing space. The guide will contain proven practices, end-to-end applications scenarios, guidelines, question & answers, videos and task-based how-to articles. This guide will be a collaborative effort between patterns & practices, product teams, and industry experts.

NOTE: Information contained here is in flux. This project is in its research phase and all information is unofficial and subject to change.

The authors continue with “What’s New,” “About the Project,” and “Frames” sections.

Joseph Goedert’s Survey Highlights Power, Limits of EHR Data post of 10/2/2009 analyzes PricewaterhouseCoopers’ recent Transforming healthcare through secondary use of health data report:

More than three-quarters of 732 surveyed executives at provider, payer and pharmaceutical organizations believe secondary use of data from electronic health records will be their organizations' greatest asset during the next five years.

But respondents also cite multiple barriers to best use of de-identified and aggregated health information. They also cite the necessity of guidelines for the usage of secondary data that is to be shared. New York consulting firm PricewaterhouseCoopers conducted the e-mail survey in June, getting replies from 482 providers, 136 insurers and 114 pharmaceutical/life sciences organizations.

Sixty-five percent of surveyed providers use secondary data to some degree, as do 54% of payers and 66% of pharmaceutical firms.  Besides EHRs, this data can come from claims, clinical trials, laboratory and radiology reports, employers, and disease management companies. Those surveyed expect their use of such data to rapidly grow and already report such benefits as quality improvements, reduced costs, increased revenue and higher patient/member satisfaction.

Forty-one percent of surveyed pharmaceutical companies report having partial access to some clinical data in electronic health records and 10% report full access, leaving 49% with no access.

Only 39% of surveyed insurers offer personal health records to members, but few use them. Only 12% of these payers report that a majority of their members have filled out a PHR; 59% report less than 5% of members have done so. …

PriceWaterhouseCoopers offers a substantial amount of analytical healthcare-reform content here.

Andrea DiMaio’s The Boundaries of Cloud Computing: World, Nation or Jurisdiction? post of 10/2/2009 describes geo-dependent juristictional issues with data stored in the cloud:

I have been writing before about the geopolitics of cloud computing (see here and here), and I am planning to write a fully fledged note for Gartner clients. This topic keeps emerging with clients at state, provincial and local level, who expect future IT investments to be directly or indirectly connected to economic recovery, and can’t quite articulate how cloud computing can help locally in that respect.

Besides that, yesterday I had an exchange with a few Gartner colleagues about the thorny issue of data location. Vendors like Google claim they will be able to provide government clients in the US cloud services that comply with federal security requirements (FISMA) and run on servers that are physically located in the US. Our discussion was whether this will be enough also for state & local as well as international government clients to be moving their email or desktop applications into “the cloud”. Of course this does not apply only to Google but to any vendor providing cloud services of sort. …

• Mary Grush describes her Safety & Service in the Skies post of 10/1/2009 to Campus Technology as “Three SaaS providers talk about why cloud computing is more secure than you think.”

AS COLLEGES AND UNIVERSITIES rely more heavily on software as a service (SaaS), they're putting morecritical data in the cloud. What are the security issues, and how arecloud providers responding? CT went to three higher ed SaaS vendors-- Google, IBM, and TopSchool-- and asked them to share theirthoughts about the state of security in cloud computing. Our panelists for this virtual roundtable discussion were:

  • Anthony Hill, CTO for SaaS-based student lifecycle management provider TopSchool. Previously he was CIO of Golden Gate University (CA), where he led a major initiative to move the university's IT services to the cloud.
  • Jeff Keltner, business development manager at Google, responsible for Google Apps in the education sector worldwide.
  • Dennis Quan, director of autonomic computing in the IBM Software Group. He launched the IBM/Google Cloud Computing partnership in 2007.

Robert Rowley, MD’s ONC elaborates on security and privacy of 10/1/2009 discusses how ARRA extends protection of patient medical information beyond that required by HIPAA:

The federal Office of the National Coordinator (ONC) for healthcare IT receives advice from two committees: the HIT Policy Committee and the HIT Standards Committee. The Policy Committee has been developing a policy framework for development of a nationwide health IT infrastructure, including standards for the exchange of patient medical information. They have already made finalized recommendations to the ONC for defining Meaningful Use.

In their September 18, 2009, meeting, the HIT Policy Committee focused their attention on privacy and security, declaring these to be “foundational requirements for appropriate management and exchange of individuals’ health data.” The committee sought testimony and comments in four broad categories: (1) individual choice/control and data segmentation; (2) use, disclosure, secondary use, and data stewardship; (3) aggregate data use, de-identification/re-identification, and models for data storage; and (4) transparency, accountability and audit.

The overview that was presented reviewed how ARRA “changes the game,” extending privacy and security beyond what was previously covered by HIPAA. Custodians of personal health information (PHI), such as EHR vendors, or anyone involved in the collection and transmission of PHI, need a Business Associate (BA) agreement. Breach notification requirements now extend beyond EHRs, and include PHR vendors (which were not included under HIPAA previously). One principle highlighted was that an individual has the right to restrict disclosure of PHI, and to limit the use and request for PHI to the “minimum necessary” information for the purposes intended – mainly, this applies to health plan and other third-party payor, restricting PHI disclosure to only what is needed for bill payment. …

Hearsay Podcasts present on 10/1/2009 a Chris Hoff on Cloud Security, Disruptive Technologies and PCI DSS podcast:

Dennis Fisher talks with Chris Hoff of Cisco about the challenges of cloud deployment and security, cloud use cases and the effect that disruptive technologies can have on enterprises, vendors and standards organizations.

Bill Brenner’s 5 Mistakes a Security Vendor Made in the Cloud of 9/30/2009 chronicles an unnamed security vendor’s SaaS missteps:

When security experts sound the alarm about enterprises embracing cloud computing with little understanding of the risks, it's usually a case where the expert -- working for a vendor -- is making a pitch for their employer's products. That's all well and good, but here's the problem -- some of them have trouble keeping their own side of the cloud clean.

That, according to Nils Puhlmann, co-founder of the Cloud Security Alliance and previously CISO for such entities as Electronic Arts and Robert Half International. …

"This major security vendor basically did everything you can possibly do wrong when rolling out the latest version of its SaaS (software as a service) product, leading to users uninstalling their solution in large numbers," he said.

The post continues with the gory details.

<Return to section navigation list> 

Cloud Computing Events

Sergio Pereira reports a CouchDB Presentation at the next Chicago ALT.NET group meeting:

In this month's Chicago ALT.NET meeting we will be taking a look at Apache CouchDB. I quote from the official site:

Apache CouchDB is a document-oriented database that can be queried and indexed in a MapReduce fashion using JavaScript. CouchDB also offers incremental replication with bi-directional conflict detection and resolution.

CouchDB provides a RESTful JSON API than can be accessed from any environment that allows HTTP requests.

6:30 pm Get Comfy With CouchDB

CouchDB is one of the more mature schema-less map/reduce object dbs out there. In this talk we'll cover the basics of what CouchDB is, and why it's cool, and then we'll run through a sample application. The application will show off LINQ to Couch, basic persistance, views and full-text search with CouchDB-Lucene.

When: 10/14/2009 6:00 PM to 8:30 PM CT   
Where:  Redpoint Technologies, 233 South Wacker Dr., Suite 750 (Sears Tower), Chicago, IL 60606, USA

Shrikant Sarda and Trae Chancellor will present a Cloud Computing for Your Enterprise Webinar on 10/7/2009 at 4:30 PM India Standard Time:

According to Shrikant and Trae, this webinar will teach the following:

    • What cloud computing is and what’s behind its momentum
    • What makes Force.com a complete platform
    • How Force.com enables faster app-building at ½ the cost
    • How to start building your first Force.com app

Register for the webinar here.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

••• Jesper@Bitbucket’s On our extended downtime, Amazon and what’s coming post of 10/4/2009 updates the Something is going on with our storage (again) static page cited below with the reason for the downtime: a DDoS UDP attack followed by “something like a TCP SYNFLOOD. Amazon employed new techniques for filtering our traffic and everything is fine again now.”

Update 5/10/2009: Cade MetzDDoS attack rains down on Amazon cloud article of 10/5/2009 for The Register provides more background on Bitbucket and its 19-hour outage.

Guy Rosen delivers his State of the Cloud – October 2009 post on 10/3/2009:

With each new month comes a new State of the Cloud post. In case you have joined recently, State of the Cloud is a regular report on the adoption of cloud infrastructures, comparing the market share held by each provider. As always, please refer to the first post in the series for methodology, data sets and caveats.

Snapshot for October 2009

Cloud Provider Comparison

Guy continues with Monthly Growth and Trends charts.

Update 10/5/2009: Sam Johnston (@samj) responds to Guy’s chart in three 10/5/2009 Tweets:

    • Counting the number of *web sites* hosted per #cloud provider is misleading IMO - this is but one of a myriad applications.
    • Furthermore, making comparisons between generic infrastructure services and task-specific platforms isn't telling the whole story.
    • A fairer comparison would be [number of] instances of Rackspace Cloud *Servers* (not *Sites*) to Amazon EC2.

Guy Rosen (@guyro) responds with a 10/5/2009 Tweet:

@samj The stats do count Rackspace Servers not Sites. As for web being one of many apps - true. See caveats on the 1st - http://bit.ly/mtYUw

••• Ron Schmelzer climbs on the definitions bandwagon with his Private Clouds: A Valuable Concept or Buzzword Bingo? post of 10/4/2009:

Every once in a while, the machinery of marketing goes haywire and starts labeling all manner of things with inappropriate terminology. The general rationale of most marketers is that if there's a band wagon rolling along somewhere and gaining some traction in the marketplace, it's best to jump on it while it's rolling. After all, much of the challenge of marketing products is getting the attention of your target customer in order to get an opportunity to pitch products or services to them. Of course, if it doesn't work with one band wagon, as the old adage goes, try, try again. This is why we often see the same products marketed with different labels and categories applied to them. Sure, the vendors will insist that they have indeed developed some new add-on or tweaked a user interface to include the new concept front and center, but at the very core of it, the products remain fundamentally unchanged. …

He continues with a detailed list of these five private cloud concepts:

  1. Company-Owned and Operated, Location-independent, Virtualized (Homogeneous) Service Infrastructure
  2. Virtualization Plus Dynamic Provisioning (Elasticity)
  3. Governed, Virtualized, Location-Independent Services
  4. Internal Business Model for Pay on Demand Consumption of Location-Independent, Virtualized Resources
  5. Marketing Hype, Pure and Simple

and concludes with the “ZapThink Take.”

Ron is founder and senior analyst of ZapThink.

BitBucket encountered a serious problem with Amazon Web Service’s Elastic Block Storage (EBS) that resulted in about a day’s outage, as described in BitBucket’s Something is going on with our storage (again) static Web page of 10/3/2009:

Last night (around 23:00) the [I/O] requests going to the disks we have running on the Amazon EBS started queuing up, as the disk didn't answer the questions it got fast enough. This lead to high load, and no responses. I've moved the instance, disks and such to another availability-zone, but it still seems to be faulty, laggy and just not working. As this is a major problem with the underlying architecture used to run bitbucket.org, it's out of our hands, and there is really not much we can do. …

The page continues with more details on the BitBucket team’s attempt to remedy the situation, but ends with this plaintive paragraph:

UPDATE 3: Amazon has acknowledged that there is in fact a problem. They're investigating, and we're waiting for the go-ahead.

Darryl K. Taft reports IBM's M2 Project Taps Hadoop for Massive Mashups for eWeek.com on 10/2/2009:

IBM is using Hadoop to enable ad hoc analytics at Web scale in an effort called Massive Mashups, or M2.

In other words, IBM is making Hadoop accessible to business professionals to enable them to gain access to analytics on the fly and presenting the results in an easily accessible way, said Rod Smith, IBM Software Group's vice president of emerging Internet technology, during a keynote presentation at Hadoop World: NYC conference here on Oct. 2.

• Krishnan Subramanian analyzes CohesiveFT’s response to Amazon’s Virtual Private Cloud in his CohesiveFT Rocks With VPNCubed For vCloud post of 10/2/2009:

On Wednesday, CohesiveFT (our previous coverage of VPNCubed here and here) announced the release of VPNCubed for VMWare's vCloud ecosystem. VPN-Cubed provides a security perimeter for the IT infrastructure whether it is inside a single cloud or multiple cloud or hybrid cloud-datacenter ecosystem.

VPN-Cubed uses the popular Open Source VPN software, OpenVPN, to act as an encrypted LAN inside a single cloud or as an encrypted WAN across multiple clouds. This allows the cloud based clusters or the hybrid cloud-enterprise datacenter ecosystem to appear as a single physical network.

In March of this year, CohesiveFT announced the release of VPNCubed for EC2. With this release Amazon EC2 customers could create their own secure private network either within an EC2 regions or across several regions. Couple of months back, Amazon released Amazon Virtual Private Cloud and at that time I wondered about what it means to CohesiveFT. …

Cloudera Announces Beta Release of Cloudera Desktop in this press release of 10/2/2009:

Cloudera, the commercial Hadoop company, has announced the release of Cloudera Desktop, a unified graphical user interface for Hadoop applications. The initial release of Cloudera Desktop includes tools for job and cluster management.

Cloudera Desktop makes Hadoop easier to use and manage. Business analysts, developers and administrators can use the Cloudera Desktop user interface to create and submit jobs, to monitor cluster health and to browse the data stored on a Hadoop cluster. Experienced users can choose between the command-line tools provided by the open source project or the Cloudera Desktop GUI. …

Barton George says in his Talking about Dell’s Cloud efforts post of 10/1/2009:

Last Friday I got together with Michael Cote of Red Monk and John Willis of Canonical for a podcast.  We met up at a nearby coffee shop and chatted about a whole bunch o’ stuff.

You can listen to the actual podcast on Cote’s blog.

Some of the topics we tackle:

  • What I’ll be doing at Dell as Cloud Evangelist.
  • Dell’s cloud building business, focused on a small group of hyper-scale customers (Azure and Facebook being a couple I can name), delivering a high volume of highly customized machines for these customers. [Emphasis added.]
  • Some of the learnings we’ve gained with working with this group.
  • Our intent to take this effort to a much wider group of customers and offer complete cloud solutions made up of hardware, third party software, a reference architecture and services.
  • Dell’s other major cloud effort:  providing Support services as a service.
  • Recent industry events and upcoming cloud conferences.
  • And last, but not least, John and Cote introduce me to the wild and wonderful world of Pokens.

Michael Coté tells his side of the podcast story in Dell in the Clouds, Barton George – IT Management & Cloud Podcast #56 on 9/25/2009:

For this episode, John and I are joined by special guest Barton George, Dell Cloud Evangelist. You might remember Barton from the Profilers in Courager series at CloudCampAustin. As you can imagine, we spend most of our time talking about Dell’s current doings in cloud computing:

Randy Bias claims Amazon’s EC2 Generating $220M+ Annually in this 10/1/2009 post to his Cloudscaling blog:

Today I’m going to tell you how much revenue Amazon’s Elastic Compute Cloud (EC2) is generating. After all, you, my regular readers, come to this blog for its insight, original thinking, and gems of wisdom. Today I have something particularly juicy for you: real-world numbers on Amazon’s EC2 size and revenue.

Recently Ruv Cohen talked about how cloud computing is taking off and Guy Rosen made an attempt to derive EC2 usage information. What I have today is actual verified EC2 numbers plus some guesses and a rough model of it’s current annual usage. Not only do I have a pretty good line on Amazon EC2’s size and usage, but we can use that to infer revenue to a fairly close approximation. Once we have EC2’s revenue numbers we also have a good notion of the overall size of the current infrastructure cloud computing (IaaS) market. Looking at Amazon’s most recent 10-Q we can even make a guess about the annual growth rate. …

Rich Karpinski asks and answers How real is the cloud? Evidence from IBM, Amazon in this 10/1/2009 post to the TelephonyUnfiltered blog:

Yesterday, we ran a story on a Yankee Group survey showing that the economy had slowed many companies plans to adopt so-called cloud computing efforts. Oracle’s outspoken boss Larry Ellison has also been pounding the approach, dubbing it “water vapor” in one report. But there’s some proof of momentum and growth as well.

CIO.com published a story this week touting Amazon’s pioneering cloud computing effort as much more than a mere “toy,” contending that Amazon Web Services is provisioning 50,000 EC2 server instances per day, or a run rate of more than 18 million annual provisioned instances. IBM, meanwhile, this week is opening to a more public beta its test and development cloud, according to GigaOm, with a planned private cloud service in the offing later soon as well.

What appears to be at work here is some initial enterprise caution, obviously due to the economy but also credited to legitimate concerns about moving mission critical applications out into the cloud. …

Krishnan Subramanian reports on 10/1/2009 IBM Opens Their Test And Development Cloud For Public Beta:

IBM jumped into the cloud bandwagon with their IBM Smart Business Services Cloud. Unlike Amazon Web Services, which is a general purpose cloud, IBM took the approach of tasks based clouds built around a workload, targeting enterprise customers vary of Amazon public clouds. They positioned their cloud solutions as something which IBM can deploy quickly and securely either inside the enterprise firewall or on IBM Cloud. They are optimized for workloads as diverse as software development and virtual desktops, as smarter traffic management and smarter retail.

IBM Smart Business cloud portfolio, developed based on the conversations IBM has had with their enterprise customers, is meant to help clients turn complex business processes into simple services. To accomplish this, Smart Business clouds brings sophisticated automation technology and self-service to specific digital tasks as diverse as software development and testing; desktop and device management; and collaboration. …

Bill McNee’s CODA and Salesforce.com Team Up to form FinancialForce.com Research Alert of 9/30/2009 from Saugatuck Technology analyzes this new jointly-owned company:

On September 30th, 2009, UK-based CODA (a division of Unit 4 Agresso) announced that it has formed a new company (FinancialForce.com) in partnership with Salesforce.com. With this announcement, the development team and CODA 2go SaaS-based financial offering (now called FinancialForce) migrates to FinancialForce.com, which will operate out of San Mateo, CA and Harrogate, UK. Ownership of the new entity is shared between Unit 4 Agresso and Salesforce.com, although Saugatuck believes that Salesforce.com’s equity stake is 20 percent or less of the new firm.

Built from the ground up on Force.com, and fully integrated with Salesforce CRM, FinancialForce has some impressive technological advantages that will serve it well over time. When deployed in conjunction with Salesforce CRM, clients are provided an integrated / unified data model, and the ability to leverage Force.com’s strong customization and workflow capabilities. Further, the creation of the new entity and expanded relationship with Salesforce.com provides FinancialForce.com some useful go-to-market and operating benefits, from branding to lead-generation to Salesforce providing hosting and tier 1 customer support for the new offering. …

Bill continues with “Why Is It Happening?” and “Market Impact” conclusions.

Judith Hurwitz explains Why is Larry Ellison afraid of the cloud? in this 9/30/2009 post to her Troposphere blog:

I just finished watching Larry Ellison’s conversation with Ed Zander at the Churchill Club, a Silicon Valley business and technology forum. While these type of dialogues are not rare in the industry, I found this one to be particularly insightful. I think we will look back at this conversation as a watershed moment regarding the role of hardware, software, integration, and the cloud.

In case you missed seeing this video on YouTube (I recommend that you watch it http://www.youtube.com/watch?v=rmrxN3GWHpM). In case you don’t have time, let me summarize the key points that I heard and give you my take. …

<Return to section navigation list> 

Note: Blogger throws HTTP 400 errors when updating posts with added labels, so new labels currently aren’t present for items added on 10/2/2009 or later. The problem report is here.

blog comments powered by Disqus