Thursday, September 03, 2009

Windows Azure and Cloud Computing Posts for 8/31/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

Updated 9/3/2009:
• Updated 9/2/2009: George Huey’s SQL Azure SQL Azure Migration Wizard, Steve Marx on instant Windows Azure CTP invitations.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

Elisa Flasko announces ADO.NET Data Services v1.5 CTP2 Now Available in this 9/1/2009 post to the ADO.NET Team blog:

ADO.NET Data Services v1.5 CTP2 is now available for download! This release (v1.5) targets the .NET Framework 3.5 SP1 & Silverlight 3 platforms and provides new client and server side features for data service developers. 

What’s included in CTP2?

This release includes updates to the features that were in the CTP1 release of ADO.NET Data Services v1.5 plus a few additional new features and a number of bug fixes including:

  • Projections
  • Data Binding updates
  • Feed Customization (aka "Web Friendly Feeds") updates
  • Server Driven Paging (SDP) client library support
  • Enhanced BLOB Support client library support
  • Request Pipeline
  • "Data Service Provider" Interface updates

For more information and to watch the Getting Started video check out the ADO.NET Data Services Team Blog.

Mike Flasko adds more technical detail about Astoria 1.5 CTP2 in his ADO.NET Data Services v1.5 CTP2 – Now Available for Download post of 9/1/2009 to the ADO.NET Data Services Team Blog.

Microsoft’s Jay Haridas delivers .NET and ADO.NET Data Service Performance Tips for Windows Azure Tables in this detailed thread of 8/29/2009 in the Windows Azure forum. Jay’s topics include:

  1. Default .NET HTTP connections is set to 2
  2. Turn off 100-continue (saves 1 roundtrip)
  3. To improve performance of ADO.NET Data Service deserialization
  4. Turn entity tracking off for query results that are not going to be modified
  5. All about unconditional updates/deletes
  6. Turning off Nagle may help Inserts/Updates

Jay should turn this thread into an Azure Team blog post.

Pablo Castro adds his 2 cents in ADO.NET Data Services v1.5 CTP2! of 9/1/2009. Pablo’s been unusually quiet on the blog front lately.

Phani Raju from the Astoria team writes about the new Web Friendly Feeds features in his Its ALIVE !!! ADO.NET Data Services V1.5 CTP 2 is now in the wild!! of 8/31/2009 and offers these posts:

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Wade Wegner’s SQL Azure Migration Wizard post of 9/1/2009 is a highly favorable review of (his “good friend and coworker”) George Huey’s SQL Azure Migration Wizard (see below). The post also includes a 00:17:09 video showing the tool in action.

George’s Wizard uses regular expressions stored in a .config file to conform generated SQL Server 2005 or 2008 T-SQL scripts to the SQL Azure August 2009 CTP’s capabilties.

Wade (@WadeWegner) and George are architects in the Developer & Platform Evangelism division at Microsoft.

George Huey posted his SQL Azure Migration Wizard to CodePlex on 8/31/2009:

The SQL Azure Migration Wizard helps you migrate your local SQL Server 2005 / 2008 databases into SQL Azure. The wizard walks you through the selection of your SQL objects, creates SQL scripts suitable for SQL Azure, and allows you to edit / deploy to SQL Azure.

Thanks to Maarten Balliauw for the heads-up. I’ll test it later today and update this post with the results.

Julien Hanssens offers a no-install test of his CTP Release “SQL Azure Manager” v0.1 project as of 8/25/2009:

With the official introduction of SQL Azure (August CTP) on August 18th 2009 it wasn’t really intuitive to connect to your SQL Azure Database. Sure, there is an official workaround but that honestly just doesn’t cut it at the moment.

So to quickly start working without any fuzz, Maarten Balliauw and myself figured out we could quickly and easily create a functional tool that would allow us to easily manage our cloud databases. This became the project SQL Azure Manager, which can be roughly defined as:

SQL Azure Manager is a community effort to quickly enable connecting to your SQL Azure database(s) and perform basic tasks.

And it does just that. Note that it is a first conceptual release. And that it is highly unstable. But it does the trick. At least at a bare minimum. And for the time being that is enough.

There’s more information on the project in Julien’s earlier SQL Azure Manager post.

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

Nick Randolph sent me a heads-up today with the following list of his posts, most of which are about .NET Services:

That’s a lotta posts for two three days. I’ve added Nick’s site to my feed reader so I don’t miss his future posts.

Maarten Balliauw calls Matias Woloski’s .NET Service Bus – Remote Desktop over Firewalls! post a “cool use for the .NET Service Bus! TCP over HTTP” in a 9/1/2009 Tweet. Matias says:

Today was holiday in Argentina but I had to work on some pending stuff (yeah, lucky me). I didn’t want to travel to the office but I had to access a SQL Server that was hosted at Southworks LAN and we don’t have inbound ports open to connect to our workstations through RDP (port 3389). So…. the .Net Service Bus came to the rescue! Last week David Aiken told me about this cool project hosted on codeplex http://socketshifter.codeplex.com. He told me “these people are streaming video over the service bus”…

So yesterday before leaving the office I opened up the socketshifter server, configured my service bus account and allowed some ports to be redirected. Today I connected from home and here is a nice screenshot of RDP to a Southworks LAN machine (connected to localhost:1000). Isn’t it cool??!!?!

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

•• My Electronic Health Record Data Required for Proposed ARRA “Meaningful Use” Standards post of 9/3/2009 provides detailed background information about the Department of Health and Human Services’ proposed requirements and measurement criteria for the “Meaningful Use” that enables health providers to claim federal subsidies for implementing Electronic Health/Medical Record applications.

The post also includes more information about the use of HealthVault’s Personal Health Record (PHR) in conjunction with Web application providers, such as PassportMD.

George Huey’s SQL Azure Migration Wizard conforms SQL Server 2005 or 2008 T-SQL scripts to SQL Azure requirements. See posts in the SQL Azure Database (SADB) section.

Alin Imrie reviews  Project Riviera - Windows Azure Code Samples in this 8/31/2009 post:

Project Riviera is a comprehensive code sample to demonstrate how to develop multi-tenant highly-scalable line-of-business application on Windows Azure Platform.

This sample is developed by Global Partner Architecture Team in Developer & Platform Evangelism group at Microsoft in collaboration with Cumulux (a Cloud ISV partner). Riviera uses Customer Loyalty Management scenario for illustration purpose but many building blocks are applicable to range of line-of-business applications.

Click here to view a screncast of Riviera, Architecture details and other related information.

Danielle Morrill’s Announcing This Week's Netbook Contest Category: Windows Azure post for Twilio of 8/31/2009 reports:

… [Y]ou have until midnight on Sunday, September 6th to to submit your Twilio + Windows Azure app for a chance to win... drum roll please...

twilio-netbook-developer-contest

The winner will receive:

  • a Dell Mini Netbook
  • Windows 7
  • $500 of Windows Azure credit

Thanks to Steve Marx for the heads up Tweet.

<Return to section navigation list> 

Windows Azure Infrastructure

•• @PracticeFusion notes in this 9/3/2009 Tweet:

Review of worst patient privacy data breaches in 2009 ... http://tinyurl.com/lhjws6 ... reveals none associated with cloud-based EHRs.

• Mary Jo Foley expands on HealthVault’s RTM status in her Microsoft HealthVault service sheds its beta tag post of 9/2/2009 to ZDNet’s “All About Microsoft” blog:

… I asked Microsoft what changes it had made to HealthVault between the beta and final releases of the service. Microsoft officials sent via e-mail the following response from David Cerino, General Manager of Microsoft’s Health Solutions Group:

“In order to make the migration out of Beta, Microsoft products need to meet a series of internal compliance requirements across the areas of Accessibility, Interoperability, Security, Privacy, Software Integrity, Geopolitical and Intellectual Property. HealthVault made a number of updates, most notably in the area of Accessibility, where the team has placed a tremendous amount of focus over the last two releases, enabling new scenarios in low vision, vision impaired, color blindness, mobility and hearing.”

Even though HealthVault is no longer in beta, Cerino noted that Microsoft plans to continue to add more features to the software and service through regularly released new updates.

On the HealthVault blog, Microsoft published the release notes for the final version of HealthVault. The software/service is ready for deployment in production and pre-production environments, according to the post. A .Net software-development kit for the final release will be available “shortly,” according to the company.

The .NET SDK should be quite informative about HealthVault’s use of Windows Azure Platform features.

Steve Marx announces instant tokens in his No More Waiting: Use Windows Azure Right Now! of 9/1/2009:

Until this week, using the Windows Azure CTP meant signing up and then waiting a couple of days for an invitation code to arrive by email.  No more.  You can now register for access and receive an invitation code right there on the spot.  No email, no waiting, no excuses. Go register now and build something cool!

David Lemphers asks What’s Your Cloud Address? in this 9/1/2009 post:

Now, addressing is an issue for any public facing Internal application, but for clouds, the issues increase because of the dynamic nature of deploying applications that require publicly addressable endpoints at huge scale.

Let’s start by thinking about a traditional web application, whether it be large or small. At the early design phase, you are able to identify the front end roles that will require public IP space. You can then start capacity planning to identify your peak load forecasts, how many front end nodes you will need to support max load, then make the appropriate request and allocation. As the process for allocating IP space from the public range can take many months, you want to conclude this process early in the design phase.

Secondly is the scarcity of resources. If you rely on IPv4, your going to have to be very sparing of your use of public space, as IPv4 is not only very scarce, but it’s expensive. IPv6 on the other hand is abundant, simplifies your network strategy (as you don’t have to necessarily NAT addresses etc), and is easy to acquire. On the flip side, you need to do more planning and up front design to ensure your system, both hardware and software, can accommodate IPv6 addressing. …

Wintergreen Research offers their 519-page Worldwide Cloud Computing Market Opportunities and Segment Forecasts 2009 to 2015 through Research and Markets as of 9/1/2009 for US$3,400 (PDF) or US$6,800 (site license). From the abstract’s last paragraph:

Cloud computing markets at $36 billion 2008 are expected to reach $160.2 billion by 2015. Market growth is fuelled by ease of information access provided by the cloud. Faster development systems are available to line of business analysts through point and click application development tools. Industry specific applications are evolving in the context of the ability to build applications from SOA components.

Joe McKendrick’s Another view: cloud not ready to take on SOA heavy lifting post of 9/1/2009 to ZDNet’s Service Oriented blog begins:

Anne Thomas Manes penned a thoughtful response to my recent post on “Cloud: the SOA we always wanted, but never had?” Anne agrees with my premise that cloud computing will boost the viability of SOA in business contexts, but takes issue with points I made about cloud finally delivering some long-sought SOA promises — including being understood by the business, being technology agnostic, necessitating provider-consumer contracts, or building trust between service providers and consumers. It’s too soon to make these assertions, she states:

“As far as I can tell, cloud computing is none of these things. It should be. But cloud is too nascent for such assertions. Besides, in order to achieve these characteristics in cloud-based systems, organizations have to 1- design them that way, and 2- develop the contracts and trust described. You won’t achieve these characteristics automagically just by deploying a system to EC2, Force.com, or some other cloud provider.”

David Linthicum posits The Cloud Reality Is Setting In by this 8/31/2009 post to InformationWeek’s Intelligent Enterprise segment:

Now that I work almost exclusively in the world of cloud computing, including SaaS, I see a much higher level of skepticism out there around cloud computing. This is best reflected by this recent CIO.com survey, which highlights the fact that reality is setting in.

"The June 2009 survey, 'CIO On-Demand Services Survey,' reveals that cloud computing fears regarding security, data management, total cost of ownership, regulatory and compliance issues, and vendor lock-in have actually increased as compared with results from a similar survey in August 2008."

The driving force behind this is that guys like yours truly have been pushing more of the downsides of cloud computing, because almost everyone else is pushing the upsides. There are always downsides with upsides, and it's important to understand those before jumping in with both feet into any sort of technical change, including cloud computing.

Dave continues with a lengthy analysis, which concludes: “A bit of skepticism is healthy, but arm yourself with the facts.”

Steve Kovsky reports on the results of a poll of 2,878 Daily Tech blog readers in his Cloud Computing’s Future Remains Murky post of 8/31/2009:

Among the 2,878 DailyTech readers who took part in the poll, 36 percent indicated that they “already have virtualization technology in place,” compared to only 12 percent that said they had no plans to virtualize. However, this 3-to-1 ration in favor of virtualization did not extend to cloud computing an advanced form of virtualization in which the physical location of data and computing resources typically resides outside the corporate campus, often with a third-part service provider at a remote location. …

[O]nly 8 percent of respondents indicated immediate plans to implement cloud technology. In contrast, 29 percent said cloud computing isn’t “quite ready for primetime.”

Barton George interviewed Forrester Research analyst James Staten at the Cloud World/Open Souce World conference and shared the video in his Forrester’s James Staten Explains the Cloud post to his Dell blog:

At Cloud World/Open Source World earlier this month I grabbed some time with Forrester’s “Mr. Cloud” James Staten.  I wanted to get his take on Cloud Computing and what was hot and what is not:

Some of the things James talks about:

  • How the conversation about cloud has changed over the last year.
  • He spends a lot of time telling people what the cloud is not.
  • The three things they’ve learned (coming soon to Forrester report near you):
    • First thing to do in the cloud is test and development
    • Organizations can take short term web promotions and marketing efforts and drop them into the cloud (witness Wendy’s 99c promotion)
    • Put apps that are triggered by revenue into the cloud
  • Rather that “Public vs Private” clouds, Forrester segments it into “internal vs. hosted vs. public
  • Cloud is not an all or nothing proposition, it’s another tool in the toolkit.

Gregory Ness asks Will Virtualization Undermine Network Equipment Vendors? in this 8/30/2009 post to the Seeking Alpha blog:

As you evaluate the promise of networking and virtualization stocks the question of how virtualization takes shape could make all the difference. The fortunes of companies like VMware (VMW), Cisco (CSCO), Microsoft (MSFT), HP (HPQ) and others can fluctuate as their offerings appeal to IT buyers.

And the first wave of virtualization projects in many enterprises have had very little to do with the network. That is not necessarily an encouraging sign. Recently I asked the question of whether or not most of today's networks could even keep up with the new automated systems.

That is why a recent blog discussion between Arista's Gourlay, Cisco’s Sultan and myself, followed by a blog from F5’s MacVittie is interesting. It is filled with implications for the future of the network (vendors and careers) and the evolution of IT.

Maria Spinola continues her cloud computing essay series with The Five Characteristics of Cloud Computing, “Including a look at the possible delivery and deployment models,” of 8/31/2009:

In the previous article Cloud Equals SaaS, Grid, Utility Computing, Hosting...? I made the following statement:

"SaaS is one of the three possible Cloud Computing delivery modes; however, to be considered Cloud Computing, any of those delivery modes must have certain specific characteristics"

So, in this article we will look at those specific characteristics that define exactly what is Cloud Computing, so that next time you will be able to evaluate if a specific offer is truly Cloud Computing, or simply a pre-existing offering that has the Cloud label slapped on it.

Jay Fry’s Bill Coleman: Cloud is still v1.0, but he has an idea or two for entrepreneurs Q&A session with Bill Coleman elicited this statement regarding the status of public cloud computing:

Unfortunately, I think that we are still in the "1.0" version of simple, public cloud implementations. These will evolve to support more general applications with more sophisticated data models and interoperability with enterprises over the next five years. Where we are just beginning to see something real is in private clouds for the enterprise. Today, it is mostly a lot of market hype to sell low-end virtualization capabilities and "bulldoze and replace" sales. However, that will change. I do think that companies such as CA have the opportunity to catalyze that part of the market, which will ultimately commoditize the economics of enterprise IT.

<Return to section navigation list> 

Cloud Security and Governance

Tim Green reports Catbird reports whether cloud security meets compliance standards in this 9/1/2009 post:

Catbird is adding a feature to its security platform that gives cloud users a reading on how well their data use complies with specific regulatory requirements.

Called vComplaince, the new feature maps the security controls that Catbird’s V-Security software imposes on virtual networks to the controls required by payment card industry (PCI), Department of Defense Information Assurance Certification and Accreditation Process (DIACAP), Sarbanes-Oxley and the health insurance portability and accountability act (HIPAA). …

Alex Meisel defines what a distributed Web application firewall should look line in his Safety in the Cloud(s): ‘Vaporizing’ the Web Application Firewall to Secure Cloud Computing article of 9/1/2009 for Computer Security Review:

Cloud computing was not designed for security, although organizations such as Cloud Security Alliance (CSA) and Open Web Application Security Project (OWASP) are making great strides in helping the industry solve the myriad security problems confronting cloud computing. The benchmark guidelines established by the CSA in the document, Guidance for Critical Areas of Focus in Cloud Computing, is a great first step. This article is intended to pick up where the CSA guide left off in terms of defining what a distributed Web application firewall (dWAF) should look like in order to meet the standards set within the CSA document.

In order to accurately outline how a dWAF is possible while maintaining all the benefits of a completely virtualized environment -- reduced IT overhead, flexible footprint management, virtually unlimited scalability -- a brief overview of cloud technology is needed. Far more than simply maximizing current hardware resources to benefit from unused CPU power, today there are three main technologies available in a cloud that provide the backbone for real productivity gains and compelling business services for companies that don’t want to invest in the hardware scaling burdens common today. …

Robert Rowley, MD’s HHS Certification vs. CCHIT post of 9/1/2008 analyzes the change in certification authorities for “meaningful use” of Electronic Medical/Health Records (EMR):

Electronic Health Record (EHR) certification has undergone a fundamental change, as a result of the Health IT Policy Committee developing a national health IT framework based on “Meaningful Use.” Unlike approaches to certification in the past, created by the vendor industry to codify features that “seemed like a good idea at the time,” the HHS approach is more methodical. It started from a high level deliberative process, which identified a set of national priorities to help focus performance improvement efforts in healthcare delivery. From there, “meaningful use” definitions were elaborated, and are the basis of a new type of certification, called HHS Certification.

Lori MacVittie explains Securing the Other Side of the Cloud in this 9/1/2009 post by asking and answering “Why would miscreants bother with other routes when they can go straight to the source?” Her take:

People concerned with security of the cloud are generally worried about illegitimate access of the applications and data they may deploy in the cloud. That’s a valid concern given the needs of certain vertical industries to comply with privacy-focused regulations like HIPAA and PCI DSS. It’s an extremely valid concern given research and studies showing just how vulnerable most web sites and applications are. Hint: it’s more than you probably think it is, and it’s likely your application is vulnerable to exploitation.

Moving an application from your environment to the cloud doesn’t make it any less or more vulnerable to exploitation. The same code and platforms that made it vulnerable inside your data center make it vulnerable outside your data center.

Joseph Goedert reports on 8/31/2009 an Effort to Certify Health I.T. Security by the Health Information Trust Alliance (HITRUST):

The Health Information Trust Alliance, an industry consortium working to improve the understanding of information security issues in health care, is spearheading efforts to develop health I.T. security certification programs.

Called HITRUST, the alliance last spring unveiled its Common Security Framework for electronic health information. The framework is an attempt to standardize health I.T. security best practices, standards and regulations in a single certifiable tool. The framework includes a best-practices security implementation manual, a cross-referenced standards and regulations matrix, and a readiness assessment toolkit.

But information security professionals and other purchasers often are confused as to which functions a particular product supports. The new certification program is an effort to classify security products by functionality to help providers and payers better understand the products that will help organizations fix the security gaps they have, says Daniel Nutkis, CEO of HITRUST. Criteria will focus on helping organizations determine a product's capabilities, functionality, effectiveness and support of security practices. …

HITRUST is seeking additional vendors and industry stakeholders to participate in the initiative. More information is available at hitrustalliance.net/csfready.

David Linthicum warns Service governance for the cloud: Ignore at your peril in this 8/31/2009 post to InfoWorld’s Cloud Computing blog: “Because the cloud service belongs to someone else, many architects don't think about its implications to delivery in the enterprise”:

Cloud computing needs service or SOA governance -- there is really no question about that. However, those architects charged with extending portions of their architecture out to the clouds are missing the service governance boat. The result is a complex federated system, with components existing inside and outside of the firewall, that is almost impossible to manage.

The trouble seems to be that many architects and other IT managers don't consider cloud computing platforms as "their architecture," so they don't put mechanism around those services to manage change and enforce operational policies. However, as on-premise systems become dependent on cloud-delivered services, they are exposed. Thus, any changes to those services, operational exceptions, or operational violations stop the system in its tracks.

Robert Lemos lists Five lessons from Microsoft on cloud security in this post of 8/25/2009 to InfoWorld’s Cloud Computing blog that I missed when originally posted:

While Google, Amazon, and Salesforce.com have gotten the most attention as cloud service providers, Microsoft -- with its 300 products and services delivered from its datacenters -- has a large cloud bank all its own.

In May, the company released a paper on its approach to cloud services and how the company plans to secure those services. The paper -- penned by Microsoft's Global Foundation Services, the group responsible for overseeing the company's software-as-a-service infrastructure -- spells out the current dangers for online services, including a growing interdependence between customers and the companies that serve them and more sophisticated attacks on Internet services. …

Kevin L. Jackson’s Pentagon Reviews Unisys Stealth post of 8/31/2009 describes “cryptographic bit-splitting” technology for increasing the security of cloud computing networks. Kevin claims the technique includes the following advantages:

  • Enhanced security from moving shares of the data to different locations on one or more data depositories or storage devices (different logical, physical or geographical locations
  • Shares of data can be split physically and under the control of different personnel reducing the possibility of compromising the data.
  • A rigorous combination of the steps is used to secure data providing a comprehensive process of maintaining security of sensitive data.
  • Data is encrypted with a secure key and split into one or more shares
  • Lack of a single physical location towards which to focus an attack

Kevin’s company, Dataline LLC, “is also leveraging this technology.”

Craig Balding’s Cloud Cartography & Side Channel Attacks post of 8/30/2009 reviews the “Hey, You, Get Off of My Cloud: Exploring Information Leakage in Third-Party Compute Clouds” research paper by UC San Diego’s Thomas Ristenpart, Horav Sachem and Stefan Savage and MIT’s Eran Tromer, whose abstract reads:

Third-party cloud computing represents the promise of outsourcing as applied to computation. Services, such as Microsoft’s Azure and Amazon’s EC2, allow users to instantiate virtual machines (VMs) on demand and thus purchase precisely the capacity they require when they require it.  In turn, the use of virtualization allows third-party cloud providers to maximize the utilization of their sunk capital costs by multiplexing many customer VMs across a shared physical infrastructure. However, in this paper, we show that this approach can also introduce new vulnerabilities.

Using the Amazon EC2 service as a case study, we show that it is possible to map the internal cloud infrastructure, identify where a particular target VM is likely to reside, and then instantiate new VMs until one is placed co-resident with the target. We explore how such placement can then be used to mount cross-VM side-channel attacks to extract information from a target VM on the same machine.

Mary Monahan claims “The targets are getting bigger, the fraudsters bolder, and we all have a whole lot more at stake to lose” in her Data Breach Trends - Mary Monahan, Javelin Strategy & Research podcast of 8/24/2009:

This is the message from Mary Monahan, Managing Partner and Research Director at Javelin Strategy & Research. In a discussion of current data breach trends, Monahan touches upon:

  • How breaches in 2009 are trending differently from 2008;
  • What public and private sector organizations need to do to prevent breaches;
  • What to watch for as we approach 2010.

Monahan has 10 years of financial services industry experience. Her banking background includes extensive managerial experience working with growth businesses, strategizing and implementing cross-sectional financial plans to accommodate multiple projective scenarios. As a college educator, Ms. Monahan's work focused on current issues in accounting and economics.

<Return to section navigation list> 

Cloud Computing Events

The Microsoft Virtualization Team’s participation at VMWorld 2009 8/31 to 9/3/2009 is the current subject of Microsoft Virtualization Events page.

When: 8/31 to 9/3/2009
Location: Booth #2422, Moscone Center, San Francisco, CA, USA

Reuven Cohen’s PR: CloudFutures, Cloud Computing for Software Vendors post of 8/31/2009 describes the Cloud Computing for Software Vendors conference coming to San Jose’s Crowne Plaza hotel on 10/5 and 10/6/2009:

Every traditional software vendor I've spoken with sees the cloud coming and understands that it's a game-changer -- but until now they've all been on their own to come up with a "cloud strategy". And a lot of vendors have resorted to a knee-jerk "everything has to be SaaS" strategy. But that's not the only way to play -- just for one example, how about taking the existing product and making it more IaaS friendly? That's much easier in many ways (no new multi-tenancy issues, for example), but it tends to get overlooked. …

The organizers have also offered a discount code for Elastic Vapor readers -- if you register by web or phone, you can use the discount code ENOMALY for $200 off.

When: 10/5 and 10/6/2009
Location: Crowne Plaza Hotel, 282 Almaden Boulevard, San Jose, CA USA 

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

The Xen Project Launches New Open Cloud Initiative press release of 9/31/2009, the first day of VMWorld 2009 reports:

[The Xen Project T]oday formally announced the Xen® Cloud Platform (XCP) initiative – a powerful new community-led effort to build on the growing leadership of the Xen hypervisor in today’s cloud, and deliver a secure and proven open source infrastructure platform for the federated cloud services of tomorrow. The Xen Cloud Platform will accelerate the use of cloud infrastructure for enterprise customers by providing open source virtual infrastructure technology that makes it easy for service providers to deliver secure, customizable, multi-tenant cloud services that work seamlessly with the virtualized application workloads customers are already running in their internal datacenters and private clouds, without locking them into any particular vendor.

Dana Gardner delivers IBM’s view of Open Group points standards at service-orientation architecture needs and approaches in this 8/31/2009 BriefingsDirect post: “This guest BriefingsDirect post comes courtesy of Heather Kreger, IBM’s lead architect for SOA Standards in the IBM Software Group.”

Last week The Open Group announced two new standards for architects; actually, more appropriately, for service architects, SOA architects, and cloud architects. These standards are intended to help organizations more easily deploy service-based solutions rapidly and reliably especially in multi-vendor environments.
These standards are the first product in a family of standards being developed for architects by The Open Group’s SOA Work Group. Other standards currently in development for SOA include the SOA Ontology, SOA Reference Architecture, and Service Oriented Infrastructure.

Maureen O’Gara reports IBM Unveils Industry’s First Public Desktop Cloud: “IBM has partnered with Citrix, Desktone and Wyse as well as VMware in the venture” on 8/31/2009:

At VMworld on Monday IBM announced what it called the industry’s first public desktop cloud. It will be available in the Americas and Europe starting in October.

IBM describes it as its biggest cloud computing endorsement to date enabling anywhere/anytime access for everything from distance learning to business computing.

The new IBM Smart Business Desktop on the IBM Cloud is a subscription service for virtualizing desktop computing resources, providing a logical, rather than physical, method of accessing data, computing power, storage capacity and other resources.

Andre Yee continues the public-versus-private-cloud fracas with Are Private Clouds Really Cloud Computing? of 8/30/2009:

I think the challenge I have is two-fold -

1. What is the definition of a "private cloud"?
2. Is applying virtualization all that's required to legitimately deliver private cloud services?

To examine my connundrum, let's revisit my simplified definition of cloud computing which I blogged about previously -

"Cloud Computing is an on-demand delivery model for IT services or applications with the characteristics of multi-tenant hosting, elasticity (variable capacity) and utility based billing."

<Return to section navigation list> 

blog comments powered by Disqus