Thursday, March 11, 2010

Windows Azure and Cloud Computing Posts for 3/11/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010. 

Azure Blob, Table and Queue Services

No significant articles today.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Liam Cavanagh’s My Great Sync Idea post of 3/11/2010 announces:

We have created a new website to allow you to submit and vote on ideas for new sync features as they relate to sync.  You can find the site here: http://www.mygreatwindowsazureidea.com/forums/44459-sql-azure-data-sync-feature-voting

We would love to hear about the things that you would like to see us focus on in the Sync Framework group and although this is primarily meant for features relating to SQL Azure and Windows Azure, please also feel free to add general Sync Framework specific ideas.  You are also able to vote for other's ideas. We have put a few things on the list just to get you started, but feel free to add your own ideas. This will help us to better understand what you need from the Sync Framework team and to build plans around how we make the things that "bubble to the top" a reality for our customers in the future.

I was surprised not to find a TechEd North America 2010 session about SQL Azure Data Sync. See my 15 Azure-Related Sessions Scheduled for TechEd 2010 post of 3/11/2010.

Stephen Forte’s Telerik Releases a new Visual Entity Designer post of 3/11/2010 describes Telerik’s replacement for LINQ to SQL and Entity Framework Objent/Relational Mapping (O/RM) tools for SQL Azure, SQL Server, Oracle, MySQL, etc:

Love LINQ to SQL but are concerned that it is a second class citizen? Need to connect to more databases other than SQL Server? Think that the Entity Framework is too complex? Want a domain model designer for data access that is easy, yet powerful? Then the Telerik Visual Entity Designer is for you.

Built on top of Telerik OpenAccess ORM, a very mature and robust product, Telerik’s Visual Entity Designer is a new way to build your domain model that is very powerful and also real easy to use. How easy? I’ll show you here.

First Look: Using the Telerik Visual Entity Designer

To get started, you need to install the Telerik OpenAccess ORM Q1 release for Visual Studio 2008 or 2010. You don’t need to use any of the Telerik OpenAccess wizards, designers, or using statements. Just right click on your project and select Add|New Item from the context menu. Choose “Telerik OpenAccess Domain Model” from the Visual Studio project templates. …

image

Steve continues with a detailed tutorial for using the Visual Entity Designer with SQL Server 2008 and promises: “In future posts I will show how to use the Visual Designer with some other scenarios. Stay tuned.”

Mark Rendle takes the Socratic/Dialectic (Q&A) approach to an SQL Azure-related analysis in his RDBMS vs NoSQL post of 3/11/2010 begins:

There's an awful lot of buzz around new-fangled "NoSQL" or "non-relational" databases at the moment. I'm hearing a lot of people asking why they'd choose one of these new-fangled storage models over the relational databases they know and (mostly) like. This post aims to shed some light on the choices available, and to maybe give some guidance if you're trying to make a decision. ..

And concludes after a lengthy and fictitious Q&A session:

  • If constant consistency is a key requirement then you should probably use an RDBMS.
  • If horizontal scalability is important and eventual consistency is good enough, then a NoSQL solution might be better.
  • If you're deploying an application to Windows Azure or Amazon Web Services, then you should probably compare the costs of Windows Azure Storage Services or SimpleDB to SQL Azure or Elastic RDBMS for your particular application.
  • If you're writing an application for a mobile platform, you're back to limited storage space so you might get back to RDBMS and normalising for efficient use of that space.
  • If you're writing a small application you want to knock out quickly using Entity Framework or NHibernate or ActiveRecord then use SQL Server Compact or SQLite, or skip that stuff altogether and serialise object graphs to files on the hard-disk.

In fact, if you really design your architecture with an open mind (or beginner's mind), you'll come up with a combination of technologies and paradigms that properly meet different parts of the requirements; maybe use SQL Server 2008 R2 for stock control, invoicing and geo-spatial search operations, but Azure Table Storage for auditing, text files for logging. And why not evaluate distributed Document Stores for the awesome customer review and discussion features that you're adding in phase two?

<Return to section navigation list> 

AppFabric: Access Control and Service Bus

No significant articles today.

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

The Register (@ElReg) reviews my Cloud Computing with the Windows Azure Platform book in its Geeks Guide 2 Cloud Computing post of 3/11/2011:

A Geeks Guide 2

Geeks Guide Cloud Computing with the Windows Azure Platform is written for aspiring Windows Azure programmers by Roger Jennings, author of over 30 books on Microsoft technologies. It offers an excellent overview of cloud computing and provides a structured tutorial format that guides you through Jennings’ approach for programming Windows Azure Storage Services and web, worker, and .NET Services applications.

Cloud Computing with the Windows Azure Platform - £16.19 The book has been split into four parts with two to five chapters. It kicks off with Introducing the Windows Azure Platform, four chapters devoted to generic cloud computing topics, the Windows Azure infrastructure, and Azure Storage Services. This leads nicely into part two, Taking Advantage of Cloud Services in the Enterprise, where Jennings begins to examine the various issues that .NET developers and IT managers face when adopting Azure Cloud Services. This section consists of four chapters and includes topics such as security, privacy, regulatory compliance, and backup and recovery.

We start to get our hands dirty in part three as we begin tackling advanced Azure Service techniques. Over three chapters Jennings introduces programming .NET Services members – Access Control Services (ACS), .NET Services Bus (BS) and Workflow services (WF) – with the Microsoft .NET Services SDK. All the source code used in this section and throughout the book is available for free download at the Wrox website.

Cloud Computing with the Windows Azure Platform actually concludes after part three as part four, Working with SQL Azure Services, which features the final two chapters, is available online only for free at www.wrox.com. Simply put, the SQL Azure Database Community Technology Preview came too late for Jennings to include his final chapter in print. While this is unfortunate, it is however quite simple to access these chapters and as long as you can tolerate printed out sheets, it shouldn’t cause too much of a problem.

You can grab Cloud Computing with the Windows Azure Platform right now for £16.19, saving 40 per cent on the RRP, at Register Books.

Thanks to ElReg for the review. A second update to the SQL Azure chapters 12 and 13 for the 2010 release version is in the editing stage and will be available shortly.

Eugenio Pace delivers a preview of Azure Guidance – Scenario background – Part I in this 3/10/2010 post:

Here’s the “fiction non-fiction” (I love that term :-)) that will become the backdrop for our initial scenario.

Adatum: Business overview

Adatum is a 5000 employee manufacturing company with a large portfolio of applications. Their criticality ranges from “peripheral” to “mission critical”, with lots of “in between. A significant portion of IT budget in Adatum is allocated to “middle” and “peripheral” applications. From an IT perspective, Adatum is mainly a Microsoft technologies shop. Most applications were and are built with Microsoft technologies and tools. They do have however, some legacy systems built on other platforms (e.g. AS400, UNIX boxes, etc).

Adatum’s suppliers and in-house developers are skilled, trained and proficient on the MSFT products, frameworks and tools (SQL Server, Windows Server, .NET, Visual Studio, etc). IT operations and administrators are very familiar with Windows based systems and the tools used to manage them (e.g. System Center, AD, etc).

image

The platform “spectrum”. Giving up control for more economies of scale.

Adatum wants to optimize its IT portfolio. By “optimize” they mean: “directing their investments on the capabilities that differentiate themselves from their competition”. Adatum is not better than its competitors because they can run e-mail better. However they do differentiate from the competition on other things: better supply chain, better quality controls, etc. Adatum would like to have efficient and effective e-mail but then spend most of their budget on the systems that support these important processes in the company. One way they are considering doing this optimization is by selectively deploying applications to the cloud.

Some challenges in Adatum today

imageAn optimization exercise. Moving capabilities to an optimal environment

Deploying new applications in Adatum today takes too long. Even for simple applications the acquisition, provisioning and deployment processes can take several weeks as the requirements are analyzed, procurement is involved, RFPs are sent to suppliers, networks are configured, etc.

Also, many of their existing applications don’t make the best use of the infrastructure. Many run on underutilized servers, yet they find it difficult to deploy new services on the same hardware with the needed SLAs, making the best use of their infrastructure. In some cases they’ve used virtualization, but not for everything. Also, less critical applications typically get less attention from IT staff, leading to increased availability problems, user satisfaction and higher costs.

Adatum sees the cloud as one way of streamlining these processes. They believe they can leverage the economies of scale of Windows Azure, pushing further standardization and automation throughout.

Eugenio continues with details about Adatum’s Goals and some concerns.

The Windows Azure Team’s Real World Windows Azure: Interview with Jacqueline McGlade, Executive Director for the European Environment Agency post of 3/11/2010 begins:

As part of the Real World Windows Azure series, we talked to Jacqueline McGlade, Executive Director for the European Environment Agency (EEA), about using the Windows Azure platform to deliver the agency's environmental awareness communication platform and the benefits that Windows Azure provides. Here's what she had to say:

MSDN: Tell us about the EEA and the services you provide.

McGlade: We provide independent and reliable information on the environment for policy makers and the general public to help improve the environment and to raise environmental awareness across Europe.

MSDN: What was the biggest challenge the EEA faced prior to implementing Windows Azure?

McGlade: We collect environmental data from 600 partner organizations, as well as individuals, and we wanted to make that data easily accessible and understandable. Citizens have become increasingly responsive about changes to their local environment, and this increased demand for information raised a number of technical issues. For example, the EEA Web site servers, which we host on-premises, were nearly at capacity.

MSDN: Can you describe the solution you built with Windows Azure to help meet capacity demands for the environmental information that you're disseminating to European citizens?

McGlade: Eye On Earth is a two-way communication platform that gathers information from a variety of sources and allows citizens to take a more active role in information exchange through a dynamic and user-friendly Web site. Data is fed into Microsoft SQL Azure every hour, which supports rapid retrieval of data, making it possible for Eye On Earth to process and deliver information in near-real time. Citizens can check air quality and bathing water quality for destinations across Europe and, because Windows Azure is hosted in Microsoft data centers, we don't have to worry about capacity issues. Even during peak periods, such as when citizens are planning their summer holidays, we can rapidly scale up and handle the traffic. …

Figure 1: Eye On Earth aggregates complex environmental data from across Europe.

Philipos Sakellaropoulos posted a description and the source code for GPS Runner Maps: My First Windows Azure Application to The Code Project on 20 December 2009, but it didn’t show up in a Google alert until 3/11/2010:

It is "cloud" Web application to display GPS tracks on Google or Bing maps, based on the latest Windows Azure Software Development Kit (November 2009). This code may be useful to you if you want to start a new Windows Azure application or want to find code to display GPS data with server code on a Google/Bing map. It is not a complete application such as professionally designed sites like Every Trail. The application uses ASP.NET providers that work with Azure Table Storage (AspProviders sample from Windows Azure Code Samples), so it is not so much different from a traditional Web application. It uses Table Storage to store data. SQL Azure was not available when I started developing. I would highly recommend SQL Azure because Table Storage has several limitations (although it is supposed to be faster).

Ron Callari asks Will Steve Ballmer's Microsoft & Obama's Administration Share A Cloud? in this 3/11/2010 post to the InventorSpot blog:

Much has been written about Steve Ballmer's almost fanatic focus on cloud computing this month. With catchphrases like "we're all in" and "the cloud fuels Microsoft and Microsoft fuels the cloud" he makes it sound like his company is the lone pioneer in the field. Or when he does allude to Google, Amazon and Apple it is done in a dismissive manner. Could this ramp-up of 'all-things-cloud' be part of a larger alliance Ballmer is solidifying with the Obama Administration?

Ballmer's email to staffBallmer's email to staff

Coming off a speech he made at the University of Washington, and coupled with the roll-out of a "Microsoft-Cloud Services" Web site and a memorandum sent to his staff (where he mentioned the word "cloud" no less than two dozen times), Ballmer's obsessive cloud-focus was reported and analyzed in hundreds of news articles and blogs. Oddly and less reported was a curious visit made to Microsoft's office by Obama's number one IT guru, Vivek Kundra. The meeting date overlapped their presentations made at UW. Curious timing wouldn't you say?


Upon further investigation (based on my own speculation, of course) I knitted together the two stories to see if Obama, Kundra and Ballmer might possibly be floating a trial balloon to consider a collaborative cloud initiative.

Same Speaking Venue/ Same Topic

In back-to-back appearances, speaking at the same university on the same date, Kundra highlightedVivek Kundra the Obama Administration's IT plan to modernize the government's technology infrastructure, by shifting to Vivek Kundra(yes, you guessed it) - cloud computing. In his speech, Kundra explained some of the basics of what the US government would require from cloud computing services.
Kundra specifically referenced the efforts developed by Microsoft to make its cloud-computing services comply with the guidelines of governmental agencies.

I don’t think so. Ron continues with these related topics:

  • Microsoft's Building Its Cloud To Protect Governments …
  • Microsoft's Cloud Computing Extends To China …
  • Why The US Government Needs Cloud Computing …

Eric Picard (@ericpicard) makes the following points in his Why marketers can't ignore the cloud computing revolution in this 3/11/2010 post to iMedia Connection:

    • Cloud services have significant implications for online advertising -- and for the evolution of marketing in general
    • Cloud storage systems could enable you to handle petabytes of data and enable you to build your analytics capabilities in the cloud
    • Once you have your online marketing data in the cloud, you can start doing analysis in much more valuable ways

Eric’s the advertising technology advisor to the Advertising Platform Engineering team at Microsoft.

Return to section navigation list> 

Windows Azure Infrastructure

David Tan’s Microsoft's Cloud Bet post of 3/11/2010 to the CTOEdge blog is an even-handed analysis of the future success of Microsoft Windows Azure and SQL Azure programs:

According to a speech delivered at the University of Washington last week, Microsoft is "All In for the cloud."  The real question is whether it Is time to fold, call, or maybe raise the stakes even higher?

Microsoft has never been the front-runner in any on-line related strategy. Its late arrival to the Internet is well documented and goes down as one of the few major mistakes Bill Gates made in guiding the company. This is different, though. With this shot across the bow, Ballmer has raised the stakes to everyone from players like Salesforce.com to Google and let the world know that Microsoft strongly believes the future of computing is in the cloud.

I guess the question is whether or not they can pull it off. A couple of months back I wrote about Microsoft’s Azure platform.  With this week’s announcement, the Microsoft vision is becoming even clearer. With the release of Azure, Microsoft is attempting to herd their partners and developers down the same path they are taking. By steering their own company, and essentially force-feeding developers and partners the way they need to work, Microsoft is building a cloud army to take the battle to their competitors.

Whatever you think of Microsoft, they have a very good network of partners on both the infrastructure and application development side. Microsoft’s greatest strength has always been enabling its partners to get customers onto the Microsoft platform. The suite of tools at a partner’s disposal to build a cloud-based computing solution is truly unparalleled in the industry. Then, once your application is on the Azure cloud, it’s only logical it will interact with Office on-line much easier than it will with Google Apps. Once that happens, Microsoft has you again. This time in the cloud and this time probably for good.

There is no denying the Microsoft is facing stronger competition than it has in a long time, and frankly Google and others are forcing it to alter its business strategy. The ironic part is that the Cloud approach probably makes them more nimble as a company than they have ever been in the past. In the long run, that might prove to be a very bad thing for competitors. [Emphasis removed.]

I also recommend David’s Azure: Platform of the Future post of 11/20/2009. David is Chief Technology Officer and Co-Founder of CHIPS Computer Consulting LLC (CHIPS).

Lori MacVittie’s I CAN HAS DEFINISHUN of SoftADC and vADC? post of 3/11/2010 analyzes today’s approaches to Application Delivery Controllers (ADCs):

In the networking side of the world, vendors often seek to differentiate their solutions not just based on features and functionality, but on form-factor, as well. Using a descriptor to impart an understanding of the deployment form-factor of a particular solution has always been quite common: appliance, hardware, platform, etc… Sometimes these terms come from analysts, other times they come from vendors themselves. Regardless of where they originate, they quickly propagate and unfortunately often do so without the benefit of a clear definition.

A reader recently asked a question that reminded me that we’ve done just that as we cloud computing and virtualization creep into our vernacular. Quite simply the question was, “What’s the definition of a Soft ADC and vADC?” That’s actually an interesting question as it’s more broadly applicable than just to ADCs. For example, the last several years we’ve been hearing about “Soft WOC (WAN Optimization Controller)” in addition to just plain old WOC and the definition of Soft WOC is very similar to Soft ADC. The definitions are, if not well understood and often used, consistent across the entire application delivery realm – from WAN to LAN to cloud.

So this post is to address the question in relation to ADC more broadly, as there’s an emerging “xADC” model that should probably be mentioned as well. Let’s start with the basic definition of an Application Delivery Controller (ADC) and go from there, shall we?

Lori continues with definitions of:

  • ADC
  • SOFT ADC
  • VIRTUAL ADC (vADC)
  • ADC as a SERVICE

and concludes:

It may seem to follow logically than any version of an ADC (or network solution) is “as good” as the next given that the core functionality is almost always the same regardless of form factor. There are, however, pros and cons to each form-factor that should be taken into consideration when designing an architecture that may take advantage of an ADC. In some cases a Soft ADC or vADC will provide the best value, in others a traditional hardware ADC, and in many cases the highly-scalable and flexible architecture will take advantage of both in the appropriate places within the architecture.

*Some solutions offered “as a service” are more akin to SaaS in that they are truly web services, regardless of underlying implementation, that are “portable” because they can be accessed from anywhere, though they cannot be “moved” or integrated internally as private solutions.

Rich Miller’s Microsoft To Cut Data Center Costs in Half post to the Data Center Knowledge blog of 3/11/2010 begins:

A look at one of the double-decker data center containers in the Microsoft Chicago data center.

Microsoft’s data center containers are packed with thousands of servers. But perhaps the most important thing they are containing is cost. After years of investing up to $500 million in each data center project, Microsoft plans to spend about $250 million or less on each data center going forward.

“I have a goal for the team to reduce costs by 50 percent or more in the next-generation builds,” said Kevin Timmons, who heads Microsoft’s data center operations team. “We’re on track to meet that goal. I can’t be any happier about that.”

After a period of industry growth in which greater scale usually was accompanied by a higher price tag, Timmons said his goal is for Microsoft’s data center network to be “incredibly scalable at awesome cost effectiveness.” …

Rich continues with additional analysis of cost savings with containerization.

tbtechnt points to a new Windows Azure Discussion Poll that emphasizes Windows Azure and SQL Azure economics and billing in this 3/10/2010 post to the Microsoft TechNet blog:

If you have been using Windows Azure in 2010 and have a live application or had a live application running on Windows Azure in 2010, please let us know about your "Azure experience".

Just respond to a few simple questions with your quick poll responses - will take you 5 minutes.

http://surveys.polldaddy.com/s/DC946D92A16F9A9E/

I registered my usual complaints about the lack of Google-competitive free compute hours, storage and data ingress/egress thresholds. See my OakLeaf Systems: A Comparison of Azure and Google App Engine Pricing (7/19/2009) and OakLeaf Systems: Lobbying Microsoft for Azure Compute, Storage and ... (9/8/2009) posts for more background on this issue.

Mike Kirkwood lays down the commandments for cloud computing APIs in his Cloud Religion: Do's, Do Not's, and a Glimpse of Nirvana post of 3/10/2010:

commandments parchmentAs the cloud is getting more players and interfaces, best and worst practices are emerging. As the market grows and more companies try to plug in, the cloud may benefit from guiding principles.

Similar to new technology movements in the past, a natural process is underway to define "what is good", which, for some in the industry, equates to "what is open". Like religion itself, open can be defined in ways that are uplifting, or on the other side of the coin, restricting. Also, we learn again, nothing is free.

Mike addresses the following topics:

    1. Cloud APIs Must Walk on Water
    2. Do Unto Others as You Would Have Done To You
    3. Do Not Covet Thy Neighbors Network Resource
    4. Nirvana: Smells Like Services Orientation

Bernard Golden uses an analogy between diagnosing failures of automotive Electronic Control Units (ECUs) and data center components in his feature-length Cloud and The Death of the Sysadmin post to NetworkWorld’s DataCenter blog of 3/10/2010:

For as long as there have been computer systems, there have been system administrators. These hardy souls are the glue of data centers, provisioning and managing systems: a mix of hardware, storage, OS, middleware, and application software. The best system administrators are like human Swiss Army Knives, armed with every skill needed to keep a system up and humming. In some respects, system administrators function like auto mechanics, needing keen diagnostic skills and repair capabilities to keep a complex mixture of disparate systems operating as a whole.

Of course, data centers have become far more complex over the past decade as systems have been deconstructed into functional components that are segregated into centralized groupings. For example, storage has, in many organizations, been migrated to centralized storage like a SAN or NAS. This has inevitably meant that personnel become more specialized in their tasks and skills. However, for every organization that has a separate storage group, there's another in which the arrival of centralized storage has just meant a new set of tasks heaped upon the sysadmin group.

Even in those IT organizations in which functions like networking and storage have been separated from system management functions, sysadmins still monitor, manage, and repair the software stack. IT organizations still rely on human insight, skill, and experience, to keep apps up and running.

I just read an IEEE article, though, that caused me to reconsider the future of sysadmins. The article, surprisingly, did not address developments in IT, but instead looked at the developments in automobiles; specifically, how autos are now rolling data centers in and of themselves. It notes that today's high-end cars (meaning, the low-end cars five to 10 years from now) have around 100 million lines of code in them distributed among 70 to 100 Electronic Control Units (ECU)-essentially, special purpose computers devoted to tasks like lighting, engine management, and yes (as Toyota is now grappling with), braking. Today's low-end cars have around 30 to 60 ECUs. In the near future, the article quotes research firm Frost and Sullivan, cars will contain 200 to 300 million lines of code.

He continues building his analogy and concludes:

Tomorrow's sysadmin won't be someone who knows enough about a bunch of different things and can write glue scripts. He or she will be a systems engineer, akin to a physician-someone doing assessment, diagnosis, and treatment of highly sophisticated software agglomerations. My only question: will there be enough such people available? No one has yet explored a future where computerization is so pervasive that the entire globe cannot graduate enough technical talent to manage it.

Bernard Golden is CEO of consulting firm HyperStratus, which specializes in virtualization, cloud computing and related issues. He is also the author of "Virtualization for Dummies," the best-selling book on virtualization to date.

<Return to section navigation list> 

Cloud Security and Governance

David Linthicum asserts “In their cloud adoption, many enterprises are not considering obvious issues that could substantially hurt them” in his 3 cloud computing mistakes you can avoid today post of 3/11/2010 to InfoWorld’s Cloud Computing blog:

Companies and individuals implementing cloud computing these days are doing some things correctly and many other things incorrectly. Here are [abbreviated versions of] the top three mistakes I'm seeing and how you can avoid them:

  1. Not considering a public cloud …
  2. Security and governance as afterthoughts …
  3. No continuation of business strategy …

Eric Chabrow claims “Agencies to Pool Resources to Accredit Technology, Services” in his NIST Guidance Seen Saving Government Millions post of 3/11/2010:

As the government looks to deploy cloud computing and other new technologies securely, just-issued guidance from the National Institute of Standards and Technology shows how agencies can pool resources to qualify technologies and services for purchase and, in turn, save taxpayers millions of dollars, says a senior NIST computer scientists.

Traditionally, each agency had been required to have its own officer make a judgment on whether the technology or service being acquired met certain IT security standards. But Special Publication 800-37, Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach, shows how agencies can either have their authorization official team up with counterparts from other agencies - known as joint authorization - or piggyback on the work performed by another agency's authorization official - leveraged authorization - to qualify IT products and services for acquisition.

Ron Ross - who led the team that wrote the revised SP 800-37, which NIST released late last month - explained in an interview with GovInfoSecurity.com (transcript below) how the new process works:

"For example, the GSA (General Services Administration) may go out and accredit or authorize cloud providers information systems. Then, there may be a string of other federal agencies that decide somewhere down the line, after that authorization is completed, that they also want to use that cloud providers services. Instead of having to go back and having each of those agencies do a complete reauthorization for their own purposes, they can now use the documentation and evidence used as part of that first agency's authorization, and use that as the basis of their risk decision."

"This has the potential to save the federal government literally millions of dollars so every agency doesn't have to go forward and do the same process over and over and over again. …

<Return to section navigation list> 

Cloud Computing Events

CCL_audio_books.indd TechEd 2010 reports 15 Azure-Related Sessions Scheduled for TechEd 2010 with my search on the keyword “Azure” in this 3/11/2010 OakLeaf blog post:

TechEd 2010’s Application Server & Infrastructure, Cloud Computing & Online Services, Developer Tools, Languages & Frameworks, and Windows Embedded tracks report 15 session when searching with the key word“Azure” as of 3/11/2010.

15 session titles and abstracts follow. Watch for addition of presenter names and new sessions as we get closer to June.

Maegan reports on 3/9/2010 about a Data Cluster Meetup on 3/14/2010 6:00 PM CST at Opal Devine’s Freehouse in Austin, TX:

Austin, TX may be the live music capital of the world, but next weekend Rackspace, together with Infochimps, WolframAlpha, Factual, and knowmore, are putting together an event that will prove it’s not just about the music.

Data geeks from all over the nation will come together to discuss the latest developments in the world of data during birds-of-a-feather sessions, talks and pure and simple mingling (not to mention munching on free food) at the Data Cluster Meetup (Sunday, March 14, 6pm at Opal Divine’s Freehouse).

Not excited yet? Read on…

Non-relational Database Smackdown
Stu Hood of the Cassandra project will lead a discussion that will debate the merits of various non-relational databases. Any CouchDB or MongoDB users out there? RSVP and get in touch to be involved in the panel.

Birds-of-a-feather
There will be five birds-of-a-feather sessions going on concurrently. Each discussion topic chosen so that you’ll be able to find one that you are most interested in:

  1. Operations (managing data) – Stu Hood of Rackspace and the Apache Cassandra project will lead a discussion on non relational databases
  2. Analytics (exploring data) – [No moderators locked in, interested? Email info@infochimps.org]

  3. Web Applications (humanizing data) – [No moderators locked in, interested? Email info@infochimps.org]
  4. Visualization (seeing data) – [No moderators locked in, interested? Email info@infochimps.org]
  5. Data Commons (freeing data) – Infochimps’s own Flip Kromer, together with Factual’s Gil Elbaz will lead a discussion on building a cross-domain data commons.

Mingling
The best part of this event is the people. You’ll have time to talk, eat, and network with some of the greatest minds in the data world and exchange cutting edge ideas.

If you’re a really smart data geek, you can’t miss out on this chance to immerse yourself in the world you love. RSVP now at http://datacluster.infochimps.org Afterwards, check out our Facebook event page for more information on who’s coming and the latest updates.

Bill Zack will present a CSD25CAL: Windows Azure Design Patterns Webcast on 3/26/2010 at 10:00 AM PST according to this preview from WorkTankSeattle:

One of the challenges in adopting a new platform is finding usable design patterns that work for developing effective solutions. The Catch-22 is that design patterns are discovered and not invented. Nevertheless it is important to have some guidance on what design patterns make sense early in the game.

This webcast attacks the problem through a set of application scenario contexts, Azure features and solution examples. It is unique in its approach and the fact that it includes the use of features from all components of the Windows Azure Platform including the Windows Azure OS, Windows Azure AppFabric and SQL Azure. In this webcast you will learn about the components of the Windows Azure Platform that can be used to solve specific business problems.

Click the preview page’s Register button to register for the Webcast.

Bill is an Architect Evangelist with Microsoft. He comes to this role after serving as a Solutions Architect in the Financial Services Unit of Microsoft Consulting Services.

Catherine Eibner announces in her 3/11/2010 Azure BizSpark Camp Sydney: Chance to Win $5000 post will be held 4/10 and 4/11/2010:

Print

The current economic downturn is putting many entrepreneurs under increasing pressure, making it critical to find new resources and ways to reduce costs and inefficiencies. Microsoft BizSparkCamp for Windows Azure is designed to offer following assistance to entrepreneurs.

  • Chance to win cash prize of $5000
  • Learn and build new applications in the cloud or use interoperable services that run on Microsoft infrastructure to extend and enhance your existing applications with help of on-site advisors
  • Get entrepreneurs coaching from a panel of industry experts
  • Generate marketing buzz for your brand
  • Create opportunity to be highlighted at upcoming launch

WindowsAzure2We are inviting nominations from BizSpark Startups interested in Windows Azure Platform that target one or more of the following:

The Microsoft BizSparkCamp for Windows Azure will be held in Sydney on Saturday the 10th of April ending on Sunday the 11th. This event consists of ½ day of training, 1 day of active prototype/development time, and ½ day for packaging/finishing and reporting out to a panel of judges for various prizes.

Catherine continues with requirements for nominating your team.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Chris Kanaracus reports “A number of people who worked on the Drizzle project have gone to the cloud infrastructure vendor” in his Rackspace hires to align with MySQL offshoot article of 3/9/2010 for NetworkWorld’s Software blog:

A number of former Sun Microsystems employees who worked on Drizzle, an offshoot of the MySQL open-source database, have ended up at cloud infrastructure provider Rackspace, where they will continue their efforts, developer Jay Pipes wrote in a blog post Monday.

Pipes left a post as community relations manager for MySQL in October 2008 to begin working on Drizzle. He had become frustrated by "the slow pace of change in the MySQL engineering department and its resistance to transparency."

Drizzle was of interest to Pipes "because it was not concerned with backwards compatibility with MySQL, it wasn't concerned with having a roadmap that was dependent on the whims of a few big customers, and it was very much interested in challenging the assumptions built into a 20-year-old code base," he wrote.

Shortly after Oracle closed its acquisition of Sun earlier this year, Pipes learned he and others would be out of a job.

"I don't know whether [Oracle CEO Larry Ellison] understands that cloud computing and infrastructure-as-a-service, platform-as-a-service, and database-as-a-service will eventually put his beloved Oracle cash cow in its place or not," he wrote. "But what I do know is that Rackspace is betting that providing these services is what the future of technology will be about." …

<Return to section navigation list> 

blog comments powered by Disqus