Wednesday, February 25, 2009

2nd Government Cloud Computing Survey - A Sneak Peek

This month, we're in the middle of collecting data for our 2nd Government Cloud Computing Survey. to peek your curiosity (an to entice your participation) here is a sneak peek of some of the results.

First, I want to thank the Government customer community for a significant increase in their participation. Although industry participation has improved as well, the main purpose of this survey is to better understand how cloud computing can best address Government customer needs. We believe that this increase in Government participation has improved our measurement of the customer's familiarity with cloud computing, resulting in an increase in the percentage of customers reporting an unfamiliarity with cloud computing.

In an effort to better understand the issues and challenges that customer hope to address with cloud computing, we expanded the issues and challenges list in the second survey. Initial data seems to indicate that this move has improved our ability to provide insight into this aspect of the market. Data indicates that while the top two main issues continue to be datacenter and budget limitations, organizational changes and the resulting increase in stakeholder community is also of critical importance.

Finally, the data shows that security remains the dominant barrier to adoption with 71% of the respondent naming that as the top customer concern.

If you haven't completed the survey, please go to and complete the short survey. Complete results will be presented at the 2nd International Cloud Computing Conference and Expo in New York City, March 30th - April 1, 2009.

Monday, February 23, 2009

Government could save billions with cloud computing

In a recent study, published by MeriTalk, Red Hat and DLT Solutions, the Federal government could save $6.6 billion by using cloud computing or software-as-a-service.

"Looking at 30 federal agencies, the study assumes every agency is starting from scratch with new technology. So instead of buying new software, agencies could save money using open-source instead of proprietary software, virtualization technologies instead of new servers, and cloud computing instead of buying software and hardware.

The numbers are based on federal agency budgets, using assumptions from other studies about federal computing resources."

Peter Tseronis, deputy associate chief information officer for the US Department of Energy commented on the study saying that even though critics may be able to quibble with the numbers, the big message is that federal agencies should be looking at new ways to save money with IT investments.

My personal recommendation to those agencies considering cloud computing solutions would be for them to send a representative to play in the Cloud Computing Wargame at this year's FOSE 2009 convention. This not only presents information on the technology, but it also provides a "no risk" means of comparing cloud computing options to more traditional IT approaches.

Reservations are required to participate in the wargame. Online registration is available or you can contact the registrar at 800-746-0099 or by email at There will also be a Cloud Computing Capstone Event on Wednesday, March 11th from 4:30-5:30 pm. The Capstone Event will feature a panel discussion announcement of wargame winners and review of lessons learned.

Space is limited so please make your reservations today. Wargame session schedule is listed below.

Tuesday March 10th

  • Session 1: 10:30 - 12:00
  • Session 2: 12:30 - 2:00
  • Session 3: 2:30 - 4:00

Wednesday March 11th

  • Session 1: 10:30 - 12:00 (DoD Theme)
  • Session 2: 12:30 - 2:00 (Intel Theme)
  • Session 3: 2:30 - 4:00 (Civil & Healthcare Theme)

Thursday March 12th

  • Session 1: 10:30 - 12:00

Thursday, February 19, 2009

Cloud Games at FOSE 2009


Booz Allen Hamilton is launching its Cloud Computing Wargame (CCW)T at FOSE March 10-12, 2009 in Washington, DC. The CCW is designed to simulate the major decision factors government agencies' IT executives and Program Managers must take into consideration when moving to a cloud environment. FOSE attendees will have the opportunity to step up and be a "player."

The Government Cloud Computing Community (GCCC) web site has a specific forum on this, and more information will be made available as we move to FOSE. This forum is an opportunity to discuss Cloud Computing Wargame features, uses, and more. AND, you will have the opportunity to suggest enhancements to the "game" that can benefit you and other government colleagues looking at cloud transformation.

For a pre-show Overview Learn More at the Government Cloud Computing Website if you are already a member, or JOIN NOW. Send an email to: AND for those of you who are fortunate to play, or are bystanders, a survey will be handed out and will be also available on-line at the Government Cloud Computing Community web site! Results will be posted.

Monday, February 16, 2009

IBM and Amazon

According to the Amazon Web Services (AWS) site, you can now use DB2, Informix, WebSphere sMash, WebSphere Portal Server or Lotus Web Content Management on Amazon's EC2 cloud.

"This relationship will enable you to bring your own IBM licenses to Amazon EC2, utilize IBM’s 'Development' AMIs, or leverage the 'Production' Amazon EC2 running IBM service. All of these options will enable you to get started quickly using popular IBM platform technologies in the Amazon EC2 environment."

To me, this move was clearly focused at besting Microsoft's Azure platform into the cloud and represents a huge leap forward for this nascent industry. Now an enterprise can easily use these IBM products as a utility. As IBM adds more of their products into the mix, cloud transition offerings (enterprise-to-cloud or cloud-to enterprise) will follow.

This also represents a huge turning point for cloud interoperability. IBM's enterprise footprint paired with Amazon's established lead in infrastructure-as-a-service could create a de facto standard within the cloud computing community.

I'm not sure if this is good or bad, but it is definitely a huge event.

Friday, February 13, 2009

A Berkeley View of Cloud Computing

Yesterday, Berkeley released their View of Cloud Computing with a view that cloud computing provides an elasticity of resources, without paying a premium for large scale, that is unprecedented in the history of IT. Specifically, cloud computing provides:
  • The illusion of infinite computing resources available on demand, thereby eliminating the need for Cloud Computing users to plan far ahead for provisioning.

  • The elimination of an up-front commitment by Cloud users, thereby allowing companies to start small and increase hardware resources only when there is an increase in their needs.

  • The ability to pay for use of computing resources on a short-term basis as needed (e.g., processors by the hour and storage by the day) and release them as needed, thereby rewarding conservation by letting machines and storage go when they are no longer useful.
In the article, they have actually provided an equation that captures the value trade off for a web business with varying demand over time and revenue proportional to user hours.

Of particular interest is their "Top 10" table of cloud computing opportunities and obstacles.

If you're serious about exploring the use of cloud computing, this paper is well worth your time.

Wednesday, February 11, 2009

Cloud Economic Models

One of the most important drivers of cloud computing in the Federal space is its perceived "compelling" economic value. Some initial insight on the economic argument is now available on the Government Cloud Computing Community website. According to Booz Allen Hamilton, cloud computing changes much of what is now understood about estimating IT costs and the current process for deriving economic benefits. Calling cloud computing "a game changer", the noted management consulting firm says that virtualization, elasticity, and massive scale in the cloud break linear cost relationships and enable infrastructure compression factors from
4:1 to 10:1. This compression of infrastructure results in a reduction in operations and maintenance costs.

This new approach to IT also changes the way capital planning and budgeting are conducted.

" Depending on how cloud computing is adopted, funds can migrate out of capital budgets into O&M budgets, away from the cost of bandwidth in the LAN to the (often higher) cost of bandwidth in the WAN. Funds budgeted for software and desktop support become service fees."
For free access to this report and other resources in the Government Cloud Computing Community wiki, send an email to

Tuesday, February 10, 2009

Cloud Computing In Government: From Google Apps To Nuclear Warfare

Today, I want to thank John Foley of InformationWeek for an enjoyable interview and his excellent post, Cloud Computing In Government: From Google Apps To Nuclear Warfare. Our discussion covered cloud-bursting at sea by battleship groups, satellite imagery, and open source software development. Please read his post and comment on other examples of government use of cloud computing.

Monday, February 9, 2009

CloudCamp Federal @ FOSE

Sign up now CloudCamp Federal @ FOSE, March 10,2009, 3pm - 8:30pm at the Walter E. Washington Convention Center, 801 Mount Vernon Place NW , Washington, DC. As a follow-up to last November's sucessful CloudCamp Federal this event will also be held as an unconference where attendees can exchange ideas, knowledge and information in a creative and supporting environment, advancing the current state of cloud computing and related technologies.

Although focused on cloud computing within the Federal government context, CloudCamp Federal is an informal, member-supported gathering, that relies entirely on volunteers to help with meeting content, speakers, meeting locations, equipment and membership recruitment.

See you there !!!

Friday, February 6, 2009

Thank You NVTC "Cool Tech" and TechBISNOW !!

Thank you to Dede Haas, Chris D'Errico and the Northern Virginia Technology Council for the opportunity to speak at yesterday's NVTC "Cool Tech" Committee meeting! The Agilex facilities were awesome and I couldn't ask for more from the audience. I also want to thank TechBISNOW for the great coverage.

For those unable to attend, NVTC has made my presentation available online at the committee website.

A Significant Event in Cloud Interoperability

On Jan 20th, GoGrid released it's API specification under a Creative Commons license.

"The Creative Commons Attribution Share Alike 3.0 license, under which the GoGrid cloudcenter API now falls, allows for the ability to:

  • Share, distribute, display and perform the work

  • Make derivative works

The GoGrid cloudcenter API re-use must, however, fall under the following Share Alike licensing conditions:

  • There must be full attribution to GoGrid, author and licensor

  • There is no implied endorsement by GoGrid of any works derived from the API usage or rework

  • After any transformation, alteration or building upon this work, any distribution must be under the same, a similar or a compatible license

  • You must make it clear to others about the terms of this license. The best way to do this is by linking to the GoGrid Wiki API page (link below)

  • Any of the conditions mentioned previously can be waived with permission from GoGrid

Details on the GoGrid cloudcenter OpenSpec API license can be found within the GoGrid site and is specific to the API only"

As I discussed in my earlier post on cloud interoperability, this move falls under industry option 1 - Release all details of a selected “Proven Technology” to industry and adopt as a basis for an open standard.

A big Thank You goes out to Go Grid. It's exciting to see our industry maturing in a collaborative way.

Wednesday, February 4, 2009

Booz|Allen|Hamilton & Dataline Sponsor 2nd Government Cloud Computing Survey

Dataline, Booz|Allen|Hamilton and the Government Cloud Computing Community have teamed together to sponsor the 2nd Government Cloud Computing Survey. Cloud Computing has come a long way since the first survey six months ago, so we are once again asking for your thoughts on this exciting new approach to technology. The rare examples of just a few months ago have turned into a large number of exciting cloud-based deployments including:
  • GovDelivery for mass email and wireless notices to the public (50+ governmental organizations across 25 states and 13 federal departments) 
  • DC Government use of Google Apps for their 38,000+ employees for email, document collaboration, intranet, and calendars.
  • Census and Human Genome project data stored in the Amazon cloud
  • Acquisition solutions at 65 separate government agencies
  • Financial tracking at the State Department using
Please give us your views, thoughts and plans through the survey at   . Results will be made available through this blog and the SOA-Ring and Government Cloud Computing Community wikis.

Tuesday, February 3, 2009

Gartner Lays Out 7-year Plan for Cloud Computing

According to Gartner's new report, cloud computing will go through three phases over seven years before it will mature as an industry;

- Phase 1: 2007 to 2011 — Pioneers and Trailblazers - A market development phase when technology providers with the strongest market "vision" will garner the most success among early adopters.

- Phase 2: 2010 to 2013 — Market Consolidation - The market will become overcrowded with a broad range of solutions from large and small vendors, and competitive pressure will drive many weaker players from the market, resulting in acquisition activity. By 2013 this technology will be the preferred, but not the exclusive, choice for the majority of opportunistic and architecturally simple application development efforts among Global 2000 enterprises.

- Phase 3: 2012 to 2015 and Beyond — Mainstream Critical Mass and Commoditization - A small number of large providers will dominate the market, providing de facto standards. These vendors will primarily leverage proprietary technologies developed during the previous five years, but they will also widely support intracloud application programming interfaces to establish a technology "fabric," linking cloud-based solutions across vendor platforms.

This outlook definitely says that cloud computing is here to stay.


I guess the blogsphere does have some clout! From Lydia Long in her Feb 4th blog.

"Gartner recently put out a press release titled “Gartner Says Cloud Application Infrastructure Technologies Need Seven Years to Mature“, based on a report from my colleague Mark Driver. That’s gotten a bunch of pickup in the press and in the blogosphere. I’ve read a lot of people commenting about how the timeline given seems surprisingly conservative, and I suspect it’s part of what has annoyed Reuven Cohen into posting, “Cloud computing is for everyone — except stupid people.

The confusion, I think, is over what the timeline actually covers.

Cloud computing in general already has substantial business uptake, with potential radical acceleration due to the economic downturn. ... I have far more clients suddenly willing to consider taking even big risks to leap into the cloud, than I have clients who actually have projects well-suited to the public cloud and who will realize substantial immediate cost savings from that move.

On the flip side, for those who have public-facing Web infrastructure, cloud services are now a no-brainer. ...Traditional hosting providers who don’t make the transition near-immediately are going to get eaten alive."

Cloud Interoperability Magazine Launches

My congratulations goes out today to Reuven Cohen on the launch of Cloud Interoperability MagazineThe site will focus on Cloud Computing, standardization efforts, emerging technologies, and infrastructure API's. As the new home for the Cloud Computing Interoperability Forum the outlet promises to enhance industry dialog around this important subject.

Reuven Cohen, through his Elastic Vapor blog, has been a key cloud computing thought leader and I look forward to the success of this new on-line venue.

Monday, February 2, 2009

Why Can't We Eliminate the "Technology Refresh" RFP?

In order to maintain life cycle and technology, the Navy is upgrading server farms at fifteen (15) sites and any future sites throughout the Far East, Europe and Middle East regions. According to the RFP:

"The Server Farm Refresh is focused on upgrading hardware that is already out of warranty and also improving the data services, performance and future capabilities while still meeting the needs of the Fleets." 

In outlining the service's requirement, the RFP specifies a solution that shall not:
  • Require a significant increase in staffing levels;
  • Introduce the requirement for senior skill sets that may not be available or exceedingly costly to obtain; or
  • Exceed a 10% increase in seat cost of $2244 per year.
In their proposals, offeror's are directed to include:
  • all KVM and UPS devices & associated peripherals
  • administrators and systems engineers
  • the capability to power up, power-down, reboot and install operating system and applications and perform administrative functions remotely across the enterprise. 
  • a solution that is secure, scalable, manageable and supportable through 2015.
  • a minimum 5 year warranty on provided hardware. 
  • A component failure will not cause outages for the customers
  • use of redundant/load balanced servers for critical devices is required
  • a target service availability of 99.999% or greater 
  • the necessary capacity to support future growth of sites, users, and services without major disruption or overhaul of infrastructure 
  • scalable/upgradeable to the latest technology
  • able to support industry changes to operating systems or software upgrades without requiring expenditure of additional funds 
  • COTS based
Is it me or is this RFP a commercial for cloud computing?

This procurement explicitly procures new technology to refresh old technology. All that does is guarantee another refresh after another few years.  Why not outline a competition for the design and delivery of a private cloud that meets these requirements and all applicable DoD directives. Isn't that much simpler and more direct? 

What do you think?