Wednesday, December 31, 2008

December NCOIC Plenary Presentations

Presentations from the NCOIC Cloud Computing sessions held earlier this month have been posted on-line in the Federal Cloud Computing wiki. The event featured speakers from IBM, Cisco, Microsoft, HP, and Salesforce. Access is free, but registration is required.


Cloud Computing for Net-Centric Operations
  • NCOIC - Bob Marcus (Leader NCOIC Cloud Computing Team). Slides."Cloud Computing for Net-Centric Operations"
  • HP - Shawn Sami (Chief Technologist, HP Software Federal). Slides."Secure Cloud Computing and RACE"
  • Salesforce.com - Peter Coffee (Director of Platform Research). Slides."Building Mission-Critical SaaS Applications"
  • IBM - David Lindquist (Cloud Computing Chief Architect and IBM Fellow). Slides."IBM's Perspective on Cloud Computing"
  • Cisco - Glenn Dasmalchi (Chief of Staff, Cisco CTO Office). Slides. - James Urquhart (Marketing Manager for Cloud Computing)"Cloud Computing: Trends and Opportunity"
  • Microsoft - Stan Freck (Director, Software + Services – US Public Sector). Slides."Software + Services – the next generation of computing"

    NCOIC Cloud Computing Ad Hoc Team Sessions (8:00 - 5:00)

    Morning: Enterprise Cloud Computing. Standards, and Open Source
  • NCOIC - Bob Marcus. Slides."Open Questions for Enterprise Cloud Computing"
    Open Questions for Enterprise Cloud Computing
  • Dataline- Kevin Jackson. Slides.Report on Enterprise Cloud Computing Session at the World Summit of Cloud Computing
  • Report on Enterprise Cloud Computing Session at the World Summit of Cloud Computing
  • Elastra - Stuart Charlton, Cloud middleware using markup languages (ECML, EDML). Slides.
  • Eucalyptus - Sunil Soman, Open Source Infrastructure similar to Amazon. Slides."Eucalyptus Open Source Cloud Computing System"
  • 3Tera -Bert Armijo, Cloudware architecture. Slides."Clouds Don't have Boundaries"
  • Sun - Joseph Mangin. Slides."Cloud Computing"
  • Open Cloud Consortium - Robert Grossman, Open Cloud Consortium. Slides. "The Open Cloud Consortium"
  • Open Grid Forum - Craig Lee, Open Grid Forum. Slides. "Cloud, Standards, and Net-Centric Operations from an OGF Perspective"

Afternoon: Cloud Computing for Tactical Networks

  • DoD OSD NII - John Daly. Slides."Cloud Computing for Tactical Networks"
  • Dataline- Kevin Jackson. Slides."SOA-R and Cloud Computing in the US Government"
  • DARPA - James Snyder discussed at NCOIC Plenary Panel. Slides."Disruption Tolerant Networking Program"

Tuesday, December 30, 2008

Booz|Allen|Hamilton Launches "Government Cloud Computing Community"


As a follow-up to a Washington, DC Executive Summit event, BoozAllenHamilton recently launched an on-line government cloud computing collaboration environment. In an effort to expand the current dialog around government cloud computing, the strategy and technology consulting firm wants to build a community "to exchange ideas, share lessons learned, and generally support the government’s progress in leveraging Cloud Computing." The current topics listed on the site are themselves very interesting and include:
  • Cloud Computing Case Studies;
  • Cloud Economic Models;
  • Enterprise Architecture; and
  • Cloud Computing War Game
Welcome aboard BAH! I expect that the mere presence of your site will heighten interest within the Federal marketplace on cloud computing.

For additional information or to join the community, send an email to cloudcomputing@bah.com

Monday, December 29, 2008

Is Google Losing Document?

John Dvorak posted this question on his blog Saturday and as of Sunday evening had 52 responses! This is not a good thing for building confidence in cloud computing. Or is it?

The story is that users of Google Docs were receiving the message “Sorry! We are experiencing technical difficulties and cannot show all of your documents.” Apparently the document were restored by Saturday evening, but this incident just reinforces two points:

  • Just like any other human enterprise, cloud computing isn't infallible; and
  • The Google cloud was apparently able to restore all the documents
In the thread it seems that Google Docs was down for one day. Not knowing what happened, this recovery seems to have been just as good as any other global enterprise.

So what's the beef with cloud computing? I don't see any. I do, however, see a problem with the document recovery delay. The issue is that a cloud service should be designed in a way that makes such failures invisible to the users.

This is why I'm a fan of cryptographic data splitting and with geographic disbursement. With this approach, even with a complete failure in one geographic location, data can be reconstituted and served immediately from the other, multiple storage locations without service lost. This type of service level cannot be provided by current RAID solutions.

Friday, December 26, 2008

Cryptographic Data Splitting? What's that?

Cryptographic data splitting is a new approach to securing information. This process encrypts data and then uses random or deterministic distribution to multiple shares. this distribution can also include fault tolerant bits, key splitting, authentication, integrity, share reassembly, key restoration or decryption.

Most security schema have one or more of the following drawbacks:
  • Log-in and password access often does not provide adequate security.
  • Public-key cryptographic system reliance on the user for security.
  • Private keys stored on a hard drive that are accessible to others or through the Internet.
  • Private keys being stored on a computer system configured with an archiving or backup system that could result in copies of the private key traveling through multiple computer storage devices or other systems
  • Loss or damage to the smartcard or portable computing device in biometric cryptographic systems
  • Possibility of a malicious person stealing a mobile user's smartcard or portable computing device using it to effectively steal the mobile user's digital credentials.
  • The computing device connection to the Internet may provide access to the file where the biometric information is stored making it susceptible to compromise through user inattentiveness to security or malicious intruders.
  • Existence of a single physical location towards which to focus an attack.

Cryptographic data splitting has multiple advantages over current, widely used security approaches because:

  • Enhanced security from moving shares of the data to different locations on one or more data depositories or storage devices (different logical, physical or geographical locations
  • Shares of data can be split physically and under the control of different personnel reducing the possibility of compromising the data.
  • A rigorous combination of the steps is used to secure data providing a comprehensive process of maintaining security of sensitive data.
  • Data is encrypted with a secure key and split into one or more shares
  • Lack of a single physical location towards which to focus an attack

Because of these and other advantages, this approach seems to be a natural for cloud computing.

Tuesday, December 23, 2008

Now really. Should the Obama administration use cloud computing?

It's amazing what a little radio time will do!

Since Sunday's broadcast, I've been asked numerous times about my real answer to the question "Will 'Cloud Computing' Work In White House". Although I would never assume to be in a position to advise the President-elect, I'm more than happy, however, to add my voice to the Center for Strategic and International Studies (CSIS) and the distinguished list of contributors that recently released the CSIS Commission on Cybersecurity for the 44th Presidency.

I truly believe that cloud computing technology can be used to implement some of their recommendations. One in particular is their recommendation for a National Office for Cyberspace (NOC) and a new National Security Council Cybersecurity Directorate (NSCCD). Along with the relevant agencies, these organizations would:

"Assume expanded authorities, including revised Federal Information Security management Act (FISMA) authorities, oversight of the Trusted Internet Connections (TIC) initiative, responsibility for the Federal Desktop Core Configuration (FDCC) and acquisition reform, and the ability to require agencies to submit budget proposals relating to cyberspace to receive its approval prior to submission to OMB."



As widely discussed in cloud computing circles, Infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS) and software-as-a-service (SaaS) are all the required components for desktop-as-a-service (DaaS). If applied to a private government cloud, this approach could be easily adopted for the Federal Desktop Core Configuration (FDCC). (Thanks goes to Avner Algom of the Israeli Association of Grid Technologies for this insightful graphic)



As I discussed on the NPR program, cryptographic data splitting could also aid in the management and protection of information in the cloud. As proposed in the CSIS report, the NOC and NSCCD would:



"Manage both a new federated regulatory approach for critical cyber infrastructure and a collaborative cybersecurity network across the federal government"



This would be akin to a "Federated Service Oriented Architecture" where a governance and security layer would be used to simultaneously improve cross-agency collaboration and inter-agency security. Couldn't this actually be the basis for a governmental private cloud? By developing and implementing appropriate standards and protocols for the government-wide, federated SOA layer, the NOC and NSCCD could quickly implement the suggested federated regulatory approach.



As emphasised repeatedly in the CSIS report, cyberspace is a vital asset for the nation. International engagement in order to establish international norms for cyberspace security is also stressed. What better way to set these international norms than to work diligently toward establishing a global, interoperable, secure cloud computing infrastructure.

Sunday, December 21, 2008

NPR "All Things Considered" considers Government Cloud Computing


My personal thanks to
Andrea Seabrook, Petra Mayer and National Public Radio for their report "Will 'Cloud Computing' Work In White House?" on today's "All Things Considered". When I started this blog there was doubt about cloud computing being anything but a fad that would just disappear in a few weeks. Now it's clear that an important dialog is underway on the merits of this new approach for the Federal government. 

I look forward to continuing the dialog and as always welcome your comments.

Thursday, December 18, 2008

HP Brings EDS Division into it's cloud plans

The Street reported earlier this week that Hewlett Packard's EDS division has won a $111 million contract with the Department of Defense (DoD) that could eventually support the U.S. military's cloud-computing efforts. EDS confirmed it will work with DISA officials to conduct security reviews on DoD networks, databases, systems and applications. It will also evaluate DoD security policies, which could further boost the department's cloud strategy. Even though security is universally seen as a key barriers for cloud adoption, this move seems to indicate a willingness on DISA's part to work through the issues that can support eventual widespread cloud deployments. Since EDS also has the Navy-Marine Corps Internet (NMCI) contract through 2010, the hard won experience that EDS has gained from that painful chapter could possibly be leveraged to accelerate cloud services to the desktop. It could also give HP a leg up on the upcoming Navy Consolidated Afloat Networks and Enterprise Services (CANES) and Next Generation Enterprise Networks (NGEN) competitive bids.

Wednesday, December 17, 2008

Cloud Computing and the Process Integration Era

The Industry Advisory Council (IAC) is a non-profit, non-partisan organization dedicated to fostering improved communications and understanding between government and industry. through its affiliation with the American Council for Technology (ACT), IAC provides a forum for industry to collaborate with and advise government executives on IT issues.


In fulfilling this role, the ACT-IAC Transition Study Group recently released a paper titled "Returning Innovation to the Federal Government with Information Technology". Since the Obama's administration stated goals include the "use [of] technology to create a more transparent and connected democracy" and the employment of technology "to solve our nation's most pressing problems", this groups recommendations should certainly be considered.


For this audience, their statements about information technology creating two major categories for performance breakthroughs in government bears attention. According to ACT-IAC, service oriented architecture, software-as-a-service and cloud computing promise to create significant opportunity for reducing budget, improving process quality, and relieving staffing needs.



"The government has thousands of systems that cannot work together and were never designed to do so. This is because components of processes were automated in 1990s era PC or client server technology. Today’s technologies such as service-oriented architecture constructs, use standards and end-to-end process integration to automate processes in a manner that reduces operating costs and errors. These technologies free up labor to focus on problem solving."

The Transition Study Group sees the appointment of the proposed national Chief Technology Officer as a key step towards realizing these and other improvements in our government. They also suggest that the CTO can provide the leadership required to orchestrate innovation within and across Federal agencies. A need for significant changes in federal IT investment processes is also highlighted.

First I'd like to thank ACT-IAC and the Transition Study Group for the insight and thought provoking recommendations offered in their papers. Second, I would like to ask our government decision makers to read this paper and to seriously consider the study group's views and recommendations.

Tuesday, December 16, 2008

The Tactical Cloud

When cloud computing first came in vogue, there was a rather serious discussion about the private cloud concept. The whole idea of cloud computing seemed to argue against implementing such a capability behind organizational walls. Although in some circles, the idea of a "private cloud" is being subsumed by the more acceptable "enterprise cloud", last week's discussions at the Network Centric Operations Industry Consortium brought up a different cloud concept - the "tactical cloud".

Now before you shout foul, please hear me out.

First of all, the discussion was centered on how the US Department of Defense (DoD) could possibly leverage cloud computing technology. In the view of many, the development of a "private" or "enterprise" cloud for the DoD is a fait accompli. Besides, the DoD has multiple private internets (NIPRnet, SIPRnet, JWICS, etc.) so private clouds seem like an appropriate evolution. Enterprise clouds, however, seemed to overlook the need for military formations to simultaneously leverage this new IT approach and operate independently. Individual units could combine their IT infrastructure virtually using cloud computing concepts. One use case hypothesized the use of high fidelity tactical simulations in faster than real-time to help commanders better evaluate tactical options before committing to a course of action. This "tactical cloud" would also need to seamlessly reach back and interact with the DoD's "enterprise cloud" There could even be situations where the "tactical cloud" would link to a public cloud in order to access information or leverage a infrastructure-as-a-service. A naval formation seem to be the perfect environment for a tactical or "battlegroup cloud". Although each ship would normally operate their IT infrastructures independently, certain situations could be better served by linking all the resources into a virtual super-computer.

Even more interesting is the fact that the conversations quickly started addressing the tactical needs of police, firefighters, medical professionals and homeland security organizations. If the DoD could improve their operations with a "tactical cloud" couldn't these other operating units benefit as well?

So tell me. Is there yet another cloud formation to consider?

Monday, December 15, 2008

"Cloud Musings" Now on SYS-CON Media "Cloud Computing Journal" !!

I'm happy to announce that a recent "Cloud Musings" article, "Commercial vs Federal Cloud Computing " has been reposted on SYS-CON Media's "Cloud Computing Journal".

Thank you SYS-CON for making selected "Cloud Musings" articles available to your audience. I am honored by your support and look forward to providing my personal insights to your readers.

SYS-CON Media, founded in 1994, is widely recognized in the Internet-technology and magazine publishing industries as the world's leading publisher of i-technology magazines, electronic newsletters, and accompanying i-technology breaking news, education and information Web portals. Cloud Computing Journal is their publication targeting the cloud computing community.

Friday, December 12, 2008

How to make clouds interoperable and standard !!

This has been a huge part of my life over the past few weeks! This is my personal view.

WARNING: DON'T EXPECT THE ANSWER TO BE FOUND BELOW !!!

There are three basic approaches to this issue being aggressively discussed right now:
  • “Standards Body” Approach
  • “Adopt Proven Technology” Approach
  • “Customer Driven” Approach
All three approaches have value and all three have their problems, including:
  • Global agreement on standards could be a long process
  • “Proven Technology” may mean “Proprietary Technology”
  • Multiple industry customers could result in industry linked standards for cloud

So what is an embryonic industry to do? A hybrid of course !! Options include, but are not limited to:

  • Release all details of a selected “Proven Technology” to industry and adopt as a basis for an open standard
  • Embrace multiple standards each set optimized for industry ROI models
  • “Interoperability Rating “ issued after standards body technical review

(And we've only just begun)

Thursday, December 11, 2008

The Tension between Public and Private Clouds

Last week, during discussion on cloud interoperability and standards in Israel, I saw for the first time a real dichotomy in the value of public (external) and private (internal) clouds. This tension seemed to arise from the fact that an CIOs considering moving applications to an outside cloud vendor, would probably set their highest priority on legacy applications. The logic was that since it was more costly to maintain older applications internally, moving those applications to the cloud would represent a high value option. This customer high value option, however, seemed to present a worst case success scenario for public cloud providers. Is this true?

The general lack of internal metering for applications would also make an Internal vs. External ROI business case a fairly difficult task. Could these tensions actually lead to different business models for internal and external cloud purveyors?

Wednesday, December 10, 2008

Cloud Computing for Continuity of Operations (COOP)

Recently, I've been focusing on cloud computing for COOP. The way I looked at it, many government agencies are already using commercial shared facilities as COOP sites and that the cloud simply represented a virtual shared facility. Although there are still many security, privacy and policy issues that need to be addressed, it seems to me that cloud computing could still provide a cost effective and efficient COOP capability for certain operational and mission subsets.

A major key to success would be in the identification of which non-critical applications and infrastructure resources an agency could migrate to a cloud platform. By definition, these would be resources the agency could go without for two days. In general, applications, storage and server resources related to ad-hoc projects, research and development efforts, and certain peak-time requirements could clearly be considered.

To ensure operational and contractual flexibility, only solutions that could work across multiple cloud infrastructures should be considered. Many commercial vendors can provide multi-cloud support for continuity of operations requirements, including:
After appropriate technical and cost/benefit trades, vendors could be selected and SLAs negotiated. Pending agencies policy reviews and appropriate contracting vehicles, cloud-based COOP could then be put in place.

Is this a valid approach? Are their alternatives? As always, your suggestions and recommendations are welcomed.

Tuesday, December 9, 2008

NCOIC Plenary Session

Hopping a plane to the west coast today to attend the NCOIC Plenary in Costa Mesa, California. First day "Cloud Computing for Net-Centric Operations" agenda includes:

  • David Ryan, Chief Architect HP Federal Technology Solution Group-"Secure Cloud Computing"
  • Peter Coffee, Salesforce.com Director of Platform Research- "Building Mission-Critical SaaS Applications"
  • David Lindquist, IBM Cloud Computing Chief Architect and IBM Fellow - "IBM's Perspective on Cloud Computing"
  • Glenn Dasmalchi, Chief of Staff, Cisco CTO Office - "Cloud Computing: Trends and Opportunity"
  • Stan Freck, Microsoft Director, Software + Services – US Public Sector- "Software + Services – the next generation of computing"

The second day team session are on Enterprise Cloud Computing and Cloud Computing for Tactical Networks and Rapid Deployment. Briefings are expected from Stuart Charlton (Elastra), Daniel Nurmi (Eucalytus) Bert Armijo (3Tera) and someone from Sun Microsystems. Carl Consumano, OSD NII, will also be presenting.

A planning discussion for a possible full-day Conference on "Cloud Computing for Tactical Networks and Rapid Deployment" in the DC area next year is also on the docket.

Dataline named "Top 100 Cloud Computing Company"

SYS-CON's Cloud Computing Journal included Dataline in its expanded list of the most active players in the cloud ecosystem. In adding Dataline to the "Top 100" list, Jeremy Geelan noted that "...the role this company fills as a mid-level Federal System Integrator is crucial to the adoption of these technologies by the public sector". In a related announcement, Dataline was also named a "Cloud Computing Technology Contributors of 2009".
Jeremy Geelan is Sr. Vice-President of SYS-CON Media & Events. He is Conference Chair of the International Cloud Computing Conference & Expo series and founder of Cloud Computing Journal. He is also executive producer and presenter of "Power Panels with Jeremy Geelan" on SYS-CON.TV.
Thank you Jeremy for including Dataline.

Monday, December 8, 2008

Autoscaling into the cloud- Good or Bad?

I always thought saw the ability to autoscale into a cloud infrastructure as a good thing. George Reese presented a differing view on the O'Reilly blog recently.

"Auto-scaling is the ability (with certain cloud infrastructure management tools like enStratus—in a limited beta through the end of the year) to add and remove capacity into a cloud infrastructure based on actual usage. No human intervention is necessary.

It sounds amazing—no more overloaded web sites. Just stick your site into the cloud and come what may! You just pay for what you use.

But I don't like auto-scaling."

While I agree with the need for an enterprise to do capacity planning, I think that the dicussion goes far beyond an overloaded website. I believe that the real value of autoscaling lies in the support of a service oriented architecture (SOA), especially when services are auto-discovered and workflows are created on the fly with mash-ups.

Thursday, December 4, 2008

Cloudera must be reading the script!

"Cloud computing leapt out as the most obvious way to address enterprise large data problems" - Ken Pierce, IT Specialist, DIA-DS/C4ISR

"We view Hadoop as the key enabler...[in] optimizing the [cloud infrastructure] platform to ingest and present information effectively in the petascale." - Robert Ames, Director & Deputy CTO, IBM Federal

Successful mission accomplishment in the DoD, DHS and Intelligence Communities revolve around their ability to process "Big Data". Hadoop is all about processing "Big Data".

The ability to process big data is crucial to mission accomplishment because this is the core technology for processing terabyte-sized datasets with on-line applications. This capability is also needed to enable low-latency in automated decision tools. Since a typical software engineer has never used a thousand machines in parallel to process a petabyte of data, new software tools are critical to the sucessful implementation of solutions in this domain. That's where Hadoop comes in.

Apache Hadoop is a Java software framework that supports data intensive distributed applications. This open source implementation of Google's distributed file system and MapReduce technologies enables applications to work with thousands of nodes and petabytes of data. Cloudera was founded to provide enterprise-level support to users of Apache Hadoop. They have extensive experience and deep expertise in the commercial use of open source software and Hadoop.

During the Cloud Computing Summit, I met Christophe Bisciglia, a Google alumni that recently founded Cloudera. During his time at Google, Christophe created and managed the Academic Cloud Computing Initiative. His success led to an extensive partnership with the National Science Foundation (NSF) which makes Google-hosted Hadoop clusters available for research and education worldwide. Our discussions quickly focused on how Hadoop made the automation of intelligence exploitation feasible.

I can't wait to see the fruit of this potential marriage.

Wednesday, December 3, 2008

Animoto = Automated Imagery PED

Over the past two days, I've spent quite a bit of time with Stevie Clifton, Co-founder & CTO of Animoto. Besides being one of the coolest things I've seen in years, Animoto is giving us a glimpse of automated imagery PED (Processing, Exploitation, Dissemination). First an introduction.

Animoto Productions, a self described "bunch of techies and film/TV producers who decided to lock themselves in a room together and nerd out" have released a web application that automatically generates professionally produced videos. The site uses their patent-pending technology and high-end motion design to fully customized orchestration of user-selected images and music. By using Cinematic Artificial Intelligence technology, "it analyzes and combines user-selected images and music with the same sophisticated post-production skills & techniques that are used in television and film." Their AWS spike story is now famous in the cloud computing computing.

Now let's fast-forward 5, no, 2 years. A UAV is now flying over the US southern border streaming live video to an intelligence center. Using frame-grabbing technology, it forwards a series of still images to the automated intelligence exploitation center. The face recognition program matches one of the images to a known smuggler, which kicks of an automatic query of NCIC and NLETS. Timestamp information is also used to create an audio track from the many high fidelity microphones in the area. The audio, still frames and automatic query data is then sent to the Animoto engine, which uses the available meta-data to produce a intelligence video and transmits it, in near-real-time, to the nearest CBP unit for appropriate interdiction.

WOW!!

By the way, Animoto uses Amazon Web Services with Rightscale to provide it's service.

Monday, December 1, 2008

World Summit of Cloud Computing 2008

Video by Animoto using cloud computing technology. (Done in 20 minutes for free)!!