Wednesday, December 31, 2008

December NCOIC Plenary Presentations

Presentations from the NCOIC Cloud Computing sessions held earlier this month have been posted on-line in the Federal Cloud Computing wiki. The event featured speakers from IBM, Cisco, Microsoft, HP, and Salesforce. Access is free, but registration is required.


Cloud Computing for Net-Centric Operations
  • NCOIC - Bob Marcus (Leader NCOIC Cloud Computing Team). Slides."Cloud Computing for Net-Centric Operations"
  • HP - Shawn Sami (Chief Technologist, HP Software Federal). Slides."Secure Cloud Computing and RACE"
  • Salesforce.com - Peter Coffee (Director of Platform Research). Slides."Building Mission-Critical SaaS Applications"
  • IBM - David Lindquist (Cloud Computing Chief Architect and IBM Fellow). Slides."IBM's Perspective on Cloud Computing"
  • Cisco - Glenn Dasmalchi (Chief of Staff, Cisco CTO Office). Slides. - James Urquhart (Marketing Manager for Cloud Computing)"Cloud Computing: Trends and Opportunity"
  • Microsoft - Stan Freck (Director, Software + Services – US Public Sector). Slides."Software + Services – the next generation of computing"

    NCOIC Cloud Computing Ad Hoc Team Sessions (8:00 - 5:00)

    Morning: Enterprise Cloud Computing. Standards, and Open Source
  • NCOIC - Bob Marcus. Slides."Open Questions for Enterprise Cloud Computing"
    Open Questions for Enterprise Cloud Computing
  • Dataline- Kevin Jackson. Slides.Report on Enterprise Cloud Computing Session at the World Summit of Cloud Computing
  • Report on Enterprise Cloud Computing Session at the World Summit of Cloud Computing
  • Elastra - Stuart Charlton, Cloud middleware using markup languages (ECML, EDML). Slides.
  • Eucalyptus - Sunil Soman, Open Source Infrastructure similar to Amazon. Slides."Eucalyptus Open Source Cloud Computing System"
  • 3Tera -Bert Armijo, Cloudware architecture. Slides."Clouds Don't have Boundaries"
  • Sun - Joseph Mangin. Slides."Cloud Computing"
  • Open Cloud Consortium - Robert Grossman, Open Cloud Consortium. Slides. "The Open Cloud Consortium"
  • Open Grid Forum - Craig Lee, Open Grid Forum. Slides. "Cloud, Standards, and Net-Centric Operations from an OGF Perspective"

Afternoon: Cloud Computing for Tactical Networks

  • DoD OSD NII - John Daly. Slides."Cloud Computing for Tactical Networks"
  • Dataline- Kevin Jackson. Slides."SOA-R and Cloud Computing in the US Government"
  • DARPA - James Snyder discussed at NCOIC Plenary Panel. Slides."Disruption Tolerant Networking Program"

Tuesday, December 30, 2008

Booz|Allen|Hamilton Launches "Government Cloud Computing Community"


As a follow-up to a Washington, DC Executive Summit event, BoozAllenHamilton recently launched an on-line government cloud computing collaboration environment. In an effort to expand the current dialog around government cloud computing, the strategy and technology consulting firm wants to build a community "to exchange ideas, share lessons learned, and generally support the government’s progress in leveraging Cloud Computing." The current topics listed on the site are themselves very interesting and include:
  • Cloud Computing Case Studies;
  • Cloud Economic Models;
  • Enterprise Architecture; and
  • Cloud Computing War Game
Welcome aboard BAH! I expect that the mere presence of your site will heighten interest within the Federal marketplace on cloud computing.

For additional information or to join the community, send an email to cloudcomputing@bah.com

Monday, December 29, 2008

Is Google Losing Document?

John Dvorak posted this question on his blog Saturday and as of Sunday evening had 52 responses! This is not a good thing for building confidence in cloud computing. Or is it?

The story is that users of Google Docs were receiving the message “Sorry! We are experiencing technical difficulties and cannot show all of your documents.” Apparently the document were restored by Saturday evening, but this incident just reinforces two points:

  • Just like any other human enterprise, cloud computing isn't infallible; and
  • The Google cloud was apparently able to restore all the documents
In the thread it seems that Google Docs was down for one day. Not knowing what happened, this recovery seems to have been just as good as any other global enterprise.

So what's the beef with cloud computing? I don't see any. I do, however, see a problem with the document recovery delay. The issue is that a cloud service should be designed in a way that makes such failures invisible to the users.

This is why I'm a fan of cryptographic data splitting and with geographic disbursement. With this approach, even with a complete failure in one geographic location, data can be reconstituted and served immediately from the other, multiple storage locations without service lost. This type of service level cannot be provided by current RAID solutions.

Friday, December 26, 2008

Cryptographic Data Splitting? What's that?

Cryptographic data splitting is a new approach to securing information. This process encrypts data and then uses random or deterministic distribution to multiple shares. this distribution can also include fault tolerant bits, key splitting, authentication, integrity, share reassembly, key restoration or decryption.

Most security schema have one or more of the following drawbacks:
  • Log-in and password access often does not provide adequate security.
  • Public-key cryptographic system reliance on the user for security.
  • Private keys stored on a hard drive that are accessible to others or through the Internet.
  • Private keys being stored on a computer system configured with an archiving or backup system that could result in copies of the private key traveling through multiple computer storage devices or other systems
  • Loss or damage to the smartcard or portable computing device in biometric cryptographic systems
  • Possibility of a malicious person stealing a mobile user's smartcard or portable computing device using it to effectively steal the mobile user's digital credentials.
  • The computing device connection to the Internet may provide access to the file where the biometric information is stored making it susceptible to compromise through user inattentiveness to security or malicious intruders.
  • Existence of a single physical location towards which to focus an attack.

Cryptographic data splitting has multiple advantages over current, widely used security approaches because:

  • Enhanced security from moving shares of the data to different locations on one or more data depositories or storage devices (different logical, physical or geographical locations
  • Shares of data can be split physically and under the control of different personnel reducing the possibility of compromising the data.
  • A rigorous combination of the steps is used to secure data providing a comprehensive process of maintaining security of sensitive data.
  • Data is encrypted with a secure key and split into one or more shares
  • Lack of a single physical location towards which to focus an attack

Because of these and other advantages, this approach seems to be a natural for cloud computing.

Tuesday, December 23, 2008

Now really. Should the Obama administration use cloud computing?

It's amazing what a little radio time will do!

Since Sunday's broadcast, I've been asked numerous times about my real answer to the question "Will 'Cloud Computing' Work In White House". Although I would never assume to be in a position to advise the President-elect, I'm more than happy, however, to add my voice to the Center for Strategic and International Studies (CSIS) and the distinguished list of contributors that recently released the CSIS Commission on Cybersecurity for the 44th Presidency.

I truly believe that cloud computing technology can be used to implement some of their recommendations. One in particular is their recommendation for a National Office for Cyberspace (NOC) and a new National Security Council Cybersecurity Directorate (NSCCD). Along with the relevant agencies, these organizations would:

"Assume expanded authorities, including revised Federal Information Security management Act (FISMA) authorities, oversight of the Trusted Internet Connections (TIC) initiative, responsibility for the Federal Desktop Core Configuration (FDCC) and acquisition reform, and the ability to require agencies to submit budget proposals relating to cyberspace to receive its approval prior to submission to OMB."



As widely discussed in cloud computing circles, Infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS) and software-as-a-service (SaaS) are all the required components for desktop-as-a-service (DaaS). If applied to a private government cloud, this approach could be easily adopted for the Federal Desktop Core Configuration (FDCC). (Thanks goes to Avner Algom of the Israeli Association of Grid Technologies for this insightful graphic)



As I discussed on the NPR program, cryptographic data splitting could also aid in the management and protection of information in the cloud. As proposed in the CSIS report, the NOC and NSCCD would:



"Manage both a new federated regulatory approach for critical cyber infrastructure and a collaborative cybersecurity network across the federal government"



This would be akin to a "Federated Service Oriented Architecture" where a governance and security layer would be used to simultaneously improve cross-agency collaboration and inter-agency security. Couldn't this actually be the basis for a governmental private cloud? By developing and implementing appropriate standards and protocols for the government-wide, federated SOA layer, the NOC and NSCCD could quickly implement the suggested federated regulatory approach.



As emphasised repeatedly in the CSIS report, cyberspace is a vital asset for the nation. International engagement in order to establish international norms for cyberspace security is also stressed. What better way to set these international norms than to work diligently toward establishing a global, interoperable, secure cloud computing infrastructure.

Sunday, December 21, 2008

NPR "All Things Considered" considers Government Cloud Computing


My personal thanks to
Andrea Seabrook, Petra Mayer and National Public Radio for their report "Will 'Cloud Computing' Work In White House?" on today's "All Things Considered". When I started this blog there was doubt about cloud computing being anything but a fad that would just disappear in a few weeks. Now it's clear that an important dialog is underway on the merits of this new approach for the Federal government. 

I look forward to continuing the dialog and as always welcome your comments.

Thursday, December 18, 2008

HP Brings EDS Division into it's cloud plans

The Street reported earlier this week that Hewlett Packard's EDS division has won a $111 million contract with the Department of Defense (DoD) that could eventually support the U.S. military's cloud-computing efforts. EDS confirmed it will work with DISA officials to conduct security reviews on DoD networks, databases, systems and applications. It will also evaluate DoD security policies, which could further boost the department's cloud strategy. Even though security is universally seen as a key barriers for cloud adoption, this move seems to indicate a willingness on DISA's part to work through the issues that can support eventual widespread cloud deployments. Since EDS also has the Navy-Marine Corps Internet (NMCI) contract through 2010, the hard won experience that EDS has gained from that painful chapter could possibly be leveraged to accelerate cloud services to the desktop. It could also give HP a leg up on the upcoming Navy Consolidated Afloat Networks and Enterprise Services (CANES) and Next Generation Enterprise Networks (NGEN) competitive bids.

Wednesday, December 17, 2008

Cloud Computing and the Process Integration Era

The Industry Advisory Council (IAC) is a non-profit, non-partisan organization dedicated to fostering improved communications and understanding between government and industry. through its affiliation with the American Council for Technology (ACT), IAC provides a forum for industry to collaborate with and advise government executives on IT issues.


In fulfilling this role, the ACT-IAC Transition Study Group recently released a paper titled "Returning Innovation to the Federal Government with Information Technology". Since the Obama's administration stated goals include the "use [of] technology to create a more transparent and connected democracy" and the employment of technology "to solve our nation's most pressing problems", this groups recommendations should certainly be considered.


For this audience, their statements about information technology creating two major categories for performance breakthroughs in government bears attention. According to ACT-IAC, service oriented architecture, software-as-a-service and cloud computing promise to create significant opportunity for reducing budget, improving process quality, and relieving staffing needs.



"The government has thousands of systems that cannot work together and were never designed to do so. This is because components of processes were automated in 1990s era PC or client server technology. Today’s technologies such as service-oriented architecture constructs, use standards and end-to-end process integration to automate processes in a manner that reduces operating costs and errors. These technologies free up labor to focus on problem solving."

The Transition Study Group sees the appointment of the proposed national Chief Technology Officer as a key step towards realizing these and other improvements in our government. They also suggest that the CTO can provide the leadership required to orchestrate innovation within and across Federal agencies. A need for significant changes in federal IT investment processes is also highlighted.

First I'd like to thank ACT-IAC and the Transition Study Group for the insight and thought provoking recommendations offered in their papers. Second, I would like to ask our government decision makers to read this paper and to seriously consider the study group's views and recommendations.

Tuesday, December 16, 2008

The Tactical Cloud

When cloud computing first came in vogue, there was a rather serious discussion about the private cloud concept. The whole idea of cloud computing seemed to argue against implementing such a capability behind organizational walls. Although in some circles, the idea of a "private cloud" is being subsumed by the more acceptable "enterprise cloud", last week's discussions at the Network Centric Operations Industry Consortium brought up a different cloud concept - the "tactical cloud".

Now before you shout foul, please hear me out.

First of all, the discussion was centered on how the US Department of Defense (DoD) could possibly leverage cloud computing technology. In the view of many, the development of a "private" or "enterprise" cloud for the DoD is a fait accompli. Besides, the DoD has multiple private internets (NIPRnet, SIPRnet, JWICS, etc.) so private clouds seem like an appropriate evolution. Enterprise clouds, however, seemed to overlook the need for military formations to simultaneously leverage this new IT approach and operate independently. Individual units could combine their IT infrastructure virtually using cloud computing concepts. One use case hypothesized the use of high fidelity tactical simulations in faster than real-time to help commanders better evaluate tactical options before committing to a course of action. This "tactical cloud" would also need to seamlessly reach back and interact with the DoD's "enterprise cloud" There could even be situations where the "tactical cloud" would link to a public cloud in order to access information or leverage a infrastructure-as-a-service. A naval formation seem to be the perfect environment for a tactical or "battlegroup cloud". Although each ship would normally operate their IT infrastructures independently, certain situations could be better served by linking all the resources into a virtual super-computer.

Even more interesting is the fact that the conversations quickly started addressing the tactical needs of police, firefighters, medical professionals and homeland security organizations. If the DoD could improve their operations with a "tactical cloud" couldn't these other operating units benefit as well?

So tell me. Is there yet another cloud formation to consider?

Monday, December 15, 2008

"Cloud Musings" Now on SYS-CON Media "Cloud Computing Journal" !!

I'm happy to announce that a recent "Cloud Musings" article, "Commercial vs Federal Cloud Computing " has been reposted on SYS-CON Media's "Cloud Computing Journal".

Thank you SYS-CON for making selected "Cloud Musings" articles available to your audience. I am honored by your support and look forward to providing my personal insights to your readers.

SYS-CON Media, founded in 1994, is widely recognized in the Internet-technology and magazine publishing industries as the world's leading publisher of i-technology magazines, electronic newsletters, and accompanying i-technology breaking news, education and information Web portals. Cloud Computing Journal is their publication targeting the cloud computing community.

Friday, December 12, 2008

How to make clouds interoperable and standard !!

This has been a huge part of my life over the past few weeks! This is my personal view.

WARNING: DON'T EXPECT THE ANSWER TO BE FOUND BELOW !!!

There are three basic approaches to this issue being aggressively discussed right now:
  • “Standards Body” Approach
  • “Adopt Proven Technology” Approach
  • “Customer Driven” Approach
All three approaches have value and all three have their problems, including:
  • Global agreement on standards could be a long process
  • “Proven Technology” may mean “Proprietary Technology”
  • Multiple industry customers could result in industry linked standards for cloud

So what is an embryonic industry to do? A hybrid of course !! Options include, but are not limited to:

  • Release all details of a selected “Proven Technology” to industry and adopt as a basis for an open standard
  • Embrace multiple standards each set optimized for industry ROI models
  • “Interoperability Rating “ issued after standards body technical review

(And we've only just begun)

Thursday, December 11, 2008

The Tension between Public and Private Clouds

Last week, during discussion on cloud interoperability and standards in Israel, I saw for the first time a real dichotomy in the value of public (external) and private (internal) clouds. This tension seemed to arise from the fact that an CIOs considering moving applications to an outside cloud vendor, would probably set their highest priority on legacy applications. The logic was that since it was more costly to maintain older applications internally, moving those applications to the cloud would represent a high value option. This customer high value option, however, seemed to present a worst case success scenario for public cloud providers. Is this true?

The general lack of internal metering for applications would also make an Internal vs. External ROI business case a fairly difficult task. Could these tensions actually lead to different business models for internal and external cloud purveyors?

Wednesday, December 10, 2008

Cloud Computing for Continuity of Operations (COOP)

Recently, I've been focusing on cloud computing for COOP. The way I looked at it, many government agencies are already using commercial shared facilities as COOP sites and that the cloud simply represented a virtual shared facility. Although there are still many security, privacy and policy issues that need to be addressed, it seems to me that cloud computing could still provide a cost effective and efficient COOP capability for certain operational and mission subsets.

A major key to success would be in the identification of which non-critical applications and infrastructure resources an agency could migrate to a cloud platform. By definition, these would be resources the agency could go without for two days. In general, applications, storage and server resources related to ad-hoc projects, research and development efforts, and certain peak-time requirements could clearly be considered.

To ensure operational and contractual flexibility, only solutions that could work across multiple cloud infrastructures should be considered. Many commercial vendors can provide multi-cloud support for continuity of operations requirements, including:
After appropriate technical and cost/benefit trades, vendors could be selected and SLAs negotiated. Pending agencies policy reviews and appropriate contracting vehicles, cloud-based COOP could then be put in place.

Is this a valid approach? Are their alternatives? As always, your suggestions and recommendations are welcomed.

Tuesday, December 9, 2008

NCOIC Plenary Session

Hopping a plane to the west coast today to attend the NCOIC Plenary in Costa Mesa, California. First day "Cloud Computing for Net-Centric Operations" agenda includes:

  • David Ryan, Chief Architect HP Federal Technology Solution Group-"Secure Cloud Computing"
  • Peter Coffee, Salesforce.com Director of Platform Research- "Building Mission-Critical SaaS Applications"
  • David Lindquist, IBM Cloud Computing Chief Architect and IBM Fellow - "IBM's Perspective on Cloud Computing"
  • Glenn Dasmalchi, Chief of Staff, Cisco CTO Office - "Cloud Computing: Trends and Opportunity"
  • Stan Freck, Microsoft Director, Software + Services – US Public Sector- "Software + Services – the next generation of computing"

The second day team session are on Enterprise Cloud Computing and Cloud Computing for Tactical Networks and Rapid Deployment. Briefings are expected from Stuart Charlton (Elastra), Daniel Nurmi (Eucalytus) Bert Armijo (3Tera) and someone from Sun Microsystems. Carl Consumano, OSD NII, will also be presenting.

A planning discussion for a possible full-day Conference on "Cloud Computing for Tactical Networks and Rapid Deployment" in the DC area next year is also on the docket.

Dataline named "Top 100 Cloud Computing Company"

SYS-CON's Cloud Computing Journal included Dataline in its expanded list of the most active players in the cloud ecosystem. In adding Dataline to the "Top 100" list, Jeremy Geelan noted that "...the role this company fills as a mid-level Federal System Integrator is crucial to the adoption of these technologies by the public sector". In a related announcement, Dataline was also named a "Cloud Computing Technology Contributors of 2009".
Jeremy Geelan is Sr. Vice-President of SYS-CON Media & Events. He is Conference Chair of the International Cloud Computing Conference & Expo series and founder of Cloud Computing Journal. He is also executive producer and presenter of "Power Panels with Jeremy Geelan" on SYS-CON.TV.
Thank you Jeremy for including Dataline.

Monday, December 8, 2008

Autoscaling into the cloud- Good or Bad?

I always thought saw the ability to autoscale into a cloud infrastructure as a good thing. George Reese presented a differing view on the O'Reilly blog recently.

"Auto-scaling is the ability (with certain cloud infrastructure management tools like enStratus—in a limited beta through the end of the year) to add and remove capacity into a cloud infrastructure based on actual usage. No human intervention is necessary.

It sounds amazing—no more overloaded web sites. Just stick your site into the cloud and come what may! You just pay for what you use.

But I don't like auto-scaling."

While I agree with the need for an enterprise to do capacity planning, I think that the dicussion goes far beyond an overloaded website. I believe that the real value of autoscaling lies in the support of a service oriented architecture (SOA), especially when services are auto-discovered and workflows are created on the fly with mash-ups.

Thursday, December 4, 2008

Cloudera must be reading the script!

"Cloud computing leapt out as the most obvious way to address enterprise large data problems" - Ken Pierce, IT Specialist, DIA-DS/C4ISR

"We view Hadoop as the key enabler...[in] optimizing the [cloud infrastructure] platform to ingest and present information effectively in the petascale." - Robert Ames, Director & Deputy CTO, IBM Federal

Successful mission accomplishment in the DoD, DHS and Intelligence Communities revolve around their ability to process "Big Data". Hadoop is all about processing "Big Data".

The ability to process big data is crucial to mission accomplishment because this is the core technology for processing terabyte-sized datasets with on-line applications. This capability is also needed to enable low-latency in automated decision tools. Since a typical software engineer has never used a thousand machines in parallel to process a petabyte of data, new software tools are critical to the sucessful implementation of solutions in this domain. That's where Hadoop comes in.

Apache Hadoop is a Java software framework that supports data intensive distributed applications. This open source implementation of Google's distributed file system and MapReduce technologies enables applications to work with thousands of nodes and petabytes of data. Cloudera was founded to provide enterprise-level support to users of Apache Hadoop. They have extensive experience and deep expertise in the commercial use of open source software and Hadoop.

During the Cloud Computing Summit, I met Christophe Bisciglia, a Google alumni that recently founded Cloudera. During his time at Google, Christophe created and managed the Academic Cloud Computing Initiative. His success led to an extensive partnership with the National Science Foundation (NSF) which makes Google-hosted Hadoop clusters available for research and education worldwide. Our discussions quickly focused on how Hadoop made the automation of intelligence exploitation feasible.

I can't wait to see the fruit of this potential marriage.

Wednesday, December 3, 2008

Animoto = Automated Imagery PED

Over the past two days, I've spent quite a bit of time with Stevie Clifton, Co-founder & CTO of Animoto. Besides being one of the coolest things I've seen in years, Animoto is giving us a glimpse of automated imagery PED (Processing, Exploitation, Dissemination). First an introduction.

Animoto Productions, a self described "bunch of techies and film/TV producers who decided to lock themselves in a room together and nerd out" have released a web application that automatically generates professionally produced videos. The site uses their patent-pending technology and high-end motion design to fully customized orchestration of user-selected images and music. By using Cinematic Artificial Intelligence technology, "it analyzes and combines user-selected images and music with the same sophisticated post-production skills & techniques that are used in television and film." Their AWS spike story is now famous in the cloud computing computing.

Now let's fast-forward 5, no, 2 years. A UAV is now flying over the US southern border streaming live video to an intelligence center. Using frame-grabbing technology, it forwards a series of still images to the automated intelligence exploitation center. The face recognition program matches one of the images to a known smuggler, which kicks of an automatic query of NCIC and NLETS. Timestamp information is also used to create an audio track from the many high fidelity microphones in the area. The audio, still frames and automatic query data is then sent to the Animoto engine, which uses the available meta-data to produce a intelligence video and transmits it, in near-real-time, to the nearest CBP unit for appropriate interdiction.

WOW!!

By the way, Animoto uses Amazon Web Services with Rightscale to provide it's service.

Monday, December 1, 2008

Sunday, November 30, 2008

2008 World Summit of Cloud Computing

After a uneventful trip , I'm now in Israel for the World Summit. With over 500 people expected to attend, it promises to be an exciting time. Unfortunately, I arrived to late to attend a reception by Google but the first day's line-up is impressive! Speakers include:

  • Stevie Clifton , Co-Founder & CTO, Animoto - The power of cloud that enables a new business
  • Nir Antebi , Senior Software Engineer, Intel - How Intel's Cloud Computing System Accelerates Chip Design
  • Russ Daniels ,Vice-President & CTO HP Cloud Services Strategy - Designing the Cloud: Services that anticipate our needs
  • Dr. Owen O'Malley , Hadoop Architect and Apache VP for Hadoop, Yahoo! - Yahoo! and Cloud

I'll be blogging on the insights that are sure to come. Stay tuned.

Wednesday, November 26, 2008

NCOIC Cloud Working Group

The NCOIC will be holding a cloud computing working group on December 10th during plenary session in Costa Mesa, CA. The session focus will be "Requirements for Enterprise Cloud Computing". The current agenda is:
  • 8:00 - 9:00 HP, David Ryan (Chief Architect, HP Federal Technology Solution Group)
    "Secure Cloud Computing"
  • 9:00 - 10:00 Salesforce.com, Peter Coffee (Director of Platform Research)
    "Building Mission-Critical SaaS Applications"
  • 10:00 - 11:00 IBM, David Lindquist (Cloud Computing chief architect and IBM Fellow)
    "IBM's Perspective on Cloud Computing"
  • 11:00 - 12:00 Cisco, Glenn Dasmalchi (Chief of Staff, Cisco CTO Office)
    "Cloud Computing: Trends and Opportunity"

A follow-up session is also planned for December 11th.

Tuesday, November 25, 2008

IBM Rating Clouds

According to Cloud Computing Journal, and Red Herring, IBM will now rate other cloud providers. Using the "Resilient Cloud Validation" program, IBM will validate their facilities, applications, data, staff, processes and business strategy in order to weed out untrustworthy providers.

"IBM announced that Allscripts, a leader in delivering innovation technologies that improve the health of patients and the bottom line of physicians and other healthcare organizations, is the first company to begin the certification process. The designation is expected to enable Allscripts to enhance the current online data backup service it provides to better serve the needs of the 150,000 physicians who use the company's electronic health records, e-prescribing and practice management solutions. Next Spring, Allscripts will release a new online backup service, powered by IBM, which will provide a simple, easy to deploy remote data protection service, helping to ensure that sensitive patient information and medical documentation will be encrypted, securely stored away from the customer location, and easily recovered at a moment's notice."

Monday, November 24, 2008

Cloud Computing vs. Cloud Services

In September, Frank Gens provided an excellent overview of the the new "Cloud Computing Era". In preparing for an upcoming talk, I re-read the post and found myself appreciating it even more. His description of "cloud computing" and "cloud services" really highlights the difference between the commercial cloud computing market and the Federal cloud computing market.

(Paraphrased from Frank Gens' article)

When people talk about “cloud computing”, they are usually referring to things like software-as-a-service (SaaS) and storage or server capacity as a service. They may also talk about the many “non-IT” business and consumer services like shopping, banking, selling, collaborating, communicating and being entertained. In reality, these things represent an on-line delivery and consumption model for business and consumer services. These users are not explicitly buying “cloud computing”, but the “cloud services” that are enabled by cloud computing environments. Cloud computing is actually the emerging IT development, deployment and delivery model that enables real-time delivery of products, services and solutions over the Internet.

Federal government customers do use the Internet, but the vast majority of their business is done using private internets. In the DoD, for instance, we call these private networks NIPRnet, SIPRnet and JWICS. These customers are, however, very interested in learning about how emerging cloud computing models can be used within and between all of these networks.

The epiphany here is that, for the most part, the commercial cloud computing market is about making money providing cloud services while the Federal marketplace is about making money helping Federal customers design and build cloud computing infrastructures.

I may be oversimplifying this, but I welcome your thoughts.

Friday, November 21, 2008

Inaugural "Inside the Cloud" Survey

Appistry and CloudCamp recently released results from the first "Inside the Cloud" survey. Key takeaways were:
  • Amazon perceived as cloud leader, with twice as many votes as Google
  • Infrastructure providers seen as leading the innovation; with apps and business models tied for 2nd place
  • Security, reliability and fulfillment of the scalability promise are most likely to keep cloud developers up at night
  • Cloud most attractive when it is most likely to reduce costs and improve scalability
  • The best-suited app for the cloud is still anybody’s guess

Thursday, November 20, 2008

FIAC Presentation Mentions Cloud Computing

At the recent Federal Information Assurance Conference, Bob Gourley, CTO Crucial Point LLC, and former Defense Intelligence Agency CTO, recently provided his views on the state of Federal IT. His cloud computing related points were:
  • Cloud computing (use of computations services from “the grid”) will accelerate
  • Government IT powerhouses (like NSA, NGA, DIA, DISA, IMO) will deliver more and more capability to users via the grid
As a intelligence community meritorious service award winner and a Infoworld 2007 "Top 25 Most Influential CTO", his presentation  makes for a very interesting read.

Wednesday, November 19, 2008

Sun Cloud Czar

Earlier this week it was announced that, Sun, Senior Vice President, Dave Douglas, was appointed to lead the Company's cloud computing efforts. A JDJ Article also stated that, in addition to becoming Sun's Cloud Czar, he will also lead Sun's efforts to capitalize on Network.com, the NetBeans developer platform, and the StarOffice portfolio. The unit will build upon Sun's existing online developer community to establish the company as a leader in cloud computing.

Tuesday, November 18, 2008

Enomaly: Startup of the Week

Congratulations to Enomaly and Reuven Cohen for being named Startup of the Week byInformationWeek !!! Reuven and I collaborate quite a bit and his blog, Elastic Vapor, is a staple for anyone interested in the latest cloud computing news.  As one of the CloudCamp originators, he was also a welcomed participant in last week's CloudCamp Federal.

If you missed his comments last week on CloudCamp Federal, you should really go and take a read.

"The spooks in the room also had an interesting take on things. The US is being beaten, and beaten badly by upstart cloud programs coming out of China and Russia and the level of red tape on the beltway was doing more harm then good. Also the concept of Russia being able take control of millions of zombie PC's at moment notice seem to be troubling. Another point of contention was that China has been able to create million server clouds with little or no competition from the US. On the flip side they also assured me that there is a lot more going on, but they couldn't talk about it. It was clear the use of distributed cloud technology represented one of the biggest opportunities within the military IT organizationsand the likelihood of some small cloud upstart or even Google or Amazon getting the job was slim."


Monday, November 17, 2008

Cloud Computing at DoD, DISA, DIA, CENTCOM and NCOIC

At CloudCamp last week, Military Information Technology distributed free copies of it's November issue to all attendees. The issue contains a very informative article by Cheryl Gerber titled "Computing in the Clouds".  The article is a MUST READ and was written from one-on-one interviews on cloud computing with:
  • Ken Pierce, IT Specialist with DIA-DS/C4ISR
  • Robert Ames, Director and Deputy Chief Technology Officer, IBM Federal
  • Colonel Joe Means, RACE Program Manager at DISA
  • Alfred Rivera, Director DISA Computing Services Directorate
  • Dave Jackson, CEO Cluster Resources
  • Mike Kochanik, CollabNet VP of Worldwide Market Development
  • Tim May, Apptis SVP Corporate Development
  • Phil Horovitz, Apptis CTO
  • Herb Kelsey, Managing Director, SBU Advisors
  • Bob Lozano, Appistry Founder and Chief Strategist
The big take-a-ways are as follows:

  • The Defense Information Systems Agency (DISA) is using cloud computing to streamline operational expenses while providing users with fast, customized self-service.
  • The Defense Intelligence Agency (DIA) is using cloud computing to meet increasing demands to process large data on networks more rapidly while realizing budgetary efficiency
  • Reliability enhancements from cloud computing helped the US Central Command (CENTCOM) mitigate the impact of an unplanned cessation of data transport.
  • Cloud computing is seen by the Department of Defense (DoD) and Intelligence Community (IC) as an obvious way to address enterprise large data problems.
  • The DOD and DISA are leveraging cloud computing technologies for a global application lifecycle management solution and software development collaboration
  • The Network-Centric Operations Industry Consortium (NCOIC), a consortium formed to support those who design and deliver systems for warriors, first-responders, and others that seek to maximize information age capabilities, has recently formed an Enterprise Cloud Computing group to address current mission requirements and to identify key areas of concern.

Wednesday, November 12, 2008

CloudCamp Federal was AWESOME !!

Just got home from CloudCamp Federal. What an event!! The over 100 attendees definitely made the statement that the Fed is interested in cloud computing. As expected, cloud security was a hot topic, but there were also sessions on cloud standards, cloud interoperability, Hadoop, and cloud computing education. Video to follow. Many thanks to the organizers and sponsors. Let's do it again !!

DISA taps CollabNet to manage DoD cloud software development

This week, the Defense information Systems Agency (DISA) announced that they will be using Collabnet tools to manage application development for RACE. The Computerworld article said that DISA will use SourceForge Enterprise ALM service to manage source code, releases and documents in software development projects. Rob Vietmeyer, DISA's Net-Centric Enterprise Services (NCES) Chief Engineer said that the agency also plans to test CollabNet's Cubit tool set to manage the Rapid Access Computing Environment (RACE) cloud computing infrastructure. DISA looks to use Cubit to manage and distribute the image libraries in RACE so that images can automatically be moved between development, testing and operational environments.

Collabnet is a key technology of Dataline's SOA-R solution and will be discussing their RACE program activities at today's CloudCamp Federal.

Today is CloudCamp Federal !!

Today we kick of off the first CloudCamp Federal. Reuven Cohen, Bob Lozano, Brand Niemann and over 150 other cloud computing enthusiast are expected to attend. We also plan to do a CloudCamp first by podcasting from the event. More to follow !!

Tuesday, November 11, 2008

CloudCamp "Sold Out" !! More Tickets Added

We are happy to announce that CloudCamp Federal (aka DC), taking place tomorrow at Apptis headquarters in Chantilly, VA (directions), is sold out. But don't fret, we've added 25 more spots. So tell your friends and co-workers to register right away.

Also note: The directions on Google Maps are incorrect. Visit the Apptis website for accurate directions. More details at: www.cloudcamp.com/dc or download the CloudCamp Welcome Guide (pdf)

President-Elect Obama: Good for Cloud Computing

In his article, "What Does Obama Revolution Mean to Cloud Computing", Krishnan Subramanian lays out a rosy picture for cloud computing:
  • Protecting the openness of Internet: crucial for cloud computing innovation and ensuring vendor diversity.
  • Safeguarding our right to Privacy: By safeguarding our right to privacy, Barack Obama's administration can help remove this concern in the minds of users regarding putting their data on the Clouds.
  • Opening up government to its citizens and bringing the government to 21st century: By moving important public government data to the Cloud, it is possible to provide a more transparent form of government.
  • Deploy next generation broadband and extend it to every single American including those who live in rural areas: Ubiquitous availability of broadband is required for the success of Cloud Computing.
  • Broad adoption of standards based health record systems: A healthcare SaaS system. Perfect for the cloud.
  • Climate friendly energy development and deployment: A lower cost of infrastructure means a much higher savings for the customers of Cloud Computing technologies.

If the President-elect follows through on these policies, the Federal marketplace will truly be the place for cloud vendors to thrive.

Monday, November 10, 2008

Only 9 tickets left for CloudCamp Federal !!

Only 9 tickets left from the original allotment of 150 for CloudCamp Federal this week. We're working to free up some more, but don't leave it to chance. Go to www.cloudcamp.com/dc to reserve your spot today !!

Valiant Angel - A Perfect PED Application for Cloud Computing

A few weeks ago in an article title "Why the Cloud? Processing, Exploitation and Dissemination", I described two instances where the newspaper industry used cloud computing in order to process, exploit and disseminate information. In that article I postulated that this is one reason why the intelligence community is interested in cloud computing.

Well Signal Magazine has recently highlighted a specific program where the DoD could use a similar approach. In a project called Valiant Angel, the Joint Intelligence Laboratory, located at the U.S. Joint Forces Command in Norfolk, Virginia, is working on a way for commanders to handle massive amounts of full-motion video being provided from unmanned platforms.

In addition to storing, moving, communicating and accessing large amounts of data, Valiant Angel is also addressing full-motion video processing, exploitation and dissemination (PED). Col. Chuck Mehle II, USA, commander of the Joint Transformation Command for Intelligence (JTC-I), JFCOM, notes that this aspect of full-motion video management is important because currently different people have different meanings for PED.

“For some, PED might be sticking yellow stickies on a screen after the event happens, and then taking notes about it as the video is displayed to them. But Valiant Angel gives them the capability—the John Madden-type capability—to drop the icon onto that video frame. Let’s say there is an event such as an IED [improvised explosive device] placement or an IED detonation or defusing. Immediately across the Defense Department, intelligence enterprise alerts will go out to those people who subscribe to that type of event, that geographic region, that timeframe, you name it,” he explains.

Such a requirement is perfect for cloud computing. An ability to meet surge storage and computing requirements by securely scaling into a on-demand cloud computing infrastructure could actually make projects like Valiant Angel affordable. Since it would be costly and impractical to buy the equipment needed to meet all possible tactical situations, a platform with an inherent ability to expand into a commercially provided cloud infrastructure could be ideal. It's very difficult to predict real-time tactical processing, exploitation and dissemination requirements. Cloud computing approaches could definitely provide a better option.

Friday, November 7, 2008

CloudCamp Federal 2008 - Don't miss out !!!

Tickets are going fast for CloudCamp Federal 2008 on November 12th in Chantilly, Virginia !! Representatives from the following organizations are already registered to attend.


3TERA
Amazon Web Services (AWS)
AOL
Appistry
Apptis
Booz Allen Hamilton
Center for Information Policy and E-Government
CollabNet, Inc.
Computer Sciences Corporation
Data Domain, Inc.
Defense Information Systems Agency
Environmental Protection Agency
George Mason University
GIS Enterprise Solutions
Great-Circle Technologies, Inc.
IBM Corporation
KPMG LLP
Microsoft
MITRE
Northrop Grumman
Raytheon
Red Hat
RightScale
Salesforce.com
Security First Corporation
ServerVault
SI International
The Boeing Company
Verizon

Register today at www.cloudcamp.com/dc !!

Wednesday, November 5, 2008

Private Clouds

Yesterday in eWeek, Chris Preimesberger, provided a very good read in "Why Private Cloud Computing is Beginning to Get Traction".

"Private cloud computing is a different take on the mainstream version, in that smaller, cloudlike IT systems within a firewall offer similar services, but to a closed internal network. This network may include corporate or division offices, other companies that are also business partners, raw-material suppliers, resellers, production-chain entities and other organizations intimately connected with a corporate mothership."

Tuesday, November 4, 2008

Important Cloud Computing Events

Mark your calendar for the following cloud computing events. These are specifically targeted to organizations looking to leverage cloud computing technologies and techniques in support of national security requirements.

Also, for my readers in the United States. Please vote today!!

Monday, November 3, 2008

Forrester: Embrace Cloud Computing to Cut Costs

"Forrester Research advises CFOs to take a close look at cloud computing for messaging and collaboration and enterprise applications. The payoffs could be noticeable during the current economic downturn."

In a recent report, analyst Ted Schadler support this advice with the following observations:
  • Cloud based SAAS services enables businesses to get up and running "in a flash"
  • Shipping IT tasks to cloud computing specialists, enables your employed staff to focus on more important business processes.
  • CFO's like the by-the-drink payment plans of cloud computing because it keeps cash in the bank longer.

Friday, October 31, 2008

Government still wary of cloud computing

Federal News Radio interviewed Ron Markezich, a corporate vice president of Microsoft, Mike Bradshaw, president of Google federal, and Michael Farber, a partner with Booz Allen on the government's approach to cloud computing. Key comments:
  • Michael Farber - many agencies understand this approach, but few have figured out the best way.
  • Ron Markezich - Agencies need to understand how the cloud could fit into their architecture. The focus must be on the benefits.
  • Mike Bradshaw - The consumer model is less expensive than other models because you don't have to focus as much on the infrastructure.
You can download Jason Miller's report to hear the entire interview.
To learn more about cloud computing in the Federal space, sign up for CloudCamp Federal, Nov 12, 2008, 3-9 pm in Chantilly, VA.

Thursday, October 30, 2008

Microsoft Azure

With the announcement of Azure, Microsoft has finally made it's cloud computing plans public. Maybe Larry Ellison is now ready to revise his opinion, huh? While this announcement is definitely a good thing, it also seems to be a defensive move on the software giant's part. In a USA Today article "Ray Valdes, an analyst at researcher Gartner, said that Microsoft's Web services strategy still isn't cohesive. It's "taking every major asset of intellectual property, and cloud-enabling it to some degree."

Key components of the Azure Services Platform include:

  • Windows Azure for service hosting and management, low-level scalable storage, computation and networking;
  • Microsoft SQL Services for a wide range of database services and reporting;
  • Microsoft .NET Services which are service-based implementations of familiar .NET Framework concepts such as workflow and access control;
  • Live Services for a consistent way for users to store, share and synchronize documents, photos, files and information across their PCs, phones, PC applications and Web sites;
  • Microsoft SharePoint Services and Microsoft Dynamics CRM Services for business content, collaboration and rapid solution development in the cloud.

When this is coupled with Amazon's October 1st announcement to host Microsoft solutions on its EC2 platform, one can only wonder how long it will take before agencies just stop buying licenses and paying for maintenance on these products.

Wednesday, October 29, 2008

Federal Grants from the Cloud

In case you mised it, the Department of Interior has announced that it plans to build a cloud computing platform to manage the processing and distributing of government grants.

"Grants.gov is re-aligning its business efforts to allow it to focus principally on its core business
This means that Grants.gov will no longer be in the ownership and management of IT
As a consequence, Grants.gov anticipates pursuing the acquisition of a "cloud computing" environment to include but not limited to service-as-a-service (SaaS), platform-as-a-service (PaaS) capability to fulfill its mission needs."

In a requirement description, the agency also clearly outlined its requirements.

"From a mission perspective, a cloud computing environment possessing the follow capabilities would be considered as a viable candidate to establish a relationship with Grants.gov:

  • An established capability (technology and staff) to develop (with business rules), test, deploy, host, manage, and maintain forms on a single integrated technological environment (minimizing development & deployment costs, & allowing for rapid forms deployment)
    Delivers a compelling user experience
  • Built-in Scalability (up and down on demand), Reliability, and Security
  • Built-in integration with web services and databases (maximum leveraging of existing software & third-party web services)
  • Supports applicant collaboration (particularly in fellowship & complex/multi-project settings to meet applicant and grantor expectations)
  • Deep application instrumentation (to allow for highly granular analysis of user activities to enable future cost recovery models for grantors based on system usage vice flat or subscription fees)
  • Supports existing grantor and applicant system-to-system connectivity within the federal grants community
  • Ability to advise Grants.gov on forms development & management oversight to minimize duplication

Budget pressues are sure to make this route a popular option for many agencies.

Tuesday, October 28, 2008

Economist.com : Let it rise

This week, The Economist provides an insightful special report on cloud computing.


"Computing is fast becoming a “cloud”—a collection of disembodied services accessible from anywhere and detached from the underlying hardware. The chances are that much of business and everyday computing will one day be mediated by this ethereal cloud."

I highly recommend the report as it presents an excellent case for disruptive nature of cloud computing.

"In the years to come companies are likely to venture much farther. For one, operators of computing clouds such as Amazon and Google have shown that this is a far more efficient way of running IT systems. Secondly, many firms will find they have no choice. The way in which their IT infrastructure has grown is proving unsustainable. Most corporate data centres today are complex warrens of underused hardware that require more and more people, space and power to keep them going. The current economic malaise will increase the pressure on companies to become more efficient. More has to be done with less, which is cloud computing’s main promise."



Monday, October 27, 2008

Some More Cloud Computing Survey Results

As promised, here are some more results from the MIT/"Cloud Musings" on-line survey!

Please remember, THIS IS NOT A SCIENTIFIC SURVEY !! The purpose is only to get a sense of the government cloud computing marketplace.

Total responses - 121

Type of Respondent Organization
Federal Government - 41%
Industry - 37%
State/Local Government - 20%
Educational Institution - 2%

Familiarity with Cloud Computing
Somewhat familiar - 63%
Not at all - 21%
Very familiar - 17%

Geographic Distribution
North East - 41%
North Central - 17%
West - 15%
Southeast - 9%
South Central - 9%
OCONUS - 9%

Challenges to address with Cloud Computing
Capital Budget Limitations - 24%
Storage Limitations - 15%
Event driven information requirements - 13%
Ubiquitous information access - 13%
Composite application requirements (Mash-ups) - 10%
Data center limitations-  9%
Operational spikes exceed IT infrastructure capacity - 9%
Other - 7%

Main Concern
Security - 54%
Finance/Budget - 12%
Unfamiliar with technology - 12%
Access - 8%
Adoption by organization - 8%
Contracting vehicle - 4%
Use with existing grid - 4%

This is just a snapshot, but there are a few take-aways:
  • The federal government is definitely interested in cloud computing technology. State agencies are exploring the possibilities as well
  • 21% of the respondents knew nothing about cloud computing.
  • Respondents were concentrated in the northeast US.
  • Capital budget limitations is the leading driver of interest in cloud computing
  • Security is the main concern

Friday, October 24, 2008

Steve Ballmer comments on Microsoft's cloud plans

On October 17th in the "Redmond Channel Partner Online", a Microsoft Partner community publication, Kurt Mackie reported on Microsoft's CEO Steve Ballmer's comments on the company's vision for syncing up applications in the Internet cloud. Although Ballmer held back on much of the details he did say that Microsoft is planning to release big news on this topic at its Professional Developers Conference, which is scheduled to take place on Oct. 27 in Los Angeles.

According to Ballmer, Microsoft's applications will not have to be rewritten to run in the cloud, "but we'll encourage developers to do something to make apps more manageable at a higher level,".

"It's a transformation for our business, but I don't think we'll be buying all of the world's data servers," he said. Instead, Microsoft will have to "service-enable our partners and customers."

Ballmer also denied that Google was Microsoft's most significant competition. He said that Microsoft's main challenges are the open source business model and getting good at advertising for the consumer market.

Thursday, October 23, 2008

Why the Cloud? Processing, Exploitation and Dissemination

So why is the intelligence community so interested in cloud computing? Three letters: PED (Processing, Exploitation, Dissemination). Take these two real life examples from the publishing industry.

Jim Staten of Forrester Research provided an example of how the New York Times leverage the cloud. The Times wanted to makes its historic archives available for online access. They needed to process 11 million articles and turn them into .pdf files. Initial estimates outlined that hundreds of servers and about 4 Tb of storage would be necessary. The IT organization at the Times estimated a months-long delay before beginning, the need for a significant budget and highlighted the difficulty of locating the computing resources. The project manager give Amazon Web Services a try and kicked off 100 EC2 instances and 4 terabytes of S3 storage. The job was finished the next day with a total cost of $240.

Another hard example comes from the Washington Post. Peter Harkins, a Senior Engineer at The Washington Post, used the Amazon Elastic Compute Cloud (Amazon EC2) to launch 200 server instances to process 17,481 pages of non-searchable PDF images into a searchable online library. With a processing speed of approximately 60 seconds per page, job was completed within nine hours and provided web portal access to the public 26 hours later. Harkins ruminates, “EC2 made it possible for this project to happen at the speed of breaking news. I used 1,407 hours of virtual machine time for a final expense of $144.62. The database of Hillary Clinton’s 1993-2001 Schedule is publicly available at: http://projects.washingtonpost.com/2008/clinton-schedule/.

Examples like this show how cloud computing techniques can be used to revolutionize PED processes. By increasing the use of automation and focusing our analyst on higher level exploitation tasks, near-real time exploitation and dissemination of critical intelligence products may be enabled in the very near term with cloud computing.

Wednesday, October 22, 2008

World Summit of Cloud Computing: "Enterprise Cloud Computing" work group

To leverage attendees of the World Summit of Cloud Computing, a kick-off meeting of the "Enterprise Cloud Computing" work group will be held near Tel Aviv, Israel on December 3, 2008. The stated goal of this meeting will be to exchange information about best practices, use cases, standardizations, success stories, emerging technologies, and issues in Enterprise Cloud Computing.

This clearly is a grass roots attempt towards creating a framework to facilitate the future global movement towards Cloud Computing. As put by Avner Algom of the Israeli Association of Grid Technologies:

"The rapid growth of this technology will require a trusted source of objective information to enable enterprises to understand the opportunities and risks in this area. An Enterprise Cloud Computing Group can play this role and provide value to both vendors and customers."

Bob Marcus of the Network Centric Operations Industry Consortium is supporting this effort by providing questions on key issues from many enterprise and government representatives. According to Mr. Marcus, current areas of interest to NCOIC members include:

  • Security of applications and data in public Clouds
  • Availability, risk management, and SLAs for public Clouds
  • Interoperability between public Clouds and enterprise systems
  • Management and governance of services across public Clouds and the enterprise
  • Best practices for migrating appropriate applications to Cloud environments
  • Use cases and patterns for Cloud deployments

According to Mr. Algom, he has already receive intent interest from many groups. So far, he is expecting participation from the following organizations:

  • 451 Group
  • Forrester
  • BMC Software
  • Sun
  • Amazon
  • IBM
  • Cohesiveft

By the way, the summit has collected quite an impressive list of speakers, including:

  • Cristophe Bisciglia, Senior Software Engineer, Google
  • Paul Strong, Distinguished Research Scientist, eBay
  • Simone Brunozzi, Web Services Evangelist, Amazon Web Services
  • Dr. Owen O'Malley, Hadoop Architect and Apache VP for Hadoop, Yahoo
  • Charles Brett, Principal Analyst, Forrester
  • Nati Shalom, Founder & CTO, GigaSpaces
  • Peter Nickolov, President, COO and CTO, 3Tera
  • Russ Daniels, Vice-President & CTOHP Cloud Services Strategy, HP
  • Jon Mountjoy, Developer Relations Manager , Salesforce.com
  • Richard Zippel, Vice President of Technology in the Chief Technologist's Office, Sun Microsystems

Tuesday, October 21, 2008

Cloud Package Management

In his post "Missing in the Cloud: package management", Dave Rosenberg highlights a critical issue in the adoption of cloud computing by government agencies.

"I dare say that a standard needs to be introduced--or at least a quasi-standard like we see for Linux with Yum, RPM and Synaptic (essentially flavors of the same ideal.)

Since Amazon doesn't currently offer this feature, I wonder what vendor will step in to fill this void. So far all the Cloud app guys have taken different approaches which will certainly introduce some additional complexity related to portability (which also needs to be standardized.)"


There was, in fact, quite a bit of feedback on this in the MIT Cloud Computing Survey.

Matthew Small from Rightscale agrees and puts it this way:

"It's a lot of work. Our ServerTemplate model has abstracted the configuration of the server from the base image that must be launched in the host. This provides for interoperability on public and private clouds. My assumption is that eventually there will be a standard "cloud computing unit" of measurement, but every host and vendor now has their own way of doing things and I don't expect that to stop."

"An IT architect at a large IT services company" had the following opinion on interoperability between cloud and enterprise systems:

"The handwriting is on the wall: the cloud will win. The economics are absurdly on the side of the cloud. But as enterprise architects mull things over, they'll want some backup or an alternative in the case one of their cloud providers goes down. If the payroll system is down on Tuesday, who cares? But if it's down on Friday, the enterprise will have a revolt on its hands. (And payroll has long been outsourced.) Right now no two cloud offerings are alike, so anyone indulging in the cloud is instantly locked into a vendor. For the cloud to truly create commodity computing, there must be standards. Standards that are coordinated and define various levels of service and what the interfaces look like (why can't they appear as services?) etc. This will be a hideously complex undertaking but the market will force it so that service consumers will have choice. Otherwise there is no true competition. I see this taking years, and the market will in large measure determine whose approach defines the standard. (Remember ISO OSI? It was all the rage way back when, but TCP/IP buried it in the dust via sheer force of market presence.)"

Bob Marcus and the Network Centric Operations Industry Consortium (NCOIC) are currently addressing their member concerns through discussions around the following topics:

Standardizations Needed

  • APIs between Cloud layers (e.g. PaaS and IaaS)
  • Interoperability across Clouds
  • Interoperability between public Clouds and enterprise systems

Implementation Guidelines

  • Best practices for migrating appropriate applications to Cloud environments
  • Use cases and patterns for Cloud deployments
  • Organizational support with the Enterprise for Cloud Computing

Robust Cloud Operations

  • Security of applications and data in public Clouds
  • Availability, risk management, and SLAs for public Clouds
  • Governance of services across public Clouds and the enterprise

The entire list of NCOIC questions is in the SOA-R wiki. If you are a member of NCOIC, please work with the newly formed Enterprise Cloud Computing Group to provide answers to these important concerns. If you're not a member, please provide your comments here or directly to Bob Marcus at robert.marcus@sri.com. (You could also consider joining the NCOIC)

Monday, October 20, 2008

PlugIntoTheCloud.com

Information Week has just launched PlugIntoTheCloud.com as their cloud computing destination. In his Non Linear Thinking blog, Bill Martin calls it a movement aimed at "providing a source and forum for IT pros, and the general population, to understand what the cloud computing trend/phenomena means to them and their companies".

John Foley, InformationWeek Editor-at-Large, in his announcement of the site's launch, explained the importance of cloud computing this way:

"We created it to address the growing need among IT pros to better understand what this trend means to them and their companies. Just this week, Gartner listed cloud computing second, right behind virtualization, on its list of the top 10 strategic technologies for 2009.

Our research tells us that business technologists are intrigued by cloud computing, but not yet swayed. InformationWeek Analytics (our in-depth reports business) surveyed 456 business technology professionals to gauge their plans for cloud computing. Among the respondents, 20% were considering cloud services, while another 28% said they didn't know enough about them. In other words, nearly half are still mulling it over. Of the rest, 18% said they were already using cloud services and 34% had no interest."

My sincere thanks goes to Mr. Foley for including "Cloud Musings" on the site's "Favorite Blog" list. I am honored and look forward to continuing this discussion on the use of cloud computing technologies within the public sector.

Friday, October 17, 2008

Is the cloud computing hype bad?


"It’s too simplistic to say cloud hype is bad . If we are technically expert is might irritate us with its breadth and abstraction, but we are not the only audience.  Somehow the idea has to  cross the corridor into other business departments and that’s just as likely to be via a Business Week article or even (dare I say it?) an airline in-flight magazine.  Whether we like it or not, repeatedly promoting a basic collective term through broader media has a long history of overcoming corporate resistance and inertia in ways IT departments can’t do alone. ‘The cloud’ is a BIG idea, its a reasonable visual metaphor and most of all its not an acronym.  It may not be perfect, but if it captures the imaginations of a broader audience of decision makers we should cut it some slack.  IT must remember that even its biggest ideas compete for mind-share with other major strategic change and improvement options - like moving to the Chinese market, restructuring the finances, building a new headquarters, or re-branding."

Thursday, October 16, 2008

Stop the FUD (Fear, Uncertainty and Doubt) !!

Dan Morrill! Count me in !!

In his excellent article, "Cloud Computing is Scary - But the FUD Has to Stop",  Dan makes some excellent points:

  • It is time to start embracing where business is going, and trying to make sure that they are doing it in the safest way possible.
  • What security professionals need to be doing rather than creating their own FUD is work out ways to make it safer.
  • While we might struggle with new technology, it is time for information security folks to step up to the plate and get smart on how the technology works, what the risks are, and how those risks can be reasonably addressed by good security solutions. 
  • There are some great resources for good information, Cloud Ave is one of them, but trend micro, IBM, Google, Amazon, Microsoft, Oracle and others who have all figured out that this can be a very neat technology and help companies expand and contract according to business need and market conditions.
Hear ! Hear !

Wednesday, October 15, 2008

IBM, Microsoft and Google

On October 6th, IBM launched their cloud services initiative. This is a:

 "[C]ompany-wide initiative that extends its traditional software delivery model toward a mix of on-premise and cloud computing applications with new software, services and technical resources for clients and Independent Software Vendors (ISVs). IBM's new cloud services can help businesses of all sizes more easily adopt cloud computing models to better manage data, lower operational costs and make collaboration easier."

Since IBM has partnered on cloud computing with Google, this commentary on Microsoft by Michael Vizard may be telling:

"Just about everything that Microsoft does and says about cloud computing comes across as fairly reactionary. Essentially, Microsoft has let Google set the tone for much of the last two years and every service that it rolls out is compared and contrasted to something Google already did...."

" ...Unfortunately, it looks like Microsoft is pretty much asleep at the wheel when it comes to cloud computing in the channel. Of course, one of these days Google is going to wake up to the same opportunity. And before Microsoft realizes, a large swath of the channel might find common cause with Google simply because it might actually be trying to make an effort."


But Google has issues as well. From the same commentary:

 

"Right now, it doesn’t look like Google even knows how to spell the word channel. But that’s not likely to stay the case forever. In the meantime, Microsoft is pretty much ignoring a coalition of the willing at its own peril."


Microsoft clearly has a chance to leverage it's dominance in the application space by helping it's partners port millions of applications to the cloud using the Microsoft Cloud OS. But will they move before Google wakes up and smells the channel?  


When is comes to channels, IBM also knows a few things so this will indeed be interesting.