"Cloud computing leapt out as the most obvious way to address enterprise large data problems" - Ken Pierce, IT Specialist, DIA-DS/C4ISR
"We view Hadoop as the key enabler...[in] optimizing the [cloud infrastructure] platform to ingest and present information effectively in the petascale." - Robert Ames, Director & Deputy CTO, IBM Federal
Successful mission accomplishment in the DoD, DHS and Intelligence Communities revolve around their ability to process "Big Data". Hadoop is all about processing "Big Data".
The ability to process big data is crucial to mission accomplishment because this is the core technology for processing terabyte-sized datasets with on-line applications. This capability is also needed to enable low-latency in automated decision tools. Since a typical software engineer has never used a thousand machines in parallel to process a petabyte of data, new software tools are critical to the sucessful implementation of solutions in this domain. That's where Hadoop comes in.
Apache Hadoop is a Java software framework that supports data intensive distributed applications. This open source implementation of Google's distributed file system and MapReduce technologies enables applications to work with thousands of nodes and petabytes of data. Cloudera was founded to provide enterprise-level support to users of Apache Hadoop. They have extensive experience and deep expertise in the commercial use of open source software and Hadoop.
During the Cloud Computing Summit, I met Christophe Bisciglia, a Google alumni that recently founded Cloudera. During his time at Google, Christophe created and managed the Academic Cloud Computing Initiative. His success led to an extensive partnership with the National Science Foundation (NSF) which makes Google-hosted Hadoop clusters available for research and education worldwide. Our discussions quickly focused on how Hadoop made the automation of intelligence exploitation feasible.
I can't wait to see the fruit of this potential marriage.
No comments:
Post a Comment