This got me thinking. Last week the President of the United States started following me on Twitter. Now I realize that it’s not really President Obama on the other side of that virtual table, but the event brought to mind the Six Degree of Separation concept.
In case you don’t remember, this theory states that everyone and everything is six or fewer steps away, by way of introduction, from any other person in the world. That random thought led me to wonder if all this social media has actually changed the theory so much that even I can be directly connected to the leader of the free world. A quick internet search revealed that the number of links between two random people have indeed reduced:
- Facebook provies it’s 4.74 degrees of separation: Scientists at Facebook and the University of Milan reported the average number of acquaintances separating any two people in the world was not six but 4.74. The experiment took a month and involved all of Facebook’s 721 million users.
- Twitter shows it’s 4.67 steps: In a study of 5.2 million Twitter friendships (friend and follower relationships), Sysomos, found the most common friendship distance is five steps. (The average distance is 4.67 steps). The second most common friendship distance is four steps.
- LinkedIn is set up on 3 degrees of separation: Although LinkedIn has yet to do a study of the magnitude of Facebook or Twitter, LinkedIn, perhaps more than any other social network, was set-up about the idea of degrees of separation. In LinkedIn case, it’s 3 degrees of separation which are:
- You already know them
- You know someone that knows them
- You know someone that knows someone that knows them
users. Apart from the gee whiz factor of being closer to the President, this also means that using Twitter can get you that much closer to a new idea, a new opportunity or a new lease on life.
Since Twitter is enabled by cloud computing, what other things has this modern miracle caused? One plus is that cloud-enabled search algorithms have released us from the hidden biases built into the world’s relational databases. Since the 80’s data on just about everything has been managed, processed and modeled using theses relational databases which were in turn built on the assumption that humans were smart enough to know how that data would be used. These assumptions were ingrained in the database design, schema and tables. The use of structured data types was also limiting.
Today, processes built on top of unstructured data types that use technologies like map reduce have virtually eliminated the need for apriori knowledge about how data can or will be used. As shown by companies like Google and Simudyne, this has heralded a revolution in information access and process simulations. The automation that cloud computing has introduced to the acquisition of information technology resources has also changed the world’s economic model. The barrier of raising capital that has stopped many new ideas has been virtually driven out of existence. Today, the only thing between an idea and a new unicorn business is a little time and a credit card.
This Simudyne video shows a simulation of an Iberian Fuels Value Chain. Over a two week period, engineers used operations data from 2012 to 2013 to build a concept model of this value chain. It’s based on Simudyne’s Providence platform, software that is being used to model financial flows across the whole of Britain’s banking system.
Speaking of time, it has also been altered by cloud computing. The use of parallel processing technologies pioneered by cloud-computing has shortened everything from the time it takes to find a reference document to the time it takes to map the human genome.
Dr. Matt Huentelman of the Translational Genomics Research Institute (TGen) likens this form of high performance computer processing to a time-machine that gives him more time to do whatever is needed to help his cancer patients. In advancing health through genomic sequencing and personalized medicine, TGen uses robust, scalable high-performance computing and powerful Dell | Hadoop platform and Dell Statistica solution big data analytics tools. The increased performance provided by Dell HPC cluster, Dell PowerEdge M1000e blades, Dell PowerEdge M420 blade servers and Intel processors accelerates results, enabling researchers to expand treatments to a larger number of patients. In other industry verticals, time savings like this have been transformed to new business models and other important leading edge discoveries.
So in short, cloud computing has brought society and its ideas closer together, it has revolutionized the way we do business and it eliminates barriers caused by lack of money and lack of time. If this isn’t changing everything, what is?
To advance health through genomic sequencing and personalized medicine, the Translational Genomics Research Institute (TGen) needs robust, scalable high-performance computing complimented with powerful big data analytics tools for its Dell | Hadoop platform. TGen optimized its infrastructure by implementing the Dell Statistica solution and scaling its existing Dell HPC cluster with Dell PowerEdge M1000e blades, Dell PowerEdge M420 blade servers and Intel processors. The increased performance accelerates results, enabling researchers to expand treatments to a larger number of patients.
get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)