Tuesday, November 29, 2016

Europe: NCTA CloudMASTER® Hotspot

The ongoing digital transformation continues to generate a steady demand for workers with increasingly sophisticated digital skills. This process is multi-dimensional and workers with these highly specialized skills are very much sought after. The European Union Commission estimates that there could be a shortage of around 800,000 information and communications technology (ICT) specialists in the EU by 2020. A third dimension is the fact that there is a growing need to reskill the existing workforce, especially in light of the Fourth Industrial Revolution and the incorporation of the Internet of Things (IoT) and cyber-physical systems into the industrial production process. The “smart factory” also opens up new possibilities of individualized and efficient customer care and smoother communication with suppliers in the supply chain logistics. This is based on cloud-based platforms and artificial intelligence.

According to the Jacques Delors Institue in Berlin, digital skills in general are now needed in almost all types of work. In this recent contribution to the debate on the European Union the Institute has targeted the societal changes being driven these broad changes. To address the European continent-wide impact of digital transformation, this think thank is proposing a Europe-wide strategy to reskill workers for the requirements of connected production.


The core mission of the Jacques Delors Institute is to produce analyses and policy proposals targeting European decision-makers and the wider public. The work of the Jacques Delors Institute is inspired by the action and ideas of Jacques Delors, and organized around three axes:
  • "European Union and citizens", which covers questions of policy, institutions and civil society, focusing in particular on the themes of participatory democracy, European institutions, European political parties and European identity.
  • "Competition, cooperation, solidarity", covering economic, social and regional issues with a specific focus on the European budget, intra-EU solidarity, agriculture, cohesion policy, economic governance, and energy policy.
  • "European external action", bringing together work with an international dimension, including EU-US relations, EU relations with neighbors, and extra-EU regional integration.


These changes are, in essence, combining platform-based communication with cloud computing, improved sensor technology and the application of sophisticated algorithms to large and unstructured pools of data generated by these sensors. This combination makes it possible to link up an almost infinite number of interconnected physical objects. One of the main economic impacts of this is seen in the industrial production process and the emergence of the “smart factory” which enable companies to manufacture individualized products to marketplace demand in real time.
In response to the need to reskill European workers for the requirements of connected production, opportunities to get NCTA CloudMASTER® training and certifications in Europe are expanding rapidly. As a point of fact, training programs for this cloud computing certification have been recently announced by:
  • New Horizons for delivery in Austria, Czech Republic, Germany, Ireland, Latvia, Romania, Slovakia, Spain, Switzerland, United Kingdom and Bulgaria;
  • Firebrand Training in the United Kingdom, Germany, Belgium, Austria and Luxemburg; and
  • Fast Lane in Spain, Italy and Germany.


CloudMASTER® training and certification is comprised of three courses with exams:
  • NCTA Cloud Technologies that provide an overview of cloud computing that will help you develop a deep understanding of the models and understand the landscape of technologies used in the cloud and those employed by users of cloud services;
  • NCTA Cloud Operations that helps you study the management of cloud operations and addresses the application need for compute power, managing CPU scaling, and meeting both structured and unstructured storage requirements; and
  • NCTA Cloud Architecture that includes hands-on experience with OpenShift, OpenStack, VMware, Amazon Web Services, Azure and Rackspace, and provides a framework to assess application performance needs while addressing business requirements of Return on Investment (ROI), Total Cost of Ownership (TCO) and Key Performance Indicators (KPIs).

 The list of training sites is growing fast so if your country isn’t listed you probably won’t have long to wait. Visit the NCTA CloudMASTER® registration site for more information.

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Sunday, November 27, 2016

Smart Manufacturing Is Cloud Computing



As cloud computing simultaneously transforms multiple industries many have wondered about how this trend will affect manufacturing. Often characterized as “staid”, this vertical is not often cited when leading edge technological change is the topic. This view, however, fails to address the revolutionary nexus of cloud computing and the manufacturing industry. Referred to as Digital Thread and Digital Twin; these cloud driven concepts are now driving this vertical’s future.

Digital Thread is a communication framework that connects traditionally siloed elements in manufacturing processes in order to provide an integrated view of an asset throughout the manufacturing lifecycle. Digital thread implementation also requires business processes that help weave data-driven decision management into the manufacturing culture.

A Digital Twin is a virtual representation of a manufacturer’s product used in product design, simulation, monitoring, optimization and servicing. They are created in the same computer-aided design (CAD) and modeling software that designers and engineers use in the early stages of product development. A digital twin is, however, retained for later stages of the product's lifecycle, such as inspection and maintenance.

Figure 1- The smart manufacturing landscape http://www.industryweek.com/systems-integration/journey-smart-manufacturing-revolutio

When successfully combined these processes can deliver on the promise of Smart Manufacturing, which include:
·         Ability to receive published data from equipment using secure open standards, analyze and aggregate the data, and trigger process controls back to equipment, systems of record and process workflows across the enterprise and value chain connected via A2A and B2B open standards.
·         Autonomous and distributed decision support at the device, machine and factory level.
·         Ubiquitous use of mined information throughout the product value chain including end-to-end value chain visibility for each product line connecting manufacturer to customers and supplier network.
·         Enhanced information- and analytics-based decision making on large amounts of raw data gathered from the smart manufacturing equipment and processes.
·         New levels of efficiency to support new services and business models including mass customization (highly configured products) and product-as-a-service.; and
·         Provide a broad portfolio of these advanced capabilities to manufacturers of all sizes and in all industry sectors, at acceptable levels of cost and implementation complexity.

Although at first glance these goals seem overly ambitious, they are being realized today because technologies and integration standards have come together to fuel this revolution. Required building blocks include:
·         Smart machines and advanced robotics –These machines recognize product configurations and diagnostic information, and make decisions and solve problems without human intervention.
·         Industrial Internet of Things (IIoT) – Devices with network and internet connectivity that are active participants in event-driven, self-healing manufacturing processes integrated with open standards that support connectivity.
·         Cloud services – On-demand information technology services that can be rapidly provisioned and released with minimal management effort or service provider interaction.
·         Enterprise integration platforms - Platforms that have the ability to receive data broadcast from equipment via secure open standards. These applications analyze and aggregate the data, and trigger process controls, history recording, and work flows that enable business processes across value chain systems that can then be integrated via application-to-application (A2A) and business-to-business (B2B) open standards.

Digital Thread and Digital Twin also enable the evolution in the manufacturing field often referred to as Industry 4.0. This next phase increases manufacturing efficiencies while reducing both cost and time of delivery. It brings together data, cloud computing, and cyberphysical systems in order to deliver:
·         Industrialization where machines supported human work;
·         Optimization where assembly lines increased productivity;
·         Automation, where machines largely replaced humans; and
·         Digitalization, where information technology with its broad portfolio becomes an integral part of manufacturing.
Cloud computing has extended many benefits to manufacturing because those businesses can now:
·         Rely more on standard cloud services allowing them to focus on business-critical functions.
·         Reduce capital expenditures significantly
·         Relieves manufacturer of the burden to license, deploy, and maintain baseline IT services like email, collaboration, unified communication, and human relation management
·         Enhance operational flexibility through the use of rapid IT scalability

In summary, the combination of digital thread, digital twin and cloud computing enables both smart manufacturing and Industry 4.0. If you’re company isn’t deeply leveraging all of these concepts today, you don’t know anything about manufacturing.

This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.









Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Tuesday, November 22, 2016

George Youmans, Jr.: The CloudMASTER Fashionista!


So how could a NCTA Certified CloudMASTER accelerate his career in the fashion industry?

To answer that question, you would need to catch up with George Youmans, Jr. He has been with fashion giant Ralph Lauren since October 2012. That was around the time he decided to complete the NCTA CloudMASTER curriculum. After graduation George was first promoted to Senior IT Technician and then to Senior Technologist.



So why would a company like Ralph Lauren even need a cloud computing specialist? 

Ralph Lauren installed interactive window displays at London’s most up-market department store, Harrods. Shoppers can use their smartphone to activate an interactive map which led them directly to the Fashion Lab where they could buy all of the items they saw on display. If the store was closed then users could still access information about the Ralph Lauren collection from the Harrods website.



Ralph Lauren conducted a technology trial where they embedded RFID tags in clothing that can be detected by the dressing-room mirror. Details about clothing items are displayed on the mirror (several languages are supported), and the system also synchronizes with inventory and point-of-sale systems. The mirror can also mimic the lighting of various environments. Some of its lighting options are white, dusk, club and aquarium. Other lighting options are tailored to the Ralph Lauren brand like "Fifth Avenue Daylight," "East Hampton Sunset" and "Evening at the Polo Bar".




The luxury fashion brand has also joined the race to produce fashionable products for the wearable-technology market. Its men only PoloTech Shirt was designed to read vital signs like heart rate and variability, breathing depth and recovery, intensity of movement, energy output and stress levels, steps taken and calories burned. Conductive threads are woven into the compression top and a lightweight module snaps around the left rib cage to relay information via Bluetooth. The smartphone app offers live fitness monitoring and offers workouts tailor-made to how your body is reacting. The shirts itself is 70% cotton, 21% nylon and for elasticity 9% spandex.


PoloTech Shirt

Sounds like a great place to accelerate a technology career doesn’t it!

A NCTA CloudMASTER® helps organizations transform customer experiences through:
Customer understanding;
Top-line growth; and
Customer touch points.
They help companies optimizes internal processes through:
Process digitization;
Worker enablement; and
Performance management.
They can also transform a company’s core functions and activities through:
Digital modifications to the business;
Creation of new digital businesses; and
Digital Globalization.

IT staff of the future need the skills of a businessperson to stay current, as their company's software requirements and the options for satisfying them will be deep, varied, and changing quickly.  The IT department five years from now will also need to keep pace with nearly constant change. The more complex and interconnected technology environments become, the more a general understanding and knowledge of how it all works together will be valued.

If you want to secure your IT future, become a CloudMASTER® today.


This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Sunday, November 20, 2016

Is Cloud Interoperability a Myth?

Photo credit: Shutterstock
As the industry matures, cloud computing will increasingly rely on interoperability in order to grow and deliver more value to industry. Assuming this is a fact, what does it mean when eighteen major OpenStack vendors come together to work through the challenges involved with achieving enterprise interoperability? Events at the OpenStack Summit in Barcelona helped provide a window into the promise of tomorrow's interoperable environment.
In cloud computing, interoperability generally refers to the ability of service models from different cloud service providers to work together. Specifically:
·         Infrastructure-as-a-service
o   Access Mechanism - defines how a service in cloud may be accessed by users and/or software developers,
o   Virtual Resources - service delivery as a complete software stack of installing a virtual machine,
o   Network - addressing and API,
o   Storage - management and organization of storage,
o   Security - authentication, authorization, user accounts and encryption,
o   Service-Level Agreement - architecture format, monitoring,
·         Platform-as-a-Service
o   The exchange of data and services among different platforms hosted on different infrastructures on cloud;
o   Data compatibility among different platforms,
o   Portability between platforms
o   Data transfer procedures (i.e. packing, copying, instantiating, installing, deployment and customization)
·         Software-as-a-Service
o   Interoperability among applications in the same cloud,
o   Data exchange and operation calls in applications on different cloud-computing environments
o   Software programs that are distributed in different cloud environments and integrate data and applications in cloud in a unified way, and
o   Migration of applications from one cloud environment to another
If this isn’t enough of a challenge, one would also need to specifically address the many embedded and overriding interoperability aspects, including:

·         Technical interoperability - development of standards of communication, transport and representation;
·         Semantic interoperability - the use of various different terms to describe similar concepts may cause problems in communication, execution of programs and data transfers;
·         Political/Human interoperability - the decision to make resources widely available has implications for organizations, their employees and end-users;
·         Interoperability of communities or societies - there is an increasing need to require access to information from a wide range of sources and communities; and
·         International interoperability - in international matters, there are variations in standard, communication problems, language barriers, differences in communication styles, and a lack of common basis.

 As one may imagine, the rapid growth of cloud computing and the global proliferation of service providers has created an intractable many-to-many interoperability quagmire that can never be tamed. Knowing this, the Openstack Interop Challenge looks toward cultivating success by leveraging the open source cloud technology as a common integration layer.  Participants include AT&T, Canonical, Cisco, DreamHost, Deutsche Telekom, Fujitsu, HPE, Huawei, IBM, Intel, Linaro, Mirantis, OSIC, OVH, Rackspace, Red Hat, SUSE and VMware. The goal was to publicly demonstrate how OpenStack delivers on the promise of interoperability across on-premises, public and hybrid cloud deployments.


Boris Renski, co-founder of Mirantis, argues that 
interoperability doesn't start at the infrastructure layer. 

Although you would expect this strategy would greatly simplify the integration challenge, contrarian views are out there.  One of the most vocal is Boris Renski, co-founder of Mirantis and member of the OpenStack board of directors. He believes interoperability does not necessarily start at the IaaS layer. He believes that applications can be built to be interoperable across different infrastructure platform. Quoting his OpenStack Summit keynote:

"Even across Mirantis-powered OpenStack clouds like AT&T and the Volkswagen cloud, they are both based on the same distribution, but the underlying reference architectures are dramatically different…Volkswagen can't throw something at AT&T and it will just work."

In this post I’m happy to report though that the participating OpenStack cloud vendors were able to announce a successful completion of the interoperability challenge. While this success is clearly a baby step on the long and treacherous road to cloud interoperability, it is worth noting because this modest achievement also led to the creation of automated tools for the deployment of applications across a variety of OpenStack environments.The effort also generated significant collateral on cloud computing interoperability best practices and is expected to drive even further interoperability collaboration across the Openstack community.




This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Wednesday, November 9, 2016

Should Data Centers Think?

As cloud computing becomes the information technology mainstream, data center technology is accelerating at a breakneck speed. Concepts like software define infrastructure, data center analytics and Nonvolatile Memory Express (NVMe) over Fabrics are changing the very nature of data center management.  According to industry research firm IDC, organizations will spend an additional $142.8 billion oninfrastructure for both public and private cloud environments in the next three years (2016-2018) to boost efficiency and business agility.

To support this rapid evolving space, Intel announced a “Cloud for All” initiative last year in order to help businesses get the most out of their cloud infrastructure. Specific goals for this initiative include:
  • Investing in the ecosystem to accelerate enterprise-ready, easy-to-deploy software defined infrastructure (SDI) solutions;
  • Optimizing SDI solutions to deliver highly efficient clouds across a range of workloads by taking full advantage of Intel platform capabilities; and
  • Aligning the industry and engaging the community through open industry standards, solutions and routes to market to accelerate cloud deployment.
As cloud infrastructure management is moving towards these new management paradigms, those at the leading edge are exploring how to make data center’s think for themselves. Industry leaders like Dr. Brian Womack, Director of Distributed Analytics Solutions in Intel’s Data Center Solutions Group, and Das Kamhout, Senior Principal Engineer at Intel are learning how to use data, artificial intelligence frameworks and machine learning to create data centers that think for themselves. Two key components of their vision are SNAP and TAP.

SNAP is a powerful open data center telemetry framework. It can be used to easily collect, process, and publish telemetry data at scale. It enables better data center scheduling and workload management through access to underlying telemetry data and platform metrics. The framework greatly improves system administrator control of the intelligent use of data center infrastructure in cloud environments by:
  • Empowering systems to expose a consistent set of telemetry data;
  • Simplifying telemetry ingestion across ubiquitous storage system;
  •  Improving the deployment model, packaging and flexibility for collecting telemetry;
  • Allowing flexible processing of telemetry data on agent (e.g. machine learning); and
  • Providing powerful clustered control of telemetry workflows across small or large clusters.
Trusted Analytics Platform (TAP) makes the SNAP telemetry usable by providing the tools, components and services necessary in the creation of advanced analytics and machine learning solutions. TAP makes these resources accessible in one place for data scientists, application developers and system operators. An open-source software platform optimized for performance and security, TAP simplifies solution development through the delivery of a collaborative and flexible integrated environment.



With TAP, Interactive analysis, modeling and algorithmic process flows on any type of raw data, streaming in real-time or batch data, is possible using either a GUI or a text-based shell. These models and flows can be used for batch processing or be integrated into applications. TAP includes REST APIs usable by any web-capable language (e.g., Python, Java, PHP, Ruby, Javascript) over HTTP, as well as a Python API, for server-local access. It operates on most data stores and file systems, including cluster federations that can enable data sharing (with security). The integrated operations management tools in TAP allow monitoring and control from top to bottom. In support of trust, TAP Security follows layered security and deep defense principles to provide transparent encryption and decryption, as well as fine-grained access authorization, based on a variety of authentication mechanisms and assurance levels.

Used in combination, SNAP and TAP could be used to make sentient data centers a reality.

Visit Chip Chat to hear more more about creating a data center that thinks for itself!


This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Monday, October 31, 2016

For Top Cyber Threats, Look in the Mirror


A recent report by Praetorian, a cybersecurity company headquartered in Austin, TX, focused on threats that resulted in data compromise or access to sensitive information. Based on a review of 100 separate internal penetration test engagements the study identified the five most prevalent threats to corporate data.  The amazing thing about these weaknesses  is that the top four are all based on utilizing stolen credentials and the last one helps an attacker be more effective in using those stolen credentials.  In other words, the enemy is right there in the mirror!  The study spanned 75 unique organizations and only focused on security weaknesses that were used to obtain a full network compromise.
Where are your pain points?

The most prevalent threat is something we’ve all heard of before – Weak Domain User Passwords.  Since most corporate environments use Microsoft’s Active Directory to manage employee accounts and access, it needs some improvements in order to fully address complex passwords. Since Active Directory only requires passwords to be a specific length and contain specific character sets so addressing this weakness will require the use of third-party software.

The next most common corporate threat is Broadcast Name Resolution Poisoning.  Using this vector, an attacker responds to broadcast requests (i.e. LLMNR, NetBIOS, MDNS, etc) by providing its own IP.  When this is done, the credentials of a user accessing network resources can be instead transmitted to the attacker’s system.

The next big no-no is when system administrators all use the same Local Admin password. If an attacker is able to compromise the LM/NT hash representation of the password, then the attacker can use the hash to authenticate and execute commands on other systems that have the same password.  Using the hash, an attacker doesn’t need the actual password at all!

Microsoft Windows operating systems have another embedded password weakness.  Believe it or not, the operating system stores domain credentials in cleartext within memory of the Local Security Authority Subsystem Service (LSASS) process.  Although this weakness requires an attacker to have Local Admin or SYSTEM-level access, it ranks high on the threat list.

This last threat enhances all of the other - Insufficient Network Access Controls. Many organizations don’t restrict network access based on business requirements.  This will enable unfettered attacker mobility after only a single system on the internal network has been compromised.
These threat vectors, last updated by Praetorian in June 2016, were evaluated as part of a complete corporate network compromise kill chain.  They also highlight the importance of understanding the cybersecurity threat.  Although the mirror is a good place to start improving on network security, you must also work to identify all your organization’s security pain points.  With that knowledge you can more effectively enhance your team’s defenses and eventually evolve towards a better understanding of your security threat environment.

If you are serious about protecting your data, download the full report and read about the effective strategies your company can use to protect itself.  If you are a CISO or corporate executives, IBM also provides some excellent information on how to secure the C-suite.  They also provide an interactive tool that can help better analyze your threats, protect your users and save your data from these and many other security challenges.




This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Friday, October 28, 2016

Your Choice: Cloud Technician or Digital Transformer

The CompTIA Cloud+ certification validates the skills and expertise of IT practitioners in implementing and maintaining cloud technologies.  This is exactly what it takes to become a good cloud technician.  In the past few years, however, the National Cloud Technologists Association (NCTA) has recognized that evolving market demands have changed cloud computing technology  in at least 13 ways:
  1.  Variable pricing Cloud service providers charge different prices at different times based on  demand
  2. Pre-emptable machines – Providers are offering a lower price for machines that could be shut down and restarted at a later time without aborting the assigned task
  3. Shift from hardware to algorithms where the hardware is bundled into the software price
  4. Use of reserve instances where the user buys compute power in advance
  5. Buying in bulk where pricing is based on aggregated use even if it is sporadic in nature
  6. Cloud providers offer shared data sources along with commodity hardware
  7. Autoscaling where newer software layers offered by cloud vendors handle infrastructure scaling automatically and billing is done by service request instead of by the machine
  8. Graphic processor units have become available for jobs requiring heavy-duty parallel computation
  9. Much improved analytics that monitoring the performance of your systems.
  10. Significant increase in the number of options available for various business requirements and loads
  11. “Bare metal” servers that aren’t virtual.
  12. Containers, like Docker, that makes deploying software much easier and faster.  The cloud will therefore spin up a new instance with a container-ready version of the OS at the bottom.
  13. A growing proliferation of exotic and specialized options, all offering anything you need with the extra phrase “as a service

This means cloud computing isn’t just about technology.  It is about leading organizations through the Digital Transformation era.  This is why the NCTACloudMASTER® certification was created.
Digital transformation is the profound and accelerating transformation of business activities, processes, competencies and models to fully leverage the changes and opportunities of digital technologies and their impact across society in a strategic and prioritized way. Executives in all industries are using digital advances such as analytics, mobility, social media and smart embedded devices as well as improving their use of traditional technologies such as ERP to change customer relationships, internal processes and value propositions.

Serving as “Digital Transformers”, a NCTA CloudMASTER®:
  • Help the organization transforms customer experiences through
    • Customer understanding;
    • Top-line growth; and
    • Customer touch points.
  • Optimizes internal processes through
    • Process digitization;
    • Worker enablement; and
    • Performance management.
  • Transforms a company’s core functions and activities through
    • Digital modifications to the business;
    • Creation of new digital businesses; and
    • Digital Globalization.

This means that if you want to have an IT career in five years, you must strive to be a Digital Transformer, not just a cloud technician.  Our society is experiencing a fundamental shift in information technology’s overarching mission, with the support-and-maintain mind-set giving way to a more strategic, software-centric vision for IT.  IT staff of the future need the skills of a businessperson to stay current, as their company's software requirements and the options for satisfying them will be deep, varied, and changing quickly.  The IT department five years from now will also need to keep pace with nearly constant change. CloudMASTER® training and certification is comprised of three courses with exams:
  • NCTA Cloud Technologies that provide an overview of cloud computing that will help you develop a deep understanding of the models and understand the landscape of technologies used in the cloud and those employed by users of cloud services. You will receive multiple points of view, firsthand experience and a foundation in managing industry leading cloud services like Amazon Web Services, Drupal, Wordpress, Google Docs and Digital Ocean.
  • NCTA Cloud Operations that helps you study the management of cloud operations and addresses the application need for compute power, managing CPU scaling, and meeting both structured and unstructured storage requirements. You will learn how to painlessly deploy fairly complex applications that scale across multiple instances in cloud technologies including Windows Azure Chef, Chef Solo, Linux and Windows Tools.
  • NCTA Cloud Architecture that includes hands-on experience with OpenShift, OpenStack, VMware, Amazon Web Services, Azure and Rackspace, and provides a framework to assess application performance needs while addressing business requirements of Return on Investment (ROI), Total Cost of Ownership (TCO) and Key Performance Indicators (KPIs). Groups will complete a cloud assessment of Fortune 100 firms using public information and make presentations to the client.

The more complex and interconnected cloud environments become, the more a general understanding and knowledge of how it all works together will be valued.  IT staff will no longer be the ones responsible for “managing the plumbing”, they'll be the people who are thinking of new ways to monetize, share, and use corporate data for organizational success.


So which future do you want for you and your family?



( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Thursday, October 20, 2016

Watson Crowdsources Cloud Computing


Recently I've been doing quite a bit of analysis work using the IBM Watson cognitive business platform. The really exciting thing about this opportunity is the way data can seem to have a conversation with you.  This got me wondering if social media data could carry on a conversation as well.  Given my almost unhealthy interest in cloud computing, we ran a one week experiment to "crowdsource the internet" in order to see if it held any interesting cloud computing insights . To narrow the volume of documents down to a reasonable number, I limited providers to those on the most recent Gartner IaaS Magic Quadrant:
  • Microsoft
  • Amazon Web Services
  • Google
  • VMware
  • IBM
  • Rackspace
  • Verizon
  • CSC
  • Interoute
  • CenturyLink
  • Dimension Data
  • Fujitsu
  • Joyent
  • NTT Communications
  • Virtustream


Leveraging Watson, I gathered cloud computing related social media documents. According to Watson, in one 24-hr period, there were 46,869 documents that mentioned these Cloud Service Providers (CSP) a total of 57,997 times. Google was totally dominating the online conversation with 73% of all mentions. Microsoft was a poor second at 17%.

Figure 1- Social media cloud computing "Share of Voice"


At this this point I took a look at overall industry sentiment. From this vantage point, Interoute outshines all rivals for positive sentiment.  Of particular note, however, was that Dimension Data simultaneously held the crown for largest percentage of negative and lowest percentage of positive sentiment (which seems to be centered mostly around the dropout of a rider from its Tour de France team and a recent internal restructuring). The Dell/EMC cloud provider Virtustream doesn’t even seem to be present in social media conversations. 



Figure 2- Customer Sentiment Regarding Cloud Service Providers
Figure 3 - Cloud Service Model "Share of Voice"
Microsoft dominated that segment of the conversation that specifically addressed the three standard cloud computing service models (Infrastructure-as-a-Service [IaaS], Platform-as-a-Service [PaaS], Software-as-a-Service [SaaS]). Over 53% of the working set referenced Microsoft with second place AWS coming in at 13.5%. Software-as-a-service is the unsurprising overall service model leader but Microsoft seems to be edging out AWS for Infrastructure-as-a-Service mentions.  Platform-as-a-Service is a distant laggard with only three providers (Microsoft, AWS and VMware) represented in social media exchanges.



Figure 4- Industry Vertical Cloud Computing "Share of Voice



In order to glean some business value, the documents were binned across thirteen industry verticals and analyzed for share of voice and author sentiment. The initial industry bins were:

  • Construction
  • Manufacturing
  • Wholesale trade
  • Information technology
  • Retail trade
  • Utilities
  • Financial services
  • Educational services
  • Transportation and warehousing
  • Entertainment, accommodation, and food services
  • Healthcare and social services
  • Public administration


Across this set, entertainment, government, education and healthcare industries seem to be most interested in the cloud. Surprising to me is that the construction industry interest surpasses that of financial services. Google seems to be driving industry related social media conversations with Microsoft and IBM rounding out the top three.

Although I wouldn’t use this non-scientific experiment to make any big bets, it does demonstrate how actionable data can be gleaned from the social media stream.  It may also shed a little light on the power of cognitive computing in the business world.

One especially intriguing capability that I didn’t use in this experiment is the use of Watson Explorer technologies with Semantic Analytics.  This solution is currently being used by IBM GTS to deliver “built to purpose” cognitive systems for the information technology industry vertical.
Figure 5- Cloud Service Provider Industry "Share of Voice



A key differentiator of this approach is its ability to extract meaning from the fragmented sentences normally found in unstructured IT service ticket description fields. Due to the global nature of GTS Services, this unstructured text is typically in multiple languages. Additionally, due to the different language skill levels of the globally sourced pool of agents, the grammar quality varies. This solution is used by GTS to uncover patterns and trends in the identification of contributing incident causes in order to prescribe appropriate preventative actions.



The digital transformation couple with cognitive computing is accelerating almost every industry. In the IT world, at least, cognitive computing promises to deliver the ability to bridge the gap between unstructured language data and effective maintenance action by correlating social media chatter and customer sentiments with the root causes of operational IT issues.

This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)