Sunday, April 8, 2018

Wasabi Hot Innovations Tour: How "Hot Cloud Storage" Changes Everything!


Digital storage requirements are growing exponentially. Budgets simply can’t keep up and existing Federal Data Center Consolidation Initiative (FDCCI), “Cloud First” Policy, Federal IT Acquisition Reform Act (FITARA) and Modernizing Government Technology (MGT) Act challenges aren’t going away. On top of all that, e-discovery, data privacy and digital forensics have made rapid data access and immutability absolute must haves.
  • What are your plans for addressing these issues?
  • How can you manage the generation of even more unstructured, IoT and “Big” data?
  • Will you get through your next IG review?

Hot cloud storage changes everything. Here’s how.
  • 80% Cheaper than the cheapest
  • Faster than the fastest
  • Safer than the safest
  • Unlimited fee egress – no additional charges for download from the cloud
  • Available as public or private cloud options
  • Immediate access and built-in immutability
  • HIPPA, HITECH, CJIS, SOC-2, ISO 27001, and PCI-DSS certified.
  • FedRAMP certification in progress

“Hot cloud storage” is currently being used by Cloud Constellation Corp., Acembly Television Broadcasting, 7 Wonders Cinema and multiple government organizations. Maybe your team should consider it as well.
Featuring:

Kevin L. Jackson, CISSP®, CCSP®

Also quoted by SAP, AT&T, Accenture, Ericsson, Forbes, Dell and others.



Individually scheduled consultations available with on-site experts


( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Friday, March 9, 2018

(Lack of) Patch Management Highlighted in US Congress


According to the former Equifax CEO’s testimony to Congress, one of the primary causes of this now infamous data breach was the company’s failure to patch a critical vulnerability in the open source Apache Struts Web application framework. Equifax also waited a week to scan its network for apps that remained vulnerable.[1] Would you like to appear at the next Congressional hearing on patch management?

Patch management is the process of identifying, acquiring, installing, and verifying patches for products and systems. Patches not only correct security and functionality problems in software and firmware, but they also introduce new, and sometimes mandatory, capabilities into the organization’s IT environment.  It is so useful, the CERT® Coordination Center (CERT®/CC) claims that 95 percent of all network intrusions are avoidable by using proper patch management to keep systems up-to-date.

This nightmare true story and compelling endorsement from CERT®/CC, however, masks the ugly operational patch management implementation complexities. Key enterprise challenges include:
  • Timing, prioritization, and testing of patches often present conflicting requirements. Competitive prioritization of IT resources, business imperative, and budget limitations often leave patching tasks on the back burner
  • Technical mechanisms and requirements for applying patches may also conflict and may include:
    • Software that updates itself with little or no enterprise input
    • Use of a centralized management tool
    • Third-party patch management applications
    • Negative or unknown interactions with network access control, health check functions, and other similar technologies
    • User initiated manual software updates
    • User-initiated patches or version upgrades
  • Typical enterprise heterogeneous environment that includes
    • Unmanaged or user managed hosts
    • Non-standard IT components that require vendor patching or cannot be patched
    • Enterprise owned assets that typically operate on non-enterprise networks
    • Smartphones, tablets, and other mobile devices
    • Patching of rehydrating virtual machines
    • Firmware updates

Piling up on these purely operational tasks are the change management steps associated with:
  • Maintaining current knowledge of available patches;
  • Deciding what patches are appropriate for particular systems;
  • Ensuring proper installation of patches;
  • Testing systems after installation; and
  •  Documenting all procedures and any specific configurations.

This challenge can also be significantly exacerbated in an IT environment that blends legacy, outsourced and cloud service provider resources. Environment heterogeneity and the sheer volume of patches released is why any patching strategy that primarily relies solely on manual implementation is untenable. 



According to the SANS Institute, meeting the patch management challenge requires the creation of a patch management methodology and the automation of that methodology.[2] The methodology itself should include:
  • A detailed inventory of all hardware, operating systems, and applications that exist in the network and the creation of the process to keep the inventory up-to-date.
  • A process to identify vulnerabilities in hardware, operating systems, and applications.
  • Risk assessment and buy-in from management and business owners.
  • A detailed procedure for testing patches before deployment.
  • A detailed process for deploying patches and service packs, as well as a process for verification of deployment.

As for the automation component, it should deliver an automated, comprehensive server lifecycle approach that can provision and configure software, update patches and implement configurations that can improve security and compliance across physical, virtual and cloud servers.
 
It should also encompass a policy-based approach with support for all major operating systems on physical servers and leading virtualization and cloud platforms. An ability to automate continuous compliance checks and remediate any security or regulatory shortcoming is also paramount. If appropriately implemented, IT Staff should be able to manage patching via a web interface. Having this feature increases server to admin ratio, enhances operational productivity, accelerates audit timelines and reduces incident response latency.

A leading solution in this space is BladeLogicServer Automation by BMC. It was specifically designed to address the dual enterprise requirements of (1) ensuring compliance with rules and regulations and (2) software patching to reduce security vulnerabilities.  In the market for over 10 years, it is a comprehensive server lifecycle automation solution that helps organizations provision and configure software, update patches and configurations to improve security and compliance across physical, virtual and cloud servers. Advanced capabilities include script automation, compliance tracking and the ability to stage and test patches before committing them. The latter feature is used to copy patch bundles to the targeted servers before maintenance windows open.The full-function suite integrates with change management systems to facilitate change record creation. Vulnerability management and remediation are automated by importing vulnerability management scan data from vendors like Qualys, Tenable and Rapid 7, and mapping the vulnerabilities back to underlying patches in BladeLogic.

Secure IT operations start with the identification and prioritization of critical vulnerabilities paired with the capability to deliver multi-tier remediation.  These reinforcing goals are why an advance patch automation solution is a “must have” for today’s modern enterprise.




This post is brought to you by BMC and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of BMC.





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Tuesday, February 20, 2018

Experience “The Big Pivot”



Graeme Thompson,
SVP/CIO Informatica
The Big Pivot Podcast explores Digital transformation and its effect on every business in every industry. In exploring the business benefits of data-driven transformation, it is for CIOs who want to have a real impact on serving customers, empowering employees, and growing revenues. Hosted by Rob O’Regan of IDG, the series features Graeme Thompson, SVP/CIO Informatica, a CIO that exemplifies a strategic executive who is leveraging data to drive change throughout his organization. Series guests include:


The New CIO role


A key point highlighted throughout the series is that the CIO’s role has shifted radically. No longer focused on using IT to improve the internal productivity of business functions, today’s best information executives partner with functional peers to deliver new business models and associated revenue streams. For them, the main challenges is a constant battle with legacy technology and legacy thinking. For software companies, this painful shift could be from an established initial license purchase and annual maintenance revenue model to a software subscription model.  Product-wise this may also include shifting from custom offerings to delivering fixed and standardize software services. CIOs must help the C-suite move away from functional level optimization and towards an enterprise success focus that provides scalable real-time data integration. This data foundation must also include an analytics platform that extracts application data, delivers visibility across each end-to-end business process (i.e., hire-to-retire, procure-to-pay, campaign-to-opportunity) and optimizes outcomes for customers and the entire company.

While the traditional CIO purview was mostly limited to IT infrastructure details (i.e., # of servers, wireless access points, storage devices), the new CIO must be equally aware of the number of corporate database instances, which ones have customer data and who has access to this data.  Data is the foundation of digital transformation so the CIO must be as focused on the data as they are on the physical aspects of IT. They must effectively leverage digital assets and escalate the use of data above the operational function that creates it to target enterprise level opportunities. The CIO is in a unique position because they are:
  • Best positioned to understand end to end processes and functions; and
  • Close to emerging technologies that enable them to identify profitable IT implementation opportunities.

Manage data as currency


The Big Pivot Episode 6 presents a thought exercise that compares the CIO’s role managing data to that of the CFO managing currency by asking:
  • Does the CFO let each functional organization keep and manage the revenue it makes?
  • Does the CFO leave it up to the goodwill of each functional manager to share their profit with other functional units?
  • Does the CFO only have a vague idea of the amount of money that flows in and out of the corporation?

These questions may seem absurd, but if data is valuable, why doesn’t the CIO manage data like the CFO manages currency?

Data is the foundation of digital business, and digital transformation success is defined by how well an organization leverages its data to create new opportunities. This viewpoint demands the use of secure, timely, accurate, correctly sourced, context applicable and appropriately organized data. While previous implications of bad data were small (an occasional reporting issue or individual process error), today’s businesses and entire industries are dependent on digital assets for success.  Maintaining the value and reliability of underlying data is now an existential priority.

Admittedly, treating data as such a valuable business asset is problematic because most companies try to avoid the political and structural disruption that comes with breaking down the functional silos and norms of the past. A practical way forward is possible, however, by linking technology investment to quantifiable business value. Functional optimizations do not always lead to desired enterprise optimizations so the CIO and senior leadership team must work together in thinking about the best use of data for the benefit of the entire company rather than any specific function. To realize this, changes across the entire system, and not just a functional area, may be required. Avoid the practice of linking data and ownership to the same functional area because this gets in the way of sharing data across the enterprise.

The new data privacy imperative


In 2018, the General Data Protection Regulation (GDPR) and other data privacy regulations are up leveling global data security, privacy regulations and control objectives. By building a globally enforceable framework for data lifecycle management, GDPR is forcing many international companies to classify and protect every individual’s data. Some of the challenges of this new regulation include:
  • The need to be able to closely track all data related to the privacy of an individual;
  • A requirement to demonstrate privacy by design in the handling of all privacy information;
  • An ability to demonstrate to regulators and data subjects the preservation of privacy information integrity through every stage of the information lifecycle; and
  • Enforcement of the right of any individual to demand the erasure of data related to them.

To continue operating, companies must augment their tradition data protection practice of protecting the network and data center environment with operational processes that implement data protection that travels with the data. This includes operations in the mobile environment as well. Data must be secured as an asset and not just as a process by-product.

A mandatory requirement of GDPR is the appointment of the Data Protection Officer, a role that serves as a focal point for data protection activities and related educational processes across the enterprise. These individuals are responsible for handling personal data and must be able to work closely with other governance functions (i.e., info security, legal, record mgmt., audit). New data protection procedures must also be fitted into project planning, external service contracts, procurement processes, data portability, and new internal processes that uphold data erasure rights.

Analytics changes business


The most important takeaway from this series is understanding the need to tie data insights to business outcomes. At its core, digital transformation means shifting from a limited competition based on physical assets to a global competition based on digital assets. The CIO see the business uniquely because, by necessity, they think of the enterprise as a connected system of processes and applications. This viewpoint can be used to effectively and efficiently drive change across the entire enterprise because fundamental business transformations are driven by data analytics that creates opportunities that didn’t previously exist. Incremental change is no longer viable, and products are no longer a selection of features. Today’s competitive offering must be able to use data to inform itself on how it is being used, fix itself if it sees a problem and then uses that data to solve the same or related problems in every other environment in which it operates.

Andrew McIntyre, VP of Technology for the Chicago Cubs on how analytics changes business

This post is brought to you by Informatica and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Informatica.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Sunday, February 18, 2018

Innovation At The Seams

by
Kevin L. Jackson & Dez Blanchfield

Today’s real business innovation is happening at the seams of industries. Moreover, after listening to this podcast between Sanjay Rishi, GM Global Cloud Consulting Services at IBM Global Business Services, and Dez Blanchfield, you will understand why Mr. Rishi describes his primary role as delivering cloud enable innovation and transformation

In this fascinating discussion, Sanjay and Dez talk about how organizations embark on cloud journeys through different entry points, namely by:

  • Developing support and engagement systems for customers, employees, and suppliers;
  • Migrating legacy applications into a cloud computing environment;
  • Leveraging exploding technologies like IoT and blockchain to innovate and transform business; and
  • Delivering business ROI with both speed and innovation.

In pursuing this goal of helping his clients strategize on cloud adoption, he has learned many valuable lessons. One of the most important centers around how enterprise leaders miss the role of organizational communications when transformation begins. The issue is that communications is quickly relegated to an afterthought and doesn’t get the correct amount of attention. In his experience, communications and change management are both essential and serve as the difference between success and failure. Sanjay’s guidance is for leaders not to forget that people’s hearts and minds must change if innovation is to deliver business results.

Another insightful nugget from this podcast is Mr. Rishi’s observation on how the CIO role is shifting from IT to business. This position is less about the back office and technology enablement and more about influencing change within organizations and becoming a catalyst for transformation and innovation. The most significant takeaway here is the need for empathy from the standpoint of understanding what challenges a CIO is going through concerning change and the speed with which change can happen. Transformation creates “haves” and “have-nots” in organizations.  Those pulled into the transformation become the “haves,” and then the masses see themselves as the “have-nots.”


People are hungry for inclusion and to be informed even if they do not participate in influencing an organization’s transformation. In sharing these insights, Sanjay Rishi reinforced his observation that the essence of digital transformation lies in relationship innovation (12:12). He even provided two vivid examples, a European TELCO and a Latin American Bank, to drive home the point. In short, by innovating the organization’s relationship with customers, suppliers, stakeholders, and employees, people can be influenced and effectively led through the investment journey needed to harvest cloud-enabled innovation opportunities.

Organizations must come to grips with the reality of two-speed transformation. The first gear of change is incremental and evolutionary while the second revolutionary and built around disruption.  First gear delivers needed enhancements and improvements to the existing business while the second wards off the threat of disruption from smaller players and start-ups. Business success is not about slowing down the rate of change. It is about balancing these two rates of change. The dependencies between the two are very significant, and embracing both is essential for success

Breaking out his crystal ball, Sanjay ended the exchange by telling everyone that the next big thing is an organization’s ability to sense and understand individual behavior in a way that enables the presentation of consumption choices. This vision seems to represent a doubling down on his earlier statements on relationship innovation. According to Mr. Rishi, this capability expands organizations and accelerates life changes for our benefit. Although individual sensing and anticipation of demand certainly has security challenges, he sees the change as positive in that it makes life much more efficient and allows us to harvest the many associated opportunities.


This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Tuesday, January 30, 2018

Digital Transformation & Intelligent Automation


By Kevin Jackson & Dez Blanchfield

Digital Transformation often needs Intelligent Automation. This type of change is the focus of a recent “Pioneers of Possible” podcast.  In discussion with Elli Hurst - Vice President, Global Automation - IBM Global Business Services (GBS), Dez Blanchfield finds out how her life’s journey inspires her in helping IBM clients use Intelligent Automation to enable globally integrated capabilities.

With six years at Price Waterhouse and 24 years at IBM, Elli seems to have moved a long way from her family’s restaurant business. The service industry passion that she learned from her father, however, is still deep in her heart. That care and passion are evident in how she focuses on understanding her client’s desired business outcomes. By using these targeted outcomes as a beacon for every engagement, her team helps clients align and execute on priorities in a manner that delivers a return on investment in months.

Her being a technology company executive, it is surprising to hear her describe technology as only “table stakes." While recognizing the critical and fundamental role that technology plays, Ms. Hurst prioritizes the need for process and people elements to work together with the technology.  While automation typically starts with a focus on reducing cost, it moves quickly to the delivery of value. Cost efficiencies exist, but value gained by the speed at which an enterprise can perform a business process with high quality is more important than to cost savings. According to her, attaining these types of business outcomes and values stem from a strategy that addresses:
  • Impact on the business and to the people that are performing work;
  • How people interact with the technology and automation;
  • How people can help enable automation; and
  • What new skill sets are needed.
The answers to these points are the basis for a successful Digital Workforce Strategy.
Organizations often err by trying to automate what people do.  Ms. Hurst’s insight is in knowing that automation should be designed to assist people in what they do which represents the real secret to bringing automation forward into the enterprise. Automation always impacts a workforce and jobs always change. Business value is released, however, when this change frees up innovation and unveils more exciting projects and tasks for that workforce.



The impact of automation on the workforce is not a bad thing; it is a good thing. Elli recommends “Taking it to the Positive” by getting buy-in and engaging the workforce teams impacted by automation. Experience has taught her that while point solutions may deliver 40% increases in efficiency, used in tandem with a Digital Workforce Strategy, they can simultaneously deliver a 95% increase in employee satisfaction. Establishing and executing that strategy is the key to any transformation that uses automation.

Another one of her telling observations is that automation drives the most significant disruptions to back-office repetitive tasks. By looking at the end-to-end business model through an industry lens, her teams have helped clients to impact the external world beyond the back office and through to the client’s customers. Described as enabling digital experience “concentric circles,” this process enables enrichment of a client’s entire business ecosystem.

The automation conversations usually start in a specific area, like robotic process automation, which has been spurred by a back-office disruption. Addressing any disruption like this requires a strategy because global automation is a journey that aligns business process with rapidly changing technology. The organizational strategy must be able to flex and continuously adapt its strategy. The typical 3-5 year strategy is no longer viable. Intelligent automation demands a “fail fast” strategic approach.

Ms. Hurst ended this fascinating conversation by describing the future of automation as the convergence of all technologies at the enterprise level. In her view, the enablement of self-healing, lights-out, information technology platforms will give business executives the ability to couple an integrated view of all business processes with an ability to take immediate and effective action through mobile devices.




This post was brought to you by IBM Global Technology Services. For more content like this, visit IT Biz Advisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Tuesday, January 23, 2018

The Ascent of Object Storage



Over the past few years, the data storage market has changed radically. The traditional hierarchy of directories, sub-directories, and files referred to as file storage has given way to object storage, individual storage objects. While file storage was designed to help humans interact with data, object storage is all about automated efficiency.

User expectation of data usage drives file storage repository design.  In this structured data model, all folders and names are organized to support a pre-defined business process or model. The file system also associates a limited amount of metadata (i.e., file name, creation date, creator, file type) with the saved file. Finding individual files is done either manually or programmatically by working through the hierarchy. The file storage approach works well with data collections but can become very cumbersome as data volume grows.

Object storage, on the other hand, is optimized for an unstructured data model. While this approach is not “human-friendly” it also doesn’t require prior knowledge or expectations of data use. Files are stored as objects in various locations with a unique identifier and a significant amount of metadata. The size of the accompanying metadata can range from kilobytes to gigabytes and often includes a content summary, keywords, key points, comments, locations of associated objects, data protection policies, security, access, geographic locations and more. Enhanced metadata enables a lower level of granularity when protecting, manipulate, and managing stored objects.

Specific business, technology, and economic drivers caused this significant market change. Business drivers include:
  • Rapid growth in amount and importance of unstructured data
  • Need to implement faster data retrieval based on identifying details incorporated in metadata that the operating system reads.
  • The requirement to apply organization to unstructured data resource through the use of text analytics, auto-categorization, and auto-tagging.
  • Increased legal and regulatory requirements for scalable data archiving and e-discovery
  • Enhanced business process and model flexibility enabled by the use of a flat storage structure.
From a technical point of view, object storage is far superior to file systems. This advantage is primarily due to its unlimited scalability and ability to be managed programmatically. It also:
  • Has fewer limitations when compared to the traditional file or block-based systems because of the flat data environment
  • Ability to customize metadata through arbitrary use of any number of data attributes
  • Global accessibility using HTTP(S) and REST APIs


From an economic point of view, object storage is also more cost-effective than file storage solutions, especially when storing large amounts of data. Since object storage solutions efficiently leverage unlimited scalability, organizations find that it is less costly to store all of their data. This advantage also exists in private cloud implementations where costs can be even lower than that provided by public cloud providers. Object storage is also much more durable than file-based alternatives.

The marketplace offers plenty of alternatives when object storage is the right answer. Access protocols, technology, and cost, however, varies widely. As shown in Table 1, storage cost for 1 terabyte of data for one year ranges from a high of $4,300.80 with data striping from QualityTech/QTS to a low of $47.88 from Wasabi that uses a more advanced erasure coding approach. While location differences cause some cost variation, most of the variation can be attributed to design architecture and underlying storage technology.

Although this market survey is not exhaustive by any means, it highlights the importance of being an educated consumer when considering object storage solutions.  Other solutions aspects worth investigating include:
  • Complexity and performance across provider storage service tiers
  • Data immutability and durability
  • Speed of internal consistency across multiple copies of your data
  • Elapsed time to the delivery of the first byte of requested data
  • Use of active integrity checking
By all objective accounts, object storage is the right storage for large segments of an organization’s data holdings. This reality should lead to more effective due diligence and care when considering your enterprise’s next storage upgrade.





( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)
Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2018)



Sunday, January 14, 2018

The Deer Hunters: An Information Technology Lesson


by Kevin Jackson & Dez Blanchfield

In episode four of the “Pioneers Of Possible” podcast series, Dez Blanchfield caught up with  Max Michaels, General Manager, IBM Network Services in the studio.  Their engaging and insightful discussion included a philosophic story about cooperative deer hunting from Mr. Michaels’ childhood, through to his professional life experiences with a famous former CEO of Pepsico and Westinghouse, both which contributed to the depth and strength Mr. Michaels’ brings to his role and the management of the IBM Network Services business.


Early in the discussion, Max shared an anecdote about how his Grandmother’s personalized retelling of a Jean-Jacques Rousseau's tale circa mid-1700’s, about a group of hunters who elected to collaborate while tracking a large stag, rather than operate independently, influenced him throughout his career. He also recounted how he enjoyed a similarly defining experience working with Michael H. Jordon when the venerable leader was Chairman and CEO of EDS.

Throughout the conversation three endearing characteristics stood out about Max Michaels as a thought leader and innovator - they were:
  • The extent to which he values good business judgment
  • Successes he has earned through unorthodox thinking
  • The critical converging of the IT and networking world
His unorthodox way of thinking was very effectively put into practice while at McKinsey, when in 1996 he led a three-person team, one among some 300 teams worldwide, to win a company-wide competition to generate new client-ready knowledge ideas called the McKinsey Worldwide Practice Olympics. Ranked #1 among 300+ McKinsey teams by showing how to apply the Black-Scholes option trading model to any strategic situation. The critical financial insight behind the Black-Scholes model is that it “eliminates risk” by showing how to buy and sell an underlying asset in just the right way. At McKinsey, Mr. Michaels showed how the identification of the right issues leads to correct strategic actions.

The insight displayed by this win is that the business world is not directly comparable to the financial world. When you invest by buying stock in the financial world, you may have little direct input into what happens to make the stock goes up or down. The business world is entirely different in that when you invest in a new product, strategy or marketing plan, the investor has a continuing opportunity to change the outcome, based on customer perception and preferences change. Throughout the conversation in this episode of the Pioneers of Possible podcast series, this approach to driving successful outcomes turns out to be key to Max Michaels’ leadership style at IBM Network Services, and the genius behind the “Always-on Initiative,” designed to help enterprises support the always-on nature of day-to-day business. The network is the enabling capability for “Always-on” and serves as a foundational element to the convergence between information technology (IT) world and telecommunications (telecom). 


Before this trend took hold, companies ran IT and telecom networks separately and with separate leadership teams. With the intuition gained earlier, Mr. Michaels is now helping IBM customers move away from focusing on IT outcomes and move towards a converged business outcome. This approach, in turn, changes business models in ways which makes it possible for organizations to better leverage the convergence of IT and telecom, both within and external to their organizations.

According to Max, the average person in the US is interacting with the network for 16 hours a day. Businesses, therefore, need to deliver their products and services to these individuals through the network.  Modern business models rely on the network to facilitate seamless connectivity and the convergence of cloud, the new model for delivering IT, and telecom. Networks enable the cloud, and in the next phase of the evolution of digital business and digital transformation, where the cloud, in turn, becomes the network. IBM as a cloud company is leading the way.

This very paradigm is in effect the same core driver behind the transformative effect the cognitive capabilities of IBM Watson has when incorporated into the network and the cloud, an effect as positively disruptive as a driver of change on how we all experience the world around us, both in business and in life. Through the lens of network service, this overarching principal also heightens the importance of network and cloud convergence. The explosion of data the world has experienced over the recent decade has increased the demand for everything to be software-defined, so that compute, storage and networks can all combine into a single entity which provides value to the end user. Network virtualization and software-defined infrastructure are dramatically and fundamentally changing the entire enterprise managed services world and that, in essence, is the definition of the IBM Service Platform with Watson delivers.

Please enjoy this episode of the podcast series. We look forward to your feedback and comments through social media as we continue our journey to introduce you to leading IBM innovators and thought leaders.



This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2018)