Friday, July 27, 2018

Cloud Migration Best Practice: Classifying Your Data



In my first post of this series, “CloudMigration Part One: An Overview,” I provided a high-level summary of how enterprises should migrate applications to the cloud. In this installment, the focus is on enterprise data and why your organization may need to review and reclassify its data before moving anything to the cloud.

Cloud computing has done more than change the way enterprises consume information technology.  It is also changing how organizations need to protect their data.  Some may see this as an “unintended consequence” but the headlong rush to save money by migrating applications to the cloud has simultaneously uncovered long-hidden application security issues.  This revelation is mostly due to the wide adoption of “Lift & Shift” as a cloud migration strategy.  Using this option typically precludes any modifications of the migrating application.  It can also result in the elimination of essential data security controls and lead to grave data breaches.

While there is no doubt in the good intentions of all involved, traditionally, enterprise applications were developed for deployment into the organization’s own IT infrastructure.  This implicit assumption also included the use of infrastructure-based security controls to protect organizational data.  These generally accepted industry practices were coupled with a cultural propensity to err on the side of caution by protecting most data at generally high levels.  During an implementation, organizations typically used a two-level (sensitive and non-sensitive) or at most a four-level data classification model.

Today, the cloud has quickly become the preferred deployment environment for enterprise applications.  This shift to using “other people’s infrastructure” has brought with it tremendous variability in the nature and quality of infrastructure-based data security controls.  It is also forcing companies to shift away from infrastructure-centric security to data-centric information security models.  Expanding international electronic commerce, ever tightening national data sovereignty laws and regional data protection and privacy regulations (i.e., GDPR) have also combined to make many data classification schemas generally untenable.  Cloud Security Alliance and the International Information Systems Security Certification Consortium (ISC2), in fact, both suggest that corporate data may need to be classified across at least eight categories, namely:
  • Data type (format, structure)
  • Jurisdiction and other legal constraints
  • Context
  • Ownership
  • Contractual or business constraints
  • Trust levels and source of origin
  • Value, sensitivity, and criticality
  • The obligation for retention and preservation

Moving to classify data at this level means that one of the most important initial steps of any cloud computing migration must be a review and possible reclassification of all organizational data.  In bypassing this step, newly migrated applications simply become data breaches in wait.  At a minimum an enterprise should:
  • Document all key business processes destined for cloud migration;
  • Identify all data types associated with each migrating business process;
  • Explicitly assign the role of “Process Data Owner” to appropriate individuals; and
  • Assign each “Process Data Owner” the task of setting and documenting the minimum required security controls for each data type.

After completing these steps, companies should review and update their IT governance process to reflect any required expansion of their corporate data classification model.  These steps are also aligned with ISO 27034-1 framework for implementing cloud application security.  This standard explicitly takes a process approach to specifying, designing, developing, testing, implementing and maintaining security functions and controls in application systems.  It defines application security not as the state of security of an application system (the results of the process) but as “a process an organization can perform for applying controls and measurements to its applications in order to manage the risk of using them.”

In Part 3 of this series, I will discuss application screening and related industry best practices and include:
  • Determining the most appropriate target application deployment environment;
  • Determining each application's business value, key performance indicators and target return on investment;
  • Determining each application's migration readiness; and
  • Deciding the appropriate application migration strategy.



This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Wednesday, July 25, 2018

Skills Gulf Is Cloud’s Biggest Peril



Ian Moyse, Cloud Industry Thought Leader & Cloud Sales Director at Natterbox

Cloud is undoubtedly the driver of the new tech economy. Be it SaaS, PaaS, IaaS, Public, Private or Hybrid clouds, E-Commerce, IOT (Internet of Things), Big Data or some iteration that at the back of it is supported by cloud technologies. Technology is both enhancing and reducing in cost at such a speed, that it is no longer the entitlement of only the large firms, but can empower any organisation from small to large, from startup to established, to be able to revolutionise their customer offering and to elect to disrupt or be disrupted.

With this speed of technology change comes a need for those supporting the business to adapt quickly and adopt new methodologies, knowledge, and skills to empower a company to take advantage of these new possibilities. Switching from Waterfall to Agile, from networking to virtualisation to Docker, from hosting to IaaS & PaaS and from C, through Java into Swift, Hack, and Dart.

A wide range of firms still relies on traditional IT infrastructure (locally deployed server applications and databases) despite the increasingly rapid rate of companies migrating to cloud-based systems.  Digital Transformation seems to be on the agenda of most Enterprise organisations, banded about as if it’s a switch to flick and a fast thing to undertake. However, the reality is far from the truth and accepting the change required and having the skills at hand to achieve it, are barriers impeding a growing number of companies.

Change is hard to accept at the best of times, particularly if you have previously been the subject expert on a vendor/technology for a long period, to now find that is being disrupted at pace and your worth is diminishing either in your own firm or to the general market. Being prepared to let go of many years of acquired skills and accept the need to re-start and learn a whole range of new skills is hard to accept, and many will resist, defending the status quo and hindering business change and their own personal progress.

For companies moving applications and services to cloud platforms, migration challenges are one of the top constraints affecting IT, as there are no automated switchovers on offer and customised internal or external migrations vary from mild to heavy development changes. For example, migrating a home grown or proprietary application requires new application development and testing. However, if taken on with commitment, the move can provide faster more agile application development through DevOps and utilisation of enhanced cloud features and API’s leading to improved application lifecycle management.

However, with this comes the need for professionals with the skills and knowledge of the chosen cloud platform to deliver the migration project in a structured, and effective manner. Cloud continues to enhance quickly and even those in the cloud a decade ago are finding they are needing to continue to learn new skills, such as the usage surge in containers, for which a Robin Systems Survey recently cited that 81% of organisations are planning to increase their use.

Big Data has introduced new approaches, tools, skills and with an expected 60% per annum growth (IDC) cannot be ignored. With the increased volume of data and continual crunching demands databases are going to live in the cloud and demand new platforms and approaches.

With the plethora of changes from new coded applications and architectures holding vast data stores in the cloud, the need for greater cyber security expertise is an essential requirement. With the human element recognised as the most vulnerable area of security, the introduction of so many new skill areas will introduce increased risk of new security exposures. Software developers in the cloud must understand and treat with extreme caution, the need for increased responsibility for security assurance and compliance. With the heightened awareness of security threats and breaches and the introduction of the new GDPR (General Data Protection Regulation) in Europe with far heftier and damaging fines, getting this wrong is now going to be catastrophic. It is estimated that less than 5% of cloud applications are ready for GDPR, leading to a vast breadth of enhancement In a very short period.

The perfect storm circling this comes from the expectation that 30-40% of the corporate workforce will retire in the next decade, combined with a reduction in those studying relevant ICT subjects and the reduction in educations capability to provide effective education in the required areas. We have a rapidly increasing need for new technology skills (to both support new technologies and to support digital transformation from old to new) and a big % of those with technology backgrounds retiring rather than reskilling, backed a reduction in educations capability to attract and educate to the level of need required.

Businesses now have pressures upon them like never before!  Markets that shift quicker, more fickle and demanding customers, users being influenced by or becoming millennials (who expect faster, quicker, easier, cheaper from the world they have grown up within) and disruption all around them from new born firms who can attack with the gusto of using all the new world tech and methods, with no legacies to unchain themselves from.

Companies MUST have access to the skills required to be able to employ the full scope of new tech on offer to their business advantage. To be able to move old creaking applications to newer form factors and to deliver a better quality of service and user experience to the demands of any device, any place, any time working for both their employee and their increasingly new breed of customer.
Unless the issue is addressed quickly,  you can expect ‘Supply & Demand’ for these new skills is going to simultaneously implode and explode, creating a chasm between need and affordability, as those who can do become scarce and valuable commodities, available to the few who can afford!


You can follow Ian at www.ianmoyse.cloud

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings

(Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Wednesday, July 18, 2018

Could Budget Sweeps Fix Your Cybersecurity Problem?


A recent roundtable discussion in Washington, DC with Federal IT and Cyber leaders focused on the business drivers, challenges and evolving strategies around cybersecurity in government.  After an opening presentation by Jim Quinn, the lead systems engineer for the Continuous Diagnostics and Mitigation program at the Department of Homeland Security, the discussion highlighted the need for data security. Key takeaways included:
  • A new emphasis on data-level security across government that puts security controls closer to the data itself, rather than focusing on the perimeter.
  • The urgency around data security is increasing, with 71 percent of agencies having been breached, which is a threefold increase from three years ago.
  • Need to deal with an expanding requirement to add more and more capabilities to mission systems with the understanding that protecting data is part of the mission.
  • Agencies that only focus their time, energy and budget on meeting various mandates are having trouble keeping up with evolving cyber threats.
  • While agencies have much flexibility in how they acquire, manage and deliver information and services, they are still responsible for protecting their data. Agencies must, therefore, approach data security at the enterprise level.
  • Data security is a matter of law. 44 U.S.C., Sec. 3542 directs agencies to ensure the confidentiality, integrity, and availability of government data.

As I’ve written many times before, organizations need to focus on how to transition to a hybrid IT future.  The overall information technology marketplace is also undergoing these dramatic shifts toward data-centric security.  Data management has moved from the management of structured data into an environment where real-time analysis and reporting of streaming data is essential. 

International commerce is also entering an environment of stricter data management regulations and national data sovereignty laws that, if violated, introduce the possibility of punishing remedies and fines. This rapid progression has also driven a massive change in information technology services. Cloud and managed service providers are meeting this need through the innovative creation and deployment of API accessible, immediately consumable, data manipulation services. Enterprise IT organizations have shown themselves unable to keep pace with the blistering increase in the number and breadth of broader IT marketplace services.  It’s also not cost-effective or even desirable for them to try.

With the recent focus on data-level security and year-end budget sweeps around the corner, shouldn’t your agency be looking at how to better store and protect its data? Mandates around IT Modernization and Cloud Computing aren’t going away soon either.  With cloud and managed service provider data storage solutions so accessible, your current on-premise solution may be hurting your mission in many ways including:
  • High CAPEX driven by significant upfront equipment costs lead to poor ROIs with long payback periods;
  • High OPEX characterized by recurring power, cooling and rack space expenses;
  • Expensive monthly hardware and software maintenance and support fees;
  • Excessive system administration cost and complexity all lead to high ongoing operations expenses;
  • Obsolescence concerns caused by storage vendors that regularly retire products and discontinue support plans, often subjecting customers to costly and disruptive upgrades;
  • High mission operational risk due to an inability to replicate live data to a secondary data center; and
  • Complex legacy storage solutions that are difficult to configure and administer.

Take a minute to think about this. Maybe those year-end sweep dollars would be better spent on improving your mission readiness with a cloud storage solution like Wasabi. Wasabi is hot cloud storage. It’s being used as a way to archive data, or used as 2nd copy, because the price for storage on Wasabi is so low and they’ve made cloud storage prices predictable with no egress charges.. It’s also secure with 11 nines of durability. Wasabi offers immutability so your data is protected from most common causes of data loss.  Finally Wasabi is high-performing; 6 times faster than its competitors. It’s easy to test by signing up for a free trial at wasabi.com





This post was brought to you by Wasabi Hot Storage 



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Monday, July 16, 2018

Cloud Migration Part One: An Overview



Cloud Migration Part One: An Overview

Business is all about efficiency and effectiveness.  In today’s world, however, those twin goals almost always lead to cloud migration.  This anecdotal observation is supported by Gartner which sees worldwide public cloud service revenue jumping to over $300B by 2021.


Independent research from Market and Markets echoes this expectation in its cloud migration services forecast which sees this market subset growing from $3.17B in 2017 to $9.47B by 2022, at a Compound Annual Growth Rate (CAGR) of 24.5%.  With migration being such a high priority activity, many organizations are looking for the most efficient and effective cloud migration strategy.
In addressing this query from thousands of customers worldwide, IBM Global Technology Services (GTS) has migrated applications in just about every industry.  These migrations have targeted global service providers like AWS and Azure, as well as regional and local ones.  The best practices GTS has honed through these experiences include:
  • How to understand and classify business critical data;
  • Executing an efficient process for screening and selecting applications for cloud migration;
  • Following a methodology for discovering the most effective strategy for each application migration; and
  •  Selection of the most cost-effective and industry aligned cloud service provider(s).
Experience has also shown that businesses are in different stages of their “Journey to the Cloud.”  These initial stages often include:
  • Planning and designing common foundational infrastructure services;
  • Pattern and Template based automated deployments for public clouds;
  • Migrating workloads to the most appropriate cloud through a standardized, repeatable tool driven framework;
  • Monitor and Manage workloads using standardized tools and process aligned to cloud platforms; and
  • Governing, tracking, managing and optimizing cloud usage and spend.


These common best practices and initial stages are common to the most successful cloud migration projects.

This series, presented in four weekly installments, lays out the details of how leading organizations have transformed themselves through cloud migration and how GTS has embedded industry best practices into their hybrid cloud service delivery model.  “Part Two: Classifying Organizational Data,” covers the identification of key business processes and their associated data types.  The article also the outlines the importance of identifying process data owners and the required security controls for each data type.  “Part Three: Application Screening,” looks at determining the most appropriate target deployment environment, each application’s business benefit, key performance indicator options and target return on investment.  That segment also shows how to select the most appropriate migration strategy for each application.  “Part Four: Executing The Migration” presents experience informed guidance on how to effectively and efficiently execute a cloud application migration strategy.  This segment includes selecting the most appropriate cloud service provider and technology services, reviewing and verifying available data security controls and suggested steps for SLA negotiations.  It also addresses business/mission model alignment, organizational change management, and migration project planning.

The series also presents the three most common cloud adoption paths for business, namely:
  • Innovation: Building cloud-native applications using the DevOps model;
  • Agility: Modernizing and migrating legacy applications and infrastructure to a native cloud model; and
  • Stability: Managing workloads and infrastructure in clouds and on premises




This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Thursday, July 12, 2018

A Personal Technology for Good Redux: Call for Code


In 2013 I had the opportunity to manage a $2M demonstration of how cloud computing could be used to support natural disasters. In that NCOIC Geospatial Community Cloud (GCC) demonstration, multiple regional clouds were managed using a cloud brokerage platform in a simulated response to a massive earthquake. Modeled after the disaster that struck Haiti in 2010, the project showed how interoperability and movement of data in an open, cloud-based infrastructure could be used to deliver a global, multidisciplinary disaster response capability. The relief simulation also showed government leaders how data sources from a variety of organizations coupled with cloud technology could improve capability and effectiveness while reducing cost, time and risk. These were critical lessons that, back then, I looked forward to maturing.
Now it’s 2018, and technology advances have continued to revolutionize our society.  The democratization of data and information have since changed our lives in many unexpected ways.  A sad fact though is that, although some government leaders have tried, our global society has not yet found a way to institutionalize the lessons we learned back then.  While cloud computing continues to upend industry norms, the disaster response community is still stuck with antiquated processes and technologies.  This unfortunate reality is but one reason why I have decided to put my energy behind the “Call for Code” initiative.

IBM is the founding member of the Call for Code Global Initiative, which was created by David Clark, a renowned leader in cause-related initiatives. David Clark's work includes iconic people and humanitarian organizations, such as President Nelson Mandela, Muhammad Ali, Prince, the United Nations, Amnesty International, and The Anne Frank Center.  The Call for Code Global Challenge is designed to leverage technology for good by asking software developers to create solutions that significantly improve preparedness for natural disasters and relief. This competition encourages developers who want to pay their skills forward for a specific mission to alleviate human suffering.  A broad cross-section of experts humanitarian and international organizations are supporting this initiative which includes the United Nations Human Rights Office and the American Red Cross’
GRAMMY nominatedsongtress Andra Day
International team. They will also benefit from the inaugural Call for Code Global Prize Event & Concert on October 13th, United Nations International Day for Disaster Reduction.  The initiative also boasts some star power with GRAMMY-nominated singer and human rights advocate Andra Day, whose 2015 global smash hit song "Rise Up" quickly became the voice for the voiceless, leading a celebrity coalition.


Another motivation for joining this initiative was a recent video from Brad Kieserman, Red Cross Vice President of Disaster Operations and Logistics.  In that video, he highlighted the importance of visualizing data in a way that helped responders make better decisions about the movement of resources during a disaster.  His vision of using technology to address unmet disaster need for me pointed out the value of the cloud as the application delivery platform and data repository.  The same value proposition we proved back in 2013.

Brad Kieserman, Red Cross Vice President of Disaster Operations and Logistics
Over the next few months I will be blogging, tweeting, podcasting and vlogging on the many “Call for Code” activities and events.  Please join me in supporting this effort by retweeting, liking and reposting this content to your friends. 

Let’s all work together to help each other when disaster strikes.

This post was brought to you by IBM Developerworks. For more content like this, visit https://developer.ibm.com/code/



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Thursday, May 31, 2018

A Path to Hybrid Cloud



Cloud computing is now an operational reality across every industry.  Organizations that fail to leverage this economic, operational and technology consumption model are merely consigning themselves to irrelevance.  The rapid acceleration of cloud adoption has now ignited a push for the Hybrid Cloud/Hybrid IT model in which enterprises simultaneously consumes information technology services from private clouds, public clouds, community clouds and traditional data center sources. While most see this as a reasonable evolutionary path, others see staying with a single provider or a slow, gradual transition as a more prudent path. I strongly disagree with the latter.

A casual observation of the information technology marketplace reveals that data is continuing to grow at an exponential pace. We have also moved from the management of structured data, through joint analysis of structured and unstructured data into an environment where real-time analysis and reporting of streaming data is essential. We are also in an environment of stricter data management regulations and national data sovereignty laws that, if violated, introduce the possibility of punishing remedies and fines. This rapid progression has also driven an exponential increase in required (and desired) information technology services. Cloud service providers meet this need through the innovative creation and deployment of API accessible, immediately consumable, data manipulation services. Enterprise IT organizations have shown themselves to be incapable of matching the blistering increase in number and breadth of these broader marketplace services.  It’s not cost-effective or even desirable for them to even try.

Business owners, on the other hand, see these new services as necessary competitive tools.  They can’t wait for the required internal governance processes or IT investment decisions. This tension has been the cause of internal conflict between IT and business and also the underlying cause of Shadow IT, a tendency to stealthily procure and use cloud services without internal IT knowledge or approval. The organizational business goal must be accomplished and to meet this imperative, enterprise IT must drive a radical shift from legacy ideas and culture towards embracing the Hybrid Cloud/Hybrid IT model.

Enterprise IT management must face reality.  The development and rapid execution of a business supportive IT strategy require a meaningful conversation between IT and business leaders on targeted new business opportunities and any associated differentiating business strategies.  IT leadership must then select the appropriate IT service mix and sources for each necessary business process. This multi-vendor, multi-source selection process should point to the needed Hybrid Cloud/Hybrid IT target end state. The path towards realizing that target should go through at least two pilot processes. One through which success delivers IT operational efficiency and savings and a second that promises new revenue streams for the business. Ideally executed in parallel, this approach will:
  • Train and educate your IT team on the cloud model and required business processes;
  • Build much-needed rapport and collaboration between the business team and IT team;
  • Accelerate attainment of the Hybrid Cloud/Hybrid IT target end state; and
  • Effectively move the organization down the necessary digital transformation path.


Enterprises that have been successful in completing this transformative process include:
  • CarMax -  a Fortune® 500 company with more than 175 stores across the US and over 6 million cars sold
  • IHG - one of the world’s leading hotel companies, with more than 375,000 people working across almost 100 countries to deliver True Hospitality for everyone; and
  • Smithfield Foods - the world’s largest pork processor and hog producer, committed to providing good food in a responsible way.

In completing their path to hybrid cloud, Smithfield Foods realized:
  • Application response time drop from 600ms to 70 ms;
  • No unplanned IT outages;
  • Increased visibility into business key performance indicators;
  • A transition from a reactive to a predictive decision making culture;
  • A 60% reduction in required IT resources; and
  • The desired enablement of business innovation.
To learn more about starting your company’s path towards the hybrid cloud, take a look at the Microsoft Office Modern Workplace episode on Hybrid Cloud.  In it, Corporate Vice President of Azure Marketing at Microsoft, Julia White, and Tim Crawford from AVOA address how organizations can build the right cloud strategy for the business and its impact on digital transformation.



This post was sponsored by Microsoft.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Tuesday, May 29, 2018

Human-Led Collaboration with Machines


When charged with managing large and complex efforts, an overarching project management task is risk assessment. It involves documenting the current situation, comparing it to the past, and understanding the odds of the past repeating itself. Since the past may never repeat itself, however, an insightful project manager also imagines the odds of any possible future outcomes.  Then the odds of past outcomes repeating themselves and the odds of new future outcome are tempered with the PM’s possible actions.  Executing this repetitive and continuous process is just one area where human-machine collaboration can change the future.

Machines do repetitive tasks well. They have perfect recall. Their forte is being able to record and document what has happened and from that, interpolate what will happen. They correlate the past and calculate the likelihood that those things will happen again. They interpolate and calculate the odds of what will happen in the future.


Humans imagine things really well. While their recollection of the past can be flawed, their creativity can be breathtaking. They intuit and sometimes see things without those things actually being there. Even with these flaws though, they can apply imagination to the whitespaces of reality and change the future. Those uniquely human capabilities need cause and structure, a skill referred to as common sense reasoning. 


Since machines, so far, have been unable to exhibit an ability to use common sense reasoning, this observation becomes the heart of human-machine collaboration. Human-machine collaboration not only support risk-assessment tasks but can also help in:
  • Resource management
  • Prediction
  • Experimentation.

By augmenting human workers with machine intelligence, the project manager can gain access to more and different analysis. More robust analysis enables more informed decisions, the anticipation of dependencies, and better leadership. Improved leadership is also why leading organizations have reshaped the use of rapid analysis, flexible organizations, and team communication tools.

Cisco Webex Teams was developed to support this shift. Focused on bridging the gap between humans and machines, it uses human priorities to plan and schedule tasks. Webex Teams can also be used to document resource levels, record resource use, and alert humans should any previously set limits be breached. Using artificial intelligence and machine learning, this collaborative tool can even provide schedule and planning option predictions.

By enabling human-machine collaboration, Cisco Webex Teams not only sets a rapid pace towards the future but delivers some of that future today by:
  • Bringing team members together more easily through advanced messaging capabilities and content sharing.
  • Enhancing productivity during team-based meetings by allowing anyone in a space to schedule, start, and record meetings that can include up to 75 video users.
  • Providing the capability to share a whiteboard application or use Cisco Webex Board’s all-in-one wireless presentation, digital whiteboarding, and video conferencing functionalities.
  • Calling team members using the app, an IP phone, or a conference-room video device.
  • Reducing meeting setup friction with integrations to streamline workflows and bots to automate additional actions.

Cisco Webex Teams enables human-led machine collaboration, a partnership in which humans set the strategy and machines execute the tactics.

Read more in the series:

Welcome the New Project Manager!


Building A Collaborative Team

Artificial Intelligence and the Project Manager


This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco. 






Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Monday, May 28, 2018

Sensomorphic


240 million results are returned in 1.06 seconds (as of May 28, 2018) when you search for cloud computing in a Google search. With that much information available, and that many conversations active around the globe;
  • Do we really know what cloud is?
  • Are we confident in knowing what cloud can do?
  • Can we explain why the cloud is changing everything?
If 10 people were asked what cloud computing is and why it is important, we would get at least 12 different answers.
  • Where is the disconnect?
We know leaders want it. CFOs support it. Strategists recommend it. Technical teams request it. Users demand it. Isn't cloud easy? Cloud is often associated with acceleration, cost control, added flexibility, increased agility, lower complexity, and rapid innovation. It takes an incredible amount of work and planning to be simple. CIOs are stating that cloud skills are a top hiring priority in 2018.
  • What do we need to stay relevant?
  • How do we keep up with an industry that is changing every day?
Cloud computing is changing strategies and enabling innovation at every turn. Cloud is changing IT economics. Cloud is blurring the lines and breaking down traditional silos. Cloud is blending roles and redefining boundaries. Regardless of which industry we are in, or the position we hold, cloud computing is changing everything; how we work, how we play, and how we communicate.

Cloud computing is a Transformation, not a Migration.

Migration seems easy because it can be described as a series of things that get done. Migrations seem tangible: from this to that, from here to there. Transformations, interestingly, are mental and emotional. Transformations require a change in mindset. Transformations require constant data that can be continuously compared to expose insights and establish perceived value.  Migrations are planned and executed. Transformations are adopted. Without adoption, transformation fails. Adoption requires a change in mindset, often created from a continuous digestion of highly valued relevant data and insight. This means continuously sensing the environment and continuously changing your actions to better align with goals, which are also changing continuously. We, the authors, call this being:

Sensomorphic.


Businesses and people tasked with adapting and driving change must become sensomorphic. Today, many are flooded with data, yet remain uninformed. Many know they are in the wrong place, yet struggle to know where they are. The only sustainable path for positive transformation is to become sensomorphic. In the world of cloud computing, this means being sensomorphic across many domains, simultaneously. The sensomorphic domains are:

Cloud adoption is a core component of digital transformation. Organizations must align modern technology and current economic models to business strategy. Transformation requires a new approach that balances cost and technology choices with company direction and client consumption models.



Architecting Cloud ComputingSolutions presents and explains many critical Cloud solution design considerations and technology decisions required to successfully consume the right cloud service and deployment models based on strategic, economic, and technology requirements. This book starts with the fundamentals of cloud computing and its architectural concepts. It then navigates through cloud service models (IaaS, PaaS, and SaaS), deployment models (public, private, community, and hybrid), and implementation options (Enterprise, MSP, and CSP). Each section exposes and discusses key considerations and challenges that organizations face during cloud migration. In later chapters, this book dives into how to leverage DevOps, Cloud-Native, and Serverless architectures in your Cloud environment. Discussions include industry best practices for scaling your cloud environment as well as details for managing essential cloud technology service components such as data storage, security controls, and disaster recovery. By the end of this book, you will be well versed in all the design considerations and operational trades needed to adopt cloud services no matter which cloud service provider you choose.

About the authors:

Kevin L. Jackson is a globally recognized cloud computing expert, technology thought leader, and CEO/founder of GovCloud Network, LLC. Mr. Jackson’s commercial experience includes being Vice President J.P. Morgan Chase and Worldwide Sales Executive at IBM. He has deployed mission applications to the US Intelligence Community cloud computing environment (IC ITE), and he has authored and published several cloud computing courses and books. He is a Certified Information System Security Professional (CISSP) and Certified Cloud Security Professional (CCSP).

Scott Goessling is the COO/CTO for Burstorm, and he helped create the world’s first automated Cloud Solution Design platform. He has lived and worked in the Philippines, Japan, India, Mexico, France, and the US. Being an expert in many technologies, Scott also has been a part of several successful start-ups, including a network hardware innovator that was acquired for over $8B. Scott's perspectives combine many real-world experiences.

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Friday, May 25, 2018

Artificial Intelligence and the Project Manager


Organizations use teams to create wealth, market share, customer service, competitive advantage, and organizational success. Effective teams accomplish their assigned end goals by engaging in collaboration as a joint learning activity. Enhanced effectiveness is why collaborative tools are so critical to the project manager, and 7 out of 10 IT professionals see collaboration as essential to their organization.

For an information worker operating within a modern team environment, finding information is relatively easy. Any team member can “google” to find just about anything.  High-performance teams, however, know how to work together to brainstorm and collaborate on discovering the right questions. That concept frames the future of collaboration and the project manager’s role. The most effective project managers use artificial intelligence (AI) to apply computational approaches to the collaborative social experience. In laymen’s terms that means using AI to discover the right questions. Research has shown this approach as a more robust method of helping humans solve increasingly complex business problems.

As AI and collaboration technologies enhance and spread intelligence equally to any worker, machine learning technologies provide just-in-time custom learning based on team needs and the organizational goals. Collaboration technology should, therefore, also help ease the challenge of connecting physically remote teams to each other. This critical function allows more interaction, more collective learning, more collaboration, and more team success. By embracing this new remote collaboration paradigm, project managers can:
  • Identify and engage critical talent independent of their location. This capability improves the manager’s ability to bring complementary skills into a collaborative environment with the broader team;  
  • Encourage and build healthy relationships with remote team members. Strong relationships are the heart of effective collaboration and leadership;
  • Present and communicate a guiding vision to the team. Providing clarity of purpose enhances collaboration;
  • Work with local and remote team members to jointly prepare a clear mission objective and define group rules of engagement;
  • Connect the project with higher level organizational objectives;
  • Create an atmosphere of safety, trust, and respect that, in turn, encourages multiple perspectives, diverse viewpoints, and creativity;
  • Make everyone’s ideas and suggestions visible and tangible by building prototypes, or drawing diagrams;
  • Provide an easy-to-use infrastructure that enables learning, communication, and collaboration;
  • Remove barriers to high performance by nurturing individual brilliance;
  • Coach for improved teamwork, emotional intelligence, and navigating difficult conversations;
  • Jointly celebrate joint accomplishments; and
  • Capture best practices and things that should be avoided.


These are all reasons why Cisco launched Webex Teams. This collaborative platform uses machine learning to present an intelligent and human-like conversational interface for any application or device. With this capability, project managers can eliminate the friction usually associated with remote team member communications. The solution has also embedded Webex Assistant, an AI-enabled service for managing directory and scheduling information and designed to assist, participate, and take action that supports the project manager. Webex Assistant leverages the powerful Webex graph to access better information faster. In doing so, it essentially injects artificial intelligence into every team interaction.




This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco. 




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Friday, May 18, 2018

Building A Collaborative Team



Recently, Harvard Business Review cited some insightful research into team behavior at 15 multinational companies. It found that although these teams tended to be large, virtual, diverse, and composed of highly educated specialists, those same four characteristics made it hard for teams to accomplish their goals. It also showed that complex team members were less likely—absent other influences—to share knowledge freely, learn from one another, shift workloads to break up bottlenecks, or help one another to complete jobs on time or share resources. In other words, to collaborate. The study also looked at teams that exhibited high levels of collaborative behavior. The difference turned out to be in the quality of team leadership.

The eight factors that led to such leadership success were:
  1. Making highly visible investments in facilities that demonstrate their commitment to collaboration.
  2. Demonstrating leadership that models collaborative behavior.
  3. Mentoring and coaching, especially informally, in ways that help people build networks across corporate boundaries.
  4. Ensuring that collaboration skills have been taught to the team.
  5. Building and supporting a strong sense of community.
  6. Assigning team leaders that are both task- and relationship-oriented.
  7. Building on heritage relationships by putting at least a few people who know one another on the team.
  8. Sharply defining team roles and individually assigned tasks.  

This observation means project managers must set an environment that nurtures the exploration of open-ended thought and interactive collaboration. To accomplish this, team interactions cannot be just a series of point-in-time activities. The traditional team meeting must be replaced with continuous interaction and relationship building. To directly address this need, Cisco created the Emerge Engineering Team and TeamTV.




The Emerge Team works to create innovative technology that accelerates the future of work. Since collaboration will be so essential to success, they created TeamTV as a means of exploring the future of collaboration. This next-generation enterprise video collaborative platform integrates with and leverages the WebEx Teams digital collaboration suite. By creating a visually immersive and continuously interactive environment, they’ve discovered the immense value of having a space to interact daily with global teammates as if they were all in one office.
In addition to having a webcam filming the participants, TeamTV provides other useful collaboration tools including:
  • The “team mode” version of TeamTV with all members on-screen;
  • A “popcorn mode” where all members can watch an event or something communally across distances;
  • TeamTV channel ticker, where team-relevant information is available across the bottom of the screen; and
  • A virtual assistant bot with facial recognition technology capable of recognizing team members and serving up relevant email and instant messages. 

Building collaboration across an enterprise is not a quick job. It requires a combination of long-term relationship building and trust, a culture where senior leaders openly exhibit cooperation and make smart near-term decisions on team formation. Legacy practices that may work well with simple, co-located teams are likely to fail when teams grow more complex. Although most factors that impede collaboration today have always been there, the modern teams that are needed to solve global business challenges require much more diversity, long-distance cooperation, and remote expertise. Project managers would, therefore, do well to update their approach to today’s business challenges by addressing the eight factors listed above.  






Read more in the series:

Welcome the New Project Manager!



This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco. 




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)