Friday, July 27, 2018

Cloud Migration Best Practice Part 2: Classifying Your Data



In my first post of this series, “CloudMigration Part One: An Overview,” I provided a high-level summary of how enterprises should migrate applications to the cloud. In this installment, the focus is on enterprise data and why your organization may need to review and reclassify its data before moving anything to the cloud.

Cloud computing has done more than change the way enterprises consume information technology.  It is also changing how organizations need to protect their data.  Some may see this as an “unintended consequence” but the headlong rush to save money by migrating applications to the cloud has simultaneously uncovered long-hidden application security issues.  This revelation is mostly due to the wide adoption of “Lift & Shift” as a cloud migration strategy.  Using this option typically precludes any modifications of the migrating application.  It can also result in the elimination of essential data security controls and lead to grave data breaches.

While there is no doubt in the good intentions of all involved, traditionally, enterprise applications were developed for deployment into the organization’s own IT infrastructure.  This implicit assumption also included the use of infrastructure-based security controls to protect organizational data.  These generally accepted industry practices were coupled with a cultural propensity to err on the side of caution by protecting most data at generally high levels.  During an implementation, organizations typically used a two-level (sensitive and non-sensitive) or at most a four-level data classification model.

Today, the cloud has quickly become the preferred deployment environment for enterprise applications.  This shift to using “other people’s infrastructure” has brought with it tremendous variability in the nature and quality of infrastructure-based data security controls.  It is also forcing companies to shift away from infrastructure-centric security to data-centric information security models.  Expanding international electronic commerce, ever tightening national data sovereignty laws and regional data protection and privacy regulations (i.e., GDPR) have also combined to make many data classification schemas generally untenable.  Cloud Security Alliance and the International Information Systems Security Certification Consortium (ISC2), in fact, both suggest that corporate data may need to be classified across at least eight categories, namely:
  • Data type (format, structure)
  • Jurisdiction and other legal constraints
  • Context
  • Ownership
  • Contractual or business constraints
  • Trust levels and source of origin
  • Value, sensitivity, and criticality
  • The obligation for retention and preservation

Moving to classify data at this level means that one of the most important initial steps of any cloud computing migration must be a review and possible reclassification of all organizational data.  In bypassing this step, newly migrated applications simply become data breaches in wait.  At a minimum an enterprise should:
  • Document all key business processes destined for cloud migration;
  • Identify all data types associated with each migrating business process;
  • Explicitly assign the role of “Process Data Owner” to appropriate individuals; and
  • Assign each “Process Data Owner” the task of setting and documenting the minimum required security controls for each data type.

After completing these steps, companies should review and update their IT governance process to reflect any required expansion of their corporate data classification model.  These steps are also aligned with ISO 27034-1 framework for implementing cloud application security.  This standard explicitly takes a process approach to specifying, designing, developing, testing, implementing and maintaining security functions and controls in application systems.  It defines application security not as the state of security of an application system (the results of the process) but as “a process an organization can perform for applying controls and measurements to its applications in order to manage the risk of using them.”

In Part 3 of this series, I will discuss application screening and related industry best practices and include:
  • Determining the most appropriate target application deployment environment;
  • Determining each application's business value, key performance indicators and target return on investment;
  • Determining each application's migration readiness; and
  • Deciding the appropriate application migration strategy.



This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Wednesday, July 25, 2018

Skills Gulf Is Cloud’s Biggest Peril



Ian Moyse, Cloud Industry Thought Leader & Cloud Sales Director at Natterbox

Cloud is undoubtedly the driver of the new tech economy. Be it SaaS, PaaS, IaaS, Public, Private or Hybrid clouds, E-Commerce, IOT (Internet of Things), Big Data or some iteration that at the back of it is supported by cloud technologies. Technology is both enhancing and reducing in cost at such a speed, that it is no longer the entitlement of only the large firms, but can empower any organisation from small to large, from startup to established, to be able to revolutionise their customer offering and to elect to disrupt or be disrupted.

With this speed of technology change comes a need for those supporting the business to adapt quickly and adopt new methodologies, knowledge, and skills to empower a company to take advantage of these new possibilities. Switching from Waterfall to Agile, from networking to virtualisation to Docker, from hosting to IaaS & PaaS and from C, through Java into Swift, Hack, and Dart.

A wide range of firms still relies on traditional IT infrastructure (locally deployed server applications and databases) despite the increasingly rapid rate of companies migrating to cloud-based systems.  Digital Transformation seems to be on the agenda of most Enterprise organisations, banded about as if it’s a switch to flick and a fast thing to undertake. However, the reality is far from the truth and accepting the change required and having the skills at hand to achieve it, are barriers impeding a growing number of companies.

Change is hard to accept at the best of times, particularly if you have previously been the subject expert on a vendor/technology for a long period, to now find that is being disrupted at pace and your worth is diminishing either in your own firm or to the general market. Being prepared to let go of many years of acquired skills and accept the need to re-start and learn a whole range of new skills is hard to accept, and many will resist, defending the status quo and hindering business change and their own personal progress.

For companies moving applications and services to cloud platforms, migration challenges are one of the top constraints affecting IT, as there are no automated switchovers on offer and customised internal or external migrations vary from mild to heavy development changes. For example, migrating a home grown or proprietary application requires new application development and testing. However, if taken on with commitment, the move can provide faster more agile application development through DevOps and utilisation of enhanced cloud features and API’s leading to improved application lifecycle management.

However, with this comes the need for professionals with the skills and knowledge of the chosen cloud platform to deliver the migration project in a structured, and effective manner. Cloud continues to enhance quickly and even those in the cloud a decade ago are finding they are needing to continue to learn new skills, such as the usage surge in containers, for which a Robin Systems Survey recently cited that 81% of organisations are planning to increase their use.

Big Data has introduced new approaches, tools, skills and with an expected 60% per annum growth (IDC) cannot be ignored. With the increased volume of data and continual crunching demands databases are going to live in the cloud and demand new platforms and approaches.

With the plethora of changes from new coded applications and architectures holding vast data stores in the cloud, the need for greater cyber security expertise is an essential requirement. With the human element recognised as the most vulnerable area of security, the introduction of so many new skill areas will introduce increased risk of new security exposures. Software developers in the cloud must understand and treat with extreme caution, the need for increased responsibility for security assurance and compliance. With the heightened awareness of security threats and breaches and the introduction of the new GDPR (General Data Protection Regulation) in Europe with far heftier and damaging fines, getting this wrong is now going to be catastrophic. It is estimated that less than 5% of cloud applications are ready for GDPR, leading to a vast breadth of enhancement In a very short period.

The perfect storm circling this comes from the expectation that 30-40% of the corporate workforce will retire in the next decade, combined with a reduction in those studying relevant ICT subjects and the reduction in educations capability to provide effective education in the required areas. We have a rapidly increasing need for new technology skills (to both support new technologies and to support digital transformation from old to new) and a big % of those with technology backgrounds retiring rather than reskilling, backed a reduction in educations capability to attract and educate to the level of need required.

Businesses now have pressures upon them like never before!  Markets that shift quicker, more fickle and demanding customers, users being influenced by or becoming millennials (who expect faster, quicker, easier, cheaper from the world they have grown up within) and disruption all around them from new born firms who can attack with the gusto of using all the new world tech and methods, with no legacies to unchain themselves from.

Companies MUST have access to the skills required to be able to employ the full scope of new tech on offer to their business advantage. To be able to move old creaking applications to newer form factors and to deliver a better quality of service and user experience to the demands of any device, any place, any time working for both their employee and their increasingly new breed of customer.
Unless the issue is addressed quickly,  you can expect ‘Supply & Demand’ for these new skills is going to simultaneously implode and explode, creating a chasm between need and affordability, as those who can do become scarce and valuable commodities, available to the few who can afford!


You can follow Ian at www.ianmoyse.cloud

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings

(Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Wednesday, July 18, 2018

Could Budget Sweeps Fix Your Cybersecurity Problem?


A recent roundtable discussion in Washington, DC with Federal IT and Cyber leaders focused on the business drivers, challenges and evolving strategies around cybersecurity in government.  After an opening presentation by Jim Quinn, the lead systems engineer for the Continuous Diagnostics and Mitigation program at the Department of Homeland Security, the discussion highlighted the need for data security. Key takeaways included:
  • A new emphasis on data-level security across government that puts security controls closer to the data itself, rather than focusing on the perimeter.
  • The urgency around data security is increasing, with 71 percent of agencies having been breached, which is a threefold increase from three years ago.
  • Need to deal with an expanding requirement to add more and more capabilities to mission systems with the understanding that protecting data is part of the mission.
  • Agencies that only focus their time, energy and budget on meeting various mandates are having trouble keeping up with evolving cyber threats.
  • While agencies have much flexibility in how they acquire, manage and deliver information and services, they are still responsible for protecting their data. Agencies must, therefore, approach data security at the enterprise level.
  • Data security is a matter of law. 44 U.S.C., Sec. 3542 directs agencies to ensure the confidentiality, integrity, and availability of government data.

As I’ve written many times before, organizations need to focus on how to transition to a hybrid IT future.  The overall information technology marketplace is also undergoing these dramatic shifts toward data-centric security.  Data management has moved from the management of structured data into an environment where real-time analysis and reporting of streaming data is essential. 

International commerce is also entering an environment of stricter data management regulations and national data sovereignty laws that, if violated, introduce the possibility of punishing remedies and fines. This rapid progression has also driven a massive change in information technology services. Cloud and managed service providers are meeting this need through the innovative creation and deployment of API accessible, immediately consumable, data manipulation services. Enterprise IT organizations have shown themselves unable to keep pace with the blistering increase in the number and breadth of broader IT marketplace services.  It’s also not cost-effective or even desirable for them to try.

With the recent focus on data-level security and year-end budget sweeps around the corner, shouldn’t your agency be looking at how to better store and protect its data? Mandates around IT Modernization and Cloud Computing aren’t going away soon either.  With cloud and managed service provider data storage solutions so accessible, your current on-premise solution may be hurting your mission in many ways including:
  • High CAPEX driven by significant upfront equipment costs lead to poor ROIs with long payback periods;
  • High OPEX characterized by recurring power, cooling and rack space expenses;
  • Expensive monthly hardware and software maintenance and support fees;
  • Excessive system administration cost and complexity all lead to high ongoing operations expenses;
  • Obsolescence concerns caused by storage vendors that regularly retire products and discontinue support plans, often subjecting customers to costly and disruptive upgrades;
  • High mission operational risk due to an inability to replicate live data to a secondary data center; and
  • Complex legacy storage solutions that are difficult to configure and administer.

Take a minute to think about this. Maybe those year-end sweep dollars would be better spent on improving your mission readiness with a cloud storage solution like Wasabi. Wasabi is hot cloud storage. It’s being used as a way to archive data, or used as 2nd copy, because the price for storage on Wasabi is so low and they’ve made cloud storage prices predictable with no egress charges.. It’s also secure with 11 nines of durability. Wasabi offers immutability so your data is protected from most common causes of data loss.  Finally Wasabi is high-performing; 6 times faster than its competitors. It’s easy to test by signing up for a free trial at wasabi.com





This post was brought to you by Wasabi Hot Storage 



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Monday, July 16, 2018

Cloud Migration Part 1: An Overview



Cloud Migration Part One: An Overview

Business is all about efficiency and effectiveness.  In today’s world, however, those twin goals almost always lead to cloud migration.  This anecdotal observation is supported by Gartner which sees worldwide public cloud service revenue jumping to over $300B by 2021.


Independent research from Market and Markets echoes this expectation in its cloud migration services forecast which sees this market subset growing from $3.17B in 2017 to $9.47B by 2022, at a Compound Annual Growth Rate (CAGR) of 24.5%.  With migration being such a high priority activity, many organizations are looking for the most efficient and effective cloud migration strategy.
In addressing this query from thousands of customers worldwide, IBM Global Technology Services (GTS) has migrated applications in just about every industry.  These migrations have targeted global service providers like AWS and Azure, as well as regional and local ones.  The best practices GTS has honed through these experiences include:
  • How to understand and classify business critical data;
  • Executing an efficient process for screening and selecting applications for cloud migration;
  • Following a methodology for discovering the most effective strategy for each application migration; and
  •  Selection of the most cost-effective and industry aligned cloud service provider(s).
Experience has also shown that businesses are in different stages of their “Journey to the Cloud.”  These initial stages often include:
  • Planning and designing common foundational infrastructure services;
  • Pattern and Template based automated deployments for public clouds;
  • Migrating workloads to the most appropriate cloud through a standardized, repeatable tool driven framework;
  • Monitor and Manage workloads using standardized tools and process aligned to cloud platforms; and
  • Governing, tracking, managing and optimizing cloud usage and spend.


These common best practices and initial stages are common to the most successful cloud migration projects.

This series, presented in four weekly installments, lays out the details of how leading organizations have transformed themselves through cloud migration and how GTS has embedded industry best practices into their hybrid cloud service delivery model.  “Part Two: Classifying Organizational Data,” covers the identification of key business processes and their associated data types.  The article also the outlines the importance of identifying process data owners and the required security controls for each data type.  “Part Three: Application Screening,” looks at determining the most appropriate target deployment environment, each application’s business benefit, key performance indicator options and target return on investment.  That segment also shows how to select the most appropriate migration strategy for each application.  “Part Four: Executing The Migration” presents experience informed guidance on how to effectively and efficiently execute a cloud application migration strategy.  This segment includes selecting the most appropriate cloud service provider and technology services, reviewing and verifying available data security controls and suggested steps for SLA negotiations.  It also addresses business/mission model alignment, organizational change management, and migration project planning.

The series also presents the three most common cloud adoption paths for business, namely:
  • Innovation: Building cloud-native applications using the DevOps model;
  • Agility: Modernizing and migrating legacy applications and infrastructure to a native cloud model; and
  • Stability: Managing workloads and infrastructure in clouds and on premises




This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



Thursday, July 12, 2018

A Personal Technology for Good Redux: Call for Code


In 2013 I had the opportunity to manage a $2M demonstration of how cloud computing could be used to support natural disasters. In that NCOIC Geospatial Community Cloud (GCC) demonstration, multiple regional clouds were managed using a cloud brokerage platform in a simulated response to a massive earthquake. Modeled after the disaster that struck Haiti in 2010, the project showed how interoperability and movement of data in an open, cloud-based infrastructure could be used to deliver a global, multidisciplinary disaster response capability. The relief simulation also showed government leaders how data sources from a variety of organizations coupled with cloud technology could improve capability and effectiveness while reducing cost, time and risk. These were critical lessons that, back then, I looked forward to maturing.
Now it’s 2018, and technology advances have continued to revolutionize our society.  The democratization of data and information have since changed our lives in many unexpected ways.  A sad fact though is that, although some government leaders have tried, our global society has not yet found a way to institutionalize the lessons we learned back then.  While cloud computing continues to upend industry norms, the disaster response community is still stuck with antiquated processes and technologies.  This unfortunate reality is but one reason why I have decided to put my energy behind the “Call for Code” initiative.

IBM is the founding member of the Call for Code Global Initiative, which was created by David Clark, a renowned leader in cause-related initiatives. David Clark's work includes iconic people and humanitarian organizations, such as President Nelson Mandela, Muhammad Ali, Prince, the United Nations, Amnesty International, and The Anne Frank Center.  The Call for Code Global Challenge is designed to leverage technology for good by asking software developers to create solutions that significantly improve preparedness for natural disasters and relief. This competition encourages developers who want to pay their skills forward for a specific mission to alleviate human suffering.  A broad cross-section of experts humanitarian and international organizations are supporting this initiative which includes the United Nations Human Rights Office and the American Red Cross’
GRAMMY nominatedsongtress Andra Day
International team. They will also benefit from the inaugural Call for Code Global Prize Event & Concert on October 13th, United Nations International Day for Disaster Reduction.  The initiative also boasts some star power with GRAMMY-nominated singer and human rights advocate Andra Day, whose 2015 global smash hit song "Rise Up" quickly became the voice for the voiceless, leading a celebrity coalition.


Another motivation for joining this initiative was a recent video from Brad Kieserman, Red Cross Vice President of Disaster Operations and Logistics.  In that video, he highlighted the importance of visualizing data in a way that helped responders make better decisions about the movement of resources during a disaster.  His vision of using technology to address unmet disaster need for me pointed out the value of the cloud as the application delivery platform and data repository.  The same value proposition we proved back in 2013.

Brad Kieserman, Red Cross Vice President of Disaster Operations and Logistics
Over the next few months I will be blogging, tweeting, podcasting and vlogging on the many “Call for Code” activities and events.  Please join me in supporting this effort by retweeting, liking and reposting this content to your friends. 

Let’s all work together to help each other when disaster strikes.

This post was brought to you by IBM Developerworks. For more content like this, visit https://developer.ibm.com/code/



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)