Tuesday, February 2, 2016

Hybrid Cloud Versus Hybrid IT: What’s the Hype?

(Originally posted on Point B and Beyond)



Once again, the boardroom is in a bitter battle over what edict its members will now levy on their hapless IT organization. On one hand, hybrid cloud is all the rage. Adopting this option promises all the cost savings of public cloud with the security and comfort of private cloud. This environment would not only check the box for meeting the cloud computing mandate, but also position the organization as innovative and industry-leading. Why wouldn’t a forward-leaning management team go all in with cloud?

On the other hand, hybrid IT appears to be the sensible choice for leveraging traditional data center investments. Data center investment business models always promise significant ROI within a fairly short time frame; if not, they wouldn’t have been approved. Shutting down such an expensive initiative early would be an untenable decision. Is this a better option than the hybrid cloud?

Hybrid Cloud Versus Hybrid IT

The difference between hybrid cloud and hybrid IT is more than just semantics. The hybrid cloud model is embraced by those entities and startups that don’t need to worry about past capital investments. These newer companies have more flexibility in exploring newer operational options.
Mature businesses, on the other hand, need to manage the transition to cloud without throwing away their valuable current infrastructure. They also deal more with organizational change management issues and possible employee skill set challenges. The new, bimodal IT model is also a concern for these enterprises, Forbes reported.

This is a tricky dilemma because both hybrid cloud and hybrid IT have been known to deliver some pretty significant advantages. Some of the biggest benefits of moving to an updated cloud or IT environment include:
  • Architectural flexibility that allows you to place business workloads where they make the most sense.
  • Retention of technical control by keeping the final decision on if, when and where a multitenant IT environment is acceptable.
  • Staff have the choice on the use of dedicated servers and network devices that can isolate or restrict access.
  • Tighter management control, which often translates into a better ability to satisfy auditors and meet compliance requirements.
  • Enhanced financial management because the company owns and funds the base configuration with a capex budget while simultaneously gaining the option to consume pay-as-you-go resources with opex funds for unanticipated spikes.
  • More technical stability through the use of dedicated servers for baseline performance and supplemental multitenant cloud servers when needed.
  • Enhanced operating system flexibility for testing and evaluation or providing technical customers the option to choose their preferred environment.
  • Promotion and support of innovation with an ability to spin up and tear down cloud servers quickly and easily for proof of concepts, pilots or software trials.
These hybrid advantages do need to be balanced with the operational challenges of moving to such a nontraditional environment. Things like automated resource provisioning/deprovisioning, cloud ecosystem management, dynamic service pricing and other IT service brokerage skills are new requirements for most organizations. This type of technology shift may also require many personal and organizational changes as well.

How to Manage the Shift

Shifting to a hybrid anything comes down to evaluating and managing both traditional and cloud IT, balancing various on-premises and off-premises suppliers and making dynamic choices about technology on the fly as the business requires new capabilities. All these tasks must be done simultaneously to achieve:
  • User and customer empowerment;
  • Application delivery optimization; and
  • Service-centric IT that accelerates business responsiveness.


So in the end, the hybrid hype is really about delivering business value. Within these models, technology becomes an ecosystem of providers, resources and tools. Interactions between old and new IT need to be devised, modeled, tested, implemented and improved.

Functionally, IT organizations need to manage the end-to-end IT service delivery model. They must be empowered to broker a set of IT services, some of which are on-premises and some of which are off-premises. The task of the IT organization is to offer internal and external customers the price, capacity and speed of provisioning of the external cloud while reducing IT service costs maintaining the security and governance the company requires.





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



Monday, January 4, 2016

What has NIST done for me lately?



According to a study, 82 percent of federal IT professional respondents reported that they were using the NIST (National Institute of Standards and Technology) cybersecurity framework to improve their security stance. The survey also demonstrated that the document is being used as a stepping stone to a more secure government. When I first read this, my immediate reaction was a resounding, “So what!” These results tell me that US Federal Government agencies are using US Government guidance to do their US Government job. Isn’t that what you would expect? Making an impression on me would require a study across multiple industry verticals. If other industries were voluntarily using the NIST Framework, that would be saying something!

Wouldn’t you know it, but such an independent study was actually conducted earlier this year. In March of 2015, the National Cybersecurity Institute did a study of Chief Information Security Officers across multiple industries. This survey not only looked into cybersecurity practices of the US government and military, but it also delved into the security practices of other verticals including Energy/Utilities, Consulting, Information Technology and Banking/Finance. When asked about the specific security standards or frameworks their organization used, 53.1 percent of the respondents cited NIST! This response level was higher than those recorded for Information Technology Infrastructure Library (ITIL), ISO/IEC (International Organization for Standardization/International Electrotechnical Commission) HIPPA (Health Insurance Portability and Accountability Act of 1996), COBIT (Control Objectives for Information and Related Technology) and CMM (Capability Maturity Model). Yes, that impressed me.



When Paul Christman, vice president of federal for Dell Software was interviewed by FedScoop, a leading online publication that covers the US Government market, he said the framework is “just good policy”.

“It applies to schools, universities, hospitals, [the Defense Department], [the Intelligence Community], and civilian agencies. The document doesn’t say ‘This is how the government should protect the government,’ ‘This is how a bank should protect a bank.’ NIST was really trying to say ‘This wasn’t a government program or mandate;’ it’s just good practice.” Kent Landfield, director of standards and technology policy at Intel Security, echoed this sentiment saying that his company was able to fit the NIST recommendations nicely into the information technology security evaluation process as a whole.

And as if NIST had planned to stage an encore performance, two new standards – 800-173, Cloud-Adapted Risk Management Framework: Guide for Applying the Risk Management Framework to Cloud-based Federal Information Systems, and 800-174, Security and Privacy Controls for Cloud-based Federal Information Systems – are currently being drafted for released. According to Dr. Michaela Iorga, senior security technical lead for cloud computing at NIST, these new frameworks are designed to overlay and elaborate upon already-existing standards that lay out the basics for cloud architecture and security. Iorga also suggests that federal organizations should also use FedRAMP, CSA’s Security, Trust and Assurance Registry (STAR), and other certification and authorization programs to make decisions about cloud computing.

With all this new knowledge in my head, I really gained a new appreciation for NIST. The agency seems to be really taking a lead on protecting cyberspace across the board. As the economic value of our collective digital economy gains in importance, this relatively small agency has placed itself at the vanguard of cybersecurity and is truly living up to its mission:

To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.


Thank you NIST!


(This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies.)






Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

This blog has been verified by Rise: Re9cb99744b65eb009b71a970003c015e

This blog has been verified by Rise:  Re9cb99744b65eb009b71a970003c015e
This blog has been verified by Rise: Re9cb99744b65eb009b71a970003c015e

Wednesday, December 16, 2015

Future Ready in the API economy


The world of business is software. No matter the industry vertical or business model, effective software is the key to business success.  An even more important aspect of this reality is the application programming interface (API).  If you are unfamiliar with this kind of geeky term, APIs are the glue that connects applications to each other and manages the virtual discussions between you and your customers.  APIs are also what enables business agility and flexibility of a #FutureReady business.
“Application programming interfaces (APIs) have been elevated from a development technique to a business model driver and boardroom consideration. An organization’s core assets can be reused, shared, and monetized through APIs that can extend the reach of existing services or provide new revenue streams. APIs should be managed like a product—one built on top of a potentially complex technical footprint that includes legacy and third-party systems and data.”[1] 

APIs are also at the heart of cloud computing.  These are used to provision, de-provision and scale the resources needed to run the software that drives your business. The importance of software, and by extension the APIs that drive software, is evident when you compare the top Dell Future Ready cities to the top cities for software engineering and entrepreneurship capitals.Eight of the top 10

Friday, December 11, 2015

Teradata: Embrace the Power of PaaS




http://www.teradata.co.uk/cloud-overview/?LangType=2057&LangSelect=true









Platform-as-a-Service (PaaS) has always been the unappreciated sibling of the cloud computing service model trio.  Existing in the dark shadow of the most widely adopted Software-as-a-Service (SaaS) and foundationally powerful Infrastructure-as-a-Service (IaaS), the third service model is often misunderstood and widely ignored.

PaaS provides a platform allowing customers to develop, run, and manage web applications without the complexity of building and maintaining the infrastructure.  Its unique power is associated with developing and deploying applications. Business value statements usually linked to PaaS includes:

  • Organizations can innovate faster, enabling the faster transformation of new ideas into real applications.
  • Helps to focus limited resources by eliminating much of the overhead required to deploy and manage applications
  • Saves money in the application development process by enabling economies of scale through enforcement of standardization and avoiding hidden cost of middleware misconfigurations
  • Software development quality is enhanced through the use of specialist that constantly tune, optimize, load-balance and reconfigure PaaS components
  • Reduce the risk and improve the timeliness of application updates by wielding complete control over how updates are brought into your production applications
  • Maximize application uptime through better data backup, operating system hardening and high availability deployments
  • Enable cost efficient global scalability by leveraging the insight of platform experts that have developed and deployed scaling mechanism capable of responding to the needs of many customer types and situations.
  • Enhanced security through continual security updates to individual PaaS stack components
  • Dramatically reduce overall project risk by bringing predictability to both the cost and the ramifications of introducing new applications and services.


Figure 1- Through the "Enhanced Services" layer, the Teradata PaaS advantage delivers industry and business process aligned components.
When it comes to big data analytics, Teradata delivers these Platform-as-a-Service advantages by delivering industry andbusiness process aligned components within their PaaS. This valuable

Sunday, December 6, 2015

Why cloud changes everything

How is cloud computing bringing society and its ideas closer together?

This got me thinking. Last week the President of the United States started following me on Twitter. Now I realize that it’s not really President Obama on the other side of that virtual table, but the event brought to mind the Six Degree of Separation concept.

In case you don’t remember, this theory states that everyone and everything is six or fewer steps away, by way of introduction, from any other person in the world. That random thought led me to wonder if all this social media has actually changed the theory so much that even I can be directly connected to the leader of the free world. A quick internet search revealed that the number of links between two random people have indeed reduced:


  • Facebook provies it’s 4.74 degrees of separation: Scientists at Facebook and the University of Milan reported the average number of acquaintances separating any two people in the world was not six but 4.74. The experiment took a month and involved all of Facebook’s 721 million users.
  • Twitter shows it’s 4.67 steps: In a study of 5.2 million Twitter friendships (friend and follower relationships), Sysomos, found the most common friendship distance is five steps. (The average distance is 4.67 steps). The second most common friendship distance is four steps.
  • LinkedIn is set up on 3 degrees of separation: Although LinkedIn has yet to do a study of the magnitude of Facebook or Twitter, LinkedIn, perhaps more than any other social network, was set-up about the idea of degrees of separation. In LinkedIn case, it’s 3 degrees of separation which are:
    • You already know them
    • You know someone that knows them
    • You know someone that knows someone that knows them
In fact experiments have shown an average degree of separation of 3.43 between two random Twitter