Tag: cloud adoption

You are on AWS, Azure, or Google’s Cloud. But are you Transforming on the Cloud? | Blog

There is no questioning the ubiquity of cloud delivery models, independent of whether they’re private, public, or hybrid. It has become a crucial technology delivery model across enterprises, and you would be hard pressed to find an enterprise that has not adopted at least some sort of cloud service.

However, adopting the cloud and leveraging it to transform the business are very different. In the Cloud 1.0 and Cloud 2.0 waves, most enterprises started their adoption journey through workload lift and shifts. They reduced their Capex and Opex spend by 30-40 percent over the years. Enamored with these savings and believing their job was done, many stopped there. True that the complexity of the lifted and shifted workload increased when they moved from Cloud 1.0 to Cloud 2.0, e.g., from web portal to collaboration platforms to even ERP systems. But, it was still lift and shift, with minor refactoring.

This fact demonstrates that most enterprises are, unfortunately, treating the cloud as just another hosting model, rather than a transformative platform.

Yet, a few forward-thinking enterprises are now challenging this status quo for the Cloud 3.0 wave. They plan to leverage the cloud as a transformative model where native services can be built in to not only modernize the existing technology landscape but also for cloud-based analytics, IoT-centric solutions, advanced architecture, and very heavy workloads. The main difference with these workloads is that they won’t just “reside” on cloud; they will use the fundamental capabilities of the cloud model for perpetual transformation.

So, what does your enterprise need to do to follow their lead?

Of course, you need to start by building the business case for transformation. Once that is done, and you’ve taken care of the change management aspects, here are the three key technology-centric steps you need to follow:

Redo workloads on the cloud

Many monolith applications, like data warehouses and sales applications, have already been ported to a cloud model. You need to break the ones you use down based on their importance and the extent of debt in terms of the transformation needed. Many components may be taken out of the existing cloud and ported in-house or to other cloud platforms based on the value they can deliver and their architectural complexity. Some components can leverage cloud-based functionalities (e.g., for data analytics) and drive further customer value. You need to think about extending the functionality of these existing workloads to leverage newer cloud platform features such as IoT-based data gathering and advanced authentication.

Revisit new builds on the cloud

Our research suggests that only 27 percent of today’s enterprises are meaningfully building and deploying cloud-native workloads. This includes workloads with self-scaling, tuning, replication, back-up, high availability, and cloud-based API integration. You must proactively assess whether your enterprise needs cloud-native architectures to build out newer solutions. Of course, cloud native does not mean every module should leverage the cloud platform. But a healthy dose of the workload should have some elements of cloud adoption.

Relook development and IT operations on the cloud

Many enterprises overlook this part, as they believe the cloud’s inherent efficiency is enough to transform their operating model. Unfortunately, it does not work that way. For cloud-hosted or cloud-based development, you need to relook at your enterprise’s code pipelines, integrations, security, and various other aspects around IT operations. The best practices of the on-premise era continue to be relevant, albeit in a different model, such as tweaks to the established ITSM model). Your developers need to get comfortable with leveraging abstract APIs, rather than worrying about what is under the hood.

The Cloud 3.0 wave needs to leverage the cloud as a transformation platform instead of just another hosting model. Many enterprises limit their cloud journey to migration and transition. This needs to change going forward. Enterprises will also have to decide whether they will ever be able to build so many native services in their private cloud. The answer is probably not. Therefore, the strategic decision of leveraging hybrid models will become even more important. The service partners will also need to enhance their offerings beyond migration, transformation during migration, and management. They need to drive continuous evolution of workloads once ported or built on the cloud.

Remember, the cloud itself is not magic. What makes it magical is the additional transformation you can derive beyond the cloud platform’s core capabilities.

What has been your experience in adopting cloud services? Please write to me at [email protected].

Hadoop and OpenStack – Is the Sheen Really Wearing off? | Sherpas in Blue Shirts

Despite Hadoop’s and OpenStack’s adoption, our recent discussions with enterprises and technology providers revealed two prominent trends:

  1. Big Data will need more than a Hadoop: Along with NoSQL technologies, Hadoop has really taken the Big Data bull by the horns. Indications of a healthy ecosystem are apparent when you see that leading vendors such as MapR is witnessing a 100% booking growth, Cloudera is expecting to double itself, and Hortonworks is almost doubling itself. However, the large vendors that really drive the enterprise market/mindset and sell multiple BI products – such as IBM, Microsoft, and Teradata – acknowledge that Hadoop’s quantifiable impact is as of yet limited. Hadoop’s adoption continues on a project basis, rather than as a commitment toward improved business analytics. Broader enterprise class adoption remains muted, despite meaningful investments and technology vendors’ focus.

  2. OpenStack is difficult, and enterprises still don’t get it: OpenStack’s vision of making every datacenter a cloud is facing some hurdles. Most enterprises find it hard to develop OpenStack-based cloud themselves. While this helps cloud providers pitch their OpenStack offerings, adoption is far from enterprise class. The OpenStack foundation’s survey indicates that approximately 15 percent of organizations utilizing OpenStack are outside the typical ICT industry or academia. Moreover, even cloud service providers, unless really dedicated to the OpenStack cause, are reluctant to meaningfully invest in it. Although most have an OpenStack offering or are planning to launch one, their willingness to push it to clients is subdued.

Why is this happening?

It’s easy to blame these challenges on open source and contributors’ lack of coherent strategy or vision. However, that just simplifies the problem. Both Hadoop and OpenStack suffer from lack of needed skills and applicability. For example, a few enterprises and vendors believe that Hadoop needs to become more “consumerized” to enable people with limited knowledge of coding, querying, or data manipulation to work with it. The current esoteric adoption is driving these users away. The fundamental promise of new-age technologies making consumption easier is being defeated. Despite Hortonworks’ noble (and questioned) attempt to create an “OpenStack type” alliance in Open Data Platform, things have not moved smoothly. While Apache Spark promises to improve Hadoop consumerization with fast processing and simple programming, only time will tell.

OpenStack continues to struggle with a “too tough to deploy” perception within enterprises. Beyond this, there are commercial reasons for the challenges OpenStack is witnessing. Though there are OpenStack-only cloud providers (e.g., Blue Box and Mirantis), most other cloud service providers we have spoken with are half-heartedly willing to develop and sell OpenStack-based cloud services. Cloud providers that have offerings across technologies (such as BMC, CloudStack, OpenStack, and VMware) believe they have to create sales incentives and possibly hire different engineering talent to create cloud services for OpenStack. Many of them believe this is not worth the risk, as they can acquire an “OpenStack-only” cloud provider if real demand arises (as I write the news has arrived that IBM is acquiring Blue Box and Cisco is acquiring Piston Cloud).

Now what?

The success of both Hadoop and OpenStack will depend on simplification in development, implementation, and usage. Hadoop’s challenges lie both in the way enterprises adopt it and in the technology itself. Targeting a complex problem is a de facto approach for most enterprises, without realizing that it takes time to get the data clearances from business. This impacts business’ perception about the value Hadoop can bring in. Hadoop’s success will depend not on point solutions developed to store and crunch data, but on the entire value chain of data creation and consumption. The entire process needs to be simplified for more enterprises to adopt it. Hadoop and the key vendors need to move beyond Web 2.0 obsession to focus on other enterprises. With the increasing focus on real-time technologies, Hadoop should get a further leg up. However, it needs to provide more integration with existing enterprise investments, rather than becoming a silo. While in its infancy, the concept of “Enterprise Data Hub” is something to note, wherein the entire value chain of Big Data-related technologies integrate together to deliver the needed service.

As for OpenStack, enterprises do not like that they currently require too much external support to adopt it in their internal clouds. If the drop in investments is any indication, this will not take OpenStack very far. Cloud providers want the enterprises to consume OpenStack-based cloud services. However, enterprises really want to understand the technology to which they are making a long-term commitment, and are cautious of anything that requires significant reskill or has the potential to become a bottleneck in their standardization initiatives. OpenStack must address these challenges. Though most enterprise technologies are tough to consume, the market is definitely moving toward easier deployments and upgrades. Therefore, to really make OpenStack an enterprise-grade offering, its deployment, professional support, knowledge management, and requisite skills must be simplified.

What do you think about Hadoop and OpenStack? Feel free to reach out to me on [email protected].


Photo credit: Flickr

Implications of the Enterprise Strategic Intent Shift toward Cloud | Sherpas in Blue Shirts

Since the beginning of 2014 Everest Group has seen a real shift in large enterprise CIO organizations in their strategic intent toward cloud services. What are the implications on the traditional infrastructure outsourcing market from this strategic intent?

Timing

First, we expect that this shift will not happen overnight. As organizations work on their cloud plans, it’s clear that this is a three-to-five-year journey for migrating some or all their environment into this next-generation environment.

Runoff of work from legacy environments

Second, we expect the runoff on traditional outsourced contracts to accelerate. The runoff has been running at about 5-10 percent a year. We expect this will pick up to something close to 50 percent of the workloads to shift over to the cloud in the next three years with 30 percent of that shift happening in the next two years.

So this is a dramatic runoff of work from legacy environments into the next-generation models. This will put significant pressure on the incumbent service providers in that space.

Who will be the likely winners?

The third implication is the likely winners from this strategic shift. We think that at least for the next two years the Indian players or those with a remote infrastructure management (RIM) model will enjoy substantial benefits. Often a move to cloud or next-generation technologies can be facilitated by a move to a RIM model. So we see RIM continuing its torrid growth.

We also believe the providers with enterprise-quality cloud offerings will be players. One that particularly comes to mind is IBM’s SoftLayer, which we think is well positioned for the shift. It has its own runoff and can grab share from asset-heavy or other legacy providers as runoff occurs there.

We expect to see Microsoft and its Azure platform play an increasingly prominent role in cloud services. It will be interesting to see if AWS, Google, and Microsoft can make the shift from serving rogue IT and business users to enterprise IT. At this time we certainly believe IBM can. And it looks like Microsoft is making deliberate efforts to transition its model. It remains to be seen if AWS and Google are willing to shift their models to better accommodate enterprise IT.


Photo credit: Photo Dean

Have a question?

Please let us know how we can help you.

Contact us

Email us

How can we engage?

Please let us know how we can help you on your journey.