Category: Gaining Altitude in the Cloud

Will Cloud Kill The CIO? Survey Says No | Gaining Altitude in the Cloud

Sometimes it’s hard to distinguish the facts from hype in enterprise cloud adoption. This is why Everest Group and Cloud Connect continue to conduct our annual joint survey to understand why and how enterprises are migrating to the cloud, and what they are migrating to the cloud. Check out my blog on InformationWeek for more findings on the Enterprise Cloud Adoption Survey. Here’s an excerpt:

If supermarket tabloids covered enterprise cloud adoption, their headlines would scream “The CIO is Dead,” “Security Concerns are Old News,” and “Cloud Makes Consumption Easy—No External Help Required.” And as we perused these headlines in the checkout line, we would wonder how much truth lay behind the hype.

To distill fact from fiction, the Everest Group launched the Enterprise Cloud Adoption Survey in 2012, in conjunction with Cloud Connect and UBM TechWeb. We have just completed the third annual survey of enterprises and vendors and will share the results in Las Vegas on Monday, March 31, at Cloud Connect Summit, co-located with Interop.

Read more on InformationWeek

You can also download the full survey summary report here.

CSS Corp Cloud Services Making an Impact | Gaining Altitude in the Cloud

For several years we’ve predicted that the cloud would disrupt data centers. But it’s not as simple as lift and shift; it requires an understanding of how to deploy in the cloud and also requires some reengineering. Now some innovators are succeeding in deployment solutions and achieving momentum. One that caught our eye: the explosive growth of CSS Corp Cloud Services.

CSS is off and running. Need evidence? Among other clients, they’re currently working with:

  • 12 Fortune 100 companies
  • 12% of the top 50 U.S. companies
  • 16% of the world’s 25 largest banks
  • 5 of the world’s largest Fortune 350 manufacturing companies

The issue around using any public cloud — especially the lowest-cost cloud, AWS — is that it requires re-architecting applications in such a way to get enterprise performance. This has been a significant constraint in the migration of workloads to public clouds.

But we’re now seeing real use cases emerging where companies systematically take production workloads, reengineer them and deploy them into the public cloud in a way that gives them production-quality outcomes — that is, high performance and high resilience.

CSS Corp Cloud Services was an early AWS adopter. The firm invested in toolsets around AWS cloud services and developed a capability for consistently re-architecting and deploying into the AWS public cloud. The company’s public-cloud use cases already span a wide area of processes including Big Data analytics, digital marketing, e-commerce, backup and storage, disaster recovery, application and Web hosting, development and test environments and media/entertainment.

Data centers are not yet an endangered species. But as firms such as CSS master cloud deployment in large corporate enterprises, we believe the rate of disruption will quickly pick up.

IBM Prepares to Deliver Consumption Based, As-a-Service Offerings | Gaining Altitude in the Cloud

IBM in late February launched BlueMix, a billion-dollar investment in a Platform-as-a-Service (PaaS) cloud based on its recent empire-building acquisition of SoftLayer. A TBR analyst says IBM’s as-a-service moves are changing the company’s DNA. My opinion? It’s a lot more significant than that! If Big Blue can integrate all its services in a true consumption-based model, it could set the standard for the new service model industry wide. The question is: Is this the way of the future?

The dilemma

Here’s the dilemma that faces the market and motivates IBM’s strategy. As I’ve blogged before, customers want consumption-based services. This means:

  • They only want to pay for what they use. They don’t want to pre-commit to volumes because they overpay when they do that. When they use a lot, they’re prepared to pay a lot; when they use a little, they only want to pay a little.
  • They want providers to make it easy to adopt their services. They don’t want big road maps and huge implementation schedules. Easy on, easy off is what they desire.

We see this fundamental desire for consumption-based coming across all service lines. But it leaves traditional service providers hamstrung to meet customer demands.

There are two routes to the change to consumption-based services:

  • Service delivery in a multi-tenant world — one platform and all customers use the same thing
  • Supply chain — completely integrate a consumption-based supply chain

The problem with a remedy for the dilemma

The problem is that the multi-tenant path does everything for providers but nothing for customers. Larger, sophisticated companies have different needs, but the multi-tenant platform forces customers to be the same as everyone else. Salesforce and other providers accommodate this issue with configuration vehicles, but fundamentally they have an unyielding standard. So providers must ask their customers to change their needs to meet the product’s standards rather than the product changing to meet the customer’s desires.

It’s a thing of beauty and a joy forever if a provider can get its customers to do that. But there are a very limited number of areas, and customers, where that can happen.

The alternative remedy

The alternative is to turn a provider’s entire supply chain into a consumption-based supply chain. This is the IBM strategy.

This path eliminates stranded costs. It also eliminates the problem of misaligned provider/customer interests that create a lot of friction in the market today. The traditional service model creates take-or-pay situations in which the provider has to provide the service whether or not the customer uses it — thus the misaligned interests.

That’s why what IBM is doing is so important. I’ve blogged before about how IBM’s recent acquisitions of SoftLayer, UrbanCode, Green Hat and Big Fix were the components of building a complete as-a-service stack from the bare iron up through the platform to the business process services. This fundamentally enables IBM to migrate to an end-to-end consumption-based world.

We have yet to see IBM roll this through in its fundamental pricing, but it’s still early; Big Blue has just now assembled the stack. But if it truly goes to market with the end-to-end consumption model, IBM will be able to address market needs much more completely than competitors, which are faced with the dilemma of having to take the risk on stranded costs and effectively price higher because of inefficient delivery models.

That’s what IBM has been putting in place. Is IBM leading in the way to the future in services? What do you think?

What If CSC and HCL Get Brave? | Gaining Altitude in the Cloud

CSC and HCL announced an alliance a few weeks ago, which is more of a go-to-market than structural change. But what if the twosome were to agree to a follow-on alliance to do something really big — something with huge industry and market consequences? It would be extremely brave and very risky. But it would address the inevitable whopping market threat for CSC and position both companies for future growth. Let me paint a picture of what that speculative alliance would look like.

The alliance would address the elephant in the room: CSC losing half its client work 

Such an alliance would first enable CSC to grasp the mantle and really address its big problem — its current huge commitment to an asset-heavy outsourcing model. CSC has invested at least $12+ billion in this model.

Our research and insights reveal that over 50 percent of the workloads currently in an asset-heavy model are able to migrate to the cloud over the next three to five years— and are incented to do so. With half of its work exiting the asset-heavy model, this mass exit would leave CSC with a huge revenue hole.

And that’s only part of the problem. The situation is doubly threatening in that the exit from the asset-heavy model will leave CSC with huge stranded costs on facilities, equipment and people along with the revenue hole.

A really brave alliance with HCL would deal with this situation.

Alliance step one. Step one is to deal with the people. CSC could move its people servicing clients in the asset-heavy model over to HCL, taking costs down and removing the stranded costs. CSC already has a vehicle in its offer set to catch the cloud work but would be replacing this revenue at 50 cents on the dollar. It’s much cheaper to do the work in the cloud than in the asset-heavy model.

Step two. Step two would move CSC’s data centers into an industry REIT to deal with the data center overhang. It would leave CSC with a much smaller set of stranded assets in overhead and equipment to deal with. In this way CSC would be able to navigate the inevitable shrinkage of its asset-heavy business and deal with those stranded assets.

The emerging CSC 

A brave alliance between CSC and HCL would also make CSC “all in” on the cannibalization of its own footprint. Although CSC is attempting to drive this strategy now, it has conflicting incentives as it fights to maintain revenue in its existing asset-heavy model while standing up new revenue. The speculative alliance I’m describing would send a message internally to the CSC organization and to its external market that CSC is “all in” on the cloud transformation issue.

What would emerge from this alliance strategy would be a cloud-based CSC — a smaller, more profitable, more nimble CSC without the huge write-downs that it likely will incur as the cloud transformation happens naturally over the next few years.

The picture for HCL also makes a lot of sense

Such an alliance would create big growth in HCL’s infrastructure because of gaining significant advantages in economies of scale, market credibility and greater profits to invest.

HCL would pick up CSC as a huge client and capture probably 15 percent of the entire RIMO (Remote Infrastructure Management Outsourcing) market in one fell swoop. It would cement HCL into the undisputed RIMO leadership position with a wide margin between HCL and TCS, its nearest competitor.

What do you think? Will the twosome be brave and take the risk of a market-changing follow-on alliance?

Enterprise Cloud Adoption: 5 Hard Truths | Gaining Altitude in the Cloud

Originally posted on InformationWeek


Last fall I had the honor of sitting on the selection committee for the inaugural ICE (Innovation in Cloud for Enterprise) Awards, sponsored by the Cloud Connect show and Everest Group. The experience taught me how large enterprises are adopting cloud computing in ways that are often compelling, sometimes surprising, and occasionally breathtaking.

The winner, Revlon, Inc., presented an impressive case for how it leverages cloud to achieve organizational transformation that boosts competitiveness and consumer wallet share.

As impressive as each individual entries was, there were five recurring themes that emerged across the enterprise cloud adoption stories we read. While certainly not scientific, they reflect what enterprises themselves report as important factors in the success of their cloud deployments.

1. Identify a compelling reason to step out of the comfort zone.
We’ve read about the importance of senior management buy-in to achieve success in cloud transformation. But what we found in the award entry submissions is that the truth is even starker: Senior management must believe that cloud adoption is critical to organizational survival.

The high-level driver might be one of the ethereal themes we read about in the tech press: Product or service differentiation, moving to market faster with new services, or getting closer to the customer through big data analysis. However, the visceral driver is always primal: We do this or we’re going to suffer at the hands of our competitors.

Read more on InformationWeek

Big Data Analytics in 2014: 5 Things That Won’t Happen | Gaining Altitude in the Cloud

While talking about a new year’s next cool thing or development is a thoroughly enjoyable ritual, discussing what will not change provides valuable lessons for technology adoption strategy and investment planning, and highlights potential future disruptions.

So what are the five things that will remain more or less the same in 2014 for big data analytics?

  1. Hadoop will NOT REPLACE ETL: The nine-year old platform has achieved great traction, and its mindshare has significantly increased. Well-known analytics providers such as Cloudera, Hortonworks, and MapR have supported it for a couple of years, and even the big boys such as IBM and Pivotal have embraced it. However, Hadoop’s proponents are positioning it as a panacea for all the ills of big data. The antagonists are equally up to the task, denouncing it as one of the important, yet small, pieces of the puzzle. Most Hadoop proponents confuse ETL as an “activity,” rather than a “process.” The way in which ETL is performed in a Hadoop framework set-up may differ, but it does not make ETL redundant or replaceable.

  2. Analytics will still be UNDEMOCRATIC: Innovative data analysis and visualization technology players such as Tableau, QlikView, Alteryx, and Tibco (Spotfire) have gained traction as “end user” friendly products. And mega providers such as SAP have increased their efforts in this direction (e.g., rebranding SAP Visual Intelligence as SAP Lumira). However, despite significant efforts to “consumerize” big data analysis and move the power out of the ivory towers of data scientists, 2014 will witness only incremental changes in this regard. 

  3. Big Data will still be a PROJECT: Organizations always pilot a new technology before they put it into mainstream production. However, this attitude defeats the purpose of big data analytics. To gain real advantage from the deluge of data, companies must engrain a big data mindset into their DNA, rather than treating it as a silo “project.” Will 2014 see organizations jettisoning their age-old habits to wholeheartedly adopt big data analytics? Not according to my market conversations.

  4. Real talent will be TOUGH to find: Every technology transformation comes with “talent imposters,” and organizations desperate for talent will hire some of these and then repent later. Unfortunately, most of the existing data warehousing and business intelligence analysts masquerade themselves as “big data talent.” And the mushrooming of big data certifications and aggressive resume fabrication will not make organizations’ hiring task any easier in 2014.

  5. Integration will be a CHALLENGE: Technology providers such as Attunity, Dell Boomi, Talend, and Informatica have created multiple solutions to integrate disparate data sources for a consistent analysis framework. Most of these solutions work with data sources such as Amazon Redshift, IBM PureData System for Analytics (Netezza), HP Vertica, SAP HANA, and Teradata. However, organizations continue to face challenges in seamlessly integrating these, and are thus unable to extract meaningful value from their big data analytics engagements. While we’ll see major improvement in this area in 2014, a world in which different data sources are seamlessly integrated and analyzed will still be a mirage.

With cloud-based data management, modeling, and analytics disrupting the landscape, coupled with the rise of in-memory computing, the big data market will continue to surprise: we’ll see technology providers entering “unknown” domains, competing with their partners, and even cannibalizing existing offerings.

What are your takes on big data analytics in 2014 and beyond?

Rethinking Outsourced Application Development and Testing | Gaining Altitude in the Cloud

The cloud revolution is breaking down walls surrounding existing outsourcing arrangements. In fact, the business case for some outsourced workloads crumbles in light of opportunities in the cloud. In a traditional data center or IT outsourced infrastructure model, you pay for the capacity 24/7; but in a cloud environment, you pay only for what you use.

By moving workloads to a cloud environment, we estimate that an organization would end up paying approximately 25-50 percent of the cost for the current 24/7 model.

Those economics are stunning.

Our analysis suggests that about 50 percent of current outsourced workloads have the capability to move to the cloud easily. That doesn’t mean that they will be moved. But they are intermittent workloads, so they can be turned off and on and thus easily moved to a variable pricing model such as the cloud. They also don’t have the same security and compliance cloud constraints as production workloads.

That doesn’t mean they will be moved. It just means they have the capability to be moved without significant disruption or a large investment to replatform them.

The diagram below shows the workloads that we think will migrate to the cloud.

Enterprise cloud migration is coming in waves

We believe the first to head for the cloud environment will be the application development and testing environments as this work is intermittent and ideal for the pay-per-use model. Considering that application development and testing comprise 20-25 percent of most outsourced IT infrastructure workloads, we believe the compelling underlying economics of the cloud model will drive these workloads out of their current environments quite quickly.

An enterprise with an IT infrastructure outsourcing contract very likely will want to migrate some workloads out of that environment into a next-generation model such as the cloud. It just makes sense to capture the savings. And in most outsourcing contracts, the customer has the freedom to reduce the outsourced workload by 30 percent before incurring penalties.

We see this as a big opportunity.

It’s also a potential threat.

HP’s Most Difficult Challenge Has Yet to Hit | Gaining Altitude in the Cloud

To date the cloud has not been a major disruption in the traditional outsourcing market. Rather, cloud has attacked the rogue IT or departmental processing market. But we believe that this tide will now turn inward onto the enterprise space, where HP and other infrastructure players live.

How will HP take on cloud disruption? We at Everest Group believe that it is highly likely that over the next three years 30 percent of the existing workloads will move out of the traditional outsourcing space and shift to cloud models. If this proves true, the substantial turnaround work that HP has done to date will not prove adequate to stem this new source of competition and disruption.

The results to date 

As shown in the figure below, CSC, Dell, HP and IBM have significant portfolios of asset-heavy IT infrastructure outsourcing deals. The statistics below clearly evidence the fact that the asset-heavy providers are losing share to asset-light players.

Asset heavy ITO players losing share

So far, the primary attack on the traditional space has been the RIMO (Remote Infrastructure Management Outsourcing) talent-only model.

Performance progress

When Meg Whitman took the reins as CEO, HP started the long process of turning around its business. Recommitting to the services space, she appointed Mike Nefkens as EVP and Enterprise Services general manager a seasoned veteran of EDS who quickly made progress in improving morale and in addressing customer satisfaction issues in existing service accounts. Market share losses slowed and HP reemerged with more competitive offerings and a can-do attitude.  But have they fought the enemy to a standstill only to find they have a new front with a more deadly enemy.

Like a stone dropped on glass, RIMO competitors smashed the asset-heavy business. What will the new disruption of cloud do to it if — as we predict — it drives a 30% + run-off of workloads?


Photo credit: Don Debold

Assessing the Cloud’s Clout to Disrupt the Outsourcing World | Gaining Altitude in the Cloud

You’ve heard it … read it in the news … and probably even participated in the discussions about it. “It” is the impact of the cloud model on the third-party service provider marketplace. What are the cloud’s potential disruptive effects? As the cloud and outsourcing meet head on, will the cloud make hiccups or ripples, or will it become a crushing force? How big a threat is cloud to the outsourcing space?

To answer those questions, I think we have to pull apart the services spaces, as the cloud’s impact will differ widely in various areas.

Let’s start with where cloud has the highest potential for disruptive impact.

Impact on IT infrastructure services

The cloud model’s impact on the traditional IT infrastructure outsourcing space has been modest to date and, surprisingly, has not had as strong an impact in disintermediating the existing marketplace as RIMO (Remote Infrastructure Management Outsourcing) had.

Although the impact to date has been modest, the cloud model has a very high potential to disintermediate business in this space and is in a position to drive increased run-off out of traditional infrastructure models.

Threat profile — High. I believe the cloud’s disruptive impact in this area is a question of when, not if.

Impact on the applications space

Here the story is quite different. Unlike the infrastructure space where cloud is poised to take away existing revenues, in the applications space it has at least as many positive impacts as negative. In fact it has the potential to be a growth engine.

As companies prepare to evolve and move into the as-a-service world or migrate their applications to the cloud infrastructure models, the substantial amount of SI and applications effort to transition into the new models will create growth in the outsourcing marketplace.

However, there is a caution sign. The kind of work best suited for the cloud model often goes to new market entrants rather than the traditional players. We’ve blogged before about the shift of decision rights into the business stakeholders’ realm rather than the CIO, and this is driving growth in the cloud market. So there are some issues associated with positioning, but it’s driving growth in the overall marketplace.

Threat profile — Initially modest to high disruptive impact for traditional players. But as the traditional providers resolve their positioning issues with the new consumerized decision-making realm in clients’ businesses, the cloud should have a net positive impact on the outsourcing providers’ growth, rather than a disruptive impact.

Impact on the BPO space

On the BPO side, the cloud model is even more immature and it’s less clear what eventual impact it will have. We certainly see cloud engines being incorporated into the BPO or BPaaS model. But here it’s more of a change in one of the component parts, rather than affecting the overall business model.

Threat profile — Modest to flat impact.

Our overall assessment is that the cloud’s clout to disrupt the existing marketplace in the outsourcing world is modest to high — but the changes will carry both negative and positive consequences.

SaaS – The Five Things That Should Happen in 2014 | Gaining Altitude in the Cloud

Keying off 2013 market activities and indications, following are five SaaS developments I anticipate we’ll witness in 2014:

1) Blurring of SaaS and on-premise: With the quintessential poster boy of cloud computing and SaaS, Salesforce.com, announcing a partnership with HP whereby customers can now choose to have their “dedicated infrastructure” within a Salesforce.com data center, the true SaaS premise is dead. However, rather than quibbling and mindless debate on further defining true SaaS, SaaS vendors will realize the potential of this market in which a “dedicated” SaaS solution is required. Though Salesforce.com has always shied away from creating a true on-premise version of its SaaS offering, other vendors do offer on-premise and SaaS version. This blurring of boundaries will further continue in 2014.

2) Battle of architectures: Oracle, a company that always denounced cloud computing, has suddenly found a love for it, and has acquired (and will continue to acquire) numerous SaaS vendors. (Note that this nothing different from its on-premise strategy, e.g., remember JD Edwards and  PeopleSoft?). However, Oracle for long has criticized Salesforce.com’s approach of creating application-based multi-tenancy. With the introduction of Oracle 12c (c denoting cloud), Oracle’s marketing machinery is going to town explaining how its database-driven multi-tenancy is better than typical application-based architecture. In 2014, we should see more the lines in the sand being drawn.

3) Indirect sales: Most SaaS vendors are running in losses, and understand that their sales and marketing expenses (~30-40 percent of revenue) are exorbitant. They will realize the importance of indirect sales channels such as system integrators and partners to further drive adoption of their offerings. Given the strategic partnerships of Salesforce.com, Workday, and NetSuite with large system integrators such as Accenture, Wipro, Deloitte, and niche providers such as Bluewolf, 2014 should see increase in the depth of these partnerships.

4) Churn management: Despite soft lock-in, SaaS providers are witnessing high churn rates. To be fair, some of the churn is attributable to clients’ unwillingness to adopt SaaS models once the pilot run is over. In 2014, we’ll see SaaS providers investing more time and energy in maintaining their existing customer relationships.

5) Enhanced functionality: This is a multi-year, multi-decade evolution for the SaaS ecosystem. In the past decade, SaaS providers have included many types of functionality that were earlier considered to be unsuitable for this model. In 2014, vendors will continue to evolve their offerings, including introducing industry-specific vertical flavors whenever possible. However, given the opportunities in the horizontal SaaS space (CRM, salesforce automation, marketing, HCM, etc.), most SaaS vendors will gain the lion’s share of their growth in enhanced horizontal functionality.

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.