Tag: cloud computing

Digital Is Shading Out Cloud | Sherpas in Blue Shirts

Three years ago global services industry was abuzz that the world would be set on fire by cloud computing. Today, although CIOs and senior executives, accept the cloud model and are looking to implement it, they are increasingly excited about infrastructure and the digitization of business. The digital revolution is shading out cloud, capturing the imagination and mindshare of the C-suite.

Cloud is certainly important, but its impact is just starting to take traction and already the C-suite is moving on to a new horizon. My, how short our attention spans are.

Although digital can incorporate aspects of cloud computing, its impact compared to cloud is enormous in proportion and potential.

I wonder what will be next in line to capture our imaginations and how quickly that will come to gain prominence.

 

The Cloud Experiment is Over, but are Buyers Waiting for Godot? | Sherpas in Blue Shirts

The cloud experiment is over and the debate in enterprises about its benefits and risks is settled. We know it works, it’s more flexible and cheaper, and it makes it easier for IT to align with business needs. So should buyers put their applications into a cloud environment?

My advice: Don’t rearchitect your legacy applications that were designed and implemented in a legacy environment and port them over to the cloud. Organization of all sizes have been waiting for providers’ porting solutions. Unfortunately, that’s sort of like the Samuel Beckett tragicomedy play, “Waiting for Godot,” in which two characters wait days for Godot even though they don’t know where or when he might arrive. Buyers wait, thinking cloud porting solutions will arrive in the market, but it just doesn’t happen. That’s because porting is really expensive and really risky.

Click to tweet

I’ve blogged in the past about CSS Corp Cloud Services and Redwood Software platforms for easily migrating legacy apps to the cloud. But as we get further into the cloud story, it looks like replatforming offerings will be far rarer than we anticipated. I’m not saying they won’t exist; I’m just saying they won’t be the dominant model.

As the smoke clears from cloud experimenting and pilots, the best-practice dominant model for moving into the cloud is shaping up as follows:

  • Look for opportunities to make incremental improvements to your legacy environment. Rework legacy by increasing the level of virtualization and automation in your data center.
  • When you develop new applications, architect them for the cloud environment.

This strategy of adding virtualization and automation may get your legacy environment into a private cloud, but it doesn’t get you into the agile low-cost public cloud environment. However, it allows you to improve the efficiency and resiliency of the existing legacy environment without the huge cost and risk of rearchitecting.

The strategy also helps CIO organizations regain some of the influence and credibility they’ve lost with business units as they’ve addressed new functionalities enabling where the business is moving. It enables the organization to be more agile, better aligned and do so with lower cost, which significantly relieves the tension of having to get a huge amount of funding for a set of high-risk legacy projects.

The fact is for many legacy applications the best you can do is make incremental progress. You can move them out of dedicated hardware into virtualized hardware. And other than some potential cost savings, there is little to no business benefit from taking on the risk of reengineering them for a public infrastructure or shared environment.

We saw this same best-practice model happen with distributed computing; new applications went into distributed computing and eventually we reached a tipping point where we needed to move legacy apps. I anticipate the new functionalities, new work will similarly drive the shift from legacy to cloud.

Going forward until the tipping point occurs, put all your efforts into standing up your organization’s new environment to take full advantage of the business alignment, flexibility and cost that the cloud family offers and just make incremental changes to your legacy environment. If you wait for a huge re-platforming surge of cloud porting solutions, I believe you’ll be waiting for Godot.

Philips’ Journey to Consumption-based Computing | Sherpas in Blue Shirts

In October 2013, Philips started to transform its IT infrastructure to a truly consumption-based model on the cloud. Alan Nance has been leading this activity in strong collaboration with Philips Procurement. Per the model, service providers charge no start-up or termination fees, and Philips pays only for what it uses. These terms are set out in a charter which all of Philips’ major IT infrastructure providers have signed up to.

One year on, I caught up with Alan to learn more about the transformation and progress to date. The full text of the interview has been published in a new Everest Group report called “Practitioner Perspectives.” In this blog I share some highlights from the interview and look at some of the key drivers for change at Philips.

These drivers include financial synchronicity, elimination of IT infrastructure Capex and speed to market.

Financial synchronicity is needed to bring IT costs in line with corporate revenue. The ultimate aim is to eliminate fixed IT costs altogether. Philips is on the way to achieving this goal. Although the transformation to consumption-based computing is still in its early days, Philips has already cut €30m of fixed costs – they have another €380m to go.

Synchronicity also applies to product development and speed to market – IT working in step with product requirements and with no Capex. One example is taking Philips’ Smart Air Purifier to China in three months. Philips’ speed to market ambitions in this case were achieved by working with Alibaba, which supports the air purifier’s mobile app on its cloud infrastructure. The app allows users to remotely monitor and manage air quality in their homes in real-time. By using Alibaba’s cloud, Philips took the product to a major new market without the need to set up new facilities such as a datacenter, in China. It tapped into Alibaba’s local presence and capabilities.

Philips’ infrastructure transformation has not been without challenges. Examples include:

  • Ensuring that service providers’ offerings meet regulatory compliance requirements in different countries
  • Developing a capability to monitor, assess and act on the impact of external changes on live services and operations
  • Evaluating products and services for inclusion in Philips’ cloud catalog – this has been more manual and time consuming than Alan expected

Then there is the need to re-skill staff. Some of the people who are good at design, build, run, and operate need to apply their skills in different ways, such as, in the service design discussion with the business and selecting the right components from Philips’ catalog. Some people have been able to make that transition, and some have not.

Despite the challenges, Philips is boldly going where few companies have gone before – a truly consumption-based computing model that is pushing the boundaries of services contracts and outsourcing. As to why Philips is opting to pursue this model, the answer is provided to us by its business model. Firstly, Philips is creating more and more products that have interactive components with some form of data sharing between central systems and apps on smart handheld devices and mobile phones. Examples, as well as the smart air purifier and its mobile app, include tools for sharing medical information between doctors and patients, and Cloud TV which streams TV channels over the Internet to Philips smart TVs. This comes with an app that lets Dropbox users view their photos, videos, and music stored online. Cloud and consumption computing are ideal for supporting this business.

The need for agility is underlined in other aspects of the business; Philips is separating its lighting business from its health technology business. The consumption-based computing model is going to make it easier to separate the two companies as resources get divided between the two new entities with little infrastructure Capex impact on Philips. There are also acquisitions, the most recent being that of Volcano Corp., the US medical imaging company that Philips is acquiring for $1.2bn. An agile infrastructure would allow it to incorporate new acquisitions into its main business quickly.

Philips has recognized the role of infrastructure in business agility and is acting upon it. There are very few other companies that cannot benefit from Philips’ model. More and more products and services are being complemented with social and digital interaction channels and mergers, acquisitions and divestments are a business reality. The questions is why are not more companies following in Philips’ footsteps?

A conversation with Alan Nance, Vice President Technology Transformation at Royal Philips – the first of a new Practitioner Perspectives Series can be accessed by Everest Group registered users here: https://research.everestgrp.com/Product/EGR-2014-4-O-1350/Practitioner-Perspectives-Alan-Nance-Interview.

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.