Category: Cloud Infrastructure

The Battle for Supremacy in Industry-specific Cloud Has Begun | Blog

In February 2021, Microsoft CEO Satya Nadella proudly announced the addition of three new offerings to the company’s growing portfolio of industry cloud solutions: Microsoft Cloud for Financial Services, Microsoft Cloud for Manufacturing, and Microsoft Cloud for Non-profit. In December 2020, AWS introduced Amazon HealthLake, a HIPAA-eligible service for healthcare and life sciences organizations; this service will compete head-to-head against Google Cloud’s and Microsoft Azure’s AI-powered solutions for the healthcare and life sciences markets. And all of these companies have other industry-specific cloud solutions. It has become quite clear that savage competition is budding in this fast-emerging cloud segment. Indeed, industry-specific cloud has become the epicenter of new investments as all major cloud vendors have declared “industry-first” focuses.

In a recent blog, we explained the basics of what constitutes a true industry cloud solution. Now, let’s take a look at the different types of industry cloud solution providers and their go-to-market strategies.

The current industry cloud solutions marketplace broadly has four kinds of players:

  • Hyperscalers or the traditional IaaS and PaaS players such as AWS, Azure, GCP, and Oracle doubling down on their vertical strategy
  • Traditional industry-agnostic SaaS players such as Salesforce and SAP entering the vertical cloud market
  • Cloud-native vertical SaaS players and micro-SaaS players such as Veeva Systems, which are developing niche functionalities targeting industries’ specific pain points with heavily nuanced solutions
  • Service providers developing their own vertical solutions, such as Accenture’s INITIENT, which caters to the R&D needs of the life sciences industry

Exhibit 1: The converging landscape of industry cloud solution providers

Picture1

How are these players equipping themselves for the intensifying war?

While all of these types of cloud solution providers have chosen verticalization as their preferred differentiation strategy, each of them is approaching it differently:

  • The hyperscalers and horizontal SaaS players have largely relied on acquisitions and a growing network of niche channel partners – For example, Salesforce’s acquisition of Vlocity is one of the largest industry cloud takeovers to date. GCP’s acquisition of Looker sheds light on its broader strategy of building differentiating competencies in data management, analytics, and AI as an anchor to enter industries like life sciences with disruptive data solutions like Genome data models
  • The vertical-specific players have adopted an IP-led approach complemented by co-innovation partnerships with enterprises – They’ve focused on utilizing their industry expertise to innovate and evolve their portfolio of IP. And they often collaborate with enterprises to co-develop solutions, which allows them to stay close to the industry and better understand its pain points
  • Service providers are carefully aligning their strategy to find a midway, balancing their portfolio of IP-led solutions while partnering with the hyperscalers, horizontal SaaS players, and the vertical-specific providers to add customization on top of their solutions. For instance, we see Accenture playing the role of crucial strategic partner to SAP industry cloud platform while also investing in developing its vertical-specific offerings such as INITIENT for the healthcare and life sciences industry.

While service providers currently enjoy an excellent relationship with the hyperscalers and vertical SaaS providers as strategic partners in the cloud, their shared desire to lead the emerging market for industry-specific solutions could cast them as competitors in the near future

Key trends to watch out for as the battle gets fiercer

  • The IaaS and PaaS players will aggressively compete for market share in a traditionally SaaS-dominated industry cloud market. The industry cloud market is currently dominated by SaaS players, but the hyperscalers are increasingly building enterprise SaaS offerings and investing in their partnership ecosystems. AWS is still catching up in the race, but other leaders like Oracle, Salesforce, and SAP have jumped onto the industry cloud bandwagon with both feet
  • Vertical-specific partnerships, alliances, and acquisitions will quickly emerge as the horizontal players race to build vertical expertise and grab market share
  • Industry-specialist product vendors are strengthening their position by evolving their offerings from on-premise to SaaS and PaaS solutions. Pure-play technology vendors with deep industry expertise, which have traditionally built industry-specific solutions on-premises, are now collaborating with enterprises and hyperscalers to develop and offer SaaS and even PaaS solutions. For instance, healthcare product vendors such as EPIC and Cerner, which have dominated the on-premise Electronic Health Records (EHR) market, are carving out multi-year strategic partnerships with the hyperscalers – Azure in EPIC’s case, and AWS in Cerner’s case – to build new-age cloud-based suites of solutions powered by AI and analytics
  • As the competition intensifies, we will see an increasing number of vertical-specific players trying to diversify their presence across multiple industries to maintain their growth and tackle crowding in the vertical cloud market. For example, Veeva has already begun expanding its Vault offering to the animal health and consumer goods industries

Industry-specific solutions will evolve and improve at a swift rate as tens of thousands of businesses across every industry begin to rely on them as the strategic digital link to their customers. An increasing number of enterprise CEOs are prioritizing end-to-end digital businesses, data-driven operations, and customer-centric growth.

In our next blog, we will analyze the vertical cloud trend from an enterprise point of view, discussing the key implications for enterprises and how they can source industry cloud solutions that best suit their needs. Meanwhile, please feel free to reach out to [email protected] or [email protected] to share any questions and your experiences.

Cloud War Chapter 6: Destination versus Architecture – Are the Cloud Vendors Fighting the Right Battle? | Blog

Most cloud vendors are obsessed with moving clients to their platforms. They understand that although their core services, such as compute, are no longer differentiated, there is still a lot of money to be made just by hosting their clients’ workloads. And they realize that this migration madness has sufficient legs to last for at least three to five years. No wonder migration spend constitutes 50-60 percent of services spend on cloud.

What about the workloads?

The cloud destination – meaning the platform it operates on – does not help a workload to run better. To get modernized and even natively built, a workload needs open architecture underpinned by software that helps applications be built just one time and run on any platform. Kubernetes has become a standard way of building such applications. Today, every major vendor has its own Kubernetes services such as Amazon EKS, Azure Kubernetes Services, Google Kubernetes Engine, IBM Cloud Kubernetes Service, and Oracle Container Engine for Kubernetes.

However, Kubernetes does not create an open and portable application. They help in running containerized workloads. But if those workloads are not portable, Kubernetes services defeat the purpose. Containerized workloads need to be architected to ensure user and kernel space are well designed. This is relevant for both new and modernized workloads. For portability, the system architecture needs to be built on a layer that is open and portable. Without this, the entire migration madness will simply add one more layer to enterprise complexity. Interestingly, any cloud vendor that helps clients build such workloads will almost always be the preferred destination platform, as clients benefit from combining the architecture, build, and run environment.

Role of partners of cloud vendors

Cloud vendors need to work with partners to help clients build new generation workloads that are easily movable and platform independent. The partners include multiple entities such as ISVs and service providers.

ISVs need to embed such open and interoperable elements into their solution so that their software can run on any platform. Service providers need to engage with clients and cloud vendors to build and modernize workloads by using open, interoperable principles. As there is significant business in cloud migration, there is a risk that the service partners will get blinded and become more focused on the growth of their own cloud services business than on driving value for the client. This is a short-term strategy that can help service providers meet their targets. But it will not make the service provider a client’s strategic long-term partner.

Role of enterprises

There is significant pressure on enterprise technology teams to migrate workloads to the cloud. Many of the clients we work with take pride in telling us they will move more than 80 percent of their workloads to the cloud in the next two to three years. There is limited deliberation on which workloads need newer build on portable middleware, or even if they need a runtime that can support an open and flexible architecture. Unfortunately, many enterprise executives have to show progress in cloud adoption. And though enterprise architects and engineering teams do come together to evaluate how a workload needs to be moved to the cloud, there is little discussion on building an open architecture for these workloads. A bright spot is that there seem to be good architectural discussions around newer workloads.

Enterprises will soon realize they are hitting the wall of cloud value because they did not meaningfully invest in building a stronger architecture. Their focus in moving their workloads to their primary cloud vendor is overshadowing all the other initiatives they must undertake to make their cloud journey a success.

Do you focus on the architecture or the destination for your workloads? I’d enjoy hearing your experiences and thoughts at [email protected].

GAIA-X Summit 2020: Key Takeaways and the Future of Data Sovereignty in Europe | Blog

GAIA-X Summit 2020: Key Takeaways and the Future of Data Sovereignty in Europe

In an earlier blog on sovereign cloud, we explained how GAIA-X – an ambitious project led by France and Germany, aimed at creating a high-performance and trustworthy data infrastructure for Europe – is set to play a pivotal role in the evolution of the European cloud market in the coming years. Since then, GAIA-X has experienced a significant increase in the number of member firms, cutting across geographies and industry verticals, in its endeavor to secure data sovereignty.

We participated in the GAIA-X summit, which was held virtually on November 18 and 19, 2020, and highlighted the increasing relevance of GAIA-X both within and outside Europe and for the broader Industry 4.0 agenda. Here are some of the notable insights from the summit.

GAIA-X has experienced a significant increase in member firms and countries – including non-European countries

GAIA-X was launched with 22 member firms (11 French, 11 German) and an initial focus on France and Germany, but, within one year, the project has grown into a colossal multi-industry project with over 181 members from 18 countries.

Notably, on October 15, 2020, all 27 European member states signed a joint declaration supporting the European Commission’s cloud and data strategy, which mentioned GAIA-X as a leading example of a public-private initiative for European-federated data infrastructure.

Economists believe GAIA-X will create long-term economic benefits for Europe

The GAIA-X project is expected to reap benefits for European countries in the long run, as several economists, including the likes of Jacques Cremer, have placed their bets on a better data sharing platform that reduces transaction costs. The key ways in which GAIA-X can help improve the European economy are:

  1. Better data sharing platforms will result in quality AI training data and, thus, a better AI-enabled infrastructure, propelling more automation and cost savings
  2. It will reduce codependence among subcontractors by curbing the use of proprietary data exchange methodologies

 

GAIA-X will help better implement European compliance laws

Compliance and data sovereignty are some of the key goals of the GAIA-X project, and the project is expected to address some of these challenges related to European laws. As it is designed on the core European values of openness, transparency, and collaboration, GAIA-X is expected to better implement European laws along with the draft European Data Governance Act, which will play a crucial role in defining data flows in Europe.

GAIA-X can play a key role in enabling infrastructure for Industry 4.0

Industries in Europe are in a transformative state, with increasing emphasis on data access, data security, and data sovereignty. The next step in IoT is AI-enabled IoT (AIoT), and industry experts are confident that GAIA-X will enable the data infrastructure needed for this transformation.

Several potential use cases across multiple data ecosystems – within and outside Europe

GAIA-X already has use cases across multiple industries and sectors, including finance, health, public sector, agriculture, energy, mobility, smart living, and Industry 4.0. Some of these use cases include collaborative condition monitoring, shared production, financial big data clusters, smart health connect, smart infrastructure management, digital parking management, and agricultural ecosystem, among many others.

Cloud adoption in Europe will increase through a collaborative data ecosystem

A key value proposition of GAIA-X is a collaborative data ecosystem that can help build a structure with innovation at its core. GAIA-X aims to increase cloud usage in the EU from 24% to 60% in the coming years by developing common standards with other cloud providers that will accelerate market uptake and simultaneously enhance data portability.

Strong financial support for GAIA-X to help the EU tackle data sovereignty and data privacy challenges

The German government is planning to make EUR200 million available for GAIA-X, while the EU is planning to invest EUR2 billion in the digital transformation space, some of which will find its way to the GAIA-X project.

With access to these funds, GAIA-X will emerge as a strong contender for hyperscalers in Europe and will impact the overall cloud market. The GAIA-X roll-out will further help the EU address the core issues of data privacy and data sovereignty – which have long been a challenge.

Consolidating successes and looking ahead

We believe that GAIA-X has experienced significant growth in the last year, primarily for the following reasons:

  • The project has a strong legal framework with open source software promoting transparency
  • Companies across industries have committed to it
  • It is starting to build a strong customer base with use cases identified, again across industries

We expect GAIA-X to continue its momentum in 2021, driven by the European Data Governance Act and constant legal battles between US tech giants, such as Facebook, Google, and Apple, and the EU. The project also plans to launch six regional hubs by the end of 2020, while another six are in the pipeline for the first half of 2021. These regional hubs will act as entry points for anyone who wants to use GAIA-X and will further strengthen GAIA-X’s data sovereignty proposition, even as more firms and countries hop on the GAIA-X bandwagon.

If you’d like to know more about the GAIA-X project or have any questions or observations, please write to us at [email protected] or [email protected].

Sustainable Business Needs Sustainable Technology: Can Cloud Modernization Help? | Blog

Estimates suggest that enterprise technology accounts for 3-5% of global power consumption and 1-2% of carbon emissions. Although technology systems are becoming more power efficient, optimizing power consumption is a key priority for enterprises to reduce their carbon footprints and build sustainable businesses. Cloud modernization can play an effective part in this journey if done right.

Current practices are not aligned to sustainable technology

The way systems are designed, built, and run impacts enterprises’ electricity consumption and CO2 emissions. Let’s take a look at the three big segments:

  • Architecture: IT workloads, by design, are built for failover and recovery. Though businesses need these backups in case the main systems go down, the duplication results in significant electricity consumption. Most IT systems were built for the “age of deficiency,” wherein underlying infrastructure assets were costly, rationed, and difficult to provision. Every major system has a massive back-up to counter failure events, essentially multiplying electricity consumption.
  • Build: Consider that for each large ERP production system there are 6 to 10 non-production systems across development, testing, and staging. Developers, QA, security, and pre-production ended up building their own environments. Yet, whenever systems were built, the entire infrastructure needed to be configured despite the team needing only 10-20% of it. Thus, most of the electricity consumption ended up powering capacity that wasn’t needed at all.
  • Run: Operations teams have to make do with what the upstream teams have given them. They can’t take down systems to save power on their own as the systems weren’t designed to work that way. So, the run teams ensure every IT system is up and running. Their KPIs are tied to availability and uptime, meaning that they were incentivized to make systems “over available” even when they weren’t being used. The run teams didn’t – and still don’t – have real-time insights into the operational KPIs of their systems landscape to dynamically decide which systems to shut off to save power consumption.

The role of cloud modernization in building a sustainable technology ecosystem

In February 2020, an article published in the journal Science suggested that, despite digital services from large data center and cloud vendors growing sixfold between 2010 and 2018, energy consumption grew by only 6%. I discussed power consumption as an important element of “Ethical Cloud” in a blog I wrote earlier this year.

Many cynics say that cloud just shifts power consumption from the enterprise to the cloud vendor. There’s a grain of truth to that. But I’m addressing a different aspect of cloud: using cloud services to modernize the technology environment and envision newer practices to create a sustainable technology landscape, regardless of whether the cloud services are vendor-delivered or client-owned.

Cloud 1.0 and 2.0: By now, many architects have used cloud’s run time access of underlying infrastructure, which can definitely address the issues around over-provisioning. Virtual servers on the cloud can be switched on or off as needed, and doing so reduces carbon emission. Moreover, as cloud instances can be provisioned quickly, they are – by design – fault tolerant, so they don’t rely on excessive back-up systems. They can be designed to go down, and their back-up turns on immediately without being forever online. The development, test, and operations teams can provision infrastructure as and when needed. And they can shut it down when their work is completed.

Cloud 3.0: In the next wave of cloud services, with enabling technologies such as containers, functions, and event-driven applications, enterprises can amplify their sustainable technology initiatives. Enterprise architects will design workloads keeping failure as an essential element that needs to be tackled through orchestration of run time cloud resources, instead of relying on traditional failover methods that promote over consumption. They can modernize existing workloads that need “always on” infrastructure and underlying services to an event-driven model. The application code and infrastructure lay idle and come online only when needed. A while back I wrote a blog that talks about how AI can be used to compose an application at run time instead of always being available.

Server virtualization played an important role in reducing power consumption. However, now, by using containers, which are significantly more efficient than virtual machines, enterprises can further reduce their power consumption and carbon emissions. Though cloud sprawl is stretching the operations teams, newer automated monitoring tools are becoming effective in providing a real-time view of the technology landscape. This view helps them optimize asset uptime. They can also build infrastructure code within development to make an application aware of when it can let go of IT assets and kill zombie instances, which enables the operations team to focus on automating and optimizing, instead of managing systems that are always on.

Moreover, because the entire cloud migration process is getting optimized and automated, power consumption is further reduced. Newer cloud-native workloads are being built in the above model. However, enterprises have large legacy technology landscapes that need to move to an on-demand cloud-led model if they are serious about their sustainability initiatives. Though the business case for legacy modernization does consider power consumption, it mostly focuses on movement of workload from on-premises to the cloud. And it doesn’t usually consider architectural changes that can reduce power consumption, even if it’s a client-owned cloud platform.

When considering next-generation cloud services, enterprises should rethink their modernization journeys beyond a data center exit to building a sustainable technology landscape. They should consider leveraging cloud-led enabling technologies to fundamentally change the way their workloads are architected, built, and run. However, enterprises can only think of building a sustainable business through sustainable technology when they’ve adopted cloud modernization as a potent force to reduce power and carbon emission.

This is a complex topic to solve for, but we all have to start somewhere. And there are certainly other options to consider, like greater reliance on renewable energy, reduction in travel, etc. I’d love to hear what you’re doing, whether it’s using cloud modernization to reduce carbon emission, just shifting your emissions to a cloud vendor, or another approach. Please write to me at [email protected].

Demystifying Industry Cloud | Blog

Microsoft recently rolled out its first industry cloud, Microsoft Cloud for Healthcare, combining capabilities across Dynamics 365, Microsoft 365, Power Platform, and Azure, to help improve care management and health data insights. Not so long ago, SAP’s CEO, Christian Klein, counted the company’s newly launched industry cloud among its growth drivers, and rightly so. Since its launch, SAP’s industry cloud has seen a long line of suitors rallying to partner in and build different vertical cloud offerings on top of it. In June 2020, there was Deloitte, then Accenture, and, most recently, Wipro.

Having analyzed this specific market trend throughout 2020, we at Everest Group have realized that, with all kinds of cloud providers jumping on to the industry cloud bandwagon, confusion abounds on what truly is an industry cloud.

So, what’s the buzz about? What is an industry cloud?

Simply put, an industry-specific/vertical cloud is a cloud solution that has been optimized and customized to fit the typical nuances and specific needs of a particular industry vertical’s customers. It is designed to tackle industry-specific constraints such as data protection, retention regulations, and operations with mission-critical applications.

We believe that a true industry cloud solution is characterized by four different layers: the infrastructure and platform layers, followed by an application layer, which is further supplemented by customization and differentiation layers.

The infrastructure layer, dominated by industry-agnostic IaaS players such as AWS, provides the hardware, network, scalability, and compute resources. The platform layer, such as Azure’s PaaS offering, is built over this infrastructure layer and becomes the debugging environment for building applications.

The application layer comprises a horizontal cloud application such as Salesforce CRM and, in several cases, hosts development platforms, such as the Salesforce App Cloud, that become the marketplace for building additional functionalities.

The differentiation layer adds vertical nuances to a horizontal application such as built-in industry regulatory compliance. It is here that we see the industry cloud taking shape, but the offerings are still standard and not customized to the needs of specific enterprises.

The customization layer brings in service providers or technology vendors with their vertical expertise and decades of experience in working with enterprises. They partner with providers of differentiated cloud offerings, and build tools and accelerators to further customize them to suit enterprise needs, adding capabilities such as AI and security, personalized dashboards for analytics, and integration services to build a truly industry-specific/vertical cloud offering.

We have illustrated this architecture in the exhibit below, taking the example of Veeva Systems, a popular provider of industry cloud solutions for life sciences. Veeva started as a CRM application built on the Salesforce platform, designed specifically for the pharmaceutical and biotechnology industries. Salesforce provided the infrastructure, platform, and application layers, while Veeva added a differentiation layer with a data model, application logic, and user interface tailored to support the pharmaceutical and biotechnology industries. It leveraged the standard Salesforce reporting, configuration, and maintenance capabilities.

Exhibit: Understanding the industry cloud architecture through Veeva industry cloud

layers

Over time, Veeva has cultivated an ecosystem of partnerships, including service providers such as Accenture (Veeva CRM partner) and Cognizant (Veeva Vault partner). These partners leverage the Veeva development platform to build additional applications customized to enterprise needs, thereby adding the final customization layer to Veeva’s solutions suite.

Industry cloud is gaining significant traction among industries

Industries such as healthcare and banking – which require rapid and auditable deployment of new features or functionalities to comply with regulatory changes – are rapidly adopting industry cloud. Healthcare continues to lead the charge from a vertical standpoint, but many industries are experiencing an uptick in adoption, including manufacturing, financial services, and retail.

The key reasons for this growth are:

  • Lowered barriers to cloud adoption, along with a ready-to-use environment with tools and services tailored to a specific vertical’s operational requirements
  • Accelerated innovation, lower costs, and reduced risks
  • Efficient handling of data sources and workflows, and compliance with the industry’s unique standards
  • Support for industry-standard APIs, helping companies connect more easily and securely, accelerating the DX economy
  • Access to industry insights or benchmarks through data aggregation from multiple clients within the same industry

What can organizations expect in the near future?

With the one-cloud-fits-all-approach reaching maturity, the next decade will be marked by the depth of vertical expertise and customization capabilities that can complement existing applications and address customer pain points. Different kinds of vendors are developing industry cloud solutions, ranging from hyperscalers such as Azure and GCP, to vertical-specific players such as Veeva. We will cover the industry cloud market in further detail in parts 2 and 3 of this blog series, in which we will answer the following questions:

  • What are the different kinds of industry cloud solution providers and their go-to-market strategies?
  • What should enterprises do, and how should they source industry cloud solutions that best suit their needs?

The battle for industry cloud is only going to get fiercer in the near future. Please follow this space for more blogs on the emerging war and warriors in the industry cloud market. Meanwhile, please feel free to reach out to [email protected] or [email protected] to share any questions and your experiences.

 

Cloud Wars Chapter 5: Alibaba, IBM, and Oracle Versus Amazon, Google, and Microsoft. Is There Even a Fight? | Blog

Few companies in the history of the technology industry have aspired to dominate the way public cloud vendors Microsoft Azure, Amazon Web Services, and Google Cloud Platform currently are. I’ve covered the MAGs’ (as they’re collectively known) ugly fight across other blogs on industry cloud, low-code, and market share.

However, enterprises and partners increasingly appear to be demanding more options. That’s not because these three cloud vendors have done something fundamentally wrong or their offerings haven’t kept pace. Rather, it’s because enterprises are becoming more aware of cloud services and their potential impact on their businesses, and because Alibaba, IBM, and Oracle have introduced meaningful offerings that can’t be ignored any longer.

What’s changed?

Our research shows that enterprises have moved only about 20 percent of their workloads to the cloud. They started with simple workloads like web portals, collaboration suites, and virtual machines. After this first phase of their journey to the cloud, they realized that they needed to do a significant amount of preparation to be successful. Many enterprises – some believe more than 90 percent – have repatriated at least one workload from public cloud, which opened enterprise leaders’ eyes to the importance of fit-for-purpose in addition to a generic cloud. So, before they move more complex workloads to the cloud, they want to be absolutely sure they get their architectural choices and cloud partner absolutely right.

Is the market experiencing public cloud fatigue?

When AWS is clocking over US$42 billion in revenue and growing at  about 30 percent, Google Cloud has about US$15 billion in revenue and is growing at over 40 percent, and Azure is growing at over 45 percent, it’s hard to argue that there’s public cloud fatigue. However, some enterprises and service partners believe these vendors are engaging in strongarm tactics to own the keys to the enterprise technology kingdom. In the race to migrate enterprise workloads to their cloud platforms, these vendors are willing to proactively invest millions of dollars – that is, on the condition that the enterprise goes all-in and builds architecture for workloads on their specific cloud platform. Notably, while all these vendors extol multi-cloud strategies in public, their actual commitment is questionable. At the same time, this isn’t much different than any other enterprise technology war where the vendor wants to own the entire pie.

Enter Alibaba, IBM, and Oracle (AIO)

In an earlier blog, I explained that we’re seeing a battle between public cloud providers and software vendors such as Salesforce and SAP. However, this isn’t the end of it. Given enterprises’ increasing appetite for cloud adoption, Alibaba, IBM, and Oracle have meaningfully upped the ante on their offerings. The move to the public cloud space was obvious for IBM and Oracle, as they’re already deeply entrenched in the enterprise technology landscape. While they probably took a lot more time than they should have in building meaningful cloud stories, they’re here now. They’re focused on “industrial grade” workloads that have strategic value for enterprises and on building open source as core to their offering to propagate multi-cloud interoperability. IBM has signed multiple cloud engagements with companies including Schlumberger, Coca Cola European Partners, and Broadridge. Similarly, Oracle has signed with Nissan and Zoom. And Oracle, much like Microsoft, has the added advantage of having offerings in the business applications market. Alibaba, despite its strong focus on China and Southeast Asia, is increasingly perceived as one of the most technologically advanced cloud platforms.

What will happen now, and what should enterprises do?

As enterprises move deeper into their cloud journeys, they must carefully vet and bet on cloud vendors. As infrastructure abstraction rises at a fever pitch with serverless, event-driven applications and functions-as-a-service, it becomes relatively easier to meet the lofty ideals of Service Oriented Architecture to have a fully abstracted underlying infrastructure, which is what a true multi-cloud environment also embodies. The cloud vendors realize that, as they provide more abstracted infrastructure services, they risk being easily replaced with API calls applications can make to other cloud platforms. Therefore, cloud vendors will continue to make high value services that are difficult to switch from, as I argued in a recent blog on multi-cloud interoperability.

It appears the MAGs are going to dominate this market for a fairly long time. But, given the rapid pace of technology disruption, nothing is certain. Moreover, having alternatives on the horizon will keep MAGs on their toes and make enterprise decisions with MAGs more balanced. Enterprises should keep investing in in-house business and technology architecture talent to ensure they can correctly architect what’s needed in the future and migrate workloads off a cloud platform when and if the time comes. Enterprises should also realize that preferring multi-cloud and actually building internal capabilities for multi-cloud are two very different things. In the long run, most enterprises will have one strategic cloud vendor and two to three others for purpose-built workloads. However, they shouldn’t be suspicious of the cloud vendors and shouldn’t resist leveraging the brilliant native services MAGs and AIO have built.

What has your experience been working with MAGs and AIO? Please share with me [email protected].

Reflections on Cloudera Now and the Battle for Data Platform Supremacy | Blog

The enterprise data market is going through a pretty significant category revision, with native technology vendors – like Cloudera, Databricks, and Snowflake – evolving, cloud hyperscalers increasingly driving enterprises’ digital transformation mandates, and incumbent vendors trying to remain relevant (e.g., the 2019 HPE-MapR deal.) This revision has led to leadership changes, acquisitions, and interesting ecosystem partnerships. Is data warehousing the new enterprise data cloud category that will eventually be a part of the cloud-first narrative?

Last month I attended Cloudera Now, Cloudera’s client and analyst event. Read on for my key takeaways from the event and let me know what you think.

  • Diversity and data literacy come to the forefront: Props to Cloudera for addressing key issues up front. In the first session, CEO Rob Bearden and activist and historian Dr. Mary Frances Berry had an honest dialogue about diversity and inclusion in tech. More often than not, tech vendors pay lip service to these issues of the zeitgeist, so it was a refreshing change to see the event kicking off with this important conversation. During the analyst breakout, Rob also took questions on data literacy and how crucial it is going to be as Cloudera aims to become more meaningful to enterprise business users against the backdrop of data democratization.
  • Cloudera seems to be turning around, slowly: After a tumultuous period following its merger with Hortonworks in early 2019, Cloudera has new, yet familiar, leaders in place, with Rob Bearden (previously CEO of Hortonworks) taking over the CEO reins in January 2020. The company reported its FYQ2 2021 results a few weeks before the event, and its revenue increased 9 percent over the previous quarter, its subscription revenue was up 17 percent, and its Annualized Recurring Revenue (ARR) grew 12 percent year-over-year. ARR is going to be really key for Cloudera to showcase stickiness and client retention. While its losses narrowed in FYQ2 2021, it has more ground to cover on profitability.
  • Streaming and ML will be key bets: As the core data warehousing platform market faces more competition, it is important for Cloudera to de-risk its portfolio by expanding revenue from emerging high growth spend areas. It was good to see streaming and Machine Learning (ML) products growing faster than the company. In early October, it also announced its acquisition of Eventador, a provider of cloud-native services for enterprise-grade stream processing, to further augment and accelerate its own streaming platform, named DataFlow. The aim is to bring this all together through Shared Data Experience (SDX), is Cloudera’s integrated offering for security and governance.
  • We are all living in the hyperscaler economy: Not surprisingly, there were a share of discussions around the increasing role of the cloud hyperscalers in the data ecosystem. The hyperscalers’ appetite is voracious; while the likes of Cloudera will partner with these cloud vendors, competition will increase, especially on industry-specific use cases. Will one of the hyperscalers acquire a data warehousing vendor? One can only speculate.
  • Industry-specificity will drive the next wave of the platform growth story: I’ve been saying this for a while – clients don’t buy tools, they buy solutions. Industry-context is becoming increasingly important, especially in more regulated and complex industries. For example, after its recent Vlocity acquisition, Salesforce announced Salesforce Industries to expand its industry product portfolio, providing purpose-built apps with industry-specific data models and pre-built business processes. Similarly, Google Cloud has ramped up its industry solutions team by hiring a slew of senior leaders from SAP and the industry. For the data vendors, focusing on high impact industry-led use cases – on their own and with partners – will be key to unlocking value for clients and driving differentiation. Cloudera showcased some interesting use cases for healthcare and life sciences, financial services, and consumer goods. Building a long-term product roadmap here will be crucial.

By happenstance, the Cloudera event started the same day its primary competitor, cloud-based data warehousing vendor Snowflake made its public market debut and more than doubled on day one, making it the largest ever software IPO. Make of that what you will, but to me it is another sign of the validation of the data and analytics ecosystem. Watch this space for more.

I’d enjoy hearing your thoughts on this space. Please email me at: [email protected].

Full disclosure: Cloudera sent a thoughtful package ahead of the event, which included a few fine specimens from the vineyards in La Rioja. I can confirm I wasn’t sampling them while writing this.

IBM Splits Into Two Companies | Blog

IBM announced this week that it is spinning off its legacy Managed Infrastructure business into a new public company, thus creating two independent companies. I highly endorse this move and, in fact, advocated it for years. IBM is a big, successful, proud organization. But it has been apparent for years that it faced significant challenges in trying to manage two very different businesses and operate within two very different operating models.

Read more in my blog on Forbes

Cloud Wars, Chapter 4: Own the “Low-Code,” Own the Client | Blog

In my series of cloud war blog posts, I’ve covered why the war among the top three cloud vendors is so ugly, the fight for industry cloud, and edge-to-cloud. Chapter 4 is about low-code application platforms. And none of the top cloud vendors – Amazon Web Services (AWS), Microsoft Azure (Azure), and Google Cloud Platform (GCP) – want to be left behind.

What is a low-code platform?

Simply put, a platform provides the run time environment for applications. The platform takes care of applications’ lifecycle as well as other aspects around security, monitoring, reliability, etc. A low-code platform, as the name suggests, makes all of this simple so that applications can be built rapidly. It generally relies on a drag and drop interface to build applications, and the background is hidden. This setup makes it easy to automate workflows, create user-facing applications, and build the middle layer. Indeed, the key reason low-code platforms have become so popular is that they enable non-coders to build applications – almost like mirroring the good old What You See is What You Get (WYSIWYG) platforms of the HTML age.

What makes cloud vendors so interested in this?

Cloud vendors realize their infrastructure-as-a-service has become so commoditized that clients can easily switch away, as I discussed in my blog, Multi-cloud Interoperability: Embrace Cloud Native, Beware of Native Cloud. Therefore, these vendors need to have other service offerings to make their propositions more compelling for solving business problems. It also means creating offerings that will drive better stickiness to their cloud platform. And as we all know, nothing drives stickiness better than an application built over a platform. This understanding implies that cloud vendors have to move from infrastructure offerings to platform services. However, building applications on a traditional platform is not an easy task. With talent in short supply, necessary financial investments, and rapid changes in business demand, enterprises struggle to build applications on time and within budget.

Enter low-code platforms. This is where cloud vendors, which already host a significant amount of data for their clients, become interested. A low-code platform that runs on their cloud not only enables clients to build applications more quickly, but also helps create stickiness because low-code platforms are notorious for “non interoperability” – it’s very difficult to migrate from one to another. But this isn’t just about stickiness. It’s one more initiative in the journey toward owning enterprises’ technology spend by building a cohesive suite of services. In GCP’s case, it’s realizing that its API-centric assets offer a goldmine by stitching together applications. For example, Apigee helps in exposing APIs from different sources and platforms. Then GCP’s AppSheet, which it acquired last year, can use the data exposed by the APIs to build business applications. Microsoft, on the other hand, is focusing on its Power platform to build its low-code client base. Combine that with GitHub and GitLab, which have become the default code stores for modern programming, and there’s no end to the innovation that developers or business leaders can create. AWS is still playing catch-up, but its launch of Honeycode has business written all over it.

What should other vendors and enterprises do?

Much like the other cloud wars around grabbing workload migration, deploying and building applications on specific cloud platforms, and the fight for industry cloud, the low-code battle will intensify. Leading vendors such as Mendix and OutSystems will need to think creatively about their value propositions around partnerships with mega vendors, API management, automation, and integration. Most vendors support deployment on cloud hyperscalers and now – with a competing offering – they need to tread carefully. Larger vendors like Pega (Infinity Platform,) Salesforce (Lightning Platform,) and ServiceNow (Now Platform) will need to support the development capabilities of different user personas, add more muscle to their application marketplaces, generate more citizen developer support, and create better integration. The start-up activity in low-code is also at a fever pitch, and it will be interesting to see how it shapes up with the mega cloud vendors’ increasing appetite in this area. We covered this in an earlier research initiatives, Rapid Application Development Platform Trailblazers: Top 14 Start-ups in Low-code Platforms – Taking the Code Out of Coding.

Enterprises are still figuring out their cloud transformation journeys. This additional complexity further exacerbates their problems. Many enterprises make their infrastructure teams lead the cloud journey. But these teams don’t necessarily understand platform services – much less low-code platforms – very well. So, enterprises will need to upskill their architects, developers, DevOps, and SRE teams to ensure they understand the impact to their roles and responsibilities.

Moreover, as low-code platforms give more freedom to citizen developers within an enterprise, tools and shadow IT groups can proliferate very quickly. Therefore, enterprises will have to balance the freedom of quick development with defined processes and oversight. Businesses should be encouraged to build simpler applications, but complex business applications should be channeled to established build processes.

Low-code platforms can provide meaningful value if applied right. They can also wreak havoc in an enterprise environment. Enterprises are already struggling with the amount of their cloud spend and how to make sense of the spend. Cloud vendors introducing low-code platform offerings into their cloud service mix is going to make their task even more difficult.

What has your experience been with low-code platforms? Please share with me at [email protected].

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.