Category: Cloud & Infrastructure

Demystifying Cloud Advisory | Blog

Before embarking on a cloud journey, every enterprise should conduct an assessment of their IT landscape by an external advisor or an internal team. But how deep should the evaluation go and what’s covered? Let’s clear up the confusion about cloud advisory and discover how to start your migration and modernization programs off right.

Starting out

To create a successful migration roadmap, due diligence or cloud discovery and assessment is critical because this first phase will directly impact the migration execution and management. Any action plan to migrate and/or modernize workloads to the cloud must consider the source environment and the business requirements.

Most enterprises typically seek help from cloud consulting service providers who bring in technical expertise as well as proprietary tools, accelerators, and frameworks required to deliver the project.

Determining the assessment extent

Choosing between the following two assessment types prevalent in the market will depend on the stage of the cloud transformation journey the organization is in and the cloud consulting support needed:

  • Low-touch assessment: Often, clients want a quick, high-level assessment before deciding to move to cloud. The scope is restricted to business and IT strategy alignment. The objective is to arrive at a top-line business case looking at Total Cost of Ownership (TCO) and Return on Investment (ROI) using the information gathered from stakeholder interviews without deploying any discovery tools. These projects typically take one to two months
  • High-touch assessment: This detailed exercise will recommend a roadmap that will help clients later migrate workloads to cloud. Discovery of workloads is largely tool-driven. The migration execution team will reference the analysis and recommendations. Occasionally service providers also conduct Proofs of Concepts (POCs) and migrate a few apps on cloud during this phase, mostly to determine the larger execution program feasibility. Projects at this higher level can take up to five months

Cloud advisory objective and depth

Organizations carry out high-touch assessments to gain an in-depth workload evaluation, resulting in nearly 60 to 70% of clients proceeding with a cloud migration transformation journey. In more than 90% of the cases, we observed clients immediately implementing the decommission/archiving-related recommendations.

The following key activities are conducted in these deep appraisals:

  • Assessing application health: Reviewing application-specific attributes such as availability, criticality, stability (issues per month), etc. is important to identify the apt migration strategy
  • Categorizing using 7Rs analysis: Tagging each workload with the appropriate migration strategy is the major goal. Depending on their characteristics, the workloads are segregated using the 7Rs: Rehost, Replatform, Refactor, Rearchitect, Replace, Retain, or Retire. For each application, a target state for each of the components (Database, Web server, app server, etc.) might also be identified at this stage
  • Planning migration waves: The group of applications that must be migrated together will determine how they are moved. The migration plan serves as a reference for the execution team
  • Determining TCO: The cloud advisory service provider also can be tasked with analyzing the costs of migrating and hosting

Choosing an advisor

Most all service providers have developed cloud advisory capabilities with the market growth. The majority also leverage proprietary tools and accelerators along with the popular third-party cloud migration tools such as Cloudamize, Device42, Movere, etc.

Everest Group believes that the cloud migration and modernization space will continue to evolve in the coming years. Until the dust settles, we see the market reeling with incoherent definitions and interpretations, resulting in dissimilar pricing for advisory services. Understanding what’s involved in the starting assessment will help you select a partner that will set your journey off in the right direction.

To access more information about the future of cloud and cloud management, watch our recent webinar on demand, Hybrid Cloud: The Future of an Ideal Enterprise Architecture. To share your experiences with cloud advisory programs, please reach out to [email protected].

Multi-cloud and Modern Applications: Doomed to Fail | Blog

Are multi-cloud and modern applications a panacea or problem? As the cloud journey scales and newer ways of building workloads get adopted, the industry is divided over the value of these initiatives. With increasing concerns about their viability, enterprises need to address some key questions before moving forward. Read on to learn more.   

In our previous blogs, we covered the dichotomy of multi-cloud and explored choice or strategy and interoperability. Let’s now dive into the debate over these approaches.

While enterprises understand the new digital business models require them to fundamentally change the way they consume cloud and build software, they aren’t necessarily aligned on the best models for the future. Not everyone is completely sold on multi-cloud and some doubts by large enterprises are emerging.

The top five questions enterprises ask are:

  1. Is there a better way to solve business challenges than assuming that multi-cloud and modern applications are the panacea?
  2. Is multi-cloud now a distraction to our technology teams?
  3. Is multi-cloud a “fear uncertainty and doubt” created by the nexus of cloud vendors and their partners?
  4. How can we succeed in multi-cloud when we barely have skills for one cloud to build, manage, and optimize workloads?
  5. Why should we build modern applications this way if they are so complex to build, operate, and sustain?

These questions are understandable – even if not always correct. However, unless enterprises become comfortable and address these challenging issues, they cannot proceed in their cloud or modern applications journey.

What should enterprises do?

Based on our research, we recommend the following three steps to succeed:

  • Acknowledge: First, acknowledge that multi-cloud and modern applications are not a cakewalk but very complex strategic initiatives. Moreover, they may not be relevant for all enterprises or use cases. Stress testing the current operating model, development practices, and existing investments are important before charting this journey. In addition, performing analysis to understand the operating cost of multi-cloud and modern applications is critical
  • Assess: Next, discovering existing technology and business estate, aligning with future priorities, and understanding in-house talent, program risks, and funding capabilities become important. Once these decisions are made, enterprises need to consider architectural choices and technology stacks. Wrong choices on these critical input areas can derail the multi-cloud and modern applications journey
  • Act: Finally, understand it is not a foregone conclusion that multi-cloud and modern applications will always benefit or harm your enterprise. In addition to the technology challenges, operating models must change. Therefore, rationalizing tools, realigning teams, prioritizing funnel funding, and transforming talent are critical. Simulating these workloads before they are built and holding cloud vendors and partners contractually accountable is important. Enterprises should also understand that some existing technology investments will be irrelevant, and they will need to buy newer tools across design, build, and run

What should vendors do?

In the complex landscape, cloud providers, service partners, and technology companies have their own incentives and businesses to run, and none have the client’s best interests as their core agenda. Vendors need to build data-driven models to show the value of multi-cloud and modern applications initiatives and help remove as much subjectivity and intuition from this process. Moreover, building platforms that can simulate these workloads across the lifecycle, as well as the talent, funding, and process transformation needed for this journey, are important. If the returns are underwhelming, enterprises should not bother going down the multi-cloud and modern applications route.

Suppliers should be proactive enough to let clients know of the operating model changes needed to adopt multi-cloud and modern applications. We believe system integrators have a more strategic role to play here because cloud or tech vendors do not understand the client landscape and have less incentive to drive such fundamental operating model transformation.

In the end, it boils down to the conviction enterprises have in multi-cloud and modern applications initiatives.  Using tools and platforms to stress test can move the decision from being a gut feeling to fact-based.

Please share your experiences with multi-cloud and modern applications with me at [email protected].

Discover more about our digital transformation research and insights.

Databricks vs Snowflake: A Rivalry to Last or Lunch for Cloud Vendors? | Blog

In the latest tech industry rivalry, the competition between Databricks and Snowflake in the cloud data and analytics space is getting a lot of attention. It joins the other famous marquee rivalries over the past 100 years, such as those between IBM and HP, SAP and Oracle, or AWS and Azure. To learn more about the similarities and differences between these two big data service providers and how to make better buying decisions when choosing between the two, read on. 

What do Databricks and Snowflake do?

For the uninitiated, Databricks focuses on analyzing data at scale regardless of its location. It can broadly be considered a data and analytics platform that helps enterprises extract value from their data. Snowflake is a cloud-based data warehousing platform that positions itself as being a simple replacement to other complex offerings from traditional vendors such as Oracle and even cloud vendors such as AWS, Microsoft, and Google.

Both the platforms apply AI to data issues for enterprises. Therefore, they are Enterprise AI companies that plan to transform the usage of data in enterprises. It could be using AI to integrate data lakes and warehouses, crunching massive scale data to make decisions, or just being an intelligent analytics platform.

Where are the firms today?

Snowflake went public in 2020, making it the largest software IPO in history at a valuation of US$33 billion. Databricks, on the other hand, continues to be private and recently reached US$38 billion in valuation. While money is less of a problem, mindshare, being first to market, and the threat from cloud hyperscalers are bigger challenges. Both vendors struggle from the significant talent demand-supply mismatch, as we covered in our research earlier.

The management of both companies has a strong respect for each other. Databricks, for example, understands that Snowflake had a head start. On the other hand, Snowflake realizes some features of Databricks need to be built for its platform as well.

What is happening?

The two vendors are well covered in the public arena, and many have written almost with a romantic spin about their roots, success, and management background. Both firms have different management styles, with Snowflake run by a professional and Databricks by the founder. However, clients are least bothered about the internal operating model of vendors. They are more concerned about whether to bet on these firms, given cloud vendors have been reshaping the industry. In addition, these two companies are dependent on cloud vendors for their own platforms.

Both the vendors have taken potshots at each other with competing offerings with similar-sounding names such as Data Ocean from Snowflake and Data Lakehouse from Databricks. They also collaborate and have connectors to each other’s platforms while they keep developing their versions of these offerings. The sales and technical teams of these vendors bring out challenges in each other’s platforms to clients, such as how Databricks focuses on Snowflake’s proprietary model versus their open-source platform. Snowflake emphasizes how its compute scaling is faster and data compression is better.

What will happen?

Developers, operators, and data professionals have strong views on which platform(s) they plan to leverage. Given Snowflake’s view on building platforms from a warehousing perspective, enterprises find it easier to migrate. Coming from a data lakes perspective, Databricks has to fight a tougher battle. Moreover, Snowflake is perceived as simpler to adopt compared to Databricks. The bigger issue for both of these vendors is the threat from cloud providers. Not only do these vendors offer their platforms on cloud hyperscalers, but these hyperscalers have built their own suite of data-related offerings.

Both Snowflake and Databricks are losing money and running losses. Innovation will be needed to compete with cloud vendors, and innovation is costly. In addition to cloud, one other big challenge these two vendors face is the growing trend of decentralization of data. As data fabric and mesh concepts gain traction, building a lake or warehouse may lose relevance. Therefore, both of these vendors will need to meet data where it is generated or consumed. They need to make connectors to as many platforms as possible. Moreover, as more open-source data platforms see traction, the earlier powerhouse of Oracle, SAP, Microsoft, and IBM may decline, which will impact these two vendors as well unless they scale their offerings to these open-source databases, messaging, and event platforms.

What should enterprises do?

It’s a known fact that a large number of Databricks clients are customers of Snowflake as well. We recommend the following to enterprises:

  • Segregate the applications: With multi-cloud gaining traction, enterprises are fine investing in multiple data platforms as well. Enterprises need to segregate their workloads from classical Oracle, SAP, Teradata, and similar platforms as well as newer workloads they plan to build or modernize, generally on open-source databases. As the data type supported by applications evolve, enterprises will need help from data vendors
  • Evaluate partner innovation: In addition to the issues around talent availability, enterprises should evaluate the ecosystem around these two vendors. Innovation that other technology and service companies are building for these data platforms should be important decision criteria
  • Bet on architecture: Both Snowflake and Databricks have a fundamentally different view of the data market. Though their offerings may converge, one brings a warehouse perspective and the other a lakehouse. However, enterprises should think about their architecture for the future. With architectural complexity on the rise, enterprises should ensure their current data management bets align with their business needs 5-10 years down the road

The market is still divided on cloud’s role in data transformation, given the challenges around cost and latency. However, as these platforms bring down the total cost of ownership by segregating compute and storage, cloud data platforms will witness growing adoption.

The general questions on best sourcing methods will always persist irrespective of technology. Enterprises will need to answer some of these such as lock-in, security, risk management, spend control, and exit strategy in making their purchasing decisions.

What has your experience been in using Snowflake and Databricks? Please reach out to me at [email protected].

Why Companies Are Considering Small Tech Firms for Cloud Services | Blog

Cloud as a concept and then as a reality swept through businesses over the past ten years, and most companies moved a lot of their applications to public cloud platforms. AWS, Google Cloud Platform, and Microsoft’s Azure (the hyperscale service providers) are now powerful influencers in business today. They turned IT into a commodity and then put an as-a-service layer on it, thus influencing business thinking as well as IT. But companies are now competing in a different way.

Read more in my blog on Forbes

Cloud Transformation: How Much Is Enough? | Blog

With today’s business transformation led by cloud, migration frenzy remains at a fever pitch. Even though most cloud vendors are now witnessing slower growth, it will still be years before this juggernaut halts. But can you have too much cloud? The question of how far enterprises should go in their cloud transformation journey is rarely thought of. Read on to learn when it may be time for your enterprise to stop and reexamine its cloud strategy.  

Enterprises believe cloud will continue to be critical but only one part of their landscape, according to our recently published Cloud State of the Market 2021. Once enterprises commit to the cloud, the next question is: How far should they go?  This runs deeper and far beyond asking how much of their workloads should run on cloud, when is the opportune time to repatriate workloads from cloud, and whether workloads should be moved between clouds.

Unfortunately, most enterprises are too busy with migration to consider it. Cloud vendors certainly aren’t bringing this question up because they are driving consumption to their platform. Service partners are not talking about this either, as they have plenty of revenue to make from cloud migration.

When should enterprises rethink the cloud transformation strategy?

The challenge in cloud transformation can manifest in multiple ways depending on the enterprise context. However, our work with enterprises indicates three major common obstacles. It’s time to relook at your cloud journey if your enterprise experiences any of the following:

  • Cloud costs can’t be explained: Cloud cost has become a major issue as enterprises realize they did not plan their journeys well enough or account for the many unknowns to start. However, after that ship has sailed, the focus changes to micromanaging cloud costs and justifying the business case. It is not uncommon for enterprises to see the total cost of ownership going up by 20% post cloud migration and the rising costs are difficult for technology teams to defend
  • Cloud value is not being met: Our research indicates 67% of enterprises do not get value out of their cloud journey. When this occurs, it is a good point to reexamine cloud. Many times, the issue is poor understanding of cloud at the offset and the workloads chosen. During migration frenzy, shortcuts are often taken and modern debt gets created, diluting the impact cloud transformation can have for enterprises
  • Cloud makes your operations more complex: With the fundamental cloud journey and architectural input at the beginning more focused on finding the best technology fits, downstream operational issues are almost always ignored. Our research suggests 40-50% of cloud spend is on operations and yet enterprises do not think through this upfront. With the inherent complexity in cloud landscape, accountability may become a challenge. As teams collapse their operating structure, this problem is exacerbated

What should enterprises do when they’ve gone too far in the cloud?

This question may appear strange given enterprises are still scaling their cloud initiatives. However, some mature enterprises are also struggling with deciding the next steps in their cloud journey. Each enterprise and business unit within them should evaluate the extent of their cloud journey. If any of the points mentioned above are becoming red flags, they must act immediately.

Operating models also should be examined. Cloud value depends on the way of working and the internal structure of an enterprise. Centralization, federation, autonomy, talent, and sourcing models can influence cloud value. However, changing operating models in pursuit of cloud value should not become putting the cart before the horse.

Enterprises always struggle with the question of where to stop. This challenge is only made worse by the rapid pace of change in cloud. As enterprises go deeper into cloud stacks of different vendors, it will become increasingly difficult to tweak the cloud transformation journey.

Despite these pressures, enterprises should periodically evaluate their cloud journeys. Cloud vendors, system integrators, and other partners will keep pushing more cloud at enterprises. Strong enterprise leadership that can ask and understand the larger question from a commercial, technical, and strategic viewpoint is needed to determine when enough cloud is enough. Therefore, from journey to the cloud, to journey in the cloud, enterprises should now also focus on the journey’s relevance and value.

If you would like to talk about your cloud journey, please reach out to Yugal Joshi at [email protected].

For more insights, visit our Market Insights™ exploring the cloud infrastructure model. Learn more

Multi-cloud: Strategic Choice or Confusion? | Blog

The multi-cloud environment is not going away, with most large enterprises favoring this approach. Multi-cloud allows enterprises to select different cloud services from multiple providers because some are better for certain tasks than others, along with other factors. While there are valid points to be made both for and against multi-cloud in this ongoing debate, the question remains: Are enterprises making this choice based on strategy or confusion? Let’s look at this issue closer.

The technology industry has never solved the question of best-of-breed versus bundled/all-in consumption. Many enterprises prefer to use technologies consumed from different vendors, while others prefer to have primary providers with additional supplier support. Our research suggests 90% of large enterprises have adopted a multi-cloud strategy.

The definition of multi-cloud has changed over the years. In the SaaS landscape, enterprise IT has always been multi-cloud as it needed Salesforce.com to run customer experience, Workday to run Human Resources, SAP to run finance, Oracle to run supply chain, and ServiceNow to run service delivery. The advent of infrastructure platform players such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) has reinvigorated this best of breed versus all-in cloud debate that results in multi-cloud or single-cloud adoption.

In a true multi-cloud world, parts of workloads were expected to run on different clouds seamlessly. But increasingly, interoperability is becoming the core discussion in multi-cloud. Therefore, it is not about splitting workloads and working across the cloud, but ensuring one cloud workload can be ported to another cloud. While debating a pedantic definition of multi-cloud is moot, it is important to acknowledge it as the way forward.

Most cloud vendors now realize multi-cloud is here to stay. However, behind closed doors, the push to go all-in is very apparent across the three large vendors. Let’s examine the following pro and anti-multi-cloud arguments:

Picture1 3

Both the pro and anti-multi-cloud proponents have strong arguments, and in addition to the above points, there are many others on each side. But the truth is increasing numbers of enterprises are adopting multi-cloud. So, when an enterprise proactively adopts a multi-cloud strategy, does that mean it’s a strategic choice or strategic confusion about cloud and its role as well as the other factors outlined above?

This is a hard question to answer, and each enterprise will have to carve its cloud strategy. However, enterprises should realize this strategy will change in the future. No enterprise will be “forever single cloud,” but most will be “forever multi-cloud.” Therefore, once they embark on a multi-cloud strategy, it will be extremely rare for enterprises to go back, but they can change their single cloud strategy more easily.

In enterprises with significant regional or business autonomy, multi-cloud adoption will grow. Enterprises may adopt various cloud vendors for different regions due to their requirements for workloads, regulations, vendor relationships, etc. Instances will continue to exist where some senior leaders support certain cloud vendors, and, as a result, this preference may also lead to multi-cloud adoption.

On many occasions, enterprises may adopt multi-cloud for specific workloads rather than as part of their strategy. They may want data-centric workloads to run on a cloud but may not want to leverage the cloud for other capabilities. Many cloud vendors may play “loss leaders” to get strategic enterprise workloads (e.g., SAP, mainframe) onto their platform to create sticky relationships with clients.

Many software vendors are launching newer offerings proclaiming they work best with client’s multi-cloud environments. As an ecosystem is built around multi-cloud, it will be hard to change. In addition to AWS, GCP, and MS Azure, other cloud vendors are upping their offerings, as we covered earlier in Cloud Wars Chapter 5: Alibaba, IBM, and Oracle Versus Amazon, Google, and Microsoft. Is There Even a Fight?.

Given multi-cloud drives “commoditization” of underlying cloud platforms, large cloud vendors are skeptical of it. Integration layers that provide value accretion on abstract platforms rather than core cloud services is an additional vendor concern. However, eventually, a layer on top of these cloud vendor platforms will enable different cloud offerings to work together seamlessly. It will be interesting to see whether cloud platform providers or other vendors end up building such a layer.

We believe system integrators have a good opportunity of owning this “meta integration” of multi-cloud to create seamless platforms. However, most of these system integrators are afraid of upsetting the large cloud vendors by even proactively bringing this up with them, let alone creating such a service. This reluctance may harm the cloud industry in the long run.

What are your thoughts about multi-cloud as a strategy or a confusion? Please write to me at [email protected].

Will India’s Personal Data Protection Bill Act as a Harbinger for a Sovereign Cloud Initiative? | Blog

Pending legislation intended to protect the privacy of India’s citizens could set the stage for a sovereign cloud initiative and new opportunities in the Indian cloud ecosystem. Is India following the same trajectory as Europe toward data sovereignty? And what benefits could it bring to the country and its people? To learn more about the ripple effects passing the Personal Data Protection Bill (PDP) could have on the industry, read on. 

The passage of the PDP Bill would change the data privacy dynamics within India by regulating the use of an individual’s data by the government and private companies. While not expected to come before the Indian Parliament for at least another three months when the winter session starts in November, the long-delayed and highly-debated legislation has larger potential implications.

First brought to the Parliament in 2019, the bill is now with the Joint Parliamentary Committee (JPC) for examination, where five extensions to submit its report on the bill have already been granted.

The most current draft has been criticized by many, including former Justice B.N. Srikrishna, who worked extensively in defining and writing the first draft of the PDP Bill. Justice Srikrishna has highlighted certain provisions in the amended PDP Bill 2019 that he says make it “dangerous” and can turn India into an “Orwellian State.”

The JPC, led by chairperson P.P. Chaudhary, has been tasked with identifying the problems and potential solutions and has held talks so far with Facebook, Twitter, Amazon, Google, Airtel, Jio, Ola, Uber, and Paytm among other major tech giants.

Definitions and points of contention

Among the points of concern are the definitions of the types of data and where each can be stored and processed. PDP Bill 2019 has segregated personal data into the following sub-categories:

  • Sensitive Personal Data – (Chapter 1, Section 3, Sub-Section 36). Defined as any personal data which may reveal, be related to, or constitute financial data, health data, official identifier, sex life, sexual orientation, biometric data, genetic data, transgender status, intersex status, caste or tribe, religious or political belief or affiliation, and any other data categorized as Sensitive Personal Data under section 15
  • Critical Personal Data – The government has been given broad discretion to define this type of data. While not final, it is currently stated as “personal data as may be notified by the Central Government to be the Critical Personal Data”

Unlike the original intent that mandated the storage of all personal data within India’s boundaries, the amended bill states that a copy of Sensitive Personal Data needs to be stored locally and can be sent abroad for data processing, under certain regulations.

The revised bill would require Critical Personal Data to be processed as well as stored within India and only sent outside India under certain conditions (outlined in Chapter VII, Section 34, Sub-Section 2 of the draft).

These data localization amendments have given the tech giants a much-needed respite, especially considering the Competition Commission of India (CCI) has tightened its rope on them over the past two years, with the latest episode being the debate over WhatsApp’s new privacy policy.

What’s the next logical step?

India’s current path draws a parallel with the European Union (EU), where data privacy across all the European member states is regulated under the General Data Protection Regulation (GDPR). If we follow the analogy closely, the next logical step for India would be to launch its sovereign cloud, in line with the new European sovereign cloud initiative named GAIA-X.

If India goes ahead with a sovereign cloud, it would unlock a new dimension, at least for the public sector, to explore and build on. With the strong government push under the ‘Make in India’ and ‘Digital India’ initiatives as well as a strong IT workforce, a sovereign cloud platform would not be a too distant dream.

Some of the benefits to India from a sovereign cloud initiative include:

  • Creating a secure and compliant platform for the public sector: India’s sovereign cloud would provide its public sector a secure, reliable, and compliant platform. Government-backed applications like messaging app Sandes and Twitter’s doppelganger Koo can effectively utilize a sovereign cloud platform. It can further be augmented to develop new applications, especially those designed for the public sector
  • Spurring cross-collaboration across various industries: Having a sovereign cloud platform would enable more vertical industries to securely onboard to the platform. With strong guidelines, anonymized and aggregated data sharing could occur, leading to a collaborative ecosystem of data analytics where citizens reap the benefits
  • Delivering community benefits underpinned by healthcare: A sovereign cloud platform like GAIA-X could augment the healthcare sector’s digitization endeavor and pave a compliant way for Electronic Health Records (EHR) creation and their interoperability. Currently, the Indian government is issuing digital vaccination certificates with QR codes and has plans to issue vaccination certificates that will be valid across the globe. Compliance could be hassle-free if India builds a sovereign cloud platform

The only big challenge that India might face is not having a successful sovereign cloud initiative of this scale to benchmark against. Europe’s GAIA-X will be the closest counterpart for India’s sovereign cloud initiative and that also is in a nascent stage.

Ripple effects on the Indian cloud ecosystem

With some degree of data localization seemingly inevitable, companies have identified a good business opportunity and are racing to get the ‘first-mover’ advantage. Various firms have started the ball rolling – from construction giants like Adani and Hiranandani jumping into the data center business to cloud solution providers like Genesys launching new capabilities with localized data storage and data sovereignty as key factors for its contact center solution.

With the enactment of PDP as law, we expect the proliferation of data centers and an increased cloud hyperscaler presence in India. A new hyperscaler-backed sovereign cloud initiative also could be possible, along with an increased focus by cloud service providers on the legal framework to keep critical data within India’s geographic boundaries.

In the long run, we can see certain service offerings emerging to manage client data, which would be very similar to how the software and services market for GDPR has evolved over the years.

What do you think the next logical steps for the government will be after passing the Personal Data Protection Bill, and how will the law impact the industry? Please share your thoughts with us at [email protected] and [email protected].

Why Is Cloud Migration Reversing from Public to on-Premises Private Clouds? | Blog

Increasingly this year, we see many companies that aggressively migrated their work from on-premises clouds looking to move work back to on-premises and private clouds. The mindset that the public cloud saves money because a company only pays for what it uses is just theoretical and really an illusion. Realistically, companies tend to buy capacity rather than actual time used. Thus, companies are in a take-or-pay situation like the economics of a private cloud or on-premises solution.

Read more in my blog on Forbes

In the “Next Normal” for Consumer Lending, Cloud-first Platforms Offer a Silver Lining | Blog

The post-pandemic world has created a disparate consumer landscape, with many finding themselves with more disposable income from stimulus checks while others struggle financially. What does it mean for consumer lending?

The consumer lending industry is poised to see tremendous growth post the pandemic, but most lenders’ credit risk appetite has shrunk, given the economic uncertainty.

With consumer expectations and behaviors continuing to change, cloud-first consumer lending platforms will play a key role in driving growth in this new era.

Let’s take a look at what’s changing and what innovative solutions lie ahead. Loans delivered through mobile apps, banking without humans, and consumer banks offering more lifestyle experiences are all trends to watch for.

A new market

The lending industry has been battered by a series of challenges over the last year as COVID-19 caused a significant decline in economic activities and consumers across the globe braced for a recession.

With economic hope in sight and consumers ready to release their pent-up demand and make up for lost spending, demand is now rising. Although it is important for lenders to take caution, they must take care of their shrinking book size by targeting the right set of customers with the right products.

But today’s consumer is different than in the past. They demand different experiences and loan products. As a result, lenders must rethink their product, experiences, and channel strategy and adapt to changing consumer expectations.

Changing consumer behaviors and new consumer segments

The fundamental tenets of Secured, Ubiquitous, Personalized, Easy, and Responsive (SUPER) lending experiences have not evolved; however, the relative importance of these tenets has changed. As we are seeing everywhere, customers have become more comfortable with digital transactions and expect to shop for loans through an end-to-end digital lending process with minimum assistance from humans.

Consumers also have grown increasingly concerned about their financial health and credit standing during these times. This has increased demand for small-ticket loans for financing short-term requirements versus large long-term loans.

Also, expect new consumers to join the market for the first time who have limited or no credit history. These include the digital-savvy, well-educated Generation Z segment and financially-distressed consumers who are seeking loans as a result of government economic revitalization programs during the pandemic.

Picture1

Both of these groups will be crucial in driving growth in consumer lending, especially in personal loans. According to research by Experian, while the overall growth in personal loans in the U.S. has declined 6 percent in 2020 compared to 12 percent the prior year, Gen-Z consumers saw their personal loan balances grow 33 percent compared to less than five percent for other age groups.

How lenders should prepare

We are moving from a “low demand, low supply” state to a climate of increasing demand while the supply side struggles to keep pace. Besides their conservative approach, lenders also face challenges to manage consumer behavioral change and provide the right products and experiences.

To cater to this new demand pattern and tap into the new customer segments to revive their shrinking loan books, lenders must leverage data and technology thoughtfully in their loan origination process. Here’s how:

  • Understand and adapt to the new consumer behavior: Lenders must equip themselves with a modern data stack and the right technology such as behavioral AI, which will enable a comprehensive view of a consumer’s situation and provide insights. New initiatives like video-based customer service and the use of bots for assistance will resonate well with customers and deepen relationships and trust
  • Nudge consumers towards low-cost, digital-first channels for acquisitions: While traditional marketing and referral sources remain important, digital channels are becoming mandatory for lenders wishing to compete across all consumer segments. Digital acquisition channels increase efficiency and provide a rich source of digitized data that can further be used to assess the customers and push customized product notifications
  • Offer integrated experiences by introducing lending as part of regular buying processes: Lenders must look to integrate their products as sub-sets of consumer’s home, auto, and other buying processes. An ecosystem approach by partnering with local businesses, e-commerce players, educational institutions, and other relevant parties and leveraging their consumer bases will be key. Next-generation customers will shift towards a mobile-based super-app ecosystem for all their lifestyle and financial needs, and lenders must not miss that opportunity to be an integral part of that ecosystem
  • Offer contextualized products for evolving consumer needs: In today’s scenario, it has become extremely important to be able to create lending products that are flexible enough to meet evolving customer needs without significantly changing the operations side. To successfully grow their consumer loan books, lenders must be able to identify financial issues consumers are facing and support them while also protecting their portfolios. One example of this would be providing advisory services for their existing customers to acquire hardship loans as a result of the pandemic
  • Create a unique experience for each hyper-segment: Lenders should reimage their product portfolio to cater to micro-segments of customers. As consumers become more value-conscious, offering a unique lending product, providing a real-time, hyper-relevant, and personalized experience for each customer will be important

 The solution: cloud-first, platform-based operating models

As banks and financial institutions shift towards a digital lending culture with new products, new channels, and new experiences to meet consumer demands, they must have the right level of agility to respond to the continuous changes in the demand side.

A platform-based operating model, which is flexible enough to accommodate the changes, will be essential to react to these evolving demands with speed as well as scale. Cloud technology is the infrastructural element that will enable the agility, flexibility, and scalability of the platform.

At the same time, lenders must reimagine their traditional data value-chain with future-ready data exchanges, enabling real-time data analytics and decision support to provide deeper customer and channel insights. Cloud allows lenders to utilize the data and AI technology that hyperscalers and cloud technology vendors offer to make the best use of internal and external data.

Many banks and financial institutions already rely on third-party digital loan origination platforms to modernize their processes, reduce operational costs, and improve customer acquisitions and revenue. Such platforms deliver enhanced omnichannel customer experiences and operational efficiency through AI or machine learning as well as analytics-rich solutions to enterprises by enabling fraud mitigation.

 Looking ahead

The consumer lending industry will only evolve further and a pragmatic approach to adoption of cloud, data and analytics, and exponential technologies will help lenders realize value from technology buzzwords and impact customer experience and business performance objectives. Innovative solutions from product vendors across the lending value chain will enable lenders to close the gaps in their current systems.

If you would like to better understand how a platform-centric approach can transform your consumer lending business in this dynamic environment, please reach out to [email protected] or [email protected].

You can also download our complimentary viewpoint Consumer Lending on the Cloud.

Microsoft Goes All in on Industry Cloud and AI with $20 Billion Nuance Deal | Blog

Yesterday’s announcement of Microsoft’s acquisition of Nuance Communications signifies the big tech company’s serious intentions in the US healthcare market.

We’ve been writing about industry cloud and verticalization plays of big technology companies (nicknamed BigTech) for a while now. With the planned acquisition of Nuance Communications for US$19.7 billion, Microsoft has made its most definitive step in the healthcare and verticalization journey.

At a base level, what matters to Microsoft is that Nuance focuses on conversational AI. Over the years, it has become quite the phenomenon among physicians and healthcare providers – 77 percent of US hospitals are Nuance clients. Also, it is not just a healthcare standout – Nuance counts 85 percent of Fortune 100 organizations as customers. Among Nuance’s claims to fame in conversational AI is the fact that it powered the speech recognition engine in Apple’s Siri.

Why Did Microsoft Acquire Nuance?

The acquisition is attractive to Microsoft for the following reasons:

  1. Buy versus build: If Microsoft (under Satya Nadella) can trust itself to build a capability swiftly, it will never buy. Last year, when we wrote about Salesforce’s acquisition of Slack, we highlighted how Microsoft pulled out of its intent to acquire Slack in 2016 and launched Teams within a year. Could Microsoft have built and scaled a speech recognition AI offering?
  2. Conversational AI: Microsoft’s big three competitors – Amazon, Apple, and Google – have a significant head start in speech recognition, the only form of AI that has gone mainstream and is likely to be a US$30 billion market by 2025. Clearly, with mature competition, this was not going to be as easy as “Alexa! Cut slack, build Teams” for Nadella
  3. Healthcare: This is another battleground for which Microsoft has been building up an arsenal. As the US continues to expand on its $3 trillion spend on healthcare, Microsoft wants a share of this sizeable market. That is why it makes sense to peel the healthcare onion a bit more

 

What Role Does Microsoft Want to Play in Healthcare?

While other competitors (read Amazon, Salesforce, and Google) were busy launching healthcare-focused offerings in 2020, Microsoft was already helping healthcare providers use Microsoft Teams for virtual physician visits. Also, Microsoft and Nuance are not strangers, having partnered in 2019, to enable ambient listening capabilities for physician to EHR record keeping. Microsoft sees a clear opportunity in the US healthcare industry.

  • Everest Group estimates that technology services spending in US healthcare will grow at a CAGR of 7.5% for the next five years, adding an incremental US$25 billion to an already whopping $56 billion
  • The focus of Microsoft and its competitors is to disrupt the multi-billion ($40 billion by 2025) healthcare data (Electronic Medical Record) industry
  • Erstwhile EMR has been a major reason for physician burnout, which the likes of Nuance aim to solve
  • Cloud-driven offerings such as Canvas Medical and Amazon Comprehend Medical are already making Epic Systems and Cerner sit up and take notice

It is not without reason that Microsoft launched its cloud for healthcare last year and has followed it up by acquiring Nuance.

What Does it Mean for Healthcare Enterprises?

Under Nadella, Microsoft has developed a sophisticated sales model that takes a portfolio approach to clients. This has helped Microsoft build a strong positioning beyond its Office and Windows offerings even in healthcare. Most clients in healthcare are already exposed to its Power Apps portfolio and Intelligent Cloud (including Azure and cloud for healthcare) in some form. It is only a question of time (if the acquisition closes without issues) until Nuance becomes part of its suite of offerings for healthcare.

What Does it Mean for Service Providers?

As a rejoinder to our earlier point about head starts, this is where Microsoft has a lead over competitors. Our recent research with System Integrators (SI) ecosystem indicates that Microsoft is head and shoulders above its nearest competitors when it comes to leveraging the SI partnership channel to bring its offerings to enterprises. This can act as a significant differentiator when it comes to taking Nuance to healthcare customers as SI partners can expect favorable terms of engagement.

Partners' Perceptions

Lastly, this is not just about healthcare

While augmenting healthcare capabilities and clients is the primary trigger for this purchase, we believe Microsoft aims to go beyond healthcare to achieve the following objectives:

  • Take conversational AI to other industries: Clearly, healthcare is not the only industry warming up to conversational AI. Retail, financial services, and many other industries have scaled usage. Hence, it is not without reason that Mark Benjamin (Nuance’s CEO) will report to Scott Guthrie (Executive Vice President of Cloud & AI at Microsoft) and not Gregory Moore (Microsoft’s Corporate Vice President, Microsoft Health), indicating a broader push
  • Make cloud more intelligent: As mentioned above, Microsoft will pursue full-stack opportunities by combining Nuance’s offerings with its Power Apps and Intelligent Cloud suites. As a matter of fact, it plans to report Nuance’s performance as part of its Intelligent Cloud segment

Microsoft: $2 Trillion and Beyond

This announcement comes against the background of BigTech and platform companies making significant moves to industry-specific use cases, which will drive the next wave of client adoption and competitive differentiation. Microsoft’s turnaround and acceleration since Nadella took over as CEO in 2014 are commendable (see the image below). It is on the verge of becoming only the second company to achieve $2 trillion in market capitalization. This move is a bet on its journey beyond the $2 trillion.

Picture2 5

What do you make of its move? Please feel free to reach out to [email protected] and [email protected] to share your opinion.

Request a briefing with our experts to discuss the 2022 key issues presented in our 12 days of insights.

Request a briefing with our experts to discuss our 2022 key issues

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

  • Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.