Category: IT Services

Productization Emerges in Media Services – A Shift Driven by Technology | Blog

Leveraging the power of Artificial Intelligence (AI) and data and analytics tools, media and communications companies are redefining themselves from being service providers to product creators. Read on to learn how emerging technology is changing the game in media services. 

While technology has played a pivotal role in media services for some time, use cases have increased and grown in complexity in recent years. AI and data and analytics are now widely used across the media planning and buying value chain. Moreover, media agencies are emphasizing developing in-house tools, solutions, and products based on data and analytics, AI, and the Internet of Things (IoT) to gain a competitive edge and propose a distinguished value proposition to their clients. Let’s explore this shift further.

Rising momentum in adopting emerging technology in the media services value chain

Essentially, the media services value chain consists of four broad components – media strategy and planning, creative development, buying and execution, and analytics and measurement as depicted below:

Exhibit 1: Media planning and buying value chain

Picture1 5

In the early days, technology was mainly used for basic tasks such as segmenting audiences, measuring media campaign analytics, and automating simple processes that were focused on a single segment.

Next, the focus shifted to programmatic advertising, automating the media buying process. Further advancements in AI-enabled programmatic media buying have spurred media agencies to invest in media marketplaces.

For example, GroupM launched Premium Marketplace, a unified programmatic marketplace with increased media buying transparency and efficiency. Omnicom Media Group introduced a programmatic marketplace for point-of-purchase screens in the US. Moreover, Publicis Media launched a programmatic marketplace that allows brands to reach underrepresented communities.

The advancements in the ease of use of AI and IoT combined with the growing prowess of data and analytics has unleashed intricate new use cases across the entire media services value chain, opening new avenues for media agencies.

As the number of channels and consumer touchpoints continue to proliferate, the emphasis has shifted to analyzing complex audience segments; automating complex processes; developing accelerated, customized, and optimized media plans across channels; and analyzing and maintaining elaborate data sets of user journeys.

AI and a data-driven approach enables audience building at a granular level, creating a single customer view and delivering highly personalized customer experiences.

Exhibit 2: Use cases of technology in the media services domain

Picture2 3

Rise of productization in media services

Increasingly, media companies are investing in platforms, in-house solutions, and tools to expand their offerings and reposition themselves from being pure play professional service providers. Agencies under the parent brand also are leveraging these investments to drive greater synergies by maintaining large in-house data sets and collecting user journey data.

As margins become thinner in the core media services, the momentum toward productization has opened a new revenue stream for media agencies. Some notable examples of these investments include:

  • Wavemaker’s media planning tool Maximize, an AI-powered media planning platform, creates media plans to reach multiple audiences and optimize media investments
  • Omnicom Group’s Omni, a precision marketing and insights platform, is designed to identify and define personalized consumer experiences at scale
  • Performics’ suite includes tools for search optimization, customized user journeys, brand discovery, and Amazon advertising bid strategy
  • Tinuite’s Mobius suite includes proprietary apps consisting of MobiusOS, Mobius Apps, and MobiusX
  • Havas Media’s Converged, an identity-based planning and buying platform, aims to deliver stronger insights, tighter targeting, and a more consistent customer journey

Other investments and partnerships to elevate the power of these platforms by integrating additional data sources include Omnicom’s Omni partnerships with Walmart Connect and NBCU Audience Insights Hub; and Wavemaker working with Amazon Advertising’s Overlapping Audiences API to access Amazon’s audience insights into media plans and campaigns.

The launch of channel-specific solutions is another example of the growth in productization. This can be seen in GroupM Australia’s launch of a custom-built, data-driven Digital Out of Home (DOOH) planning and activation platform that enables advertisers to deliver audience-based DOOH campaigns across premium sites. Further, GroupM developed Sightline, a dedicated digital out-of-home solution, which combines programmatic technology and AI with GroupM’s buying scale.

In addition to media agencies, neutral ad-tech platforms such as The Trade Desk and Xandr have invested in developing in-house platforms. Solimar by The Trade Desk has cross-channel precision and measurement capabilities and unleashes the power of first-party data. LiveRamp’s Safe Haven enterprise platform enables data gathering from multiple sources, transforms it into a connected customer view, and then activates at scale.

Is metaverse the next big thing for media services?

Interestingly, as the industry still explores the potential of the metaverse, media agencies have started executing pilot projects in this area. Publicis Groupe now has over 1000 employees dedicated to metaverse projects. WPP has incubated a metaverse foundry, while Wavemaker recently executed a wedding in the metaverse.

The outlook

Driven by in-house data sets and aggregating data through APIs, technology has changed the focus of media agencies toward offering products. This has given them new opportunities to unlock additional revenue streams and deliver distinguished value propositions.

Looking ahead, AI, data and analytics, and enablers such as the metaverse will continue to impact the dynamics of the media services value chain, and we can expect to see more investments in the near term.

To discuss technology in media services, please reach out to Nisha Krishan or Aakash Verma.

Explore more about evolving technologies what we could expect from the metaverse, watch our webinar, What to Expect from the Metaverse in 2023: Moving Beyond Hype.

Low Code No Code (LCNC) Case Study: Working with a Low-code Platform Provider to Develop Product and Pricing Strategy | Blog

Everest Group recently helped a low code no code (LCNC) product company evolve its product and pricing strategy by assessing the market standing of its platform features and commercials. Read on to learn about the approach, assessment dimensions, and outcomes in this case study.

As part of the engagement, we provided the following services to the client:

  • Identified key strengths and development opportunities for the platform and provided short-term and long-term roadmap recommendations for prioritizing features
  • Assessed the leading LCNC commercial models in the industry, along with enterprise adoption patterns and preferences
  • Performed a price benchmarking exercise
  • Projected the pricing evolution over the next 24 months based on demand patterns and macroeconomic trends

Everest Group approach

Everest Group analyzed the platform from two broad perspectives: application development capabilities (declarative tooling to accelerate application development and delivery) and process orchestration capabilities (ability to design, execute, and monitor business processes). In this blog, we’ll focus on the application development capabilities.

Everest Group gathered insights about the platform capabilities from the provider using a comprehensive RFI that collected more than 190 data points across 17 categories, followed by a briefing and demo showcasing the platform’s capabilities. Inputs from these sources were used in conjunction with our existing low-code research and ongoing conversations with ecosystem players to inform the final review across the below assessment dimensions.

Assessment dimensions

Based on our research of low-code platforms published earlier this year, we identified the following five key areas where LCNC platforms can drive competitive differentiation through strategic investments:

P1

Source: Everest Group

Now let’s take a closer look at each of the assessment dimensions and the features/components evaluated in each of these areas.

  • Pre-built templates

One of the foundational elements of an enterprise-ready low-code platform, this dimension takes into account the availability of out-of-the-box templates for common user interface components, widgets, and functional libraries, as well as reusability of user-built templates, availability of industry-specific out-of-the-box solutions, and a robust marketplace supported by multiple partners.

  • Interoperability

Before incorporating any new technology tool into its ecosystem, enterprise IT teams prioritize its ease of integration with the existing tech stack during the evaluation process. This dimension assesses the availability of pre-built connectors to common tools in the enterprise technology stack, the low code option to include further integrations, and the ability for developers to use custom scripting to call external application programming interfaces (APIs) where required.

  • Artificial Intelligence (AI) capabilities for application development

As low-code platforms scale in importance from building departmental workflow applications to business-critical enterprise-grade applications, the extent of AI-powered abilities offered is proving to be a key win theme. Through in-house capabilities or integration with external AI providers, low-code platforms aspire to provide AI-powered development assistance, AI-based application management services, AI-based automated testing, AI-powered code quality alerts, and AI-augmented portfolio analysis.

  • Collaborative development

Business users play an increasingly important role in the application lifecycle, collaborating with professional developers to reduce the gap between their requirements and the final product’s functionalities. Therefore, a platform should offer capabilities that facilitate effective collaboration, such as role-based access, project management capabilities, document sharing capabilities, and notifications on app updates, among others.

  • User Interface (UI) and User Experience (UX) capabilities

This dimension considers the support provided to UI components like lists and tables, the ability to support different devices and operating systems, language options, integrations with design tools, reusable screen templates, customizable themes, and navigation ease. UI/UX capabilities are a key swaying factor in enterprise low code buying decisions, especially when building customer-facing applications.

Pricing insights

The subscription-based pricing model experienced higher adoption than perpetual licensing in 2021 because it results in lower upfront investments and greater flexibility to scale deployments. Of the eight different pricing models uncovered in our research, the user-based licensing model was the most widely adopted. On-premise deployment of low-code and no-code platforms was found to be 25-50% more costly than cloud-based deployment.

Sample outputs

Source: Everest Group

Picture2 1

picture3

Picture4

Outcomes

Picture5

Gap identification – Everest Group helped the client identify key market differentiators, strengths, and limitations across dimensions, as well as the key features for each of these elements

Fine-tuning product vision and roadmap – The insights helped the client prioritize its feature pipeline and advance its messaging to have a greater impact

Product and pricing strategy – The trend analysis helped the client understand the pricing strategies adopted by their competitors and how these vary based on factors like hosting environment and buyer geography

For more information about this LCNC project or to discuss our research on low-code and no-code platforms, please reach out to Manukrishnan SR, Alisha Mittal, or Yugal Joshi.

Also, learn about the top five demand themes – data and AI, cloud, experience, platforms, and security – driving growth for IT service providers in our webinar, IT Service Provider 2023 Forecast: The Top 5 Themes for Growth and Wallet Share.

Is the Insurance Industry Seeing a SaaS Revolution or a SaaS Sprawl Challenge? | Blog

As leading insurance organizations seek to be more data-driven in their business decisions, they are looking for solutions that can seamlessly integrate with their existing insurance technology stack. Technology providers have responded by building capabilities to offer plug-and-play solutions that align with carriers’ immediate priorities to extract more value from investments. With the industry landscape exploding with multiple solution providers offering carrier customization and commercial flexibility, we are witnessing a flood of SaaS solutions across the insurance value chain. Read on to explore the issue of SaaS sprawl which is quickly becoming one of the industry’s leading pain points.

At the turn of this century, the SaaS revolution shifted the paradigm of how technology could be deployed. The triad of quick deployment, timely upgrades with little/no inconvenience, and cost-effective solutions made SaaS solutions a force to be reckoned with.

A massive influx of point solutions – tools that aim to address a single use case within a business – followed over the years. Today, an insurance technology stack has multiple point solutions assembled atop core systems to bridge gaps in existing capabilities as illustrated below:

Exhibit 1 – Technology stack in the insurance value chain

Picture1 1

We are now seeing a massive explosion of SaaS applications in the entire insurance value chain that likely will reach a point where businesses start to see diminishing returns. Estimates suggest that organizations (with more than 1,000 full-time equivalent employees) use more than 100 applications on average at any given time.

This rapid expansion without any/sufficient oversight has led to SaaS sprawl, the unchecked proliferation of third-party applications that can impact the entire organization. Let’s look at this challenge further.

What is causing SaaS sprawl?

The huge availability of new technology solutions tops the list of contributors to this issue. The willingness to adopt newer solutions has grown exponentially in recent years with the success of cloud-based applications and services. Shadow IT is another contributor.

While the freedom for employees to use applications without explicit approval from a centralized department can boost productivity and drive innovation, it also can have undesirable complications. The growing clout of low-code/no-code capabilities is pushing this trend further.

Infusing intelligence throughout the insurance value chain requires specific capabilities in the five core areas of product development, sales and distribution, underwriting, policy administration, and claims management (as shown in the exhibit above).

Specialized players that focus on one or a few areas are emerging, making choosing from the many solutions and managing multiple applications for end-to-end technology solutions extremely difficult.

As more and more insurance companies turn to SaaS solutions to streamline their operations, they can quickly find themselves in a tangled web of different tools and platforms.

Here are some other challenges that hinder organizations:

  1. Each platform may have its own set of features and capabilities, making it difficult for insurance companies to keep track of the applications and potentially duplicate them
  2. SaaS providers seeking to expand their presence in the value chain can potentially dictate the overall vision of an insurance technology stack through aggressive sales tactics pushing insurers to buy more tools (potentially causing more overlap). This can drive the total cost of ownership up and make getting the most out of the technology stack an aspirational goal
  3. 3. Issues integrating systems can cause data silos and inhibit the sharing of potentially crucial information, forcing carriers to incrementally spend on integration and custom builds to gather the single view of data and systems that is lacking
  4. Being locked into contracts with providers makes it hard to modernize or move to alternate providers because migration would be costly



Put succinctly, dealing with compatibility issues among applications and the data interplay, and managing contract and upgrade cycles becomes a precarious juggling act. As a result, the great bundling of the entire insurance technology stack is needed!

Alternatively, it does not make sense to put the brakes on the development altogether. Restricting employees’ ability to build and buy applications may do more harm than it can potentially help.

Organizations need to provide the foundation for their teams to think about software applications strategically. Below are recommendations for enterprises to seize the multitude of opportunities:

  • Define a target state vision and align SaaS providers to this vision through a mix of service-level agreements and continuous collaboration
  • Invest in cloud economics capabilities to manage the cost of cloud spend and conduct exercises to rationalize the SaaS environment every year
  • Educate business and technology executives on how to avoid SaaS sprawl

IT service providers have an important role to play as solution orchestrators. Working with a core insurance technology provider can offer tight integration to third-party SaaS solutions to manage integration, risk, data access, and cost challenges.

If you have questions or would like to discuss SaaS sprawl, please reach out to [email protected], [email protected], and [email protected].

Explore our upcoming webinar, IT Service Provider 2023 Forecast: The Top 5 Themes for Growth and Wallet Share, for emerging themes, challenges, and growth pockets in the technology services market.

Now is the Time to Protect Operational Technology Systems from Cyber Risks | Blog

With growing digitalization and interconnected devices, Operational Technology systems that monitor and control industrial processes in critical infrastructure are increasingly vulnerable to cyber attacks. Learn about the OT security concerns enterprises face and key considerations for selecting an OT security provider in this blog.   

Historically, IT and Operational Technology (OT) systems have been air-gapped, with little or no spending on Industrial Control Systems (ICS) security by enterprises. Further, most investments in industrial robots, SCADA, and PLC systems were made with a multi-decade horizon. This differs from IT investments, where the horizon is five years, and the technological refresh takes care of the security risk. Enterprises have been unwilling to touch OT systems because these big, monolithic systems ran well for many years, making security vulnerabilities and risks less of a priority to consider.

But OT systems – that power some of the nation’s most critical infrastructure – are at risk.

With the recent pandemic-driven digitization push and enterprises wanting to run resilient supply chains, these large monolithic untouched systems are now interconnected, making them highly prone to cyber attacks. These OT systems have also never been given basic security treatment of frequent patching, regular security updates, and periodic backups, which has further aggravated the issue.

Operational technology systems cyber attacks

Highly public cyber attacks on OT systems have raised awareness about the serious risks these security breaches can have on essential services, as seen in these industry-specific cases:

  • Manufacturing – This segment had the second-highest ransomware-associated data extortion attacks in 2021. Traditionally, plant machinery and equipment have been designed for performance and not security. But stalled assembly lines or production units directly impact end buyers and can be disastrous for manufacturers
  • Energy, utilities, and water – Threat actors have been targeting the most crucial elements in this critical infrastructure industry, mandating enhanced cybersecurity controls. Securing critical grid assets, substations, distribution pipelines, meters, etc., must be addressed
  • Oil and gas – Digitizing operations for improved efficiency has increased the attack surface and has made this area more vulnerable to threats. During the Russia-Ukraine war, states were reportedly involved in sponsoring attacks, leading governments across the world to alter or create industry regulations and guidelines

Operational technology systems key challenges

The increasing connectivity of operational technology with external networks has further exacerbated the many OT security challenges enterprises face. Major OT security concerns include asset identification, misaligned IT and OT functions, OT threat and asset intelligence, patching legacy infrastructures, OT vulnerability management, and network segmentation.

Based on market conversations with more than 100 market participants, Everest Group identified and prioritized the following key enterprise challenges.

Technology vendor snapshot for OT security

Picture1

Source: Everest Group

Specialist providers can help enterprises navigate security challenges related to OT and ICS. Enterprises should seek technology solutions that allow them to quickly identify vulnerabilities and prioritize actions to reduce and eliminate potential risks.

A provider of choice should offer a single platform for visibility and threat monitoring while ensuring seamless integration with existing enterprise technology investments. The below capabilities illustrate what enterprises should look for when selecting an OT security provider.

Picture2

By investing in tools that can provide vital intelligence and partnering with providers that offer compatible, industry-specific solutions and a skilled talent pool, enterprises can begin to thwart the growing risks to OT systems – before it is too late.

To discuss Operational Technology Systems and OT security, please reach out to [email protected] and [email protected].

Explore the top five demand themes in technology services – data and AI, cloud, experience, platforms, and security – driving growth for IT service providers in 2023 in our upcoming webinar, IT Service Provider 2023 Forecast: The Top 5 Themes for Growth and Wallet Share.

Five FinTech Trends to Watch for in the New Year | Blog

Since every past economic slowdown in this century has led to accelerated innovation and growth for FinTech firms, 2023 should be no different. We expect financial technology players to answer investors’ demands for increased profitability by tweaking business models and product innovation. To learn what FinTech trends will dominate in the coming year, read on.

FinTech firms have a history of responding to tough economic times by adapting and coming up with new business approaches. Looking back at the downturn in 2008, new FinTech trends emerged, including personal finance management (PFM), insurance aggregators and marketplace, robo-advisors, crowdfunding, challenger/neo/digital-only banks, and cryptocurrencies. Following the same pattern for innovation, the pandemic-led slowdown has resulted in buy now pay later (BNPL), metaverse payments, decentralized finance (DeFi), and Web 3.0.

Let’s explore the following FinTech trends on the horizon for 2023:

Investors will push for profitability

Rising interest rates and slow economic growth have pushed FinTech investors to demand profitability improvements. As a result, FinTech firms that were built to drive growth at the expense of profitability to scale and acquire customers are now forced to adapt their business models and investments. We expect FinTechs to find alternative monetization models. One such alternative that FinTechs are exploring is selling/licensing their technology, such as core systems and machine learning models (that they built and trained), to other financial services firms. Accessing already built and trained machine learning models will enable financial services to adopt AI at speed and scale without additional time and expense.

FinTechs will target eliminating operational inefficiencies and data silos in core processes

The last decade has seen FinTechs eat into the front-office surplus of incumbent financial services firms. Now, they are increasingly moving into mid-and-back-office processes to streamline these processes and data systems. We see FinTechs targeting the hard problems that incumbent financial services firms are slow to resolve because of legacy systems, data, and established processes. For example, eight of the top 10 retirement plan providers in the US are struggling with legacy mainframe-based technology and processes. Newer firms such as Retirable, Penelope, Smart, and Silvur have entered the market to provide better retirement experiences. Firms like Alto are bringing innovation from the Web 3.0 space to the retirement market by offering Individual Retirement Account (IRA) platforms. These IRA platforms simplify investing in alternative assets, such as start-ups and cryptocurrencies, by using tax-advantaged retirement funds. Beyond the retirement and pension segment, we see similar activity in the treasury, investment banking, group benefits, and specialty insurance markets

FinTechs will move away from bundling/aggregation to financial ecosystem orchestration

Wallets and super apps are becoming the foundational blocks for enabling ambient banking, which is focused on meeting the business and/or customer at the moment of their need, crossing other industries. Firms like Roostify and Ribbon want to orchestrate the end-to-end home-buying experience. Players such as Nomi Health and PayGround seek to simplify the end-to-end healthcare payments experience. We expect to see more vertically integrated FinTech firms at the intersection of financial services and industry experiences (e.g., car buying, small business invoicing and billing, supply chain, loyalty, and travel). Cloud and APIs are two technology components enabling the technical architecture necessary for embedded banking.

FinTechs will tap into the sustainability opportunity

Environmental, social, and governance (ESG) is a major demand theme that represents a relatively untapped market by FinTechs. We expect areas such as carbon credit marketplaces, ESG data and analytics solutions, and ESG customer transparency solutions to dominate most FinTech activity in 2023. FinTechs that can offer support for ESG reporting and compliance for small- and mid-size financial services firms is a white space that should see significant growth in 2023.

Payments, wealth management, treasury, Web 3.0, and risk and compliance (RegTech) will be the fastest-growing FinTech segments in 2023

We expect a slowdown in lending and BNPL and challenger banking because of profitability challenges, whereas segments such as cryptocurrency will see some slowdown due to the tightening of regulatory controls and the FTX collapse, which led to a crash in prices. Markets such as supply chain finance, crowdfunding, PFM, and robo-advisory are becoming saturated and remain highly competitive for new FinTech entrants. Wealth management is an attractive adjacent market for banks, lenders, Non-Banking Financial Companies (NBFCs), and insurance firms. These new entrants in the wealth management space are working with FinTech firms that are configurable and born in the cloud architecture to assemble their technology stack. Web 3.0 is an emerging space with a broad ambit across industries with possibilities to manage the entire asset lifecycle better. These assets could be physical assets, digital assets, media, identity, equity, bonds, or even virtual assets in the metaverse. Breaking down process complexity and reducing costs of operations across payments, treasury, and RegTech areas will drive the growth of FinTech activity.

The FinTech outlook for 2023

In the upcoming year, we expect to see FinTech firms make deliberate moves to increase their profitability to meet investors’ demands. These actions will include firms selling/licensing their machine learning models, eliminating operational efficiencies through Web 3.0 innovations, focusing on the intersection of financial services and industry experiences, and making sustainability a priority.

If you have questions about FinTech trends or would like to discuss developments in this space, reach out to Ronak Doshi.

Also, watch our webinar, Key Issues for 2023: Rise Above Economic Uncertainty and Succeed, as we explore major concerns, expectations, and key trends expected to amplify in 2023.

2023 Predictions for Tech and Services Demand | Blog

The technology and services market this year experienced strong growth. But we have a slight deceleration at the end of this year as the prospect of a potentially deep recession grows. There is now a slowdown in consulting, particularly strategic consulting, and a slowdown in discretionary spending. Will that continue? Here is an overview of what I predict for the coming year.

Read more in my blog on Forbes

Building Web 3.0 Infrastructure – Moving to a Decentralized Architecture | Blog

Web 3.0 offers great promise for enterprises and service providers. Moving to a decentralized internet can provide users with many benefits. The foundational infrastructure needed to create this truly open technology is evolving. Read on to learn about the features and requirements for the web’s next evolution.

Web 3.0 (or Web3) – a new intelligent and decentralized internet that will be more responsive, interactive, and tailor-made to enhance customer experience – is poised to take off. By allowing for advanced user and device interaction, the next version of the internet will recognize and interpret content and contexts.

Web 3.0 will enable newer business models based on data ownership where owned digital assets will be the key to unlocking access to the richer areas of the internet, moving away from the concepts of “trust” and “permission” presiding in the current internet version.

Seamless lending and borrowing that does not rely on creditworthiness, transaction-based user identification (goodbye Know Your Customer), intelligent marketing, and much more are on the verge of becoming a reality – all enabled by Web3.

Despite its many promising benefits, the transition to Web3 has been slower than initial estimates. But with the underlying technologies’ rising maturity, applications by startups and other enterprises are surging. Established firms also are rapidly exploring Web3 and innovating their existing business models.

Building blocks of Web3

As the industry experiments with Web3 to unlock benefits, the supporting infrastructure that enables the next internet version also is evolving. While creating and interacting with content on websites will continue to be done on existing web servers, the underlying network will facilitate owning and controlling created data.

Let’s take a deep dive into the building blocks of Web 3.0 infrastructure to better understand the following ideal core features required to create this truly open technology:

  • Smart contracts: These pieces of code stored on the underlying blockchain technology get executed when conditions are met. Smart contracts play a critical role in authenticating certain outcomes without needing an intermediary. The automation can continue the governance-free workflow and, once authenticated, update the blockchain. These contracts are enforceable by code, maintaining transparency and autonomy.
  • Tokens (fungible and nonfungible): Monetary incentives like these are vital for Web3’s growth. Tokens can be offered to anyone who contributes to the platform’s governance and improvement. Digital assets lie at the core of the decentralized internet vision and serve as authenticity and ownership proof.
  • Artificial Intelligence (AI): Contextualizing interactions will be critical for hyper-personalization in Web3. AI can bring about increased accuracy, enhanced security, and greater scalability. To achieve these outcomes, Machine Learning (ML) will be used to filter and analyze content to gain meaningful insights, prioritize content, and create user-friendly interfaces in web apps. AI is already changing internet usage in the current Web 2.0 version and will become more dominant in Web3.

Underlying infrastructure requirements

The foundational elements unlock an ecosystem of decentralized, open infrastructure for everyone to use and build Web3. While AI remains an essential building block, smart contracts and tokens are implemented using blockchain. The underlying infrastructure that can seamlessly run blockchain applications and AI/ML models has the following requirements:

  • Computing: Although the Web3 model can converge excess computational resources, running blockchain nodes will require access to high-performance computing infrastructure. Running AI/ML models to serve customized results requires quantum computing to be prevalent. Moreover, running 3D graphics over Web3 in real-time in use cases such as Metaverse would require high-power Graphics Processing Units (GPUs)
  • Storage: AI in Web3 will feed on vast amounts of data to enable personalization. Storing such huge amounts of data requires larger disk sizes. Extremely fast disk read speeds are essential to leverage stored data. The underlying Web 3.0 infrastructure would need to support data of applications that leverage Augmented Reality/Virtual Reality (AR/VR) and other media/video use cases. Finally, Web3 will require decentralized storage solutions to preserve the technology’s sanctity
  • Network: High speed and low latency are foundational requirements to enable Web3. Higher network speed will determine the data transfer and blockchain update speed, enabling real-time processing. Maintaining low latency and high throughput also becomes critical for the transactions and interconnections to prevent delays and enhance the user experience. However, the decentralized nature is concerning because it inherently increases latency and the latency data of leading blockchain networks need improvement

To summarize, all the infrastructure elements must be upgraded to create an efficient Web3 network. Different types of nodes (servers that run blockchain applications) will have varying requirements for computing, storage, and speed.

Edge centers thereby become a natural choice for this ecosystem as they will bring computing and storage closer to devices with varying configurations. These centers will enable cost-effective analysis and processing of Web3 application data.

As far as reliability, the default peer-to-peer nature of blockchain networks enables Web3 to withstand physical hardware failure and network outages. Thus, the extra cost of redundancy would not give enterprises any sizeable operational benefits.

Evolving landscape

Both cloud providers and dedicated server providers have recognized this space’s potential and have made inroads with various startups such as Ankr and ChainSafe. The higher computation power of bare metal servers makes them a natural choice for multiple Web3 projects and protocols. Without redundancy and resiliency concerns, enterprises can easily grow their applications. Firms that are developing performance-sensitive applications will prefer dedicated hardware resources.

Cloud computing’s established reputation and market stronghold will create tough competition for physical server providers and may be able to attract more enterprises. Cloud computing models can strengthen blockchain and increase Web3 security.

Even leading cloud vendors such as AWS and Google Cloud are launching node services. Their Web3 units will provide support for blockchain developers with tools and platforms that can enhance their blockchain journey. However, expanding presence in the Web3 space will require a lot of effort from these hyperscalers to dismiss centralization concerns of using public cloud computing services.

Since all service providers are promoting firms to enhance their applications on their blockchain infrastructure offerings, those who can help enterprises build the next set of innovative applications will stand out as leaders.

With investments pouring in from venture capitals, hedge funds, private equity firms, and other investors, Web3 holds great opportunity that strategy, technology, and consulting providers can cash in by building 3.0 businesses for clients.

To discuss the evolving cloud and Web 3.0 infrastructure, contact  Mukesh Ranjan and Kaustubh K.

Learn more in our webinar, Web 3.0 and Metaverse: Implications for Sourcing and Technology Leaders.

Evolution of the Web – The Rise of the Decentralization of the Internet | Blog

By democratizing data and giving power back to internet users, Web 3.0 offers many new computing possibilities and growth opportunities for enterprises and IT service providers. Learn more about the evolution of the web and how the decentralization of the internet is poised to shake up business and operating models.

The evolution of the Web – from Web 1.0 to Web 3.0

The first internet version (Web 1.0) was limited in scope and interactivity. Users could only read what was displayed or shown to them, and communication was one-way with website hosts primarily focused on delivering content and information.

With Web 2.0, the internet became all about interactivity and collaboration. Web 2.0’s emphasis on social connectivity and user-generated content completely replaced Web 1.0’s bland read-only web pages. Users now could produce and share information online and were no longer limited to being passive content consumers. Instead of read-only, the web evolved to be read-write, with companies creating platforms to share user-generated content and engage in user-to-user interactions.

Web 2.0, however, led to the web’s primarily advertising-driven business/monetization model. Google, YouTube, and Facebook realized along the way that storing and transmitting the state of billions of users and targeting them for advertising is hugely profitable. The benefits of the internet slowly started skewing heavily towards a handful of these platforms.

With Web 2.0’s innovation curve now in its middle to late phase and its leaders well-established, it is time to explore the new computing possibilities of the next wave. Web 3.0 is evolving as a reaction to the overwhelmingly centralized nature of the internet today and the concentration of power in the hands of a few platforms. Web 3.0, by construct, aims to remove these third-party intermediaries and restore power to users so that they can benefit from internet activities.

Let’s look at the changes in interaction, computation, and information as illustrated below:

Picture1 1
Evolution of the web

What is Web 3.0?

Web 3.0 is vastly different from previous generations. At its core, Web 3.0 is not about speed, performance, or convenience. Instead, Web 3.0 is about power; in fact, many Web 3.0 applications are, at least today, slower, and less convenient than existing products.

The primary focus of Web 3.0 is about who controls the technologies and applications that make up the internet and distributing its benefits without handing most of the power to a handful of large companies as we do today. Web 3.0, by construct, aims to remove third-party intermediaries and restore power to users so they can have a more immersive internet experience.

Decentralization of the internet under Web 3.0 will lead to a fairer and more open internet as it will allow anyone with an internet connection to participate and contribute to the ecosystem and enjoy greater benefits from web activities. Web 3.0 will also improve trust and reduce conflicts as users will possess the private key that can access their data, thereby eliminating the need for conflict resolution in the digital world.

We see Web 3.0 becoming a meaningful extension of Web 2.0, with the change being an evolution rather than a revolution with pockets of decentralized applications coexisting with the centralized web of today. The rollout will be gradual with a Web 2.5 stage.

Where is the internet headed?

The core concept of the Web 3.0 movement is decentralized ownership, with blockchain technology providing the underlying architecture for the internet’s next generation. Here are three key trends we see that will shape the future:

  • Decentralized platforms will allow for sustainable ecosystems of third-party applications

The products and services that make up the internet today tend to be produced and controlled by individual corporations. Web 3.0 provides the opportunity to democratize the online experience and ensure that no central entity will take a substantial chunk of the revenue; instead, creators will be able to directly interact with their users, strengthening relationships for both creators and users.

  • Money will become a native feature of the internet

While the past internet was simply a portal to the traditional financial system offline, now any user with an internet connection and phone can send or receive payments with the current software capabilities. Digital payments will unlock new business models that were previously impractical, radically lower the costs of cross-border remittances and other transactions, enable new use cases like machine payments, and expand availability to massive new markets.

  • Users will have more control over their digital identities and data

Web 3.0 is laying the groundwork for personal control of online identities. Today, most online identities belong to big, centralized companies. Web 3.0 will give users control over their data by allowing them to use their identity rather than one provided by a third party, limiting the potential for identity providers such as Facebook, Instagram, and Gmail to collect user data and sell that data to generate money.

Web 3.0 challenges

Business leaders need to realize that Web 3.0 is coming and pay attention to its latest development. But creating an ecosystem that can end the big tech companies’ monopolies and reimagine how we interact with the internet is a complex undertaking. Web 3.0 still faces significant hurdles in usability, performance, and business and monetization models, as illustrated below:

Picture2

A new internet era begins

Web 3.0 represents the natural progression of the internet that offers many exciting opportunities for technology and IT service providers. To seize its growth potential, providers need to focus on creating thought capital, hiring techno-creative thinkers, and building a go-to-market strategy to target a broader range of Web 3.0 themes to prepare for the coming transition.

To read more about Web 3.0 leading providers, see Web 3.0 Trailblazers – The Top Start-ups Building the Next Generation of the Internet. To discuss topics related to the evolution of the web, please contact Parul Trivedi, Sandeep Pattathil, and Nikhil Singh.

Learn more in our webinar, Web 3.0 and Metaverse: Implications for Sourcing and Technology Leaders.

Alert on Engineering and IT Hiring Dilemmas for 2023 | Blog

The most dangerous words regarding investing are, “This time, it will be different.” The same words often occur when planning for a recession. But, quite possibly, this time, it will be different. Why do we at Everest Group believe it may be different for the looming recession we now face? Because there is a confluence of trends that are different from past recessions.

Read more in my blog on Forbes

Adobe Acquires Figma – Showcasing Its Intent to Be at the Forefront of the Design and Prototyping Economy

In what can be termed as the biggest deal in the emerging design and prototyping market, Adobe announced on September 15 that it will buy Figma for $20 billion. Adobe, which has a market capitalization of $144.67 billion as of that date, will complete the deal in half cash and half stock. Read on for our analysis of what this means for these players and the segment.  

Did we see this coming?

With the adoption of design and prototyping tools in software development picking up immensely post-pandemic, we anticipated mergers and acquisitions in this space. But seeing a design pioneer acquire the game changer in the design space was unexpected.

Adobe was one of the first movers to identify the design space demand and build products such as Photoshop, Illustrator, and XD for designers, helping them gain significant market share. The pandemic changed the needs, demand patterns, and design stakeholders’ ways of work. Adobe could not keep up with this change which led to a slowdown in its growth. Figma – a relatively late entrant in the design space – came in as a game changer and emerged as a major competitor for Adobe.

This acquisition reminds us of Meta’s acquisition of WhatsApp in 2014, which enabled Meta to cement its unshakable dominant presence in the instant messaging space by acquiring its biggest competitor.

Figma’s growth story

Figma’s rise to the top has been nothing short of marvelous. This rocket ship launched in 2012 and had its public release in September 2016. Fast forward six years, Figma will now join forces with another major player in this space – Adobe.

Analyzing the details reveals the following three strengths that led to Figma’s rise:

  • Leadership – Figma’s leadership understood design and designers’ associated pain points. They had a vision for making design accessible to all.
  • Addressing evolving customer needs – To make design collaborative, Figma created all the functionalities needed to build an interface design and packed it into a browser-based user interface. Designers now could be based anywhere and work on the same project.
  • Freemium model – Figma’s core functionalities remained free to use, attracting thousands of early adopters to the platform. Good feedback coupled with Figma’s user-centricity ensured a sharp rise in users.

Why we think Figma is the missing piece of the puzzle for Adobe

This acquisition offers several areas for synergy that can propel the combined entity’s growth in the design space.

Picture1 3

What’s in this for Adobe

The deal will provide the following benefits for Adobe:

  • Access to Figma’s customer base: Figma has seen immense growth in its user base in the last two years with roughly 4 million users at present, including tech majors such as Github, Dropbox, and Twitter. Adobe can leverage these connections to augment the growth of its other offerings
  • Expansion of designer and developer community: Adobe has built a very strong designer community over the years to strengthen the design knowledge base to facilitate peer learning. The addition of Figma’s designer and developer community will further boost the collaborative mindset for design advancements
  • Access to innovative pool of talent: Figma as a brand has revolutionized the ways of working for all stakeholders in the design process. Adobe can collaborate with the innovative minds behind this revolution to take its design portfolio to the next level
  • Coverage of end-to-end functionality in design lifecycle: Figma’s special focus on the brainstorming and whiteboarding phase of the design lifecycle with the launch of FigJam will add to Adobe’s capabilities to address product ideation activities. Figma’s offerings will augment Adobe’s design capabilities as well

Implications for the design and prototyping market

Competing design tool vendors – This acquisition puts Adobe and Figma at the front of the pack. They can now boast functionalities spanning across the entire value chain along with a significant user base and customer list. What this means for competitors is that they will have to rely on strong differentiation propositions to create a niche for themselves. We anticipate that the competitors will focus on small and medium enterprises to drive differentiation.

Enterprises – In their quest to standardize and scale design processes across all product teams, enterprises tend to opt for tools that offer end-to-end design lifecycle functionality. This acquisition seems to be in the right direction as enterprises will now have a one-stop design solution.

Service providers – This move will unquestionably facilitate creating the largest base of designers operating in the Adobe plus Figma ecosystem. Demand for these skill sets will only go up from here. We anticipate service providers will ramp up their designer arsenal inorganically and step up on their partnership with Adobe plus Figma to offer design services for enterprises.

Things to look out for

While Dylan Field, CEO and co-founder of Figma, has stressed that Adobe is deeply committed to keeping Figma operating autonomously, seeing how the integration shapes up will be interesting. Adobe can take lessons from Microsoft’s acquisition of GitHub which shows that large technology companies can successfully acquire smaller players but still retain the core value proposition.

In line with this, we will also watch for potential changes in the following areas:

  • Product strategies: Adobe has similar offerings such as Adobe XD that overlap with Figma in many aspects. They will need to make alterations and updates in their value proposition and go-to-market approach to avoid cannibalization.
  • Commercial model: The freemium model is one of the prime contributors to Figma’s large user base but the Adobe suite has limited freemium models. We will look for any commercial model changes that may result from having the two under the same umbrella.

As the market evolves, competitors will need to make strong moves to create spaces for themselves. Adobe’s acquisition of Figma may set the stage for additional mergers and acquisitions to pay attention to.

Stay tuned for our updates on this fast-growing space of design and prototyping tools. To share your thoughts, please contact Swati Ganesh, [email protected], and Ankit Nath, [email protected].

Request a briefing with our experts to discuss the 2022 key issues presented in our 12 days of insights.

Request a briefing with our experts to discuss our 2022 key issues

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.
Consent*