Category: Automation/RPA/AI

Digital Transformations: 5 Emerging Trends in the Intelligent Process Automation Market

The pandemic’s effects on the digital landscape are long-lasting. Businesses are evolving to rely on the intelligent process automation market (IPA) to promote growth and keep up with competitors. Read on to learn more about five growing IPA trends.

In a world becoming increasingly reliant on technology, financial services organizations are digitizing and automating more processes to keep up with the competition. The intelligent process automation market, growing by about 20% across all fields, is now becoming ubiquitous.

IPA is defined as automation in business processes that use a combination of next-generation automation technologies — such as robotic process automation (RPA) and cognitive or artificial intelligence (AI)-based automation, including intelligent document processing and conversational AI. Solution providers are offering solutions across RPA, Intelligent Document Processing (IDP), and workflow/orchestration, as well as crafting innovative solutions such as digital Centers of Excellence (CoE) and investing more in as-a-Service offerings.

In our recent Intelligent Process Automation (IPA) – Solution Provider Landscape with PEAK Matrix® Assessment 2022 report, our analysts ranked IPA technology vendors and looked at the market for IPA solutions. Based on the research, the growth of IPA technology and reliance will expand to around 25% over the next three years.

Five intelligent process automation market trends enterprises should know

The question of how to become faster, more efficient, and more resilient is the focus for just about any organization undergoing digital transformation. Very often, the answer to this question is at least, in part, intelligent process automation. In the near future, we can see five emerging IPA trends:

  1. IPA will get smarter

A greater proportion of cognitive elements is finding its way into the intelligent process automation market. About 60% of new automation projects involve more advanced cognitive tools such as IDP, conversational AI and anomaly detection. As the maturity of AI-based solutions increases, cognitive automation will be in greater demand. All-round adoption of IPA will be fueled by providers entering new geographies and organizations starting IA initiatives.

  1. IPA will be more scalable

Although many organizations are trying to adopt intelligent process automation, the real question is if it can be scaled up or, in other words, if it can be brought across the organization. To help enterprises scale automation, solution providers are investing in expanding their partner ecosystem, strengthening technology capabilities, and enhancing their services portfolio.

Providers are also expected to help enterprises scale up through more effective change management and CoE set-up strategies. Aided by the prevalence of process intelligence solutions to form robust pipelines and orchestration tools to facilitate holistic automation, enterprises are better equipped now to move away from siloed applications of IA to scaled-up automation implementations.

  1. Citizen development will grow

Many organizations are experimenting with what they can do with citizen development, especially with the current talent shortage. Citizen-led development also holds the power to disrupt the current state of building automation and addresses the issue of talent availability. Solution providers are expected to invest in citizen development and low-code/no-code technologies enabling business users to build automation, consequently also addressing the talent shortage in the market.

Solution and technology providers are also expected to invest substantially in developing the low-code/no-code capabilities of their platforms to enable business users with limited technical exposure to build automation solutions on their own. A few solution providers are implementing citizen development programs in their own organizations and are planning to leverage the learnings to develop effective governance programs for enterprises.

  1. IPA service providers will bring IPA solutions packages to the market

Packaged solutions are gaining traction in the IPA market due to their ease of implementation and quick Return on Investment (RoI). Solutions for F&A are the most prevalent in the market. These solutions will need training on particular data sets to make them functional for a particular process, but they will speed up implementation. Providers are expected to take conscious steps toward promoting sustainable AI by developing solutions complying with environmental, social, and governance (ESG) parameters. They are also investing in AI solutions that are transparent about their working and usage of data.

  1. IPA service providers will pre-build connectors to legacy and other systems

There are a host of technologies, including RPA, conversational AI, process mining, and process orchestration in the IA ecosystem. Very often these IA solutions need to talk to the various other systems. Many IPA service providers are driving innovation and crafting new solutions to keep pace with the fast-moving IPA market and create a more holistic integration process. One such method is offering enabling capabilities like pre-built connectors for a faster and less complex implementation.

If you would like to learn more or discuss the intelligent process automation market and IPA trends, reach out to [email protected].

Learn how the healthcare industry is utilizing intelligent automation, digitalization, and telehealth as fundamental driving forces to transform and evolve in the webinar, How Intelligent Document Processing Is Transforming the Healthcare Industry.

Composable Commerce: For Composing the Best-of-Breed Customer Experience

From monolithic to MACH architecture, the next evolution in digital experience is here – composable commerce. Similar to building with Lego Blocks, this modular approach allows enterprises to create unique models by selecting “best-of-breed” digital commerce components. Learn how composable commerce is optimizing all aspects of the online shopping experience and what tech providers are pioneering solutions in this rapidly rising area.

Digital commerce growth leads to composable commerce

Just as the COVID-19 pandemic has been a catalyst in accelerating digital platform adoption among enterprises, modern consumers’ purchasing habits have dramatically changed due to frequent lockdowns and increasing online purchasing convenience.

According to a United Nations Conference on Trade and Development (UNCTAD) report, the average share of internet users who made purchases online increased from 53% before the pandemic to 60% across 66 countries following its onset in 2020/21.

The shopping experience has evolved from brick and mortar stores to online and moved to unified commerce – an amalgamation of offline and online channels with an ever-evolving myriad of customer touch points such as social commerce, video commerce, and now metaverse, etc.

Emerging business models such as Digital to Commerce (D2C), new and interactive channels, and advancements in technology, especially Artificial Intelligence (AI) and Augmented Reality/Virtual Reality (AR/VR), have fueled digital commerce’s growth. In response, the underlying digital commerce architecture principles have also morphed to meet the pace of change in digital-native customer expectations.

What is composable commerce, and how will it unlock business value?

Up until a few years ago, monolithic architecture-based platforms were the de facto choice for any digital commerce storefront. These big and chunky solutions providing standard out-of-the-box features for all customers offered a “one-size-fits-all” approach. Enterprises had to be content with these standard features and were required to spend huge budget and time on customizations to meet their business requirements.

However, major issues with scalability, complexity, longer time to market, and budget made this implementation approach less useful for modern commerce businesses where staying abreast of technological advancements, customer centricity, and nimbleness are of utmost priority.

The next major evolution in digital commerce architecture is MACH (microservices-based, API-first, cloud-native, and headless) architecture enabled enterprises. This approach will overcome the shortcomings of monolithic architecture and responsively and dynamically adapt to customer expectations.

Exhibit 1 Characteristics of MACH Architecture

Taking a step forward from MACH architecture, the era of composable commerce has dawned. Composable commerce – a modular approach to implementing digital commerce – uses interchangeable building blocks, leveraging the MACH Architecture framework. It offers enterprises the choice to select “best-of-breed” digital commerce components such as Product Information Management (PIM), Customer Relationship Management (CRM), pricing, etc., and is similar to Lego Blocks where users can create unique models. These composable components are the specific solutions provided by third-party vendors.

While the microservice approach breaks down the digital commerce modules into individual building blocks, composable commerce enables the linkage of these microservices to realize a specific business value. Composable commerce utilizes Packaged Business Capabilities (PBCs), a fully functional, independent component serving a defined business capability. These are used as building blocks for “composing” the unique platform. Each PBC can consist of several microservices. These PBCs can be individually replaced or modified without impacting the entire platform.

Exhibit 2 Central Tenets of Composable Commerce

Thus, composable commerce has shifted the focus toward business-centricity. Composable commerce is built for an organization’s unique operating models, strategic priorities, and customer focuses. Businesses can select essential functionalities for their requirements and “compose” them into a custom application built for their digital commerce platform. This allows enterprises to focus on the relevant PBCs for their business that are sometimes unavailable in the traditional and bulky monolithic platform’s “out-of-the-box” features.

Below are some benefits of composable commerce that enterprises can realize.

Exhibit 3 Benefits of Composable Commerce

The number of PBC vendors providing functionalities such as loyalty, promotions, search, reviews and ratings, analytics, etc., is rapidly growing. Enterprises have the flexibility to choose the best vendor for their platform, considering their individual business and technical requirements. They can manage multiple brands and varying business models, leveraging the same composable commerce stack to stay nimble in response to market changes. Complex business can be launched and managed more efficiently using composable commerce.

Technology providers are already pioneering composable commerce solutions

Technology providers have extensively started working on solutions to enable enterprises to get on board the composable commerce bandwagon. Below are some examples:

  • Spryker has launched the cloud-native “App Composition Platform,” which gives enterprises seamless access to third-party services and best-of-breed digital commerce vendors
  • Virto’s Atomic Architecture™ allows customers to get a composable, flexible, manageable, customized, and easily-updated digital commerce architecture that is fully adaptable to market challenges
  • Elastic Path’s Composable Commerce Hub is an open exchange of composable commerce solutions for digitally-driven brands that want to seamlessly create complete digital commerce experiences for their business
  • Fabric provides a configurable and composable commerce solution to rapidly deploy and scale unique brand experiences, product offerings, and services
  • Infosys Equinox is powering Nu Skin to compose unique and delightful digital journeys across ever-evolving channels
  • Avensia’s composable commerce solution Avensia Excite is built on commercetools. Avensia Excite uses Contentful for CMS, Inriver for PIM, and Apptus eSales for search engine

Composable commerce outlook

While MACH architecture had set a strong foundation for modern digital commerce architecture, composable commerce is the next logical iteration in the digital commerce business model to meet rapidly changing customer expectations and the growing number of touch points. Composable commerce adoption will continue to witness a rise as enterprises plan to move away from the traditional approach of implementing digital commerce solutions. More and more niche third-party vendors are emerging faster than before, providing ample choice for enterprises to craft and “compose” their digital commerce stack.

If you have questions about digital commerce, please reach out to Nisha Krishan ([email protected]) and Aakash Verma ([email protected]).

Stay tuned for insights on the digital commerce platform market from our recently launched inaugural Digital Commerce Platform PEAK Matrix® Assessment.

Selecting the Right Low-code Platform: An Enterprise Guide to Investment Decision Making | Blog

Enterprise adoption of low-code platforms has been invigorated in recent years by its potential to drive digital transformation. This fast-rising platform solution offers promise to democratize programming with today’s talent shortage and help companies develop applications and enhance functionalities faster. While the opportunities are clear, charting a path to successful adoption is ambiguous. Learn the 4Cs approach used by best-in-class enterprises for selecting and adopting the right-fit low-code platforms in this blog.

As many as 60% of new application development engagements consider low-code platforms, according to Everest Group’s recent market study. Driven by the pandemic, the sudden surge in demand for digital transformation accelerated low-code annual market growth to about 25%. Considering its potential, low code is appropriately being called the “Next Cloud.”

Interest by investors also has accelerated, further driving R&D spend for new product development. Funding activities in 2022 to companies featuring low code in their profiles already amounts to $560 million across 40 rounds.

Platform providers are responding to these elevated expectations with equal fervor by building platforms with deep domain-specific expertise, while others are providing process-specific solutions for enterprises’ customization requirements.

While these markets have resulted in a proliferation of low-code platforms to choose from, it also has led to confusion and inefficiencies for enterprises. As more and more enterprises explore the potential of these platforms, IT leaders are faced with numerous questions and concerns such as:

“How do I select the platform that can address my current and future requirements?”

“Which platform will work best in my specific enterprise IT landscape?”

“How can we optimize the investment in this technology?”

“How do I compare the pricing structures of different low-code platforms?”

“How do we ensure governance and security of the IT estate with these new tech assets?”

Adoption journey and evaluation parameters for low-code platforms

In addition to the high-priority use cases that initiate the adoption, enterprises should consider the platform’s scalability potential, talent availability for support and enhancement, and integration with the broader IT landscape to make the right selection.

Additionally, low-code platforms are intended to address the requirements of the IT function as well as business stakeholders. Considering the drivers, expectations, and requirements of both when making the selection is essential. A collaborative decision-making set-up with the central IT team and key Line-of-Business (LoB) leaders is critical for a successful platform selection. Let’s explore the 4Cs to low code success.

4Cs to low code success

The key steps to ensure successful low-code platform selection and adoption are:

  • Contemplate: Initiate platform adoption by a set of high-priority use cases but plan for scalability at the enterprise level during platform selection
  • Collaborate: Bring together the central IT group to lead the selection and adoption effort and meaningfully involve the LoB stakeholders
  • Compare: Start with business and tech drivers, expectations, and requirements from both IT and business to prioritize and rank platforms and select the best-fit platform
  • Customize: Make small and incremental enhancements post-adoption to broaden the platform’s scope without disrupting daily operations

This approach can provide a roadmap for enterprises with distinct outcomes. We have witnessed enterprises either adopting the best-fit approach resulting in a platform portfolio or leveraging a single platform as a foundation for an enterprise-grade innovation engine.

For instance, the Chief Technology Officer (CTO) of a leading bank in the US invested in establishing a low code Center of Excellence (CoE) that uses different platforms for process automation, IT Service Management (ITSM), and enabling point solutions for business users.

On the other hand, a large US commercial insurer built its entire end-to-end multi-country app on a single low-code platform. This comprehensive, business-critical application managing claims, billing, and collection is accessible by all underwriters and service personnel.

Next, we explore how to best compare platforms based on their offerings and capabilities. The tables below illustrate the top five business and technology-oriented parameters to consider when evaluating platforms, along with their relevance and enterprise expectations.

Technology parameters for low-code platform selection

Factors associated with the platform’s technical robustness are of key importance to IT decision-makers. Integration and UI/UX capabilities are at the top of enterprise’s technology priorities when comparing multiple platforms.

For instance, Appian ships with 150-plus Out-of-the-Box (OOTB) connectors. Appian SAIL, a patented UI architecture, takes declarative UI definitions to generate dynamic, interactive, and multi-platform user experiences. It also makes the applications more secure, easy to change, future-proofed, and native on the latest devices.

Picture1

Business parameters for low-code platform selection

Assessing these parameters is important to understand whether low code can be sustained and scaled long-term and if it addresses the business users’ expectations. Pricing and security constructs are at the top of the list for businesses looking to adopt a low-code platform.

Picture2

Let’s consider Salesforce as a case-in-point. Salesforce has security built into every layer of the platform. The infrastructure layer comes with replication, backup, and disaster recovery planning. Network services have encryption in transit and advanced threat detection. The application services layer implements identity, authentication, and user permissions. In addition, frequent product updates that help it to align its product offering with changing market demands put Salesforce as one of the go-to platforms for all the CRM needs of enterprises.

Low-code platform outlook

The plethora of options makes it difficult for enterprises to zero down their investments on a particular low-code platform. Enterprises must also leverage their network of service partners for guidance in this decision-making process.

Talent availability for implementation and enhancement support is critical to keep in mind during the platform selection. For the same reason, multiple system integrators are now taking the route of inorganic growth to bolster their low-code capabilities.

This is the time to hop on the low-code bandwagon and establish low code as the basis for enterprise digital transformation.

Everest Group’s Low-Code Application Development Platforms PEAK Matrix® Assessment 2022 provides an overview of the top 14 platforms based on vision, strategy, and market impact.

To share your thoughts and discuss our research related to low-code platforms, please reach out to [email protected] and [email protected].

Metaverse and ScienceTech: Will These Virtual and Real-World Markets Compete?

Metaverse is the buzz these days. While Metaverse provides an embodied virtual-reality experience, ScienceTech fuses technology and science to solve real problems of humanity. Who will win in the battle for relevance, investments, and talent? To learn more about these virtual and real-world market opportunities and what actions technology and service providers should take, read on.

While they once seemed far out, the Metaverse and ScienceTech are here now. As part of our continued Metaverse research, let’s explore these emerging technologies and whether they will collide or coexist.

ScienceTech brings together technology and science to improve the real world by enhancing living standards and improving equality. It combines technology with physical sciences, life sciences, earth sciences, anthropology, geography, history, mathematics, systems, logic, etc.

Meanwhile, the Metaverse is an emerging concept that uses next-generation advanced technologies such as Augmented Reality (AR)/Virtual Reality (VR), digital assets, spatial computing, and commerce to build an immersive, seamless experience.

Over the past few months, Metaverse has become a hot topic not only in technology circles but also among enterprises. As providers pump billions of dollars to create the landscape and value realization becomes clearer, Metaverse will grab increasing attention from enterprises, providers, and market influencers.

Its serious market potential can be seen by the collaboration of industry participants to define standards to interoperate Metaverse platforms and ecosystems. Everest Group is witnessing great interest in our Metaverse research and our recent webinar Web 3.0 and the Metaverse: Implications for Sourcing and Technology Leaders generated unprecedented client inquiries.

ScienceTech has been around for many years but has been mostly experimental with limited revenue and growth. Technology and service providers have been reluctant to meaningfully scale this business because of its complexity, significant investment requirements, and high risk of failure.

However, the pandemic has changed priorities for enterprises and individuals, making ScienceTech more critical to solving real-life problems. The cloud, an abundance of data, better manufacturing processes, and a plethora of affordable technologies have lowered the cost of enabling and building these offerings.

Competition between Metaverse and ScienceTech

Below are some of the areas where these two emerging fields could conflict:

  • Relevance

Many cynics have decried Metaverse as one more fantasy of BigTech trying to take people further away from reality. This cynicism has gained pace in light of the disruptive global pandemic. The make-believe happy world driven by a heavy dose of virtual reality takes the focus of humanity away from the pressing needs of our time.

While not well defined, ScienceTech is generally perceived as being different from pure play. Some of its ideas have been around for many years such as device miniaturization, autonomous systems, regenerative medicine, and biosimulation. The core defining principle of ScienceTech is that science researched, validated, and hypothesized themes are built through technology. The relevance of ScienceTech may appear far more pressing to many than the make-believe virtual world of Metaverse.

  • Investment

The interesting competition will be for investments. Last year, venture capitalists invested over US$30 billion in crypto-related start-ups. As the Web 3.0 and Metaverse tech landscape becomes more fragmented and crowded, investors may not want to put their money into sub-scaled businesses. This can help the ScienceTech space, which is not well understood by investors, but offers a compelling value proposition.

  • Talent

Technology talent is scarce and ScienceTech talent is even scarcer. Although Metaverse vendors will continue to attract talent because they can pay top dollar, ScienceTech vendors can offer more purpose and exciting technologies to niche talent. In the internet heydays, people bemoaned that bright minds were busy clicking links instead of solving world problems. Metaverse may have that challenge and ScienceTech can benefit from this perception. GenZ job seekers want to work in areas where they can impact and change the world, and ScienceTech can provide that forum.

What should technology and service providers do?

Both Metaverse providers and ScienceTech companies will thrive and share quite a few building blocks for technologies, namely, edge, cloud, Artificial Intelligence (AI), and data. Multiple technology and trends will not battle. Moreover, these two markets serve different purposes and Metaverse and ScienceTech will coexist. Technology and service providers will need to invest in both segments, and capture and shape the market demand.

Providers need to prioritize where to focus efforts, investments, partnerships, and leadership commitment. A different people strategy will be needed because skilling technology resources on science and vice-versa will not work. They will need to select specific focus areas and hire people from multiple science domains. The R&D group will have to change its constituents and focus on science-aligned technology rather than just Information and Communications Technology.

To be successful, providers also will have to find anchor clients to underwrite some offerings, collaborate to gain real-life industry knowledge, and engage with broader ecosystems such as academia, government, and industry bodies to build market-enabling forums.

To learn more about our Metaverse research and discuss your experiences in these emerging areas, contact [email protected] or contact us.

Visit our upcoming webinars and blogs to learn more about upcoming technologies and trends.

Low-code Market Realities: Understanding Common Myths to Avoid Costly Mistakes

Despite their growth, low-code platforms are still surrounded by much confusion. Many enterprises incorrectly believe that real developers don’t need low code, anyone can do it, and it’s only for simple problems. To debunk three common myths in the low-code market, read on.  

With its increasing importance, low-code platforms are also subject to several myths and misunderstandings. As with every evolving technology, enterprises have many questions about optimally using these platforms.

Based on our conversations with multiple enterprises confirming the lack of understanding about the low-code market, we tackle the common misperceptions below:

Myth #1: Low-code platforms are meant for use by citizen developers

The term low code generally evokes the impression of an HR manager who, tired of following up with the IT team multiple times, decides to create a leave approval workflow application. While this impression is not incorrect, professional developers and enterprise IT teams are key stakeholders in the low-code ecosystem as well.

Professional developers increasingly use low-code platforms to improve their efficiency. Some of these platforms can provide code quality alerts and Artificial Intelligence (AI)-powered recommendations, not to mention custom solutions that require minimal tuning.

The built-in DevOps capabilities in these platforms also encourage a culture shift from the commonly used waterfall model among users. For example, supply chain management software provider Nimbi significantly reduced developers in their team from 40 to 24 when they switched to OutSystems from traditional platforms.

We strongly believe central IT teams have a meaningful role in the ecosystem to provide effective oversight and governance, in addition to strategizing the use of the best low-code platforms at the enterprise level. In the absence of centralized governance, low-code platforms may proliferate across the organization leading to aggravation of the shadow IT issues and higher spend.

Myth #2: Low-code development does not require technical skills

As much as we may want to believe, low-code platforms are not a panacea to the ongoing talent crisis. Misleading promises by certain technology vendors have created a common impression that any user can develop any application using low-code platforms. However, low-code development does not imply zero technical skill requirement.

Most low-code platforms enable the extension of their capabilities through traditional programming languages like Java and C#. Off-the-shelf solutions have their limitations, and most applications need custom logic at some point. Typical job descriptions for low-code developer profiles outline technical qualifications like JavaScript, HTML5, and CSS3, alongside Continuous Integration (CI) and Continuous Delivery (CD) pipeline tools like Jenkins.

Thus, it is unrealistic to expect an army of business users to step in and take over all application development-related needs from the IT organization. Low-code development remains a role with a highly demanding skillset across various technologies.

Myth #3: Low code cannot be used for enterprise-grade development

Many enterprise leaders and service providers believe that low-code platforms are only suitable for small-scale department-level needs. However, our conversations indicate that low-code platforms are being rapidly adopted for critical applications used by millions of users. Here are some examples of how low code is solving complex IT problems around the world:

  • A large US commercial insurer has built its entire end-to-end multi-country comprehensive, business-critical application that manages claims, billing, and collection on Appian
  • One of the largest consumer goods companies in the world built a huge global application for financial management on Microsoft Power Platform

As we witness the adoption of low-code platforms garnering pace, a lot of myths and misunderstandings need to be cleared up about low code versus traditional development. Technology providers and service partners play a key role in helping their clients navigate the abundant options to orchestrate a carefully crafted low-code strategy and select the best low-code platforms.

At Everest Group, we are closely tracking the low-code market. For more insights, see our compendium report on various platform providers, the state of the low-code market report shedding light on the enterprise adoption journey, and a PEAK Matrix assessment comparing 14 leading players in the low-code market.

To share your thoughts and discuss our low-code market research, please reach out to [email protected], [email protected] or [email protected].

You can also attend our webinar, Building Successful Digital Product Engineering Businesses, to explore how enterprises are investing in next-generation technologies and talent and the most relevant skillsets for digital product engineering initiatives.

Is AI Emotion Detection Ready for Prime Time?

Artificial Intelligence (AI) solutions that aim to recognize human emotions can provide useful insights for hiring, marketing, and other purposes. But their use also raises serious questions about accuracy, bias, and privacy. To learn about three common barriers that need to be overcome for AI emotion detection to become more mainstream, read on.

By using machine learning to mimic human intelligence, AI can execute everything from minimal and repetitive tasks to those requiring more “human” cognition. Now, AI solutions are popping up that go as far as to interpret human emotion. In solutions where AI and human emotion intersect, does the technology help, or deliver more trouble than value?

While we are starting to see emotion detection using AI in various technologies, several barriers to adoption exist, and serious questions arise as to whether the technology is ready to be widely used. AI that aims to interpret or replace human interactions can be flawed because of underlying assumptions made when the machine was trained. Another concern is the broader question of why anyone would want to have this technology used on them. Is the relationship equal between the organization using the technology and the individual on whom the technology is being used? Concerns like these need to be addressed for this type of AI to take off.

Let’s explore three common barriers to emotion detection using AI:

Barrier #1: Is AI emotion detection ethical for all involved?

Newly launched AI-based solutions that track human sentiment for sales, human resources, instruction, and telehealth can help provide useful insights by understanding people’s reactions during virtual conversations.

While talking through the screens, the AI tracks the sentiment of the person, or people, who are taking the information in, including their reactions and feedback. The person being tracked could be a prospective customer, employee, student, patient, etc., where it’s beneficial for the person leading the virtual interaction to better understand how the individual receiving the information is feeling and what they could be thinking.

This kind of AI could be viewed as ethical in human resources, telehealth, or educational use cases because it could benefit both the person delivering the information and those receiving the information to track reactions, such as fear, concern, or boredom. In this situation, the software could help deliver a better outcome for the person being assessed. However, few other use cases are available where it is advantageous for everyone involved to have one person get a “competitive advantage” over another in a virtual conversation by using AI technology.

Barrier #2:  Can discomfort and feelings of intrusion with AI emotion detection be overcome?  

This brings us to the next barrier – why should anyone agree to have this software turned on during a virtual conversation? If someone knows of an offset in control during a virtual conversation, the AI software comes across as incredibly intrusive. If people need to agree to be judged by the AI software in some form or another, many could decline just because of its invasive nature.

People are becoming more comfortable with technology and what it can do for us; however, people still want to feel like they have control of their decisions and emotions.

Barrier #3: How do we know if the results of emotion detection using AI are accurate?

We put a lot of trust in the accuracy of technology today, and generally, we don’t always consider how technology develops its abilities. The results for emotion-detecting AI depend heavily on the quality of the inputs that are training the AI. For example, the technology must consider not only how human emotion varies from person to person but the vast differences in body language and non-verbal communication from one culture to another. Users also will want to consider the value and impact of the recommendations that come out of the analysis and if it drives the desired behaviors that were intended.

Getting accurate data from using this kind of AI software could help businesses better meet the needs of customers and employees, and health and education institutions deliver better services. AI can pick up on small nuances that may otherwise be missed entirely and be useful in job hiring and other decision making.

But inaccurate data could alter what would otherwise have been a genuine conversation. Until accuracy improves, users should focus on whether the analytics determine the messages correctly and if overall patterns exist that can be used for future interactions. While potentially promising, AI emotion detection may still have some learning to do before it’s ready for prime time.

Contact us for questions or to discuss this topic further.

Learn more about recent advances in technology in our webinar, Building Successful Digital Product Engineering Businesses. Everest Group experts will discuss the massive digital wave in the engineering world as smart, connected, autonomous, and intelligent physical and hardware products take center stage.

Resilience and Stability in the Workers’ Compensation Industry – This is the Right Time for Claims Transformation to Secure Future Growth | Blog

As the workers’ compensation industry emerges from the pandemic, leveraging digital technologies to transform claims handling and taking a customer-centric approach will help carriers maintain profitability. By using automation, analytics, and digitalization, players can differentiate themselves. To learn about the key workers’ compensation trends to pay attention to, read on.

The workers’ compensation industry has remained profitable through the pandemic, with claims severity remaining consistent and frequency continuing to decrease. But reduced net written premium, low-interest rates, and the economic slowdown are creating top-line pressures. Moreover, the sustainability of profits is not guaranteed.

As COVID-19 subsides and most industries return to normal workways, the workers’ compensation industry could face difficulties in holding on to gains if it doesn’t chart a dedicated plan to improve productivity, employee experience, and employer mandates to create market differentiation.

Process standardization and simplification are the need of the hour. The workers’ compensation industry must move from the “repent and repair” model to “prevent and prepare” by leveraging business intelligence through an end-to-end real-time data flow across processes to enable a more customer-centric approach toward claims handling.

Currently, efficiency is impacted by the lack of information that results in back-and-forth requests on multiple claims touchpoints. By integrating processes, carriers can obtain real-time data to design standard workflow for Straight-through Processing (STP), exception handling capabilities, fraud detection and claim reserves calculation, and reduce overall claims function cycle time.

Challenges to overcome for claims transformation 

In addition to concerns and uncertainty about the long-term effects of the pandemic, the workers’ compensation industry faces the challenge of outdated workflows with heritage issues such as:

  • Lack of information at each node in the claims management process that increases cycle time and leads to poor end-user experience
  • Paper-based processes that are roadblocks to enabling a virtual ecosystem consisting of digital payments, paperless documents, e-signature, e-inspection, and sundry processes
  • Cumbersome manual functions that should be automated
  • Lack of a framework for standardized processes and segregating functions, requiring customization such as objectionable and non-objectionable items having to follow the same workflow
  • Claims not being linked to risk assessment and reporting, which impacts new business and renewals and the mapping of profitable and loss-making segments
  • Inability to benchmark claims, assess claims performance, and understand market impact

To continue its growth, workers’ compensation industry players should look to implement digital transformation and optimize processes to reduce claims turnaround time. Carriers who focus on digital solutions and leverage data through automation and analytics will successfully pivot for the future.

Traditional claims versus digital claims

Picture1

Exhibit 1: Everest Group

Three workers’ compensation industry trends to watch

1 – The role of automation

Workers’ compensation claims consist of workflows requiring minimal manual intervention where automation can work as an enabler providing numerous benefits such as:

  • Enhanced user experience through conversational Artificial Intelligence (AI) for First Report of Injury (FROI)
  • Improved data validation and elimination of human error, enabling early-fraud detection and reduction in leakages
  • Automated claims routing for risk assessment through SOPs for STP, exceptions, and large claims
  • Auto approval of bills based on claim and treatment parameters can shorten handling time
  • Better claims capacity, improved backlog on open claims, speed to adjudication, and faster return-to-work solutions

Digital intake can remove friction and deliver a captivating experience for all stakeholders. By focusing on automation, workers’ compensation carriers will not only improve operational efficiency but also reduce operational costs – resulting in bottom-line benefits.

2 – Advancement in analytics

By adopting enhanced analytical capabilities, workers’ compensation carriers can increase their focus on the end-user experience and take a more proactive, prevention-based approach. Here are some ways this can be done:

  • Predictive and prescriptive analytics for insights on safety parameters will help prevent accidents and injuries
  • Predicting risk upon claim intake will smoothen the claim cycle for all stakeholders
  • Auto assignment of claims to an adjuster with relevant experience for backend issues
  • Individual and aggregated claims-based rules with persona-specific dashboards for different injury types
  • Implementing Key Performance Indicators (KPI) for assessing analytical potential

Advancement in analytics with proper predictive modeling techniques will reduce claims cost and improve claims severity, which, in turn, will deliver enhanced profitability.

3- Digital ecosystem and a safe workplace

The evolution of workers’ compensation claims in the future will depend largely on the assessment of intake efficiency, moving away from redundant processes, and instituting digital and data-led workflows. Technology usage will not only depend on the number of datasets with a carrier but also on the value generated to create new models for transforming the entire claims solution.

The cornerstone for transformation should be prevention and preparedness. With many organizations choosing to operate in a hybrid working model post-pandemic, it is imperative to assess the long-term impact of such changes. Internet of Things (IoT) and telematics can help achieve this through initiatives such as:

  • Smart workplaces with sensors, cameras, and other intelligent devices for continuous supervision
  • Digital collaborations for safety training and mechanisms with loss controllers
  • Wearable devices for loss prevention and control
  • Creating digital safety communities

With the pandemic pushing workforces to stay at home, telemedicine has gained popularity providing employees with medical consultation and reducing the away-from-work time. It is offering immediate care and a faster inquiry process by having expert medical professionals available. Telemedicine also helped promote claims advocacy and assisted in intake and triage through digital authorizations and workflow development for assigning priority to claim categories.

In the end, employers want stability and predictability of final claim costs. Regardless of how these macro trends affect the Workers’ Compensation industry, the focus should be on creating a safe workplace and taking all measures to prevent injuries and hazards.

To discuss workers’ compensation industry trends, reach out to Abhimanyu Awasthi at [email protected], Somya Bhadola at [email protected], or contact us.

Workplace leaders are also now able to focus more on creating an experience-centric workplace through digital technologies, delivering superior user performance and job satisfaction. Learn more in our webinar, Top Strategies for Creating an Employee-focused Digital Workplace.

The Rise of Machine Learning Operations: How MLOps Can Become the Backbone of AI-enabled Enterprises | Blog

We’ve seen enterprises developing and employing multiple disparate AI use cases. But to become a truly AI-enabled enterprise, many standalone use cases need to be developed, deployed, and maintained to solve different challenges across the organization. Machine Learning Operations or MLOps offers this promise to seamlessly leverage the power of AI without hassle.

Everest Group is launching its MLOps Products PEAK® Matrix Assessment 2022 to gain a better understanding of the competitive service provider landscape. Discover how you can be part of the assessment.

Learn how to participate

With the rise in digitization, cloud, and internet of things (IoT) adoption, our world generates petabytes of data every day that enterprises want to mine to gain business insights, drive decisions, and transform operations.

Artificial Intelligence (AI) and Machine Learning (ML) insights can help enterprises gain a competitive edge but come with developmental and operational challenges. Machine Learning Operations (MLOps) can provide a solution. Let’s explore this more.

While tools for analyzing historical data to gain business insights have become well-adopted and easier to use, using this information to make predictions or judgment calls is a different ball game altogether.

Tools that can deliver these capabilities based on programming languages such as Python, SAS, and R are known as data science or Machine Learning (ML). Popular deep learning frameworks include Tensorflow, Jupyter, and PyTorch.

Over the past decade, these tools have gained traction and have emerged as attractive options to develop predictive use cases by leveraging vast amounts of data to assist employees in making decisions and delivering consistent outcomes. As a result, enterprises can scale processes without proportionately increasing employee headcount.

Machine Learning varies from traditional IT initiatives as it does not take a one-size-fits-all approach. Earlier data-scientist implementation teams operated in silos, worked on different business processes, and leveraged disparate development tools and deployment techniques with limited adherence to IT policies.

While the benefits promised are real, replicating them across geographies, functions, customer segments, and distribution channels, all with their own nuances, called for a customized approach across these categories.

This led to the development of a plethora of specialized models that individual business teams had to be kept informed of as well as significant infrastructure and deployment costs.

Advances in ML have since driven software providers to offer approaches to democratize model development, making it possible to now create custom ML models for different processes and contexts.

MLOps to the rescue

In today’s world, developing multiple models that serve different purposes is less challenging. Enterprises who want to successfully become AI-enabled and deploy AI at scale need to equip individual business teams with model deployment and monitoring capabilities.

As a result, software vendors have started offering a DevOps-style approach to centralize and support the deployment requirements of a vast number of ML models, with individual teams focusing only on developing models best suited to their requirements.

This new rising methodology called MLOps is a structured approach to scaling ML across organizations that brings together skills, techniques, and tools used in data engineering and machine learning.

What’s needed to make it work

Technical Capabilities Required for MLOps

MLOps assists enterprises in decoupling the development and operational aspects in an ML model’s lifecycle by bringing DevOps-like capabilities into operationalizing ML models. Technology vendors are offering MLOps to enterprises in the form of licensable software with the following capabilities:

  • Model deployment: In this important stage, the ability to deploy models on any infrastructure is important. Other features include storing an ML model in a containerized environment and options to scale compute resources
  • Model monitoring: Tracking the performance of models in production is complex and requires a carefully designed performance measurement matrix. As soon as models start showing signs of declining prediction accuracy, they are sent to the development team for review/retraining
  • Collaboration and platform management: MLOps solutions offer platform-related features such as security, access control, version control, and performance measurement to enhance reusability and collaboration among various stakeholders, including data engineers, data scientists, ML engineers, and central IT functions

Additionally, MLOps vendors offer support for multiple Integrated Development Environments (IDEs) to promote the democratization of the model development process.

While various vendors offer built-in ML development capabilities within their solutions, connectors are being developed and integrated to support a wide array of ML model file formats.

Additionally, the overall ML lifecycle management ecosystem is rapidly converging, with vendors developing end-to-end ML lifecycle capabilities, either in-house or through partner integrations.

MLOps can promote rapid innovation through robust machine learning lifecycle management, increase productivity, speed, and reliability, and reduce risk – making it a methodology to pay attention to.

Everest Group is launching its MLOps Products PEAK® Matrix Assessment 2022 to gain a better understanding of the competitive landscape. Technology providers can now participate and receive a platform assessment.

Learn how to participate

To share your thoughts on this topic, contact [email protected] and [email protected].

Federated Learning: Privacy by Design for Machine Learning | Blog

With cyberattacks and data breaches at all-time highs, consumers are increasingly skeptical about sharing their data with enterprises, creating a dilemma for artificial intelligence (AI) that needs massive data to thrive. The nascent technology of federated learning offers an ideal growing alternative for healthcare, life sciences, banking, finance, manufacturing, advertising, and other industries to unleash the full potential of AI without compromising the privacy of individuals. To learn how you can have all the data you need while protecting consumers, read on.  

Privacy preservation with federated learning

The infinite number of massive data breaches that have stripped individuals of their privacy has made the public more aware of the need to protect their data. In the absence of strong governance and guidelines, people are more skeptical than ever about sharing their personal data with enterprises.

This new data-conscious paradigm poses a problem for artificial intelligence (AI) that thrives on huge amounts of data. Unless we can figure out a way to train machines on significantly smaller data sets, protecting the privacy and data of users will remain key obstacles to intelligent automation.

Federated learning (FL) is emerging as a solution to overcome this problem. Broadly speaking, Federated learning is a method of training machine learning models in a way that the user data does not leave its location, keeping it safe and private. This differs from the traditional centralized machine learning methods that require the data to be aggregated in a centralized location.

Federated learning is a mechanism of machine learning, wherein the process of learning takes place in a decentralized manner across a network of nodes/edge devices, and the results are aggregated in a central server to create a unified model. It essentially comprises decoupling the activity of model training with centralized data storage.

The Mechanism of Federated Learning

By training the same model across an array of devices, each with its own set of data, we get multiple versions of the model, which, when combined, create a more powerful and accurate global version for deployment and use.

In addition to training algorithms in a private and secure manner, Federated learning provides an array of other benefits such as:

  • Training across data silos
  • Training on heterogeneous data
  • Lower communication costs
  • Reduced liability

Federated learning applicability and use cases

Based on an Everest Group framework, we have found Federated learning is most suitable and being adopted at higher rates in industrials where data is an extremely critical asset that is present across different locations in a distributed fashion and privacy is paramount.

Federated learning is especially beneficial for industries that have strict data residency requirements. This makes the healthcare and life-sciences industries perfect candidates for its adoption. Federated learning can help facilitate multi-institution collaborations between medical institutions by helping them overcome regulatory hurdles that prevent them from sharing patient data by pooling data in a common location.

The next industry ripe for the adoption of Federated learning is the banking and financial sectors. For instance, it can be used to develop a more comprehensive and accurate fraud analytics solution that is based on data from multiple financial entities.

Another industry where we see high applicability of Federated learning is the manufacturing industry. By ensuring collaboration between different entities across the supply chain, using Federated learning techniques, there is a case to build a more powerful model that can help increase the overall efficiency across the supply chain.

Federated learning also might find increased use in interest-based advertising. With the decision to disable third-party cookies by major internet browsers, marketers are at a loss for targeted advertising and engagement. With Federated Learning, marketers can replace individual identifiers with cohorts or group-based identifiers. These cohorts are created by identifying people with common interests based on individual user data such as browsing habits stored on local machines.

An ecosystem on the rise

Since Google introduced the concept of Federated learning in 2016, there has been a flurry of activity. Given that this is a nascent technology, the ecosystem is currently dominated by big tech and open-source players. We see hyperscalers taking the lead with Microsoft and Amazon Web Services (AWS) making investments to activate Federated learning, followed by Nvidia and Lenovo who are looking at the market from a hardware perspective.

Another segment of players working in this arena are startups that are using Federated learning to build industry-specific solutions. AI companies such as Owkin and Sherpa.ai are pioneering this technology and have developed Federated learning frameworks that are currently operational at a few enterprises’ locations.

The adoption and need for Federated learning depend on the industry and vary with the use case. Everest Group has developed a comprehensive framework to help you assess and understand the suitability of Federated learning for your use-case in our Latest Primer for Federated Learning. The framework is built on four key parameters that include data criticality, privacy requirement, regulatory constraint, and data silo/ diversity.

Federated learning provides us with an alternative way to make AI work in a world without compromising the privacy of individuals.

If you are interested in understanding the suitability of federated learning for your enterprise, please share your thoughts with us at [email protected].

Recharge Your AI initiatives with MLOps: Start Experimenting Now | Blog

In this era of industrialization for Artificial Intelligence (AI), enterprises are scrambling to embed AI across a plethora of use cases in hopes of achieving higher productivity and enhanced experiences. However, as AI permeates through different functions of an enterprise, managing the entire charter gets tough. Working with multiple Machine Learning (ML) models in both pilot and production can lead to chaos, stretched timelines to market, and stale models. As a result, we see enterprises hamstrung to successfully scale AI enterprise-wide.

MLOps to the rescue

To overcome the challenges enterprises face in their ML journeys and ensure successful industrialization of AI, enterprises need to shift from the current method of model management to a faster and more agile format. An ideal solution that is emerging is MLOps – a confluence of ML and information technology operations based on the concept of DevOps.

According to our recently published primer on MLOps, Successfully Scaling Artificial Intelligence – Machine Learning Operations (MLOps), these sets of practices are aimed at streamlining the ML lifecycle management with enhanced collaboration between data scientists and operations teams. This close partnering accelerates the pace of model development and deployment and helps in managing the entire ML lifecycle.

Picture1 1

MLOps is modeled on the principles and practices of DevOps. While continuous integration (CI) and continuous delivery (CD) are common to both, MLOps introduces the following two unique concepts:

  • Continuous Training (CT): Seeks to automatically and continuously retrain the MLOps models based on incoming data
  • Continuous Monitoring (CM): Aims to monitor the performance of the model in terms of its accuracy and drift

We are witnessing MLOps gaining momentum in the ecosystem, with hyperscalers developing dedicated solutions for comprehensive machine learning management to fast-track and simplify the entire process. Just recently, Google launched Vertex AI, a managed AI platform, which aims to solve these precise problems in the form of an end-to-end MLOps solution.

Advantages of using MLOps

MLOps bolsters the scaling of ML models by using a centralized system that assists in logging and tracking the metrics required to maintain thousands of models. Additionally, it helps create repeatable workflows to easily deploy these models.

Below are a few additional benefits of employing MLOps within your enterprise:

Picture2

  • Repeatable workflows: Saves time and allows data scientists to focus on model building because of the automated workflows for training, testing, and deployment that MLOps provides. It also aids in creating reproducible ML workflows that accelerate fractionalization of the model
  • Better governance and regulatory compliance: Simplifies the process of tracking changes made to the model to ensure compliance with regulatory norms for particular industries or geographies
  • Improved model health: Helps continuously monitor ML models across different parameters such as accuracy, fairness, biasness, and drift to keep the models in check and ensure they meet thresholds
  • Sustained model relevance and RoI: Keeps the model relevant with regular training based on new incoming data so it remains relevant. This helps to keep the model up to date and provide a sustained Return on Investment (RoI)
  • Increased experimentation: Spurs experimentation by tracking multiple versions of models trained with different configurations, leading to improved variations
  • Trigger-based automated re-training: Helps set up automated re-training of the model based on fresh batches of data or certain triggers such as performance degradation, plateauing or significant drift

Starting your journey with MLOps

Implementing MLOps is complex because it requires a multi-functional and cross-team effort across the key elements of people, process, tools/platforms, and strategy underpinned by rigorous change management.

As enterprises embark on their MLOps journey, here are a few key best practices to pave the way for a smooth transition:

  • Build a cross-functional team – Engage team members from the data science, operations, and business front with clearly defined roles to work collaboratively towards a single goal
  • Establish common objectives – Set common goals for the cross-functional team to cohesively work toward, realizing that each of the teams that form an MLOps pod may have different and competing objectives
  • Construct a modular pipeline – Take a modular approach instead of a monolithic one when building MLOps pipelines since the components built need to be reusable, composable, and shareable across multiple ML pipelines
  • Select the right tools and platform – Choose from a plethora of tools that cater to one or more functions (management, modeling, deployment, and monitoring) or from platforms that cater to the end-to-end MLOps value chain
  • Set baselines for monitoring – Establish baselines for automated execution of particular actions to increase efficiency and ensure model health in addition to monitoring ML systems

When embarking on the MLOps journey, there is no one-size-fits-all approach. Enterprises need to assess their goals, examine their current ML tooling and talent, and also factor in the available time and resources to arrive at an MLOps strategy that best suits their needs.

For ML to keep pace with the agility of modern business, enterprises need to start experimenting with MLOps now.

Are you looking to scale AI within your enterprise with the help of MLOps? Please share your thoughts with us at [email protected].

Request a briefing with our experts to discuss the 2022 key issues presented in our 12 days of insights.

Request a briefing with our experts to discuss our 2022 key issues

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

  • Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.