Enterprise adoption of low-code platforms has been invigorated in recent years by its potential to drive digital transformation. This fast-rising platform solution offers promise to democratize programming with today’s talent shortage and help companies develop applications and enhance functionalities faster. While the opportunities are clear, charting a path to successful adoption is ambiguous. Learn the 4Cs approach used by best-in-class enterprises for selecting and adopting the right-fit low-code platforms in this blog.
As many as 60% of new application development engagements consider low-code platforms, according to Everest Group’s recent market study. Driven by the pandemic, the sudden surge in demand for digital transformation accelerated low-code annual market growth to about 25%. Considering its potential, low code is appropriately being called the “Next Cloud.”
Interest by investors also has accelerated, further driving R&D spend for new product development. Funding activities in 2022 to companies featuring low code in their profiles already amounts to $560 million across 40 rounds.
Platform providers are responding to these elevated expectations with equal fervor by building platforms with deep domain-specific expertise, while others are providing process-specific solutions for enterprises’ customization requirements.
While these markets have resulted in a proliferation of low-code platforms to choose from, it also has led to confusion and inefficiencies for enterprises. As more and more enterprises explore the potential of these platforms, IT leaders are faced with numerous questions and concerns such as:
“How do I select the platform that can address my current and future requirements?”
“Which platform will work best in my specific enterprise IT landscape?”
“How can we optimize the investment in this technology?”
“How do I compare the pricing structures of different low-code platforms?”
“How do we ensure governance and security of the IT estate with these new tech assets?”
In addition to the high-priority use cases that initiate the adoption, enterprises should consider the platform’s scalability potential, talent availability for support and enhancement, and integration with the broader IT landscape to make the right selection.
Additionally, low-code platforms are intended to address the requirements of the IT function as well as business stakeholders. Considering the drivers, expectations, and requirements of both when making the selection is essential. A collaborative decision-making set-up with the central IT team and key Line-of-Business (LoB) leaders is critical for a successful platform selection. Let’s explore the 4Cs to low code success.
The key steps to ensure successful low-code platform selection and adoption are:
This approach can provide a roadmap for enterprises with distinct outcomes. We have witnessed enterprises either adopting the best-fit approach resulting in a platform portfolio or leveraging a single platform as a foundation for an enterprise-grade innovation engine.
For instance, the Chief Technology Officer (CTO) of a leading bank in the US invested in establishing a low code Center of Excellence (CoE) that uses different platforms for process automation, IT Service Management (ITSM), and enabling point solutions for business users.
On the other hand, a large US commercial insurer built its entire end-to-end multi-country app on a single low-code platform. This comprehensive, business-critical application managing claims, billing, and collection is accessible by all underwriters and service personnel.
Next, we explore how to best compare platforms based on their offerings and capabilities. The tables below illustrate the top five business and technology-oriented parameters to consider when evaluating platforms, along with their relevance and enterprise expectations.
Factors associated with the platform’s technical robustness are of key importance to IT decision-makers. Integration and UI/UX capabilities are at the top of enterprise’s technology priorities when comparing multiple platforms.
For instance, Appian ships with 150-plus Out-of-the-Box (OOTB) connectors. Appian SAIL, a patented UI architecture, takes declarative UI definitions to generate dynamic, interactive, and multi-platform user experiences. It also makes the applications more secure, easy to change, future-proofed, and native on the latest devices.
Assessing these parameters is important to understand whether low code can be sustained and scaled long-term and if it addresses the business users’ expectations. Pricing and security constructs are at the top of the list for businesses looking to adopt a low-code platform.
Let’s consider Salesforce as a case-in-point. Salesforce has security built into every layer of the platform. The infrastructure layer comes with replication, backup, and disaster recovery planning. Network services have encryption in transit and advanced threat detection. The application services layer implements identity, authentication, and user permissions. In addition, frequent product updates that help it to align its product offering with changing market demands put Salesforce as one of the go-to platforms for all the CRM needs of enterprises.
The plethora of options makes it difficult for enterprises to zero down their investments on a particular low-code platform. Enterprises must also leverage their network of service partners for guidance in this decision-making process.
Talent availability for implementation and enhancement support is critical to keep in mind during the platform selection. For the same reason, multiple system integrators are now taking the route of inorganic growth to bolster their low-code capabilities.
This is the time to hop on the low-code bandwagon and establish low code as the basis for enterprise digital transformation.
Everest Group’s Low-Code Application Development Platforms PEAK Matrix® Assessment 2022 provides an overview of the top 14 platforms based on vision, strategy, and market impact.
Metaverse is the buzz these days. While Metaverse provides an embodied virtual-reality experience, ScienceTech fuses technology and science to solve real problems of humanity. Who will win in the battle for relevance, investments, and talent? To learn more about these virtual and real-world market opportunities and what actions technology and service providers should take, read on.
While they once seemed far out, the Metaverse and ScienceTech are here now. As part of our continued Metaverse research, let’s explore these emerging technologies and whether they will collide or coexist.
ScienceTech brings together technology and science to improve the real world by enhancing living standards and improving equality. It combines technology with physical sciences, life sciences, earth sciences, anthropology, geography, history, mathematics, systems, logic, etc.
Meanwhile, the Metaverse is an emerging concept that uses next-generation advanced technologies such as Augmented Reality (AR)/Virtual Reality (VR), digital assets, spatial computing, and commerce to build an immersive, seamless experience.
Over the past few months, Metaverse has become a hot topic not only in technology circles but also among enterprises. As providers pump billions of dollars to create the landscape and value realization becomes clearer, Metaverse will grab increasing attention from enterprises, providers, and market influencers.
Its serious market potential can be seen by the collaboration of industry participants to define standards to interoperate Metaverse platforms and ecosystems. Everest Group is witnessing great interest in our Metaverse research and our recent webinar Web 3.0 and the Metaverse: Implications for Sourcing and Technology Leaders generated unprecedented client inquiries.
ScienceTech has been around for many years but has been mostly experimental with limited revenue and growth. Technology and service providers have been reluctant to meaningfully scale this business because of its complexity, significant investment requirements, and high risk of failure.
However, the pandemic has changed priorities for enterprises and individuals, making ScienceTech more critical to solving real-life problems. The cloud, an abundance of data, better manufacturing processes, and a plethora of affordable technologies have lowered the cost of enabling and building these offerings.
Below are some of the areas where these two emerging fields could conflict:
Many cynics have decried Metaverse as one more fantasy of BigTech trying to take people further away from reality. This cynicism has gained pace in light of the disruptive global pandemic. The make-believe happy world driven by a heavy dose of virtual reality takes the focus of humanity away from the pressing needs of our time.
While not well defined, ScienceTech is generally perceived as being different from pure play. Some of its ideas have been around for many years such as device miniaturization, autonomous systems, regenerative medicine, and biosimulation. The core defining principle of ScienceTech is that science researched, validated, and hypothesized themes are built through technology. The relevance of ScienceTech may appear far more pressing to many than the make-believe virtual world of Metaverse.
The interesting competition will be for investments. Last year, venture capitalists invested over US$30 billion in crypto-related start-ups. As the Web 3.0 and Metaverse tech landscape becomes more fragmented and crowded, investors may not want to put their money into sub-scaled businesses. This can help the ScienceTech space, which is not well understood by investors, but offers a compelling value proposition.
Technology talent is scarce and ScienceTech talent is even scarcer. Although Metaverse vendors will continue to attract talent because they can pay top dollar, ScienceTech vendors can offer more purpose and exciting technologies to niche talent. In the internet heydays, people bemoaned that bright minds were busy clicking links instead of solving world problems. Metaverse may have that challenge and ScienceTech can benefit from this perception. GenZ job seekers want to work in areas where they can impact and change the world, and ScienceTech can provide that forum.
Both Metaverse providers and ScienceTech companies will thrive and share quite a few building blocks for technologies, namely, edge, cloud, Artificial Intelligence (AI), and data. Multiple technology and trends will not battle. Moreover, these two markets serve different purposes and Metaverse and ScienceTech will coexist. Technology and service providers will need to invest in both segments, and capture and shape the market demand.
Providers need to prioritize where to focus efforts, investments, partnerships, and leadership commitment. A different people strategy will be needed because skilling technology resources on science and vice-versa will not work. They will need to select specific focus areas and hire people from multiple science domains. The R&D group will have to change its constituents and focus on science-aligned technology rather than just Information and Communications Technology.
To be successful, providers also will have to find anchor clients to underwrite some offerings, collaborate to gain real-life industry knowledge, and engage with broader ecosystems such as academia, government, and industry bodies to build market-enabling forums.
Despite their growth, low-code platforms are still surrounded by much confusion. Many enterprises incorrectly believe that real developers don’t need low code, anyone can do it, and it’s only for simple problems. To debunk three common myths in the low-code market, read on.
With its increasing importance, low-code platforms are also subject to several myths and misunderstandings. As with every evolving technology, enterprises have many questions about optimally using these platforms.
Based on our conversations with multiple enterprises confirming the lack of understanding about the low-code market, we tackle the common misperceptions below:
The term low code generally evokes the impression of an HR manager who, tired of following up with the IT team multiple times, decides to create a leave approval workflow application. While this impression is not incorrect, professional developers and enterprise IT teams are key stakeholders in the low-code ecosystem as well.
Professional developers increasingly use low-code platforms to improve their efficiency. Some of these platforms can provide code quality alerts and Artificial Intelligence (AI)-powered recommendations, not to mention custom solutions that require minimal tuning.
The built-in DevOps capabilities in these platforms also encourage a culture shift from the commonly used waterfall model among users. For example, supply chain management software provider Nimbi significantly reduced developers in their team from 40 to 24 when they switched to OutSystems from traditional platforms.
We strongly believe central IT teams have a meaningful role in the ecosystem to provide effective oversight and governance, in addition to strategizing the use of the best low-code platforms at the enterprise level. In the absence of centralized governance, low-code platforms may proliferate across the organization leading to aggravation of the shadow IT issues and higher spend.
As much as we may want to believe, low-code platforms are not a panacea to the ongoing talent crisis. Misleading promises by certain technology vendors have created a common impression that any user can develop any application using low-code platforms. However, low-code development does not imply zero technical skill requirement.
Thus, it is unrealistic to expect an army of business users to step in and take over all application development-related needs from the IT organization. Low-code development remains a role with a highly demanding skillset across various technologies.
Many enterprise leaders and service providers believe that low-code platforms are only suitable for small-scale department-level needs. However, our conversations indicate that low-code platforms are being rapidly adopted for critical applications used by millions of users. Here are some examples of how low code is solving complex IT problems around the world:
As we witness the adoption of low-code platforms garnering pace, a lot of myths and misunderstandings need to be cleared up about low code versus traditional development. Technology providers and service partners play a key role in helping their clients navigate the abundant options to orchestrate a carefully crafted low-code strategy and select the best low-code platforms.
At Everest Group, we are closely tracking the low-code market. For more insights, see our compendium report on various platform providers, the state of the low-code market report shedding light on the enterprise adoption journey, and a PEAK Matrix assessment comparing 14 leading players in the low-code market.
You can also attend our webinar, Building Successful Digital Product Engineering Businesses, to explore how enterprises are investing in next-generation technologies and talent and the most relevant skillsets for digital product engineering initiatives.
Financial services organizations are digitizing and automating more processes to keep up with competition, and intelligent process automation (IPA) — which has been growing in use about 20% across all fields — is now becoming ubiquitous. Market research firm Everest Group in a recent report ranked IPA technology vendors and looked at the market for process automation.
With inflation in the United States at a 40-year high and unemployment near a 50-year low, these are tough times to attract and retain employees in just about every sector. When you add the growing demand for talent in high tech sectors like big data and AI, you get a job market that’s great for these workers, but tough for companies.
David Rickard of Everest Group, a respected provider of insight for the global BPO industry, says that while countries like India have a lot to offer now, there are some other locales that should be on your radar, including Africa.
This fully comprehensive study of enterprise intelligent automation adoption maturity from respected research firm Everest Group offers actionable insights for intelligent automation strategy and implementation.
Artificial Intelligence (AI) solutions that aim to recognize human emotions can provide useful insights for hiring, marketing, and other purposes. But their use also raises serious questions about accuracy, bias, and privacy. To learn about three common barriers that need to be overcome for AI emotion detection to become more mainstream, read on.
By using machine learning to mimic human intelligence, AI can execute everything from minimal and repetitive tasks to those requiring more “human” cognition. Now, AI solutions are popping up that go as far as to interpret human emotion. In solutions where AI and human emotion intersect, does the technology help, or deliver more trouble than value?
While we are starting to see emotion detection using AI in various technologies, several barriers to adoption exist, and serious questions arise as to whether the technology is ready to be widely used. AI that aims to interpret or replace human interactions can be flawed because of underlying assumptions made when the machine was trained. Another concern is the broader question of why anyone would want to have this technology used on them. Is the relationship equal between the organization using the technology and the individual on whom the technology is being used? Concerns like these need to be addressed for this type of AI to take off.
Let’s explore three common barriers to emotion detection using AI:
Newly launched AI-based solutions that track human sentiment for sales, human resources, instruction, and telehealth can help provide useful insights by understanding people’s reactions during virtual conversations.
While talking through the screens, the AI tracks the sentiment of the person, or people, who are taking the information in, including their reactions and feedback. The person being tracked could be a prospective customer, employee, student, patient, etc., where it’s beneficial for the person leading the virtual interaction to better understand how the individual receiving the information is feeling and what they could be thinking.
This kind of AI could be viewed as ethical in human resources, telehealth, or educational use cases because it could benefit both the person delivering the information and those receiving the information to track reactions, such as fear, concern, or boredom. In this situation, the software could help deliver a better outcome for the person being assessed. However, few other use cases are available where it is advantageous for everyone involved to have one person get a “competitive advantage” over another in a virtual conversation by using AI technology.
This brings us to the next barrier – why should anyone agree to have this software turned on during a virtual conversation? If someone knows of an offset in control during a virtual conversation, the AI software comes across as incredibly intrusive. If people need to agree to be judged by the AI software in some form or another, many could decline just because of its invasive nature.
People are becoming more comfortable with technology and what it can do for us; however, people still want to feel like they have control of their decisions and emotions.
We put a lot of trust in the accuracy of technology today, and generally, we don’t always consider how technology develops its abilities. The results for emotion-detecting AI depend heavily on the quality of the inputs that are training the AI. For example, the technology must consider not only how human emotion varies from person to person but the vast differences in body language and non-verbal communication from one culture to another. Users also will want to consider the value and impact of the recommendations that come out of the analysis and if it drives the desired behaviors that were intended.
Getting accurate data from using this kind of AI software could help businesses better meet the needs of customers and employees, and health and education institutions deliver better services. AI can pick up on small nuances that may otherwise be missed entirely and be useful in job hiring and other decision making.
But inaccurate data could alter what would otherwise have been a genuine conversation. Until accuracy improves, users should focus on whether the analytics determine the messages correctly and if overall patterns exist that can be used for future interactions. While potentially promising, AI emotion detection may still have some learning to do before it’s ready for prime time.
Contact us for questions or to discuss this topic further.
Learn more about recent advances in technology in our webinar, Building Successful Digital Product Engineering Businesses. Everest Group experts will discuss the massive digital wave in the engineering world as smart, connected, autonomous, and intelligent physical and hardware products take center stage.
As the workers’ compensation industry emerges from the pandemic, leveraging digital technologies to transform claims handling and taking a customer-centric approach will help carriers maintain profitability. By using automation, analytics, and digitalization, players can differentiate themselves. To learn about the key workers’ compensation trends to pay attention to, read on.
The workers’ compensation industry has remained profitable through the pandemic, with claims severity remaining consistent and frequency continuing to decrease. But reduced net written premium, low-interest rates, and the economic slowdown are creating top-line pressures. Moreover, the sustainability of profits is not guaranteed.
As COVID-19 subsides and most industries return to normal workways, the workers’ compensation industry could face difficulties in holding on to gains if it doesn’t chart a dedicated plan to improve productivity, employee experience, and employer mandates to create market differentiation.
Process standardization and simplification are the need of the hour. The workers’ compensation industry must move from the “repent and repair” model to “prevent and prepare” by leveraging business intelligence through an end-to-end real-time data flow across processes to enable a more customer-centric approach toward claims handling.
Currently, efficiency is impacted by the lack of information that results in back-and-forth requests on multiple claims touchpoints. By integrating processes, carriers can obtain real-time data to design standard workflow for Straight-through Processing (STP), exception handling capabilities, fraud detection and claim reserves calculation, and reduce overall claims function cycle time.
In addition to concerns and uncertainty about the long-term effects of the pandemic, the workers’ compensation industry faces the challenge of outdated workflows with heritage issues such as:
To continue its growth, workers’ compensation industry players should look to implement digital transformation and optimize processes to reduce claims turnaround time. Carriers who focus on digital solutions and leverage data through automation and analytics will successfully pivot for the future.
Traditional claims versus digital claims
Exhibit 1: Everest Group
1 – The role of automation
Workers’ compensation claims consist of workflows requiring minimal manual intervention where automation can work as an enabler providing numerous benefits such as:
Digital intake can remove friction and deliver a captivating experience for all stakeholders. By focusing on automation, workers’ compensation carriers will not only improve operational efficiency but also reduce operational costs – resulting in bottom-line benefits.
2 – Advancement in analytics
By adopting enhanced analytical capabilities, workers’ compensation carriers can increase their focus on the end-user experience and take a more proactive, prevention-based approach. Here are some ways this can be done:
Advancement in analytics with proper predictive modeling techniques will reduce claims cost and improve claims severity, which, in turn, will deliver enhanced profitability.
3- Digital ecosystem and a safe workplace
The evolution of workers’ compensation claims in the future will depend largely on the assessment of intake efficiency, moving away from redundant processes, and instituting digital and data-led workflows. Technology usage will not only depend on the number of datasets with a carrier but also on the value generated to create new models for transforming the entire claims solution.
The cornerstone for transformation should be prevention and preparedness. With many organizations choosing to operate in a hybrid working model post-pandemic, it is imperative to assess the long-term impact of such changes. Internet of Things (IoT) and telematics can help achieve this through initiatives such as:
With the pandemic pushing workforces to stay at home, telemedicine has gained popularity providing employees with medical consultation and reducing the away-from-work time. It is offering immediate care and a faster inquiry process by having expert medical professionals available. Telemedicine also helped promote claims advocacy and assisted in intake and triage through digital authorizations and workflow development for assigning priority to claim categories.
In the end, employers want stability and predictability of final claim costs. Regardless of how these macro trends affect the Workers’ Compensation industry, the focus should be on creating a safe workplace and taking all measures to prevent injuries and hazards.
Workplace leaders are also now able to focus more on creating an experience-centric workplace through digital technologies, delivering superior user performance and job satisfaction. Learn more in our webinar, Top Strategies for Creating an Employee-focused Digital Workplace.