Category: Blog

Technology Synergy Drives M&A Spike in the Banking and Financial Services Industry | Blog

Technology used to be an enabling strategic pillar for banks and financial services (BFS) organizations. Now, it is the core of these firms’ value creation playbooks. Indeed, BFS firms are building digital capability platforms using modern technologies to create what we have named SUPER — or Secure, Ubiquitous, Personalized, Easy, and Responsive — banking experiences and optimized operations.

This move to digital would require BFS firms to invest disproportionately in building these industry platforms at speed and scale. M&As (merger and acquisitions) are helping BFS firms trigger this transformation agenda by siphoning off cost synergies from mergers and investing in technology rationalization, modernization, and innovation.

Bloomberg estimated that more than US$500 billion worth of BFS M&A deals happened in 2020. That magnitude is the second highest since the 2008 financial crisis and only lags 2019 by a razor-thin margin due to the pandemic induced slowdown. Our recent analysis found that eight out of ten of the largest M&As in the BFS industry in 2020 mentioned technology synergy as one of the key drivers for the transaction.

Traditionally, acquisitions served as an opportunity to enter new product lines and/or geographies, gain new capabilities, and achieve cost savings and operational efficiencies via technology modernization and streamlining processes and systems. The recent acquisitions in the BFS space have focused additionally on technology synergy and the ability to weaponize the combined technology estate. Technology synergy is achieved in these M&A transactions by:

  • Acquiring digital capabilities and solutions
  • Achieving scale that makes economic sense to invest in building industry platforms using cloud, APIs, and data & analytics technologies
  • Acquiring digital skills
  • Combining discrete technology components of merged entities to create industry platforms.

As mentioned in the image below, leaders at BFS firms undergoing such M&As stress the importance of digital as a lever for these strategic acquisitions. For instance, in the merger of First Citizens BancShares, Inc. and CIT Group, Ellen R. Alemany – Chairwoman and CEO of CIT, who will assume the role of Vice Chairwoman of the combined entity – highlighted how well-positioned the two firms will be to leverage their product portfolio and technology across the franchises, and make additional investments in technology to enhance the customer experience.

Focus on tech synergy causing a spike in M&A activity in BFS

Expansion of the IT estate to build digital capability platforms has created a paradigm shift in business cases for M&As. The platform-based economy not only enables new businesses and systems but also facilitates rapid integration across merged entities.

A notable example is S&P Global’s bid to buy IHS Markit in December 2020, which serves as an example of a technology-driven merger in the financial information and credit rating space. It has created an opportunity for the two firms with unique and harmonizing assets to create a formidable data and technology offering. IHS is the industry frontrunner in leveraging platforms for underwriting corporate stock and bonds and trade processing. The combined entity will become a data powerhouse for complex financial products, and this will directly funnel exponential growth for S&Ps credit rating service, which comprises 40-50 percent of its revenue.

Skill acquisition is gradually gaining popularity across multiple deals. Aspects like digital identity and security are addressed in Moody’s purchase of Regulatory DataCorp (RDC), a provider of KYC/AML data services, and Mastercard’s acquisition of RiskRecon for cybersecurity services. In the platforms/technology space, Charles Schwab acquired the technology and intellectual property of a fintech, Motif. And customer experience took centerstage in Goldman Sachs’ acquisition of United Capital, with a focus on scaling up its UI/UX products. Talent acquisition is another factor that is gaining ground across some of these mergers.

In November 2020, PNC Financial Services acquired the US Operations of the Spanish lender BBVA. And most recently, Huntington Bancshares acquired TCF Financial. The banks are not only increasing their asset size and market reach but also gearing up to save costs by optimizing their IT estate and branch networks. These cost savings are being funneled to build better digital experiences as more customers are opting for online and mobile services for their banking needs.

Similarly, significant deal activity is expected in the asset management space. For example, Macquarie Group is set to buy Waddell & Reed for US$1.7 billion. This traction in asset management is driven not just by pressure on fees and revenue but also by increased costs attributed to technology and digital spending. Asset management firms with deep pockets are already betting heavily on the success of platform- and data-based niche firms. For instance, BlackRock recently purchased minority stakes in the platform-based alternative wealth management firm iCapital Network and the robo-advisor Envestnet.

Our analysis suggests that the M&A trend will pick up for regional and community banks in a bid to gain scale. This is critical to compete with larger players as customer intimacy and relationships move from physical to digital. They will be better equipped to build new capabilities in robotics, AI/ML, and advanced analytics as banking increasingly digitizes. The combined entities will also have a larger pool of resources wherein better skill-to-talent match can be achieved.

BFS M&As will be a boost to the consulting and IT services industry

M&A’s will entail increased spending in post-merger integration and consulting expenditures in the short term. BFS firms will need partners that can create a modernization roadmap for the combined entity. The merged entities can gain significant cost synergies by rationalizing their vendor portfolio and IT estate, as several applications and platforms will become redundant. Hence, a modernization roadmap will enable value creation in the long run.

Of course, the merged entities must also make rapid changes in their working models, delivery strategies, and sourcing decisions to thrive in the new normal. Investments in some specific technologies/tools will ensure growth and continuity of operations. Digital acquisition is thus becoming a table stake as firms determine the right valuation even before they formulate the integration strategy.

Large BFS firms are looking at targets that help them create a digital service model for the future. We are already seeing increased M&A activity among regional banks, asset management firms, and brokerage houses. As we inch closer – hopefully – to the end of the pandemic, BFS firms will be eyeing M&A opportunities that deliver technology synergy and associated business transformation benefits. Picking the right segment, target, and timing of these initiatives will be crucial.

Discover even more insights in the BFS industry in our recent research and reports:

Or if you would like to understand more about the impact of the increased M&A activity in the BFS industry, please reach out to us at [email protected] and [email protected].

Advancing from Artificial Intelligence to Humane Intelligence | Blog

I recently came across a news article that said doctors will NOT be held responsible for a wrong decision or recommendation made based on the recommendations of an artificial intelligence (AI) system. That’s shocking and disturbing at so many levels! Think of the multitude of AI-based decision making possible in banking and financial services, the public sector, and many other industries and the worrying implications wrong decisions could have on the lives of people and society.

One of the never-ending debates for AI adoption continues to be the ethicality and explainability concerns with the systems’ black box decision making. There are multiple dimensions to this issue:

  1. Definitional ambiguity – Trustworthy, fair and ethical, and repeatable – these are the different characteristics of AI systems in the context of explainability. Most enterprises cite explainability as a concern, but most don’t really know what it means or the degree to which it is required.
  2. Misplaced ownership – While they can be trained, re-trained, tested, and course corrected, no developer can guarantee bias-free or accurate decision making. So, in case of a conflict, who should be held responsible? The enterprise, the technology providers, the solution developers, or another group?
  3. Rising expectations – AI systems are being increasingly trusted with highly complex, multi-stakeholder decision-making scenarios which are contextual, subjective, open to interpretation, and require emotional intelligence.

 

Enterprises, particularly the highly regulated ones, have hit a roadblock in their AI adoption journey and scalability plans considering the consequence of wrong decisions with AI. In fact, one in every three AI use cases fail to reach a substantial scalable level due to explainability concerns.

While the issue may not be a concern for all AI-based use cases, it is usually a roadblock for scenarios with high complexity and high criticality, which lead to irrevocable decisions.

Advancing from Artificial Intelligence to Humane Intelligence

In fact, Hanna Wallach, a senior principal researcher at Microsoft Research in New York City, stated, “We cannot treat these systems as infallible and impartial black boxes. We need to understand what is going on inside of them and how they are being used.”

Progress so far

Last year, Singapore released its Model AI Governance Framework, which provides readily implementable guidance to private sector organizations seeking to deploy AI responsibly. More recently, Google released an end-to-end framework for an internal audit of AI systems. There are many other similar efforts by opponents and proponents of AI alike; however, a feasible solution is still out of sight.

Technology majors and service providers have also made meaningful investments to address the issue, including Accenture (AI fairness Toolkit), HCL (Enterprise XAI Framework), PwC (Responsible AI), and Wipro (ETHICA). Many XAI-centric niche firms that focus only on addressing the explainability conundrum, particularly for the highly regulated industries like healthcare and public sector, also exist. Ayasdi, Darwin AI, KenSci, and Kyndi deserve a mention.

The solution focus varies from enabling enterprises to compare the fairness and performance of multiple models to enabling users to set their ethicality bars. It’s interesting to note that all of these offer bolt-on solutions that enable an explanation of the decision in a human interpretable format, but they’re not embedded explainability-based AI products.

The missing link  

Considering this is an artificial form of intelligence, let’s take a step back and analyze how humans make such complex decisions:

  • Bias-free does not exist in the real world: The first thing to appreciate is that humans are not free from biases, and biases by their nature are subjective and open to interpretation.
  • Progressive decision-making approach: A key difference between humans and the machines making such decisions is the fact that even with all processes in place, humans seek help, pursue guidance in case of confusion, and discuss edge cases that are more prone to wrong decision making. Complex decision making is seldom left to one individual alone; rather, it’s a hierarchy of decision makers in play, adding knowledge on top of previous insights to build a decision tree.
  • Emotional Quotient (EQ): Humans have emotions, and even though most decisions require pragmatism, it’s the EQ in human decisions that explains the outcomes in many situations.

Advancing from Artificial Intelligence to Humane Intelligence

These are behaviors that today’s AI systems are not trained to adopt. A disproportionate focus on speed and cost has led to neglecting the human element that ensures accuracy and acceptance. And instead of addressing accuracy as a characteristic, we add another layer of complexity in the AI systems with explainability.

And even if the AI system is able to explain how and why it made a wrong decision, what good does that do anyway? Who is willing to put money in an AI system that makes wrong decisions but explains them really well? What we need is an AI system that makes the right decisions, so it does not need to explain them.

AI systems of the future need to be designed with these humane elements embedded in their nature and functionality. This may include, pointing out edge cases, “discussing” and “debating” complex cases with other experts (humans or other AI systems), embedding the element of EQ in decision making, and at times even handing a decision back to humans when it encounters a new scenario where the probability of wrong decision making is higher.

But until we get there, a practical way for organizations to address these explainability challenges is to adopt a hybrid human-in-the-loop approach. Such an approach relies on subject matter experts (SMEs), such as ethicists, data scientists, regulators, domain experts, etc. to

  • Improve learning models’ outcomes over time
  • Check for biases and discrepancies
  • Ensure compliance

In this approach, instead of relying on a large training data set to build the model, the machine learning system is built iteratively with regular inputs from experts.

Advancing from Artificial Intelligence to Humane Intelligence

In the long run, enterprises need to build a comprehensive governance structure for AI adoption and data leverage. Such a structure will have to institute explainability norms that factor in criticality of machine decisions, required expertise, and checks throughout the lifecycle of any AI implementation. Humane intelligence and not artificial intelligence systems are required in the world of the future.

We would be happy to hear your thoughts on approaches to AI and XAI. Please reach out to [email protected] for a discussion.

Dilemma of Customers’ Increased Productivity or Service Providers’ Profitability | Blog

I previously blogged about the need for a new third-party services operating model that would be more productive and agile and focus far more on results. Though the industry was on the verge of moving to a new model, adoption was slow at first. Now, attraction for the move to a new model is picking up in the marketplace, but enterprise customers and service providers define expectations differently.

Read more of my blog in Forbes

Amazon HealthLake: A Step Further in AWS’ Healthcare Strategy | Blog

It’s been close to a month since Amazon Web Services (AWS) announced Amazon HealthLake at its 2020 annual (and virtual) conference. After observing the reactions from various industry participants, we thought it was time to offer our opinion.

Amazon HealthLake is a HIPAA-eligible service that aims to support interoperability standards and further drive the use of big data analytics in healthcare and life sciences. The service is essentially a data lake tailored for the healthcare and life sciences industry. It will aggregate an organization’s data across various silos and disparate formats into a centralized AWS data lake, and automatically normalize this information using machine learning. It will be capable of identifying each piece of clinical information and tagging and indexing events in a timeline view with standardized labels so they can be easily searched. This structured data can then be offloaded to a service such as Amazon SageMaker to train machine learning models for advanced analytics. HealthLake will also structure all the data into the Fast Healthcare Interoperability Resources (FHIR) industry standard format to enable data sharing throughout the organization.

Exhibit 1: Amazon HealthLake

Amazon HealthLake

Source: AWS (https://aws.amazon.com/healthlake/)

What’s in it for the healthcare and life sciences industry?

This is definitely a positive step for AWS to showcase an industry-specific solution for its clients and prospects. Amazon HealthLake provides a contextualized solution for addressing some critical challenges the healthcare and life sciences industry is facing, namely working with siloed, unstructured, and incomplete data stored across multiple systems, lab reports, medical images, insurance claims, and time-series data (for example, heart ECG or brain EEG traces.) Putting data at the center of the business allows the development of innovative products and services and provides the opportunity to revolutionize business models. AWS’s approach enables healthcare industry professionals to focus on mission-critical activities while it manages the data complexity.

Some of the key use cases for HealthLake include:

  • Payers – HealthLake will help health insurance companies predict more accurate insurance premiums, design data-driven insurance policies, and carry out effective claims management by bringing together a complete view of a patient’s medical history.
  • Providers – Healthlake can integrate with other AWS machine learning and analytics services, like Amazon SageMaker and QuickSight, to improve efficiency and reduce hospital waste. Some of the core use cases include population health management, clinical decision support, revenue cycle management, scheduling optimization, reducing unnecessary procedures, and addressing privacy and security requirements.
  • Pharma/Biotech – Clinical researchers are struggling with ever-increasing volumes of data from trial sites, patients, CROs, and other vendors, as well as from newer resources like EHR and wearable technology. HealthLake can help life sciences enterprises revolutionize data-driven R&D, advance clinical research with predictive analytics, and enhance pharmacovigilance.

HealthLake fits perfectly in Amazon’s data-hinged healthcare strategy

Amazon is aiming to transform healthcare by putting a well-developed range of integrated technology solutions supported by a second-to-none data asset, similar to its disruptive approach in the retail industry with low costs, high customer convenience, and a great recommendation engine. A key prong of its strategy is healthcare cloud computing, as payers, providers, and life sciences enterprises adopt more cloud computing services to stay on top of the rising volume of patient data.

Amazon HealthLake is another sign that AWS views healthcare as an industry with massive growth potential for its cloud services. AWS has been steadily rolling out HIPAA-eligible computing tools over the past few years in a race with Google Cloud and Microsoft Azure as the industry cloud war intensifies in the nascent healthcare cloud computing space. ​In 2019, the company announced Amazon Transcribe Medical, a voice transcription service for physicians that inputs text directly into medical records. In 2018, it introduced Amazon Comprehend Medical, a service that uses AI to mine medical records for information that can be used to improve patient treatment and reduce costs.

While AWS’s long-term strategy for healthcare is anyone’s guess right now, it will certainly be an interesting player to watch as well as an exciting one to partner with and compete against.

Next-generation Security Operations Centers | Blog

The rapid pace of digitalization has increased enterprise exposure to a diverse and evolved range of cyberattacks. However, many enterprises make security an afterthought rather than a part of their digital transformation journey. While they’ve always had a daunting task to make their business resilient, the COVID-19 pandemic has only added to their woes. A global shift toward remote working and the sudden expansion of the enterprise perimeter has contributed immensely toward enterprise challenges.

Here’s a quick snapshot of some high-level security-related challenges that enterprises will continue to face in 2021:

01

To overcome these challenges, which are associated with speed and scalability of security services delivery, enterprises rely on security operations centers (SOCs) to monitor systems and defend against breaches. As the frequency and severity of breaches continue to rise, traditional SOCs and Security Information and Event Management (SIEM) systems based on signatures and rule-based automation are quickly becoming obsolete, as they make it immensely difficult for security analysts to stay on top of internal and external threat-related data.

Consequently, SOCs need to transition to an “Aware” state that is underpinned by cognitive capabilities that help detect, prevent, and resolve incidents at scale to keep pace with evolving adversaries.

What is Aware SOC?

Simply put, an Aware SOC is underpinned by next-generation SIEM and cognitive technologies – AI and ML along with decision automation – to deliver intelligent security operations. The Aware SOC is built on a single platform that seamlessly integrates solutions from multiple vendors to augment existing capabilities. Designed to secure distributed enterprise architecture, an Aware SOC brings together the best of human + machine capabilities to help enterprises fight against the rising tide of sophisticated cyberattacks.

The table below shows how enterprises should think about an Aware SOC as an amalgamation of best-of-breed technology and talent:

02

Security operations done right: Moving to a platform-driven Aware SOC

The pandemic has been a major change agent for enterprises, significantly impacting their security operations. To incorporate speed and scalability in their security operations, enterprises are now re-thinking their SOC architecture. The platform that an enterprise chooses for its security operations has started to become a pivotal element of its overall security infrastructure, becoming the de facto operating system for other point-based security tools. The shift to a platformized cloud-first approach, underpinned by SaaS-based tools for monitoring, threat hunting, vulnerability assessment, and incident resolution is expected to be the springboard of security transformation for medium and large enterprises.

Here’s our view of an architecture for a platform-driven Aware SOC:

03

Enterprises can find significant value through platform-driven Aware SOC, where it can break systems down into building blocks and bring in modularity that allows them to scale and manage security controls across environments. The elements of platform, spanning data lake and network traffic analysis, also give enterprises enriched insights related to their existing and to-be security estates.

Advantages of investing in a platform-driven Aware SOC

Investing in an Aware SOC is a highly strategic decision. Beyond economic benefits, a platform-driven Aware SOC produces a number of other benefits, including speed, scalability, resiliency, and efficiency. The benefits discussed below are not an all-encompassing list but instead a starting point for exploring the benefits of investing in platform-driven Aware SOC:

  1. Automated security across the enterprise IT estate – ingest alerts across multiple environments and execute automated workflows/playbooks to speed up incident response
  2. Break team silos – playbooks for real-time collaboration capabilities that enable security teams to solve for existing and new threats and breaches
  3. Expedite incident investigations – enables standardized response for high-quantity attacks such as DDoS attacks. Also helps security analysts adapt to sophisticated one-off attacks.

Whether an enterprise is thinking of outsourcing security operations or bolstering them internally, it needs to future-proof its overall cybersecurity strategy. While charting the broader cybersecurity strategy, an enterprise needs to keep a firm sight on its short-, mid-, and long-term business goals. This is where a platform-driven Aware SOC can help. A platformized approach to Aware SOC that stitches the entire security fabric together will go a long way in ensuring that the enterprise’s cybersecurity strategy aligns with business goals such as speed, scalability, and resilience.

Follow this space for more blogs on cybersecurity. Meanwhile, please feel free to reach out to [email protected] and [email protected] to share your experiences and ask any questions you may have.

Data Literacy: An Idea Whose Time Has Come | Blog

The COVID-19 pandemic has accelerated organizations’ move from intuition-based to data-powered decision making. Why? Primarily because the sudden impact of the pandemic exposed black spots in global supply chains, shuffled demand patterns, and significantly impacted the travel and entertainment industries. Those organizations that had the capability to harness insights were able to respond with agility and resilience.

Moving from intuition-based to data-driven decision making

Data literacy is a key element of an organization’s ability to be insights driven. Indeed, data literacy is defined as the ability of an organization’s key decision makers to make data-driven decisions by providing them access to the tools/technologies to access/manipulate data and training them in how to know, speak, and argue with data.

Data literacy supports data/technology democratization by decentralizing data-driven decision making, empowering users to drive faster, reliable, and actionable decisions.

Data literacy remains in its infancy

Despite its transformative potential, industry commitment to, and spend on, data literacy remains low.

Data literacy remains in its infancy

While the current market spend on data literacy is $530-560 million, only 10-15% goes to standalone data literacy engagements. A large percentage of data literacy initiatives are baked into broader data and analytics transformation programs, as data literacy is considered an adjunct to broader people change management processes.

Why is commitment to data literacy lacking?

While there is surely a lack of enterprise education related to data literacy, the larger challenge is for enterprises to provide the right data assets to the right people.

1. The right data – The right data refers to the relevant, high quality data assets required to perform a specific business task. These assets could be internal/external or first/third. Organizations struggle to build a consolidated view and repository of their data assets through a centralized data management and governance strategy. Moreover, developing the ability to exploit external data for added insights and decision making remains low on most organizations’ radars. As a result, enterprises’ data assets mostly exist in siloes and remain grossly underutilized. Moreover, lack of governance and processes around data quality hinder business users’ trust in insights generated with the existing data assets and technologies.

2. The right people – The right people refers to the business users who need to be empowered to make data-driven decisions. Enterprises generally struggle to find the right people who are trained to work with data and associated technologies such as AI and advanced analytics. The large demand/supply gap for key data and analytics skills further aggravates the problem. This shortage in talent supply is further exacerbated by low project readiness and poor domain understanding of the available talent pool.

Another key challenge is identity and access control to prevent data breaches and civil lawsuits. The prevalence of laws governing data – such as CCPA and GDPR – further impede enterprises from going all-in on insights driven decision making.

For enterprises to make data literacy actionable, they need to connect the right data to the right people supported by strong leadership commitment and a centralized governance structure.

Rewards associated with data literacy

Both IT service providers and technology companies have started to offer credible data literacy services, including both technology solutions for self-service analytics, such as Qlik and Tableau, and targeted training programs to address the demand-supply gap, like Qlik’s. IT service providers, including Accenture and TCS, recognizing the growing need for data literacy, are offering solutions that support data/technology democratization.

Data literacy used across the organization leads to insightful decision making, which helps the organization to become agile, resilient, and sustainable, increasing its competitive advantage in an aggressive and fast-changing global economy. Data literate organizations can also sustain growth during the normal evolving business landscape, as well as during black swan events such as a pandemic.

This blog is the first in a series exploring the core concept of data literacy and its importance for enterprises. If you have any questions about how data literacy can help your organization tackle complex situations, or if you would like to share how insights-driven decision making helped your organization work through a critical period, please write to us at [email protected] and [email protected].

It’s Time for IT Operating Model Transformation in the Medical Device Industry | Blog

As the world is working to contain the spread of COVID-19 through safety measures and vaccine development, the medical device industry also is starting to show signs of recovery from the pandemic’s impact. The decline in elective procedures hit medical device organizations’ revenue hard in the first half of 2020. But increasing demand for COVID-19 diagnostic tests is stabilizing growth and compensating for the negative impact on routine testing procedures. To support growth dynamics and ensure a stable recovery, initiatives focused on cost reduction and operating margin optimization are the top priorities for most medical device organizations. For example:

  • Stryker is looking at organization-wide cost transformation for better margin expansion.
  • Medtronic announced its restructuring plan to save up to US$475 million per year.
  • Baxter expects to achieve an operating margin of 23-24% by 2023 through cost optimization.

 How can IT operating model transformation help?

To achieve significant cost savings, medical device companies need to scale their digital transformations to reap maximum benefits, which mandates IT operating model transformation. Based on a recent survey we conducted with 200 CXOs, 80% of firms that adopt an IT operating model transformation believe they’re on their way to establishing market leadership in their respective industries, expanding to serve new market and customer segments, and achieving over one-and-a-half times more cost savings than those that haven’t initiated a similar transformation. An enterprise-wide IT operating model transformation could help medical device companies realize cost benefits and maintain a competitive edge in the market.

Yet, according to our 2020 research, nearly 78% of enterprises fail to implement their digital transformation as envisioned.

What are medical device companies doing wrong?

Most medical device companies that are transforming IT achieve success in running pilot projects but fail to realize the same impact across the breadth and depth of their enterprises, because they equate IT transformation with either technology overhaul or operating model transformation, instead of pursuing them together. Exclusively focusing on technology transformation fails because it is a siloed approach that relies too much on technology without integrating the necessary operating model changes. On the other hand, changes in only the operating model result in superficial restructuring of business units that are focused on driving tactical and incremental efficiencies, eventually leading to a plateau phase. Clearly, companies need to embrace a balanced focus on both technology and operating model transformation to scale and sustain the transformation benefits.

What’s the best approach to IT operating model transformation?

The exhibit below is a framework that outlines the strategy and solution elements that enable an IT operating model transformation. Medical device companies can leverage this framework as a blueprint to design a holistic, best-in-class approach across both strategy and solution design.

Everest Group’s Target IT Operating Model Framework

Everest Group's Target IT Ops model framework

When creating the strategic roadmap, medical device enterprises should articulate a vision statement that ties together both business and IT objectives, and also carefully consider how to communicate the vision across the organization. Building an adaptable and agile organization that is resilient and collaborative is instrumental in this long journey, as 70% of enterprises believe organization structure is a barrier to scaling digital initiatives. Furthermore, effectively resourcing the IT operating model transformation and monitoring performance through cross-functional, outcome-based business metrics are keys to its success.

To complement the well-defined strategy with a strong solution framework, medical device companies should establish a strategic partner ecosystem with a fit-for-purpose portfolio of vendors. Traditional IT sourcing partnerships with a cost-centric model should evolve to a value-adding, strategic model focused on driving innovation and enabling business value. To truly scale the digital transformation and break up siloed investments, they should identify strategic partners that will equip them with the right talent, and scale the adoption of relevant next-generation technologies, so that – together – they can drive innovation with a common objective of achieving the defined vision.

As medical device organizations embark on their IT operating model transformations, we suggest a three-stage approach:

  • Lay out the building blocks by painting an organization-wide vision and introducing the agile way of working by building integrated multidisciplinary teams.
  • Enable the IT organization with the right solutions to become a business value orchestrator.
  • Scale the impact by synchronizing effects across solutions and constantly monitoring progress through well-defined business metrics.

Please share your views on IT operating model transformation with us at [email protected] and [email protected].

BioClinica and ERT to Merge: Perspectives on Potential Synergies | Blog

On December 10, 2020, ERT, a clinical end-point data solutions company, announced its merger with BioClinica, a clinical trial management and imaging solutions company. The goal of the resulting enterprise will be to integrate the best of both worlds – ERT’s expertise in electronic Clinical Outcomes Assessment (eCOA), therapeutic expertise in cardiac safety and respiratory, and clinical endpoint measurement through wearables, with BioClinica’s expertise in imaging and clinical trial management solutions. The merger will equip the combined company to deliver data analytics, insights, business intelligence, virtual patient visits, and technology solutions to its clients.

In analyzing this development, we’ve taken a look at the hottest topics in the life sciences industry right now – decentralized and virtual clinical trials.

Virtual clinical trials – a revolution catalyzed by the pandemic

A virtual clinical trial is one in which certain parts of the clinical trial are conducted outside a clinical site, such as patient consent capture, trial data capture, or patient monitoring through sensors or wearables. The benefits to the pharmaceutical company include cost savings, better patient recruitment and retention, and improved data quality.

Earlier this year, we published a blog predicting that the 2020s would be the decade of virtual trials. It seems we were way off the mark – by about nine years. The year 2020 has already seen its fair share of virtual trials, as clinical trials that were put on pause due to lockdown restrictions were rescued by being converted to fully virtual or hybrid trials, such as cases in which clinical experts visited patients at their residences to collect vitals or samples, reducing delays.

#NoGoingBack

The virtual trial momentum isn’t temporary, and there’s increasing focus on virtual trials, even among investors. Not only this, many in the industry have pledged to preserve the progress they’ve made in clinical research due to the pandemic, including virtual trials.

On the same day the news of the merger was announced, the Decentralized Trials & Research Alliance (DTRA) was formed to unite stakeholders with a mission to make clinical trial participation widely accessible by advancing policies, research practices, and new technologies in decentralized patient-focused clinical research. Companies that are part of this alliance include technology vendors such as Medidata Solutions and Oracle Health Sciences, pharma companies such as Pfizer and Roche, CROs such as Parexel and Syneos Health, and others such as Amazon and the US Food and Drug Administration.

M&A and investment activity has increased, too. For example, Medable and Science 37 each received funding during the pandemic to advance their virtual trial offerings. And in November 2020, VirTrial, a telehealth platform for managing decentralized and virtual clinical trials, was acquired by Signant Health, a Clinical Trial Management System (CTMS) vendor, thus augmenting its virtual trial capabilities.

Clearly, virtual trials are a ripe area for M&A and investment activity given their disruptive capabilities and benefits. And we continue to expect more acquisitions, funding, and collaboration in this space in the near future.

What this all means for the merger

Our recently concluded PEAK Matrix assessment on clinical development platforms pointed out that BioClinica’s Cloud platform for clinical development does not have the capability to support virtual trials; we said it needed to invest in remote monitoring and eCOA capabilities to deliver on virtual trials. However, the solution does have a broad set of capabilities in the clinical, regulatory, and safety value chains.

As a result of the merger, however, BioClinica will be able to offer virtual trial capabilities to clients. ERT is one of the leading eCOA providers and through its wearable and sensor data capture capabilities, it is well positioned to conduct virtual trials in certain therapy areas. And it will be able to use the BioClinica Cloud offering to give clients a holistic clinical development experience, a win-win-win for ERT, BioClinica, and their clients.

Exhibit 1 shows the combined solution landscape.

Exhibit 1: The merger synergies

bioclinica

The merged entity will be able to showcase an end-to-end clinical development platform with enabling layers for virtual trial conduct. This move is definitely the right direction, at the most opportune time, and is just another sign of increasing interest in decentralized and virtual trials.

What are your views on this merger? Let us know your thoughts at [email protected] and [email protected].

Uncleared Margin Rules (UMR) as a Catalyst for Change – from Spreadsheets to Digital Compliance Driven by Data and Cloud | Blog

In the wake of the 2008 financial crisis, leaders of the G20 summit laid out the Uncleared Margin Rules (UMR) as part of the financial regulatory reform agenda. The goal of these rules was increasing transparency and reducing the credit risk posed by major participants in the Over the Counter (OTC) derivatives market. UMR introduced fully bilateral Initial Margin (IM) rules based on theoretical loss, to protect one party against the other party’s default.

The UMR have been rolled out in phases since 2016, and approximately 60 of the largest firms (by assets under management) currently comply with IM rules. Before the onset of COVID-19, 200+ firms were expected to come under the rules’ purview by September 1, 2020, but regulators pushed the timelines to help pandemic-impacted firms focus their resources on managing risks associated with market volatility.

An estimated 1,100 counterparties are expected to come under the combined purview of Phase 5 and Phase 6, which will be rolled out in September 2021 and September 2022, respectively. It is thus inevitable that the significant increase in Newly In-Scope Counterparties (NISCs) will create overwhelming demand on market resources across participants and service providers. To address this demand rush, significant operational and technology-led solutions must be implemented, and most firms plan to engage with external partners to reduce the burden of the additional contractual agreements that must be put in place.

The new rules involve the following major changes across operational processes and legal agreements:

  • Both swap dealers and funds will be required to exchange IM with one another
  • IM must now rest with third-party custodians

If not done in a timely manner, NISCs will not be able to trade in non-centrally cleared derivatives, limiting their options for both taking on and hedging risks, potentially impacting liquidity in the derivatives markets.

Time for change – fighting the legacy

Firms have historically relied on spreadsheets and siloed legacy technology systems to assess their collateral needs, access valuations, and communicate them to counterparties – a cumbersome method that makes the task of UMR compliance all the more difficult.

Though the rules only apply to new transactions, they may, in fact, create multiple workflows for monitoring both new and legacy transactions. Beyond operational updates, firms will need to negotiate and enter into new legal agreements and modify existing ones. In some cases, they may need to alter their trading strategies and operations to mitigate or steer clear of the rules by using portfolio compression or by simply reducing their use of uncleared products.

Thus, to avoid getting caught in a regulatory bottleneck, firms must act now to:

  1. Determine whether the rules apply to them by calculating the Aggregate Average Notional Amount (AANA) of non-cleared derivatives
  2. Identify their IM requirements
  3. Set up a data infrastructure for enhanced transparency and analysis
  4. Choose service providers in the areas of custody, monitoring, and legal services
  5. Create a modern architecture and digital roadmap for UMR or adopt technologies from third-party technology vendors that can be integrated easily into a wide range of asset classes that require IM calculations

Engineering and system integration complexity is bound to increase with legacy systems (which need to be modernized) and the operational changes needed to meet the regulatory guidelines. Thus, firms need to choose the right set of technology vendors and system integration and consulting partners to support them on their compliance journeys. In fact, even firms that do not cross the US$50 million IM threshold will need systems to monitor their IM thresholds regularly, thereby creating a market for cost-effective technology solutions.

Several technology vendors are increasingly building a strong data and cloud technology infrastructure and value-added digital technologies, such as cognitive technologies and interactive visualization, to help optimize costs and better comply with the rapidly changing regulatory landscape. For example, Finastra and CloudMargin have partnered to deliver an integrated collateral and margin management solution to enterprises of all sizes through a SaaS model, facilitating end-to-end straight-through processing of derivatives transactions and all associated collateral management workflows, from trade booking through settlement.

RegTechs providing a helping hand

UMR Technology and Services Vendor Landscape

AcadiaSoft is leading the way in the regulatory technology market with its UMR Collateral Suite and extensive partnerships with technology and data vendors, such as Bloomberg, Cassini Systems, Capco, Calypso Technology, HazelTree, IHS Markit, Murex, and TriOptima, to support organizations in their UMR compliance journeys. AcadiaSoft and TriOptima have partnered for a Phase 5 soft launch aimed at avoiding a compliance crunch near the deadline. More than 30 firms falling under IM Phase 5 have successfully joined the initiative, while another 25 are scheduled to join before the end of 2020.

IT service providers can tap into such opportunities by collaborating with technology vendors to help create a packaged solution, providing the much-needed implementation and deployment support layered with domain advisory capabilities. A notable case in point is the launch of Wipro’s Standard Initial Margin Method (SIMM) in a box solution in collaboration with Quaternion Risk Management.

Embracing the change

New workflows and requirements are set to be introduced as organizations embark on the journey to become UMR compliant. Rather than considering UMR as an additional regulatory burden, firms should leverage this opportunity to reevaluate and reimagine their existing workstreams and use UMR as a catalyst for change to holistically automate and streamline their collateral management.

If you’d like to share your observations or questions on the fast-evolving technology and services landscape for UMR compliance solutions, please reach out to [email protected], [email protected], and [email protected].

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

  • Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.