Tag: automation

Recharge Your AI initiatives with MLOps: Start Experimenting Now | Blog

In this era of industrialization for Artificial Intelligence (AI), enterprises are scrambling to embed AI across a plethora of use cases in hopes of achieving higher productivity and enhanced experiences. However, as AI permeates through different functions of an enterprise, managing the entire charter gets tough. Working with multiple Machine Learning (ML) models in both pilot and production can lead to chaos, stretched timelines to market, and stale models. As a result, we see enterprises hamstrung to successfully scale AI enterprise-wide.

MLOps to the rescue

To overcome the challenges enterprises face in their ML journeys and ensure successful industrialization of AI, enterprises need to shift from the current method of model management to a faster and more agile format. An ideal solution that is emerging is MLOps – a confluence of ML and information technology operations based on the concept of DevOps.

According to our recently published primer on MLOps, Successfully Scaling Artificial Intelligence – Machine Learning Operations (MLOps), these sets of practices are aimed at streamlining the ML lifecycle management with enhanced collaboration between data scientists and operations teams. This close partnering accelerates the pace of model development and deployment and helps in managing the entire ML lifecycle.

Picture1 1

MLOps is modeled on the principles and practices of DevOps. While continuous integration (CI) and continuous delivery (CD) are common to both, MLOps introduces the following two unique concepts:

  • Continuous Training (CT): Seeks to automatically and continuously retrain the MLOps models based on incoming data
  • Continuous Monitoring (CM): Aims to monitor the performance of the model in terms of its accuracy and drift

We are witnessing MLOps gaining momentum in the ecosystem, with hyperscalers developing dedicated solutions for comprehensive machine learning management to fast-track and simplify the entire process. Just recently, Google launched Vertex AI, a managed AI platform, which aims to solve these precise problems in the form of an end-to-end MLOps solution.

Advantages of using MLOps

MLOps bolsters the scaling of ML models by using a centralized system that assists in logging and tracking the metrics required to maintain thousands of models. Additionally, it helps create repeatable workflows to easily deploy these models.

Below are a few additional benefits of employing MLOps within your enterprise:

Picture2

  • Repeatable workflows: Saves time and allows data scientists to focus on model building because of the automated workflows for training, testing, and deployment that MLOps provides. It also aids in creating reproducible ML workflows that accelerate fractionalization of the model
  • Better governance and regulatory compliance: Simplifies the process of tracking changes made to the model to ensure compliance with regulatory norms for particular industries or geographies
  • Improved model health: Helps continuously monitor ML models across different parameters such as accuracy, fairness, biasness, and drift to keep the models in check and ensure they meet thresholds
  • Sustained model relevance and RoI: Keeps the model relevant with regular training based on new incoming data so it remains relevant. This helps to keep the model up to date and provide a sustained Return on Investment (RoI)
  • Increased experimentation: Spurs experimentation by tracking multiple versions of models trained with different configurations, leading to improved variations
  • Trigger-based automated re-training: Helps set up automated re-training of the model based on fresh batches of data or certain triggers such as performance degradation, plateauing or significant drift

Starting your journey with MLOps

Implementing MLOps is complex because it requires a multi-functional and cross-team effort across the key elements of people, process, tools/platforms, and strategy underpinned by rigorous change management.

As enterprises embark on their MLOps journey, here are a few key best practices to pave the way for a smooth transition:

  • Build a cross-functional team – Engage team members from the data science, operations, and business front with clearly defined roles to work collaboratively towards a single goal
  • Establish common objectives – Set common goals for the cross-functional team to cohesively work toward, realizing that each of the teams that form an MLOps pod may have different and competing objectives
  • Construct a modular pipeline – Take a modular approach instead of a monolithic one when building MLOps pipelines since the components built need to be reusable, composable, and shareable across multiple ML pipelines
  • Select the right tools and platform – Choose from a plethora of tools that cater to one or more functions (management, modeling, deployment, and monitoring) or from platforms that cater to the end-to-end MLOps value chain
  • Set baselines for monitoring – Establish baselines for automated execution of particular actions to increase efficiency and ensure model health in addition to monitoring ML systems

When embarking on the MLOps journey, there is no one-size-fits-all approach. Enterprises need to assess their goals, examine their current ML tooling and talent, and also factor in the available time and resources to arrive at an MLOps strategy that best suits their needs.

For ML to keep pace with the agility of modern business, enterprises need to start experimenting with MLOps now.

Are you looking to scale AI within your enterprise with the help of MLOps? Please share your thoughts with us at [email protected].

Democratization of Automation, Post-pandemic | Webinar

Everest Group’s Anil Vijayan and Amardeep Modi will join experts from Microsoft and HCL Technologies to explore how — with the right set of enabling initiatives — enterprises can help enhance the adoption of automation through their business user community, and exponentially scale their automation outcomes.

They will discuss:

  • What are the key impacts of COVID-19?
  • How automation ecosystem has expanded beyond RPA?
  • Understanding methodologies to achieve broader transformation outcomes through automation
  • Insights on how organizations are enabling citizen developers to achieve broader automation goals

When

Join us Wednesday, May 26, 2021, at 11 am CST, 12 pm EST, 4 pm GMT, 9:30 pm IST

Where

Live, virtual event

Presenters

Amardeep Modi
Practice Director, Everest Group

Anil Vijayan
Vice President, Everest Group

Joy Trajano
Director, Marketing Operations, Microsoft

Ashvini Sharma
Group Program Manager, Microsoft

Jason Skelton
Sr. Group Marketing Tech Manager, Microsoft

Siddharth Gandhi
Director, Digital Process Operations, HCL Technologies

Rachit Chawla
Director, Digital Process Operations, HCL Technologies

 

Internet of Things Will Connect the Supply Chain in the “Next Normal” | Blog

Imagine a utopia where minimum human intervention is needed to run an entire shop floor. In this world, manufacturers have total control and visibility of all products, machines predict equipment failures and correct them, shelves count inventory, and customers check themselves out. While such a supply chain model seems improbable and far into the future, the likes of Amazon, Walmart, and Toyota, are already on their way to achieving this vision. At the center of their supply chain initiatives making this possible is the Internet of Things (IoT.)

The supply chain is considered the backbone of a successful enterprise.  However, firms find it increasingly challenging to establish a robust supply chain model. The disruptions caused by COVID-19 have further made matters worse as ‘disconnected enterprises’ struggle to gain complete supply chain visibility. The pandemic has established that supply chain disruptions and uncertainties will become more frequent going forward.

Supply chain challenges

The current supply chain landscape faces numerous challenges that need to be addressed.  These issues are illustrated below:

Challenges in Current Supply chain

 Future-proofing the supply chain using IoT

As enterprises strive to develop a resilient supply chain, IoT will occupy the center stage. An interconnected supply chain will bring together suppliers/vendors, logistic providers, manufacturers, wholesalers and retailers, and customers dispersed by geography. The technology ensures improved efficiency, better risk management, end-to-end visibility, and enhanced stakeholder experience.

A seamlessly connected supply chain provides advantages at every stage of the value chain for each of the stakeholders. The exhibit below showcases a connected supply chain ecosystem:

Connected ecosystem for supply chain

 Let’s look at how some companies are capturing the benefits IoT:

  • Real-time location tracking

Using real-time data (captured from GPS coordinates) tracking the movement of raw materials/finished goods, IoT technology aids firms in determining where and when products get delayed. This helps managers ensure route optimization and better plan the delivery schedule. IoT, in combination with blockchain, helps secure the products against fraud. For example, Novo Surgical leverages IoT for optimally tracking and tracing its ‘smart surgical instruments.’ This has reduced errors, decreased surgical instrument loss, increased visibility and efficiency, and improved forecasting of demand for the firm.

  • Equipment monitoring

Sensors on machines constantly collect information around the functioning of the machine, enabling managers to monitor them in real time. By analyzing parameters such as machine temperature, vibration, etc., manufacturers can better predict machine downtime and take necessary actions to mitigate this. For instance, Toyota partnered with Hitachi to leverage the vendor’s IoT platform and use the data collected to reduce unexpected machine failures and improve the reliability and efficiency of equipment.

  • Smart inventory management

IoT sensors in the warehouse assist in tracking the movement of individual items, providing an efficient way to monitor inventory levels and prevent pilferage. Smart shelves contain weight sensors that monitor the product weight to determine when products are out of stock. Walmart has been leveraging smart shelves in its retail stores to manage its products more efficiently and improve the shopping experience.

  • Warehouse management

IoT technology uses sensors that can monitor and adjust warehouse parameters such as humidity, temperature, pressure, and avoiding spoiling of items. Leading e-commerce players like Amazon and Alibaba have been pioneers in leveraging IoT to optimize warehouse management.

 Charting the journey for a connected supply chain

As enterprises aim to future-proof their supply chain, they will need a structured path following these five steps below:

  1. Develop a business case: Enterprises need to determine the current gaps in their supply chain and identify the extent of digitization of their supply chain to develop the business case for a connected supply chain.
  2. Secure buy-in from supply partners: Successful implementation of IoT in the supply chain requires the various partners to collaborate and adopt the technology together. Securing a buy-in from each member of the value chain – vendors/suppliers, OEM players, logistics operators, and retailers – is imperative for firms to realize the complete benefits. Compatibility of the technology platforms leveraged by the various supply partners is essential to develop a seamless supply chain.
  3. Invest in security: Invest in security and data protection initiatives early on to avoid supply chain breaches. Performing regular security and vulnerability assessments across the value chain and investing in next-generation technology-based security solutions is essential.
  4. Leverage other technologies: While IoT has a plethora of benefits across the supply chain, consider leveraging next-generation technologies such as blockchain, artificial intelligence, and edge computing in confluence with IoT to further enhance the capabilities of the use cases.
  5. Partner for implementation: To overcome concerns around skills and address data reconciliation challenges, consider partnering with IoT providers with expertise in the supply chain arena. Service/solution providers also are instrumental in bringing a security layer that can aid in addressing data security concerns and governance issues.

Since IoT is an interplay of multiple devices and machines, a successful IoT implementation requires firms to invest in sensors, cloud/edge infrastructure, IoT connectivity networks, data management and analytics solutions, and application development and management. Enterprises can accelerate their IoT supply chain journeys by partnering with solution providers with strong expertise in IoT products and services capabilities in the supply chain arena.

Are you embarking on your connected supply chain journey? Please share your thoughts and experiences with us at [email protected] and [email protected].

Democratization: Artificial Intelligence for the People and by the People | Blog

Enterprises have identified Artificial Intelligence (AI) as a quintessential enabling technology in the success of digital transformation initiatives to further increase operational efficiency, improve employee productivity, and deliver enhanced stakeholder experience. According to our recent research, Artificial Intelligence (AI) Services – State of the Market Report 2021 | Scale the AI Summit Through Democratization, more than 72 percent of enterprises have embarked on their AI journey. This increased AI adoption is leading us into an era of industrialization of AI.

Talent is the biggest impediment in scaling AI

Enterprises face many challenges in their AI adoption journey, including rising concerns of privacy, lack of proven Return on Investment (RoI) for AI initiatives, increasing need for explainability, and an extreme crunch for skilled AI talent. According to Everest Group’s recent assessment of the AI services market, 43 percent of the enterprises identified limited availability of skilled, mature, and niche AI talent as one of the biggest challenges they face in scaling their AI initiatives.

Lack of skilled AI talent

Enterprises face this talent crunch in using both the open-source ecosystem and hyperscalers’ AI platforms for the reasons below:

  • High demand for open-source source machine learning libraries such as TensorFlow, scikit-learn, and Keras due to their ability to let users leverage transfer learning
  • Low project readiness of certified talent across platforms such as SAP Leonardo, Salesforce Einstein, Amazon SageMaker, Azure Machine Learning, and Microsoft Cognitive Services due to lack of domain knowledge and industry contextualization

As per our research in the talent readiness assessment, a large demand-supply gap exists for AI technologies (approximately 25 to 30 percent), hindering an enterprise’s ability to attract and retain talent.

In addition to this technical talent crunch, another aspect where enterprises struggle to find the right talent is the non-technical facet of AI that includes roles such as AI ethicists, behavioral scientists, and cognitive linguists.

As more and more enterprises adopt AI, this talent challenge becomes more exacerbated at the same time the demand for AI technologies is skyrocketing. There is an ongoing tussle for AI talent between academic, big tech, and enterprises, and, so far, the big techs are coming out on top. They have been able to successfully recruit huge amounts of AI talent, leaving a drying pool for the rest of the enterprises to fish in.

Democratization to overcome the talent problem

We see democratization as a potential solution to overcome this expanding talent gap. As we define it, democratization is primarily concerned with making AI accessible to a wider set of users targeted specifically at non-technical business users. The principle behind the concept of democratization is “AI for all.”

Democratization has to do with educating business users in the basic concepts of data and AI and giving them access to the data and tools that can help build a larger database of AI use-cases, develop insights, and find AI-based solutions to their problems.

Enterprises can leverage Everest Group’s four-step democratization framework to help address talent gaps within the enterprise and empower its employees. Here are the steps to guide a successful democratization initiative:

  • Data democratization: The first step of AI democratization is enabling data access to business users throughout the organization. This will help familiarize them with the data structures and interpret and analyze the data
  • Data and AI literacy: The next step is embracing initiatives to help business users build general knowledge of AI, understand the implications of AI systems, and successfully interact with them
  • Self-service low-code/no-code tools: Organizations should also invest in tools that provide pre-built components and building blocks in a drag and drop fashion to help business users deploy ML models without having to write extensive code
  • Automation-enabled machine learning (ML): Lastly, enterprises should use automated machine learning (AutoML) for automating ML workflows that involve some or all of the steps involved in the model training process, such as feature engineering, feature selection, algorithm selection, and hyperparameter optimization

Following these steps, democratization can help reduce the barriers to entry for AI experimentation, accelerate enterprise-wide adoption, and speed up in-house innovation among its benefits.

Current state of democratization

The industry as a whole is now in the initial stages of AI democratization, which is heavily focused on data and AI literacy initiatives. Some of the more technologically advanced or well-versed enterprises have been early adopters. The exhibit below presents the current market adoption of the four key elements of democratization and a few industry examples:

current AI adoption

Democratization is essential

As part of their democratization efforts, enterprises must also focus on contextualization, change management, and governance to ensure responsible and successful democratization.

By doing this, companies will not only help solve the persistent AI talent crunch but also ensure faster time to market, empower business users, and increase employee productivity. Hence, democratization is an essential step to ensuring the sustainable, inclusive, and responsible adoption of AI.

What have you experienced in your democratization journey? Please share your thoughts with us at [email protected] or at [email protected].

Microsoft Goes All in on Industry Cloud and AI with $20 Billion Nuance Deal | Blog

Yesterday’s announcement of Microsoft’s acquisition of Nuance Communications signifies the big tech company’s serious intentions in the US healthcare market.

We’ve been writing about industry cloud and verticalization plays of big technology companies (nicknamed BigTech) for a while now. With the planned acquisition of Nuance Communications for US$19.7 billion, Microsoft has made its most definitive step in the healthcare and verticalization journey.

At a base level, what matters to Microsoft is that Nuance focuses on conversational AI. Over the years, it has become quite the phenomenon among physicians and healthcare providers – 77 percent of US hospitals are Nuance clients. Also, it is not just a healthcare standout – Nuance counts 85 percent of Fortune 100 organizations as customers. Among Nuance’s claims to fame in conversational AI is the fact that it powered the speech recognition engine in Apple’s Siri.

Why Did Microsoft Acquire Nuance?

The acquisition is attractive to Microsoft for the following reasons:

  1. Buy versus build: If Microsoft (under Satya Nadella) can trust itself to build a capability swiftly, it will never buy. Last year, when we wrote about Salesforce’s acquisition of Slack, we highlighted how Microsoft pulled out of its intent to acquire Slack in 2016 and launched Teams within a year. Could Microsoft have built and scaled a speech recognition AI offering?
  2. Conversational AI: Microsoft’s big three competitors – Amazon, Apple, and Google – have a significant head start in speech recognition, the only form of AI that has gone mainstream and is likely to be a US$30 billion market by 2025. Clearly, with mature competition, this was not going to be as easy as “Alexa! Cut slack, build Teams” for Nadella
  3. Healthcare: This is another battleground for which Microsoft has been building up an arsenal. As the US continues to expand on its $3 trillion spend on healthcare, Microsoft wants a share of this sizeable market. That is why it makes sense to peel the healthcare onion a bit more

 

What Role Does Microsoft Want to Play in Healthcare?

While other competitors (read Amazon, Salesforce, and Google) were busy launching healthcare-focused offerings in 2020, Microsoft was already helping healthcare providers use Microsoft Teams for virtual physician visits. Also, Microsoft and Nuance are not strangers, having partnered in 2019, to enable ambient listening capabilities for physician to EHR record keeping. Microsoft sees a clear opportunity in the US healthcare industry.

  • Everest Group estimates that technology services spending in US healthcare will grow at a CAGR of 7.5% for the next five years, adding an incremental US$25 billion to an already whopping $56 billion
  • The focus of Microsoft and its competitors is to disrupt the multi-billion ($40 billion by 2025) healthcare data (Electronic Medical Record) industry
  • Erstwhile EMR has been a major reason for physician burnout, which the likes of Nuance aim to solve
  • Cloud-driven offerings such as Canvas Medical and Amazon Comprehend Medical are already making Epic Systems and Cerner sit up and take notice

It is not without reason that Microsoft launched its cloud for healthcare last year and has followed it up by acquiring Nuance.

What Does it Mean for Healthcare Enterprises?

Under Nadella, Microsoft has developed a sophisticated sales model that takes a portfolio approach to clients. This has helped Microsoft build a strong positioning beyond its Office and Windows offerings even in healthcare. Most clients in healthcare are already exposed to its Power Apps portfolio and Intelligent Cloud (including Azure and cloud for healthcare) in some form. It is only a question of time (if the acquisition closes without issues) until Nuance becomes part of its suite of offerings for healthcare.

What Does it Mean for Service Providers?

As a rejoinder to our earlier point about head starts, this is where Microsoft has a lead over competitors. Our recent research with System Integrators (SI) ecosystem indicates that Microsoft is head and shoulders above its nearest competitors when it comes to leveraging the SI partnership channel to bring its offerings to enterprises. This can act as a significant differentiator when it comes to taking Nuance to healthcare customers as SI partners can expect favorable terms of engagement.

Partners' Perceptions

Lastly, this is not just about healthcare

While augmenting healthcare capabilities and clients is the primary trigger for this purchase, we believe Microsoft aims to go beyond healthcare to achieve the following objectives:

  • Take conversational AI to other industries: Clearly, healthcare is not the only industry warming up to conversational AI. Retail, financial services, and many other industries have scaled usage. Hence, it is not without reason that Mark Benjamin (Nuance’s CEO) will report to Scott Guthrie (Executive Vice President of Cloud & AI at Microsoft) and not Gregory Moore (Microsoft’s Corporate Vice President, Microsoft Health), indicating a broader push
  • Make cloud more intelligent: As mentioned above, Microsoft will pursue full-stack opportunities by combining Nuance’s offerings with its Power Apps and Intelligent Cloud suites. As a matter of fact, it plans to report Nuance’s performance as part of its Intelligent Cloud segment

Microsoft: $2 Trillion and Beyond

This announcement comes against the background of BigTech and platform companies making significant moves to industry-specific use cases, which will drive the next wave of client adoption and competitive differentiation. Microsoft’s turnaround and acceleration since Nadella took over as CEO in 2014 are commendable (see the image below). It is on the verge of becoming only the second company to achieve $2 trillion in market capitalization. This move is a bet on its journey beyond the $2 trillion.

Picture2 5

What do you make of its move? Please feel free to reach out to [email protected] and [email protected] to share your opinion.

Synthetic Data – Catalyst for AI innovation | Blog

With a connected world and connected humans, we are on track for a huge uptick in new data creation at an unprecedented level. IoT, digitization, and cloud have brought on the generation and storage of ZBs of data created each day. Data has become the new oil but with some caveats. The tap of this oil is controlled by a few organizations globally, making this data asset scarce and expensive. However, enterprises in their pursuit of digital transformation require this data to get insights for better decision-making.

Shortcut to access data

The next logical question arises as to how we can get hold of this data, which, if utilized to its full potential, has the power to transform enterprises. This is where synthetic data comes to play. It is the form of data that is created inorganically rather than being generated through actual interactions or events. It is usually formed by studying the characteristics and relations between different variables. A total of three types of synthetic data exist, which are shown below.

Picture1 1

Exhibit 1: Types of synthetic data

Why is it required now?

With the cultural shift towards insights-based decision making from gut-based decision making and the onset of data literacy initiatives, enterprises require apt insights, which further require the generation of huge amounts of data. There are a few instances highlighted below which make a strong case for synthetic data.

  1. GDPR mandates stringent regulations for data access which stipulates if a company can utilize it with user content. This makes it extremely difficult to share data creating hurdles to solve business problems
  2. AI models and algorithms require extensive labeled data for training purposes. In the case of self-driving cars, it needs to clock in millions of miles to test computer vision algorithms. This delays the go-to-market for such products
  3. New product development usually requires a lot of data testing before it is introduced in the market. Innovation becomes scarce if quality data from the field is not there

 Techniques to generate synthetic data

There are usually three strategies to generate synthetic data. These include some simplistic techniques as well as methods infused heavily with AI.

Picture2

Exhibit 2: Techniques to generate synthetic data

Sampling from distribution is simply drawing a lot of random numbers from a normal distribution. Agent-based modeling understands the behavior of the original data. Once the characteristics are defined, it creates new data keeping the behavior constraints in place. Generative Adversarial Network (GAN) models are synthetic data generation techniques usually used for creating image data. These networks have two DL models, one is a generator, and the other is known as a discriminator. For example, GAN can take random noises as its input. Then the generator generates output images, whereas the discriminator tries to find whether the output is fake or real. The more the image is closer to the real one, the output can be considered as real.

Applications across enterprises

An infinite source of data that mimics the real dataset can provide innumerable opportunities to create test scenarios during development.

Synthetic data acts as a beneficiary for enterprises across domains and industries, with some examples shown below.

  1. “Customer is king”: a tag line commonly used in the current environment wherein organizations strive to provide hyper-personalization to customers for better customer retention and to create upsell and cross-sell opportunities. Synthetic data helps enterprises get detailed analysis of each customer without worrying about the consent through GDPR. This data would have properties of real data and can be used for simulations
  1. Agile development and DevOps: Software testing and quality assurance often involve a long waiting period to get access to ‘real’ data. Artificially generated data can assist in eliminating this waiting period leading to reduced testing time and increased agility during development
  2. Research and product development: Synthetic data can be used to create an understanding of the format of real data that does not exist yet and build algorithms and preliminary models on top of it. It can also be used as a baseline for product development and reduce time to market
  3. Robotics: Companies often struggle to obtain quality real-life data sets to execute testing. Synthetic data helps in running thousands of simulations, thereby improving the robots and complementing expensive real-life testing
  4. Financial services: Important elements for any financial service enterprise are fraud protection and detection methods, which can be tested and evaluated for their effectiveness using synthetic data

Limitations of synthetic data

However, the use of synthetic data does not come without its own set of limitations.

  1. At its best, synthetic data imitates the real-life data sets but is not an exact replica. This can result in certain data points that are deviations or exceptions to the overall set, leading to skewed modeling outputs
  2. It is also not an easy task to assess the quality of the synthetic data set generated as it often depends on the complexity of the original data. As a result, the quality assessment parameters need to change in accordance with the variation in the original data point, meaning there can’t be a standard framework to be followed for each synthetic data set
  3. It is difficult for business users to trust the credibility of the synthetic data generated due to a lack of technological understanding leading to slow uptake. This is more so in certain industries such as the healthcare and food industry, where there are direct repercussions to human life

Way ahead

Despite these limitations, enterprises should be keen to adopt synthetic data as they have an opportunity to disrupt the business landscape by utilizing data and its benefits to full potential. It can prove to be the push that was required for AI/ML to penetrate across enterprises and gain more traction.

If you’ve utilized synthetic data in your enterprise or know about more areas where synthetic data can be advantageous and disadvantageous, please write to us at [email protected] and [email protected]. We’d love to hear your experiences and ideas!

The Evolution from Robotic Surgery to Digital Surgery | Blog

The robotic surgery market has surged over the last decade. According to an article published by the JAMA Network Open in early January 2020, robot-assisted surgical procedures accounted for 15.1 percent of all general surgeries in 2018, up from 1.8 percent in 2012. And the market has grown even more since 2018. For example, the utilization rate of Intuitive Surgical’s da Vinci robot in US hospitals has grown more than 400 percent in the last three years.

To capture their piece of the robotic surgery market pie, other MedTech giants, including Johnson & Johnson (J&J), Medtronic, Stryker, and Zimmer Biomet have turned to acquisitions and strategic partnerships. Stryker paid a whopping US$1.65 billion in 2013 to acquire Mako Surgical Corp. Zimmer Biomet bought Medtech for its Rosa Surgical Robot in 2016 for US$132 million. J&J and Medtronic acquired Orthotaxy and Mazor Robotics, respectively, in 2018. And J&J subsequently bought Auris Health and Verb Surgical in 2019.

With all this money being spent on robotic surgery company acquisitions, it is clear that the MedTech giants intended to fight head-on with one another to build the best surgical robot.

But building the best surgical robot does not assure market leadership.  Indeed, robotics is only one aspect of the digital surgery ecosystem. In order to excel in the robotic surgery space, companies need to build solutions that complement their surgical robots with digital technology tools and capabilities.

Transforming from robotic surgery to digital surgery

As you see in the following image, the digital surgery ecosystem consists of imaging, visualization, analytics, and interoperability technologies that enhance the capabilities of surgical robots, enabling companies to unlock the full array of potential benefits robotic surgery has to offer – better precision and control, enhanced surgical visibility, remote surgery, better clinician and patient experiences, etc.

Let’s take a quick look at the value each of the digital technologies can bring to robotic surgery.

  • AI/ML and data analytics will help the robots learn from past procedures and ensure better surgery planning and reasoning. They will also help support cognitive functions such as decision making, problem solving, and speech recognition. One real-world example of AI/ML is Stryker’s Mako robot, which learns from past procedures to ensure better positioning of surgical implants for stability
  • Strong network and connectivity will enable real-time data sharing of patient outcomes, best practices, and support remote surgery at a global level
  • Augmented Reality/Virtual Reality (AR/VR) and advanced visualization technologies enhance surgical visibility beyond the naked eye and improve anatomical education and surgeon training modalities through interactive simulations
  • Remote monitoring, sensors, and wearables can assist in intra-operative and post-operative surgical care through real-time data exchange for better clinical outcomes and reduced care costs

B1

 

Realizing the benefits of digital technologies, MedTech companies are starting to make investments in them to augment their surgical robots. For example, Medtronic in 2020 acquired Digital Surgery, a leader in surgical AI, data and analytics, and digital education and training to strengthen its robotic-assisted surgery platform. Similarly, in 2021, Stryker acquired Orthosensor to enhance its Mako surgical robotics systems with smart sensor technologies and wearables, and Zimmer partnered with Canary Medical to develop smart knee implants. MedTech companies are also starting to change their branding to reflect their move to digital. For example, J&J is positioning its new offerings as digital surgery platforms instead of robotic surgery platforms.

Building a single, connected next-generation digital surgery platform

Building specialized robots for different surgical procedures requires either a huge capital investment to acquire such individualized capabilities or extensive resources and time to develop them in-house. So, it’s neither feasible nor cost-effective to do so. Therefore, it would be ideal for MedTech organizations to focus on developing one robot that supports the entire breadth of surgical procedures.

With their history of robotic acquisitions over the last three years, MedTech giants should be looking at integrating multiple point solutions to build a single, connected next-generation digital surgery platform. The following image depicts our vision of a truly connected digital surgery ecosystem built around a digital surgery platform. It ensures interoperability among all types of surgical robots so they can continually learn and evolve by sharing best practices, surgical procedures, and associated patient data.

B2

J&J has already shared its vision and roadmap for building a next-generation digital surgery platform. It brings together robotics, visualization, advanced instrumentation, connectivity, and data analytics to enable its digital surgery platform to improve outcomes across a broad range of disease states. It has announced its plans to integrate its recently unveiled Ottava platform with the Monarch platform it gained from its 2019 acquisition of Auris Health to build a strong position in the digital surgery market.

With MedTech giants in the initial phase of building their next-generation connected digital surgery ecosystem, they will need to have the right fit of complementary digital technologies to truly scale their impact – alleviating surgeon workloads, driving productivity, enabling personalization, and better clinical outcomes. Service providers that bring niche talent and a balanced portfolio of engineering and digital services will be a partner of choice for MedTech giants in this journey.

Please share your views on robotic surgery and the digital surgery ecosystem with us at [email protected] and [email protected].

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.