Category: Blog

Driving Factors for IT Services Recovery in 2024: Insights from Everest Group’s Forces & Foresight™ Research | Blog

Our inaugural Forces & Foresight research uncovers three factors we predict will drive a gradual recovery in tech services this calendar year. Continue reading to understand the projected path for the IT services industry’s rebound and implications for providers.

Learn more about Forces & Foresight™.

After the global financial crisis in 2008 and the COVID-19 pandemic in 2020, the IT services industry took five to eight quarters to recover. Both recessions had specific catalysts, and the IT services growth rebound after each downturn was fueled by weak comparisons and a release of significant pent-up demand (as illustrated in Exhibit 1).

Let’s explore whether the current economic decline will follow a similar trajectory based on Forces & Foresight research that dives deep into comprehensive dynamics to understand the industry implications.

Forces Foresight Exhibit 1

Over the past year, the contrasts between past recessions and ongoing macro challenges have been well-researched and discussed. More importantly, the industry is now facing multiple forecasts – pulling in different directions – on how the broader macroeconomic outlook will impact IT services growth.

The following three factors are fueling our foresight:

  1. Stabilizing base

Despite the uncertainty that has impacted enterprise decision-making for the past several quarters, we now see signs of spending stability. The fundamental need to leverage technology for growth or cost is keeping enterprise IT budgets green.

Some examples of various factors that give us confidence the demand downtrend has at least stabilized include:

    • Enterprises: Everest Group’s Key Issues Study of enterprise priorities captured that over 60% of enterprises (from a sample of 200-plus) plan to increase technology budgets in 2024
    • IT Providers: A notable trend for technology platform players’ performance and outlook is that SaaS/PaaS players are not only reporting double-digit growth and new clients but also foresee growth continuing in 2024
    • Services: We see strong resilience for IT services spending in specific segments such as public sector, healthcare, and energy
  1. Fixing revenue leakage

Over several past quarters, service providers have reported contradictory numbers for revenue growth versus bookings (Exhibit 2).

Our research uncovered some key factors creating this dichotomy, including shifting profiles of existing deals to longer terms and lower annual contract values (ACVs), delayed/canceled commitments, and slow ramp-ups of large deals.

As a result, signings did not translate into revenue growth even after expected ramp-up periods. However, we see signs of stabilization in the latest quarter, especially for mid-tier players, and some stalled projects moving forward. This signals that the net negative effect is reversing and that net positive revenue contribution from bookings may begin.

Forces Forsigh Exhibit 2jpg

  1. Pockets of additional demand

We see some specific growth areas in 2024. Rising cybersecurity demand is promising (as is the case in challenging periods) and is seeing more takers, such as governments expanding cybersecurity budgets. Engineering, Research, and Development (ER&D) demand in asset-heavy industries like automotive is getting stronger as enterprises strategically invest in next-generation concepts to modernize, such as smart factories. Data and analytics will continue its growth trajectory, with the rise in Artificial Intelligence (AI) fueling more robust use cases.

Beyond these promising areas, we also see signs of a turnaround in segments and sub-segments that were hit hard in recent quarters. Prominent ones include investment banking and high-tech industries, and North America.

Note: We are not factoring in benefits from AI and generative AI (gen AI) in our foresight for 2024. While we strongly believe in the immense potential of gen AI, our research suggests that scaled adoption will still take time, and the effect on this year’s growth will not move the needle.

Implications for service providers

Despite the ongoing challenges, the industry has not slumped. The revenue declines seen in one to two quarters is better than several quarters in previous downturns (Exhibit 3). We predict pivotal growth soon.

Forces Foresight

However, given the “classical economics” nature of this downturn impacting IT services, we anticipate the industry’s recovery will be more gradual than previous event-based downturns, in contrast to the sharp recovery in the last two recessions. We expect a diverse mix of industry verticals and geographies to undergo various recovery cycles, making it an interesting and complex scenario.

The key for IT service providers lies in identifying early recovery pockets and attacking rather than being defensive and possibly investing in account management on late recovery cycle areas.

For more insights, supporting data, and trends on IT services growth, please read the report, Forces & Foresight Q1 2024.

Contact Prashant Shukla to learn more at [email protected].

Unlocking Enterprise Preparedness for T+1 Settlement: The Crucial Role of IT and Technology Services Providers | Blog

By partnering with IT and technology services providers, banks and financial institutions can prepare for the new T+1 settlement. This security trade rule change to shorten the order finalization date by a day is expected to enhance operational efficiencies and reduce risk. Read on to understand how this updated regulation will impact the industry landscape and rapidly transform critical areas. Reach out to discuss the topic with us.

In today’s ever-evolving financial industry, the shift to T+1 settlement aims to enhance market efficiency, reduce counterparty risk, and align North American markets with global standards.

The transition to a T+1 settlement cycle represents a monumental shift for banks and financial institutions that will impact trade management, resource allocation, and risk mitigation.

Scheduled to go into effect this May in the US and Canada, the amended rule would require trades to be settled just one day after the transaction date, marking a significant departure from the current two-day cycle.

Let’s explore its ramifications further.

Impact on investors 

Shifting toward instantaneous or faster settlements is a remarkable milestone that will streamline operations, improve risk management, boost liquidity, and provide better counterparty risk management. This will further lead broker-dealers to reduce margins and collateral requirements.

Accelerating settlement cycles will save buyers and sellers time and increase trading volume. The positive impact will vary by the investor type. Large institutional investors like corporations will benefit from more liquidity and reduced margin requirements. Meanwhile, small or retail investors, who contribute significantly to the daily exchange trading volumes, will receive funds or assets post-execution faster. This will bring various operational benefits and improve market risk mitigation.

However, the proposed shift will require significant investor education and resilience to overcome the negative market sentiment caused by affected broker businesses. Investors will grapple with the shorter period for trade settlement, and brokers will need substantial investments to update front-to-back-office systems. Moreover, the higher settlement costs could potentially disappoint investors.

Implications for banks and financial services institutions

Transition to T1

The shift to T+1 settlement presents a significant opportunity for the financial industry to enhance operational efficiencies and reduce risk. As Martin Palivec, Head of Securities Services, Canada, Citi, has highlighted, many post-trade processes still rely heavily on manual intervention, indicating a clear need for automation and streamlining. Settlement compression will drive the industry to strengthen and automate these processes, leading to more efficient and reliable operations.

Moreover, improving reporting capabilities is becoming increasingly critical, with clients expecting timeliness and accuracy in trade status updates. This underscores the urgency for organizations to adopt advanced technologies and modernize their post-trade infrastructure.

Role of IT and technology service providers

By partnering with IT and technology services providers, banks and financial institutions can navigate the complexities of T+1 settlement and position themselves for success in the evolving financial landscape.

IT and technology services providers can play a crucial role in assisting banks and financial services institutions in their T+1 journey in the following ways:

  • System Upgrades and Integration: Technology providers can help firms upgrade their existing systems or integrate new systems to support T+1 settlement. This includes implementing real-time processing capabilities, enhancing data management systems, and ensuring interoperability with other market participants
  • Automation and Straight-Through Processing: Automation is key to achieving operational efficiency in a T+1 settlement environment. Technology providers can assist firms in automating trade interfaces, matching and affirming trades in real time, and streamlining settlement workflows to minimize the risk of failed settlements
  • Data Management and Reporting: Accurate and timely data management is critical with the compressed settlement timeline. Technology providers can help firms improve data quality, implement real-time reporting capabilities, and enhance communication with counterparties to reduce the risk of errors and delays
  • Regulatory Compliance and Risk Management: Technology providers can assist firms in ensuring compliance with T+1-related regulatory requirements, including implementing systems to monitor and report trade status and manage risk factors associated with the new settlement cycle

The move to the T+1 settlement represents a significant advancement in the financial industry, necessitating operational and technological adjustments for banks and financial institutions.

As this transition unfolds, the prospect of T+0 settlement looms, with countries like India already making strides in this direction. The adoption of distributed ledger technology (DLT) is expected to be pivotal in enabling T+0 settlement and offering real-time, secure, and transparent transaction processing.

Organizations preparing for T+1 should also consider the potential shift to T+0 in their strategic planning. By embracing technologies and practices supporting T+1 and T+0 settlement, businesses can streamline operations, enhance efficiency, and stay ahead of the curve in an evolving financial landscape.

To learn more about the impact of T+1 settlements in the financial services industry, contact Abhinav Rathaur, [email protected], Kriti Seth, [email protected], and Pranati Dave, [email protected].

Read the blog, Beyond the Hype: Approaching Gen AI in BFSI Enterprises with the Generative AI-EXCEL Framework, to learn about successful gen AI adoption in the BFSI sector.

Beyond the Hype: Approaching Gen AI in BFSI Enterprises with the Generative AI-EXCEL Framework | Blog

To successfully adopt Gen AI in BFSI, enterprises need to consider four fundamental aspects that can lead to responsible and effective deployment. Carefully evaluating each framework component is essential to ensure a positive Gen AI journey. Read on to learn about the Generative AI-EXCEL Framework and the importance of each element, or get in touch.

As there is urgency to embrace Generative Artificial Intelligence (Gen AI) across all industries – the BFSI industry is no exception given its prevalence. However, a thoughtful approach is required to fully reap the benefits of Gen AI.

Before immersing themselves in various use cases and integrating Gen AI into their operating structure, BFSI enterprises should strategically examine four fundamental components along the Gen AI value chain:

Generative AI-EXCEL framework

  • Enable AI
  • Execute AI
  • Champion AI Operations
  • Lead AI Change Management and Governance

These elements can guide enterprises toward harnessing the full potential of Gen AI in BFSI while ensuring responsible and effective deployment.

Beyond the Hype Approaching Gen AI in BFSI Enterprises with a Generative AI EXCEL Framework pdf

Beyond the Hype Approaching Gen AI in BFSI Enterprises with a Generative AI EXCEL Framework2 pdf

Enable AI

Embarking on AI initiatives demands the expertise of AI experts to define a clear vision and strategy. Seeking guidance from Gen AI experts is essential in laying a solid foundation for successful implementation. Assessing organizational readiness through an AI maturity and readiness assessment is recommended as this can provide insights into preparedness levels and potential challenges.

Developing a Gen AI roadmap and conducting a Return on Investment (ROI) analysis further ensures a well-structured approach, allowing organizations to navigate the complexities of integrating Gen AI effectively in their operations. A thoughtful approach is essential for consulting and enabling generative AI across the value chain before delving into specific use cases, relying on AI technology partners, and tool selection advisory services to ensure that organizations secure the right resources for success.

Adequate resources are crucial to ensure scalability, allowing Gen AI systems to manage increasing workloads efficiently. There is a lot of demand for talent, skills, and domain expertise, especially in Gen AI that needs to be plugged.

Moreover, hardware and infrastructure compatibility and version compatibility among different Gen AI models and frameworks are essential for seamless operations. Massive datasets play a pivotal role in training large-scale AI models, demanding significant computational power from specialized hardware such as graphics processing units (GPUs) and tensor processing units (TPUs). Balancing these elements is vital to harness the potential of Generative AI effectively.

Execute AI

When developing AI systems, some essential steps include preparing the data, refining features, utilizing and fine-tuning pre-built models, integrating AI with existing systems, creating custom models as needed, and conducting thorough testing to ensure reliability.

The increasing complexity of Gen AI models has led to the emergence of Machine Learning Operations (MLOps) and Large Language Model Operations (LLMOps) as services. These can play a pivotal role in easing the efficient deployment, orchestration, and monitoring of AI models.

Given the possibility of potential biases introduced by Gen AI, it becomes imperative for BFSI enterprises to ensure fairness. Vigilant model monitoring and drift analysis are some ways to achieve this. In addition, optimized performance can be achieved by incorporating accelerators.

Champion AI Operations

A robust change management strategy is essential for navigating a smooth transition. Leadership communication about AI’s benefits can set a positive tone for adoption. Equipping workforce with the necessary skills through comprehensive training and upskilling is essential. Developing a streamlined process for Gen AI adoption can enhance its acceptance rate. Recognizing and reinforcing Gen AI’s contributions can motivate the workforce, ensuring effective and sustainable AI integration.

Lead AI Change Management and Governance

Strong data governance can help address some of the concerns related to source attribution and confidence levels in data and foster trust in Gen AI outcomes.

Gen AI can generate content that is low in authenticity. Model explainability can help make AI decisions more understandable and traceable, boosting user confidence. Furthermore, enforcing compliance, validation, and auditing mechanisms can reinforce AI solutions’ reliability and ethical deployment.

The Gen AI model can potentially produce biased or dangerous results. Other AI models can be used to test results for risky outputs. Enterprises can also use data loss prevention and other security tools to prevent users from inputting sensitive data into prompts in the first place. Maintaining control over data is essential, and multiple levels of security are required.

In an industry where data security and privacy are paramount, governance becomes a linchpin for safeguarding sensitive information. Beyond regulatory compliance, governance can address critical aspects such as risk management, fairness, transparency, and accountability. With ongoing regulatory uncertainty and evolving laws, it is critically important to exercise caution about data breaches, privacy violations, or biased or discriminatory decisions that can create regulatory liabilities.

By following this Generative AI-EXCEL framework, BFSI enterprises can ensure they have addressed all essential aspects of enabling Gen AI. From identifying the right infrastructure and resources to developing and testing models and ensuring proper change management and governance, thoroughly evaluating each component guarantees a smooth AI transition. This approach will allow BFSI enterprises to harness Gen AI’s power fully.

To discuss Gen AI in BFSI, please reach out to [email protected], [email protected], and [email protected]. Learn more about how we can help your enterprise to leverage Gen AI, or read our report on revolutionizing BFSI workflows with Gen AI.

Decoding Quantum Computing: Uncovering its Potential Impact and Opportunities, Part I | Blog

With their exceptional computing prowess, quantum computers have the potential to revolutionize various sectors by expediting complex problem-solving. In this first blog of our two-part series, we delve into quantum computer types, opportunities for businesses and IT service providers, and their impact on modern cryptographic algorithms. Get in touch to discuss further.

What is quantum computing?

Quantum computing is an innovative approach that leverages the principles of quantum mechanics to solve extremely complex problems much faster than classical computers. Unlike classical computers using bits, quantum computers employ qubits, such as photons or atoms, for information encoding. Quantum computing progressed from 2-qubit systems in the 1980s to tens in the 2000s, and by the late 2020s, significant milestones were achieved. Google’s “quantum supremacy” in 2019 with a 53-qubit processor and IBM’s 433-qubit chip, IBM Osprey, set records. In 2023, Atom Computing unveiled an 1180-qubit quantum computer.

Quantum bits exhibit numerous types of quantum phenomena. Let’s explore the following:

  • Superposition – Quantum bits, or qubits, can represent both 0 and 1 simultaneously, allowing quantum computers to process information much faster than classical computers
  • Entanglement – Qubits become perfectly correlated, even when separated by large distances. This means that changing one qubit instantly affects its entangled partner, enabling quantum computers to determine the value of the other qubit immediately.

These principles allow quantum computers to perform calculations based on the probability of a qubit’s state before measurement, revolutionizing computing capabilities

Despite substantial investments, current systems face scalability and stability issues. Error correction and fault tolerance also remain complex, with each additional qubit increasing the probability of errors and higher sensitivity to environmental noise. These issues highlight the ongoing hurdles in quantum computing’s path to widespread commercialization.

Quantum computer types

Quantum computers have different architectures, determined by the nature of qubits and quantum mechanics phenomena used to manipulate them. The research and innovation put into these architectures deliver solutions to problems that previously could not be solved due to classical computers’ limited computing capabilities.

Below are some of the most typical architectures that enterprises should be familiar with:

  • Superconducting: Usually made from superconducting materials, these computers use loops and circuits to produce and alter the qubits. They are the most sophisticated and popular quantum computers and can accurately model and simulate the behavior of molecules, materials, chemical reactions, etc. This feature finds practical utility in fields like drug discovery, materials science, and chemistry, where understanding the quantum behavior of complex systems is essential. They also excel in solving optimization problems, such as route optimization, scheduling, and resource allocation, which have applications in logistics, supply chain management, and financial portfolio optimization
  • Trapped ion: These quantum computers use ions trapped and manipulated in an electromagnetic field as qubits. Their long coherence times make them viable for applications requiring high stability and control levels
  • Neutral atom: Similar to trapped ion quantum computers, these use neutral atoms suspended in an ultra-high vacuum by arrays of tightly focused laser beams. They also offer long coherence times and high-fidelity operations, making them suitable for implementing complex quantum algorithms such as simulations and solving optimization problems in logistics and cryptography
  • Quantum dots: These use semiconductor quantum dots or tiny semiconductor nanocrystals as qubits that can confine electrons or other charge carriers, usually manipulated by electrical, magnetic, or microwave pulses. Theyhave the potential for robust scalability and are typically implemented in quantum communication networks and quantum sensing, among other use cases
  • Photonic/optical: Photonic quantum computers leverage photons (or packets of light) to carry and process quantum information. It can play a significant role in quantum communication protocols, enabling secure transmission of information through quantum key distribution (QKD) and quantum teleportation. This ensures the confidentiality and integrity of data, which is essential for various sectors such as finance, defense, and telecommunications

Implications for enterprises and IT service providers  

Quantum computing presents numerous opportunities for enterprises across various industries to revolutionize their operations, drive efficiency, and unlock new possibilities for growth and innovation.

As the field matures, business leaders must prepare to embrace quantum computing in the following five ways:

  1. Educate stakeholders: Enterprise leaders must educate themselves, their teams, and stakeholders about quantum computing, its potential applications, and its implications for their industry. They can organize workshops, training sessions, and seminars to increase awareness and understanding of quantum computing concepts and opportunities
  1. Identify potential use cases: Leaders must understand their respective fields’ most significant challenges and opportunities and actively search for quantum computing use cases. This can be achieved by either having an in-house team of quantum computing experts or collaborating with academia, research institutions, regulatory bodies, and other industry players to stay abreast of the latest quantum computing technology advances
  1. Build a quantum-ready workforce: After identifying relevant quantum computing use cases, leaders must build a dedicated team with expertise in quantum physics, algorithms, hardware, software, and other related fields that can work together to research, design, and implement quantum solutions tailored to their needs. This will enable the enterprise to filter out the hype and focus on areas with real business implications
  1. Invest in research and development: By allocating resources to R&D initiatives focused on quantum computing, enterprises can explore potential use cases, develop proof-of-concept projects, and experiment with quantum algorithms and applications relevant to their industry
  1. Understand technology needs: Enterprises should determine the frequency of their quantum computing usage to help decide whether to purchase/own a quantum computer or utilize cloud-based quantum services provided by computing companies. It is crucial for enterprises to carefully evaluate and choose quantum-computing partners based on their unique requirements

Service providers can play a crucial role in educating enterprises about the potential applications of quantum computing in their specific industry sectors and help them navigate the challenges and benefits associated with its adoption. Enterprises should understand they don’t necessarily need to own or build a quantum computer. Instead, they should embrace quantum computing as a service that provides multiple benefits, such as scalability, elasticity, reduced costs, and increased accessibility.

Furthermore, it’s crucial to communicate to enterprises that quantum computers will not require continuous availability, as they will coexist alongside classical computers. Providers can collaborate with enterprises on R&D initiatives and develop custom algorithms and applications tailored to their business needs. Additionally, providers have an essential role in helping enterprises navigate quantum computing security concerns.

Quantum computing’s impact on modern cryptographic algorithms

Cryptography has served as the cornerstone of secure communication in the digital age, addressing multiple information security aspects, such as confidentiality, data integrity, authentication, and non-repudiation in the presence of third parties or adversaries. Some of the foundational elements of cryptography are:

 

Algorithm Description Use-cases Examples
Hash function/

algorithm

Transform input data into fixed-size strings called hash values Password hashing, digital signatures, hash-based message authentication codes (HMACs), and data integrity verification SHA2, SHA3, Blake2
Symmetric algorithms Uses one key for both encryption and decryption Data encryption, SSL/TLS, MACs, and VPNs AES, RC6, Blowfish, Twofish
Asymmetric algorithms Uses a pair of keys: a public key for encryption and a private key for decryption HTTPS, digital signatures, email encryption, blockchain, public key infrastructure (PKI) RSA, DSA, ECDSA, EdDSA, DHKE, ECDH, ElGamal

Many cryptographic algorithms that enterprises rely on today, such as RSA and ECC, are based on mathematical problems that are computationally difficult for classical computers to solve efficiently.

However, the advent of quantum computing threatens the security of these algorithms. Shor’s algorithm efficiently solves integer prime factorization and discrete logarithm problems, breaking the security of RSA and other asymmetric encryption schemes. Additionally, Grover’s algorithm threatens symmetric cryptographic algorithms and hash functions by offering a quadratic speedup in searching through unsorted databases.

 

Cryptographic algorithm Type Purpose Impact from large-scale quantum computer
AES Symmetric key Encryption Large key sizes needed
SHA-2, SHA-3 Hash functions Large output needed
RSA Public key Signatures, key establishment No longer secure
ECDSA, ECDH

(Elliptic curve cryptography)

Public key Signatures, key establishment No longer secure
DSA

(Finite field cryptography)

Public key Signatures, key establishment No longer secure

Source: Report on Post-Quantum Cryptography (nist.gov)

Quantum computing – a mixed blessing?

Given their immense computational powers, quantum computers have the potential to revolutionize various fields by solving specific problems much faster than classical computers. Rapid technological advancements in the field make it critical for enterprises to understand the technology, determine potential use cases, and prepare for it.

However, the need for robust quantum-safe or post-quantum cryptographic solutions becomes increasingly evident as quantum computing advances. Read our next blog in this series to learn how to navigate quantum computing security concerns.

To discuss further, please contact Prabhjyot Kaur, Kumar Avijit, and Suseel Menon.

Don’t miss the Global Services Lessons Learned in 2023 and Top Trends to Know for 2024 webinar to learn the successes, challenges, and trends that defined the services industry in 2023 and the opportunities for business leaders in 2024.

Generative AI and Cloud Integration Keep Mainframes Alive in the BFSI Industry | Blog

Though a recent Everest Group survey revealed the pressing need for mainframe modernization, the technology is far from dead in the banking, financial services, and insurance (BFSI) industry. The rise of generative AI (gen AI) will encourage more BFSI firms to adopt a comprehensive technical architecture, integrating cloud and mainframe technology at its core. Read on for insights from the survey or get in touch. 

Are BFSI firms really ditching mainframes? The BFSI industry is indeed grappling with the prospect of abandoning this approach. According to an Everest Group survey, about half of the respondent firms have shifted their peripheral tasks away from mainframes.  

Concerns about mainframe system scalability loom large for more than 50% of sizable BFSI firms, while about 60% of smaller firms struggle primarily with finding talent skilled in older programming languages such as COBOL.  

Operational complexity with mainframe systems is also reported as a challenge by over 90% of BFSI respondents, underscoring the pressing need for mainframe modernization. Evolving priorities such as building data-driven workflows, digitalization, and enhancing customer experiences further fuel this urgency. 

Cost efficiency and talent unavailability are the main drivers for mainframe modernization, closely followed by the imperative for innovation. North American firms prioritize core banking and CRM workloads over modernization, while European players emphasize digital channels and payment infrastructure. 

Despite these challenges, mainframes are expected to remain integral to BFSI operations. A significant majority, about 60% of the respondent firms, have not yet started modernizing their core systems. In the coming years, non-core applications will continue to have a higher migration rate than core applications.  

However, industry research underscores that BFSI enterprises optimize and enhance their mainframe ecosystems, presenting a promising opportunity for service providers to assist. Let’s explore this further.  

Capturing cloud value through a hybrid infrastructure

With mainframes here to stay in the BFSI industry, enterprises can gain a competitive advantage by investing in the private cloud to capture the underserved and large demand for hybrid IT. Hybrid cloud is a constant across all our BFSI industry enterprise conversations.  

Of the respondent BFSI firms, 35% utilize private cloud for their modernization initiatives — mainly in a hybrid cloud environment — while 65% rely on a multi-cloud strategy.  

IBM’s focus on transforming the mainframe’s interface to other environments validates this trend. The launch of z15 and z16 is the company’s answer to the age of cloud computing. It is an evolution to meet the needs of hybrid cloud deployments, leveraging investment in data, generative Artificial Intelligence (gen AI), and applications, adding features and functionality to complement this strategy. IBM is focusing its messaging on rightsizing over downsizing. The strategy to provide more flexibility, predictability, and cost-effectiveness is evident in the company’s push for tailored fit pricing.   

The survey reveals many firms believe the disconnection between mainframe environments and new cloud-native systems and applications is a big challenge. Further investments in technologies like application programming interfaces (APIs), in collaboration with technology and service providers, will help bridge this gap in the coming years. 

Will banking’s AI revolution enable cloud-based modernization? 

We expect a symbiotic relationship between gen AI and IT modernization, each complementing the other’s growth. Cloud computing is the foundational block providing the right computational power to run AI applications, while AI’s enhanced speed and efficiency will support cloud migrations.  

BFSI firms are channeling investments into gen AI, crafting use cases to support their modernization initiatives and business operations. The survey found that 40% of the BFSI firms have proof of concepts or use cases for gen AI to support mainframe modernization.  

Firms have moved beyond experimenting with gen AI. Goldman Sachs and Deutsch Bank have started using gen AI to generate code and refactor their modernization initiatives, closely watching the impact. They are building and rolling out use cases to improve operational efficiency. We believe that the banks poised for future success are identifying use cases that solve specific business problems aligned with their organization’s strategy. This can enable them to measure the results easily and encourage leadership buy-in.  

With mainframe modernization services growing at a steady 4-5%, the ability to adapt and innovate using newer technologies such as gen AI will drive more BFSI firms to adopt a more robust and holistic technical architecture with both cloud and mainframes at its core. 

However, the question remains: will gen AI bring exponential change in the next three years? There is one certainty: the need for a strong IBM, service provider, and hyperscaler value proposition will continue to grow for BFSI clients. 

To discuss mainframe modernization further, please contact [email protected] and [email protected]. Understand more about the future of the enterprise mainframe, or watch our on-demand webinar on the future of generative AI implementation at enterprise level.

Gig Worker Benefits Collective: Transforming Benefits Delivery in the Gig Economy | Blog

The rising trend of gig work, especially among Gen Z and Millennials who favor nontraditional employment, is garnering attention. However, the lack of workplace benefits for many gig workers is a significant concern. A viable proposal for this urgent need is a Gig Worker Benefits Collective. Read on to learn more.

Reach out to discuss this topic further.

With more than 70 million individuals in the US engaged part or full-time in the gig economy, the notable lack of workplace benefits and life insurance coverage for these flex workers has sparked widespread debate.

About half of Gen Z and millennials, who constitute over 50% of the workforce, gravitate toward nontraditional employment. Consequently, employers are reconsidering benefits for gig workers while navigating various legal and tax complexities.

Employees access benefits in the following ways:

  • Employer-provided benefits – Employees typically offer benefits, using them as an incentive to attract top talent
  • Purchasing individual policies – This alternative provides individuals flexibility and control over coverage
  • Government assistance – This serves as many employees’ safety net. However, the ongoing debate in the US over classifying an “employee” and “independent contractor” complicates matters for gig workers

While this issue appears daunting, finding an answer is possible. Let’s explore this further.

Proposing a solution: The Gig Worker Benefits Collective

One proposal is to create a Gig Worker Benefits Collective for these short-term workers across various employers. This centralized hub would bring together multiple employers or gig platforms under a comprehensive benefits umbrella.

Here are some benefits a Gig Worker Benefits Collective could offer:

  • Centralized administration: A central administration entity or pooled plan provider would manage benefit plans, including health insurance, retirement savings, and other relevant benefits. Technology and digital platforms would streamline administration and communication
  • Flexible benefit options: Through the collection, gig workers could choose from a wide range of benefits based on needs. Options may include health insurance, retirement plans, disability coverage, and more. Furthermore, this would give gig workers the flexibility to customize benefit packages
  • Cost sharing: Participating employers would contribute to the collective pool to cover administrative expenses, potentially reducing costs for individual employers. The collective could leverage economies of scale to negotiate better rates with insurance providers and service vendors
  • Portability and continuity: This solution would ensure gig workers could continue their benefits even when transitioning between different gigs or employers within the collective. Having seamless benefit continuity would enhance worker satisfaction and retention

The Gig Worker Benefits Collective represents a professional, tech-enabled solution designed to address gig workers’ unique needs, potentially changing the landscape of life insurance and group benefits for the gig economy.

To discuss gig worker benefits further, please contact [email protected] and [email protected].

Watch the webinar, Locations and Workforce Strategy 2024: Insights, Trends, and Key Priorities, for insights into strategic workforce decision-making for 2024.

2024 is an Inflection Point For IT Services | Blog

We are now in the digital world with operational platforms. In the platform world, looking for stability in processes no longer works because platforms collapse business processes to align with improving the user experience and objectives and key results (OKRs), thus creating new value. Consequently, in IT, we are at an inflection point, a point in time where the attention for IT or IT services is moving away from building out infrastructure and moving to achieving demonstrable business value.

Read more in my blog on Forbes

Navigating Cloud Portability and Exit Strategies in Banking and Financial Services | Blog

Cloud service providers are vital partners in helping Banking and Financial Services (BFS) institutions build robust systems for cloud migration and exit strategies to maneuver complex regulatory and operational environments. These approaches promise to ignite technological innovation. Discover actions the world’s leading banks are taking and their importance in today’s financial landscape in this blog. Contact us to discuss further.

In the BFS industry, cloud has become synonymous with innovation and agility. Yet, as the space matures in its digital transformation journey, a crucial pivot is taking place. The focus is not solely on cloud migration but on nimbly and cautiously maneuvering within it. This shift brings to the forefront the importance of cloud portability and exit strategies – concepts rapidly gaining traction as BFS enterprises seek to future-proof technology investments. Let’s explore this further.

The strategic imperative of cloud portability

Cloud portability has risen to a strategic imperative within the BFS space. It encapsulates the capability to seamlessly transition applications and workloads between cloud environments, ensuring operational resilience and uninterrupted compliance. This degree of agility is fundamental in mitigating risks associated with vendor lock-in. Additionally, it enables BFS enterprises to adapt rapidly to evolving regulatory requirements and market conditions.

Major banks have spearheaded the charge towards cloud portability by embracing technologies that allow flexibility. Adopting containerization technologies and microservices architecture, notably through Kubernetes, is a case in point. These technologies provide a layer of abstraction, decoupling applications from the underlying cloud infrastructure, which empowers banks to maneuver digital assets across platforms without the burdens of significant downtime or exorbitant costs.

For instance, major financial institutions such as Bank of America and JPMorgan Chase have been at the forefront of embracing cloud-native technologies. Bank of America has utilized Kubernetes to enhance its application deployment processes, enabling faster innovation and improved customer service. Similarly, JPMorgan Chase has invested in containerization to streamline its IT infrastructure, demonstrating the significant efficiency and flexibility benefits these technologies offer to the BFS industry.

Navigating exit strategies in a regulated landscape

While less discussed, cloud exit strategies are vital to a comprehensive cloud governance framework. In an industry where strategic pivots or regulatory mandates can necessitate a change in cloud service providers, BFS enterprises must have clear, actionable plans for such eventualities. Crafting a cloud exit strategy involves thoroughly understanding service agreements and ensuring the transition can be executed with minimal disruption to operations and compliance protocols.

Goldman Sachs’ adoption of a multi-cloud strategy exemplifies a preemptive approach to exit planning. By distributing workloads across AWS, Azure, and Google Cloud, it is poised to maintain continuity of service and positioned to negotiate the transition of services, should strategic or regulatory circumstances change.

The formulation of cloud exit strategies is intricately linked to the BFS industry’s regulatory environment. Institutions must have actionable plans to transition away from cloud providers as strategic, regulatory, or operational landscapes evolve.

Over the last decade, regulations such as the General Data Protection Regulation (GDPR) in Europe and the Federal Financial Institutions Examination Council (FFIEC) guidelines in the United States have necessitated that banks maintain strict data governance and security protocols during such transitions.

These regulatory frameworks compel banks to plan their cloud engagements meticulously. For instance, compliance with GDPR requires that any BFS institution operating in or serving customers in the European Union (EU) must ensure its cloud exit strategy does not compromise data protection standards, even during service provider transitions.

For BFS enterprises, investing in cloud portability and a strategic exit plan is a direct response to the industry’s complex risk profile. These strategies protect them against the uncertainties of the cloud market and the evolving regulatory landscape. The goal is to safeguard investments and ensure that cloud engagements remain agile, compliant, and aligned with the overarching business objectives.

How can service providers become strategic partners in this roadmap?

Cloud service providers are pivotal in facilitating the BFS sector’s cloud transitions, having evolved from mere hosts of workloads to strategic partners. Mid-market providers illustrate this evolution by aiding BFS institutions in cloud migration and strategically planning portability and exit. These service providers ensure that cloud architectures are crafted to be vendor-agnostic and that exit strategies are incorporated into the engagement from the outset, aligning with the BFS industry’s stringent standards.

Elements of a comprehensive cloud strategy

The evolving cloud landscape necessitates a proactive and all-encompassing approach to strategy development. A holistic cloud strategy should incorporate the following:

  • Prioritizing open standards and application programming interfaces (APIs) to facilitate easy transition between cloud environments
  • Evaluating technology stacks in detail to uncover and mitigate potential lock-in risks
  • Negotiating transparent and favorable contractual terms with cloud providers that account for the potential need to exit
  • Developing robust business continuity plans that include cloud service transitions

The road ahead

As the BFS sector looks to the future, the trajectory of cloud computing strategies points toward greater flexibility, regulatory compliance, and strategic agility. The increasing importance of cloud portability and exit strategies is set to catalyze a new wave of technological innovation and strategic foresight. The pioneering steps some of the world’s leading banks are already taking demonstrate this evolution.

Large banks have been front-runners in leveraging cloud technology to enhance their financial services. Collaboration with hyperscalers, such as AWS, Azure, and Google Cloud, is part of the broader strategy to adopt a cloud-first approach by distributing workloads across different cloud providers.

Goldman Sachs has been using AWS’s capabilities to innovate in financial data management, leveraging cloud technology for scalability and efficiency and ensuring its architecture supports portability and compliance. This move indicates a broader trend among BFS institutions to harness the power of cloud computing while emphasizing the importance of cloud portability and the ability to adapt and exit in line with strategic and regulatory needs.

Moving forward, collaboration between BFS institutions and cloud service providers is expected to deepen, focusing on creating more robust frameworks for cloud portability and exit strategies. This partnership will be crucial in navigating the modern financial world’s regulatory complexities and operational demands, setting new standards for innovation, security, and customer-centric services in the banking sector.

BFS enterprises that diligently incorporate cloud portability and strategic exit planning into their operational frameworks are setting themselves up for enduring success. They will safeguard current investments and position themselves to leverage future technological advances and adapt to an ever-evolving regulatory landscape. We foresee that these proactive enterprises and service providers will spearhead the next wave of innovation and resilience in the BFS sector’s cloud journey.

To explore how to achieve cloud-first transformation in tandem with safeguarding the existing technology estate, contact Ayan Pandey, [email protected], and Pranati Dave, [email protected].

Don’t miss the webinar, Global Services Lessons Learned in 2023 and Top Trends to Know for 2024, to learn about the successes, challenges, and transformative trends that defined the global services industry in 2023 and discuss the opportunities that lie ahead for business leaders in 2024.

Five Tactics for Technology Service Providers to Guard Profit Margins Against Generative AI Impact | Blog

Technology service providers are committing to gen AI-related benefits to clients without a clear path to realization, which will negatively impact their deal margins. Uncover five strategies providers can use to optimize the productivity benefits of gen AI without denting profit margins in this blog.

The notion that generative AI (gen AI) can negatively impact technology service providers’ profit margins is counterintuitive. Service providers have assumed leveraging gen AI would improve margins by amplifying productivity and efficiency. Unfortunately, this has not been the case.

Most technology service providers commit to delivering gen AI-driven productive gains to clients. However, they will struggle to achieve these benefits in the short term due to the highly contextual and probabilistic nature of gen AI tools and their niche impact on specific tasks. This is hard to guarantee in solutioning.

Consequently, service providers must adopt other methods to realize these productivity benefits. This often involves a combination of people and intellectual property (IP) assets that cannot be charged to clients, hurting near to mid-term profitability.

Let’s look at the following five options that CEOs and CFOs of technology service providers have to address this issue:

  1. Monetize generative AI assets: Service providers have struggled to monetize their IP and assets because they are mostly used as service enablers and differentiators. Clients do not perceive additional value and are not incentivized to pay for these assets. However, gen AI assets could change this trend. Investing in gen AI assets can significantly vary from weeks to months of effort. Monetizing these assets should go beyond implementation and support fees and focus on IP value. If service providers can demonstrate genuine value, clients will be willing to pay the cost through better service pricing, an innovation fund, or direct monetization
  2. Educate sales teams and clients about the realistic impact of generative AI: Sales teams often fear their competitors are moving faster than them. In this rush, they commit gen AI benefits to clients without involving the practice, solution, or delivery teams. Leaders should coach the sales team and engage with strategic clients to have realistic gen AI discussions. Providers should create gen AI literacy services that peel away the hype of eye-catching Big Tech productivity announcements about gen AI offerings. Educational initiatives can help moderate client expectations and limit margin erosion
  3. Transform deal-related enabling functions: Generative AI has significant potential for summarizing and querying knowledge repositories. Service providers already use it to enhance the sales function through requests for proposal (RFP) bots, translating documents, and understanding service level agreement (SLA) requirements and the nuances of client needs. Accelerating such initiatives across sales, pre-sales, delivery management, deal finance, and business reviews can lower deal-related costs and alleviate margin pressure from productivity commitments
  4. Identify the right clients: Service providers need to identify the right set of clients for committing and delivering gen AI productivity benefits. This crucial aspect of the puzzle is frequently overlooked. The selection process should be driven by data, the client’s AI maturity, and the nature of the relationship and services used. Choosing the right clients can increase providers’ margins in their gen AI service delivery adoption journey. These clients will be more supportive of letting specific units experiment with gen AI productivity benefits, provide realistic feedback, and set internal expectations
  5. Transform effort and pricing models: Service providers are struggling to transform their effort and pricing model in the wake of gen AI committed benefits. Providers continue to rely on traditional approaches mainly because they are uncertain about linking gen AI committed client benefits to effort and pricing to manage deal margins. However, this problem must be solved to prevent it from significantly denting profitability. Apparent solutions, such as delinking price from effort, value-based pricing, and outcome-based commercials, have not worked. Since clients are more accommodating to collaborate with providers in this journey, providers should reignite these conversations, educate procurement and vendor management offices, and refresh their solutioning playbooks

While account and delivery leaders are held financially accountable for managing margins, linking this specifically to gen AI-related commitments can backfire. Instead, the focus should be shifted upstream during deal bids, solutioning, and client engagement.

To share your experience and discuss the impact of generative AI on profit margins for technology service providers, contact [email protected] and [email protected].

Watch this webinar, The Generative AI Odyssey: A Year in Review and What’s Ahead in 2024, to hear our analysts discuss the hype vs. reality of generative AI, production-level use cases, and the future of this transformative technology.

Are Expectations For Gen AI Lower Than Planned? | Blog

I am coming to the conclusion that the hoped-for gen AI spending wave for both tech and tech services could be longer in developing than both the investors and companies are planning on. In a recent blog, I talked about the realization that most (perhaps as high as or above 90%) of the Proof of Concept (POC) initiatives that launched during 2023 will not make it to production in 2024. Yet, the tech firms are plowing hundreds of millions of dollars into building AI tools and building the capabilities for running them. What has changed? And where are there still opportunities for this important technology?

Read more in my blog on Forbes

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.