Tag: analytics

Surge in Onshoring Shapes Global Sourcing Market | Press Release

Despite macroeconomic uncertainties and reduced investor confidence, global sourcing industry witnesses stable growth in 2016

The global sourcing industry has experienced a surge in setup activity in onshore locations, according to Everest Group, a consulting and research firm focused on strategic IT, business services and sourcing. The proportion of onshore versus offshore delivery centers jumped from 45 percent in 2014 to 52 percent for the period of 2015-H1 2016.

Onshore setup activity increased among the top 20 service providers, with North America’s share surpassing 2012 levels after experiencing significant declines in 2013 and 2014 due to a global slowdown. North America is the most favored onshore location followed by Continental Europe.

According to Everest Group, the factors contributing to this rise in onshoring include:

  • a need for a deeper talent pool to support complex services,
  • the desire for easier coordination and better alignment/training with clients,
  • new data security regulations
  • tier-2 onshore locations gaining credibility for service delivery.

Overall, the global services market grew at a rate of 8-10 percent in 2015, reaching US$161-166 billion, a slight slowdown compared to the 9-11 percent growth rate of 2014.

“We expect that the global services market growth will be lower in 2016—likely 7-9 percent—due to the overall macroeconomic slowdown, currency fluctuations and volatility in equity and investment markets,” said Anurag Srivastava, vice president and director of the Global Sourcing practice at Everest Group. “Political instability associated with Brexit in the United Kingdom and the Trump presidency in the United States will continue to affect the growth rate as well.”

Global technology spending remained flat in 2015, a statistic that obscures the impact that new technologies are having on the industry.

“Going forward, countries such as India are expected to witness a slowdown in the growth of IT services exports, although digital services will continue to grow at a fast pace,” added Srivastava. “Analytics will be one of the key contributors of growth in the BPS segment; conversely, adoption of technologies such as automation will result in a decline in contract sizes and revenue growth.”

These findings and more are discussed in Everest Group’s recently published report “Global Locations Annual Report 2016: Persistent Growth in Uncertain Times.” This research offers insights into the size and growth of the global services market, global services exports by regions and country, an update of locations activity by region and country, and trends affecting global locations (changes in investment environment and exposure to various risks). It also provides industry-leading comparison and analysis of key changes in maturity, arbitrage and potential of global delivery locations through Everest Group’s unique MAP Matrix™ analysis.

Other Key Findings

  • Asia-Pacific (APAC) share of market has been consistently declining since 2012 but continues to constitute more than 60 percent of the share of the global services FTEs. India and the Philippines account for more than 90 percent of the share in the APAC region. APAC also holds the largest share (more than 70 percent) of the global services market in terms of revenue.
  • India and the Philippines retained their leadership status in the global services market, continuing to hold more than one-third of the share in new delivery center setups globally.
  • Nearshore Europe witnessed strong growth in activity during the period of 2015-H1 2016, emerging as the second largest region after Asia Pacific, with the majority of new center activity in Poland, Ireland and Romania.
  • New center setup activity increased in 2015, surpassing pre-2013 levels and reaching a new high since 2011.
  • All locations witnessed a decrease in GIC activity during the period of 2015-H1 2016. In total, global in-house center (GIC) setups continue to outnumber service provider setups. In terms of percentage share, service provider setups exceeded GIC setups for the first time during H1 2016 since dropping below in 2013.
  • Among all regions, Nearshore Europe witnessed the largest increase in new center setups in 2015 compared to 2014.

Procurement Analytics 3.0 – Leveraging Technology for Greater Business Outcomes | Webinar

Thursday, November 10 | 11 a.m. – 12 p.m. ET

Vice President of BPS Megan Weis and Senior Analyst Vatsal Gupta will be co-presenters at a November 10 webinar titled Procurement Analytics 3.0 – Leveraging Technology for Greater Business Outcomes. Paul Blake, Senior Manager of Product Marketing at GEP will also be a lead presenter during the webinar.

To thrive in today’s complex world, buyers of global outsourcing services are looking to their providers for next-generation models which not only lower cost, but leverage new tools and technologies to drive better business outcomes. In procurement, this next-generation model is squarely focused on analytics.

For many enterprises, procurement analytics begins – and ends – with spend analysis. Best-in-class procurement teams, however, are leveraging innovative analytics techniques and technology – in new areas – to drive greater value across their source-to-pay (S2P) processes. How can your procurement team do the same? In this session, experts from Everest Group and GEP will discuss how procurement organizations can apply next-gen analytics tools across S2P processes to improve their team’s overall reach and impact on the enterprise. They will also share key levers and best practices to help you successfully operationalize advanced procurement analytics.

Presenters

  • Megan Weis, VP Business Process Services at Everest Group
  • Vatsal Gupta, Senior Analyst at Everest Group
  • Paul Blake, Senior Manager of Product Marketing at GEP

Register to attend

Don’t Turn Cross-selling In Banking into A Villain | Sherpas in Blue Shirts

A critical factor behind the Wells Fargo fiasco was the incentivizing of employees based on their ability to achieve their sales targets by cross-selling products. While this is the easiest and lowest cost model for defining and measuring sales team performance, it can lead to fraud if left unchecked. In Wells Fargo’s case, over 5,300 employees were fired for fraud that occurred across multiple years and led to the exit of CEO John Stumpf.

The scandal raises serious questions. Did Wells Fargo not have the data and analytics tools needed to identify fraud that had been going on for so long? Did the bank’s processes not have a channel to capture customer feedback on transactions to raise a flag for the fraudulent activity? Can we create employee performance measures other than sales targets?

To answer these questions, I believe banks need to go back to services marketing basics 101:

  1. Measure customer acquisition costs
  2. Develop mechanism for measuring customer satisfaction (in almost real time, on an ongoing basis for consumers in the age of connected ecosystem)

If Wells Fargo had measured the cost of acquisition per customer and had the ability to drill down at the sales representative level, it would have realized that the 5,300 fired employees had unbelievably low cost of customer acquisition for the sales they made over the years – meaning they were doing amazing, or fraudulent, work. Whichever the case, the bank would need to explore further.

These days, measuring customer satisfaction after every transaction is the norm in many industries. After every call I make using Skype, the application asks me to rate my experience. The same is true for every Uber ride I take, and each time I book a flight online.

Can’t banks do this? I believe they can. It makes sense for multiple reasons:

  1. In the age of agile development and DevOps, driving continuous integration and continuous deployment the customer feedback loop needs to be real time for the customer experience and service design teams to actually drive continuous improvement of their systems
  2. This helps banks develop a rich data set that can be used to drive process and product design and improvements, and also identify fraud
  3. The data can help improve the customer experience, and demonstrates to consumers that their feedback is valuable. Customers can be enticed to leave feedback through offers of loyalty points, which in turn can help improve customer retention
  4. This approach drives customer centricity, and ensures designing processes that are aligned to the needs of customers
  5. Banks can use this data to predict the need for different segments of customers, and help drive personalization of user experience

While there are many more reasons why measuring customer satisfaction is valuable for banks and customers alike, let’s dive a little deeper into the idea of using it to measure sales team performance.

Banks can use the customer satisfaction measuring mechanism to capture feedback that enables measurement of the effectiveness and value added by the sales team member across the customer lifetime journey, from being on-boarded to systems to purchasing products to retiring products.

By embracing a customer-centric design philosophy for all its internal processes (not just for its products and services), including performance appraisals of all employees, with every KPI being linked to customer satisfaction, banks will be able to create a consumer-centric enterprise.

True that Wells Fargo’s case has made the idea of cross-selling a villain. But we must realize that its debacle was also caused by other more pressing issues such as top management failure to respond to the matter in time, lack of data and analytics solutions to identify fraudulent transactions, and the organization culture that promoted unethical behavior.

FinTech players in the market are looking to disrupt traditional financial services players by leveraging technology and designing for customers. However, they face challenges in terms of gaining customer trust and loyalty while building scale. Traditional banks boast of having scale and years of customer trust. But, we are witnessing erosion of that trust. While financial services enterprises are investing heavily to embrace the wave of digital disruption from FinTechs, they need to ensure while they pursue this strategy they continue to protect their competitive advantage of years of customer trust.

Top Two Procurement Outsourcing Drivers: Cost Reduction, Analytics | Press Release

Procurement outsourcing market matures, with buyers seeking more value, more innovation, broader scope from service providers.

The global multi-process Procurement Outsourcing (PO) market witnessed decent growth of 10 percent in 2015, reaching US$2.3 billion in size, led by strong adoption by North American manufacturing, consumer packaged goods (CPG) and retail segments, according to new research from Everest Group.

PO buyers cite cost reduction and analytics support as their two most crucial needs. In response, service providers are increasingly adopting robotic process automation (RPA) to usher in a new round of cost savings in such areas as administering purchase orders, invoice processing, fraud/duplicate payment detection, claims processing, and conducting arrears review. Similarly, buyers are increasingly asking for analytics solutions because they enable savings and minimize financial and operational risks. Typically, buyers lack in-house analytics capabilities, tools and expertise, so they are increasingly looking to service providers to plug this gap. Buyers list analytics expertise as one of the top three service areas in which they would like to see improvement by their outsourcing partner.

Growth in the PO market can also be attributed to an emerging trend of buyers seeking more end-to-end coverage. PO contracts are moving towards multi-tower scope, with an increasing inclusion of finance and accounting, supply chain management and human resources outsourcing processes in addition to traditional procurement processes. 

“Organizations are seeking to transition to a cost+value model of procurement outsourcing, where the entire procurement function shifts from an operational role to a business enabler role,” said Megan Weis, vice president, Business Process Services, at Everest Group. “Service providers play a key role in this transformation effort by providing best-in-class process efficiencies, technology solutions, and supplier relationship management that collectively contribute value far beyond cost arbitrage to the organization. Value-added contributions include risk mitigation, market intelligence, supplier-led innovation and faster speed-to-market of finished products.”

Other key findings:

  • Both organic and inorganic factors contributed to the growth in 2015; however, the organic activity (renewals, scope expansion) was subdued while inorganic activity (new deals) remained strong.
  • Strong evidence of service provider switching was observed, with growing termination rates and a fall in contract renewals.
  • Contractual activity rebounded in traditional industries such as manufacturing, consumer packaged goods (CPG) and retail.
  • In 2015, market activity picked up in the Small and Medium Business (SMB) segment and the mid-market buyer segment.
  • Adoption remained strong in North America.
  • Increasing investment by service providers to enhance category expertise has resulted in buyers becoming more comfortable with outsourcing additional categories.
  • The top five players (Accenture, Capgemini, GEP, IBM and Infosys) together account for more than 70 percent of the PO market.
  • Accenture and IBM continue to lead the market in all geographies and in all major industry segments except healthcare and pharmaceuticals, where GEP commands the top position.

These results and other findings are explored in a recently published Everest Group report: “Procurement Outsourcing (PO) Annual Report – 2016 – Analytics and Beyond.” This report assists key stakeholders (buyers, service providers, and technology providers) in understanding the changing dynamics of the PO market and helps them identify the trends and outlook for 2016-2017. The report provides comprehensive coverage of the global PO market including detailed analysis of market size and growth, buyer adoption trends, PO value proposition, solution characteristics and service provider landscape.

Analytics Pose Competitive Advantage Questions for Service Providers | Sherpas in Blue Shirts

Analytics technology is like a three-legged stool. Its rests on three necessary components: talent (data scientists, analysts and project managers), tools (from Excel all the way to Watson) and data. The most powerful is data. Data yielded in analytics are incredibly valuable in addressing business problems. As I recently explained, that value can be extremely profitable for service providers. But when it comes to designing a significant competitive advantage for providers, analytics pose strategic questions.

Should a provider sell analytics as a pure, standalone service? Or should the provider sell analytics embedded into its other service markets? Or should it do both?

And there are additional questions to consider in making that decision:

  • If the provider sells pure analytics as a service, will it cannibalize its business?
  • Or does the provider even have a choice since, if it doesn’t do what the customer wants, another provider will grab that opportunity.
  • Will selling pure analytics instead of embedding it in other service offerings cause the provider to forego the opportunity to create differentiated services?

The issue at the core of whether or not to embed analytics in existing service offerings is the fact that the data service providers have access to is proprietary, owned by their clients. We’re seeing that increasingly clients recognize data value and are not willing to cede that value.

The Analytics Holy Grail

A service provider having the ability to use a client’s proprietary data in a broader context and therefore the base of a broader service business would be the Holy Grail in the analytics space. But how can providers achieve this outcome?

We at Everest Group have seen examples where the service provider creates a multi-tenant platform and provides the same or a similar service to multiple clients. The provider extracts the multiple clients’ data, combines it with broader industry data and embeds the analytics from that to provide an enriched service for the clients on that platform.

An example is a service provider that leverages proprietary transportation data from state and local government clients. The provider integrates those clients’ data into broader traffic data and gives back to those states and municipalities actionable insight that they use in reconfiguring road systems where there is an accident or delay.

Another example is analytics services derived from logistics systems of multiple tenants such as railroads, airlines or shipping businesses. Their data allows the provider to identify bottlenecks to the benefit of all. Healthcare analytics is another instance where it makes sense to sell analytics embedded in existing service offerings. Such an offering could compare patient data in large populations to extract information and patterns that allow for better patent treatment or clinical outcomes across a broad population.

Analytics technology is incredibly powerful and valuable, and service providers will enjoy profitability from this service, whether they sell pure analytics as a service or sell embedded analytics. But embedding clients’ proprietary analytics data into existing service offerings is the strategy for getting the Holy Grail.

“This Time it’s Different” are Dangerous Words | Sherpas in Blue Shirts

“This time is different” are often thought of as the most dangerous words on Wall Street. I’ve been in the outsourcing services industry since 1983 in the early days of outsourcing pioneer EDS. I watched the rise of the asset-intensive infrastructure space. Then I watched the rise of labor arbitrage and the enormous changes that brought to the industry. And now I’m watching the rise of automation, analytics, cognitive, and cloud bring a similar scale of disruption. I know from experience how “this time it’s different” is seductive to believe in the outsourcing industry.

In 1986 when I left EDS, its net return was 22.5 percent – very similar to the top labor arbitrage firms today. It was the height of the first wave of outsourcing. That continued to go on and the industry moved to large transactions and lower margins. That business peaked in the mid- 1990s. From 2000 on, labor arbitrage took over, and the industry went back to smaller transactions with high margins – very similar to the high margins that EDS achieved back in the 1960s and 1970s. Now we’re seeing the rise of the next S curve –automation, analytics, cognitive computing and cloud – and this space is rapidly growing and gaining share. Labor arbitrage is still growing, but it’s slowing, and profit margins are declining.

I’ve recently had private conversations with some industry executives who have been prophesying the death of labor arbitrage. Some leading executives believe the market is in for a massive shift over the next 18 months to five years in which the labor arbitrage space will be completely disintermediated. I think this is unlikely.

So is it different this time?

Like the asset-intensive space, I think the labor arbitrage space will be disintermediated. But just like in the asset-intensive space, which started in 1995-1996, here we are 20 years later, and we still have asset-intensive outsourcing. Yes, EDS was bought by HP and now is combined with CSC, and IBM is still in the game. There’s still a significant infrastructure market.

I expect 20 years from now that there will still be a meaningful market for labor arbitrage, but it won’t garner the same profits as today. And I expect the shift from labor arbitrage will be a slower move than 18 months to five years in terms of having a dramatic and drastic effect on existing workloads.

Having said that, I do expect the value will move to new areas – just as it did in the past as the market evolved. And we’re currently seeing this happen with automation and other digital technologies. Market capitalization and growth should be in the new models, just like it happened for EDS, HP, CSC and IBM.

Good news and bad news

I predict difficult years ahead for the arbitrage business but radical change to the current players. The only industry leader that successfully migrated to the labor arbitrage space from the asset-intensive space was IBM. Likewise, this time I don’t expect many existing arbitrage players to successfully migrate. We saw massive consolidation in the infrastructure space, and I expect to see consolidation in the labor arbitrage space too.

Yes, I understand this time it’s different in that minority shareholder laws in India will create more resistance to consolidation. But I think the change is an irresistible force meeting a little object. I believe that the industry will consolidate, growth will continue to slow and profit margins will come down, just like it happened in the asset-intensive infrastructure space.
At the same time, automation, analytics, cognitive and cloud technologies will shift the industry to new business models, different commercial relationships, different pricing structures, and different kinds of risk sharing – just like it happened when labor arbitrage entered the asset-intensive infrastructure space. The good news is that these new models will bring new providers into the services space. The bad news is I think these differences create a high barrier for incumbent providers when it comes to changing their offerings. So, once again, this time it won’t be different.

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

  • Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.