Sherpas in Blue Shirts

Wipro cloud Appirio

Appirio to be Wipro’s SaaS Weapon | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

Wipro’s acquisition of Indianapolis-based Appirio – which provides cloud consulting services and helps organizations move applications to the cloud – will help increase its portfolio in the SaaS market and highlights a change in Wipro’s strategy from traditional outsourcing provider to new age company, placing a long-term bet on digital transformation. It will also add further steam to the cloud race among top-tier IT service providers.

Appirio Fast Facts
Appirio was founded in 2006 and has received US$111.7 million in funding from investors including Fidelity Management, General Atlantic, GGV Capital, Salesforce, and Sequoia Capital. It has around 1,250 employees, offices in Dublin, Indianapolis, Jaipur, London, San Francisco, and Tokyo, and renowned logos as clients – Facebook, HomeDepot, Honeywell, NYSE, Sony Playstation, Starbucks, and Toyota, to name just a few.

What can Wipro aim to gain?
Strategic Partner for Salesforce Implementation – Salesforce implementation contributes the most revenue in Wipro’s SaaS implementation services, and Wipro is known for its complex transformation-led projects in the CRM space. The Appirio acquisition will further strengthen its position as a leading Salesforce consultant.

Cloud Transformation Expertise – Wipro will be able to consolidate its existing Salesforce and Workday cloud application practices to launch a new practice with a comprehensive suite of cloud services under the Appirio brand. The new brand, if marketed well, can bring an uptick in Wipro’s CRM market share.

Enhanced Customer Experience – Appirio has established its name in the market by delivering an exceptional customer experience to its clients. Wipro’s global delivery capabilities combined with Appirio’s customer focus can help cross-sell CRM services to its existing client base.

Improved Market Outlook – The acquisition will help Wipro marginally move towards its highly aspirational target to exceed US$15 billion in revenue by 2020. Wipro forecasted a muted outlook for the July-September 2016 quarter, and a posted 2.6 percent rise in revenue in the previous quarter. Appirio has been a strong competitor to the likes of Accenture and Deloitte, which have gained considerable share in the cloud market, and joining forces with
Wipro will help it achieve its vision.

Acquisition Strategy – Including the Appirio acquisition, Wipro has spent US$1.13 billion in the last year on buying companies. It acquired Denmark-based design firm DesignIT for US$95 million, German technology company Cellent for US $78 million, and U.S.-based technology firm HealthPlan Services for US$460 million. Wipro’s strategy to enhance its portfolio of services by acquiring niche startups, a clear departure from its previous strategy of fewer acquisitions, will get a boost from the Appirio purchase.

What should Wipro be careful about?

Topcoder Marketplace Integration – Appirio in 2013 bought Topcoder, a leading crowdsourcing marketplace it combined with CloudSpokes to create a community of 1,000,000 designers, developers, and data analytics experts. It will be interesting to see how Wipro is able to integrate Topcoder and align to the interests of those in the community who are usually averse to working with an IT service provider.

Reaping long-term synergies – As exhibited by many unsuccessful acquisitions in the industry, it is not easy to reap benefits and synergies from a strategy acquisition. It will be worth watching how Wipro is able to expand its digital transformation capabilities with the integration of Appirio into its cloud transformation practice.

Race for Cloud Consulting Acquisitions
While acquisitions in the CRM space have not been as profound as in the digital space, providers are increasingly acquiring their cloud companies and partners to promote their cloud first agenda:

  • In September 2015, Accenture bought Cloud Sherpas, a leader in cloud advisory and technology services specializing in Google, Salesforce, and ServiceNow
  • In April 2016, IBM bought Bluewolf, one of Salesforce’s top partners and a global leader in cloud consulting and implementation services
  • In March 2016, IBM acquired Optevia, a U.K.-based SaaS system integrator that specializes in delivering solutions, specifically Microsoft Dynamics, to government entities
  • In January 2016, Accenture acquired Netherlands-based CRMWaypoint , a company that specializes in Salesforce cloud solutions for sales, service, and marketing
  • In January 2016, Capgemini acquired Oinio, a leading European Salesforce partner, to augment its capabilities in Salesforce solutions and platform across Europe and Asia
  • In January 2016, Cognizant acquired KBACE Technologies, a global technology company specializing in cloud strategy and implementation, and a leading Oracle Cloud partner
  • In September 2015, Microsoft purchased the key product and technology assets of its Dynamic CRM partner ADxstudio.

With the completion of this acquisition, what will the future hold for independent cloud consulting firms – and Appirio competitors – such as Acumen Solutions and Celigo? Will top IT service providers continue to quickly expand their capabilities through acquisitions, or instead make long term investments in reskilling their existing workforce? Only time will tell, but we’ll be sure to keep our eyes on this space.


MACRA Nails it as the Next Big Bang of Reforms in Healthcare | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

On Friday, October 14, the Centers for Medicare & Medicaid Services (CMS) in the United States released a humongous, 2398-page rule to implement new value-based payment programs under the Medicare Access and CHIP Reauthorization Act (MACRA).

This release is a significant step forward in streamlining Medicare payments, and establishing what “value” will mean in the much debated Value-Based Reimbursement (VBR) programs.

Here’s our initial take on this release, in order of what I liked most about the rules.

CMS is making the right noises: As the CMS acting administrator, Andy Slavitt, put it, “…..changes to the rule were to help physicians focus on delivering care and seeing patients instead of performing administrative tasks.” The term in bold represented the point of conflict between a right thinking, efficiency-focused regulator and unnecessarily overburdened physicians.

How is some of this getting addressed?

Reduces confusion over quality improvement: The new set of rules consolidates three existing quality reporting programs — Physician Quality Reporting System, Value-based Payment Modifier, and Meaningful Use (MU))– and a new performance category into a single system through Merit-based Incentive Payment System (MIPS.) The definition of “merit” or value was never clearer. Here is a snapshot of the scoring model that defines the four performance categories and their weights:

MACRA Healthcare

Pick Your Pace (PYP): In order to make the above operational, CMS is allowing providers to pick their own pace, (see Andy Slavitt’s blog for more details), and choose from three data submission options or join an advanced Alternative Payment Model (APM):

  • Test the program
  • Submit 90 days of data
  • Submit a full year of data

Enabling consortiums: CMS now allows MIPS reporting as a group, enabling smaller providers to get a better deal. What this means is that a group of clinicians sharing a common Tax Identification Number (irrespective of specialty or practice site) can group together to receive payments based on the group’s performance. This will foster necessary consolidation in the ambulatory space.

Relaxes exclusion norms through APMs: Providers not eligible for MIPS can still receive a bonus payment for meeting performance criteria through qualifying APMs. The inclusion criteria are clearer than before, and the nervousness caused by stringent exclusion norms is largely addressed.

Last, but not least, provides a further fillip to IT: While use of certified EHR technology will continue to give providers brownie points for performance, the following five required measures that CMS has mandated for providers will further boost technology adoption:

  • Security risk analysis
  • E-prescribing
  • Provide patient access
  • Sending summary of care
  • Request/accept summary of care

Net-net, this new rules release is a great move forward toward settling the debate on “value,” and will energize the healthcare industry to spend more on technology. As you wade through the 2398 pages, watch this space for more of our explanations and perspectives on this topic.

Banking, BFSI, Cross-Selling

Don’t Turn Cross-selling In Banking into A Villain | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

A critical factor behind the Wells Fargo fiasco was the incentivizing of employees based on their ability to achieve their sales targets by cross-selling products. While this is the easiest and lowest cost model for defining and measuring sales team performance, it can lead to fraud if left unchecked. In Wells Fargo’s case, over 5,300 employees were fired for fraud that occurred across multiple years and led to the exit of CEO John Stumpf.

The scandal raises serious questions. Did Wells Fargo not have the data and analytics tools needed to identify fraud that had been going on for so long? Did the bank’s processes not have a channel to capture customer feedback on transactions to raise a flag for the fraudulent activity? Can we create employee performance measures other than sales targets?

To answer these questions, I believe banks need to go back to services marketing basics 101:

  1. Measure customer acquisition costs
  2. Develop mechanism for measuring customer satisfaction (in almost real time, on an ongoing basis for consumers in the age of connected ecosystem)

If Wells Fargo had measured the cost of acquisition per customer and had the ability to drill down at the sales representative level, it would have realized that the 5,300 fired employees had unbelievably low cost of customer acquisition for the sales they made over the years – meaning they were doing amazing, or fraudulent, work. Whichever the case, the bank would need to explore further.

These days, measuring customer satisfaction after every transaction is the norm in many industries. After every call I make using Skype, the application asks me to rate my experience. The same is true for every Uber ride I take, and each time I book a flight online.

Can’t banks do this? I believe they can. It makes sense for multiple reasons:

  1. In the age of agile development and DevOps, driving continuous integration and continuous deployment the customer feedback loop needs to be real time for the customer experience and service design teams to actually drive continuous improvement of their systems
  2. This helps banks develop a rich data set that can be used to drive process and product design and improvements, and also identify fraud
  3. The data can help improve the customer experience, and demonstrates to consumers that their feedback is valuable. Customers can be enticed to leave feedback through offers of loyalty points, which in turn can help improve customer retention
  4. This approach drives customer centricity, and ensures designing processes that are aligned to the needs of customers
  5. Banks can use this data to predict the need for different segments of customers, and help drive personalization of user experience

While there are many more reasons why measuring customer satisfaction is valuable for banks and customers alike, let’s dive a little deeper into the idea of using it to measure sales team performance.

Banks can use the customer satisfaction measuring mechanism to capture feedback that enables measurement of the effectiveness and value added by the sales team member across the customer lifetime journey, from being on-boarded to systems to purchasing products to retiring products.

By embracing a customer-centric design philosophy for all its internal processes (not just for its products and services), including performance appraisals of all employees, with every KPI being linked to customer satisfaction, banks will be able to create a consumer-centric enterprise.

True that Wells Fargo’s case has made the idea of cross-selling a villain. But we must realize that its debacle was also caused by other more pressing issues such as top management failure to respond to the matter in time, lack of data and analytics solutions to identify fraudulent transactions, and the organization culture that promoted unethical behavior.

FinTech players in the market are looking to disrupt traditional financial services players by leveraging technology and designing for customers. However, they face challenges in terms of gaining customer trust and loyalty while building scale. Traditional banks boast of having scale and years of customer trust. But, we are witnessing erosion of that trust. While financial services enterprises are investing heavily to embrace the wave of digital disruption from FinTechs, they need to ensure while they pursue this strategy they continue to protect their competitive advantage of years of customer trust.


Creating A Successful Business Case for RPA in GICs | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

With increasing pressure on Global In-house Centers (GICs) for additional value creation, and exhaustion of traditional means, Robotics Process Automation (RPA) – an automation technology that can handle rules-based and repetitive tasks without human intervention – is fast emerging as the key lever to drive productivity.

RPA has the potential to reduce GIC headcount by 25-45 percent, depending upon the process type and extent of deployment. This results in significant cost savings for the GIC, including salaries and benefits for delivery team members replaced by the software bots, and non-people costs such as facilities, technology, and other operating expenses. Typical offshore GICs supporting horizontal functions such as F&A from Tier-1 Indian locations are likely to witness cost savings of 20-25 percent through RPA.

Beyond cost savings, RPA provides improved service delivery in the form of process quality, speed, and scalability, and better ability to manage through improved governance, security, and business continuity.

Development of an RPA solution requires substantially less time than comparable technologies such as Enterprise Application Integration (EAI) and BPM workflow solutions. This, in turn, reduces the time for RPA implementation and value realization, and offers quick return on investment, typically only six to nine months to recover the initial investments. Further, RPA is typically deployed in a phased manner. The relatively short payback period for initial investment in RPA mean subsequent phases can become self-funded from the savings realized from the earlier implementation.

GICs typically consider a minimum of 15 percent cost savings when developing an RPA business case. The savings are dependent on a number of factors, which can be adjusted suitably to build a favorable business case. Highlighted below are the key factors impacting the business case.

business case for RPA in GICs

Potential extent of automation
Headcount reduction due to RPA varies with the potential extent of automation that can be achieved, which in turn impacts the cost savings. As RPA’s sweet spot is transactional/rules-based processes, there is considerable potential for headcount reduction and cost savings when it is deployed to handle these processes.

Number of FTEs replaced per RPA license
The number of FTEs that can be replaced per robot varies by the process and type of RPA solution. The higher the number of FTEs replaced, the greater the cost savings. Both, the number of FTEs replaced per robot and the cost savings, can be increased by targeting standard transactional processes with significant volume.

Recurring cost of RPA implementation
Recurring costs for RPA – such as licensing, hosting, and monitoring – vary significantly by vendor and type of solution, in turn impacting the cost savings. The lower the recurring costs, the higher the cost savings.

For more drill-down details, please refer to Everest Group’s report, Business Case for Robotic Process Automation (RPA) in Global In-house Centers (GICs). This report assesses the business case for adoption of RPA in offshore GICs, with information on cost savings across individual components and the associated payback period. It also analyzes the impact of change in the above factors on the business case, and the threshold limits for each in order to have a justifiable business case. Further, it includes case studies on GICs that have adopted RPA, along with key learnings and implications.


CFOs Must Improve Communication About Resources for Developing Apps | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

The most important investments companies are making right now are around technology. And that’s no wonder. Digital opportunities within application services alone prove successful in achieving efficiency, growth and enablement. It’s so easy to book a business reservation or check in with your flight on your smartphone, for instance. And it’s easy to communicate with your grandmother, children or friends on Facebook. Or share news of a new product through a YouTube video and watch it go viral with millions of people sharing it. But building digital apps is neither easy nor cheap. Simplicity in interactive apps comes at a very high price. Many initiatives are not adequately funded because the CFO is not adequately communicating the investment and effort required.
There are a lot of false expectations around building digital apps.

Read more at Peter’s blog

Digital Technology

How Digital Technology is Changing the Status Quo in Services | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

What business wouldn’t want a 700% improvement in developer productivity or a big reduction in testing time or reduced time to market? The move to digital technologies carries the promise of dramatic transformation. With digital technologies, companies can do things that they have never done before. Digital is not a matter of delivering existing services in a different way; it’s delivering a new kind of service that really disrupts and changes the very fabric of how a company does business. That’s the digital promise. In a prior blog, I explained that technology is fundamentally changing how services are currently done. Now let’s look deeper into what the technology is really doing.

With digital technologies, companies have the providence of greenfield opportunities: to do new and different things, things that result in breakthrough performance in a critical business function. Let’s look, for example, at the customer onboarding process for a healthcare insurance company. It’s about a two-month process from the point that a new customer agrees to come on board to the point where the customer is fully registered and operating seamlessly in the company’s systems. The promise of the digital breakthrough is that a company can complete that process in 20 minutes – orders of magnitude faster than two months.

And that’s not all. The customer experience is also dramatically different. Customers become frustrated with the bureaucratic effort to enroll in a healthcare system and all the people involved (HR, actuarial, validating, authorizing, registering, etc.). Those multiple departments take tremendous resources and tremendous time. Changing a two-month process involving eight or nine departments to a seamless, 20-minute process delivers tremendous value to the customer, the insurance company and the customer’s employer.

The digital promise moves a business process from the provision of technology to the provision of a service. But the hard truth is that the old shared services IT construct simply won’t enable a company to achieve the digital promise.

In a shared services type of IT structure, the company focuses on unit costs or component excellence and component cost as well as trying to optimize each of those nine departments and their interactions. It’s a nightmare. And it’s very costly. Digital breaks that down and makes it “one and done.” For instance, an onboarding employee can work through the app, potentially a cognitive agent guiding the employee and prepopulating the app and doing much of the work. Consequently, there is no longer a need for the functions of those nine departments, certainly not in all of them.

The implications for an organization’s governance, policies, philosophies, procedures and people are dramatic, which is why few firms achieve a breakthrough performance. The tremendous change constrains them from moving in that direction.

A 700% improvement in productivity is astounding and hard to believe, as is a 59-day reduction in time to market and $600 million in savings over three years. These breakthrough performance outcomes may seem extreme, but they were achieved by leading financial institutions. Digital technologies such as analytics, automation, cloud and cognitive (AACC) enable companies to dramatically change their productivity and costs of delivering services.

Achieving such dramatic performance breakthroughs is difficult. But even when companies deploy these technologies in a less comprehensive way or not in a breakthrough paradigm, they still impact the market. For example, service providers are competitively bidding and able to deliver the same services at 30-50 percent less. While that’s not a 700% improvement in productivity, it is indeed a big change. Interestingly, that improvement is the same or more as what was delivered through labor arbitrage 10 or 15 years ago.

The times are changing – again – in the services world.

NASSCOM Design and Engineering Summit 2016

Dominating themes at the #NASSCOM Design and Engineering Summit 2016 | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

Digital technologies are fundamentally changing the demand ethos of the US$75 billion Engineering and Research and Development (ER&D) global sourcing market, which is expected to grow at a CAGR of more than 18 percent over the next five years. With rapidly evolving consumer needs, an increase in global regulatory pressures, the rise of the shared economy, increasingly complex security needs, and technology’s shift from enabler to disruptor, following are the major themes I expect to dominate the NASSCOM Design and Engineering Summit 2016, which is being held in Bangalore on October 5 and 6:

    1. The connected digital ecosystems: The proliferation of smart devices and radical improvement in connectivity infrastructure are shaping the evolving digital ecosystem of everything. Orchestrating this connected digital ecosystem and creating products that tap into it are creating a new demand portfolio of ER&D services across industries. Think rapid consumerization in the healthcare industry with increasing use of connected smart medical devices, the connected and autonomous vehicles defining the future of mobility in the shared economy, or the convergence of machine-to-machine (M2M) technologies and advanced analytics driving the industrial 4.0 revolution.
    2. Designing for the future: Enterprises must understand the needs of tomorrow’s customers, and will need to push the boundaries of innovation and design thinking to engineer products that are at the intersection of leveraging cutting edge technologies and re-imagining processes and business models.
    3. Smart, smarter, and smartest: The rise of cognitive computing technologies has pushed the boundaries of process and task automation to create smart products. Research advances in the field of Artificial Intelligence (AI), machine learning, and edge computing will drive development of products that dramatically improve the user experience, and provide convenience beyond expectations for consumers and employees alike. This creates demand for a talent model with hybrid skills of product engineering and design, domain knowledge, and ability to leverage cognitive technologies.
    4. Making sense of data: Enterprises are collecting a lot of data through a multitude of external and internal data sources, and are looking at how to enhance product design and engineering processes, reduce costs, improve quality, and meet evolving user expectations. Enterprises in the retail, defense, media, and financial services industries have been at the forefront of using data and analytics to answer these questions. Demand from these industries is driven from adoption of further sophisticated analytics initiatives that helps deliver competitive advantage. Industries including manufacturing, energy, telecoms, and healthcare and life sciences are rapidly adopting big data and analytics technologies.
    5. Real use cases beyond the cool stuff of AR/VR: Augmented reality and virtual reality technologies have great potential in areas such as remote monitoring and predictive maintenance, training, and simulated testing environments. Expect to hear more use cases for AR and VR technologies.
    6. Software-defined everything: “Software eats everything” across all industries – software-defined infrastructure, software-defined manufacturing, software-defined networking, software-defined datacenters, and so on. The delivery of software product as-a-service, the ability to remotely support and maintain customer premise equipment, and the increasing demand for configurable over customized software products are creating a new demand paradigm for ER&D services in the software products industry.
    7. Time-to-market: Speed is the new currency in the product engineering world. Sourcing has enabled enterprises not just to reduce costs but to drive agility and flexibility to respond to market volatility and constantly changing consumer demands. As technology becomes core to all activities, concepts such as agile and DevOps are becoming relevant across the ER&D services industry value chain.
    8. Standards, security, and compliance: Security is among three priorities for all C-suite executives globally. In the age of connected digital ecosystems, building security into product design is becoming an absolute necessity. Compliance but is a critical component of the demand driver for the ER&D services industry.

I look forward to interesting discussions on these and other topics with the engineering services enterprises and vendors during the #NASSCOM Design and Engineering Summit. If you’re there in person, feel free to contact me or my colleagues H Karthik and Bhawesh Tiwari.

Click here to read about Everest Group’s latest research on the engineering services global talent spot, and here, here, and here to check out detailed insights from this research.

NASSCOM Design and Engineering Summit 2016

changes in services industry

The Services Industry is Changing in Radical Ways | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

The times they are a changing, as the old Bob Dylan song said. In a recent post, I blogged about the dramatic changes we’ll see in the services industry over the next two or three years. Let’s look at two more forces driving industry change.

A major force is the maturity of the labor arbitrage model in services. Ten years ago, it was very difficult and intimidating for a company to set up facilities in India (or another low-cost location), hire Indian teams to run it and build the policies and processes that enable operating with excellence. But the market has matured, and that’s no longer the case. It is now much easier for companies to build offshore capabilities themselves rather than relying on third parties. Today there is a ready-made pool of facilities that they can rent, management teams they can hire and well-understood offshore policies, programs and tax rules that can be applied.

So the do-it-yourself barrier has dropped as the market has matured, leading to the current growth and capability of GICs, which I described in my recent blog. When you combine this with companies’ desire to shift from unit/service component value to business outcome value, it creates a dramatic shift to bring outsourced work back in house to internal GICs. I expect this trend to continue.

A third factor driving change recently emerged but is already making an impact by giving services customers more choices in models. This change is based on the AACC technologies – automation, analytics, cognitive and cloud. The service models coming with these technologies favor small, cross-functional teams that are aligned directly with the business users around producing business value. Moreover, the teams are organized around end-to-end services. The result? Businesses are no longer looking for large offshore factories of labor.

As the AACC technologies and capabilities become more mature, I believe they will also shift the balance, at least in many areas, toward a do-it-yourself model over a third-party service provider.

So times are changing. I’m not saying that the third-party service model will be eliminated. I’m just saying that this model is about to change in some radical ways.


How a PMO May Cause Your Transformation Initiative to Fail | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

Enterprises have undertaken transformation initiatives for decades, and there is a bevy of books, articles, white papers and consultants that tout how to ensure transformation success. One of the top best practices touted is to use a central Project Management Office (PMO) that takes responsibility for the transformation plan and holds the organization accountable for achieving it. It sounds like a great idea. The problem is it doesn’t work very well.
I’m not saying companies shouldn’t have PMOs. Project Management Offices are known for effectiveness in tracking outcomes, directing resources and investments for the project, documenting decisions and reporting. But they’re not known for orchestrating change. In fact, the track record of PMOs for successfully driving and achieving change in a complex transformation initiative is very low.

Read more at Peter’s blog

Forces of change in the services industry, outsourcing

Forces Driving Change in the Services Industry | Sherpas in Blue Shirts

By | Sherpas in Blue Shirts | No Comments

The outsourcing industry is at an inflexion point. It reminds me of Bob Dylan’s hit song from the 1960s, “The Times They are A-changing.” I believe the industry will undergo dramatic change over the next two to three years.

Three forces are driving the change. The first is a shift in user and company expectations. I’ve blogged before about the watermelon phenomenon. Although performance dashboards indicate green for service levels and KPIs, the true picture is often red when it comes to user satisfaction with IT services.

Why is the satisfaction indicator red? One reason is a shift in users’ perception of what value is. Companies historically thought about value as a function of quality and cost. Quality was defined in service levels. Cost was defined as unit costs (per hour, per transaction, per server, etc.). But user’s focus on value has shifted from to the value of the business function.

As an example, consider the business function of employee onboarding. Value used to be viewed as the cost of providing the applications and the quality of the apps (reliability and resilience, and whether they provide the necessary functionality). But the perception of value shifted. For example, now it could be tied to how the new employees view their onboarding experience and whether or not the process results in employees that are well prepared to work in the company. It used to be viewed as the cost

Another reason the user satisfaction indicator is red is the dimension of speed. Companies have always been aware of the need for speed, but historically quality took precedence over speed. Increasingly we find users value speed more. It’s not speed in developing an application; it’s the speed to change the business functionality (such as the employee enrollment system in the example above). It doesn’t matter how quickly an IT group or a third-party service provider can make a change to the application or server; it’s how quickly they can change the functionality.

Users no longer associate value with the delivery components that make it up (such as the infrastructure uptime).

This shift in user’s perception of value is a fundamental reason why the services world is changing, and it’s driving different behaviors and decisions.

One way the change is manifesting is that companies that have outsourced processes now want to bring the work back in house so they can better focus on the business value and less on the service delivery components. The preference for insourcing and GICs is gathering momentum. This is evident in statements of leaders at leading companies such as:

  • “Shell is clearly on a path to insource our project delivery capabilities.” (Jay Crotts, CIO, Shell)
  • “We believe having our own center and doing more of the work ourselves will lead to lower turnover of people, we’ll have people who really understand our systems, people who are passionate.” (Therace Risch, CIO, JCPenney)

Historically, GICs or captives performed only low-cost delivery work. But they’re now gaining share in the marketplace for sophisticated work. Companies are investing in their GICs as they recognize the need for stronger alignment between IT and business users to drive value. The GICs are still located offshore, but the functions are back in house instead of performed by a third party.

The times, they are a-changing. In my next blog post, I’ll discuss two other forces driving change in the services industry.