Tag: risk mitigation

Automation Introduces New Business Risks | Sherpas in Blue Shirts

Automation has the essentials for introducing different kinds of business risks and risk at a different order of magnitude. The new risks manifest differently and have greater consequences than in a normal business process. The issue is the difference between type 1 and type 2 errors.

  • Type 1 error. This is a normal error such as making a mathematical error on an invoice. The consequences are that you would under-bill or over-bill a client. Once you reconcile the error, you may have lost a revenue opportunity or may have to rebate the client for the difference in overcharging.
  • Type 2 error. An example of this situation is that you under bill all your clients. The consequence is often 10X or more the impact of a type 1 error.

We at Everest Group have discussed with clients this impending shift of business processes to a far more automated landscape where type 2 errors are inadvertently introduced.

In a previous blog, I talked about automation bias and how people tend to blindly come to accept or believe whatever comes out of an automated tool. This makes the likelihood stronger that type 2 errors would occur.

On an industrialized services basis with broad-scale business processes, we must be aware of type 2 errors and guard against them. This is why many of the leading firms that are looking at adopting automation, cognitive computing, and robotics are considering implementing a Center of Excellence (CoE) to help the business understand the changes that accompany automation. A CoE can help educate employees to guard against automation bias and type 2 errors that could inadvertently be institutionalized in automated approaches to business processes.


Photo credit: Flickr

Automation Bias | Sherpas in Blue Shirts

We’re at an inflection point in the ITO and BPO services world where we’re about to see a new level of technology: automation. On the whole, automation is a good thing. But there are some significant aspects we should be aware of. One is automation bias. And it’s dangerous.

When we move to automation, whether it’s cognitive computing or replacement of repetitive tasks, the people who are in the process become dependent on the automation. In fact, not only do they become dependent, they start to believe that whatever comes from the computer is truth. They take it for granted that the results are accurate. This is automation bias.

As a simple example, when you use a calculator, you quickly start to trust whatever the calculator results are. We have blind trust in automated tools.

Why is automation bias so dangerous?

A computer will slavishly do what it’s told to do or will run down the same cognitive analysis it has done in the past. When the world changes, the computer may not recognize that the world has changed. Change can come from one of the data sources having made a change. Or it could be an upstream or downstream change in a business process. Although people in the business process should recognize the change, automation bias may cause them not to recognize it because they believe that everything coming out of the computer is correct. This is a significant business risk.

The fact is automated tools are fallible. We all know that the world constantly changes, and automation bias presents the risk that the computer won’t recognize the change.

We’re on the verge of taking robotics and automation at a scale we have never done before. This will dramatically change how we perform business processes and how we run data centers. Organizations going down the automation path need to be aware of automation bias and build safeguards against it.


Photo credit: Flickr

The Challenge of Security Services in the Internet of Things | Sherpas in Blue Shirts

The first thing to think about in the nature of Internet of Things security is that you have to recognize this is not “one and done.” The fight to secure your IoT environment is an activity that continues in perpetuity. The resources you initially allocate will be substantial, but they will escalate and costs will increase over time. It’s a very different way to think about a process than the services world’s normal engineering approach where you have a large up-front cost that becomes smoothed out, so you spend less and less money on it over time.

In the IoT, we can segment into two kinds of security. And there are different ways you approach the two. And both have a different level of funding.

The first segment is security at the edge, or device level. Here you need to be sure that each level is secure and monitored, from the device at the edge all the way through the network and the apps in an ecosystem. Think of this as a hygiene or compliance role in which you need to ensure that security exists, it’s adequate, you monitor it for effectiveness, and that any attack is limited and limited to only a small segment and can’t spread. Those are the things you need to look for at the compliance level.

The second kind of security is around architecture and end-to-end monitoring. This requires a thoughtful end-to-end view of the objective you want to accomplish through the IoT, how you view security in the total ecosystem, how you architect it into systems, and how you monitor it at a systems level for the entire process that you define within the IoT. This security level typically reports to the chief security officer and requires a different level of thinking, talent, and investment.

If you’re not doing both the hygiene approach and the architectural view, black hats potentially can use any holes to corrupt the whole chain.

Even though you believe you have adequate security by levels, that doesn’t mean you’re safe. The inventiveness of the black hats is so robust that you’ll have to continually invest in protection. You first need to invest in architecting your solution from end to end and then continually monitor it and adapt it as new threats emerge.

One thing you can be sure of is that threats will continue to emerge.

How can we engage?

Please let us know how we can help you on your journey.

Contact Us

"*" indicates required fields

Please review our Privacy Notice and check the box below to consent to the use of Personal Data that you provide.