Risk is handled in one of three fundamental ways:

  • Mitigated: You reduce risk by enacting some countermeasure. Network attack risk is reduced when you install a good firewall. Malware risk is reduced when you roll out anti-virus software. Shoplifting loss risk is reduced when you install cameras and hire guards in your store.
  • Transferred: The risk is reduced by paying someone else to assume it. To put it simply: you buy insurance.
  • Accepted: You realize risk cannot be zeroed out even if you spend more money than you might lose to the threats. So you find your “sweet spot” and realize, some risk still remains. We call this, residual risk.

In information security, everything is trade-offs. Usually, the trade is resources for risk reduction. Finding the sweet spot is not even the hard part. The hard part is getting management to understand why the sweet spot is found where residual risk is still annoyingly non-zero.

There are other trade-offs. Every time we transact with a company, we risk some of our private information in exchange for some benefit that company offers us. We risk the disclosure of a credit card number, to gain the benefit of a new FitBit. We risk the privacy of our home address, to gain the benefit of having delivery of that FitBit to our door. We risk the privacy of our health information by putting that FitBit on our wrist, and syncing it to an app in our phone. This gains us the benefit of the aid that the FitBit provides to our exercise program.

Personally, I consider a FitBit too risky for privacy to be worth the benefits it can provide. Maybe someday FitBit will show me that the benefits can outweigh those risks.

See what the big deal is

But the ultimate case where I wonder how people are making these risk/benefit decisions comes with things like Alexa, Google Assistant, Siri, and (shudder) Facebook Portal. We’ve already seen cases where voice recordings from people’s homes have been grossly mishandled. What about the cases where they are handled “properly?” Where the “proper” handling of this data is to build a profile of you so detailed, your spouse would be surprised to learn some of it?

Maybe I’m the most digital Luddite around. But I will have none of that in my vicinity. When I am in its presence, unwillingly, I may do something like this:

Maybe that will tip someone’s risk-acceptance decision the right way. That’s me, always looking for a way to reduce that residual risk.