Quantifying the cost of one user

How far would you walk to save $2? How about driving across town to save yourself time? When contemplating the design of services, we’re often asked to measure the return on investment for designing services, in an effort to quanitfy their value for non-designers.

The design of friction into common experiences, especially within public services is considered part of the cost to use systems. People doing the work will tell you learned experience about why things need to be complicated. A common refrain is “people need to have skin in the game or they’ll take advantage.” Of what, exactly?

Less than two decades ago, people had to deal with other people. It’s not that computers, algorithms or systems design didn’t control complicated processes. It’s just that you usually had some semblance of recourse. Ask a parent or grandparent about how often they had to “ask to speak to a manager or supervisor” on a call center call, because for a long time those feedback loops were all connected. With outsourcing, feedback mechanisms are divorced from the people doing the work. This leads to a future post where I talk about the ways we’ve deputized service workers to carry the load of customer frustrations.

Valuing the cost of trust

The most direct examples of user deception as a revenue strategy are how subscriptions online recur and make it very difficult for people to cancel without calling. Even when there’s an online way to cancel, platforms and services will use guilt as a mechanism to maintain a relationship. To the credit of some services, offering discounts or other incentives to give someone another chance to use their tool, services or platform.

Most large entities don’t have a lot of consideration for individual actors. But every day, you watch customer interactions where people who are disempowered by bad policy design bear the wrath of customer feedback because they lack mechanisms to express their discontent besides just canceling the future business relationship. Part of the reason Twitter works so well as a feedback loop, is it gives customers at varying levels of maturity a vehicle for expressing discontent in ways that get attention. All it takes is one viral customer complaint to cause a brand more headache than they want, even when the responses from the brands themselves are warranted once you dig into the situation more closely.

How third-party systems erode the value of trust

Governance as a term doesn’t quite capture what’s happening every time a third-party system gets “skin in the game” to delegate particular tasks that used to be handled by an individual. While it enables key parties to do more with less, as it relates to the design of services

COVID-19 has seen more stores deciding to enforce cashless policies, which began in some cities because the banks were boarded up.

Customer trust indexing as a metric of customer confidence because so many lifecycles are aimed at retention but no one does the job of ensuring relationships are being nurtured in ways that sustain the partnership. In aggregate, it’s easier to disappoint one user by deception and blame than to make affordances, because the incentives aren’t present to make their program better or more attractive to stick around and individidual actors aren’t empowered to fix situations.

In a world where algos get to determine outcomes, what used to be person to person interactions get turned into humans carrying out the will of the brand on behalf of the algo. The logical choice, the cost benefit doesn’t always correspond with rational outcomes. Nor can it account for edge cases or stress cases that weren’t part of initial policy design.

For instance, when airlines were crushed with a wave of cancellations and credit card chargebacks at the initial height of the pandemic, they had to reverse long standing draconican revenue schemes like change fees, because the design of those punitive policies didn’t fit with the reality we now exist in. Now you’ll see them trying to entice people to fly with all sorts of incentives, except many haven’t staffed themselves appropriately and the trust eroded among their own staff and the fissures are showing even as you board planes.

Why isn’t trust something we consciously design for? Social media platforms trust and safety is purely about mitigiating risk, and isn’t approached from a design lens but rather a business/engineering one. We need designers focused on the design of trust as a function of the service blueprint.

Built with Jekyll, Github, VSCode and served via Netlify.