Wednesday, August 19, 2015

Algorithims Are Judging You

Of course they are. Anyone who's ever had a FICO score was judged by an algo. It's just that now the stakes are higher than ever.
We've visited the author of this piece a couple times, links below. He tends to come down on the "The internet has not lived up to its promise" side of the spectrum.

From Aeon magazine:
Frank Pasquale is a professor of law at the University of Maryland and a specialist on challenges posed to information law by technology. His book, The Black Box Society (2015), develops a social theory of reputation, search, and finance. 
Digital star chamber 
Algorithms are producing profiles of you. What do they say? You probably don’t have the right to know
In a recent podcast series called Instaserfs, a former Uber driver named Mansour gave a chilling description of the new, computer-mediated workplace. First, the company tried to persuade him to take a predatory loan to buy a new car. Apparently a number cruncher deemed him at high risk of defaulting. Second, Uber would never respond in person to him – it just sent text messages and emails. This style of supervision was a series of take-it-or-leave-it ultimatums – a digital boss coded in advance.

Then the company suddenly took a larger cut of revenues from him and other drivers. And finally, what seemed most outrageous to Mansour: his job could be terminated without notice if a few passengers gave him one-star reviews, since that could drag his average below 4.7. According to him, Uber has no real appeal recourse or other due process in play for a rating system that can instantly put a driver out of work – it simply crunches the numbers.

Mansour’s story compresses long-standing trends in credit and employment – and it’s by no means unique. Online retailers live in fear of a ‘Google Death Penalty’ – a sudden, mysterious drop in search-engine rankings if they do something judged fraudulent by Google’s spam detection algorithms. Job applicants at Walmart in the US and other large companies take mysterious ‘personality tests’, which process their responses in undisclosed ways. And white-collar workers face CV-sorting software that may understate, or entirely ignore, their qualifications. One algorithmic CV analyser found all 29,000 people who applied for a ‘reasonably standard engineering position’ unqualified.

The infancy of the internet is over. As online spaces mature, Facebook, Google, Apple, Amazon, and other powerful corporations are setting the rules that govern competition among journalists, writers, coders, and e-commerce firms. Uber and Postmates and other platforms are adding a code layer to occupations like driving and service work. Cyberspace is no longer an escape from the ‘real world’. It is now a force governing it via algorithms: recipe-like sets of instructions to solve problems. From Google search to OkCupid matchmaking, software orders and weights hundreds of variables into clean, simple interfaces, taking us from query to solution. Complex mathematics govern such answers, but it is hidden from plain view, thanks either to secrecy imposed by law, or to complexity outsiders cannot unravel.
Algorithms are increasingly important because businesses rarely thought of as high tech have learned the lessons of the internet giants’ successes. Following the advice of Jeff Jarvis’s What Would Google Do, they are collecting data from both workers and customers, using algorithmic tools to make decisions, to sort the desirable from the disposable. Companies may be parsing your voice and credit record when you call them, to determine whether you match up to ‘ideal customer’ status, or are simply ‘waste’ who can be treated with disdain. Epagogix advises movie studios on what scripts to buy, based on how closely they match past, successful scripts. Even winemakers make algorithmic judgments, based on statistical analyses of the weather and other characteristics of good and bad vintage years.

For wines or films, the stakes are not terribly high. But when algorithms start affecting critical opportunities for employment, career advancement, health, credit and education, they deserve more scrutiny. US hospitals are using big data-driven systems to determine which patients are high-risk – and data far outside traditional health records is informing those determinations. IBM now uses algorithmic assessment tools to sort employees worldwide on criteria of cost-effectiveness, but spares top managers the same invasive surveillance and ranking. In government, too, algorithmic assessments of dangerousness can lead to longer sentences for convicts, or no-fly lists for travellers. Credit-scoring drives billions of dollars in lending, but the scorers’ methods remain opaque. The average borrower could lose tens of thousands of dollars over a lifetime, thanks to wrong or unfairly processed data.

This trend toward using more data, in more obscure ways, to rank and rate us, may seem inevitable. Yet the exact development of such computerised sorting methods is anything but automatic. Search engines, for example, are paradigmatic examples of algorithmic technology, but their present look and feel owe a great deal to legal interventions. For example, thanks to Federal Trade Commission action in 2002, United States consumer-protection laws require the separation of advertisements from unpaid, ‘organic’ content. In a world where media firms are constantly trying to blur the distinction between content and ‘native advertising’, that law matters. European Union regulators are now trying to ensure that irrelevant, outdated, or prejudicial material does not haunt individuals’ ‘name search’ results – a critical task in an era when so many prospective employers google those whom they are considering for a job. The EU has also spurred search engines to take human dignity into account – by, for example, approving the request of a ‘victim of physical assault [who] asked for results describing the assault to be removed for queries against her name’....MUCH MORE
Previously:
Nudge This: "The Algorithmic Self"
The writer,  Frank Pasquale, is a professor of law at the University of Maryland, and is the author of the forthcoming book The Black Box Society: The Secret Algorithms That Control Money and Information.
And, on the off chance Bloomberg View's Matt Levine should see this, 38 footnotes! ...
And writing at the Guardian, "Uber and the lawlessness of 'sharing economy' corporates"

Previously on Nudge This:
Nudge This: "The Internet of Things Will Be a Giant Persuasion Machine"
Nudge This: "Yes, You’re Irrational, and Yes, That’s OK"
Behavior: We Are More Rational Than Those who Try To 'Nudge' Us