Tumgik
#coin is using this move as a power bid rather than for justice
catoscloves · 4 months
Text
one of the many things to love about katniss is that she's not spiteful, or revenge minded, no matter how many reasons society and her circumstances give her to be, and always has empathy for others.
and this extends to people she doesn't have much emotional connection to. for as much as she regarded the careers as the capitol's well trained lapdogs, she killed cato out of mercy rather than revenge. she acknowledged that cato and clove could have survived if her and peeta hadn't, that someone like marvel who murdered her ally (a twelve year old, the most innocent and vulnerable person in the arena and who katniss saw as a sister) might have had a life to go back to, a home with family members waiting for his return. she felt guilty during her interactions with gloss and cashmere because she killed children from their district that they might have mentored. katniss explicitly said she did not like enobaria as a person, but didn't want to exclude her from the protection deal with coin.
katniss also has a kind view of people that, as cinna himself said, she should have despised. the capitol citizens are vapid and privileged and watch children like her die for their entertainment every year. yet katniss still forges a relationship with cinna and is able to be vulnerable and share her private stories and treasured memories with him acting as the capitol audience. and while she sees her prep team as ignorant and childish pets, she does have affection for them and objected to them being tortured in district thirteen. even though effie literally comes to her district representing the most evil aspect of the capitol and every year sends two children from her home to die, katniss still sees her redeeming characteristics, bonds with effie, and makes an effort to spare her from the rebels.
no matter who the person is or what they might have done to her/her loved ones, katniss recognizes humanity in everyone and attempts to limit the suffering of others when possible, regardless of her personal opinion about them.
66 notes · View notes
365daysofsasuhina · 5 years
Text
[ 365 Days of SasuHina || Day Two Hundred Seventy-Six: Drained ] [ Uchiha Sasuke, Hyūga Hinata ] [ SasuHina, death ] [ Verse: Of Monsters and Men ] [ AO3 Link ]
Just because she isn’t technically human doesn’t mean she shouldn’t be careful.
Since she was young, Hinata has been...unlike most people she knows. While most humans are rather...well, mundane, she quickly proved to be anything but.
And that is because Hinata is a witch.
The term, of course, varies from land to land, sometimes even province to province. But all that truly matters is what it means. Neither human nor monster, she and those like her walk the delicate line between worlds.
Her mother had been one, having exhibited powers beyond the mortal. Before the birth of her second daughter, she’d begun versing Hinata in their ways.
But...she’d been young. And her lessons cut tragically short as Hanako passed not long after Hanabi’s unleashing to the world. Suddenly they were both without a mother...and Hinata without a hand (or knowing mind) to guide her.
Most of her knowledge, therefore, has been hard-won: picked up from rare books, and fleeting conversations with strangers like her. Over the course of her growing up, Hinata has learned the truth behind her bloodline in bits and pieces...just enough to crudely navigate her way through life.
Her father had indulged her...oddities for a time before removing her from the household, fearing she would taint Hanabi in a similar way. Left to her own devices, with barely any coin or possessions...Hinata was made to fend for herself from the humble age of twelve.
A fairly decent age, all things considered...for hers was a time for rampant diseases, danger, and misfortune. War and roving bandits were common...and so were things humans barely dared to believe were real.
Monsters.
By some grace, Hinata found herself in the care and tutelage of another witch named Kurenai, one whose unique magic manifested in illusions. For each family - and at times, individual - carried their own special trait or skill. In Kurenai’s case, it was the ability to bend the mind to suit her devices, often passing unseen or manipulating others to her bidding. But she was a rather kind witch, doing so only to get by, and never maliciously.
But Hinata faced a dilemma. She had no idea what her line’s power was meant to be, or her own possible unique twist. Memories of her mother, by that time, were vague and foggy...and nothing was remembered to help her realize her potential. Try as she might, Kurenai couldn’t puzzle it out either, trying trick after trick, but with little to show for it. While Hinata could demonstrate basic magical ability...her true strength remained undiscovered.
After four years in the woman’s care, Hinata decided that there was little else to glean. And given that Kurenai had become rather...involved with a city guard, she felt a bit invasive remaining. So, she’d gone on her way, promising to write, but...no longer feeling right remaining in the woman’s care. She was nearly an adult by then, and had mastered enough basic arts to get by...or so she’d hoped.
Passing through a city, Hinata had stopped to rest, all while happening to overhear a fortune teller nearby speaking in riddles and prophecies to a group of young people. A quick glance told her, of course, that this particular woman was a fraud, exhibiting no true magic.
But it was that moment that changed everything.
In a flash, as Hinata looked to a young man among them, she was struck with a vision of his future. To her horror, it was anything but the romance the hag was spouting...but a rather grisly death. Startled and afraid, she’d fled...and reached a conclusion.
That was her ability. To see into the future. And once she realized it, an echo from the past had reached her: words from her mother, about how their eyes - so pale, and so...unusual - could see what others could not. The snippet had slipped through the cracks in her mind...but then, it all made sense.
Hyūga were seers.
With that knowledge in hand, Hinata knew what she had to do. Unlike the pretender in the city, she could offer true readings for those who wanted them...and, so she hoped, earn herself the coin to survive. Town to town she traveled, offering glimpses into futures. Of course...given their era, a great many ended in tragedy, but she managed to remind her clients that the future could always be changed...and it was that alone that kept her from being run out of town.
At least...not so quickly. As time passed, the era of her kind was waning - superstition and a shift in guiding morality had begun to paint her people in a cruel and predatory light.
So, with the coin she’d accrued, Hinata fled and found herself a small cabin just within the boughs of a forest. Close enough to town to take care of her errands...but far enough out to remain hidden, her arts offered only to those deemed trustworthy...and able to pay the now far-higher sum to amend for the lack of clients.
Otherwise, she kept to herself, growing a garden and befriending the locals enough to skate by with enough supplies to last the harsh Winters. Several have now passed, and she stands a young woman of her early twenties, well-versed in life’s struggles, and her arts.
...and it’s now that trouble is afoot.
Stepping from her front door, basket along her arm, she comes up short as a local man, farmer by trade, staggers up her walkway with a look of panic in his eyes. “Milady! M...milady, I -!”
“Slow down,” Hinata softly cautions, seeing his exhaustion. If he ran the whole road from his farm, no wonder he’d be winded! “Take a moment to catch your breath.”
“It...I…” Leaning forward to brace palms on his knees, he struggles for breath. “There’s been a body found, just north of the lake!”
“A body…?”
“Yes! But it, it was…” A deep grimace overcomes his expression. “...drained…!”
Dark brows furrow. “...what do you mean, drained?”
“There’s not a drop of blood left in him! As though some monster sucked him dry! There’s talk of a vampire roaming these parts…!”
At that, Hinata’s eyes widen. A vampire…? As times have changed and humans expanded, she’s heard less and less of the monsters lurking in the moonlight of mankind.
“You can see why folks’d be panicked - everyone’s afraid to leave their homes! And, well...we was wondering if you could...do something.”
“I’m not a hunter of monsters,” Hinata quickly cautions, raising a hand.
“But...was it not in the old tales? Of, er...your kind taming monsters?”
“...that was a long time ago.”
“But milady -!”
“I’ll go investigate...and you’re right, it’s b-best you all stay in your homes. You’ll be safest there.” In truth, she knows that weak wooden cabin doors are nothing to a vampire’s might, but...if they just fed, she can hope they won’t be keen to feed again so soon. “If you can, hang a string of garlic near your door. The strong smell should help ward them off. And if you’ve any silver, arm yourself.”
“Oh, thank you…”
“Go on. Wait for me to find you...I’ll see what’s going on.” Watching him go, Hinata can’t help a feeling of dread in her stomach. Her powers aren’t showing her the future now...and in part, she’s glad not to know. Leaving her basket, she instead takes a dagger of silver and hangs the sheath from her belt.
Time to see just what’s going on.
Following the vague directions, she takes a road that leads to the small local lake, and it doesn’t take long to stumble across the smell of rotting flesh. Nose wrinkling, she parts some forest thicket to reveal the body in question. True to the farmer’s word, it’s pale as death and almost looks...shriveled despite the lack of rot. Cautiously approaching, a hand to turn them finds a bitten imprint at the crook of their neck.
No mistaking it...this is a vampire.
But what to do…? Hinata’s never fought such a creature - never had to. While some Hunters do still track such monsters, there’s no telling if one is near enough to aid before someone else falls victim. Silver may give her an edge...but she’s mortal. Too slow, too weak compared to a monster of moonlight. And though she’s heard tales from her kind of their once-famed ability to manipulate the inhuman...she’s never tried it. Never encountered any such beast to know if she even has the skill.
But when the hairs on the rear of her neck stand on end...she knows she’s being watched.
Breath forced to be even and body tense, she grips the handle of her dagger tightly. She’ll likely only have one shot at this...best to make it count.
“Well well...seems I’m not the only rare breed around…”
Spinning, she relies on her hearing to gauge the distance, attempting to bury the blade somewhere useful. But like a bored parent subduing their child, the vampire catches her wrist, and with his simple grip, makes her relinquish the weapon.
“Bit rude to stab someone when you don’t even know their name, isn’t it…?”
“You’ve killed someone,” Hinata replies tersely.
“Eye for an eye leaves the whole world blind. I needed blood, and he was there to provide. Besides...he stunk of sin. Not only will he not be missed...but I can almost guarantee that someone, somewhere, will be happy he’s gone. I try to only kill humans that deserve it, miss witch.”
As he speaks, Hinata looks the man over. He’s pale of skin, with flyaway dark hair and even darker eyes. Taller than her own build, he’s wiry and sinewy - a far throw from her short and stout anatomy. And there’s an aloof tinge to his expression, clearly bored of her.
“...a death is still a death. If he’d done wrong, he needed to be t-tried.”
“Well...sometimes we ought to give the courts a rest, hm? Besides, you humans rarely get this whole ‘justice’ thing right,” he drawls, releasing her wrist and flicking her dagger up with a boot, snatching it in midair and looking it over. “...fine blade. But you’re far too slow to use it.”
“...you need to leave.”
“Oh…?”
“I’ll not let you hurt anyone else.”
At that, the vampire scoffs. “Not let me, eh? And, ah...what, praytell, will you do to stop me…?” As if to prove his point, he moves in a blur. With a thud, her knife drives into a tree trunk, wrists pinned over her head as he effortlessly subdues her. “...I don’t want to hurt anyone who doesn’t deserve it. I’m on my way through...I’ll not trouble your little gaggle of humans, witchy woman. Just remember...humans always need a monster in the shadows. They just might make one out of you, someday…”
Tensed and panting, Hinata feels a flicker of panic in her veins. And like a spark to tinder, it seems to alight something within her. Jaw setting, a kind of instinct drives her to bark, “Release me!”
As though thrown backwards, the vampire does just that, shock widening his eyes and flaring them red. Immediately, his body language changes from relaxed to battle ready.
Rubbing at a wrist, Hinata watches him warily.
“...what did you just do…?”
“What I had to.” Drawing herself up to her full height, Hinata stares at him. “...I want you to leave. Now. I w-won’t let you harm these people, or make them afraid. They trust me...and I them. It’s my duty to protect them. I don’t want to hurt you...but I’ll do what I must. Now...leave, vampire. And n-never come back.”
Slowly, he lets himself lose his edge, eyes fading back to black. “...I told you, that’s not why I’m here. Didn’t have to go and shove me with...whatever that was. Consider me gone, witch.”
“My name is Hinata.”
“...Sasuke. Not that you apparently care. Be careful with those powers of yours. I meant what I said. There always has to be a monster...and you’ll be first on their list.”
Her jaw tightens. “...I said, go.”
Scowling, he lingers only a moment longer before simply disappearing, rustling underbrush all that betrays his movement.
Only once she’s sure he’s gone does she relax with a sigh, sagging back against the tree behind her. She had no idea what to expect, but...it seems the old tales are true. The ones who walk the night really do obey a witch’s words.
...she’ll have to remember that. But for now, she has a body to bury...and people to reassure.
Beyond the lake, coming to a stop, Sasuke only looks back once he’s sure he’s clear of her. A hand smoothes irritatedly through his hair. He’s only ever encountered a handful of witches...and none ever treated him like that.
...nor did they exhibit a power over him.
Seems the old folktales are true. Well...he’s not keen to see her again, that’s for certain.
...and yet...he can’t help a flicker of morbid curiosity. It’s rare to see a human so plucky...even if she’s not completely human. A strange walker of the line between night and day...one who wanders the twilight between humans and monsters.
For now, at any rate, he’s not about to push his luck...and he’s got somewhere to be. But he nonetheless makes a mental note for a later date.
It just might be a little...fun.
                                                            .oOo.
     A slightly different twist on the typical Nightwalker story! Rather than modern, this one's more medieval based, a bit like SHM day two this year, only...well, not directly connected. This one's standalone...at least for now!      Anyway, it is SUPER duper late, so I'm gonna go sleep - thanks for reading!
9 notes · View notes
Text
#3yrsago Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives
Tumblr media
I've been writing about the work of Cathy "Mathbabe" O'Neil for years: she's a radical data-scientist with a Harvard PhD in mathematics, who coined the term "Weapons of Math Destruction" to describe the ways that sloppy statistical modeling is punishing millions of people every day, and in more and more cases, destroying lives. Today, O'Neil brings her argument to print, with a fantastic, plainspoken, call to arms called (what else?)  Weapons of Math Destruction.
Discussions about big data's role in our society tends to focus on algorithms, but the algorithms for handling giant data sets are all well understood and work well. The real issue isn't algorithms, it's models. Models are what you get when you feed data to an algorithm and ask it to make predictions. As O'Neil puts it, "Models are opinions embedded in mathematics."
Other critical data scientists, like Patrick Ball from the Human Rights Data Analysis Group have located their critique in the same place. As Patrick once explained to me, you can train an algorithm to predict someone's height from their weight, but if your whole training set comes from a grade three class, and anyone who's self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn't the algorithm, it's the training data and the lack of correction when the model produces erroneous conclusions.
Like Ball, O'Neil is enthusiastic about the power of data-driven modelling to be a force for good in the world, and like Ball, she despairs at the way that sloppy statistical work can produce gigantic profits for a few companies at the expense of millions of people -- all with the veneer of mathematical objectivity.
O'Neil calls these harmful models "Weapons of Math Destruction," and not all fault models qualify. For a model to be a WMD, it must be opaque to its subjects, harmful to their interests, and grow exponentially to run at huge scale.
These WMDs are now everywhere. The sleazy for-profit educational system has figured out how to use models to identify desperate people and sucker them into signing up for expensive, useless "educations" that are paid for with punitive student loans, backed by the federal government. That's how the University of Phoenix can be so profitable, even after spending upwards of $1B/year on marketing. They've built a WMD that brings students in at a steady clip despite the fact that they spend $2,225/student in marketing and only $892/student on instruction. Meanwhile, the high-efficacy, low-cost community colleges are all but invisible in the glare and roar of the University of Phoenix's marketing blitzkreig.
One highly visible characteristic of WMDs is their lack of feedback and tuning. In sports, teams use detailed statistical models to predict which athletes they should bid on, and to deploy those athletes when squaring off against opposing teams. But after the predicted event has occurred, the teams update their models to account for their failings. If you pass on a basketball player who goes to glory for a rival team, you update your model to help you do better in the next draft.
Compare this with the WMDs used against us in everyday life. The largest employers in America use commercial services to run their incoming resumes against a model of a "successful" worker. These models hold your employment future in their hands. If one rejects you and you go on to do brilliant work somewhere else, that fact is never used to refine the model. Everyone loses: job-seekers are arbitrarily excluded from employment, and employers miss out on great hires. Only the WMD merchants in the middle make out like bandits.
It's worth asking how we got here. Many forms of WMD were deployed as an answer to institutional bias -- in criminal sentencing, in school grading, in university admissions, in hiring and lending. The models are supposed to be race- and gender-blind, blind to privilege and connections.
But all too often, the models are trained with the biased data. The picture of a future successful Ivy League student or loan repayer is painted using data-points from the admittedly biased history of the institutions. All the Harvard grads or dutiful mortgage payers are fed to the algorithm, which dutifully predicts that tomorrow's Harvard alums and prime loan recipients will look just like yesterday's -- but now the bias gets the credibility of seeming objectivity.
This training problem is well known in stats, but largely ignored by WMD dealers. Companies that run their own Big Data initiatives, by contrast, are much more careful about refining their models. Amazon carefully tracks those customers who abandon their shopping carts, or who stop shopping after a couple of purchases. Their interested in knowing everything they can about "recidivism" among shoppers, and they combine statistical modelling with anthropology -- seeking out and talking to their subjects -- to improve their system.
The contrast with automated sentencing software -- now widely used in the US judicial system, and spreading rapidly around the world -- could not be more stark. Like Amazon's data scientists, the companies that sell sentencing apps are trying to predict recidivism, and their predictions can send one person to prison for decades and let another go free.
These brokers are training their model on the corrupted data of the past. They look at the racialized sentencing outcomes of the past -- the outcomes that sent young black men to prison for years for minor crack possession, while letting rich white men walk away from cocaine possession charges -- and conclude that people from poor neighborhoods, whose family members and friends have had run-ins with the law, and "predict" that this person will reoffend, and recommend long sentences to keep them away from society.
Unlike Amazon, these companies aren't looking to see whether longer sentences cause recidivism (by causing emotional damage and social isolation) and how prison beatings, solitary confinement and prison rape are related to the phenomenon. If the prison system was run like Amazon -- that is, with a commitment to reducing reoffending, rather than enriching justice-system contractors and satisfying revenge-hungry bigots in the electorate -- it would probably look like a Nordic prison: humane, sparsely populated, and oriented toward rehabilitation, addiction treatment, job training, and psychological counselling.
WMDs have transformed education for teachers and students. In the 1980s, the Reagan administration seized on a report called A Nation at Risk, which claimed that the US was on the verge of collapse due to its falling SAT scores. This was the starter-pistol for an all-out assault on teachers and public education, which continues to this day.
The most visible expression of this is the "value added" assessment of teachers, which uses a battery of standardized tests to assess teachers' performance from year to year. The statistical basis for these assessments is laughable (statistics work on big numbers, not classes of 25 kids -- assessments can swing 90% from one year to the next, making them no better than random number generators). Teachers -- good teachers, committed teachers -- lose their jobs over these tests.
Students, meanwhile, are taken away from real learning in order to take more and more tests, and those tests -- which are supposed to measure "aptitude" and thus shouldn't be amenable to expensive preparatory services -- determine their whole futures.
The Nation at Risk report that started it all turned out to be bullshit, by the way -- grounded in another laughable statistical error. Sandia Labs later audited the findings from the report and found that the researchers had failed to account for the ballooning number of students who were taking the SATs, bringing down the average score.
In other words: SATs were falling because more American kids were confident enough to try to go to college: the educational system was working so well that young people who would never have taken an SAT were taking it, and the larger pool of test-takers was bringing the average score down.
WMDs turn the whole of human life into a game of Search Engine Optimization. With SEO, merchants hire companies who claim to have reverse-engineered Google's opaque model and whose advice will move your URL further  up in its ranking.
When you pay someone thousands of dollars to prep your kid for the SATs, or to improve your ranking with the "e-score" providers that determine your creditworthiness, jobworthiness, or mortgageworthiness, you're recreating SEO, but for everything. It's a grim picture of the future: WMD makers and SEO experts locked in an endless arms-race to tweak their models to game one another, and all the rest of us being subjected to automated caprice or paying ransom to escape it (for now). In that future, we're all the product, not the customer (much less the citizen).
O'Neil's work is so important because she believes in data science. Algorithms can and will be used to locate people in difficulty: teachers with hard challenges, people in financial distress, people who are struggling in their jobs, students who need educational attention. It's up to us whether we use that information to exclude and further victimize those people, or help them with additional resources
Credit bureaux, e-scorers, and other entities that model us create externalities in the form of false positives -- from no-fly lists to credit-score errors to job score errors that cost us our careers. These errors cost them nothing to make, and something to fix -- and they're incredibly expensive to us. Like all negative externalities, the cost of cleaning them up (rehabilitating your job, finding a new home, serving a longer prison sentence, etc) is much higher than the savings to the firms, but we bear the costs and they reap the savings.
It's E Pluribus Unum reversed: models make many out of one, pigeonholing each of us as members of groups about whom generalizations -- often punitive ones (such as variable pricing) can be made.
Modelling won't go away: as a tool for guiding caring and helpful remedial systems, models are amazing. As a tool for punishing and disenfranchising, they are a nightmare. The choice is ours to make. O'Neil's book is a vital crash-course in the specialized kind of statistical knowledge we all need to interrogate the systems around us and demand better.
 Weapons of Math Destruction [Cathy O'Neil/Crown]
https://boingboing.net/2016/09/06/weapons-of-math-destruction-i.html
28 notes · View notes
Text
#2yrsago Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives
Tumblr media
I've been writing about the work of Cathy "Mathbabe" O'Neil for years: she's a radical data-scientist with a Harvard PhD in mathematics, who coined the term "Weapons of Math Destruction" to describe the ways that sloppy statistical modeling is punishing millions of people every day, and in more and more cases, destroying lives. Today, O'Neil brings her argument to print, with a fantastic, plainspoken, call to arms called (what else?)  Weapons of Math Destruction.
Discussions about big data's role in our society tends to focus on algorithms, but the algorithms for handling giant data sets are all well understood and work well. The real issue isn't algorithms, it's models. Models are what you get when you feed data to an algorithm and ask it to make predictions. As O'Neil puts it, "Models are opinions embedded in mathematics."
Other critical data scientists, like Patrick Ball from the Human Rights Data Analysis Group have located their critique in the same place. As Patrick once explained to me, you can train an algorithm to predict someone's height from their weight, but if your whole training set comes from a grade three class, and anyone who's self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn't the algorithm, it's the training data and the lack of correction when the model produces erroneous conclusions.
Like Ball, O'Neil is enthusiastic about the power of data-driven modelling to be a force for good in the world, and like Ball, she despairs at the way that sloppy statistical work can produce gigantic profits for a few companies at the expense of millions of people -- all with the veneer of mathematical objectivity.
O'Neil calls these harmful models "Weapons of Math Destruction," and not all fault models qualify. For a model to be a WMD, it must be opaque to its subjects, harmful to their interests, and grow exponentially to run at huge scale.
These WMDs are now everywhere. The sleazy for-profit educational system has figured out how to use models to identify desperate people and sucker them into signing up for expensive, useless "educations" that are paid for with punitive student loans, backed by the federal government. That's how the University of Phoenix can be so profitable, even after spending upwards of $1B/year on marketing. They've built a WMD that brings students in at a steady clip despite the fact that they spend $2,225/student in marketing and only $892/student on instruction. Meanwhile, the high-efficacy, low-cost community colleges are all but invisible in the glare and roar of the University of Phoenix's marketing blitzkreig.
One highly visible characteristic of WMDs is their lack of feedback and tuning. In sports, teams use detailed statistical models to predict which athletes they should bid on, and to deploy those athletes when squaring off against opposing teams. But after the predicted event has occurred, the teams update their models to account for their failings. If you pass on a basketball player who goes to glory for a rival team, you update your model to help you do better in the next draft.
Compare this with the WMDs used against us in everyday life. The largest employers in America use commercial services to run their incoming resumes against a model of a "successful" worker. These models hold your employment future in their hands. If one rejects you and you go on to do brilliant work somewhere else, that fact is never used to refine the model. Everyone loses: job-seekers are arbitrarily excluded from employment, and employers miss out on great hires. Only the WMD merchants in the middle make out like bandits.
It's worth asking how we got here. Many forms of WMD were deployed as an answer to institutional bias -- in criminal sentencing, in school grading, in university admissions, in hiring and lending. The models are supposed to be race- and gender-blind, blind to privilege and connections.
But all too often, the models are trained with the biased data. The picture of a future successful Ivy League student or loan repayer is painted using data-points from the admittedly biased history of the institutions. All the Harvard grads or dutiful mortgage payers are fed to the algorithm, which dutifully predicts that tomorrow's Harvard alums and prime loan recipients will look just like yesterday's -- but now the bias gets the credibility of seeming objectivity.
This training problem is well known in stats, but largely ignored by WMD dealers. Companies that run their own Big Data initiatives, by contrast, are much more careful about refining their models. Amazon carefully tracks those customers who abandon their shopping carts, or who stop shopping after a couple of purchases. Their interested in knowing everything they can about "recidivism" among shoppers, and they combine statistical modelling with anthropology -- seeking out and talking to their subjects -- to improve their system.
The contrast with automated sentencing software -- now widely used in the US judicial system, and spreading rapidly around the world -- could not be more stark. Like Amazon's data scientists, the companies that sell sentencing apps are trying to predict recidivism, and their predictions can send one person to prison for decades and let another go free.
These brokers are training their model on the corrupted data of the past. They look at the racialized sentencing outcomes of the past -- the outcomes that sent young black men to prison for years for minor crack possession, while letting rich white men walk away from cocaine possession charges -- and conclude that people from poor neighborhoods, whose family members and friends have had run-ins with the law, and "predict" that this person will reoffend, and recommend long sentences to keep them away from society.
Unlike Amazon, these companies aren't looking to see whether longer sentences cause recidivism (by causing emotional damage and social isolation) and how prison beatings, solitary confinement and prison rape are related to the phenomenon. If the prison system was run like Amazon -- that is, with a commitment to reducing reoffending, rather than enriching justice-system contractors and satisfying revenge-hungry bigots in the electorate -- it would probably look like a Nordic prison: humane, sparsely populated, and oriented toward rehabilitation, addiction treatment, job training, and psychological counselling.
WMDs have transformed education for teachers and students. In the 1980s, the Reagan administration seized on a report called A Nation at Risk, which claimed that the US was on the verge of collapse due to its falling SAT scores. This was the starter-pistol for an all-out assault on teachers and public education, which continues to this day.
The most visible expression of this is the "value added" assessment of teachers, which uses a battery of standardized tests to assess teachers' performance from year to year. The statistical basis for these assessments is laughable (statistics work on big numbers, not classes of 25 kids -- assessments can swing 90% from one year to the next, making them no better than random number generators). Teachers -- good teachers, committed teachers -- lose their jobs over these tests.
Students, meanwhile, are taken away from real learning in order to take more and more tests, and those tests -- which are supposed to measure "aptitude" and thus shouldn't be amenable to expensive preparatory services -- determine their whole futures.
The Nation at Risk report that started it all turned out to be bullshit, by the way -- grounded in another laughable statistical error. Sandia Labs later audited the findings from the report and found that the researchers had failed to account for the ballooning number of students who were taking the SATs, bringing down the average score.
In other words: SATs were falling because more American kids were confident enough to try to go to college: the educational system was working so well that young people who would never have taken an SAT were taking it, and the larger pool of test-takers was bringing the average score down.
WMDs turn the whole of human life into a game of Search Engine Optimization. With SEO, merchants hire companies who claim to have reverse-engineered Google's opaque model and whose advice will move your URL further  up in its ranking.
When you pay someone thousands of dollars to prep your kid for the SATs, or to improve your ranking with the "e-score" providers that determine your creditworthiness, jobworthiness, or mortgageworthiness, you're recreating SEO, but for everything. It's a grim picture of the future: WMD makers and SEO experts locked in an endless arms-race to tweak their models to game one another, and all the rest of us being subjected to automated caprice or paying ransom to escape it (for now). In that future, we're all the product, not the customer (much less the citizen).
O'Neil's work is so important because she believes in data science. Algorithms can and will be used to locate people in difficulty: teachers with hard challenges, people in financial distress, people who are struggling in their jobs, students who need educational attention. It's up to us whether we use that information to exclude and further victimize those people, or help them with additional resources
Credit bureaux, e-scorers, and other entities that model us create externalities in the form of false positives -- from no-fly lists to credit-score errors to job score errors that cost us our careers. These errors cost them nothing to make, and something to fix -- and they're incredibly expensive to us. Like all negative externalities, the cost of cleaning them up (rehabilitating your job, finding a new home, serving a longer prison sentence, etc) is much higher than the savings to the firms, but we bear the costs and they reap the savings.
It's E Pluribus Unum reversed: models make many out of one, pigeonholing each of us as members of groups about whom generalizations -- often punitive ones (such as variable pricing) can be made.
Modelling won't go away: as a tool for guiding caring and helpful remedial systems, models are amazing. As a tool for punishing and disenfranchising, they are a nightmare. The choice is ours to make. O'Neil's book is a vital crash-course in the specialized kind of statistical knowledge we all need to interrogate the systems around us and demand better.
 Weapons of Math Destruction [Cathy O'Neil/Crown]
https://boingboing.net/2016/09/06/weapons-of-math-destruction-i.html
56 notes · View notes
Text
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives #1yrago
Tumblr media
I've been writing about the work of Cathy "Mathbabe" O'Neil for years: she's a radical data-scientist with a Harvard PhD in mathematics, who coined the term "Weapons of Math Destruction" to describe the ways that sloppy statistical modeling is punishing millions of people every day, and in more and more cases, destroying lives. Today, O'Neil brings her argument to print, with a fantastic, plainspoken, call to arms called (what else?) Weapons of Math Destruction.
Discussions about big data's role in our society tends to focus on algorithms, but the algorithms for handling giant data sets are all well understood and work well. The real issue isn't algorithms, it's models. Models are what you get when you feed data to an algorithm and ask it to make predictions. As O'Neil puts it, "Models are opinions embedded in mathematics."
Other critical data scientists, like Patrick Ball from the Human Rights Data Analysis Group have located their critique in the same place. As Patrick once explained to me, you can train an algorithm to predict someone's height from their weight, but if your whole training set comes from a grade three class, and anyone who's self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn't the algorithm, it's the training data and the lack of correction when the model produces erroneous conclusions.
Like Ball, O'Neil is enthusiastic about the power of data-driven modelling to be a force for good in the world, and like Ball, she despairs at the way that sloppy statistical work can produce gigantic profits for a few companies at the expense of millions of people -- all with the veneer of mathematical objectivity.
O'Neil calls these harmful models "Weapons of Math Destruction," and not all fault models qualify. For a model to be a WMD, it must be opaque to its subjects, harmful to their interests, and grow exponentially to run at huge scale.
These WMDs are now everywhere. The sleazy for-profit educational system has figured out how to use models to identify desperate people and sucker them into signing up for expensive, useless "educations" that are paid for with punitive student loans, backed by the federal government. That's how the University of Phoenix can be so profitable, even after spending upwards of $1B/year on marketing. They've built a WMD that brings students in at a steady clip despite the fact that they spend $2,225/student in marketing and only $892/student on instruction. Meanwhile, the high-efficacy, low-cost community colleges are all but invisible in the glare and roar of the University of Phoenix's marketing blitzkreig.
One highly visible characteristic of WMDs is their lack of feedback and tuning. In sports, teams use detailed statistical models to predict which athletes they should bid on, and to deploy those athletes when squaring off against opposing teams. But after the predicted event has occurred, the teams update their models to account for their failings. If you pass on a basketball player who goes to glory for a rival team, you update your model to help you do better in the next draft.
Compare this with the WMDs used against us in everyday life. The largest employers in America use commercial services to run their incoming resumes against a model of a "successful" worker. These models hold your employment future in their hands. If one rejects you and you go on to do brilliant work somewhere else, that fact is never used to refine the model. Everyone loses: job-seekers are arbitrarily excluded from employment, and employers miss out on great hires. Only the WMD merchants in the middle make out like bandits.
It's worth asking how we got here. Many forms of WMD were deployed as an answer to institutional bias -- in criminal sentencing, in school grading, in university admissions, in hiring and lending. The models are supposed to be race- and gender-blind, blind to privilege and connections.
But all too often, the models are trained with the biased data. The picture of a future successful Ivy League student or loan repayer is painted using data-points from the admittedly biased history of the institutions. All the Harvard grads or dutiful mortgage payers are fed to the algorithm, which dutifully predicts that tomorrow's Harvard alums and prime loan recipients will look just like yesterday's -- but now the bias gets the credibility of seeming objectivity.
This training problem is well known in stats, but largely ignored by WMD dealers. Companies that run their own Big Data initiatives, by contrast, are much more careful about refining their models. Amazon carefully tracks those customers who abandon their shopping carts, or who stop shopping after a couple of purchases. Their interested in knowing everything they can about "recidivism" among shoppers, and they combine statistical modelling with anthropology -- seeking out and talking to their subjects -- to improve their system.
The contrast with automated sentencing software -- now widely used in the US judicial system, and spreading rapidly around the world -- could not be more stark. Like Amazon's data scientists, the companies that sell sentencing apps are trying to predict recidivism, and their predictions can send one person to prison for decades and let another go free.
These brokers are training their model on the corrupted data of the past. They look at the racialized sentencing outcomes of the past -- the outcomes that sent young black men to prison for years for minor crack possession, while letting rich white men walk away from cocaine possession charges -- and conclude that people from poor neighborhoods, whose family members and friends have had run-ins with the law, and "predict" that this person will reoffend, and recommend long sentences to keep them away from society.
Unlike Amazon, these companies aren't looking to see whether longer sentences cause recidivism (by causing emotional damage and social isolation) and how prison beatings, solitary confinement and prison rape are related to the phenomenon. If the prison system was run like Amazon -- that is, with a commitment to reducing reoffending, rather than enriching justice-system contractors and satisfying revenge-hungry bigots in the electorate -- it would probably look like a Nordic prison: humane, sparsely populated, and oriented toward rehabilitation, addiction treatment, job training, and psychological counselling.
WMDs have transformed education for teachers and students. In the 1980s, the Reagan administration seized on a report called A Nation at Risk, which claimed that the US was on the verge of collapse due to its falling SAT scores. This was the starter-pistol for an all-out assault on teachers and public education, which continues to this day.
The most visible expression of this is the "value added" assessment of teachers, which uses a battery of standardized tests to assess teachers' performance from year to year. The statistical basis for these assessments is laughable (statistics work on big numbers, not classes of 25 kids -- assessments can swing 90% from one year to the next, making them no better than random number generators). Teachers -- good teachers, committed teachers -- lose their jobs over these tests.
Students, meanwhile, are taken away from real learning in order to take more and more tests, and those tests -- which are supposed to measure "aptitude" and thus shouldn't be amenable to expensive preparatory services -- determine their whole futures.
The Nation at Risk report that started it all turned out to be bullshit, by the way -- grounded in another laughable statistical error. Sandia Labs later audited the findings from the report and found that the researchers had failed to account for the ballooning number of students who were taking the SATs, bringing down the average score.
In other words: SATs were falling because more American kids were confident enough to try to go to college: the educational system was working so well that young people who would never have taken an SAT were taking it, and the larger pool of test-takers was bringing the average score down.
WMDs turn the whole of human life into a game of Search Engine Optimization. With SEO, merchants hire companies who claim to have reverse-engineered Google's opaque model and whose advice will move your URL further up in its ranking.
When you pay someone thousands of dollars to prep your kid for the SATs, or to improve your ranking with the "e-score" providers that determine your creditworthiness, jobworthiness, or mortgageworthiness, you're recreating SEO, but for everything. It's a grim picture of the future: WMD makers and SEO experts locked in an endless arms-race to tweak their models to game one another, and all the rest of us being subjected to automated caprice or paying ransom to escape it (for now). In that future, we're all the product, not the customer (much less the citizen).
O'Neil's work is so important because she believes in data science. Algorithms can and will be used to locate people in difficulty: teachers with hard challenges, people in financial distress, people who are struggling in their jobs, students who need educational attention. It's up to us whether we use that information to exclude and further victimize those people, or help them with additional resources
Credit bureaux, e-scorers, and other entities that model us create externalities in the form of false positives -- from no-fly lists to credit-score errors to job score errors that cost us our careers. These errors cost them nothing to make, and something to fix -- and they're incredibly expensive to us. Like all negative externalities, the cost of cleaning them up (rehabilitating your job, finding a new home, serving a longer prison sentence, etc) is much higher than the savings to the firms, but we bear the costs and they reap the savings.
It's E Pluribus Unum reversed: models make many out of one, pigeonholing each of us as members of groups about whom generalizations -- often punitive ones (such as variable pricing) can be made.
Modelling won't go away: as a tool for guiding caring and helpful remedial systems, models are amazing. As a tool for punishing and disenfranchising, they are a nightmare. The choice is ours to make. O'Neil's book is a vital crash-course in the specialized kind of statistical knowledge we all need to interrogate the systems around us and demand better.
Weapons of Math Destruction [Cathy O'Neil/Crown]
https://boingboing.net/2016/09/06/weapons-of-math-destruction-i.html
30 notes · View notes
Text
Weapons of Math Destruction: invisible, ubiquitous algorithms are ruining millions of lives #1yrago
Tumblr media
I've been writing about the work of Cathy "Mathbabe" O'Neil for years: she's a radical data-scientist with a Harvard PhD in mathematics, who coined the term "Weapons of Math Destruction" to describe the ways that sloppy statistical modeling is punishing millions of people every day, and in more and more cases, destroying lives. Today, O'Neil brings her argument to print, with a fantastic, plainspoken, call to arms called (what else?)Weapons of Math Destruction.
Discussions about big data's role in our society tends to focus on algorithms, but the algorithms for handling giant data sets are all well understood and work well. The real issue isn't algorithms, it's models. Models are what you get when you feed data to an algorithm and ask it to make predictions. As O'Neil puts it, "Models are opinions embedded in mathematics."
Other critical data scientists, like Patrick Ball from the Human Rights Data Analysis Group have located their critique in the same place. As Patrick once explained to me, you can train an algorithm to predict someone's height from their weight, but if your whole training set comes from a grade three class, and anyone who's self-conscious about their weight is allowed to skip the exercise, your model will predict that most people are about four feet tall. The problem isn't the algorithm, it's the training data and the lack of correction when the model produces erroneous conclusions.
Like Ball, O'Neil is enthusiastic about the power of data-driven modelling to be a force for good in the world, and like Ball, she despairs at the way that sloppy statistical work can produce gigantic profits for a few companies at the expense of millions of people -- all with the veneer of mathematical objectivity.
O'Neil calls these harmful models "Weapons of Math Destruction," and not all fault models qualify. For a model to be a WMD, it must be opaque to its subjects, harmful to their interests, and grow exponentially to run at huge scale.
These WMDs are now everywhere. The sleazy for-profit educational system has figured out how to use models to identify desperate people and sucker them into signing up for expensive, useless "educations" that are paid for with punitive student loans, backed by the federal government. That's how the University of Phoenix can be so profitable, even after spending upwards of $1B/year on marketing. They've built a WMD that brings students in at a steady clip despite the fact that they spend $2,225/student in marketing and only $892/student on instruction. Meanwhile, the high-efficacy, low-cost community colleges are all but invisible in the glare and roar of the University of Phoenix's marketing blitzkreig.
One highly visible characteristic of WMDs is their lack of feedback and tuning. In sports, teams use detailed statistical models to predict which athletes they should bid on, and to deploy those athletes when squaring off against opposing teams. But after the predicted event has occurred, the teams update their models to account for their failings. If you pass on a basketball player who goes to glory for a rival team, you update your model to help you do better in the next draft.
Compare this with the WMDs used against us in everyday life. The largest employers in America use commercial services to run their incoming resumes against a model of a "successful" worker. These models hold your employment future in their hands. If one rejects you and you go on to do brilliant work somewhere else, that fact is never used to refine the model. Everyone loses: job-seekers are arbitrarily excluded from employment, and employers miss out on great hires. Only the WMD merchants in the middle make out like bandits.
It's worth asking how we got here. Many forms of WMD were deployed as an answer to institutional bias -- in criminal sentencing, in school grading, in university admissions, in hiring and lending. The models are supposed to be race- and gender-blind, blind to privilege and connections.
But all too often, the models are trained with the biased data. The picture of a future successful Ivy League student or loan repayer is painted using data-points from the admittedly biased history of the institutions. All the Harvard grads or dutiful mortgage payers are fed to the algorithm, which dutifully predicts that tomorrow's Harvard alums and prime loan recipients will look just like yesterday's -- but now the bias gets the credibility of seeming objectivity.
This training problem is well known in stats, but largely ignored by WMD dealers. Companies that run their own Big Data initiatives, by contrast, are much more careful about refining their models. Amazon carefully tracks those customers who abandon their shopping carts, or who stop shopping after a couple of purchases. Their interested in knowing everything they can about "recidivism" among shoppers, and they combine statistical modelling with anthropology -- seeking out and talking to their subjects -- to improve their system.
The contrast with automated sentencing software -- now widely used in the US judicial system, and spreading rapidly around the world -- could not be more stark. Like Amazon's data scientists, the companies that sell sentencing apps are trying to predict recidivism, and their predictions can send one person to prison for decades and let another go free.
These brokers are training their model on the corrupted data of the past. They look at the racialized sentencing outcomes of the past -- the outcomes that sent young black men to prison for years for minor crack possession, while letting rich white men walk away from cocaine possession charges -- and conclude that people from poor neighborhoods, whose family members and friends have had run-ins with the law, and "predict" that this person will reoffend, and recommend long sentences to keep them away from society.
Unlike Amazon, these companies aren't looking to see whether longer sentences cause recidivism (by causing emotional damage and social isolation) and how prison beatings, solitary confinement and prison rape are related to the phenomenon. If the prison system was run like Amazon -- that is, with a commitment to reducing reoffending, rather than enriching justice-system contractors and satisfying revenge-hungry bigots in the electorate -- it would probably look like a Nordic prison: humane, sparsely populated, and oriented toward rehabilitation, addiction treatment, job training, and psychological counselling.
WMDs have transformed education for teachers and students. In the 1980s, the Reagan administration seized on a report called A Nation at Risk, which claimed that the US was on the verge of collapse due to its falling SAT scores. This was the starter-pistol for an all-out assault on teachers and public education, which continues to this day.
The most visible expression of this is the "value added" assessment of teachers, which uses a battery of standardized tests to assess teachers' performance from year to year. The statistical basis for these assessments is laughable (statistics work on big numbers, not classes of 25 kids -- assessments can swing 90% from one year to the next, making them no better than random number generators). Teachers -- good teachers, committed teachers -- lose their jobs over these tests.
Students, meanwhile, are taken away from real learning in order to take more and more tests, and those tests -- which are supposed to measure "aptitude" and thus shouldn't be amenable to expensive preparatory services -- determine their whole futures.
The Nation at Risk report that started it all turned out to be bullshit, by the way -- grounded in another laughable statistical error. Sandia Labs later audited the findings from the report and found that the researchers had failed to account for the ballooning number of students who were taking the SATs, bringing down the average score.
In other words: SATs were falling because more American kids were confident enough to try to go to college: the educational system was working so well that young people who would never have taken an SAT were taking it, and the larger pool of test-takers was bringing the average score down.
WMDs turn the whole of human life into a game of Search Engine Optimization. With SEO, merchants hire companies who claim to have reverse-engineered Google's opaque model and whose advice will move your URL further up in its ranking.
When you pay someone thousands of dollars to prep your kid for the SATs, or to improve your ranking with the "e-score" providers that determine your creditworthiness, jobworthiness, or mortgageworthiness, you're recreating SEO, but for everything. It's a grim picture of the future: WMD makers and SEO experts locked in an endless arms-race to tweak their models to game one another, and all the rest of us being subjected to automated caprice or paying ransom to escape it (for now). In that future, we're all the product, not the customer (much less the citizen).
O'Neil's work is so important because she believes in data science. Algorithms can and will be used to locate people in difficulty: teachers with hard challenges, people in financial distress, people who are struggling in their jobs, students who need educational attention. It's up to us whether we use that information to exclude and further victimize those people, or help them with additional resources
Credit bureaux, e-scorers, and other entities that model us create externalities in the form of false positives -- from no-fly lists to credit-score errors to job score errors that cost us our careers. These errors cost them nothing to make, and something to fix -- and they're incredibly expensive to us. Like all negative externalities, the cost of cleaning them up (rehabilitating your job, finding a new home, serving a longer prison sentence, etc) is much higher than the savings to the firms, but we bear the costs and they reap the savings.
It's E Pluribus Unum reversed: models make many out of one, pigeonholing each of us as members of groups about whom generalizations -- often punitive ones (such as variable pricing) can be made.
Modelling won't go away: as a tool for guiding caring and helpful remedial systems, models are amazing. As a tool for punishing and disenfranchising, they are a nightmare. The choice is ours to make. O'Neil's book is a vital crash-course in the specialized kind of statistical knowledge we all need to interrogate the systems around us and demand better.
Weapons of Math Destruction [Cathy O'Neil/Crown]
https://boingboing.net/2016/09/06/weapons-of-math-destruction-i.html
24 notes · View notes