Tumgik
#the energy just gets ZUCKED out it’s wild
roylustang · 11 months
Text
It’s so wild to me how the sun and heat makes running hard. It’s like damn what’s wrong with me.
3 notes · View notes
malk1ns · 2 years
Note
2, 4, or 20 w sidgeno? love your writing❤️
i love YOU <3 <3 <3
gesture prompts are here
2. Fidgeting with their wedding ring.
The team has a wager on how long it'll take for the first question to come.
Zhenya's been banned from handling the bet money—he'd protested, trying to assert his authority as finemaster, but Cartsy had snatched the ledger from him and tucked it into his own bag with a stern reprimand. "If the guys have to go to you to place their bets," he'd said, "you'll end up somehow rigging it so that you and Sid win all the money."
Well, fair enough.
Zhenya had pouted long enough to make his point, then handed over his cash with his selected date while Sid pretended to disapprove from where he was packing up.
"And no colluding!" Tanger had shouted at them as they left the locker room. "I'll be watching your media, Sid—I'll know if you put your thumb on the scale for his day."
Sid had rolled his eyes expressively, but Zhenya hadn't planned on help from that quarter—Sid's too much of a rule-follower, had even agreed with the terms that the start time wouldn't be until the beginning of the regular season. All through camp and pre-season Zhenya watched him fidget with his necklace, with the extra weight there, but he'd dutifully kept his left hand folded under his right whenever he was pulled for media.
But now, they're trooping into the locker room after the first game of the new season, a solid thrashing of the Coyotes on home ice, and Sid winks at him before he crams his fresh black hat on and settles into his stall, waiting for the reporters to file in.
The little flash of metal glints under the camera lights. Sid's hands are still on his thighs, and it's right there, but nobody asks, not that night, and not after the next game either.
October turns over to November, and Rusty curses them all out after the Halloween party comes and goes and still nobody's asked. "Kelsey's gonna be so mad, she's the one that picked the day," he mutters, kicking at Sid on his way out. "Thanks for nothing, man."
Sid just smiles.
They're not even home, when it finally happens.
They'd just slugged it out with the Wild, and Jarry had stood tall all through three grueling periods, overtime, and four shootout rounds before, fittingly, Zucks managed to snipe the puck over Flower's shoulder to finally end the game.
Shootouts make Sid jittery; he hates them, always gets hyped up on the bench watching everyone take their turn, and he's filled with extra energy for hours after. Zhenya's generally more than happy to help him burn it off after, but first they have to get through media and then trudge through the rapidly-falling snow across the park to their hotel.
Sid's answering one of Taylor's questions with a lot more energy than he usually devotes to rote postgame inquiries when it finally happens. He's talking through an almost-goal on the power play late in the second, tracing out the passes with his hands, when Taylor interrupts him.
"Uhhh, sorry Sid, but—is that a wedding ring?"
The whole change room falls silent.
"Oh," Sid says, looking down at his hands. He curls his left thumb across and strokes at the plain gold band, spinning it around his finger. "Yeah, it is. I got married over the summer."
None of the reporters seem to know what to say, after that. Sid blinks up at them innocently, media-smile firmly fixed in place, until Jen finally ushers them out of the room.
"God damn it," Cartsy says once the room is just team again. "Zucker, you asshole, the game-winner and you picked the closest day? What the fuck, man."
Jason crows in delight as Jeff passes over the envelope of cash. "It's my lucky night, I guess," he says smugly, pulling out the bills and making a show of counting them. "Straight to the holiday party fund, fellas, don't worry."
Zhenya meets Sid's eyes across the room in the uproar that ensues. Sid's smile has melted from media-practiced to his smaller, private one, the one Zhenya sees in the mornings when he's made Sid breakfast and gotten their stuff ready to head to the rink.
Zhenya winks, then twists to reach up to the top shelf of his locker and grab his own ring, sliding it onto his right ring finger and twisting it around until the metal starts to warm.
91 notes · View notes
I’m about midway by way of Zucked: Waking Up to the Fb Catastrophe by Roger McNamee. It’s going sluggish as a result of what I am learning makes me so indignant, sad or alarmed about my #1 vacation spot on the Net. Almost every web page causes me to stop and assume not nearly Fb’s betrayal of so many billion trusting customers, but in addition that this guide confirms a rising concern I have concerning the deteriorating relationship between individuals and know-how generally and my frustration that both self-regulation by tech corporations and the power of our authorities to protect us in conditions akin to this have been thus far just plain impotent.
I am those who initially came to Fb to share thoughts, concepts and footage with associates. It grew to be a supply of perception and knowledge for my most up-to-date five books and it was an ample supply of latest enterprise leads. For over ten years, Fb has offered me with ample returns on my vital investments of time.
Less so now.
Zucked is just not the primary e-book that warned towards Fb, but it’s made extra highly effective and credible because the source is Roger McNamee, who I contemplate to be among the most credible voices in know-how.
I’ve recognized Roger since we have been both just starting careers associated to the enterprise of know-how. We have been never shut, but we did share a ardour back within the early 80s for the promise of private know-how, greatest described by the late Steve Jobs as a “bicycle for the mind,”  talked about on this e-book. I’ve lengthy adopted his thought management in areas to know-how as a primal transformative pressure.
Stone Wall Forward
Zucked is giving me this very disturbing picture that billions of individuals are driving their mental bicycles at breathtaking velocity down a particularly long and darkening tunnel on the finish of which is a stone wall.
Most of us really feel this sense of tunnel. We experience along surrounded by individuals who see what we see, assume what we expect, oppose those who are totally different from us and hold peddling alongside despite mounting proof that the journey might finish badly.
We humans have grow to be a divided lot. Civility between us has deteriorated as has trust: We’re more and more disinclined to seek out widespread ground with one another and we debate political and social issues with stridency and mistrust: We really feel that righteousness is on our aspect and this who disagree are evil, deranged, dangerous or all three.
Roger McNamee believes the wrongdoer that has achieved probably the most to distort our perceptions is Facebook, and in the half of the ebook that I have accomplished, he makes an overwhelmingly compelling case.
Manipulating Minds
Fb, as chances are you’ll know, is the most important company in historical past. More than 2.2 billion individuals log in no less than as soon as month-to-month. That’s about one in three individuals on Earth if you get rid of these with out digital entry or youngsters underneath age five or seniors who’ve lost the power or want to make use of computer systems.
However wait. Sadly, there’s extra.
Fb additionally owns Instagram, which has 1.5 billion customers and WhatsApp with a few billion extra. In fact, there’s overlap, but a conservative estimate of these three social networks provides us a minimum of three billion distinctive users, most of whom go to more than as soon as every day; some of us much more.
Fb and its two largest subsidiaries are manipulating the hearts and minds of half the world’s individuals, extra by orders of magnitude, than any company in history, greater than twice the variety of individuals controlled by the Chinese language authorities as we speak; more than the variety of individuals suppressed by Germany, Japan and Russia throughout World Warfare 2.
In accordance with McNamee, the empire is beneath the management of just two individuals Sheryl Sandberg and Mark Zuckerberg.
The Virtuous We
McNamee served as Zuckerberg’s mentor from 2006-to-2009, starting shortly after young Zuck dropped out of Harvard the place Facebook started by facilitating the power to seek out dates for frat boys at elite universities. It turned much more than that extremely shortly when Zuck and Silicon Valley found each other.
McNamee says he has written this broadside to sound the alarm, to warn us that Facebook has created the type of Filter Bubble that Eli Pariser wrote about a number of years again. This bubble filters what we see in order that we like virtually all of it. We speak virtually unique with individuals who share our views. This establishes the concept that each of us is a part of a virtuous we (my words). This is completed in fact by rigorously calibrated algorithms. This social insulation is dangerous sufficient, however it worsens by orders of magnitude when algorithms pit the virtuous we towards the evil them: individuals simply assume in another way about political and social points.
The tools that Fb makes use of usually are not inherently evil: no tools are. You should use a hammer to build a home or bludgeon a spouse. It’s as much as the consumer, and Fb has lengthy defended itself for not being liable for the hate, bullying, swindling and despicable conduct most individuals have witnessed on Fb.
McNamee points to the work of a well-intentioned individual, who I consulted a few years in the past. Stanford Professor BJ Fogg, who fathered the idea of Persuasive Computing: how computer systems can be used to vary their angle and conduct. When I knew Prof. Fogg he talked enthusiastically about Persuasive Computing benefitting humankind, making us tolerant of variety.
McNamee says Fb uses Persuasive Computing as a device to not profit humankind, however to control it. It is Fb’s energy software not for the customers who’re the product however for advertisers which are the purchasers.
Fb found that when individuals are pissed off, they submit extra, the hyperlink extra, they stay on the social community longer. The corporate is agnostic about how it impacts individuals, so long as it permits them to realize revenue by sticking extra advertisements in entrance of our faces: From the corporate’s perspective you and I and another three billion individuals are not there to be entertained or in any other case made completely satisfied; we’re there to turn out to be knowledge factors for advert mongers.
With out Ethics
All the things we see and everybody instructed to us to Comply with or Like, every Group we’re invited to hitch is calculated by algorithms and based mostly on the perpetual collection of our knowledge. These algorithms, in fact, have machine intelligence, however they are devoid of different human qualities together with ethics, compassion, empathy, humor, irony, nuance or any want to discover a widespread floor between people who as soon as would respectfully disagree.
Filter Bubbles, Persuasive Computing and ever-more effective algorithms manipulate us and make us addictive. We belief newcomers into our private bubbles as a result of they know individuals we all know. This sound comforting in itself, however it reinforces what we already assume and introduces few new thoughts to ponder—until they piss us off or scare us. So, in case you are like my spouse, Paula Israel, who’s enthusiastic about protecting animals within the wild, you may be fed all types of stories and pictures about horrible things being executed to wolves or whales.  For those who hate Donald Trump, you may be selected to get tons of studies on the obscenities he foments each day; and when you consider that America should not be the place it has been for welcoming the drained, poor, huddled plenty of the world, you’ll be fed pretend information about rapists, terrorists and drug runners massed at our southern borders plotting to destroy a neighborhood close to you.
Facebook’s knowledge has found out that once we are outraged, horrified, indignant or saddened, we keep on the social network longer. We share extra, we like extra, and we publish extra, and it has designed and calibrated it in order that we do this.
Brexit as a Petri Dish
The outcome, in fact, has an incredible deal to do with the mess we are in. Hackers and faux information mongers discovered to good voter manipulation throughout Brexit in 2015. Then they took what they discovered there and refined it to serve Donald Trump in 2016, and nothing has occurred to stop it from occurring once more within the US or anyplace else where there are purported to be free elections.
They have hired individuals to deal with the issue who have resigned in frustration, shortly after beginning. Basecamp, has stopped promoting or being current on Facebook, Instagram and WhatsApp.  In April, Proctor & Gamble set Facebook –and Google—on notice to vary their practices or lose the ad help of the world’s largest shopper products company. In the event that they depart, you’ll be able to assume others will comply with.
But to date, all this noise and concern, all these congressional inquiries and media diatribes haven’t prevented Fb from reporting larger and larger riches quarter after quarter after quarter.
I feel that is as a result of we addicts maintain coming back and permitting the algorithms to control our eyeballs.
A few of what I simply stated is in Zucked, while some are my own conclusion after reading simply half of this essential e-book. Like most of my readers, I’ve turn into increasingly concerned about Facebook’s choice for algorithms over ethics.
I have not yet completed the ebook as I mentioned. I’ve reached some extent where McNamee has shaped a small group of highly capable and influential people who are speaking to the media, advising influential elected officers and of course, writing articles and this e-book. They’re chatting with anyone who listens within the hope that if Fb won’t change itself than the federal government ought to do it for them.
In Silicon Valley’s strongest circles, there is a very lengthy historical past of Libertarianism in business: the consensus is that the tech business can self-regulate itself better than the federal government can do it. I have lengthy been of that thoughts, however this ebook has already satisfied me in any other case.
There’s little proof that the tech business will self-regulate with any higher integrity or effectiveness than the oil and fuel business of an earlier period where the government needed to break up Commonplace Oil in 1911.
Our business has been all concerning the legend of startups on the world’s financial system. Entrepreneurialism is on the shortlist of hope for the longer term. It’s a great dream full of fantastic tales, but the fact is that the miracle of the startup has been eclipsed by seemingly indestructible giants like Fb (and Google who shares lots of Fb’s questionable algorithmic manipulations).
As for me, I’m not about to go away both Fb or Google. My work still is determined by these platforms in an awesome many ways. But I am chopping back, increasingly more each day. Actually, I see Roger McNamee on the platform as properly.
I imagine there is a vanishing point someplace in my not-too-distant future. I might favor the break-up of Facebook by authorities, but I worry that each Congress and the Supreme Courtroom would shield the interests of shareholders and advertisers than of us, three billion addicts.
The post Zucked: When Algorithms Replace Ethics appeared first on Tech Amender.
0 notes
Text
I’m about midway by way of Zucked: Waking Up to the Fb Catastrophe by Roger McNamee. It’s going sluggish as a result of what I am learning makes me so indignant, sad or alarmed about my #1 vacation spot on the Net. Almost every web page causes me to stop and assume not nearly Fb’s betrayal of so many billion trusting customers, but in addition that this guide confirms a rising concern I have concerning the deteriorating relationship between individuals and know-how generally and my frustration that both self-regulation by tech corporations and the power of our authorities to protect us in conditions akin to this have been thus far just plain impotent.
I am those who initially came to Fb to share thoughts, concepts and footage with associates. It grew to be a supply of perception and knowledge for my most up-to-date five books and it was an ample supply of latest enterprise leads. For over ten years, Fb has offered me with ample returns on my vital investments of time.
Less so now.
Zucked is just not the primary e-book that warned towards Fb, but it’s made extra highly effective and credible because the source is Roger McNamee, who I contemplate to be among the most credible voices in know-how.
I’ve recognized Roger since we have been both just starting careers associated to the enterprise of know-how. We have been never shut, but we did share a ardour back within the early 80s for the promise of private know-how, greatest described by the late Steve Jobs as a “bicycle for the mind,”  talked about on this e-book. I’ve lengthy adopted his thought management in areas to know-how as a primal transformative pressure.
Stone Wall Forward
Zucked is giving me this very disturbing picture that billions of individuals are driving their mental bicycles at breathtaking velocity down a particularly long and darkening tunnel on the finish of which is a stone wall.
Most of us really feel this sense of tunnel. We experience along surrounded by individuals who see what we see, assume what we expect, oppose those who are totally different from us and hold peddling alongside despite mounting proof that the journey might finish badly.
We humans have grow to be a divided lot. Civility between us has deteriorated as has trust: We’re more and more disinclined to seek out widespread ground with one another and we debate political and social issues with stridency and mistrust: We really feel that righteousness is on our aspect and this who disagree are evil, deranged, dangerous or all three.
Roger McNamee believes the wrongdoer that has achieved probably the most to distort our perceptions is Facebook, and in the half of the ebook that I have accomplished, he makes an overwhelmingly compelling case.
Manipulating Minds
Fb, as chances are you’ll know, is the most important company in historical past. More than 2.2 billion individuals log in no less than as soon as month-to-month. That’s about one in three individuals on Earth if you get rid of these with out digital entry or youngsters underneath age five or seniors who’ve lost the power or want to make use of computer systems.
However wait. Sadly, there’s extra.
Fb additionally owns Instagram, which has 1.5 billion customers and WhatsApp with a few billion extra. In fact, there’s overlap, but a conservative estimate of these three social networks provides us a minimum of three billion distinctive users, most of whom go to more than as soon as every day; some of us much more.
Fb and its two largest subsidiaries are manipulating the hearts and minds of half the world’s individuals, extra by orders of magnitude, than any company in history, greater than twice the variety of individuals controlled by the Chinese language authorities as we speak; more than the variety of individuals suppressed by Germany, Japan and Russia throughout World Warfare 2.
In accordance with McNamee, the empire is beneath the management of just two individuals Sheryl Sandberg and Mark Zuckerberg.
The Virtuous We
McNamee served as Zuckerberg’s mentor from 2006-to-2009, starting shortly after young Zuck dropped out of Harvard the place Facebook started by facilitating the power to seek out dates for frat boys at elite universities. It turned much more than that extremely shortly when Zuck and Silicon Valley found each other.
McNamee says he has written this broadside to sound the alarm, to warn us that Facebook has created the type of Filter Bubble that Eli Pariser wrote about a number of years again. This bubble filters what we see in order that we like virtually all of it. We speak virtually unique with individuals who share our views. This establishes the concept that each of us is a part of a virtuous we (my words). This is completed in fact by rigorously calibrated algorithms. This social insulation is dangerous sufficient, however it worsens by orders of magnitude when algorithms pit the virtuous we towards the evil them: individuals simply assume in another way about political and social points.
The tools that Fb makes use of usually are not inherently evil: no tools are. You should use a hammer to build a home or bludgeon a spouse. It’s as much as the consumer, and Fb has lengthy defended itself for not being liable for the hate, bullying, swindling and despicable conduct most individuals have witnessed on Fb.
McNamee points to the work of a well-intentioned individual, who I consulted a few years in the past. Stanford Professor BJ Fogg, who fathered the idea of Persuasive Computing: how computer systems can be used to vary their angle and conduct. When I knew Prof. Fogg he talked enthusiastically about Persuasive Computing benefitting humankind, making us tolerant of variety.
McNamee says Fb uses Persuasive Computing as a device to not profit humankind, however to control it. It is Fb’s energy software not for the customers who’re the product however for advertisers which are the purchasers.
Fb found that when individuals are pissed off, they submit extra, the hyperlink extra, they stay on the social community longer. The corporate is agnostic about how it impacts individuals, so long as it permits them to realize revenue by sticking extra advertisements in entrance of our faces: From the corporate’s perspective you and I and another three billion individuals are not there to be entertained or in any other case made completely satisfied; we’re there to turn out to be knowledge factors for advert mongers.
With out Ethics
All the things we see and everybody instructed to us to Comply with or Like, every Group we’re invited to hitch is calculated by algorithms and based mostly on the perpetual collection of our knowledge. These algorithms, in fact, have machine intelligence, however they are devoid of different human qualities together with ethics, compassion, empathy, humor, irony, nuance or any want to discover a widespread floor between people who as soon as would respectfully disagree.
Filter Bubbles, Persuasive Computing and ever-more effective algorithms manipulate us and make us addictive. We belief newcomers into our private bubbles as a result of they know individuals we all know. This sound comforting in itself, however it reinforces what we already assume and introduces few new thoughts to ponder—until they piss us off or scare us. So, in case you are like my spouse, Paula Israel, who’s enthusiastic about protecting animals within the wild, you may be fed all types of stories and pictures about horrible things being executed to wolves or whales.  For those who hate Donald Trump, you may be selected to get tons of studies on the obscenities he foments each day; and when you consider that America should not be the place it has been for welcoming the drained, poor, huddled plenty of the world, you’ll be fed pretend information about rapists, terrorists and drug runners massed at our southern borders plotting to destroy a neighborhood close to you.
Facebook’s knowledge has found out that once we are outraged, horrified, indignant or saddened, we keep on the social network longer. We share extra, we like extra, and we publish extra, and it has designed and calibrated it in order that we do this.
Brexit as a Petri Dish
The outcome, in fact, has an incredible deal to do with the mess we are in. Hackers and faux information mongers discovered to good voter manipulation throughout Brexit in 2015. Then they took what they discovered there and refined it to serve Donald Trump in 2016, and nothing has occurred to stop it from occurring once more within the US or anyplace else where there are purported to be free elections.
They have hired individuals to deal with the issue who have resigned in frustration, shortly after beginning. Basecamp, has stopped promoting or being current on Facebook, Instagram and WhatsApp.  In April, Proctor & Gamble set Facebook –and Google—on notice to vary their practices or lose the ad help of the world’s largest shopper products company. In the event that they depart, you’ll be able to assume others will comply with.
But to date, all this noise and concern, all these congressional inquiries and media diatribes haven’t prevented Fb from reporting larger and larger riches quarter after quarter after quarter.
I feel that is as a result of we addicts maintain coming back and permitting the algorithms to control our eyeballs.
A few of what I simply stated is in Zucked, while some are my own conclusion after reading simply half of this essential e-book. Like most of my readers, I’ve turn into increasingly concerned about Facebook’s choice for algorithms over ethics.
I have not yet completed the ebook as I mentioned. I’ve reached some extent where McNamee has shaped a small group of highly capable and influential people who are speaking to the media, advising influential elected officers and of course, writing articles and this e-book. They’re chatting with anyone who listens within the hope that if Fb won’t change itself than the federal government ought to do it for them.
In Silicon Valley’s strongest circles, there is a very lengthy historical past of Libertarianism in business: the consensus is that the tech business can self-regulate itself better than the federal government can do it. I have lengthy been of that thoughts, however this ebook has already satisfied me in any other case.
There’s little proof that the tech business will self-regulate with any higher integrity or effectiveness than the oil and fuel business of an earlier period where the government needed to break up Commonplace Oil in 1911.
Our business has been all concerning the legend of startups on the world’s financial system. Entrepreneurialism is on the shortlist of hope for the longer term. It’s a great dream full of fantastic tales, but the fact is that the miracle of the startup has been eclipsed by seemingly indestructible giants like Fb (and Google who shares lots of Fb’s questionable algorithmic manipulations).
As for me, I’m not about to go away both Fb or Google. My work still is determined by these platforms in an awesome many ways. But I am chopping back, increasingly more each day. Actually, I see Roger McNamee on the platform as properly.
I imagine there is a vanishing point someplace in my not-too-distant future. I might favor the break-up of Facebook by authorities, but I worry that each Congress and the Supreme Courtroom would shield the interests of shareholders and advertisers than of us, three billion addicts.
The post Zucked: When Algorithms Replace Ethics appeared first on Tech Amender.
0 notes
ariellblogus · 5 years
Text
I’m about midway by way of Zucked: Waking Up to the Fb Catastrophe by Roger McNamee. It’s going sluggish as a result of what I am learning makes me so indignant, sad or alarmed about my #1 vacation spot on the Net. Almost every web page causes me to stop and assume not nearly Fb’s betrayal of so many billion trusting customers, but in addition that this guide confirms a rising concern I have concerning the deteriorating relationship between individuals and know-how generally and my frustration that both self-regulation by tech corporations and the power of our authorities to protect us in conditions akin to this have been thus far just plain impotent.
I am those who initially came to Fb to share thoughts, concepts and footage with associates. It grew to be a supply of perception and knowledge for my most up-to-date five books and it was an ample supply of latest enterprise leads. For over ten years, Fb has offered me with ample returns on my vital investments of time.
Less so now.
Zucked is just not the primary e-book that warned towards Fb, but it’s made extra highly effective and credible because the source is Roger McNamee, who I contemplate to be among the most credible voices in know-how.
I’ve recognized Roger since we have been both just starting careers associated to the enterprise of know-how. We have been never shut, but we did share a ardour back within the early 80s for the promise of private know-how, greatest described by the late Steve Jobs as a “bicycle for the mind,”  talked about on this e-book. I’ve lengthy adopted his thought management in areas to know-how as a primal transformative pressure.
Stone Wall Forward
Zucked is giving me this very disturbing picture that billions of individuals are driving their mental bicycles at breathtaking velocity down a particularly long and darkening tunnel on the finish of which is a stone wall.
Most of us really feel this sense of tunnel. We experience along surrounded by individuals who see what we see, assume what we expect, oppose those who are totally different from us and hold peddling alongside despite mounting proof that the journey might finish badly.
We humans have grow to be a divided lot. Civility between us has deteriorated as has trust: We’re more and more disinclined to seek out widespread ground with one another and we debate political and social issues with stridency and mistrust: We really feel that righteousness is on our aspect and this who disagree are evil, deranged, dangerous or all three.
Roger McNamee believes the wrongdoer that has achieved probably the most to distort our perceptions is Facebook, and in the half of the ebook that I have accomplished, he makes an overwhelmingly compelling case.
Manipulating Minds
Fb, as chances are you’ll know, is the most important company in historical past. More than 2.2 billion individuals log in no less than as soon as month-to-month. That’s about one in three individuals on Earth if you get rid of these with out digital entry or youngsters underneath age five or seniors who’ve lost the power or want to make use of computer systems.
However wait. Sadly, there’s extra.
Fb additionally owns Instagram, which has 1.5 billion customers and WhatsApp with a few billion extra. In fact, there’s overlap, but a conservative estimate of these three social networks provides us a minimum of three billion distinctive users, most of whom go to more than as soon as every day; some of us much more.
Fb and its two largest subsidiaries are manipulating the hearts and minds of half the world’s individuals, extra by orders of magnitude, than any company in history, greater than twice the variety of individuals controlled by the Chinese language authorities as we speak; more than the variety of individuals suppressed by Germany, Japan and Russia throughout World Warfare 2.
In accordance with McNamee, the empire is beneath the management of just two individuals Sheryl Sandberg and Mark Zuckerberg.
The Virtuous We
McNamee served as Zuckerberg’s mentor from 2006-to-2009, starting shortly after young Zuck dropped out of Harvard the place Facebook started by facilitating the power to seek out dates for frat boys at elite universities. It turned much more than that extremely shortly when Zuck and Silicon Valley found each other.
McNamee says he has written this broadside to sound the alarm, to warn us that Facebook has created the type of Filter Bubble that Eli Pariser wrote about a number of years again. This bubble filters what we see in order that we like virtually all of it. We speak virtually unique with individuals who share our views. This establishes the concept that each of us is a part of a virtuous we (my words). This is completed in fact by rigorously calibrated algorithms. This social insulation is dangerous sufficient, however it worsens by orders of magnitude when algorithms pit the virtuous we towards the evil them: individuals simply assume in another way about political and social points.
The tools that Fb makes use of usually are not inherently evil: no tools are. You should use a hammer to build a home or bludgeon a spouse. It’s as much as the consumer, and Fb has lengthy defended itself for not being liable for the hate, bullying, swindling and despicable conduct most individuals have witnessed on Fb.
McNamee points to the work of a well-intentioned individual, who I consulted a few years in the past. Stanford Professor BJ Fogg, who fathered the idea of Persuasive Computing: how computer systems can be used to vary their angle and conduct. When I knew Prof. Fogg he talked enthusiastically about Persuasive Computing benefitting humankind, making us tolerant of variety.
McNamee says Fb uses Persuasive Computing as a device to not profit humankind, however to control it. It is Fb’s energy software not for the customers who’re the product however for advertisers which are the purchasers.
Fb found that when individuals are pissed off, they submit extra, the hyperlink extra, they stay on the social community longer. The corporate is agnostic about how it impacts individuals, so long as it permits them to realize revenue by sticking extra advertisements in entrance of our faces: From the corporate’s perspective you and I and another three billion individuals are not there to be entertained or in any other case made completely satisfied; we’re there to turn out to be knowledge factors for advert mongers.
With out Ethics
All the things we see and everybody instructed to us to Comply with or Like, every Group we’re invited to hitch is calculated by algorithms and based mostly on the perpetual collection of our knowledge. These algorithms, in fact, have machine intelligence, however they are devoid of different human qualities together with ethics, compassion, empathy, humor, irony, nuance or any want to discover a widespread floor between people who as soon as would respectfully disagree.
Filter Bubbles, Persuasive Computing and ever-more effective algorithms manipulate us and make us addictive. We belief newcomers into our private bubbles as a result of they know individuals we all know. This sound comforting in itself, however it reinforces what we already assume and introduces few new thoughts to ponder—until they piss us off or scare us. So, in case you are like my spouse, Paula Israel, who’s enthusiastic about protecting animals within the wild, you may be fed all types of stories and pictures about horrible things being executed to wolves or whales.  For those who hate Donald Trump, you may be selected to get tons of studies on the obscenities he foments each day; and when you consider that America should not be the place it has been for welcoming the drained, poor, huddled plenty of the world, you’ll be fed pretend information about rapists, terrorists and drug runners massed at our southern borders plotting to destroy a neighborhood close to you.
Facebook’s knowledge has found out that once we are outraged, horrified, indignant or saddened, we keep on the social network longer. We share extra, we like extra, and we publish extra, and it has designed and calibrated it in order that we do this.
Brexit as a Petri Dish
The outcome, in fact, has an incredible deal to do with the mess we are in. Hackers and faux information mongers discovered to good voter manipulation throughout Brexit in 2015. Then they took what they discovered there and refined it to serve Donald Trump in 2016, and nothing has occurred to stop it from occurring once more within the US or anyplace else where there are purported to be free elections.
They have hired individuals to deal with the issue who have resigned in frustration, shortly after beginning. Basecamp, has stopped promoting or being current on Facebook, Instagram and WhatsApp.  In April, Proctor & Gamble set Facebook –and Google—on notice to vary their practices or lose the ad help of the world’s largest shopper products company. In the event that they depart, you’ll be able to assume others will comply with.
But to date, all this noise and concern, all these congressional inquiries and media diatribes haven’t prevented Fb from reporting larger and larger riches quarter after quarter after quarter.
I feel that is as a result of we addicts maintain coming back and permitting the algorithms to control our eyeballs.
A few of what I simply stated is in Zucked, while some are my own conclusion after reading simply half of this essential e-book. Like most of my readers, I’ve turn into increasingly concerned about Facebook’s choice for algorithms over ethics.
I have not yet completed the ebook as I mentioned. I’ve reached some extent where McNamee has shaped a small group of highly capable and influential people who are speaking to the media, advising influential elected officers and of course, writing articles and this e-book. They’re chatting with anyone who listens within the hope that if Fb won’t change itself than the federal government ought to do it for them.
In Silicon Valley’s strongest circles, there is a very lengthy historical past of Libertarianism in business: the consensus is that the tech business can self-regulate itself better than the federal government can do it. I have lengthy been of that thoughts, however this ebook has already satisfied me in any other case.
There’s little proof that the tech business will self-regulate with any higher integrity or effectiveness than the oil and fuel business of an earlier period where the government needed to break up Commonplace Oil in 1911.
Our business has been all concerning the legend of startups on the world’s financial system. Entrepreneurialism is on the shortlist of hope for the longer term. It’s a great dream full of fantastic tales, but the fact is that the miracle of the startup has been eclipsed by seemingly indestructible giants like Fb (and Google who shares lots of Fb’s questionable algorithmic manipulations).
As for me, I’m not about to go away both Fb or Google. My work still is determined by these platforms in an awesome many ways. But I am chopping back, increasingly more each day. Actually, I see Roger McNamee on the platform as properly.
I imagine there is a vanishing point someplace in my not-too-distant future. I might favor the break-up of Facebook by authorities, but I worry that each Congress and the Supreme Courtroom would shield the interests of shareholders and advertisers than of us, three billion addicts.
The post Zucked: When Algorithms Replace Ethics appeared first on Tech Amender.
0 notes
Text
I’m about midway by way of Zucked: Waking Up to the Fb Catastrophe by Roger McNamee. It’s going sluggish as a result of what I am learning makes me so indignant, sad or alarmed about my #1 vacation spot on the Net. Almost every web page causes me to stop and assume not nearly Fb’s betrayal of so many billion trusting customers, but in addition that this guide confirms a rising concern I have concerning the deteriorating relationship between individuals and know-how generally and my frustration that both self-regulation by tech corporations and the power of our authorities to protect us in conditions akin to this have been thus far just plain impotent.
I am those who initially came to Fb to share thoughts, concepts and footage with associates. It grew to be a supply of perception and knowledge for my most up-to-date five books and it was an ample supply of latest enterprise leads. For over ten years, Fb has offered me with ample returns on my vital investments of time.
Less so now.
Zucked is just not the primary e-book that warned towards Fb, but it’s made extra highly effective and credible because the source is Roger McNamee, who I contemplate to be among the most credible voices in know-how.
I’ve recognized Roger since we have been both just starting careers associated to the enterprise of know-how. We have been never shut, but we did share a ardour back within the early 80s for the promise of private know-how, greatest described by the late Steve Jobs as a “bicycle for the mind,”  talked about on this e-book. I’ve lengthy adopted his thought management in areas to know-how as a primal transformative pressure.
Stone Wall Forward
Zucked is giving me this very disturbing picture that billions of individuals are driving their mental bicycles at breathtaking velocity down a particularly long and darkening tunnel on the finish of which is a stone wall.
Most of us really feel this sense of tunnel. We experience along surrounded by individuals who see what we see, assume what we expect, oppose those who are totally different from us and hold peddling alongside despite mounting proof that the journey might finish badly.
We humans have grow to be a divided lot. Civility between us has deteriorated as has trust: We’re more and more disinclined to seek out widespread ground with one another and we debate political and social issues with stridency and mistrust: We really feel that righteousness is on our aspect and this who disagree are evil, deranged, dangerous or all three.
Roger McNamee believes the wrongdoer that has achieved probably the most to distort our perceptions is Facebook, and in the half of the ebook that I have accomplished, he makes an overwhelmingly compelling case.
Manipulating Minds
Fb, as chances are you’ll know, is the most important company in historical past. More than 2.2 billion individuals log in no less than as soon as month-to-month. That’s about one in three individuals on Earth if you get rid of these with out digital entry or youngsters underneath age five or seniors who’ve lost the power or want to make use of computer systems.
However wait. Sadly, there’s extra.
Fb additionally owns Instagram, which has 1.5 billion customers and WhatsApp with a few billion extra. In fact, there’s overlap, but a conservative estimate of these three social networks provides us a minimum of three billion distinctive users, most of whom go to more than as soon as every day; some of us much more.
Fb and its two largest subsidiaries are manipulating the hearts and minds of half the world’s individuals, extra by orders of magnitude, than any company in history, greater than twice the variety of individuals controlled by the Chinese language authorities as we speak; more than the variety of individuals suppressed by Germany, Japan and Russia throughout World Warfare 2.
In accordance with McNamee, the empire is beneath the management of just two individuals Sheryl Sandberg and Mark Zuckerberg.
The Virtuous We
McNamee served as Zuckerberg’s mentor from 2006-to-2009, starting shortly after young Zuck dropped out of Harvard the place Facebook started by facilitating the power to seek out dates for frat boys at elite universities. It turned much more than that extremely shortly when Zuck and Silicon Valley found each other.
McNamee says he has written this broadside to sound the alarm, to warn us that Facebook has created the type of Filter Bubble that Eli Pariser wrote about a number of years again. This bubble filters what we see in order that we like virtually all of it. We speak virtually unique with individuals who share our views. This establishes the concept that each of us is a part of a virtuous we (my words). This is completed in fact by rigorously calibrated algorithms. This social insulation is dangerous sufficient, however it worsens by orders of magnitude when algorithms pit the virtuous we towards the evil them: individuals simply assume in another way about political and social points.
The tools that Fb makes use of usually are not inherently evil: no tools are. You should use a hammer to build a home or bludgeon a spouse. It’s as much as the consumer, and Fb has lengthy defended itself for not being liable for the hate, bullying, swindling and despicable conduct most individuals have witnessed on Fb.
McNamee points to the work of a well-intentioned individual, who I consulted a few years in the past. Stanford Professor BJ Fogg, who fathered the idea of Persuasive Computing: how computer systems can be used to vary their angle and conduct. When I knew Prof. Fogg he talked enthusiastically about Persuasive Computing benefitting humankind, making us tolerant of variety.
McNamee says Fb uses Persuasive Computing as a device to not profit humankind, however to control it. It is Fb’s energy software not for the customers who’re the product however for advertisers which are the purchasers.
Fb found that when individuals are pissed off, they submit extra, the hyperlink extra, they stay on the social community longer. The corporate is agnostic about how it impacts individuals, so long as it permits them to realize revenue by sticking extra advertisements in entrance of our faces: From the corporate’s perspective you and I and another three billion individuals are not there to be entertained or in any other case made completely satisfied; we’re there to turn out to be knowledge factors for advert mongers.
With out Ethics
All the things we see and everybody instructed to us to Comply with or Like, every Group we’re invited to hitch is calculated by algorithms and based mostly on the perpetual collection of our knowledge. These algorithms, in fact, have machine intelligence, however they are devoid of different human qualities together with ethics, compassion, empathy, humor, irony, nuance or any want to discover a widespread floor between people who as soon as would respectfully disagree.
Filter Bubbles, Persuasive Computing and ever-more effective algorithms manipulate us and make us addictive. We belief newcomers into our private bubbles as a result of they know individuals we all know. This sound comforting in itself, however it reinforces what we already assume and introduces few new thoughts to ponder—until they piss us off or scare us. So, in case you are like my spouse, Paula Israel, who’s enthusiastic about protecting animals within the wild, you may be fed all types of stories and pictures about horrible things being executed to wolves or whales.  For those who hate Donald Trump, you may be selected to get tons of studies on the obscenities he foments each day; and when you consider that America should not be the place it has been for welcoming the drained, poor, huddled plenty of the world, you’ll be fed pretend information about rapists, terrorists and drug runners massed at our southern borders plotting to destroy a neighborhood close to you.
Facebook’s knowledge has found out that once we are outraged, horrified, indignant or saddened, we keep on the social network longer. We share extra, we like extra, and we publish extra, and it has designed and calibrated it in order that we do this.
Brexit as a Petri Dish
The outcome, in fact, has an incredible deal to do with the mess we are in. Hackers and faux information mongers discovered to good voter manipulation throughout Brexit in 2015. Then they took what they discovered there and refined it to serve Donald Trump in 2016, and nothing has occurred to stop it from occurring once more within the US or anyplace else where there are purported to be free elections.
They have hired individuals to deal with the issue who have resigned in frustration, shortly after beginning. Basecamp, has stopped promoting or being current on Facebook, Instagram and WhatsApp.  In April, Proctor & Gamble set Facebook –and Google—on notice to vary their practices or lose the ad help of the world’s largest shopper products company. In the event that they depart, you’ll be able to assume others will comply with.
But to date, all this noise and concern, all these congressional inquiries and media diatribes haven’t prevented Fb from reporting larger and larger riches quarter after quarter after quarter.
I feel that is as a result of we addicts maintain coming back and permitting the algorithms to control our eyeballs.
A few of what I simply stated is in Zucked, while some are my own conclusion after reading simply half of this essential e-book. Like most of my readers, I’ve turn into increasingly concerned about Facebook’s choice for algorithms over ethics.
I have not yet completed the ebook as I mentioned. I’ve reached some extent where McNamee has shaped a small group of highly capable and influential people who are speaking to the media, advising influential elected officers and of course, writing articles and this e-book. They’re chatting with anyone who listens within the hope that if Fb won’t change itself than the federal government ought to do it for them.
In Silicon Valley’s strongest circles, there is a very lengthy historical past of Libertarianism in business: the consensus is that the tech business can self-regulate itself better than the federal government can do it. I have lengthy been of that thoughts, however this ebook has already satisfied me in any other case.
There’s little proof that the tech business will self-regulate with any higher integrity or effectiveness than the oil and fuel business of an earlier period where the government needed to break up Commonplace Oil in 1911.
Our business has been all concerning the legend of startups on the world’s financial system. Entrepreneurialism is on the shortlist of hope for the longer term. It’s a great dream full of fantastic tales, but the fact is that the miracle of the startup has been eclipsed by seemingly indestructible giants like Fb (and Google who shares lots of Fb’s questionable algorithmic manipulations).
As for me, I’m not about to go away both Fb or Google. My work still is determined by these platforms in an awesome many ways. But I am chopping back, increasingly more each day. Actually, I see Roger McNamee on the platform as properly.
I imagine there is a vanishing point someplace in my not-too-distant future. I might favor the break-up of Facebook by authorities, but I worry that each Congress and the Supreme Courtroom would shield the interests of shareholders and advertisers than of us, three billion addicts.
The post Zucked: When Algorithms Replace Ethics appeared first on Tech Amender.
0 notes
Text
I’m about midway by way of Zucked: Waking Up to the Fb Catastrophe by Roger McNamee. It’s going sluggish as a result of what I am learning makes me so indignant, sad or alarmed about my #1 vacation spot on the Net. Almost every web page causes me to stop and assume not nearly Fb’s betrayal of so many billion trusting customers, but in addition that this guide confirms a rising concern I have concerning the deteriorating relationship between individuals and know-how generally and my frustration that both self-regulation by tech corporations and the power of our authorities to protect us in conditions akin to this have been thus far just plain impotent.
I am those who initially came to Fb to share thoughts, concepts and footage with associates. It grew to be a supply of perception and knowledge for my most up-to-date five books and it was an ample supply of latest enterprise leads. For over ten years, Fb has offered me with ample returns on my vital investments of time.
Less so now.
Zucked is just not the primary e-book that warned towards Fb, but it’s made extra highly effective and credible because the source is Roger McNamee, who I contemplate to be among the most credible voices in know-how.
I’ve recognized Roger since we have been both just starting careers associated to the enterprise of know-how. We have been never shut, but we did share a ardour back within the early 80s for the promise of private know-how, greatest described by the late Steve Jobs as a “bicycle for the mind,”  talked about on this e-book. I’ve lengthy adopted his thought management in areas to know-how as a primal transformative pressure.
Stone Wall Forward
Zucked is giving me this very disturbing picture that billions of individuals are driving their mental bicycles at breathtaking velocity down a particularly long and darkening tunnel on the finish of which is a stone wall.
Most of us really feel this sense of tunnel. We experience along surrounded by individuals who see what we see, assume what we expect, oppose those who are totally different from us and hold peddling alongside despite mounting proof that the journey might finish badly.
We humans have grow to be a divided lot. Civility between us has deteriorated as has trust: We’re more and more disinclined to seek out widespread ground with one another and we debate political and social issues with stridency and mistrust: We really feel that righteousness is on our aspect and this who disagree are evil, deranged, dangerous or all three.
Roger McNamee believes the wrongdoer that has achieved probably the most to distort our perceptions is Facebook, and in the half of the ebook that I have accomplished, he makes an overwhelmingly compelling case.
Manipulating Minds
Fb, as chances are you’ll know, is the most important company in historical past. More than 2.2 billion individuals log in no less than as soon as month-to-month. That’s about one in three individuals on Earth if you get rid of these with out digital entry or youngsters underneath age five or seniors who’ve lost the power or want to make use of computer systems.
However wait. Sadly, there’s extra.
Fb additionally owns Instagram, which has 1.5 billion customers and WhatsApp with a few billion extra. In fact, there’s overlap, but a conservative estimate of these three social networks provides us a minimum of three billion distinctive users, most of whom go to more than as soon as every day; some of us much more.
Fb and its two largest subsidiaries are manipulating the hearts and minds of half the world’s individuals, extra by orders of magnitude, than any company in history, greater than twice the variety of individuals controlled by the Chinese language authorities as we speak; more than the variety of individuals suppressed by Germany, Japan and Russia throughout World Warfare 2.
In accordance with McNamee, the empire is beneath the management of just two individuals Sheryl Sandberg and Mark Zuckerberg.
The Virtuous We
McNamee served as Zuckerberg’s mentor from 2006-to-2009, starting shortly after young Zuck dropped out of Harvard the place Facebook started by facilitating the power to seek out dates for frat boys at elite universities. It turned much more than that extremely shortly when Zuck and Silicon Valley found each other.
McNamee says he has written this broadside to sound the alarm, to warn us that Facebook has created the type of Filter Bubble that Eli Pariser wrote about a number of years again. This bubble filters what we see in order that we like virtually all of it. We speak virtually unique with individuals who share our views. This establishes the concept that each of us is a part of a virtuous we (my words). This is completed in fact by rigorously calibrated algorithms. This social insulation is dangerous sufficient, however it worsens by orders of magnitude when algorithms pit the virtuous we towards the evil them: individuals simply assume in another way about political and social points.
The tools that Fb makes use of usually are not inherently evil: no tools are. You should use a hammer to build a home or bludgeon a spouse. It’s as much as the consumer, and Fb has lengthy defended itself for not being liable for the hate, bullying, swindling and despicable conduct most individuals have witnessed on Fb.
McNamee points to the work of a well-intentioned individual, who I consulted a few years in the past. Stanford Professor BJ Fogg, who fathered the idea of Persuasive Computing: how computer systems can be used to vary their angle and conduct. When I knew Prof. Fogg he talked enthusiastically about Persuasive Computing benefitting humankind, making us tolerant of variety.
McNamee says Fb uses Persuasive Computing as a device to not profit humankind, however to control it. It is Fb’s energy software not for the customers who’re the product however for advertisers which are the purchasers.
Fb found that when individuals are pissed off, they submit extra, the hyperlink extra, they stay on the social community longer. The corporate is agnostic about how it impacts individuals, so long as it permits them to realize revenue by sticking extra advertisements in entrance of our faces: From the corporate’s perspective you and I and another three billion individuals are not there to be entertained or in any other case made completely satisfied; we’re there to turn out to be knowledge factors for advert mongers.
With out Ethics
All the things we see and everybody instructed to us to Comply with or Like, every Group we’re invited to hitch is calculated by algorithms and based mostly on the perpetual collection of our knowledge. These algorithms, in fact, have machine intelligence, however they are devoid of different human qualities together with ethics, compassion, empathy, humor, irony, nuance or any want to discover a widespread floor between people who as soon as would respectfully disagree.
Filter Bubbles, Persuasive Computing and ever-more effective algorithms manipulate us and make us addictive. We belief newcomers into our private bubbles as a result of they know individuals we all know. This sound comforting in itself, however it reinforces what we already assume and introduces few new thoughts to ponder—until they piss us off or scare us. So, in case you are like my spouse, Paula Israel, who’s enthusiastic about protecting animals within the wild, you may be fed all types of stories and pictures about horrible things being executed to wolves or whales.  For those who hate Donald Trump, you may be selected to get tons of studies on the obscenities he foments each day; and when you consider that America should not be the place it has been for welcoming the drained, poor, huddled plenty of the world, you’ll be fed pretend information about rapists, terrorists and drug runners massed at our southern borders plotting to destroy a neighborhood close to you.
Facebook’s knowledge has found out that once we are outraged, horrified, indignant or saddened, we keep on the social network longer. We share extra, we like extra, and we publish extra, and it has designed and calibrated it in order that we do this.
Brexit as a Petri Dish
The outcome, in fact, has an incredible deal to do with the mess we are in. Hackers and faux information mongers discovered to good voter manipulation throughout Brexit in 2015. Then they took what they discovered there and refined it to serve Donald Trump in 2016, and nothing has occurred to stop it from occurring once more within the US or anyplace else where there are purported to be free elections.
They have hired individuals to deal with the issue who have resigned in frustration, shortly after beginning. Basecamp, has stopped promoting or being current on Facebook, Instagram and WhatsApp.  In April, Proctor & Gamble set Facebook –and Google—on notice to vary their practices or lose the ad help of the world’s largest shopper products company. In the event that they depart, you’ll be able to assume others will comply with.
But to date, all this noise and concern, all these congressional inquiries and media diatribes haven’t prevented Fb from reporting larger and larger riches quarter after quarter after quarter.
I feel that is as a result of we addicts maintain coming back and permitting the algorithms to control our eyeballs.
A few of what I simply stated is in Zucked, while some are my own conclusion after reading simply half of this essential e-book. Like most of my readers, I’ve turn into increasingly concerned about Facebook’s choice for algorithms over ethics.
I have not yet completed the ebook as I mentioned. I’ve reached some extent where McNamee has shaped a small group of highly capable and influential people who are speaking to the media, advising influential elected officers and of course, writing articles and this e-book. They’re chatting with anyone who listens within the hope that if Fb won’t change itself than the federal government ought to do it for them.
In Silicon Valley’s strongest circles, there is a very lengthy historical past of Libertarianism in business: the consensus is that the tech business can self-regulate itself better than the federal government can do it. I have lengthy been of that thoughts, however this ebook has already satisfied me in any other case.
There’s little proof that the tech business will self-regulate with any higher integrity or effectiveness than the oil and fuel business of an earlier period where the government needed to break up Commonplace Oil in 1911.
Our business has been all concerning the legend of startups on the world’s financial system. Entrepreneurialism is on the shortlist of hope for the longer term. It’s a great dream full of fantastic tales, but the fact is that the miracle of the startup has been eclipsed by seemingly indestructible giants like Fb (and Google who shares lots of Fb’s questionable algorithmic manipulations).
As for me, I’m not about to go away both Fb or Google. My work still is determined by these platforms in an awesome many ways. But I am chopping back, increasingly more each day. Actually, I see Roger McNamee on the platform as properly.
I imagine there is a vanishing point someplace in my not-too-distant future. I might favor the break-up of Facebook by authorities, but I worry that each Congress and the Supreme Courtroom would shield the interests of shareholders and advertisers than of us, three billion addicts.
The post Zucked: When Algorithms Replace Ethics appeared first on Tech Amender.
0 notes
hellonerdybihippie · 5 years
Text
I’m about midway by way of Zucked: Waking Up to the Fb Catastrophe by Roger McNamee. It’s going sluggish as a result of what I am learning makes me so indignant, sad or alarmed about my #1 vacation spot on the Net. Almost every web page causes me to stop and assume not nearly Fb’s betrayal of so many billion trusting customers, but in addition that this guide confirms a rising concern I have concerning the deteriorating relationship between individuals and know-how generally and my frustration that both self-regulation by tech corporations and the power of our authorities to protect us in conditions akin to this have been thus far just plain impotent.
I am those who initially came to Fb to share thoughts, concepts and footage with associates. It grew to be a supply of perception and knowledge for my most up-to-date five books and it was an ample supply of latest enterprise leads. For over ten years, Fb has offered me with ample returns on my vital investments of time.
Less so now.
Zucked is just not the primary e-book that warned towards Fb, but it’s made extra highly effective and credible because the source is Roger McNamee, who I contemplate to be among the most credible voices in know-how.
I’ve recognized Roger since we have been both just starting careers associated to the enterprise of know-how. We have been never shut, but we did share a ardour back within the early 80s for the promise of private know-how, greatest described by the late Steve Jobs as a “bicycle for the mind,”  talked about on this e-book. I’ve lengthy adopted his thought management in areas to know-how as a primal transformative pressure.
Stone Wall Forward
Zucked is giving me this very disturbing picture that billions of individuals are driving their mental bicycles at breathtaking velocity down a particularly long and darkening tunnel on the finish of which is a stone wall.
Most of us really feel this sense of tunnel. We experience along surrounded by individuals who see what we see, assume what we expect, oppose those who are totally different from us and hold peddling alongside despite mounting proof that the journey might finish badly.
We humans have grow to be a divided lot. Civility between us has deteriorated as has trust: We’re more and more disinclined to seek out widespread ground with one another and we debate political and social issues with stridency and mistrust: We really feel that righteousness is on our aspect and this who disagree are evil, deranged, dangerous or all three.
Roger McNamee believes the wrongdoer that has achieved probably the most to distort our perceptions is Facebook, and in the half of the ebook that I have accomplished, he makes an overwhelmingly compelling case.
Manipulating Minds
Fb, as chances are you’ll know, is the most important company in historical past. More than 2.2 billion individuals log in no less than as soon as month-to-month. That’s about one in three individuals on Earth if you get rid of these with out digital entry or youngsters underneath age five or seniors who’ve lost the power or want to make use of computer systems.
However wait. Sadly, there’s extra.
Fb additionally owns Instagram, which has 1.5 billion customers and WhatsApp with a few billion extra. In fact, there’s overlap, but a conservative estimate of these three social networks provides us a minimum of three billion distinctive users, most of whom go to more than as soon as every day; some of us much more.
Fb and its two largest subsidiaries are manipulating the hearts and minds of half the world’s individuals, extra by orders of magnitude, than any company in history, greater than twice the variety of individuals controlled by the Chinese language authorities as we speak; more than the variety of individuals suppressed by Germany, Japan and Russia throughout World Warfare 2.
In accordance with McNamee, the empire is beneath the management of just two individuals Sheryl Sandberg and Mark Zuckerberg.
The Virtuous We
McNamee served as Zuckerberg’s mentor from 2006-to-2009, starting shortly after young Zuck dropped out of Harvard the place Facebook started by facilitating the power to seek out dates for frat boys at elite universities. It turned much more than that extremely shortly when Zuck and Silicon Valley found each other.
McNamee says he has written this broadside to sound the alarm, to warn us that Facebook has created the type of Filter Bubble that Eli Pariser wrote about a number of years again. This bubble filters what we see in order that we like virtually all of it. We speak virtually unique with individuals who share our views. This establishes the concept that each of us is a part of a virtuous we (my words). This is completed in fact by rigorously calibrated algorithms. This social insulation is dangerous sufficient, however it worsens by orders of magnitude when algorithms pit the virtuous we towards the evil them: individuals simply assume in another way about political and social points.
The tools that Fb makes use of usually are not inherently evil: no tools are. You should use a hammer to build a home or bludgeon a spouse. It’s as much as the consumer, and Fb has lengthy defended itself for not being liable for the hate, bullying, swindling and despicable conduct most individuals have witnessed on Fb.
McNamee points to the work of a well-intentioned individual, who I consulted a few years in the past. Stanford Professor BJ Fogg, who fathered the idea of Persuasive Computing: how computer systems can be used to vary their angle and conduct. When I knew Prof. Fogg he talked enthusiastically about Persuasive Computing benefitting humankind, making us tolerant of variety.
McNamee says Fb uses Persuasive Computing as a device to not profit humankind, however to control it. It is Fb’s energy software not for the customers who’re the product however for advertisers which are the purchasers.
Fb found that when individuals are pissed off, they submit extra, the hyperlink extra, they stay on the social community longer. The corporate is agnostic about how it impacts individuals, so long as it permits them to realize revenue by sticking extra advertisements in entrance of our faces: From the corporate’s perspective you and I and another three billion individuals are not there to be entertained or in any other case made completely satisfied; we’re there to turn out to be knowledge factors for advert mongers.
With out Ethics
All the things we see and everybody instructed to us to Comply with or Like, every Group we’re invited to hitch is calculated by algorithms and based mostly on the perpetual collection of our knowledge. These algorithms, in fact, have machine intelligence, however they are devoid of different human qualities together with ethics, compassion, empathy, humor, irony, nuance or any want to discover a widespread floor between people who as soon as would respectfully disagree.
Filter Bubbles, Persuasive Computing and ever-more effective algorithms manipulate us and make us addictive. We belief newcomers into our private bubbles as a result of they know individuals we all know. This sound comforting in itself, however it reinforces what we already assume and introduces few new thoughts to ponder—until they piss us off or scare us. So, in case you are like my spouse, Paula Israel, who’s enthusiastic about protecting animals within the wild, you may be fed all types of stories and pictures about horrible things being executed to wolves or whales.  For those who hate Donald Trump, you may be selected to get tons of studies on the obscenities he foments each day; and when you consider that America should not be the place it has been for welcoming the drained, poor, huddled plenty of the world, you’ll be fed pretend information about rapists, terrorists and drug runners massed at our southern borders plotting to destroy a neighborhood close to you.
Facebook’s knowledge has found out that once we are outraged, horrified, indignant or saddened, we keep on the social network longer. We share extra, we like extra, and we publish extra, and it has designed and calibrated it in order that we do this.
Brexit as a Petri Dish
The outcome, in fact, has an incredible deal to do with the mess we are in. Hackers and faux information mongers discovered to good voter manipulation throughout Brexit in 2015. Then they took what they discovered there and refined it to serve Donald Trump in 2016, and nothing has occurred to stop it from occurring once more within the US or anyplace else where there are purported to be free elections.
They have hired individuals to deal with the issue who have resigned in frustration, shortly after beginning. Basecamp, has stopped promoting or being current on Facebook, Instagram and WhatsApp.  In April, Proctor & Gamble set Facebook –and Google—on notice to vary their practices or lose the ad help of the world’s largest shopper products company. In the event that they depart, you’ll be able to assume others will comply with.
But to date, all this noise and concern, all these congressional inquiries and media diatribes haven’t prevented Fb from reporting larger and larger riches quarter after quarter after quarter.
I feel that is as a result of we addicts maintain coming back and permitting the algorithms to control our eyeballs.
A few of what I simply stated is in Zucked, while some are my own conclusion after reading simply half of this essential e-book. Like most of my readers, I’ve turn into increasingly concerned about Facebook’s choice for algorithms over ethics.
I have not yet completed the ebook as I mentioned. I’ve reached some extent where McNamee has shaped a small group of highly capable and influential people who are speaking to the media, advising influential elected officers and of course, writing articles and this e-book. They’re chatting with anyone who listens within the hope that if Fb won’t change itself than the federal government ought to do it for them.
In Silicon Valley’s strongest circles, there is a very lengthy historical past of Libertarianism in business: the consensus is that the tech business can self-regulate itself better than the federal government can do it. I have lengthy been of that thoughts, however this ebook has already satisfied me in any other case.
There’s little proof that the tech business will self-regulate with any higher integrity or effectiveness than the oil and fuel business of an earlier period where the government needed to break up Commonplace Oil in 1911.
Our business has been all concerning the legend of startups on the world’s financial system. Entrepreneurialism is on the shortlist of hope for the longer term. It’s a great dream full of fantastic tales, but the fact is that the miracle of the startup has been eclipsed by seemingly indestructible giants like Fb (and Google who shares lots of Fb’s questionable algorithmic manipulations).
As for me, I’m not about to go away both Fb or Google. My work still is determined by these platforms in an awesome many ways. But I am chopping back, increasingly more each day. Actually, I see Roger McNamee on the platform as properly.
I imagine there is a vanishing point someplace in my not-too-distant future. I might favor the break-up of Facebook by authorities, but I worry that each Congress and the Supreme Courtroom would shield the interests of shareholders and advertisers than of us, three billion addicts.
The post Zucked: When Algorithms Replace Ethics appeared first on Tech Amender.
0 notes
csemntwinl3x0a1 · 6 years
Text
From USENET to Facebook: The second time as farce
From USENET to Facebook: The second time as farce
Demanding and building a social network that serves us and enables free speech, rather than serving a business metric that amplifies noise, is the way to end the farce.
Re-interpreting Hegel, Marx said that everything in history happens twice, the first time as tragedy, the second as farce. That’s a fitting summary of Facebook’s Very Bad Month. There’s nothing here we haven’t seen before, nothing about abuse, trolling, racism, spam, porn, and even bots that hasn’t already happened. This time as farce? Certainly Zuckerberg’s 14-year Apology Tour, as Zeynep Tufecki calls it, has the look and feel of a farce. He just can’t stop apologizing for Facebook’s messes.
Except that the farce isn’t over yet. We’re in the middle of it. As Tufekci points out, 2018 isn’t the first time Zuckerberg has said “we blew it, we’ll do better.” Apology has been a roughly biennial occurrence since Facebook’s earliest days. So, the question we face is simple: how do we bring this sad history to an endpoint that isn’t farce? The third time around, should there be one, it isn’t even farce; it’s just stupidity. We don’t have to accept future apologies, whether they come from Zuck or some other network magnate, as inevitable.
I want to think about what we can learn from the forerunners of modern social networks—specifically about USENET, the proto-internet of the 1980s and 90s. (The same observations probably apply to BBSs, though I’m less familiar with them.) USENET was a decentralized and unmanaged system that allowed Unix users to exchange “posts” by sending them to hundreds of newsgroups. It started in the early 80s, peaked sometime around 1995, and arguably ended as tragedy (though it went out with a whimper, not a bang).
Facebook repeats the pattern of USENET, this time as farce. As a no-holds-barred Wild West sort of social network, USENET was filled with everything we rightly complain about today. It was easy to troll and be abusive; all too many participants did it for fun. Most groups were eventually flooded by spam, long before spam became a problem for email. Much of that spam distributed pornography or pirated software (“warez”). You could certainly find newsgroups in which to express your inner neo-Nazi or white supremacist self. Fake news? We had that; we had malicious answers to technical questions that would get new users to trash their systems. And yes, there were bots; that technology isn’t as new as we’d like to think.
But there was a big divide on USENET between moderated and unmoderated newsgroups. Posts to moderated newsgroups had to be approved by a human moderator before they were pushed to the rest of the network. Moderated groups were much less prone to abuse. They weren’t immune, certainly, but moderated groups remained virtual places where discussion was mostly civilized, and where you could get questions answered. Unmoderated newsgroups were always spam-filled and frequently abusive, and the alt.* newsgroups, which could be created by anyone, for any reason, matched anything we have now for bad behavior.
So, the first thing we should learn from USENET is the importance of moderation. Fully human moderation at Facebook scale is impossible. With seven billion pieces of content shared per day, even a million moderators would have to scan seven thousand posts each: roughly 4 seconds per post. But we don’t need to rely on human moderation. After USENET’s decline, research showed that it was possible to classify users as newbies, helpers, leaders, trolls, or flamers, purely by their communications patterns—with only minimal help from the content. This could be the basis for automated moderation assistants that kick suspicious posts over to human moderators, who would then have the final word. Whether automated or human, moderators prevent many of the bad posts from being made in the first place. It’s no fun being a troll if you can’t get through to your victims.
Automated moderation can also do fact checking. The technology that won Jeopardy a decade ago is more than capable of checking basic facts. It might not be capable of checking complex logic, but most “fake news” centers around facts that can easily be evaluated. And automated systems are very capable of detecting bots: Google’s Gmail has successfully throttled spam.
What else can we learn from USENET? Trolls were everywhere, but the really obnoxious stuff stayed where it was supposed to be. I’m not naive enough to think that neo-Nazis and white supremacists will dry up and go away, on Facebook or elsewhere. And I’m even content to allow them to have their own Facebook pages: Facebook can let these people talk to each other all they want, because they’re going to do that anyway, whatever tools you put in place. The problem we have now is that Facebook’s engagement metric paves the road to their door. Once you give someone a hit of something titillating, they’ll come back for more. And the next hit has to be stronger. That’s how you keep people engaged, and that’s (as Tufekci has argued about YouTube) how you radicalize them.
USENET had no engagement metrics, no means of linking users to stronger content. Islands of hatred certainly existed. But in a network that didn’t optimize for engagement, hate groups didn’t spread. Neo-Nazis and their like were certainly there, but you had to search them out, you weren’t pushed to them. The platform didn’t lead you there, trying to maximize your “engagement.” I can’t claim that was some sort of brilliant design on USENET’s part; it just wasn’t something anyone thought about at the time. And as a free service, there was a need to maximize profit. Facebook’s obsession with engagement is ultimately more dangerous than their sloppy handling of personal data. “Engagement” allows—indeed, encourages—hate groups to metastasize.
Engagement metrics harm free speech, another ideal carried to the modern internet from the USENET world. But in an “attention economy,” where the limiting factor is attention, not speech, we have to rethink what those values mean. I’ve said that USENET ended in a “whimper”—but what drained the energy away? The participants who contributed real value just got tired of wading through the spam and fighting off the trolls. They went elsewhere. USENET’s history gives us a warning: good speech was crowded off the stage by bad speech.
Speech that exists to crowd out other speech isn’t the unfettered interchange of ideas. Free speech doesn’t mean the right to a platform. Indeed, the U.S. Constitution already makes that distinction: “freedom of the press” is about platforms, and you don’t get freedom of the press unless you have a press. Again, Zeynep Tufekci has it: in “It’s the (Democracy-Poisoning) Golden Age of Free Speech,” she writes “The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself.” Censorship isn’t about arresting dissidents; it’s about generating so much noise that voices you don’t like can’t be heard.
If we’re to put an end to the farce, we need to understand what it means to enable speech, rather than to drown it out. Abandoning “engagement” is part of the solution. We will be better served by a network that, like USENET, doesn’t care how people engage, and that allows them to make their own connections. Automated moderation can be a tool that makes room for speech, particularly if we can take advantage of communication patterns to moderate those whose primary goal is to be the loudest voice.
Marx certainly would have laid blame at the feet of Zuckerberg, for naively and profitably commoditizing the social identities of his users. But blame is not a solution. As convenient a punching bag as Zuckerberg is, we have to recognize that Facebook’s problems extend to the entire social world. That includes Twitter and YouTube, many other social networks past and present, and many networks that are neither online nor social. Expecting Zuck to “fix Facebook” may be the best way to guarantee that the farce plays on.
History is only deterministic in hindsight, and it doesn’t have to end in farce (or worse). We all build our social networks, and Mark Zuckerberg isn’t the only player on history’s stage. We need to revisit, reassess, and learn from all of our past social networks. Demanding and building a social network that serves us and enables free speech, rather than serving a business metric that amplifies noise, is the way to end the farce.
Is that a revolution? We have nothing to lose but our chains.
Continue reading From USENET to Facebook: The second time as farce.
https://ift.tt/2vrOB5g
0 notes
doorrepcal33169 · 6 years
Text
From USENET to Facebook: The second time as farce
Demanding and building a social network that serves us and enables free speech, rather than serving a business metric that amplifies noise, is the way to end the farce.
Re-interpreting Hegel, Marx said that everything in history happens twice, the first time as tragedy, the second as farce. That’s a fitting summary of Facebook’s Very Bad Month. There’s nothing here we haven’t seen before, nothing about abuse, trolling, racism, spam, porn, and even bots that hasn’t already happened. This time as farce? Certainly Zuckerberg’s 14-year Apology Tour, as Zeynep Tufecki calls it, has the look and feel of a farce. He just can’t stop apologizing for Facebook’s messes.
Except that the farce isn’t over yet. We’re in the middle of it. As Tufekci points out, 2018 isn’t the first time Zuckerberg has said “we blew it, we’ll do better.” Apology has been a roughly biennial occurrence since Facebook’s earliest days. So, the question we face is simple: how do we bring this sad history to an endpoint that isn’t farce? The third time around, should there be one, it isn’t even farce; it’s just stupidity. We don’t have to accept future apologies, whether they come from Zuck or some other network magnate, as inevitable.
I want to think about what we can learn from the forerunners of modern social networks—specifically about USENET, the proto-internet of the 1980s and 90s. (The same observations probably apply to BBSs, though I’m less familiar with them.) USENET was a decentralized and unmanaged system that allowed Unix users to exchange “posts” by sending them to hundreds of newsgroups. It started in the early 80s, peaked sometime around 1995, and arguably ended as tragedy (though it went out with a whimper, not a bang).
Facebook repeats the pattern of USENET, this time as farce. As a no-holds-barred Wild West sort of social network, USENET was filled with everything we rightly complain about today. It was easy to troll and be abusive; all too many participants did it for fun. Most groups were eventually flooded by spam, long before spam became a problem for email. Much of that spam distributed pornography or pirated software (“warez”). You could certainly find newsgroups in which to express your inner neo-Nazi or white supremacist self. Fake news? We had that; we had malicious answers to technical questions that would get new users to trash their systems. And yes, there were bots; that technology isn’t as new as we’d like to think.
But there was a big divide on USENET between moderated and unmoderated newsgroups. Posts to moderated newsgroups had to be approved by a human moderator before they were pushed to the rest of the network. Moderated groups were much less prone to abuse. They weren’t immune, certainly, but moderated groups remained virtual places where discussion was mostly civilized, and where you could get questions answered. Unmoderated newsgroups were always spam-filled and frequently abusive, and the alt.* newsgroups, which could be created by anyone, for any reason, matched anything we have now for bad behavior.
So, the first thing we should learn from USENET is the importance of moderation. Fully human moderation at Facebook scale is impossible. With seven billion pieces of content shared per day, even a million moderators would have to scan seven thousand posts each: roughly 4 seconds per post. But we don’t need to rely on human moderation. After USENET’s decline, research showed that it was possible to classify users as newbies, helpers, leaders, trolls, or flamers, purely by their communications patterns—with only minimal help from the content. This could be the basis for automated moderation assistants that kick suspicious posts over to human moderators, who would then have the final word. Whether automated or human, moderators prevent many of the bad posts from being made in the first place. It’s no fun being a troll if you can’t get through to your victims.
Automated moderation can also do fact checking. The technology that won Jeopardy a decade ago is more than capable of checking basic facts. It might not be capable of checking complex logic, but most “fake news” centers around facts that can easily be evaluated. And automated systems are very capable of detecting bots: Google’s Gmail has successfully throttled spam.
What else can we learn from USENET? Trolls were everywhere, but the really obnoxious stuff stayed where it was supposed to be. I’m not naive enough to think that neo-Nazis and white supremacists will dry up and go away, on Facebook or elsewhere. And I’m even content to allow them to have their own Facebook pages: Facebook can let these people talk to each other all they want, because they’re going to do that anyway, whatever tools you put in place. The problem we have now is that Facebook’s engagement metric paves the road to their door. Once you give someone a hit of something titillating, they’ll come back for more. And the next hit has to be stronger. That’s how you keep people engaged, and that’s (as Tufekci has argued about YouTube) how you radicalize them.
USENET had no engagement metrics, no means of linking users to stronger content. Islands of hatred certainly existed. But in a network that didn’t optimize for engagement, hate groups didn’t spread. Neo-Nazis and their like were certainly there, but you had to search them out, you weren’t pushed to them. The platform didn’t lead you there, trying to maximize your “engagement.” I can’t claim that was some sort of brilliant design on USENET’s part; it just wasn’t something anyone thought about at the time. And as a free service, there was a need to maximize profit. Facebook’s obsession with engagement is ultimately more dangerous than their sloppy handling of personal data. “Engagement” allows—indeed, encourages—hate groups to metastasize.
Engagement metrics harm free speech, another ideal carried to the modern internet from the USENET world. But in an “attention economy,” where the limiting factor is attention, not speech, we have to rethink what those values mean. I’ve said that USENET ended in a “whimper”—but what drained the energy away? The participants who contributed real value just got tired of wading through the spam and fighting off the trolls. They went elsewhere. USENET’s history gives us a warning: good speech was crowded off the stage by bad speech.
Speech that exists to crowd out other speech isn’t the unfettered interchange of ideas. Free speech doesn’t mean the right to a platform. Indeed, the U.S. Constitution already makes that distinction: “freedom of the press” is about platforms, and you don’t get freedom of the press unless you have a press. Again, Zeynep Tufekci has it: in “It’s the (Democracy-Poisoning) Golden Age of Free Speech,” she writes “The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself.” Censorship isn’t about arresting dissidents; it’s about generating so much noise that voices you don’t like can’t be heard.
If we’re to put an end to the farce, we need to understand what it means to enable speech, rather than to drown it out. Abandoning “engagement” is part of the solution. We will be better served by a network that, like USENET, doesn’t care how people engage, and that allows them to make their own connections. Automated moderation can be a tool that makes room for speech, particularly if we can take advantage of communication patterns to moderate those whose primary goal is to be the loudest voice.
Marx certainly would have laid blame at the feet of Zuckerberg, for naively and profitably commoditizing the social identities of his users. But blame is not a solution. As convenient a punching bag as Zuckerberg is, we have to recognize that Facebook’s problems extend to the entire social world. That includes Twitter and YouTube, many other social networks past and present, and many networks that are neither online nor social. Expecting Zuck to “fix Facebook” may be the best way to guarantee that the farce plays on.
History is only deterministic in hindsight, and it doesn’t have to end in farce (or worse). We all build our social networks, and Mark Zuckerberg isn’t the only player on history’s stage. We need to revisit, reassess, and learn from all of our past social networks. Demanding and building a social network that serves us and enables free speech, rather than serving a business metric that amplifies noise, is the way to end the farce.
Is that a revolution? We have nothing to lose but our chains.
Continue reading From USENET to Facebook: The second time as farce.
from FEED 10 TECHNOLOGY https://ift.tt/2vrOB5g
0 notes
megatechcrunch · 6 years
Link
Demanding and building a social network that serves us and enables free speech, rather than serving a business metric that amplifies noise, is the way to end the farce.
Re-interpreting Hegel, Marx said that everything in history happens twice, the first time as tragedy, the second as farce. That’s a fitting summary of Facebook’s Very Bad Month. There’s nothing here we haven’t seen before, nothing about abuse, trolling, racism, spam, porn, and even bots that hasn’t already happened. This time as farce? Certainly Zuckerberg’s 14-year Apology Tour, as Zeynep Tufecki calls it, has the look and feel of a farce. He just can’t stop apologizing for Facebook’s messes.
Except that the farce isn’t over yet. We’re in the middle of it. As Tufekci points out, 2018 isn’t the first time Zuckerberg has said “we blew it, we’ll do better.” Apology has been a roughly biennial occurrence since Facebook’s earliest days. So, the question we face is simple: how do we bring this sad history to an endpoint that isn’t farce? The third time around, should there be one, it isn’t even farce; it’s just stupidity. We don’t have to accept future apologies, whether they come from Zuck or some other network magnate, as inevitable.
I want to think about what we can learn from the forerunners of modern social networks—specifically about USENET, the proto-internet of the 1980s and 90s. (The same observations probably apply to BBSs, though I’m less familiar with them.) USENET was a decentralized and unmanaged system that allowed Unix users to exchange “posts” by sending them to hundreds of newsgroups. It started in the early 80s, peaked sometime around 1995, and arguably ended as tragedy (though it went out with a whimper, not a bang).
Facebook repeats the pattern of USENET, this time as farce. As a no-holds-barred Wild West sort of social network, USENET was filled with everything we rightly complain about today. It was easy to troll and be abusive; all too many participants did it for fun. Most groups were eventually flooded by spam, long before spam became a problem for email. Much of that spam distributed pornography or pirated software (“warez”). You could certainly find newsgroups in which to express your inner neo-Nazi or white supremacist self. Fake news? We had that; we had malicious answers to technical questions that would get new users to trash their systems. And yes, there were bots; that technology isn’t as new as we’d like to think.
But there was a big divide on USENET between moderated and unmoderated newsgroups. Posts to moderated newsgroups had to be approved by a human moderator before they were pushed to the rest of the network. Moderated groups were much less prone to abuse. They weren’t immune, certainly, but moderated groups remained virtual places where discussion was mostly civilized, and where you could get questions answered. Unmoderated newsgroups were always spam-filled and frequently abusive, and the alt.* newsgroups, which could be created by anyone, for any reason, matched anything we have now for bad behavior.
So, the first thing we should learn from USENET is the importance of moderation. Fully human moderation at Facebook scale is impossible. With seven billion pieces of content shared per day, even a million moderators would have to scan seven thousand posts each: roughly 4 seconds per post. But we don’t need to rely on human moderation. After USENET’s decline, research showed that it was possible to classify users as newbies, helpers, leaders, trolls, or flamers, purely by their communications patterns—with only minimal help from the content. This could be the basis for automated moderation assistants that kick suspicious posts over to human moderators, who would then have the final word. Whether automated or human, moderators prevent many of the bad posts from being made in the first place. It’s no fun being a troll if you can’t get through to your victims.
Automated moderation can also do fact checking. The technology that won Jeopardy a decade ago is more than capable of checking basic facts. It might not be capable of checking complex logic, but most “fake news” centers around facts that can easily be evaluated. And automated systems are very capable of detecting bots: Google’s Gmail has successfully throttled spam.
What else can we learn from USENET? Trolls were everywhere, but the really obnoxious stuff stayed where it was supposed to be. I’m not naive enough to think that neo-Nazis and white supremacists will dry up and go away, on Facebook or elsewhere. And I’m even content to allow them to have their own Facebook pages: Facebook can let these people talk to each other all they want, because they’re going to do that anyway, whatever tools you put in place. The problem we have now is that Facebook’s engagement metric paves the road to their door. Once you give someone a hit of something titillating, they’ll come back for more. And the next hit has to be stronger. That’s how you keep people engaged, and that’s (as Tufekci has argued about YouTube) how you radicalize them.
USENET had no engagement metrics, no means of linking users to stronger content. Islands of hatred certainly existed. But in a network that didn’t optimize for engagement, hate groups didn’t spread. Neo-Nazis and their like were certainly there, but you had to search them out, you weren’t pushed to them. The platform didn’t lead you there, trying to maximize your “engagement.” I can’t claim that was some sort of brilliant design on USENET’s part; it just wasn’t something anyone thought about at the time. And as a free service, there was a need to maximize profit. Facebook’s obsession with engagement is ultimately more dangerous than their sloppy handling of personal data. “Engagement” allows—indeed, encourages—hate groups to metastasize.
Engagement metrics harm free speech, another ideal carried to the modern internet from the USENET world. But in an “attention economy,” where the limiting factor is attention, not speech, we have to rethink what those values mean. I’ve said that USENET ended in a “whimper”—but what drained the energy away? The participants who contributed real value just got tired of wading through the spam and fighting off the trolls. They went elsewhere. USENET’s history gives us a warning: good speech was crowded off the stage by bad speech.
Speech that exists to crowd out other speech isn’t the unfettered interchange of ideas. Free speech doesn’t mean the right to a platform. Indeed, the U.S. Constitution already makes that distinction: “freedom of the press” is about platforms, and you don’t get freedom of the press unless you have a press. Again, Zeynep Tufekci has it: in “It’s the (Democracy-Poisoning) Golden Age of Free Speech,” she writes “The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself.” Censorship isn’t about arresting dissidents; it’s about generating so much noise that voices you don’t like can’t be heard.
If we’re to put an end to the farce, we need to understand what it means to enable speech, rather than to drown it out. Abandoning “engagement” is part of the solution. We will be better served by a network that, like USENET, doesn’t care how people engage, and that allows them to make their own connections. Automated moderation can be a tool that makes room for speech, particularly if we can take advantage of communication patterns to moderate those whose primary goal is to be the loudest voice.
Marx certainly would have laid blame at the feet of Zuckerberg, for naively and profitably commoditizing the social identities of his users. But blame is not a solution. As convenient a punching bag as Zuckerberg is, we have to recognize that Facebook’s problems extend to the entire social world. That includes Twitter and YouTube, many other social networks past and present, and many networks that are neither online nor social. Expecting Zuck to “fix Facebook” may be the best way to guarantee that the farce plays on.
History is only deterministic in hindsight, and it doesn’t have to end in farce (or worse). We all build our social networks, and Mark Zuckerberg isn’t the only player on history’s stage. We need to revisit, reassess, and learn from all of our past social networks. Demanding and building a social network that serves us and enables free speech, rather than serving a business metric that amplifies noise, is the way to end the farce.
Is that a revolution? We have nothing to lose but our chains.
Continue reading From USENET to Facebook: The second time as farce.
from All - O'Reilly Media https://ift.tt/2vrOB5g
0 notes