NEED A PERFECT PAPER? PLACE YOUR FIRST ORDER AND SAVE 15% USING COUPON:

Eco Homework discussion John Gapper NOVEMBER 29 2017 Receive free YouTube Inc updates We’ll send you a myFT Daily Digest email rounding up the latest You

Click here to Order a Custom answer to this Question from our writers. It’s fast and plagiarism-free.

Eco Homework discussion John Gapper NOVEMBER 29 2017

Receive free YouTube Inc updates

We’ll send you a myFT Daily Digest email rounding up the latest YouTube Inc news every morning.

Sign upEnter your email address

It is hard to keep up with the stream of scandals, big and small, involving social
networks such as Facebook and Twitter. From unwittingly aiding Russian efforts to
subvert elections to finding themselves exploited by extremists and pornographers,
they are constantly in trouble.

The latest is YouTube failing to stop videos of children being commented on by
paedophiles, while letting advertisements appear alongside them. Only months after
Alphabet’s video platform faced an advertiser boycott over extremist videos and had
to apologise humbly, companies such as Diageo and Mars are again removing ads.

Opinion YouTube Inc

Facebook faces the tragedy of the commons

JOHN GAPPER

The openness of social networks enables creativity but invites exploitation

https://www.ft.com/john-gapper

http://markets.ft.com/data/equities/tearsheet/summary?s=BBG000MM2P62

http://markets.ft.com/data/equities/tearsheet/summary?s=BBG000H6HNW3

https://www.ft.com/content/e91f401c-cf05-11e7-9dbb-291a884dd8c6

https://www.ft.com/content/9e594d76-d0f8-11e7-9dbb-291a884dd8c6

http://markets.ft.com/data/equities/tearsheet/summary?s=BBG009S39JX6

https://www.ft.com/content/5de88e80-0d5f-11e7-b030-768954394623

http://markets.ft.com/data/equities/tearsheet/summary?s=BBG000BS6936

https://www.ft.com/stream/4ee17c6c-3b3d-4d3a-a2f9-2a55b834977d

https://www.ft.com/john-gapper

Each scandal produces fresh calls for networks to be treated like publishers of news,
who are responsible for everything that appears under their names. Each one forces
them further to tighten their “community standards” and hire more content
checkers. By next year, Facebook intends to employ 20,000 people in “community
operations”, its censorship division.

Tempting as it is for publications that have lost much of their digital advertising to
internet giants to believe they should be treated as exact equivalents, it is flawed:
Facebook is not just a newspaper with 2.1bn readers. But being a platform does not
absolve them of responsibility. The opposite, in fact — it makes their burden heavier.

A better way to think of Russian political ads, extremist videos, fake news and all the
rest is as the polluters of common resources, albeit ones that are privately owned.
The term for this is the tragedy of the commons. Open ecosystems that are openly
shared by entire communities tend to get despoiled.

Garrett Hardin, the US ecologist and
philosopher who coined the phrase in 1968,
warned that “the inherent logic of the
commons remorselessly generates tragedy”,
adding gloomily that, “Ruin is the
destination toward which all men rush, each
pursuing his own best interest in a society
that believes in the freedom of the
commons.”

His prime example was the overgrazing of
common land, when the number of farmers
and shepherds seeking to use the resource of

free feed for animals becomes too high. He also cited companies polluting the
environment with sewage, chemical and other waste rather than cleaning up their
own mess. Rational self-interest led to the commons becoming barren or dirty.

Here lies the threat to social networks. They set themselves up as commons, offering
open access to hundreds of millions to publish “user-generated content” and share
photos with others. That in turn produced a network effect: people needed to use
Facebook or others to communicate.

Every time a scandal
occurs, they have to
reinforce their editorial
defences and come
closer to the kind of
content monitoring that
would change their
nature

https://www.facebook.com/communitystandards

http://science.sciencemag.org/content/162/3859/1243.full

But they attract bad actors as well — people and organisations who exploit free
resources for money or perverted motives. These are polluters of the digital
commons and with them come over-grazers: people guilty of lesser sins such as
shouting loudly to gain attention or attacking others.

As Hardin noted, this is inevitable. The digital commons fosters great communal
benefits that go beyond being a publisher in the traditional sense. The fact that
YouTube is open and free allows all kinds of creativity to flourish in ways that are not
enabled by the entertainment industry. The tragedy is that it also empowers
pornographers and propagandists for terror.

So when Mark Zuckerberg, Facebook’s founder, denounced Russia’s fake news
factory — “What they did is wrong and we’re not going to stand for it” — he sounded
like the police chief in Casablanca who professes to be shocked that gambling is
going on in a casino. Mr Zuckerberg’s mission of “bringing us all together as a global
community” is laudable but it invites trouble.

Hardin was a pessimist about commons, arguing that there was no technical solution
and that the only remedy was “mutual coercion, mutually agreed upon by the
majority”. The equivalent for Facebook, Twitter and YouTube would be to become
much more like publishers, imposing tight rules about entry and behaviour rather
than their current openness.

They resist this partly because it would bring stricter legal liability and partly because
they want to remain as commons. But every time a scandal occurs, they have to
reinforce their editorial defences and come closer to the kind of content monitoring
that would change their nature.

It would cross the dividing line if they reviewed everything before allowing it to be
published, rather than removing offensive material when alerted. Defying Hardin,
they aspire to a technical solution: using artificial intelligence to identify copyright
infringements and worse before their users or other organisations flag them for
review.

More than 75 per cent of extremist videos taken down by YouTube are identified by
algorithms, while Facebook now finds automatically 99 per cent of the Isis and al-
Qaeda material it removes. It is like having an automated fence around a territory to
sort exploiters from legitimate entrants.

https://www.facebook.com/zuck/posts/10104146268321841

http://www.imdb.com/character/ch0003053/quotes

https://youtube.googleblog.com/2017/08/an-update-on-our-commitment-to-fight.html

Copyright The Financial Times Limited 2021. All rights reserved.

Machines cannot solve everything, though. If they could exclude all miscreants, the
commons would turn into something else. The vision of an unfettered community is
alluring but utopias are always vulnerable.

john.gapper@ft.com

Letter in response to this column:

Uber exemplifies the Tragedy of the Commons / From Tad Borek, San Francisco,
CA, US

http://help.ft.com/help/legal-privacy/copyright/copyright-policy/

mailto:john.gapper@ft.com

https://www.ft.com/content/c4f8327c-d07f-11e7-9dbb-291a884dd8c6

Place your order now for a similar assignment and have exceptional work written by one of our experts, guaranteeing you an A result.

Need an Essay Written?

This sample is available to anyone. If you want a unique paper order it from one of our professional writers.

Get help with your academic paper right away

Quality & Timely Delivery

Free Editing & Plagiarism Check

Security, Privacy & Confidentiality