Regulating Big Tech: It's a human rights thing
- Mary Ariyo
- Mar 19
- 5 min read
In 1890 Senator John Sherman declared that “if we will not endure a king as a political power, we
should not endure a king over production, transportation, and the necessities of life. If we would not
submit to an emperor, we should not submit to an autocrat of trade, with the power to prevent
competition and to fix the price of any commodity”. At the time, this speech was a thinly veiled
attack on the power held by industry titans such as J.D. Rockerfeller’s company Standard Oil. A
company which controlled most of the oil production, processing, marketing and distribution in the
US at the time. The extent of Standard Oil’s power essentially meant that the company ruled over
the US government and all it’s citizens as though they were it’s subjects. It was a monopoly that
could not be avoided. Some might suggest that Standard Oil’s influence and power is not too
dissimilar to that which is wielded by Big Tech giants (like Google, Amazon and Facebook) today.
Following on from Senator Sherman's speech, ’The Sherman Act’ was introduced, it sought to
curtail the power of companies like Standard Oil and ensure that consumers would not be exploited.
130 years later the rules set out by Senator Sherman - though effective over time (as it led to the
break up of Standard Oil) - fail to fully anticipate the majesty of today's industrial monarchs
(FAANG), who trade in commodities of a new kind. Human experience. We (yes, you and I) are the
new commodities, we are sold as data points and as market intelligence to companies across the
world as tools to aid to grow sales, engagement, etc. Some argue that this change in product
requires us to revisit the rules because the legalities of market structures are no longer clear cut.
They are becoming human rights issues. With each of us volunteering (through our subscriptions, cookies and accounts) to be commodities, the litmus test gaging whether the practices of company X can hurt the consumer has become redundant. We would never judge how appropriate the behaviour of a drug dealer is by whether or not their consumers are offered a fair price for a drug that is killing them, so why do we judge the appropriateness of big tech’s decisions to sell and manipulate human experiences as data point by the extent to which they increase efficiencies and improve the user experience ?
Going a little deeper
The predictive products created as a result of big tech, turn our lives into data points, enabling
businesses to anticipate what we are likely to do in the future. Equipping them the ability to
potentially manipulate our decision making. In the words of Shoshana Zuboff these big tech firms
are now ‘trading in human futures’. This raises the question of whether these products in some ways
deny consumers of their right to self determination. Self determination, being the right of each
individual to make their own decisions without external influences. There are those who may
believe consumers (humans) have never truly had the right to self determination to begin with
whilst others may see these new products as an infringement on the personal liberties of consumers. Assuming you hold either of these views you might look at big tech firms and assume they are playing god or perhaps a usurper. Both roles are incredibly problematic. Article 12 of the universal declaration of human rights states that “no one shall be subject to arbitrary interference of with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation”. The very nature of the products sold by the likes of Facebook and co violate this law.
Despite this, some still discuss the products born from our data as though they are ‘help meets’,
gratifying our needs before we fully realise them for ourselves. Suggesting there is a ‘yin and yang’
element. A symbiosis of thought between us and the algorithms. As idyllic as this might seem, there
is, in my opinion something not quite right. As autonomous beings, we should (at our own
discretion) dictate the path we choose, the products we buy, at the pace that we choose.
Technological convenience is today gifted to us as a Trojan horse, cloaking conversion tools,
product suggestions and much more. As big tech becomes increasing sophisticated in monetising
our data and influencing us, we are becoming increasing aware of this, via documentaries, podcasts
etc the global regulators ought to introduce regulations that are appropriate for the times.
How could we fix it?
Lina Khan, the newly appointed Chief of the FTC’s paper Amazon Antitrust Paradox Khan suggests
“the consumer welfare approach (to policing big tech) is unduly narrow… pegging anticompetitive
harm to high prices and/or lower output - while disregarding the market structure (it’s influence)
and competitive process gives rise to market power” which leads to unchecked exploitation of
consumers. Instead of regulators asking whether the actions of big tech companies are ‘unfair to
consumers’, the new question should be ‘are the actions of big tech companies disproportionatelyaffecting the decision making abilities of consumers by limiting their choices’? In order to successfully consider this question I think these two things ought to always be considered.
(1) Does the company wield monopoly like power?
Monopoly power is often identified by market share and the ability of the business to charge a
premium price for its service or product. Given that the products sold and price setting power held
by big tech is derived from its users and the level of engagement they have with the business, it is
extremely important to pay attention to market share. According to Senator Elizabeth Warren
Facebook “controls 85% of social network traffic, bulldoze competition, and undermine
democracy”, such influence indicates the business’ monopoly like power is disproportionately affect
the lives of US consumers.
(2) In the face of existing regulation are there loopholes that this company can easily exploit to the detriment of consumers ‘free will’?
To date the EU introduced the GDPR in May 2018 - giving consumers control over their personal
data- and Apple (in its iOS 14.5 update) gave consumers the ability to stock big tech and other
online companies from tracking them across the internet. Though great efforts, I think they fail to
entirely protect consumers. With Apple’s ‘permission slip’, the ability to trace consumers between
different websites is targeted however not much is done to address the tracking done within
websites/app. In the case of Facebook Inc, the business remains able to track your interests freely
while you scroll through it’s websites. With the broadening e-commerce and Pinterest like focus of
Instagram, the ability of Facebook Inc to unduly influence consumers remains strong. During a
hearing Mark Zuckerberg was asked to explain the extent to which the business’ ‘conversion path’
product which affects consumer decisions, he shuffled around it. On the website it is illustrated as
such: The new version of this limited ‘conversion path’ captured within the Facebook bubble is
likely to be just as effective.post (x1) - influencer z story (x2) - explore page (x5) - direct
Facebook unregulated will likely package the ability to more heavily influence consumers within its
ecosystem. Any regulation introduced would need to impose so,e sort of rule that attached limits to
this type of influencing. I.e. said organisation cannot include more than to non-organic markers on
the conversion path.
Once these things are considered regulators become better placed to at the very least begin to judge whether by today’s standards big tech violates our freedoms and antitrust.
Sources
Amazon Antitrust Paradox
Why Frankenstein is still relevant 200 years after it’s publishing
Why Frankenstein matters
Why do we care so much about privacy?
Facebook built a Frankenstein monster. When will it admit it?
Big Tech compliance tracker
Facebook, Big Data and the Ethics of Behavioural Economics
The Curse of Bigness: Antitrust in a New Age.
Comments