Five Minuter

Link Wars: Facebook vs. Australia

In Australia, Facebook is once again hamstrung by its business model

Last month, the Australian government made headlines with a new law forcing Big Tech platforms, namely Google and Facebook, to pay publishers for news content. The move was ostensibly meant to provide a new revenue stream supporting journalism, but the legislation is also the latest development in a succession of moves by influential News Corp CEO Rupert Murdoch to strike back at the online platforms sapping his publications’ advertising revenue.

While Google brokered a deal with News Corp and other major Australian publishers, Facebook decided instead to use a machine learning tool to delete all “news” links from Australian Facebook. Caught in the wave of takedowns were also non-news sites: public health pages providing key coronavirus information, trade unions, and a number of other entities that share, but do not produce, news content. Facebook has since backtracked, and Australian news content has been allowed back on the platform.

While Google reached a deal with News Corp and other major Australian publishers, Facebook decided instead to use a machine learning tool to delete all “news” links from Australian Facebook.

The fiasco illustrates broader issues facing both the company and Big Tech in general: the spectre of regulatory action. This trend that explains the influx of politically influential figures entering Facebook’s employ, like Former UK Deputy Prime Minister Nick Clegg, who is now Facebook’s Vice-President of Global Affairs and Communications.

Facebook’s chronic problem isn’t the aggressive methods of its public affairs team, nor its CEO waxing poetic about free speech principles only to reverse course later. Facebook is hamstrung by its own business model, which incentivizes it to prioritize user engagement above all else.

The Australia case is reminiscent of another moment in the struggle between Big Tech and governments. In 2015, a pair of gunmen murdered a group of people at a barbecue in San Bernardino, CA with what were later understood to be jihadist motives. After the attack, Apple CEO Tim Cook seized the moment to solidify Apple’s brand image around privacy, publicly refusing the federal government’s requests to create a backdoor in iOS.

This principled stand was backed up by Apple’s business model, which involves selling hardware and software as a luxury brand, not selling data or behavioral insights. Cook’s move was both ethically defensible and strategically sound: he protected both users’ privacy and his brand’s image.

After the 2015 San Bernardino attack, Apple CEO Tim Cook seized the moment to solidify Apple’s brand image around privacy, publicly refusing the federal government’s requests to create a backdoor in iOS. This principled stand was backed up by Apple’s business model, which involves selling hardware and software as a luxury brand, not selling data or behavioral insights.

In the case Australia, different actors are involved. Google, like Facebook, relies on mining data and behavioral insights to generate advertising revenue. However, in the case of news, Facebook and Google have different incentives around quality. On the podcast “Pivot”, Scott Galloway from NYU pointed out that Google has a quality incentive when it comes to news. Users trust Google to feed them quality results, so Google would naturally be willing to pay to access professional journalists’ content.

More people use Google than any other search engine because they trust it to lead them not just to engaging information, but to correct information. Google therefore has a vested commercial interest in its algorithms delivering the highest quality response to users’ search queries. Like Apple in 2015, Google can both take an ethical stand — compensating journalists for their work — while also playing to the incentives of its business model.

On the other hand, Facebook’s business model is based on engagement. It doesn’t need you to trust the feed, it needs you to be addicted to the feed. The News Feed is most effective at attracting and holding attention when it gives users a dopamine hit, not when it sends them higher quality results. To Facebook, fake news and real news are difficult to distinguish amongst the fodder used to keep people on their platform.

In short, from Facebook’s perspective, it doesn’t matter if the site sends you a detailed article from the Wall Street Journal or a complete fabrication from a Macedonian fake news site. What matters is that the user stays on, interacting with content as much as possible to feed the ad targeting algorithms.

The immediate situation in Australia has been resolved, with USD 1bn over three years having been pledged to support publishers. But the fundamental weakness of Facebook’s reputation is becoming obvious. Regulators are clearly jumping the gun to take shots at the company in the wake of the Cambridge Analytica scandal, debates over political advertising, and the prominent role the site played in spreading conspiracies and coronavirus disinformation.

In short, shutting down news was a bad look. Zuckerberg may have been in the right on substance — free hyperlinking is a crucial component of an open internet. But considering the company has already attracted the ire of regulators around the world, this was likely not the ideal time to take such a stand.

In any case, Australia’s efforts, whether laudable or influenced by News Corp’s entrenched power, are largely for naught. As many observers have pointed out, the long-term problem facing journalism is the advertising duopoly of Google and Facebook. And the only way out of that problem is robust anti-trust action. Big Tech services may be used around the world, but only two legislatures have any direct regulatory power over the largest of these companies: the California State Assembly in Sacramento, and the United States Congress. Though the impact of these technologies is global, the regulatory solutions to tech issues will likely have to be American, as long as US-based companies continue to dominate the industry.


Satanic Panic 2: Facebook Boogaloo

Image for post

McMartin Preschool in the early 1980s // Investigation Discovery
Image for post
2004 study shows crime reporting dominates local TV coverage // Pew Research
Image for post
Not your typical fringe conspiracy aesthetic
Image for post
Notice the QAnon hashtags #greatawakening, #maga, and #painiscoming

The Forsythia-Industrial Complex

In Steven Soderbergh’s newly rediscovered 2011 film Contagion, a hypothetical novel virus called MEV-1 causes a global pandemic, which has to be stopped by the film’s protagonists, epidemiologists working for the Centers for Disease Control.

While the film contains all sorts of exposition sequences that are legitimately educational about the nature and spread of airborne viruses, the real nugget of gold in the film is in its main subplot. Jude Law plays Alan Krumwiede, an Alex Jones-type conspiracy entrepreneur. He spends his days chasing down sensational stories and posting ranting videos on his website, “Truth Serum”.

Despite the contemporary irrelevance of the “blogosphere”, the Truth Serum subplot is as pertinent today as it was in 2011. In many ways, its implications are more frightening now than ever. In 2011, algorithm-driven social media sites did not have the same preponderance over the information environment that they enjoy today.

Studies indicate that media consumption patterns have changed rapidly over the last decade. While internet-based news consumption was wide-spread by 2011, there were two key differences with the information environment of today. Firstly, online news consumption skewed young. Today, American adults of all ages consume much if not most of their news online. Secondly, news items spread via a number of media, including email chains and blogs. The dynamics of email and blogs as media are fundamentally different from algorithm-driven platforms like Facebook, Twitter, and YouTube. Blog followings and email chains are linear — a person recommends a blog or forwards an email to certain people at their own discretion. Information from these media therefore spreads less virally than content on algorithmic sites. In fact, the entire concept of “virality” essentially cannot exist without the engagement-based recommendations and news feed algorithms behind our now-dominant social media machines.

In the second and third acts of the film (*spoilers*, sorry), Krumwiede begins pushing a homeopathic cure, Forsythia, on his website and eventually on TV. He posts a series of videos on Truth Serum in which he has apparently caught the MEV-1 virus, and nurses himself back to health by using the substance. His endorsement of the drug to his loyal followers causes desperate Americans to loot pharmacies in search of Forsythia. Krumwiede is eventually arrested, and after an antibodies test reveals he never actually had the virus, he is charged with fraud. By this time, he’s moved from Forsythia to anti-vaccination, claiming the CDC and WHO are “in bed” with big pharma, and that’s why they’re trying to vaccinate the entire population. He ends the film urging his millions of loyal online followers not to take the crucial MEV-1 vaccine being produced by American and French pharmaceutical companies.

Image for post

Forsythia today

While the Forsythia subplot serves as a slight diversion from our main protagonists’ investigations and research, it has turned out to be the part of the movie that holds up best in light of our current novel coronavirus pandemic. Of course, in its obvious resemblances to the hydroxycholorquine craze in the US and France. But more worryingly, it’s become obvious that our contemporary information environment is more vulnerable to the Krumwiedes of the world now than it would have been in 2011.

American author and critic Kurt Andersen’s 2017 book Fantasyland argues that the viral, platform-based contemporary Internet is the crucial focal point of our conspiracy-laden politics. Common criticisms of modern social and online media focus on its tendency to create self-reinforcing echo chambers where individuals are only exposed to information that bolsters their existing views. Andersen takes this idea further and turns it on its head, arguing that social media causes the cross-pollination of information silos that would otherwise have remained separate.

This phenomenon is evident in the fact that believing in one conspiracy theory exponentially increases the likelihood you’ll believe in others. In the days before mass access to the internet, an individual with an easily-debunked belief would have been relatively isolated, they can now connect with other like-minded individuals across the globe. Anti-vaxxers can virtually intermingle with chemtrails theorists, 9/11 truthers, and anti-semites. Without this cross-pollination, expansive crowd-sourced conspiracy narratives like Q-Anon would simply not be possible.

In Contagion’s 2011-based universe, Krumwiede is a lone crusader, harrasing reporters and officials, pushing his homeopathic scams, and broadcasting to millions from a webcam as a one-man information army. In 2020, there is a whole parallel information ecosystem across several internet platforms where conspiracy theorists, activists, influence bots, grifters, and extremists can exchange and reinforce each others’ beliefs. Today, Krumwiede would be one of thousands of viral content creators “flooding the zone” with conspiracies, untruths, partial truths, and unverified and misleading claims.

Imagining better media bubbles

Could the Internet become a less toxic place? Maybe, but it’s a difficult problem. In a November 2019 interview with Vox, tech entrepreneur Anil Dash reminisces for “the Internet we lost” with the rise of the social media giants. He points out the weaknesses in “free speech” arguments made by Zuckerberg and other tech moguls, arguing that free discourse can exist without virality and engagement metrics. He says that a “trust network” model that looks more like the blogosphere of the 90s and 00s is perhaps more conducive to civil discourse than our platform-centric information environment today. Bloggers, like TV anchors or op-ed columnists, have to slowly gain the trust of their audience over time. Without virality’s constant interruptions, a stronger bond forms between content producer and content consumer, leaving less room for loud interlopers with wild claims to wedge their way into the discourse.

The issue with this concept is that, in a trust network model, aforementioned trusted information sources are difficult to dislodge once their network has been established. A new video “owning” the old champion relies on virality to dethrone the incumbent, and requires on a news feed or recommendations algorithm to find its way into the incumbent’s audience’s media diet. While the kind of decentralized trust network Dash and other ex-bloggers are nostalgic for would perhaps address the problems of influence bots, viral information blitzes, and other issues caused by engagement-based algorithmic media, it could also exacerbate the silo-ization of our information environment by entrenching a certain set of existing sources.


Turf Wars: The Birth of the COVID-19 Protests

Over the weekends of the 17th and 24th of April, thousands of Americans showed up at intersections and state houses across the country to protest against social distancing rules, the closure of businesses, and other measures taken by mayors and governors to combat the Covid-19 pandemic. Depending on the location, protestors ranged from the pedestrian to the extreme and bizarre. Some groups were calm, carrying signs calling on governors to reopen businesses. Other groups were toting semi-automatic rifles, combat gear, and QAnon paraphernalia.

Users on reddit, in particular /u/Dr_Midnight, noticed a strange pattern in certain sites purporting to support the anti-quarantine protests. Dozens of sites with the URL format reopen[state code/name].com had all been registered on 17 April within minutes of each other, many from a single round of GoDaddy domain purchases from the same IP address in Florida. The original Reddit posts were removed by moderators because they revealed private information about the individual who had registered the domains. Here are screenshots without sensitive information, as examples:

Image for post
The Pennsylvania and Minnesota sites are on the same server, registered from the same IP address

Image for post

Date and time for domain purchases // creds to Krebs On Security

Sites urging civil unrest in Pennsylvania, Wisconsin, Ohio, Minnesota, and Iowa all had the same “contact your legislator” widget installed, and these and other states’ websites “blog” sections cross linked to each other.

Many of the sites purchased on 17 April are dormant and have no content at the time of publication. However several of these domains forward users to a string of state gun rights advocacy websites, all named [state] The Pennsylvania, Minnesota, Michigan and other “gun rights” sites and associated Facebook groups belong to the Dorr brothers, gun rights extremists and conservative advocates who Republican lawmakers in the Midwest have repeated labeled as “grifters”. Multiple sites have “shop” sections selling the Dorrs’ anti-quarantine and pro-gun rights merchandise.

Several URLs lead to Facebook groups calling themselves “Operation Gridlock [city name]”. Here are the identical descriptions for the LA and Tennessee Gridlock Facebook groups:

Image for post
Image for post

Security researcher Brian Krebs also identified domains, including, that had eventually been sold on to In Pursuit Of LLC, a for-profit political communications agency reported to belong to the conservative billionaire industrialist Charles Koch. Non-profit journalistic site ProPublica has identified several former In Pursuit Of employees who are now on the Trump White House communications staff. It is unclear who registered and other sites purchased by for-profit political consultancies, as many were not purchased during the 17 April’s afternoon buying spree in Florida.

A further twist in the story came on 23 April, when a man named Michael Murphy, whose IP address was identified in /u/Dr_Midnight’s original removed reddit investigation, was interviewed by reporter Brianna Sacks. It turns out that Murphy, a struggling day trader from Sebastian, Florida, spent $4,000 on dozens of domains in the hopes of selling them on to liberal activists looking to prevent conservatives from organizing protests. An attempt to out-grift the grifters.

It is unclear whether Murphy’s intentions were political, financial, or both. He describes his politics as “generally liberal”, however his business has been suffering in recent years — he even tried to reorient to selling N-95 mask cleaning solution when the coronavirus outbreak worsened in March, but was unsuccessful. Murphy even claims to have attempted to contact late night TV host John Oliver, hoping the comedian would pay him for domains to use in one of his show’s signature trolling stunts. Murphy came forward to reporters after anti-right-wing reddit users began doxxing him, revealing his name, address, and businesses. Any sites not registered to Murphy’s Florida IP address were likely bought by the Dorrs brothers or Koch-backed organizations before Murphy could snatch them up.

What do we make of all this?

Relatively unsophisticated technical actors have shown themselves capable of mobilizing large numbers of citizens into the streets. A few well-named URLs and a decent Facebook following are all it takes for a series of protests to be organized across the country with little notice. Protesting citizens are entirely unaware that any central coordination of their activities exists beyond their local social media groups. However these groups were not genuine expressions of opinion by concerned private citizens. Most were created concurrently by individuals or organizations with the explicit intent of political or financial gain through advocating activities that contradict public health rules and guidelines in a time of national crisis.

It’s important to note at this point these protests represent the views of a very small minority of voters, regardless of party. A poll conducted by the Democracy Fund and UCLA in late March and again in early April shows broad approval of, and compliance with, local and state social distancing guidelines and business closures. Around 87% of respondents approved of varying measures imposed by mayors and governors, and 81% said they hadn’t left their homes over the last two weeks except for buying necessities, up from 72% in late March. Majorities of Democrats, Republicans, and Independents all believed quarantine measures to be necessary. Despite this general consensus in support of emergency measures, astroturfing operations were able to mobilize a diverse set of online activists, spectators, social media buffs, conspiracy theorists, and guns rights absolutists, even reaching all the way to disgruntled mainstream conservatives and their families.

The Internet and social media catalyze kinetic action

Centrally coordinated puppeteering of otherwise spontaneous demonstrations is not new. What is novel is the ability to do so at a national scale with almost no investment of resources of any kind — financial or otherwise. All it took was an internet connection, a few web domains, and a cursory knowledge of the online right wing universe. Once the spaces for action were created, and the right actors assembled, the demonstrations themselves were almost an inevitability. With enough prodding from conservative media and political figures, right up to the top of the movement, people took to the streets.

Twitter, and to a lesser extent Facebook, have actively shied away from preventing this method of organizing on their platforms. Despite both companies ostensibly having changed terms-of-use enforcement to take down content encouraging violating state quarantine orders, Facebook has not taken down Freedom Fund or Conservative Coalition groups or individual posts, and Twitter has officially decided that the President’s “LIBERATE” tweets do not violate their rules against inciting violence. On 22 April, Facebook did take down events pages for anti-quarantine protests in California, Nebraska, and New Jersey, but only after these states’ governors explicitly ordered the company to do so. Events pages in other states, most notably Michigan, Pennsylvania, and Ohio, remained active over the weekend of 24 April. Several groups of protesters in Lansing, Michigan entered the State Capitol Building carrying semi-automatic rifles and wearing kevlar vests and other combat gear. Michigan’s governor, Gretchen Whitmer, has been a target of particularly vitriolic rhetoric from protestors over the state’s emergency orders — some of the most stringent in the country — enacted after a major outbreak in the Detroit area in late March.

Inauthentic action catalyzed by social media is legitimized by traditional media

Traditional media, most notably TV, are often playing catch-up with more savvy information actors online. An Insider Exclusive special oPoln coronavirus on 29 April — broadcast in primetime on multiple US cable networks — contained a segment on the protests. It first showed “hundreds” of people continuing to protest in front of various state houses, and immediately contrasted these images with footage of long lines at food banks in Houston, Texas. The narration insinuates that the “exasperation” felt by the protestors somehow derives from an inability to find basic necessities. This insinuation, however, is false. Footage of the protests has revealed the discontents to be predominantly white and older, while those requiring assistance from food banks in major cities are often younger, economically-precarious people of color, a demographic notably absent from images of anti-stay-at-home protesters.

The astroturfing operation has therefore worked its way through an entire information cycle. Political donor money is used to fund fringe actors’ online efforts, purchasing websites and organizing on social media. These sites are used to generate kinetic action in the form of protests. These protests are then covered by the traditional media, broadcasting and legitimizing the initial message of the organizers, inserting their narrative into the mainstream.


Botnets for Good