Highest Rated Comments


szhang_ds2456 karma

In some cases like the India case or the U.S. case, in areas considered important/crucial by Facebook, it seemed pretty clear that political considerations had impeded action. Facebook was reluctant to act because it wanted to keep good relations with the perpetrators and so let it slide. But most of the cases were in less attention-getting areas (I'm sorry to say it, but Azerbaijan and Honduras are not countries that draw the attention of the entire world), and there was no one outside the company to hold FB's feet to the fire. And the company essentially decided that it wasn't worth the effort as a result.

I think it's ultimately important to remember that Facebook is a company. Its goal is to make money; not to save the world. To the extent it cares about this, it's because it negatively impacts the company's ability to make money (e.g. through bad press), and because FB employees are people and need to sleep at the end of the night.

We don't expect tobacco companies like Philip Morris to cover the cancer treatment costs of their customers. We don't expect financial institutions like Bank of America to keep the financial system from crashing. But people have high expectations of FB, partly because it portrays itself as a nice well-intentioned company, and partly because the existing institutions have failed to control/regulate it.

An economist would refer to this as an externality problem - the costs aren't borne by Facebook; they're borne by society, democracy, and the civic health of the world. In other cases, the government would step in to regulate, or consumer boycotts/pressure would occur.

But there's an additional facet of the issue here that will sound obvious as soon as I explain it, but it's a crucial point: The purpose of inauthentic activity is not to be seen. And the better you are at not being seen, the fewer people will see you. So when the ordinary person goes out and looks for inauthentic activity on FB, they find people who are terrible at being fake, they find real people who just look really weird, or they find people who are real but are doing their best to pretend to be fake since they think it's funny. And so the incentives are ultimately misaligned here. For areas like hate speech or misinformation, press attention does track reasonably for overall harm. But for inauthentic activity, there's very little correlation between what gets FB to act (press attention) and the actual overall harm.

szhang_ds1525 karma

My official job role was getting rid of fake engagement. The thing to understand is that the vast majority of fake engagement is not on political activity; it consists of everyday people who think they should be more popular in their personal life. To use an analogy people here might understand, it's someone going "When I make a reddit post, I only get 50 upvotes... but everything I see on the front page has thousands of upvotes and my post is definitely better! Why don't they recognize how great I am? I'll get fake upvotes, that will show them."

Like many organizations, my team was very focused on metrics and numbers - to a counterproductive extent, I'd personally argue. It's known in academia as the McNamara Fallacy, which lost the U.S. the Vietnam war. Numbers are important, but if you only focus on numbers that can be measured, you necessarily ignore everything else that cannot be measured. Facebook wanted me to focus on the vast majority of inauthentic activity - that took place for reasons like personal vanity - while neglecting the much larger impact associated with inauthentic political activity.

szhang_ds1465 karma

At the end of the day, Facebook is a private company whose responsibility is to its shareholders; its goal is to make money. It's not that different from other companies in that regard. To the extent, it cares about ideology, it's from the personal beliefs of the individuals who work there, and because it affects their bottom line profit.

I think some realistic cynicism about companies is useful to some regard as a result. If a company agrees with you on political matters, they're likely not acting out of the goodness of their hearts, but rather because it's what they believe their consumers and employees want.

Ultimately, most Bay Area tech companies are somewhat internationalist and pro-human rights on ethics/politics, while irreligious - not just because their employees want that, but also because taking a different stand [e.g. genocide is allowed, or XYZ is the one true religion] would obviously alienate many of their users.

szhang_ds1210 karma

I was always sure that if this happened it would be after the election. Not because my work was in the United States, but because any disclosures of these sorts have the necessary effect of creating uncertainty and doubt in existing institutions and potential use for misinformation.

For instance, many U.S. conspiracy theorists are of the opinion that Mark Zuckerberg's donations to election offices in the leadup to 2020 were part of an insidious plan to rig the U.S. 2020 elections. Or for instance the way QAnon seized upon the Myanmar coup as a sort of message to the United States to do their own coup in their conspiracy theories - despite it being half the world away, they apparently believe the world to revolve around this nation.

What I was most fearful of was somehow ending up as the James Comey of 2020. Thankfully that never happened.

szhang_ds924 karma

I'm sorry - I did not work at Reddit, and hence have no special knowledge about influence operations on Reddit. That said, if you stuck a gun to my head and made me guess, I'd expect Reddit to be similar to FB wrt troll farms and influence operations and the like.