In Sheera Frenkel and Cecilia Kang's "An Ugly Truth," the authors recount the company's long history of insider threats in which employees (mostly men) used the company's tools to stalk people (mostly women).
The stalking targets included both strangers and intimate partners - for example, an engineer used FB's tools to locate his partner after she fled their shared vacation hotel room in order to "confront her."
Another FB engineer stalked a woman who didn't return his messages after a date, accessing years of private messages and photos, including photos that his target believed she had permanently deleted, but which Facebook had secretly retained.
All told, Facebook fired 52 employees for data abuses between Jan 2014 and Aug 2015, after a policy change eliminated many access safeguards in the name of eliminating "the red tape that slowed down engineers."
In other words, Facebook was in a situation in which its users' interests were at odds with its shareholders. By eliminating protections for its users, it allowed its engineers to work more efficiently, and increased its profits.
These kinds of conflicts - between shareholder and stakeholder interests - are the norm in business. Think of a busy retailer that cuts its cashiers: reducing payroll costs increases profits, at the expense of worker stress and longer waits for customers.
The question of how much value can be shifted from employees and customers to shareholders isn't really an economic one - it's really a *policy* question.
If we have strong labor laws - protecting cashiers from undue stress, extending unemployment benefits to workers who quit bad jobs, protecting workers from non-compete clauses, separating health-care from employment - then a business that screws its cashiers will lose them.
Or if the business has a regulated monopoly - a patent, a trademark or some other exclusive right that makes it the only game in town (say, the sole right to sell snacks in an airport), it can shift more value from customers to shareholders before the customers walk away.
Facebook - and other tech monopolists - have engineered a world where they get to side with shareholders over users, again and again, to the users' great detriment, without losing those users.
Economic analysis of tech monopolies focuses on "network effects" - the way more users make Facebook more valuable (you join FB because your friends are there, more friends sign up because *you're* there).
Taken on their own, network effects are cause for despair, predicting that tech will produce "natural monopolies" - an inevitable winner-take-all market. But that's obviously not true - I'm not typing this on a Cray or using Altavista to look up facts while I do.
Far more important than network effects for antimonopoly analysis is *switching costs* - the things you give up when you quit a service. In FB's case, quitting means leaving behind your friends, communities and customers.
Now, this needn't be the case. You can switch phone companies or email providers without shattering your social connections. FB has engineered a high switching cost, blocking other services from connecting to it.
After all, the more you stand to lose by leaving FB, the worse FB can treat you before you're willing to leave. Zuck didn't abolish the safeguards that protected us from rogue FB employees because he's nosy - he did it because it's profitable.
He was betting (probably correctly) that no matter how unhappy the ensuring scandals made his users, it wouldn't make so many of them unhappy enough to quit that the losses would outweigh the gains from exposing us to predatory Facebookers.
Which is why proposals like the ACCESS Act, currently working its way through Congress, are such a big deal. It's a law that would force FB to let third parties plug into it, so you could leave FB but stay in touch with the people who stay behind.
@pluralistic A proper Facebook/Fediverse or Twitter/Fediverse bridge would be fantastic.
Honestly, Facebook and Twitter could just figure out how to implement ActivityPub. That would be great.
@danjones000 @pluralistic Before implementing something, tech people need to do a better job of understanding & estimating how those with other motives (e.g. profit) will twist it into something the originators never intended or wanted. I speak from experience. Just because you have good intentions doesn't mean others do. Don't underestimate the avarice and insensitivity of people who are nothing like you. Cost/benefit analysis needs to include Big Money, Big Govt, bad actors.
I think we're agreeing. Who knows how many instances fb would have? Who knows how well the instances would be moderated? Who knows if they'd sell advertising on them? What social nets would they bridge, in which directions? How much money would they throw at people to motivate them to do things to bring them even greater profits? FB has done a lot of research & grabbing people's data. They know how brains work, how cultures work, better than fedi volunteers, I think.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!