It's hard to pinpoint the precise moment that Facebook turned from a community network into an ethically dubious organisation in the public eye. The Cambridge Analytica scandal has, of course, been pivotal, prompting, at the time of going to Press, shares to crash by 11%, high-profile advertisers such as Mozilla and Commerzbank to withdraw support and Mark Zuckerberg to go to ground for four days before apologising on CNN after the hashtag #WheresZuck went viral to tell us that he's "really sorry" and feels "really bad".
But some say the tide began to turn as recently as August 2016. That's when Zuckerberg, under intense pressure from Right-wing news organisations, such as Breitbart, fired the supposedly biased humans who curated the site's Trending News section and replaced them with algorithms. Within days, the feed was dominated by fake news items about Hillary Clinton, pushed up the rankings by bots and trolls. We all know where that led.
Or perhaps Facebook turned in 2014, when it published the results of an "emotional contagion" study it had secretly conducted on 689,000 users. Researchers wanted to find out if showing people more negative posts made them more negative. Their conclusion? Yes: "Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."
"That's really important for advertisers," says Jonathan Taplin, author of Move Fast and Break Things: How Facebook, Google and Amazon Cornered Culture and Undermined Democracy. "You're feeling depressed. Some cool shoes pop up. Maybe they'll make you happy. And it's particularly powerful when it comes to politics."
Others say Facebook turned in 2012, around the time of its initial public offering. The IPO is that mythical moment in a start-up's life when Wall Street investors buy shares and founders become billionaires. But shareholders want those impressive user numbers - Facebook had about one billion users at the time - turned into dollar signs and it isn't always clear how to do this.
However, Facebook figured out a way. It could take all of the information we share about our favourite Rihanna songs, and our postcode, and our feelings about Jeremy Corbyn, and our Ocado shop, and our ethnicity, and sell it to advertisers.
Couple that information with the ability to track your location via your mobile and Facebook became, as John Lanchester put it in the London Review of Books, "the biggest surveillance-based enterprise in the history of mankind". It's now up to 2.2 billion users, by the way - more than the combined populations of Europe and China.
Or you could point to any number of other decisions. The subtle changes to user terms and conditions in 2011 that potentially opened up their data to app developers. The decision in 2010 to allow the developers of games, such as FarmVille, to scrape your data and your friends' data. Or perhaps the introduction of the 'Like' button in 2009.
"Is it any coincidence that the race to the bottom in media - toward clickbait headlines, toward the vulgar and prurient and dumb, toward provocative but often exaggerated takes - has accelerated in lock-step with the development of new technologies for measuring engagement?" wondered James Somers in The Atlantic magazine recently.
But there was something ugly in that code from 2004, when 'TheFacebook' evolved out of FaceMash, a Harvard-based hot-or-not face comparison site, which turned social envy, comparison and judgment into a fun game. (An interesting fact about Zuckerberg is that he studied psychology alongside computer science).
When Zuckerberg asked 4,000 of his peers for information and photos to populate the new site, he bragged to his friends: "People just submitted it ... I don't know why ... They 'trust me' ... dumb f***s."
The #DeleteFacebook campaign is well under way, backed, notably, by Brian Acton, founder of WhatsApp, which is owned by Facebook. "It is time. #deletefacebook," he wrote on Twitter last week. But once you factor in the enormous reach of all social media into our intimate lives - Snapchat, Instagram, Twitter, the lot of them - you might wonder how far this goes.
And, for those who are shocked that people have been using social media platforms to spy on our most intimate desires and reconfigure democracy, well, the signs have been there all along.
To anyone who has been paying attention, the only surprising thing about the Cambridge Analytica scandal is that anyone is surprised. "This is not an aberration," says Douglas Rushkoff, author of Throwing Rocks at the Google Bus and professor of media theory and digital economics at City University of New York.
"What we're seeing is these platforms performing exactly as they were designed to do. People with a lot of money invest money in a platform that encourages people to share as many private details about themselves as they can. The platform sells that information to other people who want to change our behaviour: marketers, political agents, compliance professionals. That's the way these platforms advertise themselves to their customers."
And you must remember that you, Average Facebook User, are not the customer. You are the product, as the saying goes. Facebook's customers are the people it sells advertising to: household brands, media organisations, political parties and so on.
"One of my graduate students asked me, 'What does it matter if they have my data?'," says Rushkoff. It's not such a bad question. It doesn't hurt when your data is extracted. "But it's a question that only someone of great privilege can ask," he counters. "People might be denied loans, or apartments, because of their data. In China, they're being denied social benefits because of friends they have, or things they've liked on social media."
The Chinese Communist Party is currently trialling a "social credit" system where citizens' data will be used to "provide the trustworthy with benefits and discipline the untrustworthy (so that) integrity becomes a widespread social value". Remember the talk about Zuckerberg running for President?
Even in liberal countries like our own, CVs are no longer the first port of call when looking at a new employee - it's a name search on Google then a stealth stalk of the candidate's social media profiles.
"I think consumers and users of technology have more power than they think they have," says Kathryn Parsons, founder of Decoded, which aims to spread digital literacy. "I'm a believer in empowerment and education. Know how your data is being used. Understand the platform."
But some people understandably feel that it's not clear where to turn. You might think, "Oh well, at least Instagram's a bit more cheerful", but Facebook owns Instagram. "Hmm, I guess I can keep in touch with my family on WhatsApp." Facebook owns WhatsApp, too. It's not as if Twitter, or Snapchat, seem to be increasing the sum of human happiness, either.
"They're all guilty. They're all part of the problem," says Taplin. "All of them have consistently allowed bad actors to use their platforms. Within hours of the mass shooting in Florida, YouTube featured videos claiming (surviving) school kids weren't real. Those were the most popular videos. These are sources of propaganda and lies. And there's an entire generation who get all their news from these platforms."
One of the most persistent arguments against social media is that it makes people unhappy. A 2017 research paper entitled 'Association of Facebook Use with Compromised Well-Being: A Longitudinal Study' found that Facebook interactions made people unhappier over time, while real-life interactions made them happier, but apart from that, the evidence for the psychological evils of social media isn't that conclusive.
Amy Orben, an experimental psychologist specialising in social media at Oxford University, believes fears are overplayed. "We've been researching this day-in, day-out for years and I think the striking thing is simply the lack of solid evidence we have at the moment," she says. "There seems to be a small negative effect on adolescent mental health, but it's tiny. We can only actually predict about 1% of someone's well-being from social media. We need more nuanced evidence."
Is there anything we should be worried about? The effect of filters on self esteem? Orben points out that "passive consumption" - simply scrolling down the news feed with no particular aim - does seem to make people depressed.
"Even Facebook warns about that. But that's very small. If you're directly interacting with people, it's just another method of communicating."
Zuckerberg says he just wants to connect people. But as Zadie Smith (who was at Harvard when FaceMash emerged) once wrote of Facebook and Zuckerberg: "The quality of that connection, the quality of the information that passes through it, the quality of the relationship that connection permits - none of this is important. That a lot of social networking software explicitly encourages people to make weak, superficial connections with each other and that this might not be an entirely positive thing, seems to never have occurred to him."
Better connections are available, if you improve the code.
Maybe Zuckerberg could do it himself? Disentangle the controversial advertising model. He retains the controlling stake after all. He can do what he likes. Like turn it back into a social network where you can stay in touch with your friends.
The sort of platform he claims he's always wanted to make. "That would be brave and heroic," says Kathryn Parsons. "Ready Player One eat your heart out - he should secure the movie rights."
Perhaps, people shouldn't hold out much hope from his apology - or the response of his shareholders. "Our main fear was that Zuckerberg would propose fundamental changes to the business, we are relatively relieved," said financial analysts at Macquarie last week. And, of course, increased regulation is the thing those who profit from social media sites fear the most.
Maybe we ought to put a little more faith in ourselves. "The thing I always felt bad about was how little of our intuition we bring to these spaces," says Rushkoff. "In the mid-Nineties, as the net began to turn from a research tool into a human manipulation space, I'd always tell people: 'Keep a hold of how you feel on these platforms. Does it make you feel anxious? Stay in touch with your instincts.' When you walk into a bar you do this: 'Is this a place I'm likely to get beat up? Or scammed?' You can feel it."
Reality may be catching up, in any case. The EU's General Data Protection Regulations, arriving in May, will mean sites such as Facebook can't take your information without your explicit consent. And they can't throw you off if you say "no".
In addition, the Government's Data Protection Bill, currently making its way through parliament, gives the commissioner, Elizabeth Denham, the right to fine social media platforms if they fail to play by the rules with users' data.
Andrew Orlowski, executive editor of tech publication The Register, agrees that politicians are finally wising up. "They have been in awe of them. But there's a huge lobbying operation, as well. If you asked 100 people, 'Do you want less control over the photos you post on Facebook?', everyone would say no. People want more control.
"But Silicon Valley has pushed this amazing message that no one owns anything and copyright is evil and allowing people to have control would break the internet.
"Thankfully, that era seems to be slipping away now."