As of the third quarter of 2018, 2.27 billion people actively used Facebook,1 the world's largest social media site, up from 1 billion in 2012. On average, each user spends about 41 minutes using the site daily,2 down from 50 minutes average in 2016.
Some, of course, spend far more. Teens, for instance, may spend up to nine hours perusing the site, the consequences of which are only beginning to be understood.
As noted by The Motley Fool,3 Facebook is unique in its ability to monetize the time people spend on its platform. During the third quarter of 2018, the site generated more than $6 per user. For the fourth quarter of 2017, Facebook raked in a total of $12.97 billion, $4.3 billion of which was net profit.4
Most of this revenue — $11.4 billion for the fourth quarter alone — came from mobile ads,5 which are customized to users' preferences and habits. According to CNN Money,6 98 percent of Facebook's revenue comes from advertising, totaling $39.9 billion in 2017.
Facebook's Primary Business Is Collecting and Selling Your Personal Data
Facebook has repeatedly been caught mishandling users' data and/or lying about its collection practices. The fact is, its entire profit model is based on the selling of personal information that facilitates everything from targeted advertising to targeted fraud.
Like Google, Facebook records,7 tracks and stores every single thing you do on Facebook: every post, comment, "like," private message and file ever sent and received, contacts, friends lists, login locations, stickers and more. Even the recurrent use of certain words is noted and can become valuable currency for advertisers.
For individuals who start using Facebook at a young age, the lifetime data harvest could be inconceivably large, giving those who buy or otherwise access that information a very comprehensive picture of the individual in question.
Facebook also has the ability to access your computer or smartphone's microphone without your knowledge.8 If you suddenly find yourself on the receiving end of ads for products or services you just spoke about out loud, chances are one or more apps are linked into your microphone and are eavesdropping.
In the featured video, "The Facebook Dilemma," Frontline PBS correspondent James Jacoby investigates Facebook's influence over the democracy of nations, and the lax privacy parameters that allowed for tens of millions of users' data to be siphoned off and used in an effort to influence the U.S. elections.
The Early Days of Facebook
The Frontline report starts out showing early video footage of Zuckerberg in his first office, complete with a beer keg and graffiti on the walls, talking about the success of his social media platform. At the time, in 2005, Facebook had just hit 3 million users.
In an early Harvard lecture, Zuckerberg talks about how he believes it's "more useful to make things happen and apologize later than it is to make sure you dot all your i's now, and not get stuff done." As noted by Roger McNamee, an early Facebook investor, it was Zuckerberg's "renegade philosophy and disrespect for authority that led to the Facebook motto, 'Move fast and break things.'"
While that motto speaks volumes today, "It wasn't that they intended to do harm, as much as they were unconcerned about the possibility that harm would result," McNamee says. As for the sharing of information, Zuckerberg assured a journalist in an early interview that no user information would be sold or shared with anyone the user had not specifically given permission to.
In the end, Zuckerberg’s quest to “Give people the power to share and make the world more open and connected,” has had far-reaching consequences, affecting global politics and technology, and raising serious privacy issues that have yet to be resolved.
For years, however, employees firmly believed Facebook had the power to make the world a better place. As noted by Tim Sparapani, Facebook director of public policy from 2009 to 2011, Facebook "was the greatest experiment in free speech in human history," and a "digital nation state."
However, the company — with its largely homogenous workforce of 20-something tech geeks — has proven to be more than a little naïve about its mission to improve the world through information sharing. Naomi Gleit, vice president of social good, the company's growth team, says they were slow to understand "the ways in which Facebook might be used for bad things."
The Facebook News Feed
One of the key features of Facebook that keeps users engaged is the news feed, described by former product manager on Facebook's advertising team, Antonio Garcia Martinez, as "Your personalized newspaper; your 'The New York Times' of you, channel you. It is your customized, optimized vision of the world."
However, the information that appears in your newsfeed isn't random. From the very beginning, it was driven by a secret algorithm, a mathematical formula that ranked stories in terms of importance based on your individual preferences. This personalization is "the secret sauce," to quote Martinez, that keeps users scrolling and sharing.
The addition of the "Like" button in 2009 revolutionized the company's ability to gather personal data — information about your preferences that can then be sold for cold hard cash. It also "acted as a social lubricant" and a "flywheel of engagement," Soleio Cuervo, a former product manager for the company, says.
The ability to get feedback through "likes" made people feel like they were being heard, and this ultimately became "the driving force of the product," Cuervo says. However, the "Like" button also suddenly allowed Facebook to determine who you care about most among your friends and family, what kind of content makes you react or take action, and which businesses and interests are truly important to you — information that helps build your personality profile and can be sold.
The Legal Provision That Allowed Facebook to Exist and Flourish
The Facebook news feed was made possible by laws that do not hold internet companies liable for the content posted on their website. As explained by Sparapani, "Section 230 of the Communications Decency Act is the provision which allows the internet economy to grow and thrive. And Facebook is one of the principal beneficiaries of this provision."
Section 230 of the Communications Decency Act basically says an internet provider cannot be held responsible if someone posts something violent, offensive or even unlawful on their site. According to Sparapani, Facebook “took a very libertarian perspective” with regard to what it would allow on its site.
Aside from a few basic common decency rules, the company was “reluctant to interpose our value system on this worldwide community,” Sparapani says. Were they concerned about truth becoming obfuscated amid a flood of lies? Jacoby wonders. “No,” Sparapani says. “We relied on what we thought were the public’s common sense and common decency to police the site."
Real-World Impacts of Social Media
The tremendous impact of social media, the ability to share information with like-minded individuals, became apparent during the so-called “Arab Spring” in 2011, when a Facebook page created by Wael Ghonim, a Google employee in the Middle East, literally sparked a revolution that led to the resignation of Egyptian President Muhammad Hosni El Sayed Mubarak, just 18 days after a Facebook call-out for protest resulted in hundreds of thousands of people taking to the streets.
Around the world, it became clear that Facebook could be used to create democratic change; that it has the power to change society as we know it. Alas, with the good comes the bad. After the revolution, conflict in the Middle East spiraled out of control as the polarization between opposing sides grew — and the social media environment both bred and encouraged that polarization.
What's worse, Facebook's news feed algorithm was actually designed to reward polarizing material with greater distribution. The end result played out in the streets, where sectarian violence led to bloodshed.
"The hardest thing for me was seeing the tool that brought us together tearing us apart,” Ghonim says, adding, “These tools are just enablers for whomever; they don’t separate between what’s good and bad. They just look at engagement metrics.” Since the Arab Spring, the rise of fake news has been relentless.
"Everything that happened after the Arab Spring should have been a warning sign to Facebook,” says Zeynep Tufekci, a researcher and former computer programmer. One major problem, she believes, is that Facebook was unprepared to monitor all of the content coming from every corner of the globe.
She urged the company to hire more staff, and to hire people who know the language and understand the local culture in each region Facebook is available. Still, it's unlikely that any company, at any size, would be able to police the content of a social network with more than 2 billion users.
Privacy — What Privacy?
In order for Facebook to go public, it had to be profitable, which is where the selling of user data comes in. By selling the information the platform has collected about you as you move through content and even web pages outside of Facebook, "liking" and commenting on posts along the way, marketers are able to target their chosen market.
While this seems innocuous enough at first glance, this data harvesting and selling has tremendous ramifications, opening people up to be purposely deceived and misled.
Zuckerberg, whose experience with advertising was limited, hired former Google vice president of global online sales and operations, Sheryl Sandberg, as chief operating officer. In one interview, Sandberg stresses that Facebook is "focused on privacy," and that their business model "is by far the most privacy-friendly to consumers."
"That's our mission," Zuckerberg chimes in, adding "We have to do that because if people feel like they don't have control over how they're sharing things, then we're failing them." "It really is the point that the only things Facebook knows about you are things you've done and told us," Sandberg says.
Internally, however, Sandberg demanded revenue growth, which meant selling more ads, which led to data harvesting that today exceeds people’s wildest imagination.
How to Build an Orwellian Surveillance Machine
By partnering with data brokering companies, Facebook has access to an incredible amount of data that has nothing to do with what you post online — information on your credit card transactions, where you live, where you shop, how your family is spending its time, where you work, what you eat, read, listen to and much more.
Information is also being collected about all other websites you’re perusing, outside of Facebook’s platform. All of this information, obtained by companies without your knowledge, is shared with Facebook, so that Facebook can sell ads that target specific groups of users. As noted by Tufekci, in order for Facebook’s business model to work, “it has to remain a surveillance machine."
In short, it’s the ultimate advertising tool ever created. The price? Your privacy. Sparapani was so uncomfortable with this new direction of Facebook, he resigned before the company’s partnering with data brokers took effect.
The extent of Facebook's data collection remained largely unknown until Max Schrems, an Austrian privacy advocate, filed 22 complaints with the Irish Data Protection Commission, where Facebook's international headquarters are located.
Schrems claimed that Facebook’s personal data collection violated European privacy law, as Facebook was not telling users how that data was being used. In the end, nothing happened. As noted by Schrems, it was obvious that “even if you violate the law, the reality is it’s very likely not going to be enforced.” In the U.S., the situation is even worse, as there are no laws governing emerging technologies which utilize9 the kinds of data collection done by Facebook.
Federal Trade Commission Investigates Privacy Concerns
A 2010 investigation of Facebook's data collection by the U.S. Federal Trade Commission (FTC) revealed the company was sharing user data with third party software developers without the users' consent — conduct the FTC deemed deceptive.
The FTC also grew concerned about the potential misuse of personal information, as Facebook was not tracking how third parties were using the information. They just handed over access, and these third parties could have been absolutely anyone capable of developing a third-party app for the site. Facebook settled the FTC's case against them without admitting guilt, but agreed by consent order to "identify risk to personal privacy" and eliminate those risks.
Internally, however, privacy issues were clearly not a priority, according to testimony by Sandy Parakilas, Facebook's platform operations manager between 2011 and 2012 who, during his time with the company, ended up in charge of solving the company's privacy conundrum — a responsibility he felt significantly underqualified for, considering its scope.
The Cambridge Analytica Scandal
Facebook, with founder Mark Zuckerberg at its helm, faced a firestorm after The New York Times and British media outlets reported Cambridge Analytica used "improperly gleaned" data from 87 million Facebook users to influence American voters during the 2016 presidential election.10,11
Cambridge Analytica data scientist Christopher Wylie, who blew the whistle on his employer, revealed the company built "a system that could profile individual U.S. voters in order to target them with personalized political advertisements" during the presidential campaign.
Parakilas insisted Facebook could have prevented the whole thing had they actually paid attention to and beefed up their internal security practices.12 Indeed, Cambridge Analytica used the very weakness the FTC had identified years before — a third-party personality quiz app called "This Is Your Digital Life."13
The Dark Side of Social Media Rears Its Ugly Head Again
Indeed, the U.S. Department of Defense has also expressed its concerns about Facebook, noting the ease with which it can spread disinformation. As noted by former Defense Advanced Research Projects Agency program manager, Rand Waltzman, the significant danger with giving out personal data is that you’re opening yourself up to be a target of manipulation — whether you’re being manipulated to buy something you don’t need or believe something that isn’t true.
Between 2012 and 2015, Waltzman and colleagues published 200 scientific papers on the potential threats posed by social media, detailing how Facebook and other platforms could be used for nefarious purposes. According to Waltzman, disinformation can be turned "into a serious weapon" on Facebook, as you have the ability to mislead enormous amounts of people with very little effort.
Essentially, Facebook allows for the propagation of propaganda at an enormous scale. "It's the scale that makes it a weapon," Waltzman says. Jacoby interviews a young Russian who claims to have worked as a paid social media propagandist for the Russian government, using fake Facebook profiles to spread false information and sow distrust of the Ukranian government.
The reach of this disinformation was made all the greater by the fact that you can pay to promote certain posts. In the end, all of the tools created by Facebook to benefit advertisers work equally well as government propaganda tools. The end result is tragic, as fake news has mushroomed to incomprehensible levels. Taking anything at face value these days is risky business, no matter how legitimate it may appear.
Understand the Risks of Social Media Use
Social media has many wonderful benefits. But there’s a dark side, and it’s important to be aware of this. Sen. Ron Wyden (D-OR) has actually drafted legislation to protect consumer information by enforcing strict punishments, including jail time for up to 20 years, for senior company executives who fail to follow the guidelines to protect user data. As reported by Endgadget:14
"The FTC would add 175 new members to its staff to carry out enforcement and would be given the ability to penalize a company up to four percent of its revenue for its first violation. Companies would also be required to submit regular reports to the FTC to disclose any privacy lapses that have occurred.
Companies making more than $1 billion in revenue and handling information from more than 1 million people and smaller companies handling the data of more than 50 million people would be subject to the regular check-ins. Failure to comply would care a punishment of potential jail time for executives.
The legislation would also institute a Do Not Track list. When a consumer joins the list, companies would be barred from sharing their data with third parties or using it to serve up targeted advertisements … Even if consumers don't choose to join the list, they would be granted the ability to review information collected about them, see who it has been shared with or sold to and challenge any inaccuracies."
Aside from privacy concerns and fake news, Facebook lurking has also been linked to decreased emotional well-being, and online bullying, social isolation and depression have all become serious problems among our youth.
The obvious answer to all of these issues is to minimize your use of Facebook, and be mindful of what you post, click on and comment on while there. Information is still being gathered on your personal life by other data brokers, but at least it won’t be as effectively “weaponized” against you if it’s not tied to your Facebook profile.
Source: mercola rss