A Privacy Blowback Is Coming For FAANG Stocks

|
Includes: AAPL, FB, GOOG, TWTR
by: Jeffrey Carr
Summary

GOOG is pushing for changes in California's CCPA (effective Jan 1, 2020) that impact its ad revenue just days before the State Legislature adjourns on Sep 13th.

GOOG and FB has both received record-setting fines by the FTC ($170M and $5B respectively) in the past 30 days.

15 states including CA, NY, TX, WA, and IL have impending data privacy legislation that will force them to delete consumer data valuable to advertisers.

40 States Attorney Generals are launching an anti-trust probe into FB and GOOG starting September 9.

State Privacy Law Comparison Bloomberg reports that lobbyists for Google and other FAANG companies are making a last-minute bid to submit legislation that would modify California's Consumer Privacy Act (CCPA) with wording that would allow Google and other companies to continue collecting consumer data for targeted advertising even after the consumer opts out.

“This is a jailbreak,” said California state Senator Hannah-Beth Jackson. “This blows up the entire purpose of the CCPA, which is for people to know when their information is being used and to give them the right to opt out.”

This comes right on the heels of the FTC fining Google $170M for alleged violations of the COPPA rule against harvesting data on children without parental consent in a complaint filed with the FTC by New York State Attorney General Letitia James. According to the New York Times, this was the largest fine ever obtained by the commission in a children's privacy case.

“Google and YouTube knowingly and illegally monitored, tracked, and served targeted ads to young children just to keep advertising dollars rolling in,” said Attorney General Letitia James. “These companies put children at risk and abused their power, which is why we are imposing major reforms to their practices and making them pay one of the largest settlements for a privacy matter in U.S. history."

Personal identifiers include the IP address of the viewer, and when the viewer is under the age of 13 it amounts to a violation of the Children's Online Privacy Protection Act (COPPA), a federal law. Ads served by YouTube, a Google company, are particularly successful (and profitable) for the content creators who host their videos on YouTube and allow Google and YouTube to serve targeted ads to the children viewing their content. Some critics of the FTC decision say the fine is too low since YouTube's parent company Alphabet, Inc. made a net income of $30.7 billion last year.

FTC fines against Google

Google has ongoing GDPR challenges as well. It was fined $50M by the French data protection regulator CDIL on January 21, 2019. Then in May, the Irish Data Protection Commission, the lead GDPR regulator, opened a probe into Google's Online Advertising Exchange. According to the Financial Times, the probe is looking into Google's handling of personal data at each stage of an advertising transaction.

The investigation follows a number of complaints, including from browser company Brave, claiming that Google targets advertising at internet users by applying categories that are banned under Europe’s General Data Protection Regulation. The rules, which came into force a year ago, sharply limit how companies can use information that touches on someone’s race, ethnicity, political opinions, religious beliefs, trade union membership or sexual orientation

New evidence submitted to the Irish Data Protection Commission this month by Brave's Chief Policy Officer Johnny Ryan showed how Google had labeled him with an "identifying tracker" or cookie that it secretly provided to third-party advertisers via a blank webpage.

Mr Ryan found six separate pages pushing out his identifier after a single hour of looking at websites on Google’s Chrome browser. The identifier contained the phrase “google_push” and was sent to at least eight adtech companies. “This practice is hidden in two ways: the most basic way is that Google creates a page that the user never sees, it’s blank, has no content, but allows . . . third parties to snoop on the user and the user is none the wiser,” said Mr Ryan. “I had no idea this was happening. If I consulted my browser log, I wouldn’t have had an idea either.”

Facebook's Record-Setting and Underwhelming $5B Fine

While Facebook was operating under an FTC Consent Decree from 2012 over privacy abuses, it suffered a 50 million user data breach and permitted massive Facebook user data abuse (over 80 million users) by Cambridge Analytica, the British defense contractor turned political fixer who used wartime psychological operations techniques to influence election outcomes in ten countries around the world including the Brexit and Trump campaigns.

Facebook collects detailed user information from 2.2 billion people on a daily basis and monetizes it by selling access to advertisers who are eager to pay. In fact, after it was announced that the FTC levied a record $5B fine against the company, it's stock went up! That's because FB's revenue was $55.8 billion in 2018 and even a record fine was nothing more than a slap on the wrist - "the cost of doing business" as Kara Swisher put it in a New York Times Op-Ed.

With $23 billion in cash on hand, Facebook will see a $5 billion fine as simply the cost of doing business. Needless to say, this is not how fines are supposed to work. Scott Galloway, a marketing professor at N.Y.U. and my co-host on the podcast Pivot, calls it the “algebra of deterrence,” by which he means a price and a punishment that makes certain you will not do a bad thing again.

Five billion dollars is not that price. “Put another zero on it and then we can start talking,” said Mr. Galloway this week.

FB stock price after FTC fine announcement According to MarketsInsider, "The stock closed up nearly 2% at $204.87 a share and continued to climb slightly in after-hours trading."

The FTC versus Facebook

The Federal Trade Commission complaint that accompanied the $5B penalty provides a short overview of Facebook's egregious conduct. The following are excerpts from the complaint (emphasis added below by the author):

4. To encourage users to share information, Facebook promises users that they can control the privacy of their information through Facebook’s privacy settings. However, through at least June 2018, Facebook subverted users’ privacy choices to serve its own business interests.

5. Beginning at least as early as 2010, every Facebook user who installed an app (“App User”) agreed to Facebook sharing with the third-party developer of the installed app both information about the App User and the App User’s Facebook Friends. Facebook’s default settings were set so that Facebook would share with the third-party developer of an App User’s app not only the App User’s data, but also data of the App User’s Facebook Friends (“Affected Friends”), even if those Affected Friends had not themselves installed the app. Affected Friends could only avoid this sharing by finding and opting out of it via settings on Facebook’s Applications page, which was located on Facebook’s website and mobile applications, separate and apart from Facebook’s Privacy Settings page. Third-party developers that received user and Affected Friend information could use that information to enhance the in-app experience or target advertising to App Users and their Affected Friends. In the wrong hands, user and Affected Friend data could be used for identity theft, phishing, fraud, and other harmful purposes.

7. In the wake of the FTC’s initial investigation, Facebook retained the separate optout sharing setting on its Applications page, but it added a disclaimer to its Privacy Settings page, warning users that information shared with Facebook Friends could also be shared with the apps those Friends used. However, four months after the 2012 Order was finalized, Facebook removed this disclaimer—even though it was still sharing Affected Friends data with third-party developers and still using the same separate opt-out setting that undermined users’ privacy choices before entry of the Commission Order.

8. At its F8 conference in April 2014—one theme of which was user trust— Facebook announced that it would stop allowing third-party developers to collect data about Affected Friends. Facebook also told third-party developers that existing apps could only continue to collect Affected Friend data for one year, or until April 2015. But, after April 2015, Facebook had private arrangements with dozens of developers, referred to as “Whitelisted Developers,” that allowed those developers to continue to collect the data of Affected Friends, with some of those arrangements lasting until June 2018.

9. At least tens of millions of American users relied on Facebook’s deceptive privacy settings and statements to restrict the sharing of their information to their Facebook Friends, when, in fact, third-party developers could access and collect their data through their Friends’ use of third-party developers’ apps. Facebook knew or should have known that its conduct violated the 2012 Order because it was engaging in the very same conduct that the Commission alleged was deceptive in Count One of the original Complaint that led to the 2012 Order. See Exhibit B, In re Facebook, Inc., C-4365, 2012 FTC LEXIS 136 (F.T.C. July 27, 2012) (“Original Complaint”).

10. ...Facebook had allowed millions of third-party developers to access and collect massive troves of consumer data about both App Users and their Facebook Friends, and Facebook failed to track that data in an organized, systematic way.

11. As a general practice, Facebook did not vet third-party developers before granting them access to consumer data; instead, developers simply had to check a box agreeing to comply with Facebook’s policies and terms and conditions, including those designed to protect consumer information. This made Facebook’s enforcement of its policies, terms, and conditions acutely important.

12. Facebook’s enforcement of its policies, terms, and conditions, however, was inadequate and was influenced by the financial benefit that violator third-party app developers provided to Facebook. This conduct was unreasonable. Facebook never disclosed this disparate enforcement practice to the third-party assessor charged by the 2012 Order with assessing the implementation and effectiveness of Facebook’s privacy program, nor did Facebook disclose its enforcement practices to the Commission in its biennial assessment reports mandated by the 2012 Order. See Commission Order, Part V.

13. In addition to its violations of the 2012 Order, Facebook also engaged in deceptive practices in violation of Section 5(A) of the FTC Act. Between November 2015 and March 2018, Facebook asked its users to provide personal information to take advantage of security measures on the Facebook website or mobile application, including a two-factor authentication measure that encouraged provision of users’ phone numbers. Facebook did not effectively disclose that such information would also be used for advertising.

14. Finally, in April 2018, Facebook updated its data policy to explain that Facebook would use an updated facial-recognition technology to identify people in user-uploaded pictures and videos “[i]f it is turned on,” implying that users must opt in to use facial recognition. Contrary to the implication of this updated data policy, however, tens of millions of users who still had an older version of Facebook’s facial-recognition technology had to opt out to disable facial recognition. This violated the 2012 Order by misrepresenting the extent to which consumers could control the privacy of their information used for facial recognition.

FAANG's Order of Battle Against Regulation

Bloomberg reported that Google, Facebook, Inc., and Amazon.com, Inc. all reported an increase in lobbying expenditures for 2018 in the face of increasing calls for breaking the companies up as well as increasing regulation over repeated privacy abuses.

Netflix has taken a different route, focusing its lobbying efforts on the international front instead of Washington, D.C. Of its 30 full-time staffers, 26 are based outside of the U.S. in Australia, Belgium, Brazil, Canada, France, Germany, India, Italy, Korea, Mexico, the Netherlands, Singapore, and Spain.

What the Media Giants Spend on DC Influence In addition to pedaling influence in Washington, DC, Facebook is spending money at the State level to weaken new data privacy legislation coming on the heels of the GDPR and the California Consumer Privacy Act (CCPA). Sludge reported that Facebook increased its State lobbying expenditures by 31% since the 2016 election, mainly with Republican lobbying firms.

“A lot of big companies are increasing lobbying in the states both because there’s sort of more happening in the states and there’s congressional gridlock,” said Ian Vandewalker, senior counsel at the Brennan Center’s Democracy Program. “Companies can get more bang for their buck on state levels.”

Facebook Lobbying Expenditures By State Based on results in the states where Facebook has hired lobbyists, it costs relatively little to stall or kill a bill that could potentially impact FB's advertising model.

Facebook had no lobbying presence in Montana until 2017, the year that a bill was introduced to ban companies from collecting biometric data (i.e., facial recognition) without first obtaining consent. The bill died in the legislature. Facebook spent $7,000 on lobbyists that year.

Also in 2017, a bill introduced in the Connecticut General Assembly impacted Facebook’s patented technology to deliver ads based on perceived emotions. The bill died on the floor. Facebook spent $69,000 on lobbyists that year.

In 2018, New York legislators passed regulations for online political ad disclosures, but failed to get any traction on efforts to regulate data collection from Facebook users. Facebook spent $46,000 on lobbyists that year.

The big dog in online lobbying on behalf of FAANG and other technology giants is the Internet Association, founded in 2012 by Google, Facebook, Amazon, and eBay, with member companies including Netflix, Dropbox, PayPal, and Microsoft, among others. In 2017, Alphabet, Google's parent company, spent more on lobbyists than any other corporation in America according to the New York Times.

The Battle to Weaken California's Consumer Privacy Act

The California Consumer Privacy Act began in 2017 when Bay Area real estate developer Alastair Mactaggart and finance executive Rick Arney decided to launch a ballot initiative to provide consumers with a codified list of privacy rights in the face of Silicon Valley's relentless accumulation and monetization of personal data. The three basic principles to be addressed by the initiative were Transparency (what is being collected), Control (the ability to say no), and Accountability (for data breaches).

“One of the reasons why it’s brought as a ballot initiative is that there is consensus that Silicon Valley owns Sacramento,” said Chris Hoofnagle, an adjunct professor at the University of California, Berkeley School of Law and an adviser for the initiative. “There’s no prospect of any consequential consumer privacy legislation.”

Mactaggart funded the effort with approximately $3M of his own money and by May, 2018 he and Arney had managed to collect 629,000 signatures, almost twice the required amount to be included on the November 2018 ballot. After the initiative was certified, they were contacted by two California legislators with an offer to turn their initiative into legislation and so avoid an expensive and protracted media fight with Silicon Valley and the California Chamber of Commerce who were prepared to spend $100M to fight the initiative under a PAC they set up called "The Committee To Protect California Jobs." Mactaggart agreed and the ballot initiative became instead AB 375 sponsored by State Senator Robert Hertzberg and State Assemblyman Ed Chau.

Silicon Valley's approach to fighting this bill was multi-faceted. They used their extensive influence in the State legislature to try to eliminate any teeth that the bill might have while simultaneously backing privacy groups who claimed that the bill wasn't tough enough. "Sometimes politics makes strange bedfellows", said Samantha Corbin, a lobbyist in Sacremento for the EFF, Common Sense Kids Action, and Facebook. It didn't matter to them how the bill was defeated or by whom; as long as it didn't contain provisions that would cut into its extremely profitable business model. Eventually, Mactaggart, Hertzberg, and Chau managed to work out a compromise that the legislators in both houses were happy with and Facebook announced that “while not perfect, we support AB 375 and look forward to working with policymakers on an approach that protects consumers and promotes responsible innovation.”

The Internet Association also issued the following statement: “The internet industry will not obstruct or block AB 375 from moving forward because it prevents the even-worse ballot initiative from becoming law in California.”

That's not to say that Google and Facebook stopped trying to weaken the law even after it was passed (as Bloomberg reported last week) and continue to fight impending legislation in other states, at the federal level, and internationally.

Anti-Trust Probes Against GOOG and FB

On Sep 6th, the Wall Street Journal announced "Top state law-enforcement officials from across the country are formally launching antitrust probes into Facebook Inc. and Alphabet Inc.'s Google starting next week, further pressuring tech giants already under federal scrutiny over whether their online dominance stifles competition."

This move by the States is reflective of the anger that the companies' users have expressed in a recent WSJ/NBC public opinion poll which show that "almost three quarters of respondents said they believe the trade-off that underpins the huge sector—consumers receiving free services but giving up detailed data about their online behavior—is unacceptable."

Summary

The Silicon Valley business model of corporations acquiring consumer data that they neither own nor pay for, nor cannot adequately protect, nor provide the consumer with sufficient information on how the data is used, nor whom it is shared with - all for the convenience of building their social graph - returned massive profits over the years to "Big Data" companies. A recent study showed consumer support for increased government regulation across all demographics and political ideologies is accompanying a rising tide of enforcement actions in the E.U. and the U.S.

The poll findings reflect a sense that the public is seeing more downside risks from some online services (Facebook and Twitter). The results also suggest that Congress has a green light from voters when it comes to overseeing the lightly regulated internet economy more closely, particularly when it comes to privacy legislation that is now being drafted.

Investors in technology stocks would do well to watch this trend and modify their investment strategies accordingly.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it. I have no business relationship with any company whose stock is mentioned in this article.