close
close

FTC report finds that major social media and video streaming companies monitor their users on a large scale without relying on lax privacy controls and inadequate safeguards for children and young people


FTC report finds that major social media and video streaming companies monitor their users on a large scale without relying on lax privacy controls and inadequate safeguards for children and young people

A new report from the Federal Trade Commission (FTC) examining the data collection and use practices of major social media and video streaming services shows that these services engage in extensive surveillance of consumers to monetize their personal information while failing to adequately protect online users, especially children and teens.

The employee report is based on responses to 6(b) orders issued in December 2020 to nine companies, including some of the largest social media and video streaming services: Amazon.com, Inc., which owns the gaming platform Twitch; Facebook, Inc. (now Meta Platforms, Inc.); YouTube LLC; Twitter, Inc. (now X Corp.); Snap Inc.; ByteDance Ltd., which owns the video-sharing platform TikTok; Discord Inc.; Reddit, Inc.; and WhatsApp Inc.

The orders asked for information about how the companies collect, track and use personal and demographic data, how they determine what ads and other content to show to consumers, whether and how they apply algorithms or data analytics to personal and demographic data, and the impact of their practices on children and young people.

“The report lays out how social media and video streaming companies collect enormous amounts of Americans’ personal data and monetize it to the tune of billions of dollars a year,” said FTC Chair Lina M. Khan. “While these surveillance practices are lucrative for the companies, they can compromise people’s privacy, threaten their freedom, and expose them to a range of dangers, from identity theft to stalking. Of particular concern is the failure of several companies to adequately protect children and teens online. The report’s findings are timely, especially as state and federal policymakers consider laws to protect people from abusive data practices.”

The report found that the companies collected and could indefinitely store vast amounts of data, including information from data brokers and about users and non-users of their platforms. The staff report also highlights that many companies engaged in extensive data sharing, which raises serious concerns about the adequacy of the companies’ controls and oversight over how they handle data. In particular, the staff report found that the companies’ data collection, minimization and retention practices were “wholly inadequate.” In addition, the staff report found that some companies did not delete all user data in response to user requests for deletion.

The staff report also found that many companies’ business models encouraged the mass collection of user data for monetization, particularly through targeted advertising, which makes up the majority of their revenue. It also found that these incentives conflicted with user privacy and therefore posed risks to user privacy. In particular, the report found that some companies used privacy-violating tracking technologies such as pixels to enable advertising to users based on preferences and interests.

In addition, the staff report highlighted the many ways in which the companies fed users’ and non-users’ personal information into their automated systems, including for use by their algorithms, data analytics and AI. The report found that users and non-users had little to no opportunity to opt out of the use of their data by these automated systems, and that there were varying, inconsistent and inadequate approaches to monitoring and auditing the use of automated systems.

In addition, the staff report concluded that social media and video streaming services do not adequately protect children and young people on their websites. The report cited research that suggests social media and digital technologies contribute to negative mental health impacts among young users.

Based on the data collected, the staff report states that many companies claim that children are not on their platforms because their services are not geared toward children or do not allow children to create accounts. The staff report found that this is an obvious attempt to avoid liability under the Children’s Online Privacy Protection Act. The staff report found that the social media and video streaming services often treated teens the same as adult users, with most companies allowing teens to access their platforms without account restrictions.

The report also mentioned some of the potential competitive impacts of the companies’ data practices. It found that companies that collect large amounts of user data can potentially gain a dominant position in the market, which can lead to harmful practices where companies prioritize data collection while neglecting user privacy. It found that when there is limited competition between social media and video streaming services, consumers have limited choices.

The staff report provides recommendations to policymakers and businesses based on staff observations, insights and analysis, including:

  • Congress should pass comprehensive federal privacy laws to limit surveillance, establish basic protections, and provide privacy rights to consumers.
  • Companies should limit data collection, implement concrete and enforceable data minimization and retention policies, limit data sharing with third parties and affiliates, delete consumer data when it is no longer needed, and implement consumer-friendly privacy policies that are clear, simple, and easy to understand.
  • Companies should not collect sensitive data through privacy-violating ad tracking technologies.
  • Companies should carefully review their policies and practices regarding ad targeting based on sensitive categories.
  • Companies should address the lack of control users have over how systems use their data, as well as the lack of transparency regarding the use of such systems, and implement more stringent testing and monitoring standards for such systems. Companies should not ignore the fact that their platforms are also used by children and should consider COPPA as a minimum requirement and provide additional safety measures for children.
  • Companies should recognize that young people are not adults and offer them better data protection.
  • Congress should pass a federal privacy law to close the gap in privacy protections that COPPA provides for youth over the age of 13.

The Commission voted 5-0 to release the staff report. Chairman Khan and Commissioners Alvaro Bedoya, Melissa Holyoak and Andrew N. Ferguson each released separate statements.

The lead attorneys in this case are Jacqueline Ford, Ronnie Solomon and Ryan Mehm of the FTC’s Bureau of Consumer Protection.

Leave a Reply

Your email address will not be published. Required fields are marked *