close
close

FTC says social media and online video companies monitor their users ‘extensively’ | Technology


FTC says social media and online video companies monitor their users ‘extensively’ | Technology

A new report by the Federal Trade Commission (FTC) staff on nine technology companies confirms that social media and online video companies collect enormous amounts of your personal information on and outside of their websites or apps and share it with a large number of third-party companies.

The FTC’s report, released Thursday, examined the data collection practices of Facebook, WhatsApp, YouTube, Discord, Reddit, Amazon, Snap, TikTok and Twitter/X between January 2019 and December 31, 2020. Most of the companies’ business models were based on tracking how people interacted with their platforms, collecting their personal information and using it to determine what content and ads users see in their feeds, the report said.

The FTC’s findings confirm years of reports about the depth and breadth of these companies’ tracking practices and accuse the tech companies of “extensive surveillance of users.” The agency recommends that Congress adopt federal privacy regulations based on the data it documented. In particular, the agency urges lawmakers to recognize that the business models of many of these companies provide little incentive for effective self-regulation or protection of user data.

“Recognizing this fundamental fact is important for law enforcement officials and policymakers alike, because any effort to restrict or regulate the way these companies collect vast amounts of personal data runs counter to their primary business interests,” FTC Chair Lina Khan said in a statement. “To develop effective rules or remedies to limit this data collection, policymakers must ensure that violating the law is not more lucrative than complying with the law.”

The FTC also calls on the companies named in the report to invest in “limiting data storage and sharing, restricting targeted advertising, and strengthening child protections.”

In particular, the report highlights that consumers have little control over how these companies use and share their personal data. Most companies collected or inferred demographic information about their users, such as age, gender, and language. Some collected information about household income, education, and parental and marital status. But even when this type of personal information was not explicitly collected, some companies were able to analyze users’ behavior on the platform and infer details of their private lives without their knowledge. For example, some companies’ user interest categories included “baby, children, and maternity,” which revealed parental status, or “newlyweds” and “divorce maintenance,” which revealed marital status. This information was then used by some companies to tailor the content displayed to users, thereby increasing engagement on their platforms. In some cases, this demographic information was shared with third-party companies to show them more relevant advertisements.

Regardless of the product used, opting out of data collection was not easy, according to the FTC. Almost all of the companies said they shared personal information with automated systems, mostly to deliver content and advertising. On the other hand, almost none of them offered “a comprehensive way to directly control or opt out of how their data was used by any algorithms, data analytics or AI,” according to the report.

Several companies say it’s impossible to create even a complete list of the people they share your data with. When the companies were asked to list which advertisers, data brokers or other companies they shared consumer data with, none of these nine companies provided the FTC with a complete list.

The FTC also found that despite evidence that many of these platforms are used by children and teens, many of the tech companies said they do not require different data sharing practices for children under 13 because their platforms are not targeted at children. According to the report, none of the companies said they have data sharing practices that treat information collected through their websites and apps about and from 13- to 17-year-olds differently than data collected from adults, even though data about minors is more sensitive.

The FTC called the companies’ data minimization practices “woefully inadequate,” noting that some of the companies did not delete information when users requested it. “Even the companies that did delete data deleted only some of the data, not all of it,” the report said.

“That’s the most basic requirement,” said Mario Trujillo, an attorney with the Electronic Frontier Foundation. “The fact that some have not done so despite state privacy laws requiring it shows that stronger enforcement is needed, especially from consumers themselves.”

Some of the companies have questioned the report’s findings. In a statement, Discord said the FTC’s report was an important step but “lumped together very different models.”

Skip newsletter promotion

“Discord’s business model is very different – we are a real-time communications platform with strong privacy controls for users and no endlessly scrolling feeds. At the time of the study, Discord did not operate a formal digital advertising service,” said Kate Sheerin, Discord’s head of public policy in the U.S. and Canada, in a statement.

A Google spokesperson said the company has the strictest privacy policies in the industry. “We never sell people’s personal information and we don’t use sensitive information to serve ads. We prohibit personalizing ads for users under 18, and we don’t personalize ads for people who watch ‘made-for-kids content’ on YouTube,” said Google spokesman José Castañeda.

The other companies either did not provide an official comment or did not immediately respond to a request for comment.

However, if companies dispute the FTC’s findings, the burden of proof is on them, explains the Electronic Privacy Information Center (Epic), a Washington, D.C.-based nonprofit research organization focused on privacy and free speech.

“I used to work in corporate privacy, and let’s put it this way: I believe absolutely nothing without the documentation to back up the claims,” ​​said Calli Schroeder, global privacy counsel at Epic. “And I agree with the FTC’s conclusion that self-regulation is a failure. Companies have repeatedly shown that their priority is profit, and they only take consumer protection and privacy seriously when it impacts their bottom line.”

Leave a Reply

Your email address will not be published. Required fields are marked *