FTC finds social media and online video companies engaging in 'extensive surveillance' of users | Technology

Social media and online video companies are collecting vast amounts of your personal information on and off their websites or apps and sharing it with a wide range of third-party entities, a new Federal Trade Commission (FTC) staff report on nine technology companies confirms.

The FTC report released Thursday analyzed the data collection practices of Facebook, WhatsApp, YouTube, Discord, Reddit, Amazon, Snap, TikTok and Twitter/X between January 2019 and Dec. 31, 2020. Most of the companies’ business models incentivized tracking of how people interacted with their platforms, collecting their personal data and using it to determine what content and ads users see in their feeds, the report said.

The FTC’s findings validate years of reporting on the depth and breadth of these companies’ tracking practices and call out the tech firms for “vast surveillance of users.” The agency recommends that Congress pass federal privacy regulations based on what it has documented. In particular, the agency urges lawmakers to recognize that the business models of many of these companies do little to incentivize effective self-regulation or protect user data.

“Recognizing this basic fact is important for law enforcement and policymakers alike, because any effort to limit or regulate how these companies collect vast amounts of people’s personal data will conflict with their core business incentives,” FTC Chairwoman Lina Khan said in a statement. “To craft effective rules or remedies that limit this data collection, policymakers will need to ensure that breaking the law is not more lucrative than complying with it.”

The FTC also calls on companies named in the report to invest in “limiting data retention and sharing, restricting targeted advertising, and strengthening protections for teens.”

In particular, the report highlights that consumers have little control over how these companies use and share their personal data. Most companies collected or inferred demographic information about users, such as age, gender, and language. Some collected information about household income, education, and marital and parental status. But even when this type of personal information was not explicitly collected, some companies were able to analyze users’ behavior on the platform to infer details of their personal lives without their knowledge. For example, some companies’ user interest categories included “baby, kids, and parenthood,” which would reveal parental status, or “newlyweds” and “divorce support,” which would reveal marital status. This information was then used by some companies to tailor the content people saw in order to increase engagement on their platforms. In some cases, that demographic information was shared with third-party entities to help target them with more relevant ads.

According to the FTC, no matter what product you used, it was not easy to opt out of data collection. Nearly all of the companies said they provided personal information to automated systems, most often to serve content and ads. On the other hand, almost none of them offered “a comprehensive ability to directly control or opt out of the use of your data by all algorithms, data analytics, or artificial intelligence,” according to the report.

Several companies say it's impossible to even compile a complete list of who they share data with. When asked to list which advertisers, data brokers or other entities they share consumer data with, none of these nine companies provided the FTC with a complete inventory.

The FTC also found that despite evidence that children and teens use many of these platforms, many of the tech companies reported that because their platforms are not directed at children, they do not need different data-sharing practices for children under 13. According to the report, none of the companies reported having data-sharing practices that treated information collected about and from 13- to 17-year-olds through their sites and apps differently than data about adults, even though data about minors is more sensitive.

The FTC called the companies’ data minimization practices “woefully inadequate” and found that some of them failed to delete information when users requested it. “Even companies that actually deleted data only deleted some data, but not all,” the report said.

“That’s the most basic requirement,” said Mario Trujillo, a staff attorney at the Electronic Frontier Foundation. “The fact that some didn’t do it, even in the face of state privacy laws requiring it, shows that stronger enforcement is needed, especially by consumers themselves.”

Some companies have questioned the report's conclusions. In a statement, Discord said the FTC report was an important step but lumped “very different models together.”

Skip newsletter promotion

“Discord’s business model is very different: we are a real-time communications platform with strong user privacy controls and no endless feeds to scroll through. At the time of the study, Discord did not operate a formal digital advertising service,” Kate Sheerin, Discord’s director of public policy in the US and Canada, said in a statement.

A Google spokesperson said the company had the strictest privacy policies in the industry. “We never sell people’s personal information and we don’t use sensitive information to serve ads. Prohibit ad personalization “For users under 18, we do not personalize ads for anyone viewing ‘content made for kids’ on YouTube,” said Google spokesman Jose Castaneda.

The other companies either did not provide an official comment or did not immediately respond to a request for comment.

However, if companies challenge the FTC's findings, the burden is on them to provide evidence, says the Electronic Privacy Information Center (EPIC), a Washington, DC-based public interest research organization focused on privacy and free speech.

“I worked in privacy enforcement for companies and let’s just say I don’t believe anything without documentation to back up the claims,” said Calli Schroeder, Epic’s global privacy counsel. “And I agree with the FTC’s conclusion that self-regulation is a failure. Companies have repeatedly shown that their priority is profits and they will only take consumer protection and privacy issues seriously when not doing so would hurt those profits.”

Fuente

Leave a comment