There’s a new bill circulating in the Senate that would require large internet companies to disclose that their results are using “opaque algorithms” and offer consumers an option to see non-personalized search results or content, the Wall Street Journal (WSJ) first reported. It’s called “The Filter Bubble Transparency Act.”
The term Filter Bubble was coined by activist and co-founder of Upworthy Eli Pariser to describe the socially destructive impact of showing fragmented and highly personalized content to internet users.
Bi-partisan support. The main sponsor of the bill is Republican Senator John Thune but it has bi-partisan support. He stated in an interview that the bill was designed to improve “transparency,” “choice” and “control” for consumers. Here’s the essence of what the bill (.pdf) requires platforms to do:
[The platform] provides notice to users of the platform that the platform uses an opaque algorithm that makes inferences based on user specific data to select the content the user sees. Such notice shall be presented in a clear, conspicuous manner on the platform whenever the user interacts with an opaque algorithm for the first time, and may be a one-time notice that can be dismissed by the user.
[The platform] makes available a version of the platform that uses an input-transparent algorithm and enables users to easily switch between the version of the platform that uses an opaque algorithm and the version of the platform that uses the input-transparent algorithm by selecting a prominently placed icon, which shall be displayed wherever the user interacts with an opaque algorithm.
Limited impact on Google results. At a time when “personalization” is top of mind for most marketers and technology companies, the proposed law seeks to strip away any personalization being utilized by algorithms. And while there is a widely held belief that Google heavily personalizes results, the company has previously said it does not, with the exception of location and some “immediate context from a prior search.”
As a practical matter then, it would have a limited impact on Google. It would have potentially a much greater impact on companies such as Facebook, YouTube (Google) and potentially even Amazon. But it would also affect any app or website using an algorithm that takes personal data or context into account.
Broad application to content sites, social networks. The bill would apply to “any public-facing website, internet application, or mobile application, including a social network site, video sharing service, search engine, or content aggregation service.”
It represents a bi-partisan expression of frustration with Google and Facebook in particular and an attempt to assert some degree of control over how they present content. Republicans believe that “conservative voices” are being “filtered out” by big internet platforms, which they see as biased. Democrats believe platforms like Facebook are manipulated by bad actors and partly responsible for intensifying polarization of the electorate.
The proposed law would not affect companies that have fewer than 500 employees, less than $50 million in revenue or audiences of less than one million users.
Algorithms held to be ‘protected speech.’ The bill does not appear to require companies to disclose the specific inputs into their algorithms, just that they’re using algorithms. Previously, U.S. courts have ruled, in Search King, Inc. v. Google Technology, Inc. (2003) and Langdon v. Google, Inc. (2007), that “search results” are protected editorial speech. Presumably this would equally apply to Facebook’s News Feed or YouTube results and content recommendations. However, the U.S. Supreme Court has not ruled on the specific question of whether search results are protected speech under the First Amendment.
Assuming passage of some version of the bill, it’s not clear whether the First Amendment might be used to challenge its Constitutionality. The WSJ also speculates that the bill’s provisions could ultimately be folded into a broader Congressional digital privacy legislation. If some version of the bill is ultimately passed the Federal Trade Commission would be in charge of enforcement.
Why we should care. Congress is determined to regulate big tech companies, which have generated considerable frustration and concern on Capitol Hill. There’s enough bi-partisan ire to all but ensure that some legislation will pass. The question is wisdom and sophistication of that effort.
Even if the Filter Bubble Act were to pass as a piece of stand-alone legislation, which is far from certain, it’s not clear that people would care or take advantage of it. It’s likely that most people would use the “default” version of results or content and not click over to the unfiltered version. There’s already plenty of behavioral evidence (see GDPR consent banners and ad choices) that would be the case.