Washington Can't Solve Silicon Valley's Problems

Washington Can't Solve Silicon Valley's Problems

Washington’s scrutiny of Silicon Valley ramped up substantially last year, culminating in Google CEO Sundar Pichai’s 3.5 hour command performance before the House Judiciary Committee in December. Members expressed concern about a range of issues, including privacy, potential political bias, and whether the company was planning on introducing a censored search product in China.

Facebook has also drawn public ire, most recently for sharing its data with advertising customers and for researching the motivations of the company’s critics. Members of Congress, whose campaigns likely use Facebook’s various tracking and targeting tools, are, to paraphrase the famous line from Casablanca, shocked — shocked — to find that Facebook uses data to make money.

The principal takeaway from Silicon Valley's troubles is that the original idealistic vision of the internet as consisting of open, neutral platforms where anyone could post anything is not workable anymore, if it ever was. 

The real policy issues, however, are not those related to semantic debates over whether Facebook is correct in asserting that it doesn’t "sell" its data. Sharing data, whether by selling it or using it to serve better ads, is a way to create value. Targeted ads are benefits, not costs, and policymakers should view them that way.

Instead, the real issues stem from the negative externalities that can come with useful technologies. In particular, conventional privacy issues have given way to an array of increasingly difficult-to-manage problems, like concerns that internet platforms might facilitate ethnic cleansing, election meddling, and dissemination of hate speech.

Advocates and politicians across the political spectrum have offered two broad responses to the problems associated with big tech platforms. The first is more aggressive antitrust enforcement, possibly entailing the break-up of one or more of the platforms. The second is new consumer privacy regulation following models recently adopted in Europe and California. 

Neither of these approaches is well suited to address the internet issues of greatest concern — the use of platforms for nefarious purposes — and both would entail substantial compliance costs and hinder innovation. 

It is hard to envision how a regulatory regime designed to tackle these problems could work. Policing platform content is not easy, requiring a combination of sophisticated algorithms and human judgments. Public officials making judgment calls on acceptable speech would soon run afoul of the First Amendment, at least in the United States.

Which leaves a third alternative — regulation by the market. Facebook, Google, and other technology platforms are realizing that failure to take responsibility for content on their platforms is harming their brands, their ability to attract the most talented employees, and their share prices. Facebook shareholders saw their equity value decline by 14 percent in the span of a week after the Cambridge Analytica episode became public. From July through the end of 2018, Facebook’s share value declined by about 40 percent — double the decline in other tech stocks such as Amazon and Alphabet, which in turn declined more than the broader averages. Repeated controversies are probably at least partly responsible for this dramatic decline.

The market is telling big technology firms that serving their shareholders' interests requires them to police their content and take responsibility for what is on their platforms. While still obviously difficult, tech companies are better equipped to craft solutions than a government agency.

Government can’t censor content, but the companies themselves can and should. These platforms are not public utilities that need to give equal access to everyone. Different platforms will appeal to different audiences. Some will appeal to niche audiences of various stripes and will undoubtedly include content that is objectionable to many people. The most successful platforms will likely appeal to broader audiences and curate their content accordingly. It's no different in principal than a mall operator refusing to lease to risky tenants in part because it will harm business for the Nordstrom's or Macy's next door.    

It is true that the costs of actions like Russian election meddling are incurred by the rest of society — they are externalities and therefore not immediately reflected in stock prices. But once the activities become public, the costs are quickly internalized. Facebook has incurred reputational damage and appears to be spending substantial resources trying to prevent future events. Indeed, the reputational harm may be disproportionate to the true costs, as newly funded campaigns, such as Freedom from Facebook, direct efforts to influence public opinion.

Although tech platforms that didn’t even exist until recently now seem firmly entrenched, they can fall just as fast if they don’t pay close attention to how their platforms are used. As former Intel CEO Andy Grove famously observed, "only the paranoid survive." Hopefully, this will provide Mr. Pichai, Mr. Zuckerberg, and their peers sufficient incentives to mitigate these harms for their shareholders and the American public, because Congress is not equipped to solve these issues.

Thomas M. Lenard is president emeritus and senior fellow with the Technology Policy Institute.

Comment
Show comments Hide Comments

Related Articles