I Own a Social Media Site & I Want Washington to Regulate Big Tech

I Own a Social Media Site & I Want Washington to Regulate Big Tech
(Joe Scarnici/Triller via AP)
X
Story Stream
recent articles

As the co-owner of a social media site – Triller – you might think I would be the last person on earth who wants Washington to take action against Big Tech. But I do. Partly because I understand the inner workings of these platforms and know how powerful they are. But also, because over the past year, my company and I have been the targets of a coordinated online harassment campaign that Big Tech has largely turned a blind eye toward.

In April 2021, Triller Fight Club, the boxing league established by our company, put on a bout between YouTube star Jake Paul and mixed-martial artist Ben Askren. We sold more than 1 million pay-per-view subscriptions to the fight at $49.99 per stream.

Shortly after, a YouTube video blogger named Ethan Klein pirated a clip of our fight for his channel and bragged about it. Triller asked him to remove the clip, he refused, so we sued. He eventually removed it, but that wasn’t the end of it. Klein responded by producing a series of video podcasts attacking me, using ad hominem slurs and knowingly false information. Klein’s material has been racist, homophobic and vulgar. He had earned a YouTube “strike” before for using harmful or dangerous language. But the attacks on Triller and me are still on YouTube.

But his videos were just the start.

Klein’s YouTube channel has more than 3 million followers, and they have become his digital foot soldiers. Here’s an example: Nearly 20 percent of all the Google searches for my name lead to a single website with my name in it that Klein set up to defame me. The relentless clicks of Klein’s followers have boosted that site to the top of my Google search results page. This is exactly like someone erecting a billboard in front of your house filled with vicious lies about you.

You’d reasonably want that billboard – or, in this case, website – taken down, wouldn’t you? Apparently, Klein’s malicious web site with my name in it does not violate the terms of service of either Google or the web company hosting it. As a result, my only recourse was to sue Klein for defamation in the Supreme Court of California, where the case is pending.

The Big Tech companies have, since their inception, claimed they were mere platforms, largely not responsible for the content they host and distribute. Today, however, Facebook and Google have become the primary news and information source for many people, and the companies earn advertising money from this traffic. They are now publishers, and publishers must be held to account not only for what they publish but whom they let to remain on their platforms. They have allowed a thugocracy of bad actors to maraud the internet, destroying lives, businesses and reputations with harassment, bullying and libel.

When it comes to intellectual property – the revenue base of any creative business – Big Tech gets the advertising revenue associated with the content it hosts, but assumes little of the responsibility. That’s a pretty sweet deal, and it must change.                                     

Consider: If my company broadcasts a fight on pay-per-view, within moments after the bout begins, we will see multiple YouTube users pirating and broadcasting the fight for free on their channels. The onus is on us to find and report them to YouTube. If YouTube agrees it is pirated content, the users have 72 hours to take it down. But by then the damage has been done to our business – millions of viewers may have seen the fight for free. And at $49.99 per viewer, that’s a lot of lost revenue.

What should be done?

In practice, the biggest social media sites and many websites – YouTube, Facebook, Instagram, Twitter, TikTok, Wikipedia – have become so ubiquitous and so ingrained into most Americans’ everyday lives, and have such an enormous impact, that that have become de facto utilities, like electricity and water. And the government regulates utilities.

As with many advances in the private sector, the government’s regulatory regime has not kept up. The Federal Trade Commission, the Federal Communications Commission and the Department of Justice are all constrained when it comes to regulating the Wild West of Big Tech. For instance, the FTC can crack down on deceptive online content, but not on harmful content.

Therefore, Congress should establish a new federal agency with digital expertise to oversee and regulate Big Tech that would focus on consumer protection and competition. This agency would require content-hosting platforms to radically step up their intellectual property protection and assume at least some responsibility for the content they host. It’s fair to assume if a company has the technical prowess to invent the Metaverse, it can do a better job policing its content.

Under pain of penalty, social media platforms must clarify and toughen their regulations on what content is acceptable and what is not. Too often, vague and narrow definitions of harmful and dangerous content are exploited by savvy online actors who know how to style their harmful content to pass muster, yet still inflict damage.

No one would want to live in a physical world where criminals, predators, con artists, homophobes, xenophobes and the like visit their homes every day, vandalizing their property, robbing from their business and bullying their children. But over the past 20 years, that is exactly the online world that Big Tech has enabled. It’s time to clean up the neighborhood.

Ryan Kavanaugh is the founder of Proxima Media, the co-owner of TrillerNet and the founder of Relativity Media.



Comment
Show comments Hide Comments