David Oxenford

By: David Oxenford,
Wilkinson Barker Knauer LLP

In our summary of last week’s regulatory actions, I was struck by a common thread in comments made by several FCC Commissioners in different contexts – the thread being the FCC’s role in regulating Internet content companies.  As we noted in our summary, both Republican commissioners issued statements last week in response to a request by a public interest group that the FCC block Elon Musk’s acquisition of Twitter.  The Commissioners stated that the FCC had no role to play in reviewing that acquisition.  Twitter does not appear to own regulated communications assets and thus the FCC would not be called upon to review any application for the acquisition of that company.  The Commissioners also noted concerns with the First Amendment implications of trying to block the acquisition because of Musk’s hands-off position on the regulation of content on the platform, but the Commissioners’ principal concern was with FCC jurisdiction (Carr StatementSimington Comments).  In the same week, FCC Chairwoman Jessica Rosenworcel, in remarks to a disability rights organization, talked about plans for more FCC forums on the accessibility of Internet content to follow up on the sessions that we wrote about here.

The ability of the FCC to regulate internet content and platforms depends on statutory authority.  In holding the forums on captioning of online video content, the FCC could look to the language of the 21st Century Communications and Video Accessibility Act, which included language that asked the FCC to look at the accessibility of video content used on internet platforms.  In other areas, the FCC’s jurisdiction is not as clear, but calls arise regularly for the FCC to act to regulate content that, as we have written in other contexts, looks more and more like broadcast content and competes directly with that content.

Calls for the FCC to regulate internet content and the companies that provide that content are certain to multiply.  In another of our weekly summaries of regulatory actions of interest to broadcasters, we noted recent meetings with FCC Commissioners’ offices by representatives of the TV affiliates organizations, in which they asked that the FCC consider regulation of linear programming services delivered through internet platforms in the same way that they regulate cable and satellite multichannel video providers, including the possibility of adopting a system of must-carry and retransmission consent.  This is not at all a new idea, having been raised in 2014 in an FCC proceeding that asked for public comment on the question of whether to subject online video providers to MVPD regulation – a proceeding that never resulted in any action (see our articles here and here).

The FCC, of course, already is involved to some degree in internet content regulation.  It deals with transmission paths, both wired and unwired, and has wrestled with the questions of “net neutrality” over the last decade.  Even in content areas, it imposes some obligations.  But these are in areas ancillary to its broadcast regulation.  For instance, it has rules dealing with broadcast content exported to internet platforms – including obligations to export captions to those platforms when video programming is repurposed by a broadcaster to the internet.  See our article on captioning such programming here and here, and we noted in one of our weekly summaries of the FCC actions, here, there was recently a multi-million-dollar consent decree between the FCC and a media conglomerate which exported broadcast network programming without captions to an online platform owned by an affiliated company.  In the area of children’s television, there are limits on commercial content on landing pages of URLs displayed on television programming directed to children.  We also wrote about an apparent allusion of the FCC to penalties for the online use of fake EAS tones – or real tones where there was no emergency.

But these are the exception, not the rule.  For the most part, the FCC has been careful to stay out of internet content regulation where it does not have a clear statutory mandate to intervene.  In some areas, that can result in frustration over the lack of clear online standards.  For instance, in the political broadcasting arena, a broadcaster knows the rules for candidate rates, sponsorship identification and public disclosure of broadcast political content, because those issues are all governed by FCC rules.  But comparable rules for those issues for online political advertising are, for the most part, set by a patchwork of state laws that are obscure and sometimes impose different and even contradictory obligations (see our articles here and here).

Sponsorship identification for broadcasters is also governed by the FCC. In an online world, the FTC enforces guidelines similar (and in some cases more stringent) than those imposed by the FCC (see our posts here and here).  But there have been questions of whether all payments for sponsored content are apparent to online consumers – and even what practices should be disclosed or permitted (see, for instance, the recent letter from some congressional representatives to Spotify complaining about a program to offer artists more exposure for their music in return for lower royalties).

And bigger issues of moderation of online content have been at the forefront of recent political debate. There were questions raised prior to the last election as to whether the FCC had jurisdiction to review the application of Section 230 of the Communications Decency Act that gives online platforms immunity for content that is posted by third parties – and the degree to which that content can be moderated by these platforms (see our articles here and here).  These issues are sure to become even more common as Congress and others in the political realm consider the power of online platforms and whether there should be governmental limitations on that power (see, for instance, our articles here,  here and here).  Will the FCC have a role in enforcing any laws that are ultimately adopted?  We can only wait and see.

Each of these areas demands much greater consideration, and we will, of course, from time to time be looking at them all.  But it is an area of much controversy, and that controversy is sure to grow as online content plays an ever-larger role in society.

David Oxenford is MAB’s Washington Legal Counsel and provides members with answers to their legal questions with the MAB Legal Hotline. Access information here. (Members only access). There are no additional costs for the call; the advice is free as part of your MAB membership. 

image_printPrint