7 ways protections for online content are being eroded

Recent changes to Section 230 of the Communications Decency Act raise questions about how safe from liability publishers will continue to be for user-generated or third-party content. Contributor Wesley Young discusses threats on the horizon to those protections.

Chat with MarTechBot

Legal Publishing Ss 1920Section 230 of the Communications Decency Act (CDA) is critical to the foundation of online commerce as it’s exercised today. That’s why the recent debate about tweaking it to tackle online sex trafficking pitted some of the biggest online players against the interests of some of the most vulnerable victims in our society.

In the end, the law in question (the Fighting Online Sex Trafficking Act, or FOSTA) was defined narrowly enough to fix the targeted problem, created by Backpage.com, and the large publishers backed off.

But many are still concerned about the impact the change will have on the internet. Additionally, more direct threats have been ongoing for some time, mostly on the state and local level, which have the potential to significantly disrupt all kinds of online content including local advertising. Then there was Facebook CEO Mark Zuckerberg’s testimony before Congress, during which he seemed to acknowledge that the social network can and should take responsibility for the content published on it by others.

Below I take a look at the concerns surrounding publisher immunity and how they can affect the local search industry and more.

The issue

One fundamental principle that has shaped how the marketing industry has evolved, including local search content and advertising, is the protection for publishers against liability for third-party content.

[pullquote]“Publishers” is a broad term in this context that includes everyone who controls, hosts, operates or manages online content that includes the ability to moderate user-generated content.[/pullquote]

It’s a critical protection since so much content is created by third parties but hosted by publishers, including social networks, search engines, review sites and more. Search results serve up third-party website content; reviews capture user-generated recommendations and critiques; and both print and digital media display advertisements created, and sometimes even served, by third parties.

Even operators of personal websites or owners of social media pages that exercise control over content might be considered publishers when they host ads or solicit engagement with their content. Thus, “publishers” is a broad term in this context that includes everyone who controls, hosts, operates or manages online content that includes the ability to moderate user-generated content.

Without immunity for third-party content, a publisher might be held liable for misleading advertising, false reviews or slanderous comments. For example, if I clicked on a sponsored post that guaranteed a “double your money in one week investment opportunity,” I might sue the website owner when I lose all my money for “promoting” the scam. Publisher immunity laws mean the originator of the content is responsible for its own speech and publishers don’t have to screen every user-generated statement for veracity.

The protections for online publishers come from Section 230 of the Communications Decency Act of 1996 which states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Many states have their own protections for publishers via exemptions from consumer protection laws for advertisements that violate those laws as long as the publisher didn’t know the ad was deceptive. For example, California provides this exemption for publishers in its prohibitions of false advertising:

[blockquote]This article does not apply to any visual or sound radio broadcasting station, to any internet service provider or commercial online service, or to any publisher of a newspaper, magazine, or other publication, who broadcasts or publishes, including over the Internet, an advertisement in good faith, without knowledge of its false, deceptive, or misleading character.[/blockquote]

These protections have been used for a lot of good, but unfortunately, some bad, too, as detailed by the decade-long battle courts have had with Backpage.com, a classified ads site whose adult section was used widely by perpetrators of online sex trafficking. (That section was shut down in 2017, and the site was seized by The US Department of Justice earlier this month.)

This shutdown of the website, along with Congress’s amendment to Section 230, has brought the debate about eroding publisher protections to the forefront.

FOSTA

FOSTA (The Fight Online Sex Trafficking Act) was signed into law by the President last week. (SESTA was the Senate version before some changes were adopted into FOSTA.) It amends some criminal laws targeting those who commit trafficking crimes.

With regard to the Section 230 protections for publishers, FOSTA creates a narrow exception to the immunity granted. Publishers are not protected if their site is managed or operated “with the intent to promote or facilitate the prostitution of another person . . . .” Thus, the exception only affects those who operate with criminal intent, a standard that shouldn’t cause much concern in the internet industry.

A change in FOSTA to US Code Section 1591 also adds language specifying it is a crime to facilitate sex trafficking when you know the victim is forced into it or that he or she is a minor. While a “knowing” standard is also a high standard, there is enough uncertainty to cause online personal classifieds sites, many of which are well-known for illegal postings, to shut those forums down.

Much of the media coverage on FOSTA criticizes it for weakening publisher immunity. But even to the extent these changes weaken publisher immunity, the changes were necessary to address severe and heartbreaking crimes against child victims.

Between 2010 and 2015, the Senate Committee on Homeland Security found an 846 percent increase in reports of suspected child sex trafficking, directly correlated to the increased use of the internet to sell children for sex. Backpage successfully wielded Section 230 for the better part of a decade to avoid prosecution or liability before being shut down just this month. As a result, Congress passed the bill in as close to a unanimous vote as we’ve seen in this contentious political environment.

The bigger threats to publisher immunity

The real concern regarding FOSTA for publishers is the precedent it sets. There have been numerous attempts to make publishers more responsible for content in the past, and the fear is that FOSTA may be used to justify a broader erosion of protections, which would have a much more direct impact on local search and other online businesses.

If publishers are made responsible for third-party content, a variety of online marketing products and services, including local search, will become much more expensive. Uncertainty regarding enforcement, both from regulators and private action, means higher risk for liability. With higher risk come higher prices to cover insurance or pay damages in a civil suit. Or, in the worst-case scenario, publishers will stop hosting third-party content in those areas where there is exposure.

Below are some examples of some bigger threats to publisher immunity, including examples of legislation that has been pushed, and a look at the ways online businesses in general, and local search in particular, will be affected if those proposals or ideas move forward:

1. Public concessions in response to PR crises
There is a growing perception, among lawmakers and others, that publishers ought to have some responsibility for the content on their sites or platforms, contrary to the Section 230 protections. That mindset is being fed by some very public statements by some of the largest publishers in response to PR crises.

It’s understandable and a common PR strategy to apologize and accept responsibility as a way to move the discussion forward from the bad act and on to next positive steps. However, that becomes problematic when the statements are so broad as to nearly invite additional regulation.

The most recent example of this is from Facebook’s Mark Zuckerberg during testimony at Congressional hearings involving Cambridge Analytica. He made statements that the company is “responsible for the content on its platform” and that Facebook needs to take a “broader view” of its responsibility in the world.

While the hearings were ostensibly primarily about data security and privacy, Zuckerberg’s own words indicate he was not necessarily limiting them to the privacy issue, and lawmakers’ questions covered everything from content censorship to Facebook’s responsibility for illegal pharmaceutical ads. Statements like Zuckerberg’s will likely be cited in arguments for expanding publisher liability.

2. Local businesses are asked to screen ads they host
These questions about ads have also been addressed in a number of state bills that aimed to impose requirements on website operators or administrators to screen ads prior to allowing them to display on their sites.

For example, some call for the websites to identify the products or services being advertised and include mandatory disclosures for certain business categories. Other bills have mandated that website owners check that the advertiser has required permits or licenses before allowing their ad to run. A bill formerly introduced in California contained the following language addressed to the entertainment industry:

The operator of an Internet Web site that posts casting advertisements shall not post the advertisement of a person subject to paragraph (1) of subdivision (a) unless the person has provided information to the operator to establish that the person is the recipient of a valid Child Performer Services Permit, including a permit number and a form of identification to verify that the person is the recipient.

Most ads aren’t even placed in a manner that would allow them to be individually reviewed and are instead populated automatically via programmatic advertising (more below). Even if an individual ad was sold, such a manual screening process is not only prohibitively inefficient but burdens small businesses with legal risks of knowledge and compliance outside of their expertise.

For small business owners, requirements like these would make the risk far outweigh the benefit of hosting ads on their sites.

3. Publishers are asked to verify the veracity of directory listings
Similarly, state bills have imposed requirements on traditional local search publishers of search results or directory listings. These bills often involve business categories that have plagued regulators seeking to catch or shut down abusive operators, such as locksmiths and adoption agencies.

Legislative bills have sought to make publishers verify advertisers’ compliance with professional regulations before listings or ads can be displayed. For example, some bills have asked publishers to verify physical addresses or check license numbers against state agency records. Others, like one introduced in Maine, would have made publishers determine compliance with the proposed regulation as a whole, reading:

“Publication prohibited. A person may not publish by means of a public medium an advertisement that violates this section.”

Making publishers ad hoc regulators is not only ineffective, but a responsibility misplaced. It would also place a significant restraint on the development of local search products and services, as publishers would be unwilling to bear legal risk in areas where these laws existed.

4. Programmatic advertising is threatened
Many of the attempts described above arise out of a lack of understanding about the way today’s online system works. We saw clear evidence of that shallow knowledge most recently in Congress’s questioning of Zuckerberg. One questioner asked how Facebook could offer the platform for free. Zuckerberg couldn’t suppress a smile after he answered, “Senator, we run ads.”

Many publisher liability bills are written assuming individual pieces of content, such as ads, cross the publisher’s “desk” on their way to going online. Obviously, programmatic advertising does not work that way. But when laws are passed that are incompatible with an existing platform, that could bring significant components of the system to a screeching halt.

There are also those that understand just enough to be dangerous. Bills have been introduced to regulate the “advertising network” of programmatic advertising, but they include definitions that would rope in ad agencies, software companies and platform developers, as well as publishers and website managers. Disruption to the programmatic ecosystem posed by bills like these has the potential to be costly.

5. Publishers are exposed to low legal standards for enforcement
Perhaps in an attempt to goad publishers into action, many bills that impose publisher liability are drafted with the same penalty on both the advertiser and the publisher for illegal content. Thus, even though the advertiser makes the misleading statement or fails to get licensed, the publisher is held to be just as guilty for allowing the content to be displayed.

This low bar exposes the publisher not just to enforcement by state agencies, but also to private causes of action. For example, a competitor could sue for lost profits because the publisher allowed the unlicensed professional to steal away business.

Publishers are likely to be easier to find and have deeper pockets than the scammer or careless advertiser who placed the ad and would be much easier targets in an enforcement or damages claim.

6. Penalties for violations are unreasonable
If publishers are held to the same liability as advertisers, they would also be subject to the same consumer protection remedies. Consumer protection laws often allow treble damages and attorneys’ fees. Civil fines often have minimum damage limits. But most serious is when violations also include criminal penalties.

The worst example I’ve seen was legislation that imposed strict liability on publishers — that means any violation, regardless of fault or care taken, is subject to penalties. And all violations of this proposed legislation were deemed to be punishable as a felony crime.

7. Large publishers used as the standard for reasonable care
One question that I’ve faced in legislative committee hearings on publisher liability bills is “why can’t they make an algorithm for that?” The perception is that large technology platforms like Google are so highly proficient in programming that they should be able to write code that will implement the legal standards being sought.

First, if software could analyze a factual scenario and exercise legal judgment to determine the appropriate applicability and compliance, then there would be no need for lawyers. Second, legislators often fail to see the forest of other businesses behind the huge Google and Facebook trees. Yet legislation being debated always affects a much broader set of publishers.

As discussed above, “publishers” frequently references a broad group including local business websites, media and news sites, online directories, search engines, map platforms, blogs, retail websites, ecommerce sites, apps, video sites and social media pages. If large technology companies with huge financial and human resources determine the standard of reasonable care that all of those publishers must adhere to, that will place undue expectations on smaller publishers.

Closing thoughts

The amendments to Section 230 of the CDA won’t affect the vast majority of us and are important in the fight to protect the most vulnerable victims of our society. Yet the threat to protections for online advertising and content is real — it’s coming from the strong undercurrent and changing perception regarding the responsibility we have in hosting user-generated or third-party content.

All of the above examples of bill language were dropped or amended before being enacted. But they are indicative of what could be if we’re not careful. We take these protections for granted, but it’s important to be aware of the potential impact laws like these might have on our ability to do business and speak up in support of the protections that keep our online presence open and free.


Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.


About the author

Wesley Young
Contributor
Wesley Young is the Local Search Association’s vice president of Public Policy. He blogs about the industry on the Local Search Insider blog.

Get the must-read newsletter for marketers.