[ad_1]
The Department of Justice released today a set of reform proposals to update the outdated immunity for online platforms under Section 230 of the Communications Decency Act of 1996. Responding to bipartisan concerns about the scope of 230 immunity, the department identified a set of concrete reform proposals to provide stronger incentives for online platforms to address illicit material on their services while continuing to foster innovation and free speech. The department’s findings are available here.
“When it comes to issues of public safety, the government is the one who must act on behalf of society at large. Law enforcement cannot delegate our obligations to protect the safety of the American people purely to the judgment of profit-seeking private firms. We must shape the incentives for companies to create a safer environment, which is what Section 230 was originally intended to do,” said Attorney General William P. Barr. “Taken together, these reforms will ensure that Section 230 immunity incentivizes online platforms to be responsible actors. These reforms are targeted at platforms to make certain they are appropriately addressing illegal and exploitive content while continuing to preserve a vibrant, open, and competitive internet. These twin objectives of giving online platforms the freedom to grow and innovate while encouraging them to moderate content responsibly were the core objectives of Section 230 at the outset. The Department’s proposal aims to realize these objectives more fully and clearly in order for Section 230 to better serve the interests of the American people.”
The department’s review of Section 230 over the last ten months arose in the context of its broader review of market-leading online platforms and their practices, which were announced in July 2019. The department held a large public workshop and expert roundtable in February 2020, as well as dozens of listening sessions with industry, thought leaders, and policy makers, to gain a better understanding of the uses and problems surrounding Section 230.
Section 230 was originally enacted to protect developing technology by providing that online platforms were not liable for the third-party content on their services or for their removal of such content in certain circumstances. This immunity was meant to nurture emerging internet businesses and to overrule a judicial precedent that rendered online platforms liable for all third-party content on their services if they restricted some harmful content.
However, the combination of 25 years of drastic technological changes and an expansive statutory interpretation left online platforms unaccountable for a variety of harms flowing from content on their platforms and with virtually unfettered discretion to censor third-party content with little transparency or accountability. Following the completion of its review, the Department of Justice determined that Section 230 is ripe for reform and identified and developed four categories of wide-ranging recommendations.
Incentivizing Online Platforms to Address Illicit Content
The first category of recommendations is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation claims. These reforms include a carve-out for bad actors who purposefully facilitate or solicit content that violates federal criminal law or are willfully blind to criminal content on their own services. Additionally, the department recommends a case-specific carve out where a platform has actual knowledge that content violated federal criminal law and does not act on it within a reasonable time, or where a platform was provided with a court judgment that the content is unlawful, and does not take appropriate action.
Promoting Open Discourse and Greater Transparency
A second category of proposed reforms is intended to clarify the text and revive the original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users. One of these recommended reforms is to provide a statutory definition of “good faith” to clarify its original purpose. The new statutory definition would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and consistent with public representations. These measures would encourage platforms to be more transparent and accountable to their users.
Clarifying Federal Government Enforcement Capabilities
The third category of recommendations would increase the ability of the government to protect citizens from unlawful conduct, by making it clear that Section 230 does not apply to civil enforcement actions brought by the federal government.
Promoting Competition
A fourth category of reform is to make clear that federal antitrust claims are not, and were never intended to be, covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.
For more information about the department’s recommendations, please visit https://www.justice.gov/ag/department-justice-s-review-section-230-communications-decency-act-1996.
[ad_2]
Source:Department of Justice
China PR Agency – 专业的中文新闻稿发布服务