Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
86 result(s) for "Klonick, Kate"
Sort by:
The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression
For a decade and a half, Facebook has dominated the landscape of digital social networks, becoming one of the most powerful arbiters of online speech. Twenty-four hours a day, seven days a week, over two billion users leverage the platform to post, share, discuss, react to, and access content from all over the globe. Through a system of semipublic rules called \"Community Standards,\" Facebook has created a body of \"laws\" and a system of governance that dictate what users may say on the platform. In recent years, as this intricately built system to dispatch the company's immense private power over the public right of speech has become more visible, Facebook has experienced intense pressure to become more accountable, transparent, and democratic, not only in how it creates its fundamental policies for speech but also in how it enforces them. In November 2018, after years of entreaty from the press, advocacy groups, and users, CEO and founder Mark Zuckerberg announced that Facebook would construct an independent oversight body to be researched, created, and launched within the year. The express purpose of this body was to serve as an appellate review system for user content and to make content-moderation policy recommendations to Facebook. This Feature empirically documents the creation of this institution, now called the Facebook Oversight Board. The Board is a historic endeavor both in scope and scale. The Feature traces the events and influences that led to Facebook's decision to create the Oversight Board. It details the year-long process of creating the Board, relying on hundreds of hours of interviews and embedded research with the Governance Team charged with researching, planning, and building this new institution. The creation of the Oversight Board and its aims are a novel articulation of internet governance. This Feature illuminates the future implications of the new institution for global freedom of expression. Using the lens of adjudication, it analyzes what the Board is, what the Board means to users, and what the Board means for industry and governments. Ultimately, the Feature concludes that the Facebook Oversight Board has great potential to set new precedent for user participation in private platforms' governance and a user right to procedure in content moderation.
The New Governors: The People, Rules, and Processes Governing Online Speech
Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque. This Article provides the first analysis of what these platforms are actually doing to moderate online speech under a regulatory and First Amendment framework. Drawing from original interviews, archived materials, and internal documents, this Article describes how three major online platforms - Facebook, Twitter, and YouTube - moderate content and situates their moderation systems into a broader discussion of online governance and the evolution of free expression values in the private sphere. It reveals that private content-moderation systems curate user content with an eye to American free speech norms, corporate responsibility, and the economic necessity of creating an environment that reflects the expectations of their users. In order to accomplish this, platforms have developed a detailed system rooted in the American legal system with regularly revised rules, trained human decision-making, and reliance on a system of external influence. This Article argues that to best understand online speech, we must abandon traditional doctrinal and regulatory analogies and understand these private content platforms as systems of governance. These platforms are now responsible for shaping and allowing participation in our new digital and democratic culture, yet they have little direct accountability to their users. Future intervention, if any, must take into account how and why these platforms regulate online speech in order to strike a balance between preserving the democratizing forces of the internet and protecting the generative power of our New Governors.
The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression
For a decade and a half, Facebook has dominated the landscape of digital social networks, becoming one of the most powerful arbiters of online speech. Twenty-four hours a day, seven days a week, over two billion users leverage the platform to post, share, discuss, react to, and access content from all over the globe. Through a system of semipublic rules called \"Community Standards,\" Facebook has created a body of \"laws\" and a system of governance that dictate what users may say on the platform. In recent years, as this intricately built system to dispatch the company's immense private power over the public right of speech has become more visible, Facebook has experienced intense pressure to become more accountable, transparent, and democratic, not only in how it creates its fundamental policies for speech but also in how it enforces them.
THE NEW GOVERNORS: THE PEOPLE, RULES, AND PROCESSES GOVERNING ONLINE SPEECH
Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque. This Article provides the first analysis of what these platforms are actually doing to moderate online speech under a regulatory and First Amendment framework. Drawing from original interviews, archived materials, and internal documents, this Article describes how three major online platforms — Facebook, Twitter, and YouTube — moderate content and situates their moderation systems into a broader discussion of online governance and the evolution of free expression values in the private sphere. It reveals that private content-moderation systems curate user content with an eye to American free speech norms, corporate responsibility, and the economic necessity of creating an environment that reflects the expectations of their users. In order to accomplish this, platforms have developed a detailed system rooted in the American legal system with regularly revised rules, trained human decisionmaking, and reliance on a system of external influence. This Article argues that to best understand online speech, we must abandon traditional doctrinal and regulatory analogies and understand these private content platforms as systems of governance. These platforms are now responsible for shaping and allowing participation in our new digital and democratic culture, yet they have little direct accountability to their users. Future intervention, if any, must take into account how and why these platforms regulate online speech in order to strike a balance between preserving the democratizingforces of the internet and protecting the generative power of our New Governors.
THE NEW GOVERNORS: THE PEOPLE, RULES, AND PROCESSES GOVERNING ONLINE SPEECH
Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque.
THE NEW GOVERNORS: THE PEOPLE, RULES, AND PROCESSES GOVERNING ONLINE SPEECH
Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque.
THE NEW GOVERNORS: THE PEOPLE, RULES, AND PROCESSES GOVERNING ONLINE SPEECH
Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque.
THE NEW GOVERNORS: THE PEOPLE, RULES, AND PROCESSES GOVERNING ONLINE SPEECH
Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque.
Networked Technologies' Transformation of Social Norms, Private Self-Regulation, and the Law
This collection of essays explores how networked technologies have transformed social norms, private self-regulation and the law. This dissertation has three parts: (1) the first essay explores how the social norm enforcing mechanism of shaming has been changed by the Internet, (2) the second essay looks at how platforms like Facebook, YouTube, and Twitter have developed private systems of self-regulation regarding online speech in the shadow of existing law; and (3) the final essay, which compares the development of the free speech theories enshrined in public figure and newsworthiness exceptions in communications tort doctrine with motivations and mechanisms of enforcement of public figure doctrine in content moderation policy at Facebook. The first essay, Re-Shaming the Debate: Social Norms, Shame, and Regulation in an Internet Age, explores how advances in technology communication have dramatically changed the ways in which social norm enforcement is used to constrain behavior. This is powerfully demonstrated through current events around online shaming and cyberharassment. Low cost, anonymous, instant, and ubiquitous access to the Internet has removed most—if not all—of the natural checks on shaming. This essay ties together the current conversation around online shaming and cyber-bullying and cyber-harassment with the larger legal discussion around social norms and shaming sanctions. It argues that the introduction of the internet has altered the social conditions in which people speak and, thus, changed the way we perceive and enforce social norms. Accordingly, online shaming is (1) a punishment with indeterminate social meaning; (2) not a calibrated or measured form of punishment; and (3) of little or questionable accuracy in who and what it punishes. In thus reframing the problem, this essay looks at the viability of the legal, normative, private, and State solutions to controlling online shaming. While it might appear that any internet user can publish freely and instantly online, the second essay, The New Governors: The People, Rules, and Processes Governing Online Speech demonstrates how many platforms actively curate the content posted by their users. This essay provides an empirical account of what these platforms are doing to moderate online speech both in terms of their substantive policy and through the procedural systems they developed. It then situates their moderation systems into a broader discussion of online governance and the evolution of free expression values in the private sphere. It reveals that private content moderation systems create substantive policies that balance free speech norms, corporate responsibility, and the economic necessity to create an environment reflective of the expectations of its users. In order to accomplish this, platforms have procedurally developed a detailed system similar to common law regulation with recursively revised rules contingent on new and changing facts, trained human decision-making like judges, and reliance on a system of external influence. This essay argues that to best understand online speech, we must abandon traditional analogies, and understand private content platforms as systems of governance operating with free speech norms and changing norms and expectations of their users. The third and final essay, Facebook v. Sullivan, looks comparatively at two systems that exist in the modern era to adjudicate disputes concerning speech about other people. The first is the oldest and most familiar: the tort system implemented through state defamation and privacy law, with the courts judging disputes that test the line between reputational and privacy protection and free speech. The second is much newer: content moderation of user speech implemented through rules and policies of online speech platforms like Facebook, which are not tethered to a free speech commitment like that found in the First Amendment. This essay analyzes the terms used by both systems to adjudicate the balancing of harmful speech against free speech—public figure and newsworthiness. It begins with the U.S. legal doctrine that developed this terminology with a discussion of the underlying First Amendment theory used by the Court to rationalize its development. It then looks to moderation of online speech at Facebook, discussing how and why exceptions for public figure users and newsworthiness were carved out, before arguing that much of reasoning by the courts in creating First Amendment limits to tort law for speech about public figures and matters of public concern can be seen in Facebook's motivations for creating exceptions to blanket rules against bullying or hate speech. Despite these noble motives, however, this paper elucidates two main problems with the way Facebook enforces this policy: (1) Facebook's reliance on news algorithms for determinations about public figures is overly descriptive, and thus threatens to keep up speech even for users who have sympathetic concerns for taking speech down; and (2) the use of global news algorithms cannot adequately take into account localized newsworthiness or prominent figures in \"small\" communities, and thus might over censor. Finally, this Essay argues that these trends threaten not only the free speech concerns that motivated the creation of the doctrine, but also Facebook's core mission to \"give people the power to build community.\" As a result, they comprise an underestimated threat to the democratic potential of the internet.