Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
2,386 result(s) for "Digital Services Act"
Sort by:
Without access to social media platform data, we risk being lef t in the dark
Social media data are essential for studying human behaviour and understanding potential systemic risks. Social media platforms have, however, begun to remove access to these data. In response, other countries and regions have implemented legislation that compels platforms to provide researchers with data access. In South Africa, we have lagged behind the Global North when it comes to using platform data in our research and, given the recent access restrictions, we risk being left behind. In this Commentary, I call attention to this critical issue and initiate a conversation about access to social media data in South Africa.
Unseen Influence: Computational Propaganda, Free Elections, and the Reluctance to Seek Judicial Remedies in Poland. Evidence from AI-Assisted Case Law Analysis
The Polish electoral system adheres to the principle of free and fair elections. This principle has a defined content, and its backbone remains access to truthful information and the free shaping of opinions about a candidate or an issue put to a referendum. However, the enormous increase in computational power and the associated development of artificial intelligence have caused electoral competition to become highly aggressive; it no longer avoids false information, messages appealing to negative emotions, or calls for violence. Very Large Online Platforms’ predictable abdication of their role as moderators of public debate leads to the question: How can or should public authorities protect integrity and freedom of participation from abuse in the era of digital constitutionalism? Should we rely on a litigation system where the initiative comes solely from the participant in the electoral process, or should we also include the regulatory power of the electoral administration? What picture of electoral campaigns is provided by Polish jurisprudence concerning electoral disputes?
Making European Union digital platform regulation a reality
Over the last twenty years, the digital economy – driven primarily through large digital platforms that have been mostly unregulated to date – has brought enormous economic and societal benefits. The COVID-19 pandemic has accelerated this trend by making digital platforms central to the global economy and society and by highlighting further opportunities, but, importantly too, risks and threats. Digital platforms, representing the increasingly important and maturing online platform economy, are now being described as critical infrastructure and even utilities. Digital platform policy, particularly the future regulation of the large far-reaching dominant platforms, is a major focus of the European Union (EU) as part of its response to the COVID-19 crisis. The literature on platform regulation highlights two major themes that emerge concerning digital platform regulation, and that are consequently the focus of future regulation: competition and online content. This article presents research findings in these areas through a critical analysis of and reflection on emerging digital platform regulatory practices which are being progressed under the groundbreaking EU Digital Services Act and Digital Markets Act package. This includes assessing implications for national implementation, regulatory enforcement, and governance. A particular emphasis is placed on the Digital Services Act where there is less literature, knowledge, and experience on how to best regulate online content. In this context, the paper provides insights into how Ireland, where many of the large platforms are established and so is their de-facto regulator, is dealing with regulatory implementation issues driven by the EU.
Making European Union digital platform regulation a reality1
Over the last twenty years, the digital economy – driven primarily through large digital platforms that have been mostly unregulated to date – has brought enormous economic and societal benefits. The COVID-19 pandemic has accelerated this trend by making digital platforms central to the global economy and society and by highlighting further opportunities, but, importantly too, risks and threats. Digital platforms, representing the increasingly important and maturing online platform economy, are now being described as critical infrastructure and even utilities. Digital platform policy, particularly the future regulation of the large far-reaching dominant platforms, is a major focus of the European Union (EU) as part of its response to the COVID-19 crisis. The literature on platform regulation highlights two major themes that emerge concerning digital platform regulation, and that are consequently the focus of future regulation: competition and online content. This article presents research findings in these areas through a critical analysis of and reflection on emerging digital platform regulatory practices which are being progressed under the groundbreaking EU Digital Services Act and Digital Markets Act package. This includes assessing implications for national implementation, regulatory enforcement, and governance. A particular emphasis is placed on the Digital Services Act where there is less literature, knowledge, and experience on how to best regulate online content. In this context, the paper provides insights into how Ireland, where many of the large platforms are established and so is their de-facto regulator, is dealing with regulatory implementation issues driven by the EU.
Online Platforms: Towards an Information Tsunami with New Requirements on Moderation, Ranking, and Traceability
Ideally, online contracts are concluded by informed consumers. For 25 years the European Union is issuing an abundance of information requirements providers of online services have to comply with. We briefly touch upon the previously existing information requirements, and findings on the intersection of behavorial studies and consumer law, but primarily focus on new information requirements against the background of an online platform offering space for businesses to provide their service to consumers. We critically discuss new information requirements from the so-called Omnibus Directive 2019/2161 primarily regarding changes in Directive 2011/83/EU on consumer rights. Also, we analyze amendments to the Directive 2000/31/EC on e-commerce as proposed late 2020 in the Digital Services Act. We focus on what information should be communicated, how this information should be communicated, and at what moment. In our analysis our doubts about the value of all this information is apparent, and we question the need for and interest of consumers for all information. In the end we suggest of what we believe should be directly communicated to the consumer, and what could be available only to those consumers, probably quite few, who are really interested. Online platforms, consumer protection, information requirements, Digital Services Act
The Interplay between the Digital Services Act and Sector Regulation: How Special Is Copyright?
On 15 December 2020, the European Commission published its proposal for the Digital Services Act, which is expected to be adopted before summer 2022. It carries out a regulatory overhaul of the twenty-one-year-old horizontal rules on intermediary liability in the e-Commerce Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this article, we look specifically at the intersection of the Digital Services Act with the regime for online content-sharing service providers (OCSSPs) set forth in Article 17 of Directive (EU) 2019/790 on Copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap, as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the Digital Services Act will apply to OCSSPs insofar as it contains rules that regulate matters not covered by Article 17 CDSM Directive, as well as specific rules on matters where Article 17 leaves a margin of discretion to Member States. This includes, to varying degrees, rules in the Digital Services Act relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where Article 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the Digital Services Act aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might help us to achieve that goal.
Personal Data Processing by Online Platforms and Search Engines: The Case of the EU Digital Services Act
The new EU Digital Services Act (DSA) is intended to regulate intermediary service providers, with particular attention to online platforms and search engines. The core activity of such platforms and engines is personal data processing, pursuant to the tasks of content moderation and recommendations. This means that regarding personal data, there is an interconnection between the GDPR and the DSA, and it is a matter of law to determine how they interact in the EU digital space. This paper endeavours to draw a comprehensive picture of how the GDPR and the DSA seek to provide better guidance on the adequacy and enforcement of personal data protection. It is argued that their relationship is best described as the DSA being the lex specialis vis-à-vis the GDPR, but this is somewhat blurred by instances where the latter is mostly complementing the former, such as 1. specific legal basis for data processing in compliance with new legal obligations for platforms; 2. a new articulation between both regulations concerning dark patterns; 3. new prohibitions on personal data processing ; 4. new duties for the protection of personal data; and 5. a new ancillary institutional framework to regulate data protection by online platforms in collaboration with national data protection authorities.
Copyright Content Moderation in the European Union: State of the Art, Ways Forward and Policy Recommendations
This Opinion describes and summarises the results of the interdisciplinary research carried out by the authors during the course of a three-year project on intermediaries' practices regarding copyright content moderation. This research includes the mapping of the EU legal framework and intermediaries' practices regarding copyright content moderation, the evaluation and measuring of the impact of moderation practices and technologies on access and diversity, and a set of policy recommendations. Our recommendations touch on the following topics: the definition of \"online content-sharing service provider\"; the recognition and operationalisation of user rights; the complementary nature of complaint and redress safeguards; the scope of permissible preventive filtering; the clarification of the relationship between Art. 17 of the new Copyright Directive and the Digital Services Act; monetisation and restrictive content moderation actions; recommender systems and copyright content moderation; transparency and data access for researchers; trade secret protection and transparency of content moderation systems; the relationship between the copyright acquis , the Digital Services Act and the upcoming Artificial Intelligence Act; and human competences in copyright content moderation.
A Case Study of Judicial-Legislative Interactions via the Lens of the DSA’s Host Liability Rules
This paper explores the relationship between the Court of Justice of the European Union and the EU Legislature. Taking intermediary liability rules for hosts under Article 14 of the e-Commerce Directive (ECD) and its subsequent revision under the Digital Services Act (DSA) as a case study, the paper explores this relationship. The analysis focuses on the distinction between ‘active’ and ‘neutral’ roles for hosts developed by the Court of Justice in its case law. The paper shows how the Court’s rulings shaped the application of host liability rules under the ECD, subsequently impacted their revision during the DSA negotiations, and how the Legislature addressed the uncertainty created by its case law on voluntary investigations via the introduction of a ‘Good Samaritan’ clause. Methodologically the paper draws on legal scholarship and political science literature, employing a mixed methods approach which combines doctrinal analysis, process tracing, and a law attainment typology to explore these dynamics. The paper concludes that while the Court’s jurisprudence was important during negotiations, it also highlights the challenges it posed as both a form of legal regulation, and by extension as a tool of integration. These issues necessitating legislative intervention. The paper’s findings lend weight to the increasing body of literature in legal scholarship which argues that there has been a shift in European integration dynamics, with positive integration playing an increasingly important role.
Disinformation tackling in the metaverse and the Digital Services Act
The emergent metaverse presents novel challenges for online information integrity. Disinformation in immersive environments can be particularly pernicious, blurring the lines between reality and fiction. This study explores the multifaceted phenomenon of disinformation in metaverse, examining its potential forms, methods of propagation and the unique risks it poses. We then delve into the regulatory landscape, specifically analysing the EU’s Digital Services Act (DSA) as a potential framework to combat disinformation in this new virtual frontier. In doing so, the study assesses the strengths and limitations of the DSA in addressing the complexities of metaverse disinformation. Through critical analysis, we explore how the DSA’s specific scope and provisions on content removal, platform accountability and transparency can be adapted to the metaverse context.