Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
937 result(s) for "706/689/112"
Sort by:
The psychological drivers of misinformation belief and its resistance to correction
Misinformation has been identified as a major contributor to various contentious contemporary events ranging from elections and referenda to the response to the COVID-19 pandemic. Not only can belief in misinformation lead to poor judgements and decision-making, it also exerts a lingering influence on people’s reasoning after it has been corrected — an effect known as the continued influence effect. In this Review, we describe the cognitive, social and affective factors that lead people to form or endorse misinformed views, and the psychological barriers to knowledge revision after misinformation has been corrected, including theories of continued influence. We discuss the effectiveness of both pre-emptive (‘prebunking’) and reactive (‘debunking’) interventions to reduce the effects of misinformation, as well as implications for information consumers and practitioners in various areas including journalism, public health, policymaking and education.Misinformation is influential despite unprecedented access to high-quality, factual information. In this Review, Ecker et al. describe the cognitive, social and affective factors that drive sustained belief in misinformation, synthesize the evidence for interventions to reduce its effects and offer recommendations for information consumers and practitioners.
Revisiting COVID-19 vaccine hesitancy around the world using data from 23 countries in 2021
The COVID-19 pandemic continues to impact daily life, including health system operations, despite the availability of vaccines that are effective in greatly reducing the risks of death and severe disease. Misperceptions of COVID-19 vaccine safety, efficacy, risks, and mistrust in institutions responsible for vaccination campaigns have been reported as factors contributing to vaccine hesitancy. This study investigated COVID-19 vaccine hesitancy globally in June 2021. Nationally representative samples of 1,000 individuals from 23 countries were surveyed. Data were analyzed descriptively, and weighted multivariable logistic regressions were used to explore associations with vaccine hesitancy. Here, we show that more than three-fourths (75.2%) of the 23,000 respondents report vaccine acceptance, up from 71.5% one year earlier. Across all countries, vaccine hesitancy is associated with a lack of trust in COVID-19 vaccine safety and science, and skepticism about its efficacy. Vaccine hesitant respondents are also highly resistant to required proof of vaccination; 31.7%, 20%, 15%, and 14.8% approve requiring it for access to international travel, indoor activities, employment, and public schools, respectively. For ongoing COVID-19 vaccination campaigns to succeed in improving coverage going forward, substantial challenges remain to be overcome. These include increasing vaccination among those reporting lower vaccine confidence in addition to expanding vaccine access in low- and middle-income countries. Vaccine hesitancy is a public health challenge. Here the authors examine COVID-19 vaccine hesitancy in June 2021 using a survey including individuals from 23 countries, and report differences compared to a year earlier.
Like-minded sources on Facebook are prevalent but not polarizing
Many critics raise concerns about the prevalence of ‘echo chambers’ on social media and their potential role in increasing political polarization. However, the lack of available data and the challenges of conducting large-scale field experiments have made it difficult to assess the scope of the problem 1 , 2 . Here we present data from 2020 for the entire population of active adult Facebook users in the USA showing that content from ‘like-minded’ sources constitutes the majority of what people see on the platform, although political information and news represent only a small fraction of these exposures. To evaluate a potential response to concerns about the effects of echo chambers, we conducted a multi-wave field experiment on Facebook among 23,377 users for whom we reduced exposure to content from like-minded sources during the 2020 US presidential election by about one-third. We found that the intervention increased their exposure to content from cross-cutting sources and decreased exposure to uncivil language, but had no measurable effects on eight preregistered attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims. These precisely estimated results suggest that although exposure to content from like-minded sources on social media is common, reducing its prevalence during the 2020 US presidential election did not correspondingly reduce polarization in beliefs or attitudes. A large-scale field intervention experiment on 23,377 US Facebook users during the 2020 presidential election shows that reducing exposure to content from like-minded social media sources has no measurable effect on political polarization or other political attitudes and beliefs.
Shifting attention to accuracy can reduce misinformation online
In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media 1 , 2 , 3 – 4 . Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation 5 , 6 – 7 . Here, we attempt to address both of these questions. First, we find that the veracity of headlines has little effect on sharing intentions, despite having a large effect on judgments of accuracy. This dissociation suggests that sharing does not necessarily indicate belief. Nonetheless, most participants say it is important to share only accurate news. To shed light on this apparent contradiction, we carried out four survey experiments and a field experiment on Twitter; the results show that subtly shifting attention to accuracy increases the quality of news that people subsequently share. Together with additional computational analyses, these findings indicate that people often share misinformation because their attention is focused on factors other than accuracy—and therefore they fail to implement a strongly held preference for accurate sharing. Our results challenge the popular claim that people value partisanship over accuracy 8 , 9 , and provide evidence for scalable attention-based interventions that social media platforms could easily implement to counter misinformation online. Surveys and a field experiment with Twitter users show that prompting people to think about the accuracy of news sources increases the quality of the news that they share online.
Influence of fake news in Twitter during the 2016 US presidential election
The dynamics and influence of fake news on Twitter during the 2016 US presidential election remains to be clarified. Here, we use a dataset of 171 million tweets in the five months preceding the election day to identify 30 million tweets, from 2.2 million users, which contain a link to news outlets. Based on a classification of news outlets curated by www.opensources.co , we find that 25% of these tweets spread either fake or extremely biased news. We characterize the networks of information flow to find the most influential spreaders of fake and traditional news and use causal modeling to uncover how fake news influenced the presidential election. We find that, while top influencers spreading traditional center and left leaning news largely influence the activity of Clinton supporters, this causality is reversed for the fake news: the activity of Trump supporters influences the dynamics of the top fake news spreaders. The influence of 'fake news’, spread via social media, has been much discussed in the context of the 2016 US presidential election. Here, the authors use data on 30 million tweets to show how content classified as fake news diffused on Twitter before the election.
Growing polarization around climate change on social media
Climate change and political polarization are two of the twenty-first century’s critical socio-political issues. Here we investigate their intersection by studying the discussion around the United Nations Conference of the Parties on Climate Change (COP) using Twitter data from 2014 to 2021. First, we reveal a large increase in ideological polarization during COP26, following low polarization between COP20 and COP25. Second, we show that this increase is driven by growing right-wing activity, a fourfold increase since COP21 relative to pro-climate groups. Finally, we identify a broad range of ‘climate contrarian’ views during COP26, emphasizing the theme of political hypocrisy as a topic of cross-ideological appeal; contrarian views and accusations of hypocrisy have become key themes in the Twitter climate discussion since 2019. With future climate action reliant on negotiations at COP27 and beyond, our results highlight the importance of monitoring polarization and its impacts in the public climate discourse.Polarization and the resulting political deadlock have become key barriers to more ambitious climate action. Using Twitter data between Conferences of the Parties, this research identifies a trend of increasing polarization driven by growing right-wing activity alongside accusations of political hypocrisy.
Transforming machine translation: a deep learning system reaches news translation quality comparable to human professionals
The quality of human translation was long thought to be unattainable for computer translation systems. In this study, we present a deep-learning system, CUBBITT, which challenges this view. In a context-aware blind evaluation by human judges, CUBBITT significantly outperformed professional-agency English-to-Czech news translation in preserving text meaning (translation adequacy). While human translation is still rated as more fluent, CUBBITT is shown to be substantially more fluent than previous state-of-the-art systems. Moreover, most participants of a Translation Turing test struggle to distinguish CUBBITT translations from human translations. This work approaches the quality of human translation and even surpasses it in adequacy in certain circumstances.This suggests that deep learning may have the potential to replace humans in applications where conservation of meaning is the primary aim. The quality of human language translation has been thought to be unattainable by computer translation systems. Here the authors present CUBBITT, a deep learning system that outperforms professional human translators in retaining text meaning in English-to-Czech news translation, and validate the system on English-French and English-Polish language pairs.
Misinformation: susceptibility, spread, and interventions to immunize the public
The spread of misinformation poses a considerable threat to public health and the successful management of a global pandemic. For example, studies find that exposure to misinformation can undermine vaccination uptake and compliance with public-health guidelines. As research on the science of misinformation is rapidly emerging, this conceptual Review summarizes what we know along three key dimensions of the infodemic: susceptibility, spread, and immunization. Extant research is evaluated on the questions of why (some) people are (more) susceptible to misinformation, how misinformation spreads in online social networks, and which interventions can help to boost psychological immunity to misinformation. Implications for managing the infodemic are discussed. This Review provides an overview of the psychology of misinformation, from susceptibility to spread and interventions to help boost psychological immunity.