Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
3,464 result(s) for "Financing, Organized"
Sort by:
ChatGPT use shows that the grant-application system is broken
The fact that artificial intelligence can do much of the work makes a mockery of the process. It’s time to make it easier for scientists to ask for research funding. The fact that artificial intelligence can do much of the work makes a mockery of the process. It’s time to make it easier for scientists to ask for research funding.
NIH plans grant-review overhaul to reduce bias
Reviewers would no longer score researchers’ expertise and institutions during grant evaluations for the US biomedical agency. Reviewers would no longer score researchers’ expertise and institutions during grant evaluations for the US biomedical agency. Close-up of the logo on the main building of the National Institutes of Health
Inequality quantified: Mind the gender gap
Despite improvements, female scientists continue to face discrimination, unequal pay and funding disparities.
Big Science vs. Little Science: How Scientific Impact Scales with Funding
is it more effective to give large grants to a few elite researchers, or small grants to many researchers? Large grants would be more effective only if scientific impact increases as an accelerating function of grant size. Here, we examine the scientific impact of individual university-based researchers in three disciplines funded by the Natural Sciences and Engineering Research Council of Canada (NSERC). We considered four indices of scientific impact: numbers of articles published, numbers of citations to those articles, the most cited article, and the number of highly cited articles, each measured over a four-year period. We related these to the amount of NSERC funding received. Impact is positively, but only weakly, related to funding. Researchers who received additional funds from a second federal granting council, the Canadian Institutes for Health Research, were not more productive than those who received only NSERC funding. Impact was generally a decelerating function of funding. Impact per dollar was therefore lower for large grant-holders. This is inconsistent with the hypothesis that larger grants lead to larger discoveries. Further, the impact of researchers who received increases in funding did not predictably increase. We conclude that scientific impact (as reflected by publications) is only weakly limited by funding. We suggest that funding strategies that target diversity, rather than \"excellence\", are likely to prove to be more productive.
Radical open-access plan could spell end to journal subscriptions
Eleven research funders in Europe announce ‘Plan S’ to make all scientific works free to read as soon as they are published. European Commission special envoy Robert-Jan Smits has spearheaded a plan to make all scientific works free to read.
Interdisciplinary research has consistently lower funding success
The degree of interdisciplinarity in research proposals negatively correlates with funding success across a wide range of research fields. The cost of interdisciplinarity In the recent past, governments and funding bodies have been keen to promote the merits of interdisciplinary research. But there is a widely held belief that such research faces higher barriers when it comes to applying for funding. Lindell Bromham et al . have mined data from the Australian Research Council to establish whether this gut feeling is based on reality. It is. They find that the degree of interdisciplinarity correlates negatively with funding success, irrespective of the research field. The authors go on to develop a metric — the interdisciplinary distance or IDD — that can be used to single out grant submissions that might be prone to this type of bias, so that their evaluation can be addressed proactively. Interdisciplinary research is widely considered a hothouse for innovation, and the only plausible approach to complex problems such as climate change 1 , 2 . One barrier to interdisciplinary research is the widespread perception that interdisciplinary projects are less likely to be funded than those with a narrower focus 3 , 4 . However, this commonly held belief has been difficult to evaluate objectively, partly because of lack of a comparable, quantitative measure of degree of interdisciplinarity that can be applied to funding application data 1 . Here we compare the degree to which research proposals span disparate fields by using a biodiversity metric that captures the relative representation of different fields (balance) and their degree of difference (disparity). The Australian Research Council’s Discovery Programme provides an ideal test case, because a single annual nationwide competitive grants scheme covers fundamental research in all disciplines, including arts, humanities and sciences. Using data on all 18,476 proposals submitted to the scheme over 5 consecutive years, including successful and unsuccessful applications, we show that the greater the degree of interdisciplinarity, the lower the probability of being funded. The negative impact of interdisciplinarity is significant even when number of collaborators, primary research field and type of institution are taken into account. This is the first broad-scale quantitative assessment of success rates of interdisciplinary research proposals. The interdisciplinary distance metric allows efficient evaluation of trends in research funding, and could be used to identify proposals that require assessment strategies appropriate to interdisciplinary research 5 .
Who would you share your funding with?
I want to see whether the wisdom of crowds does a better job than conventional grant review at supporting research, says Johan Bollen. I want to see whether the wisdom of crowds does a better job than conventional grant review at supporting research, says Johan Bollen.
The career cost of COVID-19 to female researchers, and how science should respond
Some funders and journals are trying to support female researchers and others whose publications and positions are at risk. Some funders and journals are trying to support female researchers and others whose publications and positions are at risk. The UK funder Wellcome wants applicants, such as researchers at the Mahidol Oxford Tropical Medicine Research Unit in Bangkok, to specifically mention the impacts from COVID-19 on their progress to date. Two female researchers in an office
Accelerating scientific publication in biology
Scientific publications enable results and ideas to be transmitted throughout the scientific community. The number and type of journal publications also have become the primary criteria used in evaluating career advancement. Our analysis suggests that publication practices have changed considerably in the life sciences over the past 30 years. More experimental data are now required for publication, and the average time required for graduate students to publish their first paper has increased and is approaching the desirable duration of PhD training. Because publication is generally a requirement for career progression, schemes to reduce the time of graduate student and postdoctoral training may be difficult to implement without also considering new mechanisms for accelerating communication of their work. The increasing time to publication also delays potential catalytic effects that ensue when many scientists have access to new information.The time has come for life scientists, funding agencies, and publishers to discuss how to communicate new findings in a way that best serves the interests of the public and the scientific community.
Assessment of potential bias in research grant peer review in Canada
Peer review is used to determine what research is funded and published, yet little is known about its effectiveness, and it is suspected that there may be biases. We investigated the variability of peer review and factors influencing ratings of grant applications. We evaluated all grant applications submitted to the Canadian Institutes of Health Research between 2012 and 2014. The contribution of application, principal applicant and reviewer characteristics to overall application score was assessed after adjusting for the applicant’s scientific productivity. Among 11 624 applications, 66.2% of principal applicants were male and 64.1% were in a basic science domain. We found a significant nonlinear association between scientific productivity and final application score that differed by applicant gender and scientific domain, with higher scores associated with past funding success and h-index and lower scores associated with female applicants and those in the applied sciences. Significantly lower application scores were also associated with applicants who were older, evaluated by female reviewers only (v. male reviewers only, −0.05 points, 95% confidence interval [CI] −0.08 to −0.02) or reviewers in scientific domains different from the applicant’s (−0.07 points, 95% CI −0.11 to −0.03). Significantly higher application scores were also associated with reviewer agreement in application score (0.23 points, 95% CI 0.20 to 0.26), the existence of reviewer conflicts (0.09 points, 95% CI 0.07 to 0.11), larger budget requests (0.01 points per $100 000, 95% CI 0.007 to 0.02), and resubmissions (0.15 points, 95% CI 0.14 to 0.17). In addition, reviewers with high expertise were more likely than those with less expertise to provide higher scores to applicants with higher past success rates (0.18 points, 95% CI 0.08 to 0.28). There is evidence of bias in peer review of operating grants that is of sufficient magnitude to change application scores from fundable to nonfundable. This should be addressed by training and policy changes in research funding.