Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
57,712
result(s) for
"Science journals"
Sort by:
What drives and inhibits researchers to share and use open research data? A systematic literature review to analyze factors influencing open research data adoption
by
Shinde, Rhythima
,
Jeng, Wei
,
Zuiderwijk, Anneke
in
Biology and Life Sciences
,
Categories
,
Computer and Information Sciences
2020
Both sharing and using open research data have the revolutionary potentials for forwarding scientific advancement. Although previous research gives insight into researchers' drivers and inhibitors for sharing and using open research data, both these drivers and inhibitors have not yet been integrated via a thematic analysis and a theoretical argument is lacking. This study's purpose is to systematically review the literature on individual researchers' drivers and inhibitors for sharing and using open research data. This study systematically analyzed 32 open data studies (published between 2004 and 2019 inclusively) and elicited drivers plus inhibitors for both open research data sharing and use in eleven categories total that are: 'the researcher's background', 'requirements and formal obligations', 'personal drivers and intrinsic motivations', 'facilitating conditions', 'trust', 'expected performance', 'social influence and affiliation', 'effort', 'the researcher's experience and skills', 'legislation and regulation', and 'data characteristics.' This study extensively discusses these categories, along with argues how such categories and factors are connected using a thematic analysis. Also, this study discusses several opportunities for altogether applying, extending, using, and testing theories in open research data studies. With such discussions, an overview of identified categories and factors can be further applied to examine both researchers' drivers and inhibitors in different research disciplines, such as those with low rates of data sharing and use versus disciplines with high rates of data sharing plus use. What's more, this study serves as a first vital step towards developing effective incentives for both open data sharing and use behavior.
Journal Article
Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data
2018
In this study, we address the question whether (and to what extent, respectively) altmetrics are related to the scientific quality of papers (as measured by peer assessments). Only a few studies have previously investigated the relationship between altmetrics and assessments by peers. In the first step, we analyse the underlying dimensions of measurement for traditional metrics (citation counts) and altmetrics-by using principal component analysis (PCA) and factor analysis (FA). In the second step, we test the relationship between the dimensions and quality of papers (as measured by the post-publication peer-review system of F1000Prime assessments)-using regression analysis. The results of the PCA and FA show that altmetrics operate along different dimensions, whereas Mendeley counts are related to citation counts, and tweets form a separate dimension. The results of the regression analysis indicate that citation-based metrics and readership counts are significantly more related to quality, than tweets. This result on the one hand questions the use of Twitter counts for research evaluation purposes and on the other hand indicates potential use of Mendeley reader counts.
Journal Article
Analyzing AI use policy in LIS: association with journal metrics and publisher volume
2024
The objective of this study is to investigate the landscape of AI use policies in library and information science (LIS) journals and examine their association with key journal metrics. The study analyzed 232 LIS journals indexed in the 2023 Scimago Journal Rank (SJR) portal, focusing on AI use policies, guidelines for declaring AI use, and references to the Committee on Publication Ethics (COPE) for establishing guidelines. Data on journal metrics, including quartiles, SJR, h-index, total documents published in 2022 (TD2023), publisher volume, and citations per document over 2 years (CITES2YR), were collected from the SJR portal. Several key findings emerged: the majority of LIS journals did not have explicit AI use policies, although AI tools were generally permitted for manuscript editing. Logistic regression analysis revealed a significant association between higher journal metrics, particularly citations (CITES2YR), and the presence of AI use policies, while other metrics, such as SJR and h-index, were not consistently significant. Furthermore, larger publishers were more likely to have AI use policies but showed flexibility by not mandating AI use declarations. Significant differences were found across journal quartiles, with Quartile 1 journals being more likely to adopt AI use policies than Quartile 4. These findings highlight the influential role of large-volume publishers in shaping AI use policies and emphasize their importance in setting scholarly norms in the LIS community.
Journal Article
A case study exploring associations between popular media attention of scientific research and scientific citations
by
Alyssa Evans-Pickett
,
Matthew K. Seeley
,
Joseph G. Hadfield
in
Appropriations
,
Author productivity
,
Bibliometrics
2020
The association between mention of scientific research in popular media (e.g., the mainstream media or social media platforms) and scientific impact (e.g., citations) has yet to be fully explored. The purpose of this study was to clarify this relationship, while accounting for some other factors that likely influence scientific impact (e.g., the reputations of the scientists conducting the research and academic journal in which the research was published). To accomplish this purpose, approximately 800 peer-reviewed articles describing original research were evaluated for scientific impact, popular media attention, and reputations of the scientists/authors and publication venue. A structural equation model was produced describing the relationship between non-scientific impact (popular media) and scientific impact (citations), while accounting for author/scientist and journal reputation. The resulting model revealed a strong association between the amount of popular media attention given to a scientific research project and corresponding publication and the number of times that publication is cited in peer-reviewed scientific literature. These results indicate that (1) peer-reviewed scientific publications receiving more attention in non-scientific media are more likely to be cited than scientific publications receiving less popular media attention, and (2) the non-scientific media is associated with the scientific agenda. These results may inform scientists who increasingly use popular media to inform the general public and scientists concerning their scientific work. These results might also inform administrators of higher education and research funding mechanisms, who base decisions partly on scientific impact.
Journal Article
Image Data Resource: a bioimage data integration and publication platform
by
Carazo Salas, Rafael E
,
Tarkowska, Aleksandra
,
Ferguson, Richard K
in
14/1
,
14/19
,
631/114/129/2044
2017
This Resource describes the Image Data Resource (IDR), a prototype online system for biological image data that links experimental and analytic data across multiple data sets and promotes image data sharing and reanalysis.
Access to primary research data is vital for the advancement of science. To extend the data types supported by community repositories, we built a prototype Image Data Resource (IDR). IDR links data from several imaging modalities, including high-content screening, multi-dimensional microscopy and digital pathology, with public genetic or chemical databases and cell and tissue phenotypes expressed using controlled ontologies. Using this integration, IDR facilitates the analysis of gene networks and reveals functional interactions that are inaccessible to individual studies. To enable reanalysis, we also established a computational resource based on Jupyter notebooks that allows remote access to the entire IDR. IDR is also an open-source platform for publishing imaging data. Thus IDR provides an online resource and a software infrastructure that promotes and extends publication and reanalysis of scientific image data.
Journal Article
Thousands of scientists publish a paper every five days
by
Boyack, Kevin W.
,
Ioannidis, John P. A.
,
Klavans, Richard
in
706/648/479
,
706/648/479/37
,
Authorship
2018
To highlight uncertain norms in authorship, John P. A. Ioannidis, Richard Klavans and Kevin W. Boyack identified the most prolific scientists of recent years.
To highlight uncertain norms in authorship, John P. A. Ioannidis, Richard Klavans and Kevin W. Boyack identified the most prolific scientists of recent years.
Journal Article
Scientific productivity: An exploratory study of metrics and incentives
by
Lindner, Mark D.
,
Torralba, Karina D.
,
Khan, Nasim A.
in
Arthritis, Rheumatoid - therapy
,
Bibliometrics
,
Biology and Life Sciences
2018
Competitive pressure to maximize the current bibliometric measures of productivity is jeopardizing the integrity of the scientific literature. Efforts are underway to address the 'reproducibility crisis' by encouraging the use of more rigorous, confirmatory methods. However, as long as productivity continues to be defined by the number of discoveries scientists publish, the impact factor of the journals they publish in and the number of times their papers are cited, they will be reluctant to accept high quality methods and consistently conduct and publish confirmatory/replication studies. This exploratory study examined a sample of rigorous Phase II-IV clinical trials, including unpublished studies, to determine if more appropriate metrics and incentives can be developed. The results suggest that rigorous procedures will help reduce false positives, but to the extent that higher quality methods are accepted as the standard of practice, the current bibliometric incentives will discourage innovative studies and encourage scientists to shift their research to less informative studies of subjects that are already being more actively investigated. However, the results also suggest that it is possible to develop a more appropriate system of rewards. In contrast to the current bibliometric incentives, evaluations of the quality of the methods and reproducibility of the results, innovation and diversity of thought, and amount of information produced may serve as measures and incentives that maintain the integrity of the scientific literature and maximize scientific progress.
Journal Article
How AI technology can tame the scientific literature
2018
As artificially intelligent tools for literature and data exploration evolve, developers seek to automate how hypotheses are generated and validated.
As artificially intelligent tools for literature and data exploration evolve, developers seek to automate how hypotheses are generated and validated.
Journal Article
Journals’ instructions to authors: A cross-sectional study across scientific disciplines
by
Malički, Mario
,
Aalbersberg, IJsbrand Jan
,
Bouter, Lex
in
Analysis
,
Bibliographic data bases
,
Biology
2019
In light of increasing calls for transparent reporting of research and prevention of detrimental research practices, we conducted a cross-sectional machine-assisted analysis of a representative sample of scientific journals' instructions to authors (ItAs) across all disciplines. We investigated addressing of 19 topics related to transparency in reporting and research integrity. Only three topics were addressed in more than one third of ItAs: conflicts of interest, plagiarism, and the type of peer review the journal employs. Health and Life Sciences journals, journals published by medium or large publishers, and journals registered in the Directory of Open Access Journals (DOAJ) were more likely to address many of the analysed topics, while Arts & Humanities journals were least likely to do so. Despite the recent calls for transparency and integrity in research, our analysis shows that most scientific journals need to update their ItAs to align them with practices which prevent detrimental research practices and ensure transparent reporting of research.
Journal Article