Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectCountry Of PublicationPublisherSourceTarget AudienceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
151,689
result(s) for
"Retrieval"
Sort by:
Big Data, Little Data, No Data
by
Borgman, Christine L
in
Big data
,
Communication in learning and scholarship
,
Communication in learning and scholarship -- Technological innovations
2015,2016,2017
\"Big Data\" is on the covers ofScience, Nature, theEconomist, andWiredmagazines, on the front pages of theWall Street Journaland theNew York Times.But despite the media hyperbole, as Christine Borgman points out in this examination of data and scholarly research, having the right data is usually better than having more data; little data can be just as valuable as big data. In many cases, there are no data -- because relevant data don't exist, cannot be found, or are not available. Moreover, data sharing is difficult, incentives to do so are minimal, and data practices vary widely across disciplines.Borgman, an often-cited authority on scholarly communication, argues that data have no value or meaning in isolation; they exist within a knowledge infrastructure -- an ecology of people, practices, technologies, institutions, material objects, and relationships. After laying out the premises of her investigation -- six \"provocations\" meant to inspire discussion about the uses of data in scholarship -- Borgman offers case studies of data practices in the sciences, the social sciences, and the humanities, and then considers the implications of her findings for scholarly practice and research policy. To manage and exploit data over the long term, Borgman argues, requires massive investment in knowledge infrastructures; at stake is the future of scholarship.
Improving the translation of search strategies using the Polyglot Search Translator: a randomized controlled trial
by
Clark, Justin Michael
,
Carter, Matthew
,
Jones, Mark
in
Analysis
,
automation
,
Bibliographic data bases
2020
Background: Searching for studies to include in a systematic review (SR) is a time- and labor-intensive process with searches of multiple databases recommended. To reduce the time spent translating search strings across databases, a tool called the Polyglot Search Translator (PST) was developed. The authors evaluated whether using the PST as a search translation aid reduces the time required to translate search strings without increasing errors.Methods: In a randomized trial, twenty participants were randomly allocated ten database search strings and then randomly assigned to translate five with the assistance of the PST (PST-A method) and five without the assistance of the PST (manual method). We compared the time taken to translate search strings, the number of errors made, and how close the number of references retrieved by a translated search was to the number retrieved by a reference standard translation.Results: Sixteen participants performed 174 translations using the PST-A method and 192 translations using the manual method. The mean time taken to translate a search string with the PST-A method was 31 minutes versus 45 minutes by the manual method (mean difference: 14 minutes). The mean number of errors made per translation by the PST-A method was 8.6 versus 14.6 by the manual method. Large variation in the number of references retrieved makes results for this outcome unreliable, although the number of references retrieved by the PST-A method was closer to the reference standard translation than the manual method.Conclusion: When used to assist with translating search strings across databases, the PST can increase the speed of translation without increasing errors. Errors in search translations can still be a problem, and search specialists should be aware of this.
Journal Article
Search foundations : toward a science of technology-mediated experience
\"This book contributes to discussions within Information Retrieval and Science (IR&S) by improving our conceptual understanding of the relationship between humans and technology\"-- Provided by publisher.
Handbook of evaluation methods for health informatics
by
Brender, Jytte
in
Decision Support Techniques
,
Evaluation
,
Information storage and retrieval systems
2006
This Handbook provides a complete compendium of methods for evaluation of IT-based systems and solutions within healthcare. Emphasis is entirely on assessment of the IT-system within its organizational environment. The author provides a coherent and complete assessment of methods addressing interactions with and effects of technology at the organizational, psychological, and social levels.It offers an explanation of the terminology and theoretical foundations underlying the methodological analysis presented here. The author carefully guides the reader through the process of identifying relevant methods corresponding to specific information needs and conditions for carrying out the evaluation study. The Handbook takes a critical view by focusing on assumptions for application, tacit built-in perspectives of the methods as well as their perils and pitfalls. *Collects a number of evaluation methods of medical informatics*Addresses metrics and measures*Includes an extensive list of anotated references, case studies, and a list of useful Web sites
Informatica
2023
Informatica -the updated
edition of Alex Wright's previously published Glut-continues the
journey through the history of the information age to show how
information systems emerge . Today's \"information
explosion\" may seem like a modern phenomenon, but we are not the
first generation-or even the first species-to wrestle with the
problem of information overload. Long before the advent of
computers, human beings were collecting, storing, and organizing
information: from Ice Age taxonomies to Sumerian archives, Greek
libraries to Christian monasteries.
Wright weaves a narrative that connects such seemingly far-flung
topics as insect colonies, Stone Age jewelry, medieval monasteries,
Renaissance encyclopedias, early computer networks, and the World
Wide Web. He suggests that the future of the information age may
lie deep in our cultural past.
We stand at a precipice struggling to cope with a tsunami of
data. Wright provides some much-needed historical perspective. We
can understand the predicament of information overload not just as
the result of technological change but as the latest chapter in an
ancient story that we are only beginning to understand.
When we are no more : how digital memory is shaping our future
Examines how humanity records and passes on its culture to future generations, from the libraries of antiquity to the excess of information available in the digital age, and how ephemeral digital storage methods present a challenge for passing on current cultural memory to the future.
Optical cryptosystems
2020
Advanced technologies such as artificial intelligence, big data, cloud computing, and the Internet of Things have changed the digital landscape, providing many new and exciting opportunities. However, they also provide ever-shifting gateways for information theft or misuse. Staying ahead requires the development of innovative and responsive security measures, and recent advances in optical technology have positioned it as a promising alternative to digital cryptography. Optical Cryptosystems introduces the subject of optical cryptography and provides up-to-date coverage of optical security schemes. Optical principles, approaches, and algorithms are discussed as well as applications, including image/data encryption-decryption, watermarking, image/data hiding, and authentication verification. This book also includes MATLAB[reg] codes, enabling students and research professionals to carry out exercises and develop newer methods of image/data security and authentication.