Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
55
result(s) for
"Dataprocessing"
Sort by:
A programmable shared-memory system for an array of processing-in-memory devices
by
Sim, Hyogi
,
Vazhkudai, Sudharshan S.
,
Lee, Sangkuen
in
Arrays
,
Big data processing
,
Commodities
2019
Processing in memory (PIM), the concept of integrating processing directly with memory has been attracting a lot of attention, since PIM can assist in overcoming the throughput limitation caused by data movement between CPU and memory. The challenge, however, is that it requires the programmers to have a deep understanding of the PIM architecture to maximize the benefits such as data locality and parallel thread execution on multiple PIM devices. In this study, we present AnalyzeThat, a programmable shared-memory system for parallel data processing with PIM devices. Thematic to AnalyzeThat is a rich PIM-aware data structure (PADS), which is an encapsulation that integrally ties together the data, the analysis tasks and the runtime needed to interface with the PIM device array. The PADS abstraction provides (i) a sophisticated key-value data container that allows programmers to easily store data on multiple PIMs, (ii) a suite of parallel operations with which users can easily implement data analysis applications, and (iii) a runtime, hidden to programmers, which provides the mechanisms needed to overlay both the data and the tasks on the PIM device array in an intelligent fashion, based on PIM-specific information collected from the hardware. We have developed a PIM emulation framework called AnalyzeThat. Our experimental evaluation with representative data analytics applications suggests that the proposed system can significantly reduce the PIM programming effort without losing its technology benefits.
Journal Article
Cognitive Work Analysis: Coping with Complexity
by
Jenkins, Daniel P.
,
Stanton, Neville A.
,
Salmon, Paul M.
in
Command and control systems
,
Command and control systems -- Data processing
,
Human-computer interaction
2009,2008,2017
'Complex sociotechnical systems' are systems made up of numerous interacting parts, both human and non-human, operating in dynamic, ambiguous and safety critical domains. Cognitive Work Analysis (CWA) is a structured framework specifically developed for considering the development and analysis of these complex socio-technical systems. Unlike many human factors approaches, CWA does not focus on how human-system interaction should proceed (normative modelling) or how human-system interaction currently works (descriptive modelling). Instead, through a focus on constraints, it develops a model of how work can be conducted within a given work domain, without explicitly identifying specific sequences of actions (formative modelling).
The Handbook of Information and Computer Ethics
by
Tavani, Herman T
,
Himma, Kenneth E
in
Aerospace
,
Communication, Networking and Broadcast Technologies
,
Components, Circuits, Devices and Systems
2009,2008
Discover how developments in information technology are raising new ethical debates Information and computer ethics has emerged as an important area of philosophical and social theorizing, combining conceptual, meta-ethical, normative, and applied elements. As a result, academic interest in this area has increased dramatically, particularly in computer science, philosophy, and communications departments; business schools; information and library schools; and law schools. The Handbook of Information and Computer Ethics responds to this growing interest with twenty-seven chapters that address both traditional and current issues in information and computer ethics research. It is organized into six parts: Foundational Issues and Methodological Frameworks Theoretical Issues Affecting Property, Privacy, Anonymity, and Security Professional Issues and the Information-Related Professions Responsibility Issues and Risk Assessment Regulatory Issues and Challenges Access and Equity Issues Each chapter, written by one or more of the most influential ethicists in their fields of expertise, explains and evaluates the central positions and arguments on the respective issues. Chapters end with a bibliography that identifies the most important supplementary books and papers available on the topic. This handbook provides an accessible, yet sophisticated, overview of the most important issues we face in information and computer ethics today. It is an ideal supplemental text for advanced undergraduate- and graduate-level courses in information and computer ethics, and is also of interest to readers who are involved in library science, computer science, or philosophy.
Optimal design of experiments : a case study approach
by
Goos, Peter
,
Jones, Bradley
in
Case studies
,
Computer-aided design
,
Computer-aided engineering
2011
\"This is an engaging and informative book on the modern practice of experimental design. The authors' writing style is entertaining, the consulting dialogs are extremely enjoyable, and the technical material is presented brilliantly but not overwhelmingly. The book is a joy to read. Everyone who practices or teaches DOE should read this book.\" - Douglas C. Montgomery, Regents Professor, Department of Industrial Engineering, Arizona State University \"It's been said: 'Design for the experiment, don't experiment for the design.' This book ably demonstrates this notion by showing how tailor-made, optimal designs can be effectively employed to meet a client's actual needs. It should be required reading for anyone interested in using the design of experiments in industrial settings.\" -Christopher J. Nachtsheim, Frank A Donaldson Chair in Operations Management, Carlson School of Management, University of Minnesota This book demonstrates the utility of the computer-aided optimal design approach using real industrial examples. These examples address questions such as the following: How can I do screening inexpensively if I have dozens of factors to investigate? What can I do if I have day-to-day variability and I can only perform 3 runs a day? How can I do RSM cost effectively if I have categorical factors? How can I design and analyze experiments when there is a factor that can only be changed a few times over the study? How can I include both ingredients in a mixture and processing factors in the same study? How can I design an experiment if there are many factor combinations that are impossible to run? How can I make sure that a time trend due to warming up of equipment does not affect the conclusions from a study? How can I take into account batch information in when designing experiments involving multiple batches? How can I add runs to a botched experiment to resolve ambiguities? While answering these questions the book also shows how to evaluate and compare designs. This allows researchers to make sensible trade-offs between the cost of experimentation and the amount of information they obtain.
Computational drug design
by
Young, D. C
in
Biochemical Phenomena
,
Chemistry, Pharmaceutical -- methods
,
Computational Biology -- methods
2009
A solid perspective on the entire breadth of the field of computational drug design, including every major design technique in use today Explains how the drug design process varies for different types of targets. Compares and contrasts accuracy of available computational methods.