Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
36,209
result(s) for
"Computer science -- Methodology"
Sort by:
MATLAB for neuroscientists : an introduction to scientific computing in MATLAB
by
Dickey, Adam Seth
,
Benayoun, Marc D
,
Lusignan, Michael E
in
Computer science -- Methodology
,
Data processing
,
MATLAB
2014,2013,2008
This is the first comprehensive teaching resource and textbook for the teaching of Matlab in the Neurosciences and in Psychology. Matlab is unique in that it can be used to learn the entire empirical and experimental process, including stimulus generation, experimental control, data collection, data analysis and modeling. Thus a wide variety of computational problems can be addressed in a single programming environment. The idea is to empower advanced undergraduates and beginning graduate students by allowing them to design and implement their own analytical tools. As students advance in their research careers, they will have achieved the fluency required to understand and adapt more specialized tools as opposed to treating them as \"black boxes\".
Quantitative methods in archaeology using R
\"Quantitative Methods in Archaeology Using R is the first hands-on guide to using the R statistical computing system written specifically for archaeologists. It shows how to use the system to analyze many types of archaeological data. Part I includes tutorials on R, with applications to real archaeological data showing how to compute descriptive statistics, create tables, and produce a wide variety of charts and graphs. Part II addresses the major multivariate approaches used by archaeologists, including multiple regression (and the generalized linear model); multiple analysis of variance and discriminant analysis; principal components analysis; correspondence analysis; distances and scaling; and cluster analysis. Part III covers specialized topics in archaeology, including intra-site spatial analysis, seriation, and assemblage diversity\"--Provided by publisher.
International survey on skin patch test procedures, attitudes and interpretation
by
Vereda, Andrea
,
Pawankar, Ruby
,
Tanno, Luciana K.
in
[SDV.IMM.ALL]Life Sciences [q-bio]/Immunology/Allergology
,
[SDV.MHEP.DERM]Life Sciences [q-bio]/Human health and pathology/Dermatology
,
[SDV.SPEE]Life Sciences [q-bio]/Santé publique et épidémiologie
2016
Background Skin patch test is the gold standard method in diagnosing contact allergy. Although used for more than 100 years, the patch test procedure is performed with variability around the world. A number of factors can influence the test results, namely the quality of reagents used, the timing of the application, the patch test series (allergens/haptens) that have been used for testing, the appropriate interpretation of the skin reactions or the evaluation of the patient’s benefit.
Methods We performed an Internet –based survey with 38 questions covering the educational background of respondents, patch test methods and interpretation. The questionnaire was distributed among all representatives of national member societies of the World Allergy Organization (WAO), and the WAO Junior Members Group.
Results One hundred sixty-nine completed surveys were received from 47 countries. The majority of participants had more than 5 years of clinical practice (61 %) and routinely carried out patch tests (70 %). Both allergists and dermatologists were responsible for carrying out the patch tests. We could observe the use of many different guidelines regardless the geographical distribution. The use of home-made preparations was indicated by 47 % of participants and 73 % of the respondents performed 2 or 3 readings. Most of the responders indicated having patients with adverse reactions, including erythroderma (12 %); however, only 30 % of members completed a consent form before conducting the patch test.
Discussion The heterogeneity of patch test practices may be influenced by the level of awareness of clinical guidelines, different training backgrounds, accessibility to various types of devices, the patch test series (allergens/haptens) used for testing, type of clinical practice (public or private practice, clinical or research-based institution), infrastructure availability, financial/commercial implications and regulations among others.
Conclusion There is a lack of a worldwide homogeneity of patch test procedures, and this raises concerns about the need for standardization and harmonization of this important diagnostic procedure.
Journal Article
Analogicity in Computer Science. Methodological Analysis
2020
Analogicity in computer science is understood in two, not mutually exclusive ways: 1) with regard to the continuity feature (of data or computations), 2) with regard to the analogousness feature (i.e. similarity between certain natural processes and computations). Continuous computations are the subject of three methodological questions considered in the paper: 1a) to what extent do their theoretical models go beyond the model of the universal Turing machine (defining digital computations), 1b) is their computational power greater than that of the universal Turing machine, 1c) under what conditions are continuous computations realizable in practice? The analogue-analogical computations lead to two other issues: 2a) in what sense and to what extent their accuracy depends on the adequacy of certain theories of empirical sciences, 2b) are there analogue-analogical computations in nature that are also continuous? The above issues are an important element of the philosophical discussion on the limitations of contemporary computer science.
Journal Article
Computing as Empirical Science – Evolution of a Concept
by
Polak, Paweł
in
empirical computing science
,
methodology of computer science
,
natural computing
2016
This article presents the evolution of philosophical and methodological considerations concerning empiricism in computer/computing science. In this study, we trace the most important current events in the history of reflection on computing. The forerunners of Artificial Intelligence H.A. Simon and A. Newell in their paper Computer Science As Empirical Inquiry (1975) started these considerations. Later the concept of empirical computer science was developed by S.S. Shapiro, P. Wegner, A.H. Eden and P.J. Denning. They showed various empirical aspects of computing. This led to a view of the science of computing (or science of information processing) - the science of general scope. Some interesting contemporary ways towards a generalized perspective on computations were also shown (e.g. natural computing).
Journal Article
The Bounded and Precise Word Problems for Presentations of Groups
by
Ivanov, S. V.
in
Geometric group theory [See also 05C25, 20E08, 57Mxx]
,
Group theory and generalizations
,
Presentations of groups (Mathematics)
2020
We introduce and study the bounded word problem and the precise word problem for groups given by means of generators and defining
relations. For example, for every finitely presented group, the bounded word problem is in
An Object-Oriented Methodology for Solving Assignment-Type Problems with Neighborhood Search Techniques
by
Ferland, Jacques A
,
Lavoie, Alain
,
Hertz, Alain
in
Applied sciences
,
Boolean data
,
Boolean functions
1996
Because of its specificity, it is usually difficult to reuse computer code developed for a given combinatorial problem to deal with another one. We use the Object Oriented Programming methodology to derive general purpose software including four different neighborhood search techniques (descent method, tabu search, exchange procedure, simulated annealing) to deal with any assignment-type problem with a bare minimum of coding effort. The structure of the software even allows a more advanced user to play around with several parameters of these techniques and to modify the techniques to create specific variants.
Journal Article
Big Data, Little Data, No Data
by
Borgman, Christine L
in
Big data
,
Communication in learning and scholarship
,
Communication in learning and scholarship -- Technological innovations
2015,2016,2017
\"Big Data\" is on the covers ofScience, Nature, theEconomist, andWiredmagazines, on the front pages of theWall Street Journaland theNew York Times.But despite the media hyperbole, as Christine Borgman points out in this examination of data and scholarly research, having the right data is usually better than having more data; little data can be just as valuable as big data. In many cases, there are no data -- because relevant data don't exist, cannot be found, or are not available. Moreover, data sharing is difficult, incentives to do so are minimal, and data practices vary widely across disciplines.Borgman, an often-cited authority on scholarly communication, argues that data have no value or meaning in isolation; they exist within a knowledge infrastructure -- an ecology of people, practices, technologies, institutions, material objects, and relationships. After laying out the premises of her investigation -- six \"provocations\" meant to inspire discussion about the uses of data in scholarship -- Borgman offers case studies of data practices in the sciences, the social sciences, and the humanities, and then considers the implications of her findings for scholarly practice and research policy. To manage and exploit data over the long term, Borgman argues, requires massive investment in knowledge infrastructures; at stake is the future of scholarship.