Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
174,970
result(s) for
"Statistical services"
Sort by:
Statistical disclosure control
by
Hundepool, Anco
in
Confidential communications
,
Confidential communications -- Statistical services
,
MATHEMATICS
2012
A reference to answer all your statistical confidentiality questions.
This handbook provides technical guidance on statistical disclosure control and on how to approach the problem of balancing the need to provide users with statistical outputs and the need to protect the confidentiality of respondents. Statistical disclosure control is combined with other tools such as administrative, legal and IT in order to define a proper data dissemination strategy based on a risk management approach.
The key concepts of statistical disclosure control are presented, along with the methodology and software that can be used to apply various methods of statistical disclosure control. Numerous examples and guidelines are also featured to illustrate the topics covered.
Statistical Disclosure Control:
* Presents a combination of both theoretical and practical solutions
* Introduces all the key concepts and definitions involved with statistical disclosure control.
* Provides a high level overview of how to approach problems associated with confidentiality.
* Provides a broad-ranging review of the methods available to control disclosure.
* Explains the subtleties of group disclosure control.
* Features examples throughout the book along with case studies demonstrating how particular methods are used.
* Discusses microdata, magnitude and frequency tabular data, and remote access issues.
* Written by experts within leading National Statistical Institutes.
Official statisticians, academics and market researchers who need to be informed and make decisions on disclosure limitation will benefit from this book.
The sum of the people : how the census has shaped nations, from the ancient world to the modern age
Provides a 3,000-year history of the census, chronicling the practices of the ancient world through the Supreme Court rulings of today, examining how censuses have been used as tools of democracy, exclusion and mass surveillance.
Handbook of statistical data editing and imputation
by
Scholtus, Sander
,
Waal, Ton de
,
Pannekoek, Jeroen
in
Data editing
,
Data integrity
,
MATHEMATICS
2011
A practical, one-stop reference on the theory and applications of statistical data editing and imputation techniques
Collected survey data are vulnerable to error. In particular, the data collection stage is a potential source of errors and missing values. As a result, the important role of statistical data editing, and the amount of resources involved, has motivated considerable research efforts to enhance the efficiency and effectiveness of this process. Handbook of Statistical Data Editing and Imputation equips readers with the essential statistical procedures for detecting and correcting inconsistencies and filling in missing values with estimates. The authors supply an easily accessible treatment of the existing methodology in this field, featuring an overview of common errors encountered in practice and techniques for resolving these issues. The book begins with an overview of methods and strategies for statistical data editing and imputation. Subsequent chapters provide detailed treatment of the central theoretical methods and modern applications, with topics of coverage including: Localization of errors in continuous data, with an outline of selective editing strategies, automatic editing for systematic and random errors, and other relevant state-of-the-art methods Extensions of automatic editing to categorical data and integer data The basic framework for imputation, with a breakdown of key methods and models and a comparison of imputation with the weighting approach to correct for missing values More advanced imputation methods, including imputation under edit restraints Throughout the book, the treatment of each topic is presented in a uniform fashion. Following an introduction, each chapter presents the key theories and formulas underlying the topic and then illustrates common applications. The discussion concludes with a summary of the main concepts and a real-world example that incorporates realistic data along with professional insight into common challenges and best practices. Handbook of Statistical Data Editing and Imputation is an essential reference for survey researchers working in the fields of business, economics, government, and the social sciences who gather, analyze, and draw results from data. It is also a suitable supplement for courses on survey methods at the upper-undergraduate and graduate levels.
Count the Dead
2022
The global doubling of human life expectancy between 1850 and 1950
is arguably one of the most consequential developments in human
history, undergirding massive improvements in human life and
lifestyles. In 1850, Americans died at an average age of 30. Today,
the average is almost 80. This story is typically told as a series
of medical breakthroughs-Jenner and vaccination, Lister and
antisepsis, Snow and germ theory, Fleming and penicillin-but the
lion's share of the credit belongs to the men and women who
dedicated their lives to collecting good data. Examining the
development of death registration systems in the United States-from
the first mortality census in 1850 to the development of the death
certificate at the turn of the century- Count the Dead
argues that mortality data transformed life on Earth, proving
critical to the systemization of public health, casualty reporting,
and human rights. Stephen Berry shows how a network of coroners,
court officials, and state and federal authorities developed
methods to track and reveal patterns of dying. These officials
harnessed these records to turn the collective dead into informants
and in so doing allowed the dead to shape life and death as we know
it today.
Principles and Practices for a Federal Statistical Agency
by
Statistics, Committee on National
,
Citro, Constance F
,
National Academies of Sciences, Engineering, and Medicine
2017
Publicly available statistics from government agencies that are credible, relevant, accurate, and timely are essential for policy makers, individuals, households, businesses, academic institutions, and other organizations to make informed decisions. Even more, the effective operation of a democratic system of government depends on the unhindered flow of statistical information to its citizens.
In the United States, federal statistical agencies in cabinet departments and independent agencies are the governmental units whose principal function is to compile, analyze, and disseminate information for such statistical purposes as describing population characteristics and trends, planning and monitoring programs, and conducting research and evaluation. The work of these agencies is coordinated by the U.S. Office of Management and Budget. Statistical agencies may acquire information not only from surveys or censuses of people and organizations, but also from such sources as government administrative records, private-sector datasets, and Internet sources that are judged of suitable quality and relevance for statistical use. They may conduct analyses, but they do not advocate policies or take partisan positions. Statistical purposes for which they provide information relate to descriptions of groups and exclude any interest in or identification of an individual person, institution, or economic unit.
Four principles are fundamental for a federal statistical agency: relevance to policy issues, credibility among data users, trust among data providers, and independence from political and other undue external influence. Principles and Practices for a Federal Statistical Agency: Sixth Edition presents and comments on these principles as they've been impacted by changes in laws, regulations, and other aspects of the environment of federal statistical agencies over the past 4 years.
Federal Statistics, Multiple Data Sources, and Privacy Protection
by
Statistics, Committee on National
,
National Academies of Sciences, Engineering, and Medicine
,
Harris-Kojetin, Brian A
in
Information retrieval
,
Information retrieval. (OCoLC)fst00972619
,
Statistical services
2017,2018
The environment for obtaining information and providing statistical data for policy makers and the public has changed significantly in the past decade, raising questions about the fundamental survey paradigm that underlies federal statistics. New data sources provide opportunities to develop a new paradigm that can improve timeliness, geographic or subpopulation detail, and statistical efficiency. It also has the potential to reduce the costs of producing federal statistics.
The panel's first report described federal statistical agencies' current paradigm, which relies heavily on sample surveys for producing national statistics, and challenges agencies are facing; the legal frameworks and mechanisms for protecting the privacy and confidentiality of statistical data and for providing researchers access to data, and challenges to those frameworks and mechanisms; and statistical agencies access to alternative sources of data. The panel recommended a new approach for federal statistical programs that would combine diverse data sources from government and private sector sources and the creation of a new entity that would provide the foundational elements needed for this new approach, including legal authority to access data and protect privacy.
This second of the panel's two reports builds on the analysis, conclusions, and recommendations in the first one. This report assesses alternative methods for implementing a new approach that would combine diverse data sources from government and private sector sources, including describing statistical models for combining data from multiple sources; examining statistical and computer science approaches that foster privacy protections; evaluating frameworks for assessing the quality and utility of alternative data sources; and various models for implementing the recommended new entity. Together, the two reports offer ideas and recommendations to help federal statistical agencies examine and evaluate data from alternative sources and then combine them as appropriate to provide the country with more timely, actionable, and useful information for policy makers, businesses, and individuals.