Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectCountry Of PublicationPublisherSourceTarget AudienceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
548
result(s) for
"Strategic planning Computer simulation."
Sort by:
Business case analysis with R : simulation tutorials to support complex business decisions
by
Brown, Robert D., author
in
Strategic planning Data processing.
,
Business planning Data processing.
,
Strategic planning Computer simulation.
2018
This tutorial teaches you how to use the statistical programming language R to develop a business case simulation and analysis. It presents a methodology for conducting business case analysis that minimizes decision delay by focusing stakeholders on what matters most and suggests pathways for minimizing the risk in strategic and capital allocation decisions. Business case analysis, often conducted in spreadsheets, exposes decision makers to additional risks that arise just from the use of the spreadsheet environment.
Computational Analysis of Firms' Organization and Strategic Behaviour
2010
This book addresses possible applications of computer simulation to theory building in management and organizational theory. The key hypothesis is that modelling and computer simulation provide an environment to develop, test and articulate theoretical propositions. In general, computer simulation provides an experimental environment where researchers are able to play with symbolic representations of phenomena by modifying the model’s structure and activating or deactivating model’s parameters. This environment allows to both generating hypotheses to ex post explain observed phenomena or to ex ante generate distributions of unrealized events thereby envisioning areas for further empirical investigations.
Under a methodological perspective, the volume investigates logics and techniques to design a research strategy grounded on computer simulation. In particular, the articles in the book concentrate on two different techniques, and philosophies, to set up a simulation study: System Dynamics, which is grounded on differential equations and feedback theory, and agent-based modeling. The book describes how computer simulation helps to look into research issues typical to strategic management and organizational theory. In this respect, such themes as firms’ diversification strategies, competitive strategy, rivalry and the impact of role dynamics on organizational performances are explored through the lenses of computer simulation models.
Edoardo Mollona received a Ph.D degree in Strategic Management at the London Business School, and is currently associate professor at the Department of Computer Science of the University of Bologna. He has published articles and books on the application of modeling and simulation to strategic management and organizational theory.
Part 1: Why and How Using Computer Simulation for Theory Development in Social Sciences 1. The Use of Computer Simulation in Strategy and Organization Research Mollona, E. 2. Computational Modelling and Social Theory: The Dangers of Numerical Representation Edmonds, B. 3. Devices for Theory Development: Why Using Computer Simulation if Mathematical Analysis is Available Fioresi, R. and E. Mollona. 4. Mix, Chain and Replicate: Methodologies for Agent-Based Modelling of Social Systems Hales, D. Part 2: Computer Simulation for Theory Development in Strategy and Organization Theory 5. The Dynamics of Firm Growth and Resource Sharing in Corporate Diversification Gary, S. 6. Revisiting Porter’s Generic Strategies for Competitive Environments using System Dynamics Kunc, M. 7. Rivalry and Learning among Clustered and Isolated Firms Boari, C., Fioretti, G. and Odorici, V. 8. Organization and Strategy in Banks Cappellini, A. and A. Raimondi. 9. Changing Roles in Organizations: An Agent-Based Approach Lamieri, M. and D. Mangalagiu 10. Rationality Meets the Tribe: Recent Models of Cultural Group Selection Hales, D. Part 3: How to Build Agent-Based Computer Models of Firms 11. An Agent-Based Methodological Framework to Simulate Organizations or the Quest for the Enterprise: jES and jESOF, Java Enterprise Simulator and Java Enterprise Simulator Open Foundation Terna, P. 12. From Petri Nets to ABM: The Analysis of the Enterprise's Process to Model the Firm Ferraris, G.
Development and validation of the Michigan Chronic Disease Simulation Model (MICROSIM)
2024
Strategies to prevent or delay Alzheimer’s disease and related dementias (AD/ADRD) are urgently needed, and blood pressure (BP) management is a promising strategy. Yet the effects of different BP control strategies across the life course on AD/ADRD are unknown. Randomized trials may be infeasible due to prolonged follow-up and large sample sizes. Simulation analysis is a practical approach to estimating these effects using the best available existing data. However, existing simulation frameworks cannot estimate the effects of BP control on both dementia and cardiovascular disease. This manuscript describes the design principles, implementation details, and population-level validation of a novel population-health microsimulation framework, the MIchigan ChROnic Disease SIMulation (MICROSIM), for The Effect of Lower Blood Pressure over the Life Course on Late-life Cognition in Blacks, Hispanics, and Whites (BP-COG) study of the effect of BP levels over the life course on dementia and cardiovascular disease. MICROSIM is an agent-based Monte Carlo simulation designed using computer programming best practices. MICROSIM estimates annual vascular risk factor levels and transition probabilities in all-cause dementia, stroke, myocardial infarction, and mortality in a nationally representative sample of US adults 18+ using the National Health and Nutrition Examination Survey (NHANES). MICROSIM models changes in risk factors over time, cognition and dementia using changes from a pooled dataset of individual participant data from 6 US prospective cardiovascular cohort studies. Cardiovascular risks were estimated using a widely used risk model and BP treatment effects were derived from meta-analyses of randomized trials. MICROSIM is an extensible, open-source framework designed to estimate the population-level impact of different BP management strategies and reproduces US population-level estimates of BP and other vascular risk factors levels, their change over time, and incident all-cause dementia, stroke, myocardial infarction, and mortality.
Journal Article
Design and Development of Digital Twins: a Case Study in Supply Chains
2020
Digital twin technology consists of creating virtual replicas of objects or processes that simulate the behavior of their real counterparts. The objective is to analyze its effectiveness or behavior in certain cases to improve its effectiveness. Applied to products, machines and even complete business ecosystems, the digital twin model can reveal information from the past, optimize the present and even predict the future performance of the different areas analyzed. In the context of supply chains, digital twins are changing the way they do business, providing a range of options to facilitate collaborative environments and data-based decision making and making business processes more robust. This paper proposes the design and development of a digital twin for a case study of a pharmaceutical company. The technology used is based on simulators, solvers and data analytic tools that allow these functions to be connected in an integral interface for the company.
Journal Article
Mental representation and the discovery of new strategies
by
Csaszar, Felipe A.
,
Levinthal, Daniel A.
in
Computer simulation
,
Emotions
,
managerial cognition
2016
Research summary: Managers' mental representations affect the perceived payoffs and alternatives that managers consider. Thus, mental representations affect how managers search for profitable strategies as well as the quality of strategies they discover. To study how mental representation and search interact, we formally model the dual search over possible representations and over policy choices of a strategy \"landscape.\" We analyze when it is preferable to emphasize searching for the best policies rather than the best mental representation, and vice versa. We show that, in the long run, a balance between the two search modes not only results in better expected performance, but also reduces the variation in performance. Additionally, the article describes conditions under which increased accuracy of mental representations can actually worsen firm performance. Managerial summary: Managers' mental representations affect the perceived payoffs and alternatives that managers consider. Thus, mental representations affect the quality of strategies managers can discover. We analyze a computer simulation of how managers use mental representations to search for strategies. This sheds light on how managers should deal with the trade-off between searching for policies and searching for representations; that is, whether managers should think creatively about how to represent a strategy problem or whether they should just stick to the current problem understanding, and try to find ways to improve performance as suggested by the current representation. We provide insight regarding the balance between the two search modes and describe conditions under which increasingly accurate mental representations can worsen firm performance.
Journal Article
Simulating the council-specific impact of anti-malaria interventions: A tool to support malaria strategic planning in Tanzania
by
Runge, Manuela
,
Mohamed, Ally
,
Lengeler, Christian
in
Antimalarials - therapeutic use
,
Biology and Life Sciences
,
Busta Rhymes
2020
The decision-making process for malaria control and elimination strategies has become more challenging. Interventions need to be targeted at council level to allow for changing malaria epidemiology and an increase in the number of possible interventions. Models of malaria dynamics can support this process by simulating potential impacts of multiple interventions in different settings and determining appropriate packages of interventions for meeting specific expected targets.
The OpenMalaria model of malaria dynamics was calibrated for all 184 councils in mainland Tanzania using data from malaria indicator surveys, school parasitaemia surveys, entomological surveillance, and vector control deployment data. The simulations were run for different transmission intensities per region and five interventions, currently or potentially included in the National Malaria Strategic Plan, individually and in combination. The simulated prevalences were fitted to council specific prevalences derived from geostatistical models to obtain council specific predictions of the prevalence and number of cases between 2017 and 2020. The predictions were used to evaluate in silico the feasibility of the national target of reaching a prevalence of below 1% by 2020, and to suggest alternative intervention stratifications for the country.
The historical prevalence trend was fitted for each council with an agreement of 87% in 2016 (95%CI: 0.84-0.90) and an agreement of 90% for the historical trend (2003-2016) (95%CI: 0.87-0.93) The current national malaria strategy was expected to reduce the malaria prevalence between 2016 and 2020 on average by 23.8% (95% CI: 19.7%-27.9%) if current case management levels were maintained, and by 52.1% (95% CI: 48.8%-55.3%) if the case management were improved. Insecticide treated nets and case management were the most cost-effective interventions, expected to reduce the prevalence by 25.0% (95% CI: 19.7%-30.2) and to avert 37 million cases between 2017 and 2020. Mass drug administration was included in most councils in the stratification selected for meeting the national target at minimal costs, expected to reduce the prevalence by 77.5% (95%CI: 70.5%-84.5%) and to avert 102 million cases, with almost twice higher costs than those of the current national strategy. In summary, the model suggested that current interventions are not sufficient to reach the national aim of a prevalence of less than 1% by 2020 and a revised strategic plan needs to consider additional, more effective interventions, especially in high transmission areas and that the targets need to be revisited.
The methodology reported here is based on intensive interactions with the NMCP and provides a helpful tool for assessing the feasibility of country specific targets and for determining which intervention stratifications at sub-national level will have most impact. This country-led application could support strategic planning of malaria control in many other malaria endemic countries.
Journal Article
The Land Use Model Intercomparison Project (LUMIP) contribution to CMIP6: rationale and experimental design
by
Seneviratne, Sonia I
,
Jones, Chris D
,
Jones, Andrew D
in
20th century
,
Atmospheric models
,
Biogeochemical cycles
2016
Human land-use activities have resulted in large changes to the Earth's surface, with resulting implications for climate. In the future, land-use activities are likely to expand and intensify further to meet growing demands for food, fiber, and energy. The Land Use Model Intercomparison Project (LUMIP) aims to further advance understanding of the impacts of land-use and land-cover change (LULCC) on climate, specifically addressing the following questions. (1) What are the effects of LULCC on climate and biogeochemical cycling (past-future)? (2) What are the impacts of land management on surface fluxes of carbon, water, and energy, and are there regional land-management strategies with the promise to help mitigate climate change? In addressing these questions, LUMIP will also address a range of more detailed science questions to get at process-level attribution, uncertainty, data requirements, and other related issues in more depth and sophistication than possible in a multi-model context to date. There will be particular focus on the separation and quantification of the effects on climate from LULCC relative to all forcings, separation of biogeochemical from biogeophysical effects of land use, the unique impacts of land-cover change vs. land-management change, modulation of land-use impact on climate by land-atmosphere coupling strength, and the extent to which impacts of enhanced CO2 concentrations on plant photosynthesis are modulated by past and future land use.LUMIP involves three major sets of science activities: (1) development of an updated and expanded historical and future land-use data set, (2) an experimental protocol for specific LUMIP experiments for CMIP6, and (3) definition of metrics and diagnostic protocols that quantify model performance, and related sensitivities, with respect to LULCC. In this paper, we describe LUMIP activity (2), i.e., the LUMIP simulations that will formally be part of CMIP6. These experiments are explicitly designed to be complementary to simulations requested in the CMIP6 DECK and historical simulations and other CMIP6 MIPs including ScenarioMIP, C4MIP, LS3MIP, and DAMIP. LUMIP includes a two-phase experimental design. Phase one features idealized coupled and land-only model simulations designed to advance process-level understanding of LULCC impacts on climate, as well as to quantify model sensitivity to potential land-cover and land-use change. Phase two experiments focus on quantification of the historic impact of land use and the potential for future land management decisions to aid in mitigation of climate change. This paper documents these simulations in detail, explains their rationale, outlines plans for analysis, and describes a new subgrid land-use tile data request for selected variables (reporting model output data separately for primary and secondary land, crops, pasture, and urban land-use types). It is essential that modeling groups participating in LUMIP adhere to the experimental design as closely as possible and clearly report how the model experiments were executed.
Journal Article
Data-driven smart sustainable cities of the future: urban computing and intelligence for strategic, short-term, and joined-up planning
2021
Sustainable cities are quintessential complex systems—dynamically changing environments and developed through a multitude of individual and collective decisions from the bottom up to the top down. As such, they are full of contestations, conflicts, and contingencies that are not easily captured, steered, and predicted respectively. In short, they are characterized by wicked problems. Therefore, they are increasingly embracing and leveraging what smart cities have to offer as to big data technologies and their novel applications in a bid to effectively tackle the complexities they inherently embody and to monitor, evaluate, and improve their performance with respect to sustainability—under what has been termed “data-driven smart sustainable cities.” This paper analyzes and discusses the enabling role and innovative potential of urban computing and intelligence in the strategic, short-term, and joined-up planning of data-driven smart sustainable cities of the future. Further, it devises an innovative framework for urban intelligence and planning functions as an advanced form of decision support. This study expands on prior work done to develop a novel model for data-driven smart sustainable cities of the future. I argue that the fast-flowing torrent of urban data, coupled with its analytical power, is of crucial importance to the effective planning and efficient design of this integrated model of urbanism. This is enabled by the kind of data-driven and model-driven decision support systems associated with urban computing and intelligence. The novelty of the proposed framework lies in its essential technological and scientific components and the way in which these are coordinated and integrated given their clear synergies to enable urban intelligence and planning functions. These utilize, integrate, and harness complexity science, urban complexity theories, sustainability science, urban sustainability theories, urban science, data science, and data-intensive science in order to fashion powerful new forms of simulation models and optimization methods. These in turn generate optimal designs and solutions that improve sustainability, efficiency, resilience, equity, and life quality. This study contributes to understanding and highlighting the value of big data in regard to the planning and design of sustainable cities of the future.
Journal Article
Unraveling the causes of the Seoul Halloween crowd-crush disaster
2024
As the world steadily recovers from the COVID-19 pandemic, managing large gatherings becomes a critical concern for ensuring crowd safety. The crowd-crush disaster in Seoul in 2022 highlights the need for effective predictive crowd management techniques. In this study, an empirical analysis of this incident is conducted using data from various sources, and model-based simulations are created to replicate hazardous crowd conditions in high-risk areas. In the empirical analysis, mobile device data indicates a significant increase in population above normal levels in the disaster area just hours before the incident occurred. In the simulations, a hydrodynamic model is employed to simulate a bidirectional collision, which quantitatively demonstrates that the average density during the crush reached 7.57 ped / m 2 (with a maximum of ( 9.95 ) ped / m 2 ). Additionally, the average crowd pressure peaked at 1,063 N / m (with a maximum of 1,961 N / m ), and the maximum velocity entropy was 10.99 . Based on these findings, it can be concluded that the primary causes of the disaster were the substantial population, bidirectional collision, and escalating panic. The results of controlled simulations under various management strategies are then presented. By implementing effective crowd management techniques, crowd safety can be enhanced through quantitative comparisons of these key indicators.
Journal Article
GloFAS – global ensemble streamflow forecasting and flood early warning
2013
Anticipation and preparedness for large-scale flood events have a key role in mitigating their impact and optimizing the strategic planning of water resources. Although several developed countries have well-established systems for river monitoring and flood early warning, figures of populations affected every year by floods in developing countries are unsettling. This paper presents the Global Flood Awareness System (GloFAS), which has been set up to provide an overview on upcoming floods in large world river basins. GloFAS is based on distributed hydrological simulation of numerical ensemble weather predictions with global coverage. Streamflow forecasts are compared statistically to climatological simulations to detect probabilistic exceedance of warning thresholds. In this article, the system setup is described, together with an evaluation of its performance over a two-year test period and a qualitative analysis of a case study for the Pakistan flood, in summer 2010. It is shown that hazardous events in large river basins can be skilfully detected with a forecast horizon of up to 1 month. In addition, results suggest that an accurate simulation of initial model conditions and an improved parameterization of the hydrological model are key components to reproduce accurately the streamflow variability in the many different runoff regimes of the earth.
Journal Article