Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
47 result(s) for "Gutfraind, Alexander"
Sort by:
Defining the analytical complexity of decision problems under uncertainty based on their pivotal properties
Uncertainty poses a pervasive challenge in decision analysis and risk management. When the problem is poorly understood, probabilistic estimation exhibits high variability and bias. Analysts then utilize various strategies to find satisficing solutions, and these strategies can sometimes adequately address even highly complex problems. Previous literature proposed a hierarchy of uncertainty, but did not develop a quantitative score of analytical complexity. In order to develop such a score, this study reviewed over 90 strategies to cope with uncertainty, including methods utilized by expert decision-makers such as engineers, military planners and others. It found that many decision problems have pivotal properties that enable their solution despite uncertainty, including small action space, reversibility and others. The analytical complexity score of a problem could then be defined based on the availability of these properties.
Optimizing Topological Cascade Resilience Based on the Structure of Terrorist Networks
Complex socioeconomic networks such as information, finance and even terrorist networks need resilience to cascades--to prevent the failure of a single node from causing a far-reaching domino effect. We show that terrorist and guerrilla networks are uniquely cascade-resilient while maintaining high efficiency, but they become more vulnerable beyond a certain threshold. We also introduce an optimization method for constructing networks with high passive cascade resilience. The optimal networks are found to be based on cells, where each cell has a star topology. Counterintuitively, we find that there are conditions where networks should not be modified to stop cascades because doing so would come at a disproportionate loss of efficiency. Implementation of these findings can lead to more cascade-resilient networks in many diverse areas.
Operational resilience: concepts, design and analysis
Building resilience into today’s complex infrastructures is critical to the daily functioning of society and its ability to withstand and recover from natural disasters, epidemics and cyber-threats. This study proposes quantitative measures that capture and implement the definition of engineering resilience advanced by the National Academy of Sciences. The approach is applicable across physical, information and social domains. It evaluates the critical functionality, defined as a performance function of time set by the stakeholders. Critical functionality is a source of valuable information, such as the integrated system resilience over a time interval and its robustness. The paper demonstrates the formulation on two classes of models: 1) multi-level directed acyclic graphs and 2) interdependent coupled networks. For both models synthetic case studies are used to explore trends. For the first class, the approach is also applied to the Linux operating system. Results indicate that desired resilience and robustness levels are achievable by trading off different design parameters, such as redundancy, node recovery time and backup supply available. The nonlinear relationship between network parameters and resilience levels confirms the utility of the proposed approach, which is of benefit to analysts and designers of complex systems and networks.
Modeling hepatitis C micro-elimination among people who inject drugs with direct-acting antivirals in metropolitan Chicago
Hepatitis C virus (HCV) infection is a leading cause of chronic liver disease and mortality worldwide. Direct-acting antiviral (DAA) therapy leads to high cure rates. However, persons who inject drugs (PWID) are at risk for reinfection after cure and may require multiple DAA treatments to reach the World Health Organization’s (WHO) goal of HCV elimination by 2030. Using an agent-based model (ABM) that accounts for the complex interplay of demographic factors, risk behaviors, social networks, and geographic location for HCV transmission among PWID, we examined the combination(s) of DAA enrollment (2.5%, 5%, 7.5%, 10%), adherence (60%, 70%, 80%, 90%) and frequency of DAA treatment courses needed to achieve the WHO’s goal of reducing incident chronic infections by 90% by 2030 among a large population of PWID from Chicago, IL and surrounding suburbs. We also estimated the economic DAA costs associated with each scenario. Our results indicate that a DAA treatment rate of >7.5% per year with 90% adherence results in 75% of enrolled PWID requiring only a single DAA course; however 19% would require 2 courses, 5%, 3 courses and <2%, 4 courses, with an overall DAA cost of $325 million to achieve the WHO goal in metropolitan Chicago. We estimate a 28% increase in the overall DAA cost under low adherence (70%) compared to high adherence (90%). Our modeling results have important public health implications for HCV elimination among U.S. PWID. Using a range of feasible treatment enrollment and adherence rates, we report robust findings supporting the need to address re-exposure and reinfection among PWID to reduce HCV incidence.
Mathematical Modeling of Hepatitis C Prevalence Reduction with Antiviral Treatment Scale-Up in Persons Who Inject Drugs in Metropolitan Chicago
New direct-acting antivirals (DAAs) provide an opportunity to combat hepatitis C virus (HCV) infection in persons who inject drugs (PWID). Here we use a mathematical model to predict the impact of a DAA-treatment scale-up on HCV prevalence among PWID and the estimated cost in metropolitan Chicago. To estimate the HCV antibody and HCV-RNA (chronic infection) prevalence among the metropolitan Chicago PWID population, we used empirical data from three large epidemiological studies. Cost of DAAs is assumed $50,000 per person. Approximately 32,000 PWID reside in metropolitan Chicago with an estimated HCV-RNA prevalence of 47% or 15,040 cases. Approximately 22,000 PWID (69% of the total PWID population) attend harm reduction (HR) programs, such as syringe exchange programs, and have an estimated HCV-RNA prevalence of 30%. There are about 11,000 young PWID (<30 years old) with an estimated HCV-RNA prevalence of 10% (PWID in these two subpopulations overlap). The model suggests that the following treatment scale-up is needed to reduce the baseline HCV-RNA prevalence by one-half over 10 years of treatment [cost per year, min-max in millions]: 35 per 1,000 [$50-$77] in the overall PWID population, 19 per 1,000 [$20-$26] for persons in HR programs, and 5 per 1,000 [$3-$4] for young PWID. Treatment scale-up could dramatically reduce the prevalence of chronic HCV infection among PWID in Chicago, who are the main reservoir for on-going HCV transmission. Focusing treatment on PWID attending HR programs and/or young PWID could have a significant impact on HCV prevalence in these subpopulations at an attainable cost.
Effectiveness of isolation policies in schools: evidence from a mathematical model of influenza and COVID-19
Non-pharmaceutical interventions such as social distancing, school closures and travel restrictions are often implemented to control outbreaks of infectious diseases. For influenza in schools, the Center of Disease Control (CDC) recommends that febrile students remain isolated at home until they have been fever-free for at least one day and a related policy is recommended for SARS-CoV-2 (COVID-19). Other authors proposed using a school week of four or fewer days of in-person instruction for all students to reduce transmission. However, there is limited evidence supporting the effectiveness of these interventions. We introduced a mathematical model of school outbreaks that considers both intervention methods. Our model accounts for the school structure and schedule, as well as the time-progression of fever symptoms and viral shedding. The model was validated on outbreaks of seasonal and pandemic influenza and COVID-19 in schools. It was then used to estimate the outbreak curves and the proportion of the population infected (attack rate) under the proposed interventions. For influenza, the CDC-recommended one day of post-fever isolation can reduce the attack rate by a median (interquartile range) of 29 (13-59)%. With 2 days of post-fever isolation the attack rate could be reduced by 70 (55-85)%. Alternatively, shortening the school week to 4 and 3 days reduces the attack rate by 73 (64-88)% and 93 (91-97)%, respectively. For COVID-19, application of post-fever isolation policy was found to be less effective and reduced the attack rate by 10 (5-17)% for a 2-day isolation policy and by 14 (5-26)% for 14 days. A 4-day school week would reduce the median attack rate in a COVID-19 outbreak by 57 (52-64)%, while a 3-day school week would reduce it by 81 (79-83)%. In both infections, shortening the school week significantly reduced the duration of outbreaks. Shortening the school week could be an important tool for controlling influenza and COVID-19 in schools and similar settings. Additionally, the CDC-recommended post-fever isolation policy for influenza could be enhanced by requiring two days of isolation instead of one.
Agent-Based Model Forecasts Aging of the Population of People Who Inject Drugs in Metropolitan Chicago and Changing Prevalence of Hepatitis C Infections
People who inject drugs (PWID) are at high risk for blood-borne pathogens transmitted during the sharing of contaminated injection equipment, particularly hepatitis C virus (HCV). HCV prevalence is influenced by a complex interplay of drug-use behaviors, social networks, and geography, as well as the availability of interventions, such as needle exchange programs. To adequately address this complexity in HCV epidemic forecasting, we have developed a computational model, the Agent-based Pathogen Kinetics model (APK). APK simulates the PWID population in metropolitan Chicago, including the social interactions that result in HCV infection. We used multiple empirical data sources on Chicago PWID to build a spatial distribution of an in silico PWID population and modeled networks among the PWID by considering the geography of the city and its suburbs. APK was validated against 2012 empirical data (the latest available) and shown to agree with network and epidemiological surveys to within 1%. For the period 2010-2020, APK forecasts a decline in HCV prevalence of 0.8% per year from 44(± 2)% to 36(± 5)%, although some sub-populations would continue to have relatively high prevalence, including Non-Hispanic Blacks, 48(± 5)%. The rate of decline will be lowest in Non-Hispanic Whites and we find, in a reversal of historical trends, that incidence among non-Hispanic Whites would exceed incidence among Non-Hispanic Blacks (0.66 per 100 per years vs 0.17 per 100 person years). APK also forecasts an increase in PWID mean age from 35(± 1) to 40(± 2) with a corresponding increase from 59(± 2)% to 80(± 6)% in the proportion of the population >30 years old. Our studies highlight the importance of analyzing subpopulations in disease predictions, the utility of computer simulation for analyzing demographic and health trends among PWID and serve as a tool for guiding intervention and prevention strategies in Chicago, and other major cities.
Generating realistic scaled complex networks
Research on generative models plays a central role in the emerging field of network science, studying how statistical patterns found in real networks could be generated by formal rules. Output from these generative models is then the basis for designing and evaluating computational methods on networks including verification and simulation studies. During the last two decades, a variety of models has been proposed with an ultimate goal of achieving comprehensive realism for the generated networks. In this study, we (a) introduce a new generator, termed ReCoN; (b) explore how ReCoN and some existing models can be fitted to an original network to produce a structurally similar replica, (c) use ReCoN to produce networks much larger than the original exemplar, and finally (d) discuss open problems and promising research directions. In a comparative experimental study, we find that ReCoN is often superior to many other state-of-the-art network generation methods. We argue that ReCoN is a scalable and effective tool for modeling a given network while preserving important properties at both micro- and macroscopic scales, and for scaling the exemplar data by orders of magnitude in size.
Multiscale planar graph generation
The study of network representations of physical, biological, and social phenomena can help us better understand their structure and functional dynamics as well as formulate predictive models of these phenomena. However, due to the scarcity of real-world network data owing to factors such as cost and effort required in collection of network data and the sensitivity of this data towards theft and misuse, engineers and researchers often rely on synthetic data for simulations, hypothesis testing, decision making, and algorithm engineering. An important characteristic of infrastructure networks such as roads, water distribution and other utility systems is that they can be (almost fully) embedded in a plane, therefore to simulate these system we need realistic networks which are also planar. While the currently-available synthetic network generators can model networks that exhibit realism, they do not guarantee or achieve planarity. In this paper we present a flexible algorithm that can synthesize realistic networks that are planar. The method follows a multi-scale randomized editing approach generating a hierarchy of coarsened networks of a given planar graph and introducing edits at various levels in the hierarchy. The method preserves the structural properties with minimal bias including the planarity of the network, while introducing realistic variability at multiple scales. Reproducibility: All datasets and algorithm implementation presented in this work are available at https://bit.ly/2CjOUAS
Integrating evidence, models and maps to enhance Chagas disease vector surveillance
Until recently, the Chagas disease vector, Triatoma infestans, was widespread in Arequipa, Perú, but as a result of a decades-long campaign in which over 70,000 houses were treated with insecticides, infestation prevalence is now greatly reduced. To monitor for T. infestans resurgence, the city is currently in a surveillance phase in which a sample of houses is selected for inspection each year. Despite extensive data from the control campaign that could be used to inform surveillance, the selection of houses to inspect is often carried out haphazardly or by convenience. Therefore, we asked, how can we enhance efforts toward preventing T. infestans resurgence by creating the opportunity for vector surveillance to be informed by data? To this end, we developed a mobile app that provides vector infestation risk maps generated with data from the control campaign run in a predictive model. The app is intended to enhance vector surveillance activities by giving inspectors the opportunity to incorporate the infestation risk information into their surveillance activities, but it does not dictate which houses to surveil. Therefore, a critical question becomes, will inspectors use the risk information? To answer this question, we ran a pilot study in which we compared surveillance using the app to the current practice (paper maps). We hypothesized that inspectors would use the risk information provided by the app, as measured by the frequency of higher risk houses visited, and qualitative analyses of inspector movement patterns in the field. We also compared the efficiency of both mediums to identify factors that might discourage risk information use. Over the course of ten days (five with each medium), 1,081 houses were visited using the paper maps, of which 366 (34%) were inspected, while 1,038 houses were visited using the app, with 401 (39%) inspected. Five out of eight inspectors (62.5%) visited more higher risk houses when using the app (Fisher's exact test, p < 0.001). Among all inspectors, there was an upward shift in proportional visits to higher risk houses when using the app (Mantel-Haenszel test, common odds ratio (OR) = 2.42, 95% CI 2.00-2.92), and in a second analysis using generalized linear mixed models, app use increased the odds of visiting a higher risk house 2.73-fold (95% CI 2.24-3.32), suggesting that the risk information provided by the app was used by most inspectors. Qualitative analyses of inspector movement revealed indications of risk information use in seven out of eight (87.5%) inspectors. There was no difference between the app and paper maps in the number of houses visited (paired t-test, p = 0.67) or inspected (p = 0.17), suggesting that app use did not reduce surveillance efficiency. Without staying vigilant to remaining and re-emerging vector foci following a vector control campaign, disease transmission eventually returns and progress achieved is reversed. Our results suggest that, when provided the opportunity, most inspectors will use risk information to direct their surveillance activities, at least over the short term. The study is an initial, but key, step toward evidence-based vector surveillance.