Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
15,673
result(s) for
"process monitoring"
Sort by:
Election watchdogs : transparency, accountability and integrity
\"Recent years have seen resurgent interest in the potential capacity of transparency - the public availability of information - to improve democratic governance. Timely, accurate, granular and freely-available information is generally regarded as intrinsically valuable, as well as having many instrumental benefits. In development, transparency and accountability is generally thought to help plug the leaky pipes of corruption and inefficiency, channel public spending more efficiently, and produce better services. In the field of electoral governance, openness about the rules and procedures, outcomes, and decisions processes used by electoral authorities is widely assumed to build public trust, improve policy-making, and facilitate accountability. In the age of WikiLeaks, Twitter and Google, open governance, expanding information and communication, often seems like an unqualified good. Nevertheless, beyond popular buzzword sloganeering, evidence suggests that the impact of transparency on the quality of governance and elections remains mixed. Transparency also has a dark side, threatening trust, privacy, and security. To understand these issues more fully, this book seeks to assess the contemporary drive towards open electoral governance and to identify several conditions predicted to determine the success of transparency policies in strengthening electoral integrity. Chapters look at transparency in electoral governance at the international and state levels, as well as within civil society\"-- Provided by publisher.
Fire now, fire later: alarm-based systems for prescriptive process monitoring
by
Fahrenkrog-Petersen, Stephan A
,
Weidlich Matthias
,
Maggi, Fabrizio Maria
in
Alarms
,
Customer relationship management
,
Enterprise resource planning
2022
Predictive process monitoring is a family of techniques to analyze events produced during the execution of a business process in order to predict the future state or the final outcome of running process instances. Existing techniques in this field are able to predict, at each step of a process instance, the likelihood that it will lead to an undesired outcome. These techniques, however, focus on generating predictions and do not prescribe when and how process workers should intervene to decrease the cost of undesired outcomes. This paper proposes a framework for prescriptive process monitoring, which extends predictive monitoring with the ability to generate alarms that trigger interventions to prevent an undesired outcome or mitigate its effect. The framework incorporates a parameterized cost model to assess the cost–benefit trade-off of generating alarms. We show how to optimize the generation of alarms given an event log of past process executions and a set of cost model parameters. The proposed approaches are empirically evaluated using a range of real-life event logs. The experimental results show that the net cost of undesired outcomes can be minimized by changing the threshold for generating alarms, as the process instance progresses. Moreover, introducing delays for triggering alarms, instead of triggering them as soon as the probability of an undesired outcome exceeds a threshold, leads to lower net costs.
Journal Article
Monitoring online biomass with a capacitance sensor during scale-up of industrially relevant CHO cell culture fed-batch processes in single-use bioreactors
2020
In 2004, the FDA published a guideline to implement process analytical technologies (PAT) in biopharmaceutical processes for process monitoring to gain process understanding and for the control of important process parameters. Viable cell concentration (VCC) is one of the most important key performance indicator (KPI) during mammalian cell cultivation processes. Commonly, this is measured offline. In this work, we demonstrated the comparability and scalability of linear regression models derived from online capacitance measurements. The linear regressions were used to predict the VCC and other familiar offline biomass indicators, like the viable cell volume (VCV) and the wet cell weight (WCW), in two different industrially relevant CHO cell culture processes (Process A and Process B). Therefore, different single-use bioreactor scales (50–2000 L) were used to prove feasibility and scalability of the in-line sensor integration. Coefficient of determinations of 0.79 for Process A and 0.99 for Process B for the WCW were achieved. The VCV was described with high coefficients of determination of 0.96 (Process A) and 0.98 (Process B), respectively. In agreement with other work from the literature, the VCC was only described within the exponential growth phase, but resulting in excellent coefficients of determination of 0.99 (Process A) and 0.96 (Process B), respectively. Monitoring these KPIs online using linear regression models appeared to be scale-independent, enabled deeper process understanding (e.g. here demonstrated in monitoring, the feeding profile) and showed the potential of this method for process control.
Journal Article
Strip-Type Embeddable Shape Sensor Based on Fiber Optics for In Situ Composite Consolidation Monitoring
2022
Carbon fibers and resin used in manufacturing carbon fiber-reinforced plastic composite structures flow before the resin solidifies, resulting in disrupted fiber orientation and non-uniform thickness. This process, known as consolidation, is critical for the quality of the composite structure, but no technology exists to measure the deformation in situ. This study proposes a strip-type embeddable shape sensor based on fiber optics for in situ monitoring of consolidation deformation. The sensor consists of a thin, flexible sheet with optical fibers embedded in the upper and lower surfaces of the sheet, and it can monitor out-of-plane bending deformation in composite materials during consolidation. Finite element analysis and experiments are used to evaluate the basic performance of the shape sensor before it is applied to composite gap/lap monitoring. For the first time, the relaxation of consolidation deformation due to the flow of fiber-resin suspension is measured. The proposed sensor will be a powerful tool for elucidating consolidation mechanisms and for validating composite manufacturing simulations.
Journal Article
Generic Chemometric Models for Metabolite Concentration Prediction Based on Raman Spectra
2022
Chemometric models for on-line process monitoring have become well established in pharmaceutical bioprocesses. The main drawback is the required calibration effort and the inflexibility regarding system or process changes. So, a recalibration is necessary whenever the process or the setup changes even slightly. With a large and diverse Raman dataset, however, it was possible to generate generic partial least squares regression models to reliably predict the concentrations of important metabolic compounds, such as glucose-, lactate-, and glutamine-indifferent CHO cell cultivations. The data for calibration were collected from various cell cultures from different sites in different companies using different Raman spectrophotometers. In testing, the developed “generic” models were capable of predicting the concentrations of said compounds from a dilution series in FMX-8 mod medium, as well as from an independent CHO cell culture. These spectra were taken with a completely different setup and with different Raman spectrometers, demonstrating the model flexibility. The prediction errors for the tests were mostly in an acceptable range (<10% relative error). This demonstrates that, under the right circumstances and by choosing the calibration data carefully, it is possible to create generic and reliable chemometric models that are transferrable from one process to another without recalibration.
Journal Article
A review of univariate and multivariate process capability indices
2017
This paper offers a review of univariate and multivariate process capability indices (PCIs). PCIs are statistic indicators widely used in the industry to quantify the capability of production processes by relating the variability of the measures of the product characteristics with the admissible one. Univariate PCIs involve single-product characteristics while multivariate PCIs deal with the multivariate case. When analyzing the capability of processes, decision makers of the industry may choose one PCI among all the PCIs existing in the literature depending on different criteria. In this article, we describe, cluster, and discuss univariate and multivariate PCIs. To cluster the PCIs, we identify three classes of characteristics: in the first class, the characteristics related to the information of the process data input are included; the second class includes characteristics related to the approach used to calculate the PCIs; and in the third class, we find characteristics related to the information that the PCIs give. We discuss the strengths and weaknesses of each PCI using four criteria: calculation complexity, globality of the index, relation to proportion of nonconforming parts, and robustness of the index. Finally, we propose a framework that may help practitioners and decision makers of the industry to select PCIs.
Journal Article
Specification-driven predictive business process monitoring
2020
Predictive analysis in business process monitoring aims at forecasting the future information of a running business process. The prediction is typically made based on the model extracted from historical process execution logs (event logs). In practice, different business domains might require different kinds of predictions. Hence, it is important to have a means for properly specifying the desired prediction tasks, and a mechanism to deal with these various prediction tasks. Although there have been many studies in this area, they mostly focus on a specific prediction task. This work introduces a language for specifying the desired prediction tasks, and this language allows us to express various kinds of prediction tasks. This work also presents a mechanism for automatically creating the corresponding prediction model based on the given specification. Differently from previous studies, instead of focusing on a particular prediction task, we present an approach to deal with various prediction tasks based on the given specification of the desired prediction tasks. We also provide an implementation of the approach which is used to conduct experiments using real-life event logs.
Journal Article
Nonlinear process monitoring based on new reduced Rank-KPCA method
by
Elaissi, Ilyes
,
Taouali, Okba
,
Hassani Messaoud
in
Air monitoring
,
Air quality
,
Computer applications
2018
Kernel Principal Component Analysis (KPCA) is an efficient multivariate statistical technique used for nonlinear process monitoring. Nevertheless, the conventional KPCA suffers high computational complexity in dealing with large samples. In this paper, a new kernel method based on a novel reduced Rank-KPCA is developed to make up for the drawbacks of KPCA. The basic idea of the proposed novel approach consists at first to construct a reduced Rank-KPCA model that describes properly the system behavior in normal operating conditions from a large amount of training data and after that to monitor the system on-line. The principle of the proposed Reduced Rank-KPCA is to eliminate the dependencies of variables in the feature space and to retain a reduced data from the original one. The proposed monitoring method is applied to fault detection in a numerical example, Continuous Stirred Tank Reactor and air quality-monitoring network AIRLOR and is compared with conventional KPCA and Moving Window KPCA methods.
Journal Article
Weighted-Likelihood-Ratio-Based EWMA Schemes for Monitoring Geometric Distributions
by
Cai, Hongxing
,
Zhang, Yizhen
,
Zhang, Jiujun
in
Design
,
exponentially weighted moving average
,
geometric distribution
2024
Monitoring the parameter of discrete distributions is common in industrial production. Also, it is often crucial to monitor the parameter of geometric distribution, which is often regarded as the nonconforming item rate. To enhance the detection of nonconforming item, we designed an exponentially weighted moving average (EWMA) scheme based on the weighted likelihood ratio test (WLRT) method, and this scheme is denoted as the EWLRT scheme, specifically designed for monitoring the increase of the parameter in geometric distribution. Moreover, the optimal statistical design of the EWLRT scheme is presented when the shift is known. Results from numerical comparisons through Monte Carlo simulations indicates that the EWLRT scheme performs better than the competing schemes in some scenarios. Additionally, the designed scheme is characterized by its simplicity and ease of use, making it ideally suited for scenarios involving single observation. An example is illustrated to demonstrate the effectiveness of the EWLRT scheme.
Journal Article
Advances in Continuous Active Pharmaceutical Ingredient (API) Manufacturing: Real-time Monitoring Using Multivariate Tools
by
Dumarey, Melanie
,
Shapland, Peter
,
Berry, Malcolm
in
Biochemical Engineering
,
Biomedical and Life Sciences
,
Biomedicine
2019
Purpose
The implementation of continuous processing technologies for pharmaceutical manufacturing has increased due to its potential to enhance supply chain flexibility, reduce the footprint of the manufacturing facility, and deliver more consistent quality. Additionally, it facilitates extensive, real-time monitoring by sensors and process analytical technology (PAT) tools without perturbing the process. In the presented case study, the use of multivariate tools for the real-time monitoring and retrospective review of a continuous active pharmaceutical ingredient (API) synthesis was evaluated from process development through to commercialization.
Method
A multivariate statistical process monitoring (MSPM) approach summarizing variability in both quality critical (controlled flow rates, temperatures) and non-quality critical parameters (pressures, pump speeds, conductivity) was used to monitor three telescoped chemistry stages of a continuous API synthesis. Four different modeling strategies were presented addressing specific monitoring and analysis requirements during the pharmaceutical development lifecycle.
Results
During development (R&D and commercial facility), the implemented multivariate monitoring resulted in the identification of potential failure modes, a deeper understanding of the natural process variability and accelerated root cause analysis for a recurrent reagent blockage. During manufacturing (commercial facility), the multivariate tool confirmed potential for predictive maintenance and early fault detection.
Conclusions
While the implemented control strategy based on parametric control and offline analytical testing provided the required quality assurance, the multivariate trends provided additional information on process performance. More specifically, they enabled more detailed process understanding during the development of the continuous API synthesis following quality by design (QbD) principles and demonstrated the potential for enhanced process performance during commercial manufacturing.
Journal Article