Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
39,940
result(s) for
"Simulation experiments"
Sort by:
Experimental Verification for the Graphitization of Inertinite
2023
In order to explore the graphitization of inertinite, this paper conducted high-temperature thermal simulation experiments (HTT) and high-temperature high-pressure simulation experiments (HTHP) on isolated samples enriched in inertinite. X-ray diffraction (XRD), Raman spectroscopy, and transmission electron microscopy (TEM) were used to analyze the graphitization process of inertinite. ① Results of HTT: the graphitization of inertinite has a “threshold condition” with the temperature threshold ranging between 2100 °C and 2400 °C. Below this threshold, the d002 value of the samples remains above 0.342 nm. ② Results of HTHP: (i) External forces have a significant positive effect on the graphitization of inertinite. Compared to the HTT, the addition of external forces significantly reduces the temperature required for inertinite graphitization. (ii) Proper combinations of temperature and pressure conditions are crucial for efficiently promoting the graphitization of inertinite. Changes in pressure, either increasing or decreasing from the optimal pressure, have a suppressive effect on the graphitization of inertinite. ③ The mechanism of external forces on the graphitization of inertinite was analyzed. Shear stress promotes the rotation and orientation of aromatic layers, while static hydrostatic pressure contributes to the contraction and reduction of interlayer spacing in carbon layers.
Journal Article
Understanding the performance of construction business: A simulation-based experimental study
by
Arun Bajracharya
,
Goh Cheng Siew
,
Hai Chen Tan
in
Construction business performance, Project, Finance, Capacity, System dynamics, Simulation experiment
,
Construction industry
,
Data processing
2021
Higher failure rates of construction business have been observed as a recurring phenomenon in the construction industry. This research focuses on the causes behind a range of performance modes of construction business. The growth and capacity underinvestment archetype has been used as the main systems archetype to develop a causal structure for understanding the business performance. A system dynamics model was developed to create a simulation platform for the causal structure. A context of a typical small and medium construction company has been used in the simulation model. This research considered and experimented with a set of selected managerial policies and practices that can lead the construction business to failure, sustenance, or growth. In order to achieve the expected growth or sustenance, it is found that a certain level of balance needs to be secured on how much emphasis is to be given to win new projects, how much profit margins to work with, and how much capacities to be arranged and deployed for project operations, management, and execution.
Journal Article
Estimating a Service-Life Distribution Based on Production Counts and a Failure Database
by
Hamada, Michael S.
,
Ryan, Kenneth J.
,
Vardeman, Stephen B.
in
Bayesian methods
,
Case studies
,
Case study
2017
A manufacturer wanted to compare the service-life distributions of two similar products. These concern product lifetimes after installation (not manufacture). For each product, there were available production counts and an imperfect database providing information on failing units. In the real case, these units were expensive repairable units warrantied against repairs. Failure (of interest here) was relatively rare and driven by a different mode/mechanism than ordinary repair events (not of interest here). Data models for the service life based on a standard parametric lifetime distribution and a related limited failure population were developed. These models were used to develop expressions for the likelihood of the available data that properly accounts for information missing in the failure database. A Bayesian approach was employed to obtain estimates of model parameters (with associated uncertainty) in order to investigate characteristics of the service-life distribution. Custom software was developed and is included as Supplemental Material to this case study. One part of a responsible approach to the original case was a simulation experiment used to validate the correctness of the software and the behavior of the statistical methodology before using its results in the application, and an example of such an experiment is included here. Because of confidentiality issues that prevent use of the original data, simulated data with characteristics like the manufacturer's proprietary data are used to illustrate some aspects of our real analyses. We note also that, although this case focuses on rare and complete product failure, the statistical methodology provided is directly applicable to more standard warranty data problems involving typically much larger warranty databases where entries are warranty claims (often for repairs) rather than reports of complete failures.
Journal Article
Impact of Decision Rules and Non-cooperative Behaviors on Minimum Consensus Cost in Group Decision Making
2021
In group decision making, it is sensible to achive minimum consensus cost (MCC) because the consensus reaching process resources are often limited. In this endeavour, though, there are still two issues that require paying attention to: (1) the impact of decision rules, including decision weights and aggregation functions, on MCC; and (2) the impact of non-cooperative behaviors on MCC. Hence, this paper analytically reveals the decision rules to minimize MCC or maximize MCC. Furthermore, detailed simulation experiments show the joint impact of non-cooperative behavior and decisions rules on MCC, as well as revealing the effect of the consensus within the established MCC target.
Journal Article
Why different trust relationships matter for information systems users
by
Hoffmann, Axel
,
Leimeister, Jan Marco
,
Söllner, Matthias
in
Analysis
,
Automation
,
Business and Management
2016
Technology acceptance research has shown that trust is an important factor fostering use of information systems (IS). As a result, numerous IS researchers have studied factors that build trust in IS. However, IS research on trust has mainly focused on the trust relationship between the user and the IS itself, largely neglecting that other targets of trust might also drive IS use from a user's point of view. Accordingly, we investigate the importance of different targets of trust in IS use. Therefore, we use the concept of a network of trust and identify four different targets of trust that are prevalent from a user's point of view. Afterwards, we develop our research model and evaluate it using a free simulation experiment. The results show that multiple targets of trust are important in the context of IS use. In particular, we highlight the importance of a second target - trust in the provider - which is equally important as trust in the IS itself. Consequently, IS providers should focus not only on fostering users' trust in their IS but also on positioning themselves as trustworthy providers. In addition, we show that a third target - trust in the Internet - has significant indirect effects on multiple constructs that impact IS use.
Journal Article
Cation-containing lipid membranes - experiment and md simulations
2017
Using small angle neutron diffraction and molecular dynamics simulations we studied the interactions between calcium (Ca[2+]) or zinc (Zn[2+]) cations, and oriented gel phase dipalmitoyl-phosphatidylcholine (DPPC) bilayers. For both cations studied at ~1:7 divalent metal ion to lipid molar ratio (Me[2+]:DPPC), bilayer thickness increased. Simulation results helped reveal subtle differences in the effects of the two cations on gel phase membranes.
Journal Article
The initiation mechanism of gravel-type debris flow based on laboratory simulation experiments
2025
Debris flow is a geological process primarily triggered by factors such as rainfall or earthquakes, characterized by strong abruptness and significant destructive force. It poses a severe threat to human life and property. Exploring its formation, initiation mechanisms, and effective early warning measures is a reliable approach for disaster prevention and mitigation in mountainous regions. An indoor simulation experiment system was designed to study the initiation process of coarse-grained gravel-type debris flow under different particle sizes and slope angles. The interplay among the mean particle size, water flow, the slope angle of the material source, and the onset of debris flow has been elucidated to assess the dynamic changes in various characteristic parameters. This analysis formed initiation criteria specific to gravel-type debris flow. The combined particle size and slope angle of the material source were positively correlated with the water power required for debris flow initiation. A fitting relationship between the unit width flow rate, mean particle size, and slope angle was established. The humidity, seepage pressure, soil pressure, slope angle, and vibratory shock acceleration of the material source had characteristics such as mutability or asynchronism during the formation and initiation of debris flow. However, humidity and seepage pressure were not used as initiation criteria. The initiation criteria for debris flow included changes in characteristic parameters (e.g., soil pressure, slope angle, and vibration acceleration). The research findings can provide a reference for the characteristic identification and initiation early warning of similar types of debris flows, enhancing the safety management level of debris flow disasters.
Journal Article
A Model-Driven Approach for Conducting Simulation Experiments
by
van Rienen, Ursula
,
Budde, Kai
,
Heller, Jakob
in
Automation
,
design of experiments
,
Experiments
2022
With the increasing complexity of simulation studies, and thus increasing complexity of simulation experiments, there is a high demand for better support for them to be conducted. Recently, model-driven approaches have been explored for facilitating the specification, execution, and reproducibility of simulation experiments. However, a more general approach that is suited for a variety of modeling and simulation areas, experiment types, and tools, which also allows for further automation, is still missing. Therefore, we present a novel model-driven engineering (MDE) framework for simulation studies that extends the state-of-the-art of conducting simulation experiments in the following ways: (a) Providing a structured representation of the various ingredients of simulation experiments in the form of meta models and collecting them in a repository improves knowledge sharing across application domains and simulation approaches. (b) Specifying simulation experiments in the quasi-standardized form of the meta models (e.g., via a GUI) and, subsequently, performing the automatic generation of experiment specifications in a language of choice increases both the productivity and quality of complex simulation experiments. (c) Automatic code transformation between specification languages via the meta models enables the reusability of simulation experiments. (d) Integrating the framework using a command-line interface allows for further automation of subprocesses within a simulation study. We demonstrate the advantages and practicality of our approach using real simulation studies from three different fields of simulation (stochastic discrete-event simulation of a cell signaling pathway, virtual prototyping of a neurostimulator, and finite element analysis of electric fields) and various experiment types (global sensitivity analysis, time course analysis, and convergence testing). The proposed framework can be the starting point for further automation of simulation experiments and, therefore, can assist in conducting simulation studies in a more systematic and effective manner. For example, based on this MDE framework, approaches for automatically selecting and parametrizing experimentation methods, or for planning follow-up activities depending on the context of the simulation study, could be developed.
Journal Article
Connecting Tibetan Plateau Snow Change With Arctic Sea‐Ice
2025
Documenting changes in the Arctic sea‐ice variability are essential for understanding the spring sea‐ice predictability barrier. While Tibetan Plateau snow cover (TPSC) has been linked to Arctic sea‐ice variability, the spatiotemporal stability of this relationship remains unclear. In this study, combing satellite observations and snow experiments, we identified a shift in connections between TPSC and Barents‐Kara Seas sea‐ice around 1990. Before 1990, a positive dipole TPSC pattern (eastern enhanced/western reduced snow cover) induces Arctic anticyclonic anomalies through a circumglobal wave train. These anomalies facilitate polar vortex splitting, enhancing moisture transport and solar radiation over the northern Kara Sea, which accelerates sea‐ice reduction. Conversely, post‐1990, a positive monopole TPSC pattern (positive snow anomalies on the entire Tibetan Plateau) strengthens the polar vortex, suppressing Barents Sea (BS) moisture and solar radiation, thereby promoting sea‐ice growth. This regime shifts underscore TPSC's capacity to modulate Arctic sea‐ice dynamics through polar vortex system.
Journal Article
Simulation Experiment and Analysis of GNSS/INS/LEO/5G Integrated Navigation Based on Federated Filtering Algorithm
2022
This article examines the positioning effect of integrated navigation after adding an LEO constellation signal source and a 5G ranging signal source in the context of China’s new infrastructure construction. The tightly coupled Kalman federal filters are used as the algorithm framework. Each signal source required for integrated navigation is simulated in this article. At the same time, by limiting the range of the azimuth angle and visible height angle, different experimental scenes are simulated to verify the contribution of the new signal source to the traditional satellite navigation, and the positioning results are analyzed. Finally, the article compares the distribution of different federal filtering information factors and reveals the method of assigning information factors when combining navigation with sensors with different precision. The experimental results show that the addition of LEO constellation and 5G ranging signals improves the positioning accuracy of the original INS/GNSS by an order of magnitude and ensures a high degree of positioning continuity. Moreover, the experiment shows that the federated filtering algorithm can adapt to the combined navigation mode in different scenarios by combining different precision sensors for navigation positioning.
Journal Article