Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
44,838
result(s) for
"Computation theory"
Sort by:
On implementing SWMR registers from SWSR registers in systems with Byzantine failures
2024
The implementation of registers from (potentially) weaker registers is a classical problem in the theory of distributed computing. Since Lamport’s pioneering work (Lamport in Distrib Comput 1(2):77–101, 1986), this problem has been extensively studied in the context of asynchronous processes with crash failures. In this paper, we investigate this problem in the context of Byzantine process failures, with and without process signatures. We first prove that, without signatures, there is no wait-free linearizable implementation of a 1-writer n-reader register from atomic 1-writer 1-reader registers. In fact, we show a stronger result, namely, even under the assumption that the writer can only crash and at most one reader can be malicious, there is no linearizable implementation of a 1-writer n-reader register from atomic 1-writer (n-1)-reader registers that ensures that every correct process eventually completes its operations. In light of this impossibility result, we give two implementations of a 1-writer n-reader register from atomic 1-writer 1-reader registers that work under different assumptions. The first implementation is linearizable (under any combination of Byzantine process failures), but it guarantees that every correct process eventually completes its operations only under the assumption that the writer is correct or no reader is Byzantine—thus matching the impossibility result. The second implementation assumes process signatures; it is wait-free and linearizable under any number and combination of Byzantine process failures.
Journal Article
Calculations of elastic and thermal properties of the strengthening C14 Fe6Nb4Al2 Laves phase using the density functional theory
2025
Compounds with structures of the Laves phase type, which are precipitated in the matrix of steels or superalloys as a result of the work of the product at high temperatures, are usually considered useful strengthening phases, whose elastic and thermal properties are of considerable interest. In our work, the coefficients of the elastic tensor of the C14
Fe
6
Nb
4
Al
2
Laves phase were calculated. The calculations were obtained using the density functional theory. Such elastic characteristics as the bulk modulus of elasticity, shear moduli, Young’s modulus and Poisson’s ratio, as well as thermal properties like sound wave velocities and Debye temperature, were obtained. The factors influencing the anisotropy of the elastic properties of
Fe
6
Nb
4
Al
2
were calculated.
Journal Article
First-principles prediction of a novel superhard orthorhombic carbon allotrope with large bandgap
2024
Searching for novel carbon allotropes holds paramount importance for both condensed matter physics and materials science. Using the combination of structure searching and first-principles calculations, we identified a carbon allotrope with orthorhombic symmetry (
Pmm
2, 10 atoms/cell), termed O–C10. The energetic, dynamical, mechanical, and thermal stabilities of O–C10 have been validated through energy analysis, phonon dispersion curves calculations, evaluation of elastic constants, and
ab initio
molecular dynamics simulations. In addition, the bulk modulus and Vickers hardness of O–C10 are calculated to be 410 and 79 GPa, respectively, which are higher than those of c-BN. The predicted
B
/
G
ratio for O–C10 is 0.90 (< 1.750), indicating that O–C10 possesses brittle nature. The simulated X-ray diffraction (XRD) patterns are also provided to facilitate potential future experimental endeavors. The calculated electronic band structures by the Heyd–Scuseria–Ernzerhof (HSE06) hybrid functional unveiled that O–C10 is an insulator with a large indirect band gap of 5.29 eV. These results regarding this novel carbon allotrope may broaden the scope of the carbon family, advance the practical uses of carbon-based materials, and stimulate the development of distinctive crystalline carbon structures.
Journal Article
Homonym Population Protocols
by
Cohen, Johanne
,
Bournez, Olivier
,
Rabie, Mikaël
in
Communities
,
Computation
,
Computer simulation
2018
The population protocol model was introduced by Angluin et al. as a model of passively mobile anonymous finite-state agents. This model computes a predicate on the multiset of their inputs via interactions by pairs. The original population protocol model has been proved to compute only semilinear predicates and has been extended in various ways. In the community protocol model by Guerraoui and Ruppert, the n agents have unique identifiers but may only store a finite number of the identifiers they already heard about. The community protocol model is known to provide the power of a non-deterministic Turing machine with an O (n log n) space. We consider variants of the two above-mentioned models and we obtain a whole landscape that covers and extends already known results. Namely, by considering the case of homonyms, that is to say the case when several agents may share the same identifier, we provide a hierarchy that goes from the case of no identifier (population protocol model) to the case of unique identifiers (community protocol model). In particular, we obtain that any Turing Machine on space O (log O (1) n) can be simulated with log rn identifiers, for any r > 0. Our results also extend and revisit the hierarchy provided by Chatzigiannakis et al. on population protocols carrying Turing Machines on limited space, reducing the gap left by this work between per-agent space o (loglog n) (proved to be equivalent to population protocols) and Ω(log n) (proved to be equivalent to Turing machines): We prove that per-agent space Θ(loglog n) corresponds to symmetric predicates computable in polylogarithmic non-deterministic space.
Journal Article
Declarative Logic Programming
2018
Logic Programming (LP) is at the nexus of knowledge representation, AI, mathematical logic, databases, and programming languages. It allows programming to be more declarative, by specifying \"what\" to do instead of \"how\" to do it. This field is fascinating and intellectually stimulating due to the fundamental interplay among theory, systems, and applications brought about by logic. The goal of this book is to help fill in the void in the literature with state-of-the-art surveys on key aspects of LP. Much attention was paid to making these surveys accessible to researchers, practitioners, and graduate students alike.
A Configurable Hardware Architecture for Runtime Application of Network Calculus
2021
Network Calculus has been a foundational theory for analyzing and ensuring Quality-of-Service (QoS) in a variety of networks including Networks on Chip (NoCs). To fulfill dynamic QoS requirements of applications, runtime application of network calculus is essential. However, the primitive operations in network calculus such as arrival curve, min-plus convolution and min-plus deconvolution are very time consuming when calculated in software because of the large volume and long latency of computation. For the first time, we propose a configurable hardware architecture to enable runtime application of network calculus. It employs a unified pipeline that can be dynamically configured to efficiently calculate the arrival curve, min-plus convolution, and min-plus deconvolution at runtime. We have implemented and synthesized this hardware architecture on a Xilinx FPGA platform to quantify its performance and resource consumption. Furthermore, we have built a prototype NoC system incorporating this hardware for dynamic flow regulation to effectively achieve QoS at runtime.
Journal Article
A Neuro-Symbolic ASP Pipeline for Visual Question Answering
by
HIGUERA, NELSON
,
PRITZ, MICHAEL
,
OETSCH, JOHANNES
in
Answer set programming
,
Computation theory
,
Datasets
2022
We present a neuro-symbolic visual question answering (VQA) pipeline for CLEVR, which is a well-known dataset that consists of pictures showing scenes with objects and questions related to them. Our pipeline covers (i) training neural networks for object classification and bounding-box prediction of the CLEVR scenes, (ii) statistical analysis on the distribution of prediction values of the neural networks to determine a threshold for high-confidence predictions, and (iii) a translation of CLEVR questions and network predictions that pass confidence thresholds into logic programmes so that we can compute the answers using an answer-set programming solver. By exploiting choice rules, we consider deterministic and non-deterministic scene encodings. Our experiments show that the non-deterministic scene encoding achieves good results even if the neural networks are trained rather poorly in comparison with the deterministic approach. This is important for building robust VQA systems if network predictions are less-than perfect. Furthermore, we show that restricting non-determinism to reasonable choices allows for more efficient implementations in comparison with related neuro-symbolic approaches without losing much accuracy.
Journal Article
Viable supply chain model: integrating agility, resilience and sustainability perspectives—lessons from and thinking beyond the COVID-19 pandemic
2022
Viability is the ability of a supply chain (SC) to maintain itself and survive in a changing environment through a redesign of structures and replanning of performance with long-term impacts. In this paper, we theorize a new notion—the viable supply chain (VSC). In our approach, viability is considered as an underlying SC property spanning three perspectives, i.e., agility, resilience, and sustainability. The principal ideas of the VSC model are adaptable structural SC designs for supply–demand allocations and, most importantly, establishment and control of adaptive mechanisms for transitions between the structural designs. Further, we demonstrate how the VSC components can be categorized across organizational, informational, process-functional, technological, and financial structures. Moreover, our study offers a VSC framework within an SC ecosystem. We discuss the relations between resilience and viability. Through the lens and guidance of dynamic systems theory, we illustrate the VSC model at the technical level. The VSC model can be of value for decision-makers to design SCs that can react adaptively to both positive changes (i.e., the agility angle) and be able to absorb negative disturbances, recover and survive during short-term disruptions and long-term, global shocks with societal and economical transformations (i.e., the resilience and sustainability angles). The VSC model can help firms in guiding their decisions on recovery and re-building of their SCs after global, long-term crises such as the COVID-19 pandemic. We emphasize that resilience is the central perspective in the VSC guaranteeing viability of the SCs of the future. Emerging directions in VSC research are discussed.
Journal Article