Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
3,500
result(s) for
"Program verification (computers)"
Sort by:
Large language models for chemistry robotics
by
Skreta, Marta
,
Shkurti, Florian
,
Li, Andrew Zou
in
Chemical synthesis
,
Chemistry
,
Constraints
2023
This paper proposes an approach to automate chemistry experiments using robots by translating natural language instructions into robot-executable plans, using large language models together with task and motion planning. Adding natural language interfaces to autonomous chemistry experiment systems lowers the barrier to using complicated robotics systems and increases utility for non-expert users, but translating natural language experiment descriptions from users into low-level robotics languages is nontrivial. Furthermore, while recent advances have used large language models to generate task plans, reliably executing those plans in the real world by an embodied agent remains challenging. To enable autonomous chemistry experiments and alleviate the workload of chemists, robots must interpret natural language commands, perceive the workspace, autonomously plan multi-step actions and motions, consider safety precautions, and interact with various laboratory equipment. Our approach, CLAIRify, combines automatic iterative prompting with program verification to ensure syntactically valid programs in a data-scarce domain-specific language that incorporates environmental constraints. The generated plan is executed through solving a constrained task and motion planning problem using PDDLStream solvers to prevent spillages of liquids as well as collisions in chemistry labs. We demonstrate the effectiveness of our approach in planning chemistry experiments, with plans successfully executed on a real robot using a repertoire of robot skills and lab tools. Specifically, we showcase the utility of our framework in pouring skills for various materials and two fundamental chemical experiments for materials synthesis: solubility and recrystallization. Further details about CLAIRify can be found at https://ac-rad.github.io/clairify/.
Journal Article
Verification of Quantum Computation: An Overview of Existing Approaches
by
Gheorghiu, Alexandru
,
Kapourniotis, Theodoros
,
Kashefi, Elham
in
Computers
,
Cryptography
,
Fault tolerance
2019
Quantum computers promise to efficiently solve not only problems believed to be intractable for classical computers, but also problems for which verifying the solution is also considered intractable. This raises the question of how one can check whether quantum computers are indeed producing correct results. This task, known as quantum verification, has been highlighted as a significant challenge on the road to scalable quantum computing technology. We review the most significant approaches to quantum verification and compare them in terms of structure, complexity and required resources. We also comment on the use of cryptographic techniques which, for many of the presented protocols, has proven extremely useful in performing verification. Finally, we discuss issues related to fault tolerance, experimental implementations and the outlook for future protocols.
Journal Article
Efficient tomography of a quantum many-body system
2017
Traditionally quantum state tomography is used to characterize a quantum state, but it becomes exponentially hard with the system size. An alternative technique, matrix product state tomography, is shown to work well in practical situations.
Quantum state tomography is the standard technique for estimating the quantum state of small systems
1
. But its application to larger systems soon becomes impractical as the required resources scale exponentially with the size. Therefore, considerable effort is dedicated to the development of new characterization tools for quantum many-body states
2
,
3
,
4
,
5
,
6
,
7
,
8
,
9
,
10
,
11
. Here we demonstrate matrix product state tomography
2
, which is theoretically proven to allow for the efficient and accurate estimation of a broad class of quantum states. We use this technique to reconstruct the dynamical state of a trapped-ion quantum simulator comprising up to 14 entangled and individually controlled spins: a size far beyond the practical limits of quantum state tomography. Our results reveal the dynamical growth of entanglement and describe its complexity as correlations spread out during a quench: a necessary condition for future demonstrations of better-than-classical performance. Matrix product state tomography should therefore find widespread use in the study of large quantum many-body systems and the benchmarking and verification of quantum simulators and computers.
Journal Article
Development of an ABAQUS plugin tool for periodic RVE homogenisation
by
Omairey, Sadik L
,
Sriramula, Srinivas
,
Dunning, Peter D
in
Computer simulation
,
Deformation mechanisms
,
Elastic properties
2019
EasyPBC is an ABAQUS CAE plugin developed to estimate the homogenised effective elastic properties of user created periodic representative volume element (RVE), all within ABAQUS without the need to use third-party software. The plugin automatically applies the concepts of the periodic RVE homogenisation method in the software’s user interface by categorising, creating, and linking sets necessary for achieving deformable periodic boundary surfaces, which can distort and no longer remain plane. Additionally, it allows the user to benefit from finite element analysis data within ABAQUS CAE interface after calculating homogenised properties. In this article, the algorithm of the plugin based on periodic RVE homogenisation method is explained, which could be developed for other commercial FE software packages. Furthermore, examples of its implementation and verification are illustrated.
Journal Article
PGA: a software package for rapid, accurate, and flexible batch annotation of plastomes
2019
Background
Plastome (plastid genome) sequences provide valuable information for understanding the phylogenetic relationships and evolutionary history of plants. Although the rapid development of high-throughput sequencing technology has led to an explosion of plastome sequences, annotation remains a significant bottleneck for plastomes. User-friendly batch annotation of multiple plastomes is an urgent need.
Results
We introduce Plastid Genome Annotator (PGA), a standalone command line tool that can perform rapid, accurate, and flexible batch annotation of newly generated target plastomes based on well-annotated reference plastomes. In contrast to current existing tools, PGA uses reference plastomes as the query and unannotated target plastomes as the subject to locate genes, which we refer to as the reverse query-subject BLAST search approach. PGA accurately identifies gene and intron boundaries as well as intron loss. The program outputs GenBank-formatted files as well as a log file to assist users in verifying annotations. Comparisons against other available plastome annotation tools demonstrated the high annotation accuracy of PGA, with little or no post-annotation verification necessary. Likewise, we demonstrated the flexibility of reference plastomes within PGA by annotating the plastome of
Rosa roxburghii
using that of
Amborella trichopoda
as a reference. The program, user manual and example data sets are freely available at
https://github.com/quxiaojian/PGA
.
Conclusions
PGA facilitates rapid, accurate, and flexible batch annotation of plastomes across plants. For projects in which multiple plastomes are generated, the time savings for high-quality plastome annotation are especially significant.
Journal Article
Accrediting outputs of noisy intermediate-scale quantum computing devices
by
Datta, Animesh
,
Ferracin, Samuele
,
Kapourniotis, Theodoros
in
Accreditation
,
Cryptography
,
Gates (circuits)
2019
We present an accreditation protocol for the outputs of noisy intermediate-scale quantum devices. By testing entire circuits rather than individual gates, our accreditation protocol can provide an upper-bound on the variation distance between noisy and noiseless probability distribution of the outputs of the target circuit of interest. Our accreditation protocol requires implementing quantum circuits no larger than the target circuit, therefore it is practical in the near term and scalable in the long term. Inspired by trap-based protocols for the verification of quantum computations, our accreditation protocol assumes that single-qubit gates have bounded probability of error. We allow for arbitrary spatial and temporal correlations in the noise affecting state preparation, measurements, single-qubit and two-qubit gates. We describe how to implement our protocol on real-world devices, and we also present a novel cryptographic protocol (which we call 'mesothetic' protocol) inspired by our accreditation protocol.
Journal Article
Synthesis of High-Quality Visible Faces from Polarimetric Thermal Faces using Generative Adversarial Networks
2019
The large domain discrepancy between faces captured in polarimetric (or conventional) thermal and visible domains makes cross-domain face verification a highly challenging problem for human examiners as well as computer vision algorithms. Previous approaches utilize either a two-step procedure (visible feature estimation and visible image reconstruction) or an input-level fusion technique, where different Stokes images are concatenated and used as a multi-channel input to synthesize the visible image given the corresponding polarimetric signatures. Although these methods have yielded improvements, we argue that input-level fusion alone may not be sufficient to realize the full potential of the available Stokes images. We propose a generative adversarial networks based multi-stream feature-level fusion technique to synthesize high-quality visible images from polarimetric thermal images. The proposed network consists of a generator sub-network, constructed using an encoder–decoder network based on dense residual blocks, and a multi-scale discriminator sub-network. The generator network is trained by optimizing an adversarial loss in addition to a perceptual loss and an identity preserving loss to enable photo realistic generation of visible images while preserving discriminative characteristics. An extended dataset consisting of polarimetric thermal facial signatures of 111 subjects is also introduced. Multiple experiments evaluated on different experimental protocols demonstrate that the proposed method achieves state-of-the-art performance. Code will be made available at https://github.com/hezhangsprinter.
Journal Article
Journals unite for reproducibility
2014
Reproducibility, rigor, transparency, and independent verification are cornerstones of the scientific method. Of course, just because a result is reproducible does not necessarily make it right, and just because it is not reproducible does not necessarily make it wrong. A transparent and rigorous approach, however, can almost always shine a light on issues of reproducibility. This light ensures that science moves forward, through independent verifications as well as the course corrections that come from refutations and the objective examination of the resulting data.
Journal Article
Airborne Software Data Coupling and Control Coupling Analysis
As a part of software structural coverage, the data coupling and control coupling analysis help to evaluate the implementation of requirements and the code quality to some extent. This paper researches the data and control coupling definitions, analysis methods and the differences of requirements between DO-178B and DO-178C, thus providing reference for airborne software verification.
Journal Article
DIEDA: discriminative information based on exponential discriminant analysis combined with local features representation for face and kinship verification
by
Aliradi, Rachid
,
Elmaghraby, Adel S
,
Belkhir, Abdelkader
in
Computer vision
,
Discriminant analysis
,
Histograms
2025
Face and kinship verification using facial images is a novel and challenging problem in computer vision. In this paper, we propose a new system that uses discriminative information, which is based on the exponential discriminant analysis (DIEDA) combined with multiple scale descriptors. The histograms of different patches are concatenated to form a high dimensional feature vector, which represents a specific descriptor at a given scale. The projected histograms for each zone use the cosine similarity metric to reduce the feature vector dimensionality. Lastly, zone scores corresponding to various descriptors at different scales are fused and verified by using a classifier. This paper exploits discriminative side information for face and kinship verification in the wild (image pairs are from the same person or not). To tackle this problem, we take examples of the face samples with unlabeled kin relations from the labeled face in the wild dataset as the reference set. We create an optimized function by minimizing the interclass samples (with a kin relation) and maximizing the neighboring interclass samples (without a kinship relation) with the DIEDA approach. Experimental results on three publicly available face and kinship datasets show the superior performance of the proposed system over other state-of-the-art techniques.
Journal Article