Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
397 result(s) for "Truth tables"
Sort by:
Vector synthesis of fault testing map for logic
Vector synthesis of fault testing (simulation) map for logic is proposed, which without simulation allows to determine of all faults detected on test sets, as well as determining test sets to detect specified faults. For synthesis, a superposition of smart data structures is used, containing: a deductive matrix D, as a derivative of the logical vector L, test truth table T, and fault truth table F. The deductive matrix is seen as the gene of functionality and base of fault simulation mechanism to solve all the problems of testing and verification. In the matrix synthesis, an axiom is used: all the mentioned tables are identical in shape to each other and always interact with each other convolutionally T⊕L⊕F=0. A universal deductive reversing converter “test-faults” and “faults-test” for logical functionalities of any dimension is proposed. Converter functions: fault simulation on test sets T→F and synthesis of test sets F→T to detect the specified faults. The converter can be used as a test generation and fault simulation service for IP-core system-on-chip (SoC) under the IEEE 1500 SECT standard. Based on the deductive matrix, a fault testing map for logic is built, where each test set is matched with the logic-detected faults of the input lines.
Faults-as-address simulation
Fault-as-address-simulation (FAAS) is a simulation mechanism for testing combinations of circuit line faults, represented by the bit addresses of element logical vectors. The XOR relationship between the test set T and the truth table L of the element forms a deductive vector for fault simulation, using truth table addresses or the logic vector bits. Addresses are used in the simulation matrix to mark those n-combinations of input faults detected at the element's output. The columns of the simulation matrix are treated as n-row addresses to generate an element output row via a deductive vector. There is no transport of input faults to the element output, Only the 1-signals written in the non-input row coordinates of the circuit simulation matrix. The simulation matrix is initially filled with 1-signals along the main diagonal. The line faults detected on the test set of circuits are determined by the inverse of lines good values, which have 1-values in the matrix row corresponding to the output circuit element. The deductive vector is obtained by the XOR-relations between the test set and logical vector in three table operations. The advantage of the proposed FAAS mechanism is the predictable complexity of the algorithm and memory consumption for storing data structures when simulating a test set.
Single-Shot Correlations and Two-Qubit Gate of Solid-State Spins
Measurement of coupled quantum systems plays a central role in quantum information processing. We have realized independent single-shot read-out of two electron spins in a double quantum dot. The read-out method is all-electrical, cross-talk between the two measurements is negligible, and read-out fidelities are ∼86% on average. This allows us to directly probe the anticorrelations between two spins prepared in a singlet state and to demonstrate the operation of the two-qubit exchange gate on a complete set of basis states. The results provide a possible route to the realization and efficient characterization of multiqubit quantum circuits based on single quantum dot spins.
Implementing the Quantum von Neumann Architecture with Superconducting Circuits
The von Neumann architecture for a classical computer comprises a central processing unit and a memory holding instructions and data. We demonstrate a quantum central processing unit that exchanges data with a quantum random-access memory integrated on a chip, with instructions stored on a classical computer. We test our quantum machine by executing codes that involve seven quantum elements: Two superconducting qubits coupled through a quantum bus, two quantum memories, and two zeroing registers. Two vital algorithms for quantum computing are demonstrated, the quantum Fourier transform, with 66% process fidelity, and the three-qubit Toffoli-class OR phase gate, with 98% phase fidelity. Our results, in combination especially with longer qubit coherence, illustrate a potentially viable approach to factoring numbers and implementing simple quantum error correction codes.
Minimal Weak Truth Table Degrees and Computably Enumerable Turing Degrees
Two of the central concepts for the study of degree structures in computability theory are computably enumerable degrees and minimal degrees. For strong notions of reducibility, such as m-deducibility or truth table reducibility, it is possible for computably enumerable degrees to be minimal. For weaker notions of reducibility, such as weak truth table reducibility or Turing reducibility, it is not possible to combine these properties in a single degree. We consider how minimal weak truth table degrees interact with computably enumerable Turing degrees and obtain three main results. First, there are sets with minimal weak truth table degree which bound noncomputable computably enumerable sets under Turing reducibility. Second, no set with computable enumerable Turing degree can have minimal weak truth table degree. Third, no \\Delta^0_2 set which Turing bounds a promptly simple set can have minimal weak truth table degree.
An Introduction to Crisp Set QCA, with a Comparison to Binary Logistic Regression
The authors focus on the dichotomous crisp set form of qualitative comparative analysis (QCA). The authors review basic set theoretic QCA methodology, including truth tables, solution formulas, and coverage and consistency measures and discuss how QCA (a) displays relations between variables, (b) highlights descriptive or complex causal accounts for specific (groups of) cases, and (c) expresses the degree of fit. To help readers determine when QCA's configurational approach might be appropriate, the authors compare and contrast QCA to mainstream statistical methodologies such as binary logistic regressions done on the same data set.
Some notes on the Aristotelian doctrine of opposition and the propositional calculus
In this paper, we develop some of Williamson's ideas about the contribution of propositional calculus to a better understanding of Aristotelian logic. Specifically, the use he makes of truth tables in analyzing the structure of the traditional square of opposition is enhanced by a simple technique: The relations of dependence between different propositions allow us to construct “conditioned truth tables”. Taking advantage of this possibility, we propose a new interpretation of several passages of the Organon related to opposition.
An All-Optical Quantum Gate in a Semiconductor Quantum Dot
We report coherent optical control of a biexciton (two electron-hole pairs), confined in a single quantum dot, that shows coherent oscillations similar to the excited-state Rabi flopping in an isolated atom. The pulse control of the biexciton dynamics, combined with previously demonstrated control of the single-exciton Rabi rotation, serves as the physical basis for a two-bit conditional quantum logic gate. The truth table of the gate shows the features of an all-optical quantum gate with interacting yet distinguishable excitons as qubits. Evaluation of the fidelity yields a value of 0.7 for the gate operation. Such experimental capability is essential to a scheme for scalable quantum computation by means of the optical control of spin qubits in dots.
Logic Gates and Computation from Assembled Nanowire Building Blocks
Miniaturization in electronics through improvements in established \"top-down\" fabrication techniques is approaching the point where fundamental issues are expected to limit the dramatic increases in computing seen over the past several decades. Here we report a \"bottom-up\" approach in which functional device elements and element arrays have been assembled from solution through the use of electronically well-defined semiconductor nanowire building blocks. We show that crossed nanowire p-n junctions and junction arrays can be assembled in over 95% yield with controllable electrical characteristics, and in addition, that these junctions can be used to create integrated nanoscale field-effect transistor arrays with nanowires as both the conducting channel and gate electrode. Nanowire junction arrays have been configured as key OR, AND, and NOR logic-gate structures with substantial gain and have been used to implement basic computation.
Standards of Good Practice and the Methodology of Necessary Conditions in Qualitative Comparative Analysis
The analysis of necessary conditions for some outcome of interest has long been one of the main preoccupations of scholars in all disciplines of the social sciences. In this connection, the introduction of Qualitative Comparative Analysis (QCA) in the late 1980s has revolutionized the way research on necessary conditions has been carried out. Standards of good practice for QCA have long demanded that the results of preceding tests for necessity constrain QCA's core process of Boolean minimization so as to enhance the quality of parsimonious and intermediate solutions. Schneider and Wagemann's Theory-Guided/Enhanced Standard Analysis (T/ESA) is currently being adopted by applied researchers as the new state-of-the-art procedure in this respect. In drawing on Schneider and Wagemann's own illustrative data example and a meta-analysis of thirty-six truth tables across twenty-one published studies that have adhered to current standards of good practice in QCA, I demonstrate that, once bias against compound conditions in necessity tests is accounted for, T/ESA will produce conservative solutions, and not enhanced parsimonious or intermediate ones.