Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
563
result(s) for
"minimal computing"
Sort by:
Dickens's Idiomatic Imagination
2023
Dickens's Idiomatic Imagination
offers an original analysis of how Charles Dickens's use of
\"low\" and \"slangular\" (his neologism) language allowed him to
express and develop his most sophisticated ideas. Using a
hybrid of digital (distant) and analogue (close) reading
methodologies, Peter J. Capuano considers Dickens's use of bodily
idioms-\"right-hand man,\" \"shoulder to the wheel,\" \"nose to the
grindstone\"-against the broader lexical backdrop of the nineteenth
century.
Dickens was famously drawn to the vernacular language of
London's streets, but this book is the first to call attention to
how he employed phrases that embody actions, ideas, and social
relations for specific narrative and thematic purposes. Focusing on
the mid- to late career novels Dombey and Son , David
Copperfield , Bleak House , Great
Expectations , and Our Mutual Friend , Capuano
demonstrates how Dickens came to relish using common idioms in
uncommon ways and the possibilities they opened up for artistic
expression. Dickens's Idiomatic Imaginatio n establishes a
unique framework within the social history of language alteration
in nineteenth-century Britain for rethinking Dickens's literary
trajectory and its impact on the vocabularies of generations of
novelists, critics, and speakers of English.
Itinerant Entrepreneurs: #bigger6, Digital Humanities Pedagogy & Minimal Computing
2024
This article offers insight into the creation of a project-based graduate seminar that relied on collaboration, play, student agency, and practice to understand the powerful Bigger 6 Global Romanticism movement. By deploying Digital Humanities pedagogy and minimal computing, these first-generation students at a minority-serving institution began to understand that the literary voices heralding the Industrial Revolution and mechanization of print culture were immigrant, non-white, or female. The culminating digital project, \"The Bengal Annual: A Digital Exploration of Non-Canonical British Romantic Literature,\" explores what can be accomplished in a single semester as \"itinerant entrepreneurs\" of Digital Humanities.
Journal Article
Realizing repeated quantum error correction in a distance-three surface code
by
Müller, Markus
,
Eichler, Christopher
,
Andersen, Christian Kraglund
in
142/126
,
639/766/483/2802
,
639/766/483/481
2022
Quantum computers hold the promise of solving computational problems that are intractable using conventional methods
1
. For fault-tolerant operation, quantum computers must correct errors occurring owing to unavoidable decoherence and limited control accuracy
2
. Here we demonstrate quantum error correction using the surface code, which is known for its exceptionally high tolerance to errors
3
–
6
. Using 17 physical qubits in a superconducting circuit, we encode quantum information in a distance-three logical qubit, building on recent distance-two error-detection experiments
7
–
9
. In an error-correction cycle taking only 1.1 μs, we demonstrate the preservation of four cardinal states of the logical qubit. Repeatedly executing the cycle, we measure and decode both bit-flip and phase-flip error syndromes using a minimum-weight perfect-matching algorithm in an error-model-free approach and apply corrections in post-processing. We find a low logical error probability of 3% per cycle when rejecting experimental runs in which leakage is detected. The measured characteristics of our device agree well with a numerical model. Our demonstration of repeated, fast and high-performance quantum error-correction cycles, together with recent advances in ion traps
10
, support our understanding that fault-tolerant quantum computation will be practically realizable.
By using 17 physical qubits in a superconducting circuit to encode quantum information in a surface-code logical qubit, fast (1.1 μs) and high-performance (logical error probability of 3%) quantum error-correction cycles are demonstrated.
Journal Article
Topological and Subsystem Codes on Low-Degree Graphs with Flag Qubits
2020
In this work we introduce two code families, which we call the heavy-hexagon code and the heavy-square code. Both code families are implemented by assigning physical data and ancilla qubits to both vertices and edges of low-degree graphs. Such a layout is particularly suitable for superconducting qubit architectures to minimize frequency collisions and cross talk. In some cases, frequency collisions can be reduced by several orders of magnitude. The heavy-hexagon code is a hybrid surface and Bacon-Shor code mapped onto a (heavy-) hexagonal lattice, whereas the heavy-square code is the surface code mapped onto a (heavy-) square lattice. In both cases, the lattice includes all the ancilla qubits required for fault-tolerant error correction. Naively, the limited qubit connectivity might be thought to limit the error-correcting capability of the code to less than its full distance. Therefore, essential to our construction is the use of flag qubits. We modify minimum-weight perfect-matching decoding to efficiently and scalably incorporate information from measurements of the flag qubits and correct up to the full code distance while respecting the limited connectivity. Simulations show that high threshold values for both codes can be obtained using our decoding protocol. Further, our decoding scheme can be adapted to other topological code families.
Journal Article
Automatic detection of lung cancer from biomedical data set using discrete AdaBoost optimized ensemble learning generalized neural networks
by
Al-Makhadmeh, Zafer
,
Mohamed, Shakeel P
,
Tolba Amr
in
Biomedical data
,
Datasets
,
Ensemble learning
2020
Today, most of the people are affected by lung cancer, mainly because of the genetic changes of the tissues in the lungs. Other factors such as smoking, alcohol, and exposure to dangerous gases can also be considered the contributory causes of lung cancer. Due to the serious consequences of lung cancer, the medical associations have been striving to diagnose cancer in its early stage of growth by applying the computer-aided diagnosis process. Although the CAD system at healthcare centers is able to diagnose lung cancer during its early stage of growth, the accuracy of cancer detection is difficult to achieve, mainly because of the overfitting of lung cancer features and the dimensionality of the feature set. Thus, this paper introduces the effective and optimized neural computing and soft computing techniques to minimize the difficulties and issues in the feature set. Initially, lung biomedical data were collected from the ELVIRA Biomedical Data Set Repository. The noise present in the data was eliminated by applying the bin smoothing normalization process. The minimum repetition and Wolf heuristic features were subsequently selected to minimize the dimensionality and complexity of the features. The selected lung features were analyzed using discrete AdaBoost optimized ensemble learning generalized neural networks, which successfully analyzed the biomedical lung data and classified the normal and abnormal features with great effectiveness. The efficiency of the system was then evaluated using MATLAB experimental setup in terms of error rate, precision, recall, G-mean, F-measure, and prediction rate.
Journal Article
Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward–backward splitting, and regularized Gauss–Seidel methods
2013
In view of the minimization of a nonsmooth nonconvex function
f
, we prove an abstract convergence result for descent methods satisfying a sufficient-decrease assumption, and allowing a relative error tolerance. Our result guarantees the convergence of bounded sequences, under the assumption that the function
f
satisfies the Kurdyka–Łojasiewicz inequality. This assumption allows to cover a wide range of problems, including nonsmooth semi-algebraic (or more generally tame) minimization. The specialization of our result to different kinds of structured problems provides several new convergence results for inexact versions of the gradient method, the proximal method, the forward–backward splitting algorithm, the gradient projection and some proximal regularization of the Gauss–Seidel method in a nonconvex setting. Our results are illustrated through feasibility problems, or iterative thresholding procedures for compressive sensing.
Journal Article
An intelligent big data collection technology based on micro mobile data centers for crowdsensing vehicular sensor network
2023
The fast development of Internet of Things (IoT) has greatly driven the development of mobile crowdsensing vehicular sensor network (CVSN). A lot of fascinating big data–based applications have been developed such as traffic management, health monitoring, and smart city. How to effectively collect enough data while not increasing too much redundancy is still a challenging problem in the big data application for CVSN. In this paper, a data relay mule–based collection scheme (DRMCS) is proposed to improve the quality of service (QoS). Comparing with the previous researches, the innovation of DRMCS is as follows: First, a data collection framework which considers the sensing task completion rate, redundancy rate and delay is proposed. Second, the micro mobile data center (MMDC) is proposed to solve the problem of connecting the huge number of intelligent sensing devices with data centre. Third, a MMDC selection strategy based on simulated annealing algorithm is proposed by DRMCS to improve the data collection performance. Compared with traditional vehicular network opportunistic communication without data relay mule (OCDRM), the sensing task completion rate of DRMCS has been improved by 78.6%.
Journal Article
Multipartite reflected entropy
by
Cheng, Newton
,
Bao, Ning
in
AdS-CFT Correspondence
,
Black Holes
,
Classical and Quantum Gravitation
2019
A
bstract
We discuss two methods that, through a combination of cyclically gluing copies of a given
n
-party boundary state in AdS/CFT and a canonical purification, creates a bulk geometry that contains a boundary homologous minimal surface with area equal to 2 or 4 times the
n
-party entanglement wedge cross-section, depending on the parity of the party number and choice of method. The areas of the minimal surfaces are each dual to entanglement entropies that we define to be candidates for the
n
-party reflected entropy. In the context of AdS
3
/CFT
2
, we provide a boundary interpretation of our construction as a multiboundary wormhole, and conjecture that this interpretation generalizes to higher dimensions.
Journal Article
Energy efficient offloading mechanism using particle swarm optimization in 5G enabled edge nodes
by
Venkatachalam, K.
,
Bacanin, Nebojsa
,
Malebary, Sharaf
in
5G mobile communication
,
Algorithms
,
Cloud computing
2023
Today’s world naturally depends on wireless devices for the daily necessities like communication, smart car driving, smart medical check up, smart housing security, etc. These applications create huge amount of data to be processed across the edge and cloud devices. Mobile or wireless devices can efficiently handle the input data with practical limitations on computing capacity. These limitations are otherwise difficult to handle and could be overcame by using mobile edge computing technology. When computing tasks depend upon edge devices to store and process data, it tends to offload in available edge nodes. Advanced smart applications use 5G networks to process the data in edge nodes with central units or distributed cloud units. Our research problem is focused on 5G data offloading by saving the energy over time. It mainly works on selecting appropriate edge nodes with minimum cost and energy for 5G data offloading process. Balancing the load at every edge node became a crucial task in advanced 5G networks. High-class networks have more density which tends to increase the energy consumption appropriately. In our proposed work, energy efficient offloading is done with mobile edge computing (MEC), macro base stations, small base stations to compute the data with less energy. The process of selecting minimum energy devices in edge network is done using particle swarm optimization (PSO) algorithm. This proposed offloading scheme helps to process data in 5G networks very effectively. The workload energy of the 5G network at IoT and MEC is preserved by using the multi-level offloading mechanism. Further complexity of the system is optimized with energy optimization algorithm called PSO to reduce the execution time and energy. Results have shown that for the set of 500 tasks, mobile edge server consumes 11 J, while the core cloud consumes 15 J of energy per task execution. Mobile edge computing consumes less energy than cloud and mobile devices.
Journal Article
Cooperative agents-based approach for workflow scheduling on fog-cloud computing
by
Mokni, Marwa
,
Hajlaoui, Jalel Eddine
,
Omri, Mohamed Nazih
in
Artificial Intelligence
,
Cloud computing
,
Communication
2022
Connected objects in the Internet of Things (IoT) domain are widespread everywhere. They interact with each other and cooperate with their neighbors to achieve a common goal. Most of these objects generate a huge amount of data, often requiring a process under strict time constraints. Being motivated by the question of optimizing the execution time of these IoT tasks, we remain aware of the sensitivity to latency and the volume of data generated. In this article, we propose a hybrid Cloud-Fog multi-agent approach to schedule a set of dependent IoT tasks modeled as a workflow. The major advantage of our approach is to allow to model IoT workflow planning as a multi-objetif optimization problem in order to create a compromise planning solution in terms of response time, cost and makespan. In addition to taking into account data communications between workflow tasks, during the planning process, our approach has two other advantages: (1) maximizing the use of Fog Computing in order to minimize response time, and (2) the use of elastic cloud computing resources at minimum cost. The implementation of the MAS-GA (Multi-Agent System based Genetic Algorithm), which we have proposed in this context; the series of experiments carried out on different corpora, as well as the analysis of the found results confirm the feasibility of our approach and its performance in terms of cost which represents an average gain of 21.38% compared to Fog and 25.49% compared to Cloud, makespan which represents a gain of 14.13% compared to Fog and a slight increase of 5.24% compared to Cloud and in response time which represents an average gain of 46.66% compared to Cloud with a slight increase of 6.66% compared to Fog, while strengthening the collaboration between Fog computing and Cloud computing.
Journal Article