Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectCountry Of PublicationPublisherSourceTarget AudienceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
227,746
result(s) for
"Data integrity"
Sort by:
Decentralized trust framework for smart cities: a blockchain-enabled cybersecurity and data integrity model
by
AlZubi, Ahmad Ali
,
Islam, Rafiqul
,
Roy, Sandip
in
639/166
,
639/4077
,
AI-Driven threat detection
2025
The rapid evolution of smart cities has led to transformative advancements through the integration of IoT devices, sensors, and data-driven systems, yet has simultaneously exposed critical vulnerabilities in cybersecurity, data integrity, and trust management. This research proposes a Decentralized Trust Framework that leverages blockchain technology, AI-driven threat detection, and a Lightweight Adaptive Proof-of-Stake (LA-PoS) consensus mechanism to address these challenges. The framework integrates three key layers: a Blockchain Layer for decentralized trust and immutability, a Cybersecurity Layer employing cryptographic standards and AI-based anomaly detection, and a Data Integrity Protocol Layer for real-time synchronization and tamper-proof data validation. Performance evaluations indicate the framework achieves a threefold increase in transaction throughput, a 30% reduction in latency, and enhanced energy efficiency compared to traditional blockchain systems. Security metrics highlight a 98.2% threat detection rate and a substantial reduction in false positives, while resource optimization nearly doubles IoT device battery life. The framework demonstrates applicability in critical smart city use cases, including smart traffic management, energy systems, and public safety, providing secure, scalable, and efficient solutions for urban infrastructures. Despite these advancements, challenges such as interoperability among heterogeneous systems, computational overhead for IoT devices, and policy adoption persist. Future research will focus on optimizing interoperability protocols, incorporating quantum-resistant cryptographic techniques, and extending the framework to emerging domains such as autonomous systems and smart healthcare. The proposed framework provides a robust foundation for building sustainable, resilient, and trustworthy urban ecosystems, bridging gaps in current smart city technologies.
Journal Article
Data integrity and quality
2021
Data integrity is the quality, reliability, trustworthiness, and completeness of a data set, providing accuracy, consistency, and context. Data quality refers to the state of qualitative or quantitative pieces of information. Over five sections, this book discusses data integrity and data quality as well as their applications in various fields.
A custom hash algorithm for hosting secure gray scale image repository in public cloud
by
Chidambaram, Nithya
,
Murugesan, Veenasri
,
Amirtharajan, Rengarajan
in
639/166
,
639/705
,
Algorithms
2025
Nowadays, Cloud computing is an essential platform for securing resources and effectively managing files. In digital technology, many data leaks or breaches frequently occur during the storage and transmission process. Several techniques for secure image transmission have been developed by researchers worldwide. In the traditional method, data loss prevention (DLP) is the best way to protect sensitive data from breaches. The massive amount of data is not feasible in the existing storage system. However, ensuring data security remains a severe challenge. Cloud infrastructure provides a more robust, reliable, and scalable solution to overcome attacks in developing regions. The primary objective of cloud storage is to provide affordable and easy access to storage, with a vast amount of data stored across multiple cloud storage services. This paper proposed a custom block-based hash algorithm that generates a digital fingerprint from the grayscale-scale image. The pivotal contribution presented in the proposed work lies in emphasising data integrity generation and validation, tamper detection, and accurate identification of the tampered region. The entire 256 × 256 image is considered for tamper-proofing, and the hash values generated are based on the proposed work. In the integrity validation process, it compares the digest with the original digest. The cloud environment provides scalable infrastructure for securely managing and storing the digital fingerprint. User-level authentication is also incorporated into the proposed framework. Additionally, a Graphical User Interface (GUI) application has been developed for generating a hash and verifying whether the image has been tampered with or not, with the tampered region marked by a bounding box. Various benchmark metrics are analysed for validating the outfit of the proposed algorithm. The metrics, including quantitative and qualitative tests for integrity codes, collision property, and avalanche effect, were analysed, and the proposed algorithm exhibits a good ability towards integrity validation.
Journal Article
Big data : does size matter?
\"Timandra Harkness cuts through the hype to put data science into its real-life context using a wide range of stories, people, and places to reveal what is essentially a human science--demystifying big data, telling us where it comes from and what it can do. 'Big Data' then asks the awkward questions: What are the unspoken assumptions underlying its methods? Are we being bamboozled by mega data's size, its speed, and its shiny technology? Nobody needs a degree in computer science to follow Harkness's exploration of what mega data can do for us--and what it can't or shouldn't. 'Big Data' asks you to decide: Are you a data point, or a human being?\"--Provided by publisher.
Certificateless data integrity auditing with sparse Merkle trees for the cloud-edge environment
2025
Ensuring data integrity in cloud-edge environments is critical for IoT ecosystems but is challenged by dynamic data and resource constraints. This paper proposes a certificateless auditing scheme harmonizing cloud security with edge efficiency. By integrating online/offline cryptography and sparse Merkle trees, our approach achieves (1) significant user-side computation reduction via offline or edge-side tag generation, (2)
dynamic update complexity versus traditional
approaches, and (3) 75% communication overhead savings through pre-download mechanism. The scheme eliminates certificate management and mitigates Key Generation Centre (KGC) risks via decentralized trust mechanisms. Security proofs demonstrate resilience against KGC collusion and tag forgery under the Inv-CDH assumption. Experiments show our scheme audits faster than prior schemes, supporting 500k+ operations at sub-second latency. This work bridges scalability and real-time demands for smart cities and Industry 4.0 while enabling future extensions in ML-optimized caching and blockchain trust models.
Journal Article
Veracity of big data : machine learning and other approaches to verifying truthfulness
Examine the problem of maintaining the quality of big data and discover novel solutions. You will learn the four V's of big data, including veracity, and study the problem from various angles. The solutions discussed are drawn from diverse areas of engineering and math, including machine learning, statistics, formal methods, and the Blockchain technology. Veracity of Big Data serves as an introduction to machine learning algorithms and diverse techniques such as the Kalman filter, SPRT, CUSUM, fuzzy logic, and Blockchain, showing how they can be used to solve problems in the veracity domain. Using examples, the math behind the techniques is explained in easy-to-understand language. Determining the truth of big data in real-world applications involves using various tools to analyze the available information. This book delves into some of the techniques that can be used. Microblogging websites such as Twitter have played a major role in public life, including during presidential elections. The book uses examples of microblogs posted on a particular topic to demonstrate how veracity can be examined and established. Some of the techniques are described in the context of detecting veiled attacks on microblogging websites to influence public opinion. -- Back cover.
A Multi-Key with Partially Homomorphic Encryption Scheme for Low-End Devices Ensuring Data Integrity
by
Bejaoui, Tarek
,
Hammoudeh, Mohammad
,
Al-Khalidi, Mohammed
in
Algorithms
,
Asymmetry
,
Cloud computing
2023
In today’s hyperconnected world, the Internet of Things and Cloud Computing complement each other in several areas. Cloud Computing provides IoT systems with an efficient and flexible environment that supports application requirements such as real-time control/monitoring, scalability, fault tolerance, and numerous security services. Hardware and software limitations of IoT devices can be mitigated using the massive on-demand cloud resources. However, IoT cloud-based solutions pose some security and privacy concerns, specifically when an untrusted cloud is used. This calls for strong encryption schemes that allow operations on data in an encrypted format without compromising the encryption. This paper presents an asymmetric multi-key and partially homomorphic encryption scheme. The scheme provides the addition operation by encrypting each decimal digit of the given integer number separately using a special key. In addition, data integrity processes are performed when an untrusted third party performs homomorphic operations on encrypted data. The proposed work considers the most widely known issues like the encrypted data size, slow operations at the hardware level, and high computing costs at the provider level. The size of generated ciphertext is almost equal to the size of the plaintext, and order-preserving is ensured using an asymmetrical encryption version.
Journal Article
Blockchain and clinical trial : securing patient data
\"This book aims to highlight the gaps and the transparency issues in the clinical research and trials processes and how there is a lack of information flowing back to researchers and patients involved in those trials. Lack of data transparency is an underlying theme within the clinical research world and causes issues of corruption, fraud, errors and a problem of reproducibility. Blockchain can prove to be a method to ensure a much more joined up and integrated approach to data sharing and improving patient outcomes. Surveys undertaken by creditable organisations in the healthcare industry are analysed in this book that show strong support for using blockchain technology regarding strengthening data security, interoperability and a range of beneficial use cases where mostly all respondents of the surveys believe blockchain will be important for the future of the healthcare industry. Another aspect considered in the book is the coming surge of healthcare wearables using Internet of Things (IoT) and the prediction that the current capacity of centralised networks will not cope with the demands of data storage. The benefits are great for clinical research, but will add more pressure to the transparency of clinical trials and how this is managed unless a secure mechanism like, blockchain is used\"--Publisher's description.
Spatial Resolution and Data Integrity Enhancement of Microwave Radiometer Measurements Using Total Variation Deconvolution and Bilateral Fusion Technique
2022
Passive multi-frequency microwave sensors are indispensable instruments for worldwide environmental monitoring. However, they often suffer from the issues of poor spatial resolution and the original land–sea transition zone data are contaminated severely. Conventional analytical deconvolution methods enhance the spatial resolution at the expense of noise amplification and Gibbs fluctuations in the land–sea transition zone. In order to enhance the spatial resolution as well as simultaneously enhance the integrity of the Microwave Radiometer data, a method based on Total Variation deconvolution, Bilateral Filter, and data fusion (TVBF+) is proposed. Our method substantially improves data integrity and obtains similar enhanced resolution compared to existing methods. Experiments performed using both simulated and actual microwave radiation Imager (MWRI) data demonstrate the method’s robustness and effectiveness.
Journal Article