Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
106 result(s) for "NIST analysis"
Sort by:
RGB Image Encryption through Cellular Automata, S-Box and the Lorenz System
The exponential growth in transmission of multimedia over the Internet and unsecured channels of communications is putting pressure on scientists and engineers to develop effective and efficient security schemes. In this paper, an image encryption scheme is proposed to help solve such a problem. The proposed scheme is implemented over three stages. The first stage makes use of Rule 30 cellular automata to generate the first encryption key. The second stage utilizes a well-tested S-box, whose design involves a transformation, modular inverses, and permutation. Finally, the third stage employs a solution of the Lorenz system to generate the second encryption key. The aggregate effect of this 3-stage process insures the application of Shannon’s confusion and diffusion properties of a cryptographic system and enhances the security and robustness of the resulting encrypted images. Specifically, the use of the PRNG bitstreams from both of the cellular automata and the Lorenz system, as keys, combined with the S-box, results in the needed non-linearity and complexity inherent in well-encrypted images, which is sufficient to frustrate attackers. Performance evaluation is carried out with statistical and sensitivity analyses, to check for and demonstrate the security and robustness of the proposed scheme. On testing the resulting encrypted Lena image, the proposed scheme results in an MSE value of 8923.03, a PSNR value of 8.625 dB, an information entropy of 7.999, NPCR value of 99.627, and UACI value of 33.46. The proposed scheme is shown to encrypt images at an average rate of 0.61 Mbps. A comparative study with counterpart image encryption schemes from the literature is also presented to showcase the superior performance of the proposed scheme.
A New Chaotic-Based RGB Image Encryption Technique Using a Nonlinear Rotational 16 × 16 DNA Playfair Matrix
Due to great interest in the secure storage and transmission of color images, the necessity for an efficient and robust RGB image encryption technique has grown. RGB image encryption ensures the confidentiality of color images during storage and transmission. In the literature, a large number of chaotic-based image encryption techniques have been proposed, but there is still a need for a robust, efficient and secure technique against different kinds of attacks. In this paper, a novel RGB image encryption technique is proposed for encrypting individual pixels of RGB images using chaotic systems and 16 rounds of DNA encoding, transpositions and substitutions. First, round keys are generated randomly using a logistic chaotic function. Then, these keys are used across different rounds to alter individual pixels using a nonlinear randomly generated 16×16 DNA Playfair matrix. Experimental results show the robustness of the proposed technique against most attacks while reducing the consumed time for encryption and decryption. The quantitative metrics show the ability of the proposed technique to maintain reference evaluation values while resisting statistical and differential attacks. The obtained horizontal, vertical and diagonal correlation is less than 0.01, and the NPCR and UACI are larger than 0.99 and 0.33, respectively. Finally, NIST analysis is presented to evaluate the randomness of the proposed technique.
A new multiple image encryption algorithm using hyperchaotic systems, SVD, and modified RC5
Secure image encryption is critical for protecting sensitive data such as satellite imagery, which is pivotal for national security and environmental monitoring. However, existing encryption methods often face challenges such as vulnerability to traffic analysis, limited randomness, and insufficient resistance to attacks. To address these gaps, this article proposes a novel multiple image encryption (MIE) algorithm that integrates hyperchaotic systems, Singular Value Decomposition (SVD), counter mode RC5, a chaos-based Hill cipher, and a custom S-box generated via a modified Blum Blum Shub (BBS) algorithm. The proposed MIE algorithm begins by merging multiple satellite images into an augmented image, enhancing security against traffic analysis. The encryption process splits the colored image into RGB channels, with each channel undergoing four stages: additive confusion using a memristor hyperchaotic key transformed by SVD, RC5 encryption in counter mode with XOR operations, Hill cipher encryption using a 6D hyperchaotic key and invertible matrices mod 256, and substitution with a custom S-box generated by a modified BBS. Experimental results demonstrate the proposed algorithm’s superior encryption efficiency, enhanced randomness, and strong resistance to cryptanalytic, differential, and brute-force attacks. These findings highlight the MIE algorithm’s potential for securing satellite imagery in real-time applications, ensuring confidentiality and robustness against modern security threats.
Compositional Analysis of Biomass Reference Materials: Results from an Interlaboratory Study
Biomass compositional methods are used to compare different lignocellulosic feedstocks, to measure component balances around unit operations and to determine process yields and therefore the economic viability of biomass-to-biofuel processes. Four biomass reference materials (RMs NIST 8491–8494) were prepared and characterized, via an interlaboratory comparison exercise in the early 1990s to evaluate biomass summative compositional methods, analysts, and laboratories. Having common, uniform, and stable biomass reference materials gives the opportunity to assess compositional data compared to other analysts, to other labs, and to a known compositional value. The expiration date for the original characterization of these RMs was reached and an effort to assess their stability and recharacterize the reference values for the remaining material using more current methods of analysis was initiated. We sent samples of the four biomass RMs to 11 academic, industrial, and government laboratories, familiar with sulfuric acid compositional methods, for recharacterization of the component reference values. In this work, we have used an expanded suite of analytical methods that are more appropriate for herbaceous feedstocks, to recharacterize the RMs’ compositions. We report the median values and the expanded uncertainty values for the four RMs on a dry-mass, whole-biomass basis. The original characterization data has been recalculated using median statistics to facilitate comparisons with this data. We found improved total component closures for three out of the four RMs compared to the original characterization, and the total component closures were near 100 %, which suggests that most components were accurately measured and little double counting occurred. The major components were not statistically different in the recharacterization which suggests that the biomass materials are stable during storage and that additional components, not seen in the original characterization, were quantified here.
Integrating cost–benefit analysis into the NIST Cybersecurity Framework via the Gordon–Loeb Model
The National Institute for Standards and Technology (NIST) Cybersecurity Framework has rapidly become a widely accepted approach to facilitating cybersecurity risk management within organizations. An insightful aspect of the NIST Cybersecurity Framework is its explicit recognition that the activities associated with managing cybersecurity risk are organization specific. The NIST Framework also recognizes that organizations should evaluate their cybersecurity risk management on a cost–benefit basis. The NIST Framework, however, does not provide guidance on how to carry out such a cost–benefit analysis. This article provides an approach for integrating cost–benefit analysis into the NIST Cybersecurity Framework. The Gordon–Loeb (GL) Model for cybersecurity investments is proposed as a basis for deriving a cost-effective level of spending on cybersecurity activities and for selecting the appropriate NIST Implementation Tier level. The analysis shows that the GL Model provides a logical approach to use when considering the cost–benefit aspects of cybersecurity investments during an organization’s process of selecting the most appropriate NIST Implementation Tier level. In addition, the cost–benefit approach provided in this article helps to identify conditions under which there is an incentive to move to a higher NIST Implementation Tier.
An efficient block-level image encryption scheme based on multi-chaotic maps with DNA encoding
This paper presents an efficient image encryption scheme based on permutation followed by diffusion, where both of these phases use 2-d Sine logistic modulation map (SLMM) with different initial values. In addition, diffusion uses another map as 1-d Logistic chaotic map (LCM). The initial values of these chaotic maps are obtained from an external key of 64 bytes along with 32-byte hash value from the corresponding plain-image to incorporate plain-text sensitivity. Initially, confusion of the plain-image is implemented by applying row-level and column-level permutations. Then, this permuted image is used for subsequent diffusion, applied on block-level considering block size of 64 bytes. This diffusion process is accomplished by overlaying with chaotic matrix derived from LCM, followed by substitution of those overlaid bytes by DNA encoding along with SLMM to attain an encrypted image with an entropy nearly 8. Furthermore, all the chaotic values generated from the aforementioned maps are highly sensitive on the key as well as on the plain-image. This scheme is thoroughly verified on different sized plain-images with modern statistical analyses to prove the robustness of this scheme. Eventually, comparison with other schemes reinforces its competence and suitability to implement it in real-time system.
Enhancing Secure Software Development with AZTRM-D: An AI-Integrated Approach Combining DevSecOps, Risk Management, and Zero Trust
This paper introduces the Automated Zero Trust Risk Management with DevSecOps Integration (AZTRM-D) framework, a novel approach that embeds security throughout the entire Secure Software and System Development Life Cycle (S-SDLC). AZTRM-D strategically unifies three established methodologies: DevSecOps practices, the NIST Risk Management Framework (RMF), and the Zero Trust (ZT) model. It then significantly augments their capabilities through the pervasive application of Artificial Intelligence (AI). This integration shifts traditional, often fragmented, security paradigms towards a proactive, automated, and continuously adaptive security posture. AI serves as the foundational enabler, providing real-time threat intelligence, automating critical security controls, facilitating continuous vulnerability detection, and enabling dynamic policy enforcement from initial code development through operational deployment. By automating key security functions and providing continuous oversight, AZTRM-D enhances risk mitigation, reduces vulnerabilities, streamlines compliance, and significantly strengthens the overall security posture of software systems, thereby addressing the complexities of modern cyber threats and accelerating the delivery of secure software.
Changes in the Chemical Composition of Polyethylene Terephthalate under UV Radiation in Various Environmental Conditions
Polyethylene terephthalate has been widely used in the packaging industry. Degraded PET micro(nano)plastics could pose public health concerns following release into various environments. This study focuses on PET degradation under ultraviolet radiation using the NIST SPHERE facility at the National Institute of Standards and Technology in saturated humidity (i.e., ≥95% relative humidity) and dry conditions (i.e., ≤5% relative humidity) with varying temperatures (30 °C, 40 °C, and 50 °C) for up 20 days. ATR-FTIR was used to characterize the chemical composition change of degraded PET as a function of UV exposure time. The results showed that the cleavage of the ester bond at peak 1713 cm−1 and the formation of the carboxylic acid at peak 1685 cm−1 were significantly influenced by UV radiation. Furthermore, the formation of carboxylic acid was considerably higher at saturated humidity and 50 °C conditions compared with dry conditions. The ester bond cleavage was also more pronounced in saturated humidity conditions. The novelty of this study is to provide insights into the chemical degradation of PET under environmental conditions, including UV radiation, humidity, and temperature. The results can be used to develop strategies to reduce the environmental impact of plastic pollution.
FPGA based implementation of a perturbed Chen oscillator for secure embedded cryptosystems
This paper introduces an enhancement to the Chen chaotic system by incorporating a constant perturbation term d to one of the state variables, aiming to improve the performance of pseudo-random number generators (PRNGs). The perturbation significantly enhances the system's chaotic properties, resulting in superior randomness and increased security. An FPGA-based realization of a perturbed Chen oscillator (PCO)-derived PRNG is presented, tailored for embedded cryptosystems and implemented on a Nexys 4 FPGA card featuring the XILINX Artix-7 XC7A100T-1CSG324C integrated chip. The Xilinx-based system generator (XSG) tool is utilized to generate a digital version of the new oscillator, minimizing resource utilization. Experimental results demonstrate that the PCO-generated data successfully passes the NIST and TestU01 test suites. Additionally, statistical tests with key sensitivity are performed, validating the suitability of the designed PRNG for cryptographic applications. This establishes the PCO as a straightforward and efficient tool for multimedia security.
Critical evaluation of the NIST retention index database reliability with specific examples
The NIST gas chromatographic retention index database is widely used in gas chromatography–mass spectrometry analysis. For many compounds, the NIST database contains many entries that are presumably obtained independently of each other. We showed with specific examples that there are cases in the NIST database where several entries exist for the same compound, and all of them are equally erroneous (an error of more than 100 units). In particular, we demonstrated that all retention index values for such an important compound as imidazole for non-polar stationary phases in the NIST database are erroneous. In addition to imidazole, a similar situation is observed for four more nitrogen-containing heterocyclic compounds. For certainty, measurements were performed under several conditions, using various temperature programs, and using two specimens of columns. The structures were confirmed using nuclear magnetic resonance and mass spectrometry. It was shown with specific examples that many values are not reliable: either data were obtained using standard samples of undescribed origin without confirmation (without even using mass spectrometry) or, in some cases, standard samples were not used at all, and the retention index was obtained for a mixture component identified using a mass spectral library search. Some “independent” values are not such but are repeated publications of the same data (secondary sources), or simply several values taken from the same source. In the work, an analysis was carried out and assumptions were made about how several equally incorrect retention index values could appear in the NIST database.