Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
80 result(s) for "Soft thresholding"
Sort by:
OPERA net Otsu driven performance enhanced image restoration algorithm
Digital images have progressed significantly in many areas but various types of noise still exist in real-world images such as Gaussian noise, Poisson noise, Salt-and-pepper noise so on. Although image denoising plays an important role to remove noise from images by using many techniques to preserve the important feature of an image but faces the problem of computational complexity, over smoothing. To address these limitations proposed paper introduces hybrid image denoising technique that combines the strengths of Wavelet transform and Non-Local Means (NLM) filtering, with enhanced Otsu thresholding. In proposed work, firstly the noisy image is denoised by wavelet-based Otsu thresholding and then NLM is employed to improve the enhance edge preservation of an image by eliminating the further noise. Kodak 24 dataset is used to test the images. Furthermore, we compare the proposed technique to the existing technique based on the performance metrics such as Peak Signal-to-Noise Ratio (PSNR), Structural Similarity Index Measure (SSIM), and Root Mean Squared Error (RMSE). The Kodak24 dataset is used to evaluate the proposed hybrid denoising method. Experimental results show that the proposed technique outperforms existing approaches in terms of PSNR (34.86), SSIM (0.93) and RMSE (4.61) demonstrating its effectiveness in denoising and preserving structural image quality. Additionally, the FOM and VIF scores indicate the outstanding results of the hybrid strategy: both the FOM and VIF values were the best among all other assessed denoising methods, attained to 0.99 and 0.59, respectively.
RETRACTED ARTICLE: Forest pest monitoring and early warning using UAV remote sensing and computer vision techniques
Unmanned aerial vehicle (UAV) remote sensing has revolutionized forest pest monitoring and early warning systems. However, the susceptibility of UAV-based object detection models to adversarial attacks raises concerns about their reliability and robustness in real-world deployments. To address this challenge, we propose SC-RTDETR, a novel framework for secure and robust object detection in forest pest monitoring using UAV imagery. SC-RTDETR integrates a soft-thresholding adaptive filtering module and a cascaded group attention mechanism into the Real-time Detection Transformer (RTDETR) architecture, significantly enhancing its resilience against adversarial perturbations. Extensive experiments on a real-world pine wilt disease dataset demonstrate the superior performance of SC-RTDETR, with an improvement of 7.1% in mean Average Precision (mAP) and 6.5% in F1-score under strong adversarial attack conditions compared to state-of-the-art methods. The ablation studies and visualizations provide insights into the effectiveness of the proposed components, validating their contributions to the overall robustness and performance of SC-RTDETR. Our framework offers a promising solution for accurate and reliable forest pest monitoring in non-secure environments.
Application of iterative soft thresholding for fast reconstruction of NMR data non-uniformly sampled with multidimensional Poisson Gap scheduling
The fast Fourier transformation has been the gold standard for transforming data from time to frequency domain in many spectroscopic methods, including NMR. While reliable, it has as a drawback that it requires a grid of uniformly sampled data points. This needs very long measuring times for sampling in multidimensional experiments in all indirect dimensions uniformly and even does not allow reaching optimal evolution times that would match the resolution power of modern high-field instruments. Thus, many alternative sampling and transformation schemes have been proposed. Their common challenges are the suppression of the artifacts due to the non-uniformity of the sampling schedules, the preservation of the relative signal amplitudes, and the computing time needed for spectra reconstruction. Here we present a fast implementation of the Iterative Soft Thresholding approach (istHMS) that can reconstruct high-resolution non-uniformly sampled NMR data up to four dimensions within a few hours and make routine reconstruction of high-resolution NUS 3D and 4D spectra convenient. We include a graphical user interface for generating sampling schedules with the Poisson-Gap method and an estimation of optimal evolution times based on molecular properties. The performance of the approach is demonstrated with the reconstruction of non-uniformly sampled medium and high-resolution 3D and 4D protein spectra acquired with sampling densities as low as 0.8%. The method presented here facilitates acquisition, reconstruction and use of multidimensional NMR spectra at otherwise unreachable spectral resolution in indirect dimensions.
Forest pest monitoring and early warning using UAV remote sensing and computer vision techniques
Unmanned aerial vehicle (UAV) remote sensing has revolutionized forest pest monitoring and early warning systems. However, the susceptibility of UAV-based object detection models to adversarial attacks raises concerns about their reliability and robustness in real-world deployments. To address this challenge, we propose SC-RTDETR, a novel framework for secure and robust object detection in forest pest monitoring using UAV imagery. SC-RTDETR integrates a soft-thresholding adaptive filtering module and a cascaded group attention mechanism into the Real-time Detection Transformer (RTDETR) architecture, significantly enhancing its resilience against adversarial perturbations. Extensive experiments on a real-world pine wilt disease dataset demonstrate the superior performance of SC-RTDETR, with an improvement of 7.1% in mean Average Precision (mAP) and 6.5% in F1-score under strong adversarial attack conditions compared to state-of-the-art methods. The ablation studies and visualizations provide insights into the effectiveness of the proposed components, validating their contributions to the overall robustness and performance of SC-RTDETR. Our framework offers a promising solution for accurate and reliable forest pest monitoring in non-secure environments.
convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees
Sparse high dimensional graphical model selection is a topic of much interest in modern day statistics. A popular approach is to apply l₁‐penalties to either parametric likelihoods, or regularized regression/pseudolikelihoods, with the latter having the distinct advantage that they do not explicitly assume Gaussianity. As none of the popular methods proposed for solving pseudolikelihood‐based objective functions have provable convergence guarantees, it is not clear whether corresponding estimators exist or are even computable, or if they actually yield correct partial correlation graphs. We propose a new pseudolikelihood‐based graphical model selection method that aims to overcome some of the shortcomings of current methods, but at the same time retain all their respective strengths. In particular, we introduce a novel framework that leads to a convex formulation of the partial covariance regression graph problem, resulting in an objective function comprised of quadratic forms. The objective is then optimized via a co‐ordinatewise approach. The specific functional form of the objective function facilitates rigorous convergence analysis leading to convergence guarantees; an important property that cannot be established by using standard results, when the dimension is larger than the sample size, as is often the case in high dimensional applications. These convergence guarantees ensure that estimators are well defined under very general conditions and are always computable. In addition, the approach yields estimators that have good large sample properties and also respect symmetry. Furthermore, application to simulated and real data, timing comparisons and numerical convergence is demonstrated. We also present a novel unifying framework that places all graphical pseudolikelihood methods as special cases of a more general formulation, leading to important insights.
D-CCA: A Decomposition-Based Canonical Correlation Analysis for High-Dimensional Datasets
A typical approach to the joint analysis of two high-dimensional datasets is to decompose each data matrix into three parts: a low-rank common matrix that captures the shared information across datasets, a low-rank distinctive matrix that characterizes the individual information within a single dataset, and an additive noise matrix. Existing decomposition methods often focus on the orthogonality between the common and distinctive matrices, but inadequately consider the more necessary orthogonal relationship between the two distinctive matrices. The latter guarantees that no more shared information is extractable from the distinctive matrices. We propose decomposition-based canonical correlation analysis (D-CCA), a novel decomposition method that defines the common and distinctive matrices from the space of random variables rather than the conventionally used Euclidean space, with a careful construction of the orthogonal relationship between distinctive matrices. D-CCA represents a natural generalization of the traditional canonical correlation analysis. The proposed estimators of common and distinctive matrices are shown to be consistent and have reasonably better performance than some state-of-the-art methods in both simulated data and the real data analysis of breast cancer data obtained from The Cancer Genome Atlas. Supplementary materials for this article are available online.
New Fault Diagnosis Method for Rolling Bearings Based on Improved Residual Shrinkage Network Combined with Transfer Learning
The fault diagnosis of rolling bearings is faced with the problem of a lack of fault data. Currently, fault diagnosis based on traditional convolutional neural networks decreases the diagnosis rate. In this paper, the developed adaptive residual shrinkage network model is combined with transfer learning to solve the above problems. The model is trained on the Case Western Reserve dataset, and then the trained model is migrated to a small-sample dataset with a scaled-down sample size and the Jiangnan University bearing dataset to conduct the experiments. The experimental results show that the proposed method can efficiently learn from small-sample datasets, improving the accuracy of the fault diagnosis of bearings under variable loads and variable speeds. The adaptive parameter-rectified linear unit is utilized to adapt the nonlinear transformation. When rolling bearings are in operation, noise production is inevitable. In this paper, soft thresholding and an attention mechanism are added to the model, which can effectively process vibration signals with strong noise. In this paper, the real noise is simulated by adding Gaussian white noise in migration task experiments on small-sample datasets. The experimental results show that the algorithm has noise resistance.
Positive-Definite ℓ1-Penalized Estimation of Large Covariance Matrices
The thresholding covariance estimator has nice asymptotic properties for estimating sparse large covariance matrices, but it often has negative eigenvalues when used in real data analysis. To fix this drawback of thresholding estimation, we develop a positive-definite ℓ 1 -penalized covariance estimator for estimating sparse large covariance matrices. We derive an efficient alternating direction method to solve the challenging optimization problem and establish its convergence properties. Under weak regularity conditions, nonasymptotic statistical theory is also established for the proposed estimator. The competitive finite-sample performance of our proposal is demonstrated by both simulation and real applications.
Negative edges and soft thresholding in complex network analysis of resting state functional connectivity data
Complex network analyses of functional connectivity have consistently revealed non-random (modular, small-world, scale-free-like) behavior of hard-thresholded networks constructed from the right-tail of the similarity histogram. In the present study we determined network properties resulting from edges constrained to specific ranges across the full correlation histogram, in particular the left (negative-most) tail, and their dependence on the confound signal removal strategy employed. In the absence of global signal correction, left-tail networks comprised predominantly long range connections associated with weak correlations and were characterized by substantially reduced modularity and clustering, negative assortativity and γ<1 Deconvolution of specific confound signals (white matter, CSF and motion) resulted in the most robust within-subject reproducibility of global network parameters (ICCs~0.5). Global signal removal altered the network topology in the left tail, with the clustering coefficient and assortativity converging to zero. Networks constructed from the absolute value of the correlation coefficient were thus compromised following global signal removal since the different right-tail and left-tail topologies were mixed. These findings informed the construction of soft-thresholded networks, replacing the hard thresholding or binarization operation with a continuous mapping of all correlation values to edge weights, suppressing rather than removing weaker connections and avoiding issues related to network fragmentation. A power law adjacency function with β=12 yielded modular networks whose parameters agreed well with corresponding hard-thresholded values, that were reproducible in repeated sessions across many months and evidenced small-world-like and scale-free-like properties. [Display omitted] ►Networks based on the left tail of the correlation histogram have distinct topology. ►Global signal removal substantially alters the properties of left tail networks. ►Soft-thresholding (retaining all edges) retains modular structure of networks. ►Hard- and soft-thresholded network parameters were reproducible over 5–16months.