Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
53
result(s) for
"Champod, Christophe"
Sort by:
Functionalised silicon oxide nanoparticles for fingermark detection
by
Moret, Sébastien
,
Bécue, Andy
,
Champod, Christophe
in
Cyanoacrylate
,
Cyanoacrylates
,
Dermatoglyphics
2016
•Synthesis of luminescent and functionalised silicon oxide nanoparticles.•These nanoparticles regroup all desired properties for fingermark detection.•The technique is effective on non-porous substrates.•The technique performed similarly compared to one-step luminescent cyanoacrylate.•Silicon oxide nanoparticles are less affected by donor inter-variability.
Over the past decade, the use of nanotechnology for fingermark detection has been attracting a lot of attention. A substantial number of nanoparticle types has thus been studied and applied with varying success. However, despite all efforts, few publications present clear supporting evidence of their superiority over standard and commonly used techniques. This paper focuses on a rarely studied type of nanoparticles that regroups all desired properties for effective fingermark detection: silicon oxide. These nanoparticles offer optical and surface properties that can be tuned to provide optimal detection. This study explores their potential as a new method for fingermark detection.
Detection conditions, outer functionalisations and optical properties were optimised and a first evaluation of the technique is presented. Dye-doped silicon oxide nanoparticles were assessed against a one-step luminescent cyanoacrylate. Both techniques were compared on natural fingermarks from three donors collected on four different non-porous substrates. On average, the two techniques performed similarly but silicon oxide detected marks with a better homogeneity and was less affected by donor inter-variability. The technique remains to be further optimised and yet silicon oxide nanoparticles already show great promises for effective fingermark detection.
Journal Article
Use of quantum dots in aqueous solution to detect blood fingermarks on non-porous surfaces
2009
A new and original reagent based on the use of highly fluorescent cadmium telluride (CdTe) quantum dots (QDs) in aqueous solution is proposed to detect weak fingermarks in blood on non-porous surfaces. To assess the efficiency of this approach, comparisons were performed with one of the most efficient blood reagents on non-porous surfaces, Acid Yellow 7 (AY7). To this end, four non-porous surfaces were studied, i.e. glass, transparent polypropylene, black polyethylene, and aluminium foil. To evaluate the sensitivity of both reagents, sets of depleted fingermarks were prepared, using the same finger, initially soaked with blood, which was then successively applied on the same surface without recharging it with blood or latent secretions. The successive marks were then cut in halves and the halves treated separately with each reagent. The results showed that QDs were equally efficient to AY7 on glass, polyethylene and polypropylene surfaces, and were superior to AY7 on aluminium. The use of QDs in new, sensitive and highly efficient latent and blood mark detection techniques appears highly promising. Health and safety issues related to the use of cadmium are also discussed. It is suggested that applying QDs in aqueous solution (and not as a dry dusting powder) considerably lowers the toxicity risks.
Journal Article
Fingerprint identification: advances since the 2009 National Research Council report
2015
This paper will discuss the major developments in the area of fingerprint identification that followed the publication of the National Research Council (NRC, of the US National Academies of Sciences) report in 2009 entitled: Strengthening Forensic Science in the United States: A Path Forward. The report portrayed an image of a field of expertise used for decades without the necessary scientific research-based underpinning. The advances since the report and the needs in selected areas of fingerprinting will be detailed. It includes the measurement of the accuracy, reliability, repeatability and reproducibility of the conclusions offered by fingerprint experts. The paper will also pay attention to the development of statistical models allowing assessment of fingerprint comparisons. As a corollary of these developments, the next challenge is to reconcile a traditional practice dominated by deterministic conclusions with the probabilistic logic of any statistical model. There is a call for greater candour and fingerprint experts will need to communicate differently on the strengths and limitations of their findings. Their testimony will have to go beyond the blunt assertion of the uniqueness of fingerprints or the opinion delivered ispe dixit.
Journal Article
Automated face recognition in forensic science: Review and perspectives
2020
•Forensic facial images are used for investigation, intelligence and evaluation.•Automatic face recognition lacks standardization and validation to be used in court.•Computing score-based likelihood-ratio allows interpreting face recognition evidence.•Determination of relevant population and conditioning of propositions are crucial.
With recent technological innovations, the multiplication of captured images of criminal events has brought the comparison of faces to the forefront of the judicial scene. Forensic face recognition has become a ubiquitous tool to guide investigations, gather intelligence and provide evidence in court. However, its reliability in court still suffers from the lack of methodological standardization and empirical validation, notably when using automatic systems, which compare images and generate a matching score. Although the use of such systems increases drastically, it still requires more empirical studies based on adequate forensic data (surveillance footage and identity documents) to become a reliable method to present evidence in court. In this paper, we propose a review of the literature leading to the establishment of a methodological workflow to develop a score-based likelihood-ratio computation model using a Bayesian framework. Different approaches are proposed in the literature regarding the within-source and between-sources variability distributions modelling. Depending on the data available, the modelling approach can be specific to the case or generic. Generic approaches allow interpreting the score without any available images of the suspect. Such model is henceforth harder to defend in court because the results are not anchored to the suspect. To make sure the computed score-based LR is robust, we must assess the performance of the model with two main characteristics: the discriminating power and the calibration state of the model. We hence describe the main metrics (Equal Error Rate and Cost of log likelihood-ratio), and graphical representations (Tippett plots, Detection Error Trade-off plot and Empirical Cross-Entropy plot) used to quantify and visualize the performance characteristics.
Journal Article
Use of gold nanoparticles as molecular intermediates for the detection of fingermarks
by
Margot, Pierre
,
Becue, Andy
,
Champod, Christophe
in
Aqueous solutions
,
Biological and medical sciences
,
Cyclodextrins - chemical synthesis
2007
Among the numerous methods dedicated to the detection of latent fingermarks, the MultiMetal Deposition (MMD) offers, as a main advantage, the ability to be applied on a great number of porous and non-porous surfaces, e.g., paper, plastic, glass, latex, and polystyrene, even if wetted. While considered as a powerful and sensitive technique, MMD is often neglected, mainly because of operational limitations (siliconized vessels, restrictive pH domain, numerous immersion baths, …). In this contribution, we propose a modification of the standard MMD method so that the procedure is simplified with a number of baths reduced to a minimum. To reach this goal, it was necessary to obtain a fully operable solution which could detect fingermarks in a single step. We chose to take advantage of the molecular recognition mechanisms by functionalizing the gold nanoparticles with a molecular host able to bind itself to gold while keeping the ability to trap molecules in solution. Cyclodextrins were chosen as they can be easily chemically modified to offer gold-binding abilities. Moreover, they are widely used as hosts for various molecular guests (dyes, luminescent molecules, …). This new formulation has been tested on three different surfaces to attest the feasibility of this strategy. Successful results were obtained with detailed fingermarks offering a good contrast to allow their identification without the need to enhance the results (such as with a physical developer). If the new formulation behaves very similarly to the old one, in terms of experimental conditions, it offers the additional advantage to develop fingermarks after immersing them in only one bath. The goal is thus reached.
Journal Article
Testing the accuracy and reliability of palmar friction ridge comparisons – A black box study
by
Eldridge, Heidi
,
De Donno, Marco
,
Champod, Christophe
in
Accuracy
,
Bayesian analysis
,
Bayesian theory
2021
•An error-rate estimation was established for palm comparisons.•The false positive and false negative rates were 0.7% and 9.5%, respectively.•Individual examiner performance and consensus on decisions ranged widely.•Recommendations are made for reducing variability and erroneous exclusions.•Full data can be viewed at https://cchampod.shinyapps.io/Results_BBStudy/.
Critics and commentators have been calling for some time for black box studies in the forensic science disciplines to establish the foundational validity of those fields—that is, to establish a discipline-wide, base-rate estimate of the error rates that may be expected in each field. While the well-known FBI/Noblis black box study has answered that call for fingerprints, no research to establish similar error rates for palmar impressions has been previously undertaken. We report the results of the first large-scale black box study to establish a discipline-wide error rate estimate for palmar comparisons. The 226 latent print examiner participants returned 12,279 decisions over a dataset of 526 known ground-truth pairings. There were 12 false identification decisions made yielding a false positive error rate of 0.7%. There were also 552 false exclusion decisions made yielding a false negative error rate of 9.5%. Given their larger number, false negative error rates were further stratified by size, comparison difficulty, and area of the palm from which the mark originated. The notion of “questionable conclusions,” in which the ground truth response may not be the most appropriate, is introduced and discussed in light of the data obtained in the study. Measures of examiner consistency in analysis and comparison decisions are presented along with statistical analysis of the ability of many variables, such as demographics or image quality, to predict outcomes. Two online apps are introduced that will allow the reader to fully explore the results on their own, or to explore the notions of frequentist confidence intervals and Bayesian credible intervals.
Journal Article
A practical treatment of sensitivity analyses in activity level evaluations
by
Kokshoorn, Bas
,
Taylor, Duncan
,
Champod, Christophe
in
Activity level
,
Bayesian Network
,
Court reporting
2024
Evaluations of forensic observations considering activity level propositions are becoming more common place in forensic institutions. A measure that can be taken to interrogate the evaluation for robustness is called sensitivity analysis. A sensitivity analysis explores the sensitivity of the evaluation to the data used when assigning probabilities, or to the level of uncertainty surrounding a probability assignment, or to the choice of various assumptions within the model. There have been a number of publications that describe sensitivity analysis in technical terms, and demonstrate their use, but limited literature on how that theory can be applied in practice. In this work we provide some simplified examples of how sensitivity analyses can be carried out, when they are likely to show that the evaluation is sensitive to underlying data, knowledge or assumptions, how to interpret the results of sensitivity analysis, and how the outcome can be reported. We also provide access to an application to conduct sensitivity analysis.
•Forms of sensitivity analysis that apply to activity level evaluations are presented.•Examples are given for when evaluations are likely to be most sensitive to data.•Strategies for practically dealing with sensitivity evaluations are discussed.•Examples of reporting the results of sensitivity evaluations are provided.
Journal Article
A review of predictive modelling and drone remote sensing technologies as a tool for detecting clandestine burials
by
Milliet, Quentin
,
Koopman, Marissa
,
Champod, Christophe
in
Cellular telephones
,
Clandestine graves
,
Criminal investigations
2025
The search for missing people is a complex and intensive undertaking. Predictive models (such as RAG mapping and geographic profiling) in combination with drone-mounted technologies can improve these searches by driving down time and monetary costs, gathering new types of data and reducing the need for investigators to expose themselves to dangerous environments. Promising technologies to discover traces of clandestine burials in the landscape are LiDAR, RGB photography, multispectral and hyperspectral imaging, as well as infrared/thermal photography. This review covers the existing literature on these techniques and discusses future opportunities and directions.
•Predictive modelling and spatial analysis contribute to narrowing down the most probable locations of clandestine graves.•Physical features of the environment are indicators of the presence of graves.•Remote sensing technologies provide a variety of tools to locate grave-related features.•Air-based imaging and laser techniques have advantages and disadvantages to support search teams.
Journal Article
Fingerprint identification: advances since the 2009 National Research Council report
2015
This paper will discuss the major developments in the area of fingerprint identification that followed the publication of the National Research Council (NRC, of the US National Academies of Sciences) report in 2009 entitled: Strengthening Forensic Science in the United States: A Path Forward. The report portrayed an image of a field of expertise used for decades without the necessary scientific research-based underpinning. The advances since the report and the needs in selected areas of fingerprinting will be detailed. It includes the measurement of the accuracy, reliability, repeatability and reproducibility of the conclusions offered by fingerprint experts. The paper will also pay attention to the development of statistical models allowing assessment of fingerprint comparisons. As a corollary of these developments, the next challenge is to reconcile a traditional practice dominated by deterministic conclusions with the probabilistic logic of any statistical model. There is a call for greater candour and fingerprint experts will need to communicate differently on the strengths and limitations of their findings. Their testimony will have to go beyond the blunt assertion of the uniqueness of fingerprints or the opinion delivered ispe dixit.
Journal Article
Use of automated quality assessment algorithms in fingermark detection research – Application to IND/Zn vs DFO
2024
When developing detection techniques for fingermarks, the detected fingermarks must be evaluated for their quality to assess the effectiveness of the new method. It is a common practice to compare the performance of the new (optimized) technique with the traditional or well-established ones. In current practice, this evaluation step is carried out by a group of human assessors. A new approach is applied in this paper and consists of using algorithms to perform this task. To implement this approach, the comparison between IND/Zn and DFO has been chosen because it has already been the subject of many articles published in recent years and a consensus exists on the superiority of IND/Zn over DFO. The quality of 3’600 fingermarks developed using both detection techniques was assessed automatically using two algorithms: LQM (Latent Quality Metric) and ILFQM (Improved Latent Fingerprint Quality Metric). The distribution of quality scores was studied for both detection techniques. The results showed that fingermarks detected with IND/Zn received higher scores on average than fingermarks detected with DFO, which is in line with the consensus in the literature based on human assessment. The results of this research are promising and shows that automated fingermark quality assessment is an efficient and viable way to comparatively assess fingermark detection techniques.
[Display omitted]
•First use of automatic quality assessment to compare two fingerprint reagents.•Overall, IND/Zn led to more fingermarks, of better quality, compared to DFO.•The conclusions reached by the algorithms match the scientific consensus.•Automated algorithms allow a quick analysis of large sets of fingermarks.•Automated algorithms represent a promising alternative to human assessment.
Journal Article