Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
132 result(s) for "Prothrombin Time - standards"
Sort by:
International Council for Standardization in Haematology Field Study Evaluating Optimal Interpretation Methods for Activated Partial Thromboplastin Time and Prothrombin Time Mixing Studies
The prothrombin time (PT) and activated partial thromboplastin time (APTT) are screening tests used to detect congenital or acquired bleeding disorders. An unexpected PT and/or APTT prolongation is often evaluated using a mixing test with normal plasma. Failure to correct (\"noncorrection\") prolongation upon mixing is attributed to an inhibitor, whereas \"correction\" points to factor deficiency(ies). To define an optimal method for determining correction or noncorrection of plasma mixing tests through an international, multisite study that used multiple PT and APTT reagents and well-characterized plasma samples. Each testing site was provided 22 abnormal and 25 normal donor plasma samples, and mixing studies were performed using local PT and APTT reagents. Mixing study results were evaluated using 11 different calculation methods to assess the optimal method based on the expected interpretation for factor deficiencies (correction) and noncorrection (inhibitor effect). Misprediction, which represents the failure of a mixing study interpretation method, was assessed. Percentage correction was the most suitable calculation method for interpreting PT mixing test results for nearly all reagents evaluated. Incubated PT mixing tests should not be performed. For APTT mixing tests, percentage correction should be performed, and if the result indicates a factor deficiency, this should be confirmed with the subtraction III calculation where the normal pooled plasma result (run concurrently) is subtracted from the mixing test result with correction indicated by a result of 0 or less. In general, other calculation methods evaluated that performed well in the identification of factor deficiency tended to have high misprediction rates for inhibitors and vice versa. No single method of mixing test result calculation was consistently successful in accurately distinguishing factor deficiencies from inhibitors, with between-reagent and between-site variability also identified.
Harmonizing the International Normalized Ratio (INR)
To reduce interlaboratory variation and bias in international normalized ratio (INR) results, as used to monitor patients receiving vitamin K antagonist therapy, including warfarin, in a large pathology network (n = 27 laboratories) by procedural standardization and harmonization. Network consensus to standardize to common instrument and reagent platforms was established, following development of hemostasis test specifications. Subsequent installations and implementation occurred after conclusion of a government tender process. Network-wide application of simple novel process of verification harmonization of local international sensitive index and mean normal prothrombin time initiated for each new lot of INR reagent that does not require ongoing use of reference thromboplastin or calibration/certified plasma sets. We achieved reduction of different instrument manufacturers (from four to one), instrument types (10 to three), reagent types (four to one), and instrument/reagent combinations (12 to three), plus substantial reduction in INR variability and bias. Results infer significant improvement in local patient management, with positive implications for other laboratories. For the United States in particular, lack of US Food and Drug Administration-cleared certified plasmas may compromise INR accuracy, and our novel approach may provide a workable alternative for laboratories and networks.
The impact of frequency of patient self-testing of prothrombin time on time in target range within VA Cooperative Study #481: The Home INR Study (THINRS), a randomized, controlled trial
Anticoagulation (AC) is effective in reducing thromboembolic events for individuals with atrial fibrillation (AF) or mechanical heart valve (MHV), but maintaining patients in target range for international normalized ratio (INR) can be difficult. Evidence suggests increasing INR testing frequency can improve time in target range (TTR), but this can be impractical with in-clinic testing. The objective of this study was to test the hypothesis that more frequent patient-self testing (PST) via home monitoring increases TTR. This planned substudy was conducted as part of The Home INR Study, a randomized controlled trial of in-clinic INR testing every 4 weeks versus PST at three different intervals. The setting for this study was 6 VA centers across the United States. 1,029 candidates with AF or MHV were trained and tested for competency using ProTime INR meters; 787 patients were deemed competent and, after second consent, randomized across four arms: high quality AC management (HQACM) in a dedicated clinic, with venous INR testing once every 4 weeks; and telephone monitored PST once every 4 weeks; weekly; and twice weekly. The primary endpoint was TTR at 1-year follow-up. The secondary endpoints were: major bleed, stroke and death, and quality of life. Results showed that TTR increased as testing frequency increased (59.9 ± 16.7 %, 63.3 ± 14.3 %, and 66.8 ± 13.2 % [mean ± SD] for the groups that underwent PST every 4 weeks, weekly and twice weekly, respectively). The proportion of poorly managed patients (i.e., TTR <50 %) was significantly lower for groups that underwent PST versus HQACM, and the proportion decreased as testing frequency increased. Patients and their care providers were unblinded given the nature of PST and HQACM. In conclusion, more frequent PST improved TTR and reduced the proportion of poorly managed patients.
Clinical usefulness of international normalized ratio calibration of prothrombin time in patients with chronic liver disease
The international normalized ratio (INR) may not be directly applicable to patients with liver disease. We aimed to establish an alternative INR calibration system for patients with liver disease and to evaluate the effect of their use in chronic liver disease patients. Eighty-two patients with liver cirrhosis (LC) were included, and their prothrombin times (PTs) were measured by using 5 commercial thromboplastins. Each of the thromboplastins was also assigned an international sensitivity index (ISI liver ) by the plasmas from LC patients. INR vka , INR liver , model for end-stage liver disease (MELD) vka , MELD liver , Child-Pugh (Child) vka , and Child liver scores were calculated. The coefficient of variance of INR vka was significantly larger than that of INR liver ( P  < 0.01). The mean difference in INR vka between the thromboplastins was also significantly larger than that in INR liver ( P  < 0.01). The total mean MELD liver score was higher than the total mean MELD vka score. The mean difference between the MELD vka and MELD liver scores (MELD score ≥15) was 3.2 %. We reconfirmed that the use of the alternative calibration system described herein for patients with liver disease may resolve the variability of INR measurement. Our data suggest that we would need to reevaluate the correlation between Child-Pugh class, MELD score, and clinical prognosis by using INR liver for patients with LC.
Assessing Clinical Laboratory Quality: A College of American Pathologists Q-Probes Study of Prothrombin Time INR Structures, Processes, and Outcomes in 98 Laboratories
The anticoagulant warfarin has been identified as the second most frequent drug responsible for serious, disabling, and fatal adverse drug events in the United States, and its effect on blood coagulation is monitored by the laboratory test called international normalized ratio (INR). To determine the presence of INR policies and procedures, INR practices, and completeness and timeliness of reporting critical INR results in participants' clinical laboratories. Participants reviewed their INR policies and procedure requirements, identified their practices by using a questionnaire, and studied completeness of documentation and timeliness of reporting critical value INR results for outpatients and emergency department patients. In 98 participating institutions, the 5 required policies and procedures were in place in 93% to 99% of clinical laboratories. Fifteen options for the allowable variations among duplicate results from different analyzers, 12 different timeliness goals for reporting critical values, and 18 unique critical value limits were used by participants. All required documentation elements were present in 94.8% of 192 reviewed INR validation reports. Critical value INR results were reported within the time frame established by the laboratory for 93.4% of 2604 results, but 1.0% of results were not reported. Although the median laboratories successfully communicated all critical results within their established time frames and had all the required validation elements based in their 2 most recent INR calculations, those participants at the lowest 10th percentile were successful in 80.0% and 85.7% of these requirements, respectively. Significant opportunities exist for adherence to INR procedural requirements and for practice patterns and timeliness goals for INR critical results' reporting.
International normalized ratio testing with point-of-care coagulometer in healthy term neonates
Background Neonates routinely receive vitamin K to prevent vitamin K deficiency bleeding, which is associated with a high mortality rate and a high frequency of neurological sequelae. A coagulation screening test might be necessary to detect prophylactic failure or incomplete prophylaxis. However, venous access and the volume of blood required for such testing can be problematic. CoaguChek XS is a portable device designed to monitor prothrombin time while only drawing a small volume of blood. Although the device is used in adults and children, studies have not been performed to evaluate its clinical utility in neonates, and the reference value is unknown in this population. The objectives of the present study were to determine the reference intervals (RIs) for international normalized ratio (INR) using the CoaguChek XS by capillary puncture in healthy term neonates, to evaluate factors that correlate with INR, and to evaluate the device by assessing its ease of use in clinical practice. Methods This study included 488 healthy term neonates born at a perinatal center between July 2012 and June 2013. The INRs determined by CoaguChek XS were measured in 4-day-old neonates. Results The enrolled neonates were orally administered vitamin K 6-12 h after birth. A RI for INRs in 4-day-old neonates was established using the CoaguChek XS with a median value of 1.10 and a range of 0.90–1.30. A significant difference in the INR was noted between male (median value, 1.10; RI, 0.90–1.30) and female (median value, 1.10; RI, 0.90–1.24) neonates (p = 0.049). The INR was found to correlate with gestational age, birth weight, and hematocrit value. Conclusions The CoaguChek XS device is safe, fast, and convenient for performing INR assays in neonates. Our study is the first to establish a RI for INRs that were measured using the CoaguChek XS in healthy term neonates.
Preparation of Control Blood for External Quality Assessment of Point-of-Care International Normalized Ratio Testing in the Netherlands
The aim of this study was to prepare control blood for an external quality assessment scheme (EQAS) for international normalized ratio (INR) point-of-care testing (POCT) in the Netherlands and to assess the performance of the participants. Control blood was prepared from dialyzed pooled patient plasma and washed human erythrocytes. Samples of control blood were mailed to participants of the Netherlands EQAS from October 2006 through December 2012. Most participants used CoaguChek XS (Roche Diagnostics, Mannheim, Germany) devices for POCT. The median between-center coefficient of variation (CV) of the reported INR decreased from 4.5% in 2006 to 2.6% in 2012. A few participants used the ProTime Microcoagulation System (ITC, Edison, NJ) for POCT. The median CV (per year) of the INR with the latter system was 7.0% to 10.6%. The control blood samples were useful for external quality assessment in the Netherlands. The participants' performance with the CoaguChek XS system improved with time, demonstrating the value of external quality assessment.
Simplified Method for Local Correction of System International Normalized Ratio
Abstract Background International normalized ratio (INR) derivation is dependent on the international sensitivity index (ISI) of thromboplastin. It varies with instrument and reagents used. Objective To evaluate the role of a correction factor in the derivation of INR. Methods We studied prothrombin time (PT) and INR from patients using 3 thromboplastins of varying ISI values. The correction factor was applied to the observed INR to obtain a corrected INR. Results The difference between corrected INR and observed INR values varies from −0.8 through 0.96. Conclusions Corrected INR is dependent on PT only. It can be applied to all patients irrespective of cause for elevated PT.
The association of serum antiphospholipid antibodies and dilute Russell's viper venom times
Aims We hypothesised that there is a threshold value for the association of dilute Russell's viper venom times (dRVVT) with positive immunoglobin G antiphospholipid antibody (IgG-APLA) test results. Methods We tested 120 controls and a cohort of 2412 outpatients who had concomitant test results for dRVVT and IgG-APLA (IgG antibodies to cardiolipins and β2-glycoprotein I). We also selected a subgroup who had repeated IgG-APLA tests at least 12 weeks apart (1398 patients with multiple β2-glycoprotein I tests and 672 with multiple aCL tests). We cross tabulated the proportion of IgG-APLA single positive, double positive and persistently positive antibodies with dRVVT values. Results The distribution of the dRVVT results from the reference population was consistent with an upper limit of the reference interval of 1.22 to >1.48. A consistent increase in the proportion of IgG-APLA single, double positive and persistently positive antibody tests occurred in the group with a normalised dRVVT ratio of 1.40–1.49. IgG-APLA double positivity was found in 12.5% (4 of 32) patients with a ratio of dRVVT 1.40–1.49 compared with 3.3% (6/181) of those with a ratio of dRVVT 1.20–1.39 (p=0.045). Conclusions We conclude that there is an association between dRVVT positivity and elevated proportions of single, double and persistently positive IgG-APLA test results with an apparent threshold effect. These findings may provide a general guide to risk and suggest a way to choose from a wide range of possible upper limits of the reference interval.
Poor Agreement among Prothrombin Time International Normalized Ratio Methods: Comparison of Seven Commercial Reagents
Background: Prothrombin time (PT) has long been the most popular test for monitoring oral anticoagulation therapy. The International Normalized Ratio (INR) was introduced to overcome the problem of marked variation in PT results among laboratories and the various recommendations for patient care. According to this principle, all reagents should be calibrated to give identical results and the same patient care globally. This is necessary for monitoring of single patients and for application of the results of anticoagulation trials and guidelines to clinical practice. Methods: We took blood samples from 150 patients for whom oral anticoagulation had been prescribed. Plasmas were separated and PTs determined by use of seven commercial reagents and four calibrator sets. The differences in results were assessed by plotting, for each possible pair of methods, the differences in INR values for each sample against the mean INR value (Bland-Altman difference plots). Results: Mean results differed significantly (P <0.001) for 17 of 21 possible paired comparisons of methods. Only two pairs of methods produced very similar results when assessed for problems of substantial differences in INR values; a significant, systematic increase in the difference with INR; and a significant systematic increase in the variation in difference with increasing INR values. Conclusions: The agreement among several (and perhaps most) commercial INR methods is poor. The failure of current calibration strategies may severely compromise both the monitoring of individual patients and the application of oral anticoagulation guidelines and trial results to clinical practice.