Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
3,031 result(s) for "real-world analysis"
Sort by:
Persistence in allergen immunotherapy: A longitudinal, prescription data‐based real‐world analysis
Introduction Allergic rhinitis (AR) is a widespread disease with increasing prevalence in developed countries. The only treatment that tackles the underlying causes is allergen immunotherapy (AIT). This treatment is performed through two application routes, the subcutaneous immunotherapy (SCIT) or the sublingual immunotherapy (SLIT). However, persistence during the long course of treatment over 3 years is key for the efficacy of this treatment option. The impaired adherence significantly impacts public health resources. The aim of this study was to assess the persistence of AIT for both application routes. Methods IQVIATM LRx was used to identify patients starting AIT between 2009 and 2018 with grass pollen (GP), early flowering tree pollen (EFTP) and house dust mite (HDM) allergens. Patients were classified within each allergen category by AIT groups (subcutaneous depigmented polymerised allergen AIT [dSCIT], other subcutaneous AIT [oSCIT] and SLIT) and age (5‐11 years, 12‐17 years, 18+ years). Furthermore, they were followed up for up to 3 years until the cessation of treatment. Patients, who were still on treatment after 3 years were deemed to be censored. Kaplan‐Meier curves of persistence were generated and compared by log‐rank tests. Results The number of patients included in the three allergen categories was 38,717 GP, 23,183 EFTP, and 41,728 HDM AIT. In all allergen categories and for any product group, patient persistence decreased with increasing age class with the difference between 5‐11 years and 12‐17 years greater than between the latter and 18+ years. The percentage of patients completing the first year of AIT was low, particularly for SLIT where 22.2%–27.1% of patients remained persistent after 12 months. The equivalent figures for dSCIT were 52.0%–64.1% and for oSCIT 38.3%–50.3%. Conclusion Persistence in AIT in AR was low in this retrospective prescription‐based database and was clearly linked to patient age and application route.
Association of uncontrolled blood pressure in apparent treatment‐resistant hypertension with increased risk of major adverse cardiovascular events plus
Patients with apparent treatment‐resistant hypertension (aTRH) are at increased risk of end‐organ damage and cardiovascular events. Little is known about the effects of blood pressure (BP) control in this population. Using a national claims database integrated with electronic medical records, the authors evaluated the relationships between uncontrolled BP (UBP; ≥130/80 mmHg) or controlled BP (CBP; <130/80 mmHg) and risk of major adverse cardiovascular events plus (MACE+; stroke, myocardial infarction, heart failure requiring hospitalization) and end‐stage renal disease (ESRD) in adult patients with aTRH (taking ≥3 antihypertensive medication classes concurrently within 30 days between January 1, 2015 and June 30, 2021). MACE+ components were also evaluated separately. Multivariable regression models were used to adjust for baseline differences in demographic and clinical characteristics, and sensitivity analyses using CBP <140/90 mmHg were conducted. Patients with UBP (n = 22 333) were younger and had fewer comorbidities at baseline than those with CBP (n = 11 427). In the primary analysis, which adjusted for these baseline differences, UBP versus CBP patients were at an 8% increased risk of MACE+ (driven by a 31% increased risk of stroke) and a 53% increased risk of ESRD after 2.7 years of follow‐up. Greater MACE+ (22%) and ESRD (98%) risk increases with UBP versus CBP were seen in the sensitivity analysis. These real‐world data showed an association between suboptimal BP control in patients with aTRH and higher incidence of MACE+ and ESRD linked with UBP despite the use of multidrug regimens. Thus, there remains a need for improved aTRH management.
Ixazomib, Lenalidomide and Dexamethasone in Relapsed and Refractory Multiple Myeloma in Routine Clinical Practice: Extended Follow-Up Analysis and the Results of Subsequent Therapy
Background: We confirmed the benefit of addition of ixazomib to lenalidomide and dexamethasone in patients with relapsed and refractory multiple myeloma (RRMM) in unselected real-world population. We report the final analysis for overall survival (OS), second progression free survival (PFS-2), and the subanalysis of the outcomes in lenalidomide (LEN) pretreated and LEN refractory patients. Methods: We assessed 344 patients with RRMM, treated with IRD (N  =  127) or RD (N  = 217). The data were acquired from the Czech Registry of Monoclonal Gammopathies (RMG). With prolonged follow-up (median 28.5 months), we determined the new primary endpoints OS, PFS and PFS-2. Secondary endpoints included the next therapeutic approach and the survival measures in LEN pretreated and LEN refractory patients. Results: The final overall response rate (ORR) was 73.0% in the IRD cohort and 66.8% in the RD cohort. The difference in patients reaching ≥VGPR remained significant (38.1% vs. 26.3%, p = 0.028). Median PFS maintained significant improvement in the IRD cohort (17.5 vs. 12.5 months, p = 0.013) with better outcomes in patients with 1–3 prior relapses (22.3 vs. 12.7 months p = 0.003). In the whole cohort, median OS was for IRD vs. RD patients 40.9 vs. 27.1 months (p = 0.001), with further improvement within relapse 1-3 (51.7 vs. 27.8 months, p ˂ 0.001). The median PFS of LEN pretreated (N = 22) vs. LEN naive (N = 105) patients treated by IRD was 8.7 vs. 23.1 months (p = 0.001), and median OS was 13.2 vs. 51.7 months (p = 0.030). Most patients in both arms progressed and received further myeloma-specific therapy (63.0% in the IRD group and 53.9% in the RD group). Majority of patients received pomalidomide-based therapy or bortezomib based therapy. Significantly more patients with previous IRD vs. RD received subsequent monoclonal antibodies (daratumumab—16.3% vs. 4.3%, p = 0.0054; isatuximab 5.0% vs. 0.0%, p = 0.026) and carfilzomib (12.5 vs. 1.7%, p = 0.004). The median PFS-2 (progression free survival from the start of IRD/RD therapy until the second disease progression or death) was significantly longer in the IRD cohort (29.8 vs. 21.6 months, p = 0.016). There were no additional safety concerns in the extended follow-up. Conclusions: The IRD regimen is well tolerated, easy to administer, and with very good therapeutic outcomes. The survival measures in unsorted real-world population are comparable to the outcomes of the clinical trial. As expected, patients with LEN reatment have poorer outcomes than those who are LEN-naive. The PFS benefit of IRD vs. RD translated into significantly better PFS-2 and OS, but the outcomes must be accounted for imbalances in pretreatment group characteristics (especially younger age and stem cell transplant pretreatment), and in subsequent therapies.
House dust mite immunotherapy: A real‐world, prescription data‐based analysis
Background House dust mite (HDM) sensitisation can contribute to the development of allergic rhinoconjunctivitis (AR) or allergic asthma (AA). As treatment, allergen immunotherapy (AIT) is a promising approach, since it aims building immunotolerance against allergens, therewith establishing long‐term efficacy. The evaluation of AIT has been investigated in many randomised controlled trials, whereas few real‐world evidence studies are available. Methods We used data from the longitudinal prescription data base IQVIA™ LRx. Data on initial AIT prescriptions against HDM from January 2009 to December 2013 was analysed regarding treatment (subcutaneous AIT with either depigmented polymerised allergen extract [dSCIT] or other allergens [oSCIT], or sublingual immunotherapy [SLIT]) and treatment duration. Treatment groups were compared with a control group of AR patients not receiving AIT. Data on symptomatic medication was collected until February 2017 and progression of AR and AA was compared. Results Data of 7260 patients with AIT prescriptions and of 21,780 control patients was analysed. AIT was associated with a significant decrease of AR medication intake compared with control (dSCIT: −34.0%, p < 0.0001; oSCIT: −25.7%, p < 0.0001; SLIT: −37.7%, p = 0.0026). In asthmatics, SCIT was associated with a significant decrease of asthma medication compared with control (dSCIT: −45.2%, p < 0.0001; oSCIT: −32.9%, p < 0.0001). Further, a significantly reduced likelihood for onset of asthma medication was demonstrated in patients treated with SCIT compared with controls (dSCIT OR: 0.759, p = 0.0476; oSCIT OR: 0.815, p = 0.0339). Conclusion Real‐world data analyses indicate that AIT, particularly given via a subcutaneous route, reduces the need of medication against AR and AA and might delay the onset of asthma medication in patients with AR.
Feasibility of next‐generation sequencing test for patients with advanced NSCLC in clinical practice
Background The usefulness of the Oncomine Dx Target test (Oncomine Dx), a next‐generation sequencing (NGS) test, has already been proven in clinical trials. However, NGS requires high‐quality tumor samples and takes a long time to generate results. The feasibility of NGS for use in advanced non‐small cell lung cancer (NSCLC) patients in clinical practice has not yet been determined. Methods Patients serially diagnosed with advanced NSCLC were evaluated in our hospital. The Oncomine Dx, Cobas EGFR mutation test (Cobas EGFR), and ALK‐IHC were performed. The patients were divided into four sets: the full analysis set (FAS) that referred to patients diagnosed with NSCLC, the intent to perform companion diagnostics (CDx) set (IPS) that referred to patients in which CDx had been ordered regardless of sample quality, the per‐performed CDx set (PPS) that referred to patients who could undergo CDx regardless of the results, and the per‐completed CDx set (CCS) that referred to patients in which informative results were received from the CDx. Results The total number of patients analyzed in the study was 167. The IPS/FAS of Oncomine Dx (80.2%) was lower than that of the ALK‐IHC (85.0%) and Cobas EGFR (92.8%). The CCS/FAS of Oncomine Dx (65.9%) was lower than that of the ALK‐IHC (82.0%) and Cobas EGFR (92.2%). PPS/IPS and CCS/PPS of the Oncomine Dx with nonsurgical biopsy ranged between 78.6% and 90.9%, which was lower than those patients who underwent surgical resection (95.0% and 100%). Conclusions The feasibility of Oncomine Dx in clinical practice was lower than the other CDx. The feasibility of Oncomine Dx will increase by improving the biopsy procedure. Key points Significant study findings The usefulness of a next‐generation sequencing (NGS) test has been proven in clinical trials. The feasibility of NGS is lower than other diagnostics in clinical practice especially with regard to nonsurgical biopsy. What this study adds It is necessary to improve the feasibility of NGS in clinical practice. To improve NGS feasibility, turnaround time must be shortened, and larger samples must be obtained during surgical procedures. The feasibility of Oncomine Dx in clinical practice is relatively low compared with that of the ALK‐IHC and Cobas EGFR.
Efficacy of bendamustine and rituximab in unfit patients with previously untreated chronic lymphocytic leukemia. Indirect comparison with ibrutinib in a real‐world setting. A GIMEMA‐ERIC and US study
Limited information is available on the efficacy of front‐line bendamustine and rituximab (BR) in chronic lymphocytic leukemia (CLL) with reduced renal function or coexisting conditions. We therefore analyzed a cohort of real‐world patients and performed a matched adjusted indirect comparison with a cohort of patients treated with ibrutinib. One hundred and fifty‐seven patients with creatinine clearance (CrCl) <70 mL/min and/or CIRS score >6 were treated with BR. The median age was 72 years; 69% of patients had ≥2 comorbidities and the median CrCl was 59.8 mL/min. 17.6% of patients carried TP53 disruption. The median progression‐free survival (PFS) was 45 months; TP53 disruption was associated with a shorter PFS (P = 0.05). The overall survival (OS) at 12, 24, and 36 months was 96.2%, 90.1%, and 79.5%, respectively. TP53 disruption was associated with an increased risk of death (P = 0.01). Data on 162 patients ≥65 years treated with ibrutinib were analyzed and compared with 165 patients ≥65 years treated with BR. Factors predicting for a longer PFS at multivariable analysis in the total patient population treated with BR and ibrutinib were age (HR 1.06, 95% CI 1.02‐1.10, P < 0.01) and treatment with ibrutinib (HR 0.55, 95% CI 0.33‐0.93, P = 0.03). In a post hoc analysis of patients in advanced stage, a significant PFS advantage was observed in patient who had received ibrutinib (P = 0.03), who showed a trend for OS advantage (P = 0.08). We arrived at the following conclusions: (a) BR is a relatively effective first‐line regimen in a real‐world population of unfit patients without TP53 disruption, (b) ibrutinib provided longer disease control than BR in patients with advanced disease stage. Bendamustine and Rituximab was a relatively effective first‐line regimen in real‐world untreated CLL patients with reduced renal function or coexisting conditions without TP53 disruption.In a matched‐adjusted indirect comparison with a cohort of CLL patients treated upfront, ibrutinib provided longer PFS than bendamustine and rituximab in those with advanced stage.
Refining Tumor Mutational Burden as a Predictive Biomarker for Pembrolizumab: A Real‐World Analysis in Japanese Patients
Tumor mutational burden (TMB) is a key biomarker for predicting the response to immune checkpoint inhibitors (ICIs). However, its predictive accuracy in real‐world clinical practice, particularly in Asian populations, remains inadequately evaluated. We addressed this issue by analyzing real‐world data from 63,952 patients registered in the Center for Cancer Genomics and Advanced Therapeutics (C‐CAT) database, which integrates genomic and clinical information from Japanese patients with various advanced solid tumors. We assessed the therapeutic efficacy of pembrolizumab in 1899 patients who underwent one of three comprehensive genomic profiling tests: FoundationOne CDx, the OncoGuide NCC Oncopanel System, or the GenMine TOP Cancer Genome Profiling System. Based on the reported TMB values, patients were classified as TMB‐high (≥ 10 mutations per megabase) or TMB‐low (< 10 mutations per megabase). The objective response rate (ORR) among 946 TMB‐high patients exceeded 30% and was significantly higher than that observed in 953 TMB‐low patients (16.8%, p < 0.001). Notably, patients with borderline TMB values (10 to less than 13 mutations per megabase) exhibited relatively modest responses (20.8%). The ORR improved when hotspot mutations were excluded from the TMB calculation, suggesting that this adjustment enhances the predictive accuracy of TMB. These findings support the clinical utility of TMB as a biomarker for predicting ICI response in routine oncology practice. In particular, excluding hotspot mutations from TMB calculations may improve response prediction in patients whose TMB values are near the threshold. In this study, we investigated the predictive value of tumor mutational burden (TMB) for assessing the efficacy of pembrolizumab in a Japanese cohort. We analyzed real‐world data from 63,952 patients registered in the C‐CAT database who underwent comprehensive genomic profiling, and evaluated the therapeutic efficacy of pembrolizumab in 1899 of these patients. Our findings support the clinical utility of TMB as a predictive biomarker in routine oncology practice and underscore the importance of accounting for hotspot mutations, particularly in patients with TMB values near the clinical cutoff.
High‐risks drug adverse events associated with Cetirizine and Loratadine for the treatment of allergic diseases: A retrospective pharmacovigilance study based on the FDA adverse event reporting system database
Background Cetirizine and Loratadine are the two best‐selling second‐generation antihistamines for allergic diseases. This study aims to provide a comparative analysis of the differences in adverse drug events (ADEs) between these two medications, which can assist clinicians in making appropriate treatment decisions. Methods ADE reports related to Cetirizine and Loratadine obtained from the FDA adverse event reporting system (FAERS) database were analyzed using disproportionality analysis and Bayesian analysis to evaluate and compare the ADE signals of both drugs. Results A total of 28,051 and 28,073 ADE reports were retrieved from the FAERS database related to Cetirizine and Loratadine, respectively, with both drugs showing a predominance of middle‐aged females. Specifically, Loratadine was associated with respiratory symptoms, mainly nasal symptoms such as rhinorrhea (n = 326, ROR 6.75), sneezing (n = 251, ROR 15.24), and nasal congestion (n = 185, ROR 4.25), while Cetirizine did not show this association. Notably, both drugs exhibited strong signals for somnolence in the nervous and psychiatric systems, especially Cetirizine (Cetirizine, n = 2556, ROR 10.52 vs. Loratadine, n = 1200, ROR 7.76). Additionally, Cetirizine itself showed strong signals for attention disturbance (n = 233, ROR 3.3), while Loratadine was associated with nervousness (n = 145, ROR 3.3). Further exploration revealed more severe adverse reactions closely associated with Cetirizine, including hallucinations, aggression, and abnormal behavior. Importantly, Cetirizine was significantly associated with the occurrence of pericarditis (n = 138, ROR 8.13), potentially leading to serious adverse consequences. Conclusion Compared to Loratadine, Cetirizine poses a greater potential risk in the nervous and psychiatric systems. Additionally, this study reveals previously underestimated potential cardiac toxicity of Cetirizine; albeit at a relatively low incidence rate, the high signal intensity warrants further attention and exploration. These findings highlight the need for enhanced patient monitoring and therapy optimization when prescribing these medications, ensuring better management of allergic diseases while minimizing risks.
Adverse drug events associated with linezolid administration: a real-world pharmacovigilance study from 2004 to 2023 using the FAERS database
Introduction: Linezolid is an oxazolidinone antibiotic that is active against drug-resistant Gram-positive bacteria and multidrug-resistant Mycobacterium tuberculosis . Real-world studies on the safety of linezolid in large populations are lacking. This study aimed to determine the adverse events associated with linezolid in real-world settings by analyzing data from the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS). Methods: We retrospectively extracted reports on adverse drug events (ADEs) from the FAERS database from the first quarter of 2004 to that of 2023. By using disproportionality analysis including reporting odds ratio (ROR), proportional reporting ratio (PRR), Bayesian Confidence Propagation Neural Network (BCPNN), along with the multi-item gamma Poisson shrinker (MGPS), we evaluated whether there was a significant association between linezolid and ADE. The time to onset of ADE was further analyzed in the general population and within each age, weight, reporting population, and weight subgroups. Results: A total of 11,176 reports of linezolid as the “primary suspected” drug and 263 significant adverse events of linezolid were identified, including some common adverse events such as thrombocytopenia ( n = 1,139, ROR 21.98), anaemia ( n = 704, ROR 7.39), and unexpected signals that were not listed on the drug label such as rhabdomyolysis ( n = 90, ROR 4.33), and electrocardiogram QT prolonged ( n = 73, ROR 4.07). Linezolid-induced adverse reactions involved 27 System Organ Class (SOC). Gender differences existed in ADE signals related to linezolid. The median onset time of all ADEs was 6 days, and most ADEs ( n = 3,778) occurred within the first month of linezolid use but some may continue to occur even after a year of treatment ( n = 46). Conclusion: This study reports the time to onset of adverse effects in detail at the levels of SOC and specific preferred term (PT). The results of our study provide valuable insights for optimizing the use of linezolid and reducing potential side effects, expected to facilitate the safe use of linezolid in clinical settings.
Post-marketing safety concerns with pirfenidone and nintedanib: an analysis of individual case safety reports from the FDA adverse event reporting system database and the Japanese adverse drug event report databases
To date, only two drugs, pirfenidone and nintedanib, are approved for the treatment of patients with idiopathic pulmonary fibrosis (IPF). In addition, very few studies have reported on the safety profile of either drug in large populations. This study aims to identify and compare adverse drug events (ADEs) associated with pirfenidone and nintedanib in real-world settings by analyzing data from the US Food and Drug Administration Adverse Event Reporting System (FAERS). In addition, we utilized data from the Japanese Adverse Drug Event Report (JADER) database for external validation. The ADE reports on both drugs from 2014 Q3 to 2024 Q2 in FAERS and from 2008 Q1 to 2024 Q1 in JADER were collected. After deduplication, Bayesian and non-Bayesian methods for disproportionality analysis, including Reporting Odds Ratio (ROR), Proportional Reporting Ratio (PRR), Bayesian Confidence Propagation Neural Network (BCPNN), and Multiple Gamma Poisson Shrinkers (MGPS), were used for signal detection. Additionally, time to onset (TTO) analysis were performed. In total, 35,804 and 20,486 ADE reports were identified from the FAERS database for pirfenidone and nintedanib, respectively. At the system organ class (SOC) level, both drugs have a positive signal value for \"gastrointestinal disorders,\" \"respiratory, thoracic, and mediastinal disorders,\" and \"metabolism and nutrition disorders.\" Other positive signals for pirfenidone include \"general disorders and administration site conditions,\" and \"skin and subcutaneous tissue disorders,\" while for nintedanib, they were \"investigations,\" \"infections and infestations,\" and \"hepatobiliary disorders.\" Some positive signals were consistent with the drug labels, including nausea, decreased appetite, and weight decreased identified in pirfenidone, as well as diarrhea, decreased appetite, abdominal pain upper, and epistaxis identified in nintedanib. We also identified unexpected signals not listed on the drug label, such as decreased gastric pH, and pneumothorax for pirfenidone, and constipation, flatulence for nintedanib. The median onset time for ADEs was 146 days for pirfenidone and 45 days for nintedanib, respectively. Although the two antifibrotics differed in the proportion of periods in which the ADEs occurred, these ADEs were likely to continue even after a year of treatment. In the external validation of JADER, the number of reports for pirfenidone and nintedanib were 265, and 1,327, respectively. The disproportionality analysis at the SOC and preferred term (PT) levels supports the FAERS results. This study systematically investigates and compares the ADEs and their onset times at the SOC and specific PT levels for pirfenidone and nintedanib. Our results provide valuable pharmacological insights for the similarities and differences between the safety profiles of the two drugs and highlight the importance of monitoring and managing the toxicity profile associated with antifibrotic drugs.