Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
LanguageLanguage
-
SubjectSubject
-
Item TypeItem Type
-
DisciplineDiscipline
-
YearFrom:-To:
-
More FiltersMore FiltersIs Peer Reviewed
Done
Filters
Reset
24
result(s) for
"Petrie, Michael F"
Sort by:
Declining Risk of Sudden Death in Heart Failure
2017
Data were analyzed from 40,195 patients with heart failure with reduced ejection fraction enrolled in 12 clinical trials in the 1995–2014 period. Sudden-death rates declined substantially over time, a finding consistent with a cumulative effect of evidence-based medical therapy.
Journal Article
Semaglutide in Patients with Obesity-Related Heart Failure and Type 2 Diabetes
by
Møller, Daniél V.
,
Perna, Eduardo
,
Melenovský, Vojtěch
in
administration & dosage
,
adverse effects
,
Analysis of covariance
2024
Among patients with obesity-related heart failure with preserved ejection fraction and type 2 diabetes, semaglutide produced greater reductions in symptoms, physical limitations, and body weight than placebo at 1 year.
Journal Article
Cost-effectiveness of transcatheter edge-to-edge repair in secondary mitral regurgitation
by
Petrie, Mark C
,
Lindenfeld, Joann
,
Cohen, David J
in
Cardiac Catheterization - methods
,
Clinical Trials as Topic
,
Cost analysis
2022
BackgroundTranscatheter edge-to-edge mitral valve repair (TMVr) improves symptoms and survival for patients with heart failure with reduced left ventricular ejection fraction (HFrEF) and severe secondary mitral regurgitation despite guideline-recommended medical therapy (GRMT). Whether TMVr is cost-effective from a UK National Health Service (NHS) perspective is unknown.MethodsWe used patient-level data from the Cardiovascular Outcomes Assessment of the MitraClip Percutaneous Therapy for Heart Failure Patients with Functional Mitral Regurgitation (COAPT) trial to perform a cost-effectiveness analysis of TMVr +GRMT versus GRMT alone from an NHS perspective. Costs for the TMVr procedure were based on standard English tariffs and device costs. Subsequent costs were estimated based on data acquired during the trial. Health utilities were estimated using the Short-Form 6-Dimension Health Survey.ResultsCosts for the index procedural hospitalisation were £18 781, of which £16 218 were for the TMVr device. Over 2-year follow-up, TMVr reduced subsequent costs compared with GRMT (£10 944 vs £14 932, p=0.006), driven mainly by reductions in heart failure hospitalisations; nonetheless, total 2-year costs remained higher with TMVr (£29 165 vs £14 932, p<0.001). When survival, health utilities and costs were projected over a lifetime, TMVr was projected to increase life expectancy by 1.57 years and quality-adjusted life expectancy by 1.12 quality-adjusted life-years (QALYs) at an incremental cost of £21 980, resulting in an incremental cost-effectiveness ratio (ICER) of £23 270 per QALY gained (after discounting). If the benefits of TMVr observed in the first 2 years were maintained without attenuation, the ICER improved to £12 494 per QALY.ConclusionsFor patients with HFrEF and severe secondary mitral regurgitation similar to those enrolled in COAPT, TMVr increases life expectancy and quality-adjusted life expectancy compared with GRMT at an ICER that represents good value from an NHS perspective.
Journal Article
Plasma proteomic signatures of a direct measure of insulin sensitivity in two population cohorts
2023
Aims/hypothesis
The euglycaemic–hyperinsulinaemic clamp (EIC) is the reference standard for the measurement of whole-body insulin sensitivity but is laborious and expensive to perform. We aimed to assess the incremental value of high-throughput plasma proteomic profiling in developing signatures correlating with the
M
value derived from the EIC.
Methods
We measured 828 proteins in the fasting plasma of 966 participants from the Relationship between Insulin Sensitivity and Cardiovascular disease (RISC) study and 745 participants from the Uppsala Longitudinal Study of Adult Men (ULSAM) using a high-throughput proximity extension assay. We used the least absolute shrinkage and selection operator (LASSO) approach using clinical variables and protein measures as features. Models were tested within and across cohorts. Our primary model performance metric was the proportion of the
M
value variance explained (
R
2
).
Results
A standard LASSO model incorporating 53 proteins in addition to routinely available clinical variables increased the
M
value
R
2
from 0.237 (95% CI 0.178, 0.303) to 0.456 (0.372, 0.536) in RISC. A similar pattern was observed in ULSAM, in which the
M
value
R
2
increased from 0.443 (0.360, 0.530) to 0.632 (0.569, 0.698) with the addition of 61 proteins. Models trained in one cohort and tested in the other also demonstrated significant improvements in
R
2
despite differences in baseline cohort characteristics and clamp methodology (RISC to ULSAM: 0.491 [0.433, 0.539] for 51 proteins; ULSAM to RISC: 0.369 [0.331, 0.416] for 67 proteins). A randomised LASSO and stability selection algorithm selected only two proteins per cohort (three unique proteins), which improved
R
2
but to a lesser degree than in standard LASSO models: 0.352 (0.266, 0.439) in RISC and 0.495 (0.404, 0.585) in ULSAM. Reductions in improvements of
R
2
with randomised LASSO and stability selection were less marked in cross-cohort analyses (RISC to ULSAM
R
2
0.444 [0.391, 0.497]; ULSAM to RISC
R
2
0.348 [0.300, 0.396]). Models of proteins alone were as effective as models that included both clinical variables and proteins using either standard or randomised LASSO. The single most consistently selected protein across all analyses and models was IGF-binding protein 2.
Conclusions/interpretation
A plasma proteomic signature identified using a standard LASSO approach improves the cross-sectional estimation of the
M
value over routine clinical variables. However, a small subset of these proteins identified using a stability selection algorithm affords much of this improvement, especially when considering cross-cohort analyses. Our approach provides opportunities to improve the identification of insulin-resistant individuals at risk of insulin resistance-related adverse health consequences.
Graphical Abstract
Journal Article
Rationale and design of a randomised trial of intravenous iron in patients with heart failure
2022
ObjectivesFor patients with a reduced left ventricular ejection fraction (LVEF) heart failure with reduced ejection fraction (HFrEF) and iron deficiency, administration of intravenous iron improves symptoms, exercise capacity and may in the following 12 months, reduce hospitalisations for heart failure. The Effectiveness of Intravenous iron treatment versus standard care in patients with heart failure and iron deficiency (IRONMAN) trial evaluated whether the benefits of intravenous iron persist in the longer term and impact on morbidity and mortality.MethodsIRONMAN is a prospective, randomised, open-label, blinded endpoint (PROBE) event-driven trial. Patients aged ≥18 years with HFrEF (LVEF ≤45%) and evidence of iron deficiency (ferritin <100 µg/L and/or TSAT <20%) were enrolled if they had either a current or recent hospitalisation for heart failure or elevated plasma concentrations of a natriuretic peptide. Participants were randomised to receive, or not to receive, intravenous ferric derisomaltose in addition to guideline-recommended therapy for HFrEF. Every 4 months, intravenous iron was administered if either ferritin was <100 µg/L or, provided ferritin was ≤400 µg/L, TSAT was <25%. The primary endpoint is a composite of total hospitalisations for heart failure and cardiovascular death. Hospitalisation and deaths due to infection are safety endpoints.ResultsTrial recruitment was completed across 70 UK hospital sites in October 2021. Participants were followed until the end of March 2022. We plan to report the results by November 2022.ConclusionsIRONMAN will determine whether repeated doses of intravenous ferric derisomaltose are beneficial and safe for the long-term treatment of a broad range of patients with HFrEF and iron deficiency.Trial registration number NCT02642562.
Journal Article
Influence of resource selection on nonbreeding season mortality of mallards
by
Palumbo, Matthew D.
,
Rubin, Benjamin D.
,
Benson, John F.
in
adults
,
Agriculture
,
Anas platyrhynchos
2022
Relationships between individual resource selection strategies and fitness are difficult to quantify at large spatial scales. These links are important for understanding the potential effects of management on population-level processes. We modeled the degree to which selection of specific landscape features altered mortality risk of female mallards (Anas platyrhynchos) during the non-breeding season. We used individual resource selection estimates from adult female mallards equipped with Global Positioning System (GPS) backpack transmitters (n = 56) in the Lake St. Clair region of southwestern Ontario, Canada, in August of 2014 and 2015. We determined the fate of individuals between August and January and used time-to-event analyses to model survival over 158 days. Furthermore, we investigated how diurnal and nocturnal resource selection and year were related to mortality risk. The survival rate for the adult female mallards was 0.57 (95% CI = 0.42–0.77). Resource types were combinations of land class types (e.g., water, marsh, flooded agriculture, supplemental feeding areas, and dry agriculture) important to mallards and varying levels of risk from anthropogenic disturbance ranging from inviolate refuges to publicly accessed areas where we predicted mortality risk to be greatest. Our results suggest that water that the public can access (i.e., public water) influenced mortality risk during multiple seasons. Specifically, selection of public water by female mallards reduced mortality risk diurnally during the non-hunting period (hazard ratio = 0.68, 95% CI = 0.48–0.96) but increased mortality risk during the first half of the hunting period (hazard ratio = 1.54, 95% CI = 1.08–2.20). Our research highlights that individual selection strategies by ducks within this landscape can influence mortality risk.
Journal Article
Allopurinol versus usual care in UK patients with ischaemic heart disease (ALL-HEART): a multicentre, prospective, randomised, open-label, blinded-endpoint trial
by
Doshi, Sagar
,
Shepherd, Bridget
,
MacDonald, Thomas M
in
Aged
,
Allopurinol
,
Allopurinol - therapeutic use
2022
Allopurinol is a urate-lowering therapy used to treat patients with gout. Previous studies have shown that allopurinol has positive effects on several cardiovascular parameters. The ALL-HEART study aimed to determine whether allopurinol therapy improves major cardiovascular outcomes in patients with ischaemic heart disease.
ALL-HEART was a multicentre, prospective, randomised, open-label, blinded-endpoint trial done in 18 regional centres in England and Scotland, with patients recruited from 424 primary care practices. Eligible patients were aged 60 years or older, with ischaemic heart disease but no history of gout. Participants were randomly assigned (1:1), using a central web-based randomisation system accessed via a web-based application or an interactive voice response system, to receive oral allopurinol up-titrated to a dose of 600 mg daily (300 mg daily in participants with moderate renal impairment at baseline) or to continue usual care. The primary outcome was the composite cardiovascular endpoint of non-fatal myocardial infarction, non-fatal stroke, or cardiovascular death. The hazard ratio (allopurinol vs usual care) in a Cox proportional hazards model was assessed for superiority in a modified intention-to-treat analysis (excluding randomly assigned patients later found to have met one of the exclusion criteria). The safety analysis population included all patients in the modified intention-to-treat usual care group and those who took at least one dose of randomised medication in the allopurinol group. This study is registered with the EU Clinical Trials Register, EudraCT 2013-003559-39, and ISRCTN, ISRCTN32017426.
Between Feb 7, 2014, and Oct 2, 2017, 5937 participants were enrolled and then randomly assigned to receive allopurinol or usual care. After exclusion of 216 patients after randomisation, 5721 participants (mean age 72·0 years [SD 6·8], 4321 [75·5%] males, and 5676 [99·2%] white) were included in the modified intention-to-treat population, with 2853 in the allopurinol group and 2868 in the usual care group. Mean follow-up time in the study was 4·8 years (1·5). There was no evidence of a difference between the randomised treatment groups in the rates of the primary endpoint. 314 (11·0%) participants in the allopurinol group (2·47 events per 100 patient-years) and 325 (11·3%) in the usual care group (2·37 events per 100 patient-years) had a primary endpoint (hazard ratio [HR] 1·04 [95% CI 0·89–1·21], p=0·65). 288 (10·1%) participants in the allopurinol group and 303 (10·6%) participants in the usual care group died from any cause (HR 1·02 [95% CI 0·87–1·20], p=0·77).
In this large, randomised clinical trial in patients aged 60 years or older with ischaemic heart disease but no history of gout, there was no difference in the primary outcome of non-fatal myocardial infarction, non-fatal stroke, or cardiovascular death between participants randomised to allopurinol therapy and those randomised to usual care.
UK National Institute for Health and Care Research.
Journal Article
Use of Influenza Antiviral Agents by Ambulatory Care Clinicians During the 2012–2013 Influenza Season
2014
Background. Early antiviral treatment (≤2 days since illness onset) of influenza reduces the probability of influenza-associated complications. Early empiric antiviral treatment is recommended for those with suspected influenza at higher risk for influenza complications regardless of their illness severity. We describe antiviral receipt among outpatients with acute respiratory illness (ARI) and antibiotic receipt among patients with influenza. Methods. We analyzed data from 5 sites in the US Influenza Vaccine Effectiveness Network Study during the 2012–2013 influenza season. Subjects were outpatients aged ≥6 months with ARI defined by cough of ≤7 days' duration; all were tested for influenza by polymerase chain reaction (PCR). Medical history and prescription information were collected by medical and pharmacy records. Four sites collected prescribing data on 3 common antibiotics (amoxicillin-clavulanate, amoxicillin, and azithromycin). Results. Of 6766 enrolled ARI patients, 509 (7.5%) received an antiviral prescription. Overall, 2366 (35%) had PCR-confirmed influenza; 355 (15%) of those received an antiviral prescription. Among 1021 ARI patients at high risk for influenza complications (eg, aged <2 years or ≥65 years or with ≥1 chronic medical condition) presenting to care ≤2 days from symptom onset, 195 (19%) were prescribed an antiviral medication. Among participants with PCR-confirmed influenza and antibiotic data, 540 of 1825 (30%) were prescribed 1 of 3 antibiotics; 297 of 1825 (16%) were prescribed antiviral medications. Conclusions. Antiviral treatment was prescribed infrequently among outpatients with influenza for whom therapy would be most beneficial; in contrast, antibiotic prescribing was more frequent. Continued efforts to educate clinicians on appropriate antibiotic and antiviral use are essential to improve healthcare quality.
Journal Article
Comparison of Arthrofibrosis After ACL Reconstruction According to Graft Choice: Quadriceps Tendon Versus Bone-Patellar Tendon-Bone Autograft
by
Petrie, Russell S.
,
Quilligan, Edward J.
,
Deshpande, Viraj A.
in
Bones
,
Cohort analysis
,
Ligaments
2025
Background:
Arthrofibrosis is a complication of anterior cruciate ligament reconstruction (ACLR), and it is possible that graft choice such as the quadriceps tendon (QT) autograft may be a risk factor. With the increasing popularity of the QT autograft, it is important to compare it with other graft choices.
Purpose/Hypothesis:
The purpose of this study was to identify whether graft choice, QT versus bone–patellar tendon–bone (BTB) autograft, is a risk factor for early return to the operating room for arthrofibrosis after ACLR. It was hypothesized that the rate of arthrofibrosis surgery would be higher for the QT autograft recipients.
Study Design:
Cohort study; Level of evidence, 3.
Methods:
A single-center retrospective chart review was conducted between January 2010 and November 2022. Skeletally mature patients who underwent primary ACLR with either QT or BTB autograft were considered for inclusion. Patients who received an alternate graft or those undergoing revision ACLR were excluded. The primary outcome of interest was return to the operating room for arthrofibrosis release (either manipulation under anesthesia or lysis of adhesions).
Results:
Of 1726 included patients (1155 receiving a BTB autograft and 571 receiving a QT autograft), 5.2% (n = 60) of BTB recipients and 6.5% (n = 37) of QT recipients required subsequent arthrofibrosis. There was no significant association between graft type and subsequent arthrofibrosis (P = .275). There was a significant association with graft type and presence of a cyclops lesion (65.0% of BTB grafts and 40.5% of QT grafts; P = .018). After removing those patients with chronic tears who underwent ACLR at >1 year, patients who required arthrofibrosis were found to have a significantly shorter time between injury and ACLR (mean, 59.23 ± 48.46 days) than those who did not require arthrofibrosis (mean, 81.7 ± 72.63 days) (P≤ .01). Significantly more female patients (9.25%) than male patients (2.79%) required arthrofibrosis (hazard ratio, 3.82; P < .001), and patients who required arthrosis were significantly younger (mean, 22.52 ± 9.35 years) than those who did not (mean, 25.74 ± 10.83 years) (P = .001).
Conclusion:
Study findings indicated no statistically significant difference in the rate of secondary arthrofibrosis surgery between patients who underwent ACLR with either QT or BTB autograft.
Journal Article