Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
75 result(s) for "Bissett, Ian"
Sort by:
Emotional predictors of bowel screening: the avoidance-promoting role of fear, embarrassment, and disgust
Background Despite considerable efforts to address practical barriers, colorectal cancer screening numbers are often low. People do not always act rationally, and investigating emotions may offer insight into the avoidance of screening. The current work assessed whether fear, embarrassment, and disgust predicted colorectal cancer screening avoidance. Methods A community sample ( N  = 306) aged 45+ completed a questionnaire assessing colorectal cancer screening history and the extent that perceptions of cancer risk, colorectal cancer knowledge, doctor discussions, and a specifically developed scale, the Emotional Barriers to Bowel Screening (EBBS), were associated with previous screening behaviours and anticipated bowel health decision-making. Results Step-wise logistic regression models revealed that a decision to delay seeking healthcare in the hypothetical presence of bowel symptoms was less likely in people who had discussed risk with their doctor, whereas greater colorectal cancer knowledge and greater fear of a negative outcome predicted greater likelihood of delay. Having previously provided a faecal sample was predicted by discussions about risk with a doctor, older age, and greater embarrassment, whereas perceptions of lower risk predicted a lower likelihood. Likewise, greater insertion disgust predicted a lower likelihood of having had an invasive bowel screening test in the previous 5 years. Conclusions Alongside medical and demographic factors, fear, embarrassment and disgust are worthy of consideration in colorectal cancer screening. Understanding how specific emotions impact screening decisions and behaviour is an important direction for future work and has potential to inform screening development and communications in bowel health.
Seasonal variations in acute diverticular disease hospitalisations in New Zealand
Purpose Seasonal variation of acute diverticular disease is variably reported in observational studies. This study aimed to describe seasonal variation of acute diverticular disease hospital admissions in New Zealand. Methods A time series analysis of national diverticular disease hospitalisations from 2000 to 2015 was conducted among adults aged 30 years or over. Monthly counts of acute hospitalisations’ primary diagnosis of diverticular disease were decomposed using Census X-11 times series methods. A combined test for the presence of identifiable seasonality was used to determine if overall seasonality was present; thereafter, annual seasonal amplitude was calculated. The mean seasonal amplitude of demographic groups was compared by analysis of variance. Results Over the 16-year period, 35,582 hospital admissions with acute diverticular disease were included. Seasonality in monthly acute diverticular disease admissions was identified. The mean monthly seasonal component of acute diverticular disease admissions peaked in early-autumn (March) and troughed in early-spring (September). The mean annual seasonal amplitude was 23%, suggesting on average 23% higher acute diverticular disease hospitalisations during early-autumn (March) than in early-spring (September). The results were similar in sensitivity analyses that employed different definitions of diverticular disease. Seasonal variation was less pronounced in patients aged over 80 ( p  = 0.002). Seasonal variation was significantly greater among Māori than Europeans ( p  < 0.001) and in more southern regions ( p  < 0.001). However, seasonal variations were not significantly different by gender. Conclusions Acute diverticular disease admissions in New Zealand exhibit seasonal variation with a peak in Autumn (March) and a trough in Spring (September). Significant seasonal variations are associated with ethnicity, age, and region, but not with gender.
Risk factors for the development of prolonged post-operative ileus following elective colorectal surgery
Purpose Prolonged post-operative ileus (PPOI) increases post-operative morbidity and prolongs hospital stay. An improved understanding of the elements which contribute to the genesis of PPOI is needed in the first instance to facilitate accurate risk stratification and institute effective preventive measures. The aim of this retrospective cohort study was to therefore determine the perioperative risk factors associated with development of PPOI. Methods All elective intra-abdominal operations undertaken by the Colorectal Unit at Auckland District Health Board from 1 January to 31 December 2011 were accessed. Data were extracted for an assortment of patient characteristics and perioperative variables. Cases were stratified by the occurrence of clinician-diagnosed PPOI. Univariate and regression analyses were performed to identify correlates and independent risk factors, respectively. Results Two hundred and fifty-five patients were identified of whom 50 (19.6 %) developed PPOI. The median duration for PPOI was 4 days with 98 % resolving spontaneously with conservative measures. Univariate analysis identified increasing age; procedure type; increasing opiate consumption; elevated preoperative creatinine; post-operative haemoglobin drop, highest white cell count and lowest sodium; and increasing complication grade as significant correlates. Logistic regression found increasing age (OR 1.032, 95 % CI 1.004–1.061; p  = 0.026) and increasing drop in pre- to post-operative haemoglobin (OR 1.043, 95 % CI 1.002–1.085; p  = 0.037) as the only independent predictors for developing PPOI. An important limitation of this study was its retrospective nature. Conclusions Increasing age and increasing drop in haemoglobin are independent predictors for developing PPOI. Prospective assessment is required to facilitate more accurate risk factor analysis.
Clinical tumour size and nodal status predict pathologic complete response following neoadjuvant chemoradiotherapy for rectal cancer
Purpose Pathologic complete response (pCR) to neoadjuvant treatment for rectal cancer has been associated with improved local control, reduced distant disease and a survival advantage when compared with non-complete responders. Approximately 10–25 % of patients undergoing neoadjuvant chemoradiotherapy for rectal cancer achieve pCR; however, predictors for its occurrence are inadequately defined. This study aimed to identify clinical and tumour factors that predict pCR in patients receiving neoadjuvant chemoradiotherapy for rectal cancer. Methods Consecutive rectal cancer patients diagnosed and treated in the Auckland region between 1 January 2002 and 1 February 2013 were retrospectively identified. Cases were stratified by the occurrence of pCR or non-pCR. Predictive capacity of several patient, tumour and treatment-related variables were then assessed by univariate and regression analyses. Results Two hundred ninety-seven patients received neoadjuvant chemoradiotherapy, of whom 34 (11.4 %) achieved pCR. There were no significant differences in age, gender, ethnicity, BMI, pretreatment clinical T or N stage, tumour distance from the anal verge, tumour differentiation, chemoradiotherapy regimen and time interval to surgery between the pCR and non-pCR groups. Univariate analysis identified pretreatment serum CEA levels, a reduction in pre- to post-treatment serum CEA and smaller tumours as significant correlates of pCR. Logistic regression analysis found smaller tumour size and pretreatment clinical N stage as independent clinical predictors for achieving pCR. Conclusions Smaller tumour size and pretreatment clinical N stage were independent clinical predictors for achieving pCR. Prospective analysis is recommended for more rigorous risk factor assessment.
Diverticular disease management in primary care: How do estimates from community-dispensed antibiotics inform provision of care?
The literature regarding diverticular disease of the intestines (DDI) almost entirely concerns hospital-based care; DDI managed in primary care settings is rarely addressed. To estimate how often DDI is managed in primary care, using antibiotics dispensing data. Hospitalisation records of New Zealand residents aged 30+ years during 2007-2016 were individually linked to databases of community-dispensed oral antibiotics. Patients with an index hospital admission 2007-2016 including a DDI diagnosis (ICD-10-AM = K57) were grouped by acute/non-acute hospitalisation. We compared use of guideline-recommended oral antibiotics for the period 2007-2016 for these people with ten individually-matched non-DDI residents, taking the case's index date. Multivariable negative binomial models were used to estimate rates of antibiotic use. From almost 3.5 million eligible residents, data were extracted for 51,059 index cases (20,880 acute, 30,179 non-acute) and 510,581 matched controls; mean follow-up = 8.9 years. Dispensing rates rose gradually over time among controls, from 47 per 100 person-years (/100py) prior to the index date, to 60/100py after 3 months. In comparison, dispensing was significantly higher for those with DDI: for those with acute DDI, rates were 84/100py prior to the index date, 325/100py near the index date, and 141/100py after 3 months, while for those with non-acute DDI 75/100py, 108/100py and 99/100py respectively. Following an acute DDI admission, community-dispensed antibiotics were dispensed at more than twice the rate of their non-DDI counterparts for years, and were elevated even before the index DDI hospitalisation. DDI patients experience high use of antibiotics. Evidence is needed that covers primary-care and informs self-management of recurrent, chronic or persistent DDI.
Letter to the editor regarding “Incidence, severity and detection of blood pressure and heart rate perturbations in postoperative ward patients after noncardiac surgery”
For a complete evaluation of CVSM using the Visi Mobile device, it would be of value to analyse these metrics in a similar manner to the present haemodynamic analysis. [...]it is important to emphasise that although CVSM has the potential to improve the ‘capacity to detect’ postoperative complications and the ‘capacity to respond’ to deterioration, Khanna et al. were unable to quantify how often abnormalities detected by CVSM were confirmed and resulted in clinical review of the patient. CVSM has significant potential to improve surgical safety, but needs rigorous evaluation prior to widespread use, particularly given potential unintended consequences including measurement imprecision, increased workload, and alarm fatigue.
Continuous wireless postoperative monitoring using wearable devices: further device innovation is needed
Other clinical validation studies of wearable monitoring devices have also demonstrated limited device precision in the postoperative setting [3]. [...]any assessment of the ability for devices to detect early deterioration is heavily confounded by inherent limitations of the devices. Furthermore, although accuracy and reliability of wearable devices are a barrier, most studies have been conducted in healthy volunteers and in comparison with clinical and/or gold standard for continuous measurement [4]; with even less evidence comparing these devices with the standard intermittent manual measurements, the common practice in general wards. [...]well-powered and well-designed clinical trials, including careful implementation of these new systems, might still be of benefit to support the development and innovation of these technologies, by testing its impact in clinical care as complementary, and not a substitute, to standard practice [6].
Validation of body surface colonic mapping (BSCM) against high resolution colonic manometry for evaluation of colonic motility
Abnormal cyclic motor pattern (CMP) activity is implicated in colonic dysfunction, but the only tool to evaluate CMP activity, high-resolution colonic manometry (HRCM), remains expensive and not widely accessible. This study aimed to validate body surface colonic mapping (BSCM) through direct correlation with HRCM. Synchronous meal-test recordings were performed in asymptomatic participants with intact colons. A signal processing method for BSCM was developed to detect CMPs. Quantitative temporal analysis was performed comparing the meal responses and motility indices (MI). Spatial heat maps were also compared. Post-study questionnaires evaluated participants’ preference and comfort/distress experienced from either test. 11 participants were recruited and 7 had successful synchronous recordings (5 females/2 males; median age: 50 years [range 38–63]). The best-correlating MI temporal analyses achieved a high degree of agreement (median Pearson correlation coefficient (Rp) value: 0.69; range 0.47–0.77). HRCM and BSCM meal response start and end times (Rp = 0.998 and 0.83; both p < 0.05) and durations (Rp = 0.85; p = 0.03) were similar. Heat maps demonstrated good spatial agreement. BSCM is the first non-invasive method to be validated by demonstrating a direct spatio-temporal correlation to manometry in evaluating colonic motility.
Anterior Resection Syndrome—A Risk Factor Analysis
Background Evacuatory dysfunction after distal colorectal resection varies from incontinence to obstructed defaecation and is termed anterior resection syndrome. The aim of this study was to identify risk factors for the development of anterior resection syndrome. Methods All anterior resections undertaken at Auckland Hospital from 2002 to 2012 were retrospectively evaluated. An assortment of patient and peri-operative variables were recorded. Cases were stratified by the occurrence of anterior resection syndrome symptoms from 1 to 5 years post-operatively. Results A total of 277 patients were identified. Prevalence of anterior resection syndrome decreased progressively from 61 % at 1 year to 43 % at 5 years. Univariate analysis identified anastomotic height, surgeon, pT stage, procedure year and temporary diversion ileostomy as recurring significant correlates ( p  < 0.05). Logistic regression identified lower anastomotic height (odds ratio (OR) 2.11, 95 % confidence interval (CI) 1.05–4.27; p  = 0.04) and obstructive presenting symptoms (OR 6.71, 95 % CI 1.00–44.80; p  = 0.05) as independent predictors at 1 and 2 years, respectively. Post-operative chemotherapy was a predictor at 1 year (OR 1.93, 95 % CI 1.04–3.57; p  = 0.03). Temporary diverting ileostomy was an independent predictor at 2 (OR 2.49, 95 % CI 1.04–5.95; p  = 0.04), 3 (OR 4.17, 95 % CI 1.04–16.78; p  = 0.04), 4 (OR 8.05, 95 % CI 1.21–53.6; p  = 0.03), and 5 years (OR 49.60, 95 % CI 2.17–1134.71; p  = 0.02) after adjusting for anastomotic height. Conclusions Anastomotic height, post-operative chemotherapy and obstructive presenting symptoms were independent predictors at 1 and 2 years. Temporary diversion ileostomy was an independent predictor for the occurrence of anterior resection syndrome at 2, 3, 4 and 5 years even after correcting for anastomotic height. Prospective assessment is required to facilitate more accurate risk factor analysis.