Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
551
result(s) for
"Wester, P"
Sort by:
South Asian agriculture increasingly dependent on meltwater and groundwater
2022
Irrigated agriculture in South Asia depends on meltwater, monsoon rains and groundwater. Climate change alters the hydrology and causes shifts in the timing, composition and magnitude of these sources of water supply. Simultaneously, socio-economic growth increases water demand. Here we use a high-resolution cryosphere–hydrology–crop model forced with an ensemble of climate and socio-economic projections to assess how the sources of irrigation water supply may shift during the twenty-first century. We find increases in the importance of meltwater and groundwater for irrigated agriculture. An earlier melt peak increases meltwater withdrawal at the onset of the cropping season in May and June in the Indus, whereas increasing peak irrigation water demand during July and August aggravates non-renewable groundwater pumping in the Indus and Ganges despite runoff increases. Increasing inter-annual variability in rainfall runoff increases the need for meltwater and groundwater to complement rainfall runoff during future dry years.South Asian agriculture depends on water from rains, meltwater and groundwater, but climate change impacts the timing of these water sources’ availability. Projections indicate that meltwater and groundwater will become more important and will need to offset reduced rainfall during drier years.
Journal Article
Assessment of permafrost distribution maps in the Hindu Kush Himalayan region using rock glaciers mapped in Google Earth
2015
The extent and distribution of permafrost in the mountainous parts of the Hindu Kush Himalayan (HKH) region are largely unknown. A long tradition of permafrost research, predominantly on rather gentle relief, exists only on the Tibetan Plateau. Two permafrost maps are available digitally that cover the HKH and provide estimates of permafrost extent, i.e., the areal proportion of permafrost: the manually delineated Circum-Arctic Map of Permafrost and Ground Ice Conditions (Brown et al., 1998) and the Global Permafrost Zonation Index, based on a computer model (Gruber, 2012). This article provides a first-order assessment of these permafrost maps in the HKH region based on the mapping of rock glaciers. Rock glaciers were used as a proxy, because they are visual indicators of permafrost, can occur near the lowermost regional occurrence of permafrost in mountains, and can be delineated based on high-resolution remote sensing imagery freely available on Google Earth. For the mapping, 4000 square samples (~ 30 km2) were randomly distributed over the HKH region. Every sample was investigated and rock glaciers were mapped by two independent researchers following precise mapping instructions. Samples with insufficient image quality were recorded but not mapped. We use the mapping of rock glaciers in Google Earth as first-order evidence for permafrost in mountain areas with severely limited ground truth. The minimum elevation of rock glaciers varies between 3500 and 5500 m a.s.l. within the region. The Circum-Arctic Map of Permafrost and Ground Ice Conditions does not reproduce mapped conditions in the HKH region adequately, whereas the Global Permafrost Zonation Index does so with more success. Based on this study, the Permafrost Zonation Index is inferred to be a reasonable first-order prediction of permafrost in the HKH. In the central part of the region a considerable deviation exists that needs further investigations.
Journal Article
Randomised trial of old and new antihypertensive drugs in elderly patients: cardiovascular mortality and morbidity the Swedish Trial in Old Patients with Hypertension-2 study
by
Wester, P-O
,
Scherstén, Bengt
,
Hedner, Thomas
in
Aged
,
Aged, 80 and over
,
Antihypertensive agents
1999
The efficacy of new antihypertensive drugs has been questioned. We compared the effects of conventional and newer antihypertensive drugs on cardiovascular mortality and morbidity in elderly patients.
We did a prospective, randomised trial in 6614 patients aged 70–84 years with hypertension (blood pressure ≥180 mm Hg systolic, ≥105 mm Hg diastolic, or both). Patients were randomly assigned conventional antihypertensive drugs (atenolol 50 mg, metoprolol 100 mg, pindolol 5 mg, or hydrochlorothiazide 25 mg plus amiloride 2·5 mg daily) or newer drugs (enalapril 10 mg or lisinopril 10 mg, or felodipine 2·5 mg or isradipine 2–5 mg daily). We assessed fatal stroke, fatal myocardial infarction, and other fatal cardiovascular disease. Analysis was by intention to treat.
Blood pressure was decreased similarly in all treatment groups. The primary combined endpoint of fatal stroke, fatal myocardial infarction, and other fatal cardiovascular disease occurred in 221 of 2213 patients in the conventional drugs group (19·8 events per 1000 patientyears) and in 438 of 4401 in the newer drugs group (19·8 per 1000; relative risk 0·99 [95% Cl 0·84–1·16], p=0·89). The combined endpoint of fatal and non-fatal stroke, fatal and non-fatal myocardial infarction, and other cardiovascular mortality occurred in 460 patients taking conventional drugs and in 887 taking newer drugs (0·96 [0·86–1·08], p=0·49).
Old and new antihypertensive drugs were similar in prevention of cardiovascular mortality or major events. Decrease in blood pressure was of major importance for the prevention of cardiovascular events.
Journal Article
The ecological effects of selective decontamination of the digestive tract (SDD) on antimicrobial resistance: a 21-year longitudinal single-centre study
by
van der Meer, Nardo J. M.
,
van der Voort, Peter H. J.
,
Wester, Jos P. J.
in
Adult
,
Aged
,
Aged, 80 and over
2019
Background
The long-term ecological effects on the emergence of antimicrobial resistance at the ICU level during selective decontamination of the digestive tract (SDD) are unknown. We determined the incidence of newly acquired antimicrobial resistance of aerobic gram-negative potentially pathogenic bacteria (AGNB) during SDD.
Methods
In a single-centre observational cohort study over a 21-year period, all consecutive patients, treated with or without SDD, admitted to the ICU were included. The antibiotic regime was unchanged over the study period. Incidence rates for ICU-acquired AGNB’s resistance for third-generation cephalosporins, colistin/polymyxin B, tobramycin/gentamicin or ciprofloxacin were calculated per year. Changes over time were tested by negative binomial regression in a generalized linear model.
Results
Eighty-six percent of 14,015 patients were treated with SDD. Most cultures were taken from the digestive tract (41.9%) and sputum (21.1%). A total of 20,593 isolates of AGNB were identified. The two most often found bacteria were
Escherichia coli
(
N
= 6409) and
Pseudomonas
(
N
= 5269). The incidence rate per 1000 patient-day for ICU-acquired resistance to cephalosporins was 2.03, for polymyxin B/colistin 0.51, for tobramycin 2.59 and for ciprofloxacin 2.2. The incidence rates for ICU-acquired resistant microbes per year ranged from 0 to 4.94 per 1000 patient-days, and no significant time-trend in incidence rates were found for any of the antimicrobials. The background prevalence rates of resistant strains measured on admission for cephalosporins, polymyxin B/colistin and ciprofloxacin rose over time with 7.9%, 3.5% and 8.0% respectively.
Conclusions
During more than 21-year SDD, the incidence rates of resistant microbes at the ICU level did not significantly increase over time but the background resistance rates increased. An overall ecological effect of prolonged application of SDD by counting resistant microorganisms in the ICU was not shown in a country with relatively low rates of resistant microorganisms.
Journal Article
Oral neuromuscular training in patients with dysphagia after stroke: a prospective, randomized, open-label study with blinded evaluators
2020
Background
Oral and pharyngeal swallowing dysfunction are common complications in acute stroke patients. This primary aim of this study was to determine whether oral neuromuscular training improves swallowing function in participants with swallowing dysfunction after stroke. A secondary aim was to assess how well results of the timed water-swallow test (TWST) correspond with swallowing dysfunction diagnosed by videofluoroscopy (VFS).
Methods
This was an intention-to-treat two-centre prospective randomized open-label study with blinded-evaluators (PROBE) design. At 4 weeks after stroke onset, participants with swallowing dysfunction were randomized to 5 weeks of continued orofacial sensory-vibration stimulation with an electric toothbrush or additional oral neuromuscular training with an oral device (Muppy®). Participants were examined with TWST, a lip-force test, and VFS before (baseline), after 5 weeks’ treatment (the end-of-treatment), and 12 months after treatment (follow-up). The baseline VFS results were compared with the TWST results. The primary endpoint was changes in swallowing rate assessed using TWST, from baseline to the end of training and from baseline to follow-up based on intention-to-treat analyses. The secondary endpoint was the corresponding changes in lip-force between baseline, the end of treatment, and follow-up.
Results
The participants were randomly assigned as controls (
n
= 20) or for intervention with oral neuromuscular training (
n
= 20). After treatment, both groups had improved significantly (intervention,
P
< 0.001; controls,
P
= 0.001) in TWST but there was no significant between-group difference in swallowing rate. At the 12-month follow-up, the intervention group had improved further whereas the controls had deteriorated, and there were significant between-group differences in swallowing rate (
P
= 0.032) and lip force (
P
= 0.001). A TWST < 10 mL/sec at baseline corresponded to VFS-verified swallowing dysfunction in all assessed participants.
Conclusion
The 5-week oral neuromuscular training improved swallowing function in participants with post-stroke dysphagia compared with the controls 12 months after intervention, but there was no between-group difference in improvement immediately after treatment. TWST results corresponded with VFS results, making TWST a feasible method for identifying persons with swallowing dysfunction after stroke. Larger randomized controlled trials are required to confirm our preliminary positive long-term results.
Trial registration
Retrospectively registered at
ClinicalTrials.gov
:
NCT04164420
. Registered on 15 November 2019.
Journal Article
Effect of angiotensin-converting-enzyme inhibition compared with conventional therapy on cardiovascular morbidity and mortality in hypertension: the Captopril Prevention Project (CAPPP) randomised trial
by
Hedner, Thomas
,
Niklason, Anders
,
de Faire, Ulf
in
Adrenergic beta-Antagonists - therapeutic use
,
Adult
,
Aged
1999
Angiotensin-converting-enzyme (ACE) inhibitors have been used for more than a decade to treat high blood pressure, despite the lack of data from randomised intervention trials to show that such treatment affects cardiovascular morbidity and mortality. The Captopril Prevention Project (CAPPP) is a randomised intervention trial to compare the effects of ACE inhibition and conventional therapy on cardiovascular morbidity and mortality in patients with hypertension.
CAPPP was a prospective, randomised, open trial with blinded endpoint evaluation. 10 985 patients were enrolled at 536 health centres in Sweden and Finland. Patients aged 25–66 years with a measured diastolic blood pressure of 100 mm Hg or more on two occasions were randomly assigned captopril or conventional antihypertensive treatment (diuretics, β-blockers). Analysis was by intention-to-treat. The primary endpoint was a composite of fatal and non-fatal myocardial infarction, stroke, and other cardiovascular deaths.
Of 5492 patients assigned captopril and 5493 assigned conventional therapy, 14 and 13, respectively, were lost to follow-up. Primary endpoint events occurred in 363 patients in the captopril group (11·1 per 1000 patient-years) and 335 in the conventional-treatment group (10·2 per 1000 patient-years; relative risk 1·05 [95% Cl 0·90–1·22], p=0·52). Cardiovascular mortality was lower with captopril than with conventional treatment (76
vs 95 events; relative risk 0·77 [0·57–1·04], p=0·092), the rate of fatal and non-fatal myocardial infarction was similar (162
vs 161), but fatal and non-fatal stroke was more common with captopril (189 vs 148; 1·25 [1·01–1·55]. p=0·044).
Captopril and conventional treatment did not differ in efficacy in preventing cardiovascular morbidity and mortality. The difference in stroke risk is probably due to the lower levels of blood pressure obtained initially in previously treated patients randomised to conventional therapy.
Journal Article
Identifying important barriers to recruitment of patients in randomised clinical studies using a questionnaire for study personnel
2019
Background
Many randomised controlled trials (RCT) fail to meet their recruitment goals. Study personnel play a key role in recruitment. The aim of this study was to identify successful strategies that study personnel consider to be important in patient recruitment to RCT.
Methods
We constructed a questionnaire based on the literature, discussions with colleagues and our own experience as trialists. The survey was named “What is Important for Making a Study Successful questionnaire” (WIMSS-q). Our target group was the study personnel in the ongoing EFFECTS study. The questionnaire was sent out electronically to all physicians and nurses (
n
= 148). Success factors and barriers were divided according to patient, centre and study level, respectively.
Results
Responses were received from 94% of the study personnel (139/148). The five most important factors at centre level for enhancing recruitment were that the research question was important (97%), a simple procedure for providing information and gaining consent (92%), a highly engaged local principal investigator and research nurse (both 87%), and that study-related follow-ups are practically feasible and possible to coordinate with the clinical follow-up (87%). The most significant barrier at the local centre was lack of time and resources devoted to research (72%). Important patient-related barriers were fear of side effects (35%) and language problems (30%).
Conclusions
For recruitment in an RCT to be successful, the research question must be relevant, and the protocol must be simple and easy to implement in the daily routine.
Trial registration
The protocol for this study was registered at the Northern Ireland Hub for trials methodology research (SWAT ID
64
). The EFFECTS study has EudraCT number 2011–006130-16 and was registered 17 February 2016 at ClinicalTrials.gov number
NCT02683213
.
Journal Article
Enhancing Recruitment Using Teleconference and Commitment Contract (ERUTECC): a stepped wedge cluster randomised trial within the EFFECTS trial
2025
Two out of three randomised controlled trials (RCTs) fail to meet their recruitment goals. Recruitment to Efficacy oF Fluoxetine - a randomisEd Controlled Trial in Stroke (EFFECTS), fluoxetine for stroke recovery was slower than anticipated. We aimed to evaluate an intervention to improve recruitment to EFFECTS.
This stepped wedge, cluster randomised study investigated whether a teleconference with the study personnel and the head of department could enhance recruitment in the ongoing EFFECTS. We included 20 low- and medium recruiting active centres. We excluded high recruiting centres. All centres started as controls and were followed by 60 days of observation. We used block randomisation. The primary outcome was a 20% increase of recruitment within 60 days post intervention compared within 60 days pre intervention. Secondary outcomes were comparing recruitment between different types of centres, that is small versus large or experienced versus non-experienced centres, and university versus non-university hospitals. In exploratory analyses, recruitment within 30 days post versus 30 days pre intervention was compared.
The recruitment increased by 10% at 60 days. We noticed a short-lived increase of 23% the first month. The increased recruitment was most pronounced in low-recruiting, small and non-university hospitals. The recruitment of patients increased after the first contact with the centres where we announced that there would be a conference.
A teleconference with the study personnel and the head of department increased the recruitment by 23% within 30 days and by 10%, 60 days post intervention in this embedded RCT. This implies that this structured intervention aimed at increased recruitment was short-lived and would need frequent repetitions in order to be effective.
Journal Article
The psychosocial work environment is associated with risk of stroke at working age
by
Jood, Katarina
,
Karlsson, Nadine
,
Medin, Jennie
in
Arbetsmedicin och miljömedicin
,
case control study
,
Case-Control Studies
2017
Objective The aim of this study was to explore the relation between the risk of first-ever stroke at working age and psychological work environmental factors. Methods A consecutive multicenter matched 1:2 case–control study of acute stroke cases (N=198, age 30–65 years) who had been working full-time at the time of their stroke and 396 sex- and age-matched controls. Stroke cases and controls answered questionnaires on their psychosocial situation during the previous 12 months. The psychosocial work environment was assessed using three different measures: the job–control–demand model, the effort–reward imbalance (ERI) score, and exposures to conflict at work. Results Among 198 stroke cases and 396 controls, job strain [odds ratio (OR) 1.30, 95% confidence interval (95% CI) 1.05–1.62], ERI (OR 1.28, 95% CI 1.01–1.62), and conflict at work (OR 1.75, 95% CI 1.07–2.88) were independent risk factors of stroke in multivariable regression models. Conclusions Adverse psychosocial working conditions during the past 12 months were more frequently observed among stroke cases. Since these factors are presumably modifiable, interventional studies targeting job strain and emotional work environment are warranted.
Journal Article
Stroke unit care revisited: who benefits the most? A cohort study of 105 043 patients in Riks-Stroke, the Swedish Stroke Register
2009
Background:Treatment at stroke units is superior to treatment at other types of wards. The objective of the present study is to determine the effect size of stroke unit care in subgroups of patients with stroke. This information might be useful in a formal priority setting.Methods:All acute strokes reported to the Swedish Stroke Register from 2001 through 2005 were followed until January 2007. The subgroups were age (18–64, 65–74, 75–84, 85+ years and above), sex (male, female), stroke subtype (intracerebral haemorrhage, cerebral infarction and unspecified stroke) and level of consciousness (conscious, reduced, unconscious). Cox proportional hazards and logistic regression analyses were used to estimate the risk for death, institutional living or dependency.Results:105 043 patients were registered at 86 hospitals. 79 689 patients (76%) were treated in stroke units and 25 354 patients (24%) in other types of wards. Stroke unit care was associated with better long-term survival in all subgroups. The best relative effect was seen among the following subgroups: age 18–64 years (hazard ratio (HR) for death 0.53; 0.49 to 0.58), intracerebral haemorrhage (HR 0.61; 0.58 to 0.65) and unconsciousness (HR 0.70; 0.66 to 0.75). Stroke unit care was also associated with reduced risk for death or institutional living after 3 months.Conclusions:Stroke unit care was associated with better long-term survival in all subgroups, but younger patients, patients with intracerebral haemorrhage and patients who were unconscious had the best relative effect and may be given the highest priority to this form of care.
Journal Article