Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
8
result(s) for
"interviewer-respondent interaction"
Sort by:
INTERVIEWING PRACTICES, CONVERSATIONAL PRACTICES, AND RAPPORT: RESPONSIVENESS AND ENGAGEMENT IN THE STANDARDIZED SURVEY INTERVIEW
2016
\"Rapport\" has been used to refer to a range of positive psychological features of an interaction, including a situated sense of connection or affiliation between interactional partners, comfort, willingness to disclose or share sensitive information, motivation to please, and empathy. Rapport could potentially benefit survey participation and response quality by increasing respondents' motivation to participate, disclose, or provide accurate information. Rapport could also harm data quality if motivation to ingratiate or affiliate causes respondents to suppress undesirable information. Some previous research suggests that motives elicited when rapport is high conflict with the goals of standardized interviewing. The authors examine rapport as an interactional phenomenon, attending to both the content and structure of talk. Using questions about end-of-life planning in the 2003-2005 wave of the Wisconsin Longitudinal Study, the authors observe that rapport consists of behaviors that can be characterized as dimensions of responsiveness by interviewers and engagement by respondents. The authors identify and describe types of responsiveness and engagement in selected questionanswer sequences and then devise a coding scheme to examine their analytic potential with respect to the criterion of future study participation. The analysis suggests that responsive and engaged behaviors vary with respect to the goals of standardization; some behaviors conflict with these goals, whereas others complement them.
Journal Article
ITEM LOCATION, THE INTERVIEWER-RESPONDENT INTERACTION, AND RESPONSES TO BATTERY QUESTIONS IN TELEPHONE SURVEYS
2018
Survey researchers often ask a series of attitudinal questions with a common question stem and response options, known as battery questions. Interviewers have substantial latitude in deciding how to administer these items, including whether to reread the common question stem on items after the first one or to probe respondents' answers. Despite the ubiquity of use of these items, there is virtually no research on whether respondent and interviewer behaviors on battery questions differ over items in a battery or whether interview behaviors are associated with answers to these questions. This article uses a nationally representative telephone survey with audio-recorded interviews and randomized placement of items within four different batteries to examine interviewer and respondent behaviors and respondent answers in battery questions. Using cross-classified random-effects models, the authors find strong evidence that there is more interviewer-respondent interaction on items asked earlier in the battery. In addition, interviewer and respondent behaviors are associated with both substantive and nonsubstantive answers provided to battery items, especially if the interviewer decided to reread or probe with the response options. These results suggest that survey designers should follow recommendations to randomize battery items and consider the importance of standardization of question administration when designing battery questions.
Journal Article
The Action Structure of Recruitment Calls and Its Analytic Implications: The Case of Disfluencies
by
Schaeffer, Nora Cate
,
Min, Bo Hee
,
Garbarski, Dana
in
Case studies
,
Correlation analysis
,
Declination
2020
We describe interviewers’ actions in phone calls recruiting sample members. We illustrate (1) analytic challenges of studying how interviewers affect participation and (2) actions that undergird the variables in our models. We examine the impact of the interviewer’s disfluencies on whether a sample member accepts or declines the request for an interview as a case study. Disfluencies are potentially important if they communicate the competence or humanity of the interviewer to the sample member in a way that affects the decision to participate. Using the Wisconsin Longitudinal Study, we find that although as they begin, calls that become declinations are similar to those that become acceptances, they soon take different paths. Considering all recruitment actions together, we find that the ratio of disfluencies to words does not predict acceptance of the request for an interview, although the disfluency ratio before the turning point – request to participate or a declination – of the call does. However, after controlling for the number of actions, the disfluency ratio no longer predicts participation. Instead, when we examine actions before and after the first turning point separately, we find that the number of actions has a positive relationship with participation before and a negative relationship after.
Journal Article
Flexible Pretesting on a Tight Budget: Using Multiple Dependent Methods to Maximize Effort‐Return Trade‐Offs
by
Wilson, Bianca D.M.
,
Holtby, Sue
,
Jans, Matt
in
behavior coding
,
gender identity
,
interviewer‐respondent interaction
2020
Questionnaire pretesting is an essential, yet time‐consuming component of high‐quality survey measurement. Question designers must choose judiciously from a wide range of pretesting techniques to create an optimal design for the specific situation. For example, behavior coding is useful for identifying interactional problems in complex questions, but may not be as useful for simpler questions with few opportunities for problematic interviewer or respondent behavior. Similarly, low‐effort randomized experiments may not provide enough qualitative texture to recommend specific question changes. Many pretests may benefit from using multiple methods. This chapter presents a case study that employed dependent pretesting, where pretesting techniques were modified mid‐test based on initial results. This approach prioritizes maximizing actionable insight for effort expended over strict adherence to a predetermined pretesting protocol. Pretesting approaches used and the decision‐making process involved in making mid‐test adjustments are discussed. Lessons learned are also shared to add future pretests.
Book Chapter
How to Pop the Question? Interviewer and Respondent Behaviours When Measuring Change with Proactive Dependent Interviewing
by
Al Baghal, Tarek
,
Eckman, Stephanie
,
Sala, Emanuela
in
coding
,
dependent interviewing
,
digital audio‐recording
2021
Dependent interviewing (DI) is a technique used in longitudinal surveys, whereby answers given in an interview are used to determine question routing or wording in the following interview. In this chapter, the authors examine which question format is best by studying interviewer and respondent behaviours, and how the question wording affects these. They use audio‐recordings of experimental questions in the Innovation Panel of the UK Household Longitudinal Study to examine how different versions of proactive DI questions function, and why some wording versions are problematic. The next step in the analysis process after data collection was to code the interviewer‐respondent interactions captured in the digital audio‐recording. In coding the interviewer and respondent behaviours, the unit of coding was each turn taken by the interviewer or respondent.
Book Chapter
Interaction Characteristics in Some Question-Wording Experiments
1997
Caractéristique des interactions dans des expériences de l'utilisation des mots dans les questionnaires. Des résultats d'analyses de l'interaction interviewer-répondant sont peu utilisés dans les recherches sur la qualité des données. Avec une analyse d'interaction des entretiens enregistrés sur bande lors d'une expérience sur l'utilisation des mots dans des questionnaires. Il est possible de valider quelqu'unes des explications proposées pour certains effets produit par l'utilisation des mots. Des expériences sur l'étendue de l'échelle des réponses, l'utilisation explicite du filtre \"pas d'opinion\" , et l'utilisation des verbes \"Interdire\" ou \"permettre\" sont étudiées.
Journal Article
Response 1 to Fowler's Chapter: Coding the Behavior of Interviewers and Respondents to Evaluate Survey Questions
by
Dykema, Jennifer
,
Cate Schaeffer, Nora
in
coding the interaction, interviewers and respondents ‐ understanding interactional expressions, of cognitive processing and measurement error
,
observing interviewer's behavior ‐ how standardized the respondent's behavior is, clues to their cognitive processing
,
response 1 to Fowler's chapter, coding behavior of interviewers and respondents ‐ studying interaction in survey interview and why it is done
2011
This chapter contains sections titled:
Why do We Study Interaction in the Survey Interview?
Questions, Behavior, and Quality of Measurement: A Conceptual Model
Coding the Behavior of Interviewers and Respondents
Conversation Analytic Studies of Interaction in the Interview
Excerpt 3.1. Wisconsin Longitudinal Study, Question 2
Excerpt 3.2. Wisconsin Longitudinal Study, Question 1
Excerpt 3.3. Wisconsin Longitudinal Study, Question 7
Characteristics of Survey Questions
Excerpt 3.4. Wisconsin Longitudinal Study, Question 35
Conclusion
Acknowledgments
Notes
References
Book Chapter
Why don’t they answer? Unit non-response in the IAB establishment panel
2012
The paper proposed focuses on the unit-nonresponse in the IAB (Institute for Employment Research) Establishment Panel, a comprehensive data set describing the employer side of the labour market in Germany. Every year since 1993 (1996) the IAB Establishment Panel has surveyed the same establishments from all branches and different size categories in western (eastern) Germany. Although great efforts are taken to convince the owner/manager to take part in the survey there are always firms that do not answer the questionnaire. In this paper the authors try to find out why some establishments are not willing or able to respond to the questionnaire. If the respondent has the authority to provide relevant information, is able to give reliable answers to the questions with a justifiable amount of effort and is interested in the survey in business terms, participation is less frequently refused. The results also confirm the central significance of the interaction between the respondent and the interviewer. If one of the two individuals changes, the probability of further participation falls clearly.
Journal Article