For the past two years, we have given hundreds of students at our university the opportunity to get hands-on experience conducting and analyzing qualitative interviews and constructing, collecting, and analyzing quantitative surveys. We are also using the student-collected data for a research project of our own. Here’s how we did it:
We first worked together to come up with a broad research idea that would be applicable to both of our courses (Criminology for Mary Nell and Juvenile Justice for Ashley). We decided that both sets of students could collect data on juvenile offending and perceptions of juvenile offenders. We then obtained IRB approval for the project, including permission to consider all of our enrolled students (approximately 100 students per course per semester) as research assistants, provided they completed the CITI online human subjects training course. In each course, the CITI course completion counted towards a small component of their final grade (2.5-5%).
Qualitative Project
For the Criminology course where students would be doing qualitative interviews, Ashley and Mary Nell worked together to develop an interview guide, leaving space for students to probe and include their own questions. Mary Nell devoted one full 90-minute class period to interviewer training and gave students time in class to practice interviewing and troubleshoot any concerns. Students had to conduct and record interviews with two people, transcribe their interviews verbatim, then write a course paper that incorporated the interviews and applied criminological theory. They submitted the transcripts, the audio recordings, their course paper, and a summary of the interview experience, including their evaluation of whether the person they interviewed took the interview seriously, how well they thought they did as an interviewer, and how they established rapport. Students each interviewed two people, so by the end of each semester, we had 200 fully transcribed interviews ready for analysis.
We tweaked this project over the course of three semesters. Some changes we made were related to IRB -- the first semester, IRB insisted that students interview strangers, which did not work out very well. We discovered that the “sweet spot” was for students to interview someone they knew casually, or a friend of a friend, rather than a complete stranger or a very close friend. We also learned which kinds of questions worked better than others. Students do not start out as very good interviewers, so when we had a string of questions in the interview guide, meant to be probes or follow ups, students would read all of the questions at once rather than wait for responses in between. We also added the requirement that students add their own interview questions in each section so that they could focus the interview on a theory or concept that they were especially interested in. Our first semester of data was not usable for our own research, there were just too many problems with the questions, topics, and interview techniques. So we would definitely recommend having a semester to pilot the interviews before keeping the interview data to use for your own purposes. However, students still learned a good amount from the process. The day that students hand in their papers, we spend part of class summarizing their experiences, what they learned, and what they would change or do differently next time. We also glean some of this information from the questionnaires they fill out after their interviews. Almost all students enjoy the project and feel like they learn more about the research process and the theories that guide the course. As two examples, undergraduate students Joe Buttino and Jeremy Carr write about their experience:
Joe’s Perspective: Through the Criminology project, I learned a great deal about proper interview techniques and how to conduct a quality interview. When analyzing my two interviews, I was excited to apply criminological theory. For example, Cohen and Felson’s “chemistry of crime” [routine activities theory] proved helpful in understanding my respondents’ rationale for their delinquent behavior, and labeling theory helped me understand other people’s responses to their behavior. My final paper was a culmination of the theories I learned in class and their real-world application to young people my age. The focus on my peers was refreshing because it overlapped with my own academic interests of understanding generational differences between Millennials and Gen Z.
Jeremy’s Perspective: The qualitative Criminology project is one of my favorites during my undergrad career and was helpful in preparing me for a sociology graduate program. In the prep work for the project, I learned about the utility of different sampling strategies, and through the process of conducting the interviews, I learned to be reflexive about my own standpoint and biases. Further, when interviewing a friend, the strengths and weaknesses of having an insider status quickly became apparent. In analyzing my interviews, my main takeaway was that one criminological theory cannot explain all aspects of delinquent and criminal behavior. Rather, a web of theories can offer more insight into explaining behavior and others’ responses to it.
Quantitative Project
In Ashley’s Juvenile Justice course, students completed the key components of a quantitative survey project, from literature review through data presentation. When paired with the main course text, an ethnography by Victor Rios, the survey project allowed students to compare the utility of different methods for studying juveniles and juvenile justice. The project had several checkpoints, some of which were completed in groups. Students began the project by identifying broad topics of interest and potential research questions in small groups. Based on these group topics, each student completed a short annotated bibliography and shared with their group the key takeaways. Each group then honed their research question, developed a simple hypothesis informed by their collective literature review, and wrote survey questions that would enable them to test their hypothesis. Ashley then added these survey questions to a Google Forms pre-existing survey developed in collaboration with Mary Nell. This ensured comparability of survey data across semesters while still allowing each class to add their own questions and test their unique hypotheses. Each student was responsible for distributing the survey and obtaining at least 10 survey respondents, resulting in about 1000 respondents for the full class each semester. Upon completion of data collection, students tested their hypotheses and prepared a policy brief or presentation to describe the findings and their implications as they related to theory and practice.
Because the project was labor intensive and student preparation for it varied widely, Ashley modified its structure slightly each semester to maximize student learning and minimize chaos. From this trial and error came several successful changes. First, any required group work was minimal and took place in class. Students also completed peer evaluations of their group members to hold one another accountable. Second, the number of hypotheses actually tested was reduced to 3-4 per class (rather than 1 per group) by holding a small competition in which each group presented its hypothesis and made a case for why it was important to test. The class then voted on which hypotheses were most interesting, and only the survey questions relevant to this narrowed set of hypotheses were added to the final survey. Consistent with the limits of cross-sectional survey research, hypotheses were also required to be very basic (e.g. Young people who report more X will also report more Y). Finally, it was important to gauge students’ quantitative skills and/or familiarity with Microsoft Excel or other analytic software before requiring them to analyze data. Most students, Ashley discovered, lacked any familiarity with Excel and had difficulty forming basic cross tabs or descriptive statistics. Walking through the data analysis in class for the 3-4 hypotheses proved a much better use of student and faculty time than having students attempt data analysis on their own. Students were still responsible for interpreting and presenting the results in their final write-ups.
As with any class project, some students were more invested in the survey project than others. Most students, though, seemed engaged and interested to learn if the survey data supported their hypotheses (as demonstrated by class cheers when a hypothesis was supported). Students consistently developed interesting hypotheses that applied and extended course material. For example, some students in the most recent semester were interested in understanding how high school security measures (e.g. school resource officers, active shooter drills, random searches, security cameras, etc.) were related to student feelings of safety on campus. Other students were interested in understanding how juvenile interactions with police were related to university students’ perceptions of campus police. Because hypotheses were grounded in theory and existing literature, students connected with the material in new ways and learned applied skills -- like data management, sampling, and survey writing -- in the process.
Joe’s Perspective: The Juvenile Justice survey project piqued my interest in quantitative data analysis. I was quite curious before the beginning of the course but I did not yet have the opportunity to collect or analyze quantitative data. Because I had already taken Dr. Trautner’s class and knew a little about the complementary survey project, I was able to formulate interesting questions and hypotheses for my class to test with this year’s survey. For example, I proposed, and my class agreed, that it would be interesting and relevant to test how the presence of metal detectors and other high-security interventions in elementary and high school were related to students’ self-reported levels of fear in college. I was excited to find patterns in the survey data that supported criminology theory and also to have interview data to provide a more nuanced take on these patterns. The project allowed me to learn the basics of survey data analysis in Excel and STATA and, together with Dr. Trautner’s project, to see the benefit of a mixed methods approach.
Jeremy’s Perspective: Professor Barr’s approach to the Juvenile Justice survey project was unique, and I would encourage other instructors to implement a similar approach. Rather than proposing an easy and predictable topic for a passing grade, the in-class competition to narrow our hypotheses encouraged us to propose a more interesting and relevant research question and hypothesis. Further, because our class simply added to a preexisting survey and we analyzed the results as a group in class, we were able to spend the majority of project time discussing methods and theory and understanding why we found the results we did. If we had to start the survey from scratch or analyze the data on our own, my classmates and I likely would have spent the majority of the project simply figuring out how to create a survey or navigate Excel to show our results. Finally, many sociology students acknowledge that there are problems in society and spend a lot of class time understanding these problems. Yet we rarely are asked to think about how to fix them. Professor Barr asked us to do the latter. During my semester, we were prompted to use theory and our survey results to complete a policy brief relevant to our own university. This allowed students to get their feet in the door of change and to recognize their own agency.
Outside the Classroom
Really engaged students might want to work more closely with the data than just doing two interviews or testing very basic hypotheses with survey data. In our case, we hired two students (Joe and Jeremy) to work as research assistants on the project. Joe also used a subset of the data for an Honor’s project.
Joe’s Perspective: During my first year in college, I had the unique opportunity to work closely with Dr. Trautner on a different qualitative project and with Dr. Barr in a small seminar class, so I was excited to be selected as an RA on their joint project. My work on this project has formalized my training in basic and intermediate research methods. In addition to the interview and survey skills acquired through coursework, I learned the basics of constructing codebooks, cleaning and analyzing data, and working with others to establish interrater reliability. The project has also allowed me to improve my critical thinking skills by understanding the sociological context of deviance and crime. I put all of these new skills to use in the form of an Honors Contract paper entitled “Perceptions of Delinquency among College Students and Non-students.” This paper draws upon the blinded survey and interview data collected in prior semesters of Dr. Trautner’s and Dr. Barr’s classes to examine differences in how young people rationalize their own and others’ adolescent delinquency. This paper has secured me a spot in the American Sociological Association’s Honors Program at their annual meeting this year in New York City, and I also won the Undergraduate Paper Prize from the ASA’s Sociology of Law section. Although my career intentions are a bit unclear at the moment, I am excited to explore sociology further and network with other students at this year’s conference.
Jeremy’s Perspective: Having the opportunity to be involved in the nit and grit of these two projects as an RA has provided me experience in collaboration with other research assistants to function on a professional level. I have become much more interested in conducting my own research after being exposed to the raw process of analyzing and organizing data. I intend to use methods I have learned in my RA position to construct my Master’s thesis when I being graduate studies in the fall. Expanding my interview analysis to 400 interviews (versus 2 for class) allowed for a better assessment of patterns and more consistent application of theory. I will be furthering my education at the University at Buffalo in the fall of 2019 to pursue an MA in sociology with a focus in criminology, and this decision was partly inspired by my involvement as an RA. Further, my research assistantship with Professors Trautner and Barr served as a focus for my statement of purpose when applying to graduate programs and, I believe, was a selling point on my CV.
For all students, we gave them advice on how they could frame the experience they gained on their resume to highlight to future employers. We circulated our assignments to career counselors at our university’s career services office, who then turned the assignments into bullet
points that could be put on a resume. We then gave these to students as examples should they wish to use them.
For example, here are the suggestions the career counselor made for the interview project for language they could include on a resume:
Interview-based social research, Spring 2018
• Collaborated with lead faculty to locate, recruit and interview 2 people on their experiences with, and others’ reactions to illegal juvenile behavior.
• Conducted two 60-minute interviews and demonstrated strong attention to detail by accurately transcribing responses
• Completed 3-hour tutorial to learn effective and ethical practices on conducting research with human subjects
• Analyzed and summarized interviewee responses and applied knowledge of criminological theory in 6-page written paper
We have both really enjoyed incorporating research components into our substantive courses. Students are able to see the practical value of course concepts while building their resumes and developing an appreciation for empirical work. Our approach allows all students to benefit, and for especially interested or advanced students to gain even further experience.