Start Submission Become a Reviewer

Reading: A Mixed Method Approach to Understanding Researcher Identity

Download

A- A+
Alt. Display

Empirical Research

A Mixed Method Approach to Understanding Researcher Identity

Authors:

Rachel Kajfez ,

The Ohio State University, US
X close

Dennis Lee,

Clemson University, US
X close

Katherine Ehlert,

Clemson University, US
X close

Courtney Faber,

University of Tennessee, US
X close

Lisa Benson,

Clemson University, US
X close

Marian Kennedy

Clemson University, US
X close

Abstract

Background: Identity research in engineering education has been expanding to include multiple forms of measurement. While a variety of approaches have successfully contributed to our understanding of identity, mixed method approaches have been utilized minimally in identity research. Therefore, additional insight on the implications and affordances of mixed method approaches for identity research is needed.

Purpose: Our work explored undergraduate engineering students’ researcher identity using a fully integrated mixed method approach by interweaving deductive (quantitative) and inductive (qualitative) strands of survey and interview data. We aimed to answer the research question: How can quantitative and qualitative data approaches be used in combination to explore how students conceptualize their researcher identity?

Method: The data included responses from 20 undergraduate mechanical and biomedical engineering students representing six different institutions. Students completed surveys, and their survey responses served as a foundation for subsequent interviews. Both the survey and interview contained an identical item (a single anchored scale measure of researcher identity), which directly connected the two data sets. While the survey data was used during the interviews, analysis of the strands was initially separate but concurrent.

Results: After initial analysis, mixing the data provided two key opportunities to attain a deep understanding of participants’ researcher identity. The anchored scale provided a platform to discuss researcher identity with participants and allowed us to look within and across participants’ experiences in unexpected ways. Our discoveries of how individual students’ conceptualizations of the identity being measured varied and how those conceptualizations changed over time were only possible through the analysis of both quantitative and qualitative data.

Conclusions: Our fully integrated mixed method approach resulted in a more complete understanding of students’ researcher identities, allowing us to extend our theoretical understanding beyond what would have been possible with either method alone. Researchers exploring complex topics that can be fluid and affected by time and experience, such as identity, may benefit from integrating a similar approach into their research protocols.

How to Cite: Kajfez, R., Lee, D., Ehlert, K., Faber, C., Benson, L., & Kennedy, M. (2021). A Mixed Method Approach to Understanding Researcher Identity. Studies in Engineering Education, 2(1), 1–15. DOI: http://doi.org/10.21061/see.24
128
Views
29
Downloads
6
Twitter
  Published on 29 Mar 2021
 Accepted on 23 Feb 2021            Submitted on 15 Jan 2020

Introduction and Background

Identity has become a significant area of study in the engineering education community with researchers using quantitative, qualitative, and, to a more limited extent, mixed method approaches to measure identity (Patrick & Borrego, 2016; Rodriguez et al., 2018). Multiple researchers have explored developing a quantitative scale to measure engineering identity to determine what influences identity development and is influenced by identity. While grounded in theory, this is a deductive approach to identity wherein the researcher is limiting what is and is not part of the engineering identity as represented in the scale items. Godwin (2016) established a scale that measured identity using a model of performance/competence, interest, and recognition, while Borrego and her colleagues have developed a quantitative instrument using the model of identifying with an organization (Borrego et al., 2018). Similarly, Pierrakos and colleagues have also developed an engineering student identity scale (E-SIS) based on social identity theory (Curtis et al., 2017; Pierrakos et al., 2016). While valuable contributions to our understanding of identity, these quantitative approaches scope and limit the identity being studied (i.e., they approach the construct to be measured deductively from the top down with concrete bounds).

Other researchers have also measured engineering identity through modifying existing identity scales to fit within the context of engineering or measuring engineering identity through a single item (Patrick & Borrego, 2016). Single item identity questions tend to seek a more holistic measure of engineering identity using prompts such as “Do you consider yourself to be an engineer?” (Meyers et al., 2012). While a simple item is more open than other approaches because it allows the participant to self-define the item, this technique does not allow for in-depth understanding of responses since the researcher cannot evaluate how the participant interpreted the question beyond responses to an open-ended survey question, which is often paired with the single item. While all of these studies employed well-known and established practices to measuring identity that allow researchers to generalize results based on theory, they lack an individual view and participant-centered focus with respect to the construct that is needed given the complex and multi-faceted nature of identity.

Because identity is multi-dimensional, many researchers have also explored engineering identity qualitatively through interviews, multiple open-ended survey questions, or other methods like ethnography that take a more inductive (i.e., the researcher allows the findings to emerge from the range of data provided by the participants) approach to understanding participants’ experiences. For example, Matusovich et al. (2010) used a multiple identities framework to better understand how undergraduates develop an engineering professional identity, and Tonso (2007) used an ethnographic approach and a community of practice framework to explore engineering identity and campus culture. In both of these cases, theory guided the work but new, unexpected results bubbled up from participants’ responses. Similarly, Stevens et al. (2008) used an ethnographic approach and developed a framework of becoming an engineer through longitudinal ethnographies of students matriculating through an engineering curriculum across multiple institutions. While these studies provide complex views of identity that expand and extend known theory, their transferability is limited given the number of participants (tens of participants in the qualitative work versus hundreds of participants in the quantitative work). Additionally, like many qualitative approaches, they are time intensive and require researchers to develop a deep understanding of each participant’s experience within their unique context to be able to tell a detailed story and glean meaning.

In an effort to leverage benefits from both quantitative and qualitative research methods, some engineering education researcher teams have explored identity using different mixed method approaches in limited ways. Li et al. (2009) developed a model to identify student characteristics that would affect enrollment and retention in engineering. They developed the model by qualitatively analyzing previous literature related to engineering and collegiate retention and then developed quantitative measures from that model. Similarly, Fleming et al. (2013) leveraged a sequential explanatory (quantitative then qualitative) mixed method approach to explore engineering identity of students attending minority serving institutions. They distributed a survey to students at four minority serving institutions and then invited a select number of survey participants to participate in a follow-up semi-structured interview. The results indicate that there is a positive impact that minority serving institutions had on the development of an engineering identity for those students. Conversely, Litchfield and Javernick-Will (2015) leveraged a sequential exploratory (qualitative then quantitative) mixed method approach to explore differences in engineering identity between students who participated in Engineers Without Borders (EWB) and those who did not, concluding that students who participated in EWB had broader motivations for becoming engineers. One of the largest mixed method studies in engineering education related to identity development, which included both sequential and concurrent strands, is the Academic Pathways Study (APS) that explored how students “pursue and persist in an engineering degree, relative to their learning and identity development” (Chachra et al., 2008, p. 2). APS researchers explored how gender, motivation, environment, and experiences impacted and influenced the development of an engineering identity (Atman et al., 2010). These mixed method approaches contribute to the small but growing body of literature in engineering education that has examined identity through both quantitative and qualitative methods, attempting to address the challenges posed by a single research approach.

Given the promise for using mixed methods to offset the limits of one approach alone and to more fully understand identity, we conducted a fully integrated mixed method analysis (Creamer, 2018a) of undergraduate engineering students’ role identities as researchers. We use the term fully integrated mixed methods (Creamer, 2018a) to indicate the integration of data from quantitative (deductive) and qualitative (inductive) methods throughout our entire research process (from study planning to data collection to data analysis). In this manuscript, we present an approach for integrating a single anchored scale identity item with in-depth interviews using a study of student researcher identity as an exemplar case. The unique contribution of this work towards understanding the researcher identity of engineering students led us to suggest that mixed methods research can be a strong complement to quantitative and qualitative identity research methods that are currently and commonly employed in engineering education.

Undergraduate Researcher Identity

Undergraduate engineering students’ roles within academia are complex and multifaceted. In addition to their roles as learners, some engineering students take on supplementary roles such as engineering intern, teaching assistant, office assistant, or undergraduate researcher. Each new role allows a student to broaden their professional skills and educational experience. Since the roles individuals fill impact their identity (Stryker & Burke, 2000) and identity impacts career choice and success (e.g., Carlone & Johnson, 2007; Cass et al., 2011; Godwin et al., 2016; Hazari et al., 2010), it is essential to understand the additional roles engineering students can undertake besides that of an engineer.

We are specifically interested in the researcher identity of undergraduate engineers. Conducting undergraduate research gives students the opportunity to understand what it is like to be a researcher while enhancing their metacognitive and problem-solving skills (Hunter et al., 2007). Exposure to undergraduate research can prepare students for a thesis-based graduate program and experience for launching a research career, which can lead to students clarifying their career plans and goals. Undergraduate research experiences have been shown to increase students’ confidence in their abilities to conduct research and also improve their technical and communication skills by giving the them the opportunity to present and/or publish their work (Seymour et al., 2004), skills that are beneficial to students regardless of their future careers. We posit that these skill-building experiences can also lead to the development or changing of a researcher identity.

Our Mixed Method Approach to Researcher Identity

Mixed methods expand the breadth of research questions that can be explored beyond what can be achieved through quantitative or qualitative methods alone, and the complexity of the questions has the potential to increase as well. For example, mixed methods provides a means for triangulation but is also valuable when the work is driven by complementarity between the quantitative and qualitative data (Creamer, 2018b). Triangulation alone only reveals one aspect of the potential. The ability to examine complementarity is particularly important when the two data strands reveal contradictory or inconsistent data (Creamer, 2018b). A framing of complementarity allows for mismatch while aiming for more holistic understanding. For our work, this deep exploration was accomplished through a single item presented in both the quantitative and qualitative data strands. This item allowed us to compare within and between participants’ responses in a unique way, moving beyond triangulation.

We explored students’ responses to the single researcher identity item across participants and any differences between individual’s responses from the survey and interview questions. This approach allowed us to capture the individual’s views and the broader definitions of what it means to be a researcher from the perspective of the undergraduates. In the language of mixed methods research, we qualitized (Nzabonimpa, 2018; Sandelowski, 2000) the quantitative measure to gain a deeper understanding of the student experience and additional knowledge related to the validity of such an approach to measuring identity. Specifically, we compared and contrasted the students’ responses across these two strands to answer our research question: How can quantitative and qualitative data approaches be used in combination to explore how students conceptualize their researcher identity?

Methods

This manuscript stems from a mixed method project aimed at understanding the epistemic thinking and researcher identity of engineering students involved in undergraduate research (Benson et al., 2016; Faber et al., 2019; Lee et al., 2019). For this special issue, we are focusing on new findings attributed to the use of mixed method approaches to studying identity. If you are interested in learning more about the larger project and our findings, please read our other works, which are cited throughout this section. Particularly for this special issue, we were interested in combining quantitative and qualitative data to explore students’ conceptions of themselves as researchers and how this allows for comparisons across participants’ quantitative scores. We initially looked for confirmation between data strands for the same item but found differences between participants’ survey and interview responses. Probing these differences during interviews provided insight into how participants believed their identities changed over time and with different research experiences. It also allowed us to better understand nuanced changes in their definition of what a researcher is. Our findings around these topics would not have emerged without the use of this mixed methods approach. It should be noted that the data used in this study is not publicly available but access to de-identified data may be granted by contacting the lead author.

Project Overview

The project began with a survey (open- and closed-ended items) first administered in 2016, and again in 2017 to increase the sample size, to engineering students who self-identified as participating in some sort of undergraduate research experience (Ehlert et al., 2017) at six institutions classified as different types by The Carnegie Classification of Institutions of Higher Education (Indiana University School of Education Center for Postsecondary Research, 2015). Follow-up interviews were conducted with 20 undergraduate students who completed the survey. At the larger institutions included in this study, we reviewed participants’ survey responses and sent invitations for follow-up interviews to ensure that we interviewed students with different types and years of research experience and different perceptions of research and themselves as researchers. At the smaller institutions, we emailed any student who indicated they were interested in completing an interview. Through IRB approved procedures (Clemson University: IRB2014-051, Ohio State University: 2015B0285, and University of Tennessee-Knoxville: 17-03655-XM), we linked the interview participants to their quantitative responses and assigned a gender-neutral pseudonym.

Fifty percent of the interview participants came from Institutions 1 (Very High Research Activity) and 2 (Very High Research Activity) (25% each), with the other participants from Institutions 3 (Very High Research Activity), 4 (Larger Program Master’s University), 5 (High Research Activity), and 6 (Special Focus Four-Year: Engineering Schools) making up 10%, 15%, 5%, and 20% of the participant cohort, respectively. The text in parentheses after each institution indicates the institution type as defined by The Carnegie Classification of Institutions of Higher Education (Indiana University School of Education Center for Postsecondary Research, 2015), which is based on an institution’s defining feature. At the time of the interviews, which happened approximately three to twelve months after the participants completed the survey, participants had been in college from two to six years, with the majority being in college four years (40%), followed by three years (25%), two years (20%), five years (10%), and six years (5%). The majority of the 20 participants were female (75%).

Quantitative and Qualitative Data Strands

There are various definitions of mixed methods. For this work, we operationalize quantitative and qualitative using Creamer’s (2018a) definitions: the quantitative strand is deductive in nature and the qualitative strand is inductive. The deductive approach allowed us to focus and bound based on what is already known while the inductive approach allowed for new insights to be generated by our participants and methods.

Based on this framing, our quantitative or deductive strand included participants’ responses to the question “Do you see yourself as a researcher?” Participants responded to this question using an anchored scale from 1 to 7, with 1 being “No, not at all” and 7 being “Yes, very much.” This single item was first incorporated into our work in 2014 (Faber & Benson, 2015), following Hazari et al.’s (2010) work in physics identity as a guide. As there was not an existing framework for researcher identity at the start of this research study, we decided to use this single deductive item to get an initial overall understanding of the extent to which students saw themselves as researchers. While this item provided a quantitative measure of identity that was simple to analyze and bounded, we recognized the item lacked complexity. Subsequently, the single deductive item was followed by inductive, open-ended questions in the survey to further explore how students saw themselves as researchers. These responses to these open-ended questions allowed us to more deeply understand each participant’s responses to the single item (Faber et al., 2020). In our early analysis of the surveys, we identified that students responded to this item using different ranges of scale points (i.e., we could not tell how they were considering the full scale to rate their researcher identity). This observation echoes the limitations of quantitative research identified earlier with respect to the deductive nature of quantitative items. In this case, the participants’ views were not being fully captured and these views were influencing the scale use. This limitation again highlights the need for mixed methods work, which influenced our decision to continue using the item in our interview as a starting point to ask participants additional questions about how they saw themselves as researchers.

Our qualitative, or inductive, strand included open-ended survey questions and interview questions designed to further explore specific aspects of students’ researcher identities (e.g., when they first felt like a researcher, what makes them feel like a researcher, etc.). These interviews were semi-structured in nature and were informed by participants’ responses to the earlier survey. Prior to each interview, we read the participant’s survey responses and pulled participant’s responses to specific survey questions to use during the interview. Related to identity, each participant was asked In your survey, you indicated that you did [or didn’t] feel like a researcher. On a scale of 1 to 7, you rated a [number]. Where do you feel you are now? Why?” When asked this question, each participant was handed a printed scale with the following anchors: 1 – No, not at all and 7 – Yes, very much. We asked students for this explanation in our interviews because of the time (three to twelve months) between our initial surveys and the follow-up interviews. Originally, we intended to have the interviews occur within one to three months after the initial surveys; however, we decided to place more focus on collecting and analyzing survey results before moving to interviews, increasing the time between the two data collection points. This approach also allowed us to ask students to reflect on any difference between their initial survey response to the single researcher identity item and their response to the same item during interviews. Participants thought deeply about their responses to this item and provided thoughtful answers in the interview. In the interviews, to more fully capture participants’ researcher identities beyond the single identity item, we asked a series of follow-up questions: 1) When did you start feeling like a researcher? 2) What part(s) of your research experience(s) make you feel like a researcher? and 3) What parts don’t make you feel like a researcher? However, as we progressed through our analysis of the first 13 interviews, we noticed differences in the ways that participants interpreted the researcher identity scale. To more fully understand participants’ responses to the question “Do you see yourself as a researcher?” participants in the remaining seven interviews were asked to specifically define the scale anchors with the following prompt “Could you define what a 1 and a 7 mean to you?” Again, this change allowed us to use a deductive quantitative item to begin our analysis while leveraging the power of an inductive qualitative approach to better understand the participants’ views and interpretations through mixing. By explicitly asking participants to define the ends of the scale, we were able to further integrate our deductive and inductive strands by re-scaling these participants’ quantitative responses, allowing us to compare across participants and further understand the bounds of participants’ views of their researcher identity.

Analysis

For us to compare quantitative researcher identity ratings within and between participants, we needed to normalize participants’ researcher identity based on their descriptions of the scale anchors. To do this, two researchers analyzed all participants’ discussions of the researcher identity scale anchors and individually constructed definitions for the item scale anchors by using the participants’ inductive data to inform participants’ conceptualizations of the deductive scale anchors. The researchers met and discussed their individual definitions of the scale anchors until they reached agreement. Below, we present composites of the researcher identity scale anchors. We constructed these descriptions from participants’ qualitative interview data; therefore, they represent participants’ perspectives of the researcher identity scale.

1 –No, not at all: No contribution to research, not leading research, not currently doing research, not planning on a research career, and negative self-assessment of research skills.

7 –Yes, very much: Have research experience, contributing data/ideas, disseminating results, confidence/high self-assessment of research skills, planning on research as a career.

Using these final definitions for the scale anchors, the researchers then individually re-scaled all twenty participants’ researcher identities from 1 to 7 based on their full interview transcripts making sure to stay grounded in the participants’ own responses and descriptions of themselves as researchers. Once complete, the researchers met again to discuss participant identity scale values until agreement was reached. The researchers were able to come to agreement on re-scaling for all participants except for one. Six participants were rated at the high end of the scale (7, 6), six at the low end of the scale (1, 2), and seven in the middle of the scale (3, 4, 5). The researchers could not reach consensus for the one participant (Aubrey) because some parts of her transcript indicated that she rated high, while other parts of her transcript indicated that she rated in the middle of the scale. However, the researchers were able to agree that Aubrey fell at the transition between high and medium researcher identity rating. Explicitly asking participants to define the scale anchors during the interview was integral in allowing us to mix our inductive and deductive strands during data analysis. The inductive participant descriptions of the scale anchors added nuance to the scale anchor which facilitated the re-scaling of participant identity ratings.

To obtain a fine-grained re-scaling of participant researcher identity, participants’ researcher identities were then ranked based on data from their transcript where they discussed their researcher identity self-rating and explanation (please note that we did not re-score them; we re-scaled or ranked them in comparison to each other). Following the interview transcription, the responses were analyzed using a systematic process that aligned with grounded theory methods. That detailed approach is described in our model development paper (Faber et al., n.d.); however, at a high level our analysis included emergent coding and writing structured analytical memos for each participant (Lee et al., 2019). The explicit definitions of the researcher identity scale by participants along with the identity rating re-scaling facilitated the mixing of our inductive and deductive data strands expanded the depth and nuance of the analysis and strength of the results. Within the structured analytical memos, we included descriptions of participants’ researcher identities as well as comparisons across participants. This data allowed participant descriptions of their own researcher identities to be re-scaled according to the level of alignment with low and high researcher identity definitions established through analysis of participants discussions of the researcher identity anchors. Furthermore, once ranked within the re-scaled structure, participant descriptions of their own identities revealed deeper insights into how the participant sample as a whole perceived a high, medium, or low researcher identity. The results of the re-scaling are summarized in Table 1 and are discussed in further detail in the Mixed Methods Opportunities and Related Findings section of this manuscript.

Table 1

Participants are Ranked High to Low Based on Researcher Identity Scores Reported.

Identity Grouping Participants Ranked High to Low Researcher Identity Participant’s Reasoning Change from Survey to Interview

I am a researcher and/or I want a research career Pat Confident in abilities as a researcher, mentor, and mentee 0
Dana Anyone who is asking and/or answering questions is a researcher –1.5
Riley Feels confident as a researcher, but feels that she cannot fully be a researcher without a Ph.D. 0
Logan and Taylor Cannot rate self higher because there is always more to learn in research –1
0.5
Aubrey Not enough experience to rate self higher 0.25
Clay Had not finished writing a paper for publication –1.5
Frances Has not done enough independent research to rate self higher –1
Peyton Did not see that the research he was doing was helping people –1

I am not a researcher and/or do not want a career in research Ari Likes research, but lacks research experience 0
Kelly and Sage Confident in their research abilities, but feel they cannot lead their own research –0.5
–1.5
River Interested in research, but not all fields of research –2
Bay Feels like she is performing the practices of research but does not identify as a researcher 0
Kennedy Lacks experience, only did research for a short time 0.5
Hayden, Max, and Reed Currently not participating in research experience –4
–4
–5
Eli Results were not verified by a university professor, so cannot be considered research –3.5
Sam No passion for research; the more research she does, the less she feels like a researcher –1

Limitations

We acknowledge that there are limitations to our current approach, including limitations related to both understanding researcher identity and to this approach for integrating a single anchored scale identity item with in-depth interviews. While one item to measure identity impacts the interpretation of our quantitative results (e.g., it lacks depth and complexity), there are several factors that encouraged us to use a single instead of multiple measures. When we began this work, instruments to measure researcher identity (e.g., Choe et al., 2017) were just being developed in engineering education, focused on graduate students, and were not widely available. Additionally, we believed that a single, holistic measure of identity was an appropriate starting point for our work since a more robust scale was not available, and we could use the insights to gather additional data to further expand our findings. While this may be a potential limitation, by using a single item, as opposed to multiple items across multiple dimensions, we allowed the participants to define researcher identity for themselves. This approach aligned with our goal of exploring researcher identity from the undergraduate student perspective rather than applying a researcher-constructed conceptualization. We were further encouraged to explore participants’ perceptions of researcher identity using inductive approaches, allowing us to integrate the initial deductive results with additional inductive data. Our approach also aligned with other researchers’ approaches to measuring identity at the time (e.g., Meyers et al., 2012). Our findings provide a valuable contribution to the literature related to identity measurement, which are further explored in the next section.

Another limitation to this work is our sample of participants. A majority of our participants self-identify as female (15 out of 20) and White (15 out of 20). As such, we are missing the perceptions of some members of the engineering community such as males of color. In addition, most participants attend Very High Research Activity institutions (13 out of 20), which is in part due to the fact that these institutions were larger and thus provided more potential participants. It is possible that our mixed methods approach contributed to the final composition of our sample, especially because we used the initial survey as a mechanism to select participants for interviews. If an individual did not complete the survey, they would have not been given the opportunity to complete an interview. Self-selection bias is a challenge in any study and will be further exaggerated within a study that requires participants to participate in multiple data collection measures, such as a survey and then interview. Additionally, to limit disciplinary differences, we only recruited from mechanical and biomedical engineering departments, so 18 out of the 20 participants represent these majors. Another limitation to participant recruitment in this study was that we recruited participants through emails sent by departmental representatives (advisors, department chairs, undergraduate coordinators, etc.), and these recruitment emails may have been overlooked by many students. Again, our sample impacts the interpretation of the results and readers’ ability to transfer the findings broadly to all engineering students.

Finally, the elapsed time between the survey and interview was not consistent across all of the participants given challenges in participant recruitment and the nature of the grounded theory elements of the work, which requires time to analyze the data between recruiting participants. Our approach to using surveys then interviews did not allow us to capture the exact same data at each time point for the participants (e.g., we did not do pre- and post-surveys relative to each participant’s research experiences). Therefore, we cannot make detailed comparisons between the two time points across all participants, which limits our ability to make specific claims about how participants’ researcher identity might develop over time. This limitation of varying times between the survey and interview is a direct result of using a mixed methods and grounded theory approach along with the complexities that these designs introduce to participant recruitment.

Finally, as mentioned above, aspects of our research approach were based on grounded theory methods and were therefore emergent in nature. One implication of this approach is the interview protocol evolved over time as we adapted it to the findings and needs of the work. This led to different data being collected across the participants as we further developed the interview protocol to explore specific aspects of students’ experiences. On the other hand, this emergent approach afforded us the opportunity to explore various ways of mixing our deductive and inductive data as we analyzed data, allowing us to address specific limitations associated with collecting only quantitative or qualitative data. While this emergent approach was critical to our study, it does impact and limit the interpretation of the results and our ability to make conclusions across all participants. For example, we did not ask all participants to describe the ends of the researcher identity scale as described above, which was used to inform the rescaling of participants’ responses.

Mixed Methods Opportunities and Related Findings

Asking participants on both the survey and in the interview “Do you see yourself as a researcher? 7 – Yes, very much to 1 – No, not at all” provided a platform for participants to discuss their researcher identity in a unique way and resulted in two key opportunities that uncovered unique findings. First, we were able to ask participants to compare how they saw themselves as researchers at two time points – at the time of the survey and at the time of the interview. Second, we were able to discuss the full breadth of participants’ conceptualizations of who a researcher is. We structured our research results around these two opportunities.

Opportunity 1: Scale provided a platform to talk about researcher identity

Thirteen out of twenty of the participants in our study continued conducting research in some capacity in the time between taking the survey and completing an interview (approximately three to twelve months). We expected that these participants would come to see themselves as more of a researcher over this period of time, responding with a higher score to the question “Do you see yourself as a researcher?” during the interview compared to the survey. However, this was not the case in our study. Only three of the twenty participants reported a higher score in the interview relative to their survey response (see Table 2).

Table 2

Participant, Identity Score Reported from Survey, Interview, and Selected Demographic Data.

Participant Survey Identity Score Interview Identity Scorea Institution & Type Years in College Gender Race/Ethnicity

Ari 5.0 5.0 3, R1 4 F Hispanic
Aubrey 5.0 5.25a 6, ES 3 F White
Bay 4.0 4.0 6, ES 2 F White
Clay 6.0 4.5a 4, M1 4 F South Asian
Dana 6.0 4.5a 1, R1 3 M White
Eli 5.0 1.5a 6, ES 2 F White
Frances 7.0 6.0 2, R1 4 M White
Hayden 7.0 3.0 6, ES 4 F White
Kelly 6.0 5.5 2, R1 3 F White
Kennedy 3.0 3.5a 2, R1 4 F White
Logan 6.0 5.0 5, R1 5 M White
Max 6.0 2.0 1, R1 2 F White
Pat 6.0 6.0 1, R1 5 M White
Peyton 6.0 5.0 2, R1 2 M East Asian
Reed 6.0 1.0 3, R1 6 F Hispanic
Riley 6.0 6.0 1, R1 3 F White
River 6.0 4.0 4, M1 4 F White
Sage 7.0 5.5a 1, R1 3 F White
Sam 4.0 3.0 2, R1 4 F White
Taylor 5.0 5.5a 4, M1 4 F South Asian

Note: Carnegie classification reported at time of publication: R1 – Very High Research Activity, M1 – Lager Program Master’s University, and ES – Special Focus Four-Year: Engineering Schools. Scores marked with an (a) were an average based on the range reported by the participant. For example, Aubrey reported she was between a 5 and 5.5. Participant scores highlighted had an increase in scores from their survey to interview. On the identity scale, a one corresponds to not at all feeling like a researcher and a seven corresponds to feeling very much like a researcher.

In the interview, we asked follow-up questions to explore why participants’ ratings of their researcher identities decreased over time. We found that for those still doing research, self-ratings decreased due to changes in their conceptualizations of research and making the decision that they did not want to pursue a career in research. It is important to note that, in general, the participants who reported lower researcher identity ratings during the interview did not necessarily feel less like researchers compared to when they completed the survey. These participants attributed changes in their researcher identity rating to a re-calibration of their researcher identity scale, which would not have been captured without a mixed methods approach.

In addition to providing this comparison across two time points, incorporating the item “Do you see yourself as a researcher?” in the interviews provided the opportunity to discuss the bounds of participants’ conceptualizations of researchers. Even for the interview participants who described seeing themselves as researchers, none rated themselves as a 7 during the interview, while three responded with a 7 on the survey. In the interviews, we found that participants felt that they were missing something that prevented them from rating themselves highly as a researcher. What was lacking varied for participants and included a publication, enough experience, or a higher degree (such as a Ph.D.). For example, Taylor explained that one could never be a true researcher because the nature of research meant that one would always be learning and working towards becoming a researcher.

I would probably say a six. I think it’s really hard to say that you’re ever at a full 100% researcher, but I think that’s also because my interpretation of it is there’s always room to learn and stuff and be exposed to other things. (Taylor)

In Frances’ case, the interviewer prompted her to reflect on the fact that she rated herself as a 7 on the survey but in her interview later stated that, as an undergraduate researcher, she could not attain a researcher identity rating of 7. Frances’ reflection provided additional insight to her view of researchers:

I think I may have just been viewing it a little bit differently in terms of what is a researcher. There were definitely a lot of ideas that I contributed to the studies themselves, but when I think about the overall goals of the study and where were those original ideas coming from, those were not mine. I think until you really have to be that person that starts from square one, that for me was what defines a seven where I think I started from kind of beginning middle. (Frances)

Frances’ change in her researcher identity rating was due to a change in her definition of a researcher and not how strongly she identifies as a researcher. She explains that while she contributed ideas to the research, the original idea for the research project was not hers, so she could not rate her researcher identity a 7. Likewise, earlier in the interview, she says that an undergraduate researcher cannot be a 7 “…within the confines of structure undergraduate research with academic grants and the bureaucracy of it…very few people come in at 18 years old knowing exactly what to do, how to get there.” Additionally, based on her current conceptualization of researchers (during her interview), she wanted to amend her earlier survey response for researcher identity to “beginning middle.”

Opportunity 2: Look across participants in a different way

Our use of mixed methods to interrogate the quantitative researcher identity item through qualitative interviews revealed that there were both inter- and intra-participant differences in how participants interpreted the researcher identity scale (see Table 1). As already described, some participants reported lower numbers for researcher identity ratings during the interview as compared to the survey (Tables 1 and 2) but explained that their researcher identity was not lower. Rather, they had a better understanding of what a researcher was, so the lower researcher identity number was a result of a shift in how the participant interpreted the researcher identity scale. We also observed differences in interpretation of the scale between participants. For example, both Sam and Eli felt that they were not researchers, and their qualitative descriptions of researcher identity of themselves also aligned with the composite description of the anchor for “1” on the researcher identity scale. However, Sam rated herself a 3 during the interview, while Eli rated herself between a 1 and 2.

In light of the qualitative differences between participants’ interpretations of the researcher identity scale, and to facilitate comparison of researcher identities between participants, we re-scaled participant researcher identities from the interview question according to their qualitative descriptions as described in the analysis section of this manuscript. Further investigation of interview data during this re-scaling revealed that participants fell into two general groups: those who referred to themselves as researchers, and those who stated that they were not researchers or did not want to have a career in research.

Within the “I am a researcher group,” we found that Pat and Dana aligned best with the “yes, very much” anchor definition. Pat was confident as both a research mentor and a mentee, and Dana felt that research was done whenever he asked and answered a question. Neither expressed reservations about their researcher identities.

Now that I do have proper engineering mentors and really starting to work with those, that just only accelerates it and you really feel like a researcher being mentored by somebody else and mentoring others, all to work towards a common goal of nobody’s done prior to. (Pat)

We scaled Riley, Logan, and Taylor lower than Pat and Dana because while they aligned with the high researcher identity definition, they had some reservations about rating themselves at the high anchor. Riley felt she would reach the anchor by obtaining a Ph.D., while Logan and Taylor felt they could never reach the anchor because there is always more to learn in research.

Based on what I know about research, though, I don’t think I could ever be above a six or a six and a half. I think becoming a researcher and being a researcher is a never-ending path. I think that you can never actually truly be a researcher, but instead a developing researcher. (Logan)

We scaled Clay, Aubrey, Frances, and Peyton lower because in contrast to Riley, Logan, and Taylor, they felt that they needed more experience, independence, or publications to rate themselves higher on the researcher identity scale.

I think now I would probably rate myself at a six, just because I wasn’t principal investigator. You have a direction to go, it’s mostly at the undergraduate research point about making it work. (Frances)

For participants in the “I am not a researcher/don’t want it as a career” group, participants who reported confidence in their research skills were scaled highest, followed by participants who only felt like researchers when they were actively participating in a research experience. Those who felt that they had not contributed to research and had a low self-assessment of their research skills aligned best with the “no, not at all” researcher identity scale anchor and were placed at the bottom of the scale. Sage and Kelly were most confident in their research skills in this subgroup but explained that they did not feel like researchers because they did not feel they could lead their own research.

Otherwise, I don’t know, maybe after the question that you just asked about if I feel like I’m kind of like guiding my research or … Which I’m not. It’s always underneath somebody. Which is fine and I’m learning a lot and getting a lot out of it. That’s where somebody has to start. But, if I could lead my own research, that, I could not do. I think that’s why it wouldn’t be a seven anymore. (Sage)

Bay felt that even though she had performed research practices during her research experience, she still did not feel like a researcher, so she was placed below Sage and Kelly.

Unlike Sage, Kelly, and Bay, whose researcher identity ratings changed little between the survey and interview, researcher identity ratings for Reed, Hayden, and Max went from the top of the researcher identity scale to the bottom between the survey and interview. Reed, Hayden, and Max rated themselves as 6, 7, and 6 respectively on the survey, and 1, 3, and 2 respectively during the interview. When asked why their researcher identity ratings changed, they explained that it was because they were not currently doing research.

Interviewer: Okay, so what number would you be on that scale?

Reed: I guess one, because I’m not doing research.

These drastic changes in researcher identity ratings indicate that the researcher identities of Reed, Haden, and Max are thin and contextual: strong only when they are participating in a research experience. For this reason, we scaled Reed, Hayden, and Max below Bay.

Finally, Eli and Sam represent the “no, not at all” anchor of the researcher identity scale. Eli constructed her own research question and led her own research, both of which were cited as reasons for feeling like a researcher by participants who reported strong researcher identities. However, she still felt that because her research was not verified by a college professor, her work could not be considered research. Sam, on the other hand felt little passion for research, and reported that the more research she did, the less she felt like a researcher.

The people that I work with, all really enjoy the whole process of researching, of the research process. They enjoy doing data analysis. They enjoy writing papers and presenting. That’s what gets them going, where I’d just rather run the procedure and that’s it. (Sam)

Eli’s feelings that what she was doing was not research, and Sam’s lack of passion for research were reasons we scaled them to the low anchor of the researcher identity scale.

Defining the anchors of the quantitative researcher identity scale through the qualitative interview data allowed us to identify participants that best fit with the upper and lower anchors of the scale. We were able to re-scale participants based on each participant’s interpretation of the researcher identity scale by considering a combination of participants’ quantitative researcher identity self-rating and the qualitative description of their own researcher identities. Mixing the quantitative and qualitative data analysis afforded us the opportunity to make a deeper comparison between participants.

Discussion of Implications

Mixed method approaches to studying identity can provide insights that are not achievable with quantitative or qualitative techniques alone. We have framed the discussion in terms of the implications of using mixed methods to study identity and the unique opportunities that are afforded to our work because of this approach. Specifically, we provide insights into the inter- and intra-based insights we gained through our research.

First, when conducting identity research, researchers need to consider individual students’ conceptualization of the identity being measured. We operationalize this as inter-participate differences. These differences exist between the participants themselves (i.e., different participants view and define identities differently). For example, in our study we found that each participant had a slightly different definition of a researcher. These differences would have not come to light if we only used a quantitative, even a multi-item, instrument as many of these instruments are still self-reflections of the participants which do not help researchers understand students’ definitions of those items (e.g., Choe et al., 2017). We would have not known that a 7 to Eli meant being a professor and doing research as a career, and a 7 to Sage meant guiding her own research. Similar inter-participant differences have been found in large scale survey studies where students interpret the meaning of questions differently which raises concerns about the validity of the measure (e.g., Bennett & Kane, 2014). The revised anchors based on composites of participants’ definitions of a researcher allowed us to create a more accurate quantitative scale with anchors that are more specific and appropriate for this population despite it being only a single item. There are also differences between the participants’ and our views of researcher identity as researchers of this work. For example, our research team would have defined a 7 as collecting data, analyzing data, and disseminating research results, which is different than both Eli and Sage’s definitions of a 7. A mixed method approach allowed us to uncover these definitions and better understand the range of students’ views in the data set. It also allowed us as researchers to be cognizant of our own views on researcher identity and the potential bias our views could bring to the work.

Second, researchers need to recognize that a participant’s conceptualization of an identity can change over time regardless of the roles they hold (Roth & Tobin, 2007; Shanahan, 2009). Therefore, even though an individual’s survey score may be the same on a pre- and post-test regardless of the number of items, a change could have occurred related to their overall understanding of the identity or reconstruction of their identity based on contexts or social interactions (Elmesky & Seiler, 2007). For example, Riley defined a researcher as “someone who is highly engaged in their academic discipline and stays up to date on current successes, and constantly challenges previous ideas to come up with better solutions” on her survey, but approximately one year later, Riley defined it as “you’re definitely a researcher by the time you’ve had your PhD.” We categorize these changes as intra-participant differences, and again, uncovering these changes was only possible with both quantitative and qualitative data. This time-based implication is of particular concern for researchers using longitudinal or cross-sectional quantitative study designs where development is anticipated. In those particular types of studies, we strongly recommend that researchers integrate qualitative methods to further explore perceptions of the identity items at each instance of exploration.

The quantitative anchored scale item gave us a concrete point to elicit discussion while the qualitative data provided details and deeper understanding. After extensive discussion within our research team and with our project advisory board, we realized that intra-participant differences also relate to the concept of thickening and thinning identities, which have been used to characterize ethnic, racial, and national identities (Smith & Sparkes, 2008). Over time, based on the activities a person participates in (i.e., the roles they fulfill), the contexts in which they fulfill those roles, and social interactions within those contexts, a person can change views on and construct their own identities. If the identities are central to who they are and the experience strengthens that view, the identity is considered thick. If the identities are tangential to who the person is and the experiences do not support their identity, the identity is considered thin. Similar results have been found related to teacher leaders whose identities varied over time as a result of experiences (Wenner & Campbell, 2018). Additionally, this finding relates to the idea of a core identity (i.e., the aspects of self that are relatively stable over time), context (i.e., the social dimensions of surroundings that influence the self), and our meaning making capacity (i.e., the filter through which influences do or do not affect the self) (Abes et al., 2007). For our participants, their meaning making capacity as it relates to an undergraduate research experience impacted students uniquely. Additionally, this finding could be reframed using role identity theory where the intersection of roles held, behaviors, and social interactions within context contribute to development (Stets & Burke, 2000). Our results indicate that changes in how students rate their researcher identity on an anchored scale can reflect the fluidity of someone’s researcher identity, but a number alone does not provide the whole story as is the case with Riley, whose definition changed but not her score.

These differences contribute to our main takeaway that there are limitations to the interpretation researchers can make about participant data regardless of whether it is quantitative or qualitative, single or multi-item. Incorporating mixed methods into a study can be a way to offset some of the drawbacks and provide a more robust understanding of participants’ experiences and perspectives. There is a need for more mixed method studies that help us understand the inter- and intra-based differences that exist in identity research.

Future Work

While our fully integrated mixed method approach has led to a deeper understanding of undergraduate engineering students’ researcher identities, there is much potential in using a similar approach to study the many role-based identities that students hold and the intersections between those identities. For example, through our work we gained initial insights into the intersection between students’ engineering and researcher identities, and we started to investigate identity development over time during role transitions.

While our interviews focused on students’ researcher identities and their undergraduate research experiences, our participants were all engineering students. Subsequently, for some participants, their engineering identities became a point of discussion. For example, Max identified more as an engineer than a researcher, saying “Engineering feels … it’s more problem solving and results focused. With research you do have problem solving and you do have results focus, but it’s less towards that end of it and more towards the how’s and why’s.” During our research, we did not systematically explore this identity or its connection to researcher identity for each participant. In future work, we recommend other identities are explored and intersected for a deeper developmental understanding of the student experience. Fully integrated mixed methods will be a useful way to examine multiple role identities in that participants can define the scales for multiple identities and the intersection points between them. Work such as this can also leave room for other identities beyond that of a researcher or engineer, which may have profound impacts on students’ experiences in engineering, particularly for diverse types of students.

Another area of exploration may be macro role transitions (Ashforth, 2000). Micro role transitions are the ones we experience day to day. For example, our participants were researchers in the lab and then students in the classroom. Macro role transitions are larger transitions such as graduating and transitioning from undergraduate to graduate student. Two participants were graduate students at the time of their interviews, so they were excluded from our dataset (our study focused on undergraduate students). However, they did give us some insight into the macro role transition from undergraduate to graduate student and the impacts of that transition on researcher identity. For example, Jesse felt less confident in his research when he compared himself to other graduate students in his graduate program because his undergraduate curriculum focused more on application than mathematical theory: “It was very difficult for me who had been educated more on application and physical understanding to try to compete with people who are incredibly mathematically minded.” As he continued in his graduate program, Jesse found his undergraduate education afforded him strengths that his colleagues did not have: “I could present the theory in ways that my colleagues simply could not. That’s what gave me confidence as a graduate student and maybe as a researcher…” These transitions merit further investigation into the impacts of identity on student trajectories.

Finally, in addition to the potential future work related to identity, we see future work related to mixed methods design and implementation. We used a fully integrated approach with grounded theory, but future work may investigate the impacts of mixed method research on identity when different designs or frameworks are used. For example, an exploratory sequential approach (Creswell & Plano Clark, 2018) in which interviews precede a survey could allow us to gain better insights into identity differences across the institutions and institution types to include broader, more diverse populations. The timing between participant responses to surveys and interviews were partially impacted by our grounded theory approach. Future studies may remove this limitation and gain valuable insight when our approach of an anchored item is used with a different methodological approach. Impacts such as these are beyond the scope of this study but would provide fruitful avenues for future work.

Conclusions

Identity is complex and merits a methodological approach that can uncover those complexities. Our fully integrated mixed method approach used a combination of closed-ended and open-ended survey items to explore students’ identity on a survey and during an interview that were conducted between three and twelve months apart. This approach provided two key opportunities: (1) we were able to ask participants to compare how they saw themselves as researchers at two time points and (2) we were able to capture the full breadth of participants’ conceptualizations of who a researcher is. As a result, we gained a more complete understanding of participants’ researcher identities, how they can change and how participants’ quantitative responses compare. Our work highlights the importance of considering students’ conceptualization of the identity that is being measured, how students’ conceptualizations of that identity may alter their view of a rating scale over time, and how the method of data collection (e.g., interview, survey) may impact participants’ responses.

Acknowledgements

This material is based upon work supported by the National Science Foundation under Grant Nos. 1531607 and 1531641. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Additionally, the authors would like to thank Penelope Vargas, who helped conceptualize the initial project and start the research, along with the various student researchers who assisted with this project along the way: Anne McAlister, Teresa Porter, Karina Sobieraj, Cazembe Kennedy, Alessandra St. Germain, and Guoyong Wu. Without your hard work and dedication this research would not have been possible. Also, we would like to thank our external evaluator, Elizabeth Creamer, and advisory board members, Devlin Montfort, Denise Simmons, and Marie-Claire Shanahan, who guided our thinking and provided valuable insights and contributions as this research and manuscript evolved. Finally, thank you to our student research participants. We truly appreciate you sharing your experiences with us and allowing us to learn from your stories.

Competing Interests

Rachel Kajfez is a member of the editorial board for Studies in Engineering Education, which is on a voluntary basis. All other authors have no competing interests.

Author Contributions

There are six authors on this manuscript who have all met the authorship criteria outlined by the journal. Rachel Kajfez led the writing of the manuscript contributing to the introduction and discussion sections. She also assisted with all phases of the research. Dennis Lee assisted with data collection, led the analysis, and wrote the results section. Katherine Ehlert led data collection, assisted with analysis, and wrote the background sections. Courtney Faber assisted with all phases of the research and wrote the limitations and conclusions sections. Finally, Lisa Benson and Marian Kennedy assisted with all phases of the research and provided critical feedback on all sections of the document contributing to the intellectual content.

References

  1. Abes, E. S., Jones, S. R., & McEwen, M. K. (2007). Reconceptualizing the model of multiple dimensions of identity: The role of meaning-making capacity in the construction of multiple identities. Journal of College Student Development, 48(1), 1–22. DOI: https://doi.org/10.1353/csd.2007.0000 

  2. Ashforth, B. (2000). Role transitions in organizational life: An identity-based perspective. Routledge. DOI: https://doi.org/10.4324/9781410600035 

  3. Atman, C. J., Sheppard, S. D., Turns, J., Adams, R. S., Fleming, L. N., Stevens, R., … Leifer, L. J. (2010). Enabling engineering student success: The final report for the Center for the Advancement of Engineering Education. CAEE-TR-10-02. Center for the Advancement of Engineering Education (NJ1). ERIC. https://files.eric.ed.gov/fulltext/ED540123.pdf 

  4. Bennett, R., & Kane, S. (2014). Students’ interpretations of the meanings of questionnaire items in the National Student Survey. Quality in Higher Education, 20(2), 129–164. DOI: https://doi.org/10.1080/13538322.2014.924786 

  5. Benson, L. C., Kennedy, M. S., Ehlert, K. M., Vargas, P. M. D., Faber, C. J., Kajfez, R. L., & McAlister, A. M. (2016). Understanding undergraduate engineering researchers and how they learn. 2016 IEEE Frontiers in Education. DOI: https://doi.org/10.1109/FIE.2016.7757630 

  6. Borrego, M., Patrick, A., Martins, L., & Kendall, M. (2018). A new scale for measuring engineering identity in undergraduates. 2018 ASEE Gulf-Southwest Section Annual Meeting. Austin, TX. https://peer.asee.org/31558 

  7. Carlone, H. B., & Johnson, A. (2007). Understanding the science experiences of successful women of color: Science identity as an analytic lens. Journal of Research in Science Teaching, 44(8), 1187–1218. DOI: https://doi.org/10.1002/tea.20237 

  8. Cass, C. A. P., Hazari, Z., Cribbs, J., Sadler, P. M., & Sonnert, G. (2011). Examining the impact of mathematics identity on the choice of engineering careers for male and female students. 2011 IEEE Frontiers in Education Conference. Rapid City, SD. DOI: https://doi.org/10.1109/FIE.2011.6142881 

  9. Chachra, D., Kilgore, D., Loshbaugh, H., McCain, J., & Chen, H. (2008). Being and becoming: Gender and identity formation of engineering students. 2008 ASEE Annual Conference & Exposition, Pittsburgh, PA. DOI: https://doi.org/10.18260/1-2--3597 

  10. Choe, N. H., Borrego, M. J., Martins, L. L., Patrick, A. D., & Seepersad, C. C. (2017). A quantitative pilot study of engineering graduate student identity. 2017 ASEE Annual Conference & Exposition, Columbus, OH. DOI: https://doi.org/10.18260/1-2--27502 

  11. Creamer, E. G. (2018a). An introduction to fully integrated mixed methods research. Thousand Oaks, CA: SAGE Publications, Inc. DOI: https://doi.org/10.4135/9781071802823 

  12. Creamer, E. G. (2018b). Paradigms in play: Using case studies to explore the value-added of divergent findings in mixed methods research. International Journal of Multiple Research Approaches, 10(1), 30–40. DOI: https://doi.org/10.29034/ijmra.v10n1a2 

  13. Creswell, J. W., & Plano Clark, V. L. (2018). Designing and conducting mixed methods research (3rd ed.). Thousand Oaks, CA: SAGE Publications, Inc. 

  14. Curtis, N. A., Pierrakos, O., & Anderson, R. D. (2017). The engineering student identity scale: A structural validity evidence study. 2017 ASEE Annual Conference & Exposition, Columbus, OH. DOI: https://doi.org/10.18260/1-2--28964 

  15. Ehlert, K. M., Lee, D. M., Faber, C. J., Benson, L. C., & Kennedy, M. S. (2017). Utilizing cluster analysis of close-ended survey responses to select participants for qualitative data collection. 2017 ASEE Annual Conference & Exposition, Columbus, OH. DOI: https://doi.org/10.18260/1-2--29099 

  16. Elmesky, R., & Seiler, G. (2007). Movement expressiveness, solidarity and the (re)shaping of African American students’ scientific identities. Cultural Studies of Science Education, 2(1), 73–103. DOI: https://doi.org/10.1007/s11422-007-9050-4 

  17. Faber, C., & Benson, L. (2015). Undergraduate students’ recognition and development as researchers. 2015 ASEE Annual Conference & Exposition, Seattle, WA. DOI: https://doi.org/10.18260/p.24953 

  18. Faber, C. J., Benson, L. C., Kajfez, R. L., Kennedy, M. S., Lee, D. M., McAlister, A. M., … Wu, G. (2019). Dynamics of researcher identity and epistemology: The development of a grounded-theory model. 2019 ASEE Annual Conference & Exposition, Tampa, FL. DOI: https://doi.org/10.18260/1-2--32358 

  19. Faber, C. J., Kajfez, R. L., Benson, L. C., Lee, D. M., Kennedy, M. S., & Creamer, E. G. (n.d.). The dynamics of researcher identity and epistemic thinking within undergraduate research experiences: A grounded theory model. 

  20. Faber, C. J., Kajfez, R. L., McAlister, A. M., Ehlert, K. M., Lee, D. M., Kennedy, M. S., & Benson, L. C. (2020). Undergraduate engineering students’ perceptions of research and researchers. Journal of Engineering Education, 109(4), 780–800. DOI: https://doi.org/10.1002/jee.20359 

  21. Fleming, L. N., & Smith, K. C. (2013). Engineering identity of Black and Hispanic undergraduates: The impact of minority serving institutions. 2013 ASEE Annual Conference & Exposition, Atlanta, GA. DOI: https://doi.org/10.18260/1-2--19524 

  22. Godwin, A. (2016). The development of a measure of engineering identity. 2016 ASEE Annual Conference & Exposition, New Orleans, LA. DOI: https://doi.org/10.18260/p.26122 

  23. Godwin, A., Potvin, G., Hazari, Z., & Lock, R. (2016). Identity, critical agency, and engineering: An affective model for predicting engineering as a career choice. Journal of Engineering Education, 105(2), 312–340. DOI: https://doi.org/10.1002/jee.20118 

  24. Hazari, Z., Sonnert, G., Sadler, P. M., & Shanahan, M.-C. (2010). Connecting high school physics experiences, outcome expectations, physics identity, and physics career choice: A gender study. Journal of Research in Science Teaching, 47(8), 978–1003. DOI: https://doi.org/10.1002/tea.20363 

  25. Hunter, A., Laursen, S. L., & Seymour, E. (2007). Becoming a scientist: The role of undergraduate research in students’ cognitive, personal, and professional development. Science Education, 91(1), 36–74. DOI: https://doi.org/10.1002/sce.20173 

  26. Indiana University School of Education Center for Postsecondary Research. (2015). The Carnegie classification of institutions of higher education. Retrieved from https://carnegieclassifications.iu.edu/index.php 

  27. Lee, D. M., McAlister, A. M., Ehlert, K. M., Faber, C. J., Kajfex, R. L., Creamer, E., & Kennedy, M. (2019). Enhancing research quality through analytical memo writing in a mixed methods grounded theory study implemented by a multi-institution research team. 2019 IEEE Frontiers in Education Conference. Covington, KY. DOI: https://doi.org/10.1109/FIE43999.2019.9028469 

  28. Li, Q., Swaminathan, H., & Tang, J. (2009). Development of a classification system for engineering student characteristics affecting college enrollment and retention. Journal of Engineering Education, 98(4), 361–376. DOI: https://doi.org/10.1002/j.2168-9830.2009.tb01033.x 

  29. Litchfield, K., & Javernick-Will, A. (2015). “I am an engineer AND”: A mixed methods study of socially engaged engineers. Journal of Engineering Education, 104(4), 393–416. DOI: https://doi.org/10.1002/jee.20102 

  30. Matusovich, H. M., Streveler, R. A., & Miller, R. L. (2010). Why do students choose engineering? A qualitative, longitudinal investigation of students’ motivational values. Journal of Engineering Education, 99(4), 289–303. DOI: https://doi.org/10.1002/j.2168-9830.2010.tb01064.x 

  31. Meyers, K. L., Ohland, M. W., Pawley, A. L., Silliman, S. E., & Smith, K. A. (2012). Factors relating to engineering identity. Global Journal of Engineering Education, 14(1), 119–131. http://www.wiete.com.au/journals/GJEE/Publish/vol14no1/16-Myers-K.pdf 

  32. Nzabonimpa, J. P. (2018). Quantitizing and qualitizing (im-)possibilities in mixed methods research. Methodological Innovations, 11(2), 1–16. DOI: https://doi.org/10.1177/2059799118789021 

  33. Patrick, A. D., & Borrego, M. (2016). A review of the literature relevant to engineering identity. 2016 ASEE Annual Conference & Exposition, New Orleans, LA. DOI: https://doi.org/10.18260/p.26428 

  34. Pierrakos, O., Curtis, N. A., & Anderson, R. D. (2016). How salient is the identity of engineering students? On the use of the Engineering Student Identity Survey. 2016 IEEE Frontiers in Education Conference. Erie, PA. DOI: https://doi.org/10.1109/FIE.2016.7757709 

  35. Rodriguez, S. L., Lu, C., & Bartlett, M. (2018). Engineering identity development: A review of the higher education literature. International Journal of Education in Mathematics, Science and Technology, 6(3), 254–265. DOI: https://doi.org/10.18404/ijemst.428182 

  36. Roth, W.-M., & Tobin, K. (2007). Aporias of Identity in Science. In Science, Learning, Identity (pp. 1–10). Brill | Sense. DOI: https://doi.org/10.1163/9789087901264_002 

  37. Sandelowski, M. (2000). Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies. Research in Nursing & Health, 23(3), 246–255. DOI: https://doi.org/10.1002/1098-240X(200006)23:3<246::AID-NUR9>3.0.CO;2-H 

  38. Seymour, E., Hunter, A., Laursen, S. L., & DeAntoni, T. (2004). Establishing the benefits of research experiences for undergraduates in the sciences: First findings from a three-year study. Science Education, 88(4), 493–534. DOI: https://doi.org/10.1002/sce.10131 

  39. Shanahan, M. (2009). Identity in science learning: Exploring the attention given to agency and structure in studies of identity. Studies in Science Education, 45(1), 43–64. DOI: https://doi.org/10.1080/03057260802681847 

  40. Smith, B., & Sparkes, A. C. (2008). Contrasting perspectives on narrating selves and identities: An invitation to dialogue. Qualitative Research, 8(1), 5–35. DOI: https://doi.org/10.1177/1468794107085221 

  41. Stets, J. E., & Burke, P. J. (2000). Identity theory and social identity theory. Social Psychology Quarterly, 63(3), 224. DOI: https://doi.org/10.2307/2695870 

  42. Stevens, R., O’Connor, K., Garrison, L., Jocuns, A., & Amos, D. M. (2008). Becoming an engineer: Toward a three-dimensional view of engineering learning. Journal of Engineering Education, 97(3), 355–368. DOI: https://doi.org/10.1002/j.2168-9830.2008.tb00984.x 

  43. Stryker, S., & Burke, P. J. (2000). The past, present, and future of an identity theory. Social Psychology Quarterly, 63(4), 284–297. DOI: https://doi.org/10.2307/2695840 

  44. Tonso, K. L. (2007). On the outskirts of engineering: Learning identity, gender, and power via engineering practice. Brill | Sense. DOI: https://doi.org/10.1163/9789087903534 

  45. Wenner, J. A., & Campbell, T. (2018). Thick and thin: Variations in teacher leader identity. International Journal of Teacher Leadership, 9(2), 5–21. https://files.eric.ed.gov/fulltext/EJ1202333.pdf 

comments powered by Disqus