Start Submission Become a Reviewer

Reading: A Narrative Review of Design-Based Research in Engineering Education: Opportunities and Chal...

Download

A- A+
Alt. Display

Literature Reviews

A Narrative Review of Design-Based Research in Engineering Education: Opportunities and Challenges

Authors:

Angela Minichiello ,

Department of Engineering Education, Utah State University, US
X close

Lori Caldwell

Department of Engineering Education, Utah State University, US
X close

Abstract

Background: Active engagement across a range of methodological frameworks is one hallmark of thriving scholarly disciplines. Design-based research is one newer approach to education research that holds promise for developing effective interventions that are iteratively theorized, designed, and tested within local engineering education contexts.

Purpose: To promote engagement with diverse research frameworks, the purpose of this narrative literature review was to identify, describe, and critically examine emerging use of design-based research in engineering education. We addressed research questions focused on characterizing the use of design-based research in engineering education in terms of the a) problems studied, b) interventions designed, c) participant populations and learning contexts, d) research methods employed, e) form(s) of the research findings, and f) limitations of the literature. Furthermore, this work identified current opportunities and challenges of design-based research for the field of engineering education through analysis of review findings in light of the authors’ experiences conducting design-based research in engineering education.

Scope/Method: Using established review procedures that included specified database search terms and inclusion criteria, we identified 24 empirical design-based research studies in engineering education. We used qualitative content analysis to code study characteristics including nationality, participant population, research methods, and learning context. We then synthesized and critiqued findings across studies.

Conclusions: In synthesizing key aspects of empirical design-based research studies in engineering education, this review provides insights into the ways design-based research is being implemented to advance engineering education imperatives and provides a foundation for expanding and strengthening use of design-based research in future work in engineering education. Opportunities of design-based research for engineering education include developing local improvements to the field’s most persistent and vexing issues (i.e., “wicked” problems) and realizing the full potential of technology for 21st century engineering education. Challenges include developing interdisciplinary teams, the need for expertise across multiple research approaches and methods, funding emergent DBR projects, and disseminating DBR results across the project lifespan.

How to Cite: Minichiello, A., & Caldwell, L. (2021). A Narrative Review of Design-Based Research in Engineering Education: Opportunities and Challenges. Studies in Engineering Education, 1(2), 31–54. DOI: http://doi.org/10.21061/see.15
190
Views
66
Downloads
2
Twitter
  Published on 09 Feb 2021
 Accepted on 17 Nov 2020            Submitted on 12 Jan 2020

Simply said, methodology matters. Broadly defined, research methodology comprises the theoretical rationales and frameworks that simultaneously link research methods—specific actions to be conducted, tools to be used, and procedures to be followed—to the ends (i.e., intended outcomes as framed by puzzles, questions, and/or hypotheses) and means (i.e., theoretical underpinnings including ontological and epistemological perspectives) of a research study (Crotty, 1998, p. 3). Clear and deliberate engagement across a range of methodological approaches is often recognized as a hallmark of thriving scholarly disciplines. Explicit exposition of methodological decisions within research studies grounds research in accepted practices for ensuring its quality and signals to the research community how to interpret its findings. Engagement with diverse methodological options within research fields grows capacity to identify, examine, realize, and deliver effective solutions to complex and nuanced issues that persist within fields themselves and/or linger at boundaries shared with other disciplines.

The increasing richness and variety of paradigms, perspectives, strategies, and methods represented within the engineering education research (EER) literature of past decade (2009–2019) suggest that—despite engineering’s deep post/positivist and quantitative roots—EER is shifting toward the opportunities that diverse methodologies offer. Additionally, it signals a growing understanding within and across the field that engineering education’s most vexing issues, including declining interest across all levels of education, historically and systematically embedded processes of marginalization and underrepresentation, and a seemingly insurmountable education research-to-practice divide, require new ways to conduct and disseminate research so as to impact instructional practice and transform the engineering education system. We (the authors) contend that design-based research (DBR) is one such approach that holds promise for realizing effective solutions that—by design—traverse the research-to-practice divide; this contention provided us with the impetus for conducting a narrative literature review of DBR in engineering education.

Background

What is Design-Based Research?

Arthur Bakker (2019, p. 1, Chapter I, emphasis in original) differentiates DBR from other educational research approaches saying, “Most educational research describes or evaluates education as it currently is. Some educational research analyzes education as it was. Design [-based] research, however, is about education as it could be….” DBR is an approach to educational research wherein the creative and generative processes of design are intimately “intertwined” with the research process: “The design is research-based and the research is design-based” (Bakker, 2019, p. 2, Chapter I). In other words, DBR researchers develop practice-ready solutions to educational problems through the design, realization, implementation, evaluation, and re-design of interventions within authentic learning contexts and in collaboration with key stakeholders (e.g., students, instructors, administrators, and researchers). As a result, DBR researchers design and realize new learning environments (e.g., materials, processes, tools, technologies) as they develop theory and test and evaluate intervention outcomes (Bakker, 2019).

This interweaving of design with research strikes us as being potentially noteworthy for the field of engineering education, wherein researchers may often have prior experience and/or interest in the process of design. Kelly (2014, p. 497) writes how DBR “draws on engineering practices for some of its key values and approaches,” citing as example that the “inspiration for one of the early stage models for design-based research in education (Bannan-Ritland, 2003) was proposed by Woodie Flowers, an MIT engineer….” Within the DBR community, design includes both concrete aspects of the design of objects (e.g., new learning environments, tools, and technologies) as well as more “abstract” and “process-oriented” design aims (e.g., developing sequences of learning activities or procedures for communication and interaction) (Bakker, 2019, pp. 2–3, Chapter I).

Origins of Design-Based Research

Historically, the growth of DBR has benefited from the efforts of multiple pioneers, including education researchers focused on curriculum development in the Netherlands (where DBR was known as developmental research) and cognitive psychologists attempting to mitigate the limitations of controlled experiments in education research in the United States (where DBR was known as design experiments) (see Bakker, 2019; Cobb et al., 2017). Because the origins of DBR are most frequently traced to two seminal articles (Brown, 1992; A. Collins, 1992) from the field of cognitive psychology, DBR is commonly referred to as design experiments. However, DBR is internationally recognized by a variety of labels (i.e., development/developmental research, design experiments/experimentation, education design research, and formative experiments) and has only recently grown into its new name (Bakker, 2019, p. 2, Chapter I). In 2003, a group of faculty and researchers committed to theorizing and practicing design-based research in education emerged (The Design-Based Research Collective, 2003) under the name design-based research proposed by Hoadley (2002); DBR has since become a widely used umbrella term to designate research approaches that blend the design of educational interventions with educational and learning research.

What Design-Based Research Is Not

In light of many alternative labels for DBR, it is important to clarify what DBR is and what DBR is not. Across the literature, several characteristics are recognized as essential to DBR studies (The Design-Based Research Collective, 2003, p. 5), including the a) goals of designing interventions and developing educational theories about those interventions are intertwined; b) development and research processes are enacted within continuous cycles of design, implementation, analysis, and redesign; c) research on designs leads to educational or learning theories (or proto-theories) that convey the implications of the research to educational practitioners and designers; d) research explains or describes how designs work within authentic contexts with actual learners; and e) research employs the research methods that are appropriate and necessary for connecting learners’ design enactment in context to their educational or learning outcomes. Thus, at its core, DBR is the design of educational interventions and the development of theoretical knowledge about these interventions that, collectively, result in positive and sustainable change in authentic learning environments.

To ensure the change is both positive and effective, DBR is conducted with educational practitioners and stakeholders from within the context of interest, as well as in collaboration researchers from other disciplines (Cobb et al., 2003). These intimate collaborations with and between practitioners and researchers are an important way that DBR differs from other types of intervention research (see Levy & Begeny, 2018; Rothman & Thomas, 1994) used to evaluate an intervention’s effects via statistical and logical inferences and a limited number of independent variables (Salkind, 2010). DBR also differs from other design-oriented research approaches, such as research through design (see Edelson, 2002; Gaver, 2012), wherein knowledge claims and findings are developed directly from designerly insights and thinking processes that occur during, or as a result of, the realization of prototypes (Stappers & Giaccardi, 2002). Last, due to its commitment to improving locally relevant educational problems in direct collaboration with local stakeholders, DBR is often linked to action research. However, while commitments to realizing local change are similar among these approaches, there are also distinct differences: a requirement for action researchers to be participants in the research endeavor, and a need for DBR studies to develop theory by iteratively improving the design of a solution to an educational problem, rather than to develop collective action to improve an undesirable situation or problem as in action research (Bakker, 2019, p. 8, Chapter I).

Method, Methodology, Approach, or Paradigm?

Within the literature, it is possible to find several unique examples of DBR described as research method, methodology, approach, and paradigm. Bakker (2019, p. 13, Chapter I) suggests that part of the obscurity that surrounds DBR results from two causes: a) the lack of “clear-cut categories” in social science research wherein research methods (i.e. techniques) and approaches (i.e., strategies) can be neatly organized and b) the fact that researchers often conflate the terms research methodology, the science (i.e., the why) behind your research methods (i.e., the how), and research approach. Others (see Kelly, 2004; Sandoval, 2014) suggest that, rather than being classifiable as a single or unique research methodology or approach, DBR may instead be a methodological framework— a “genre of flexibly [used] … existing research approaches for the purposes of gaining design based insights and research-based designs” (Bakker, 2019, p. 3, Chapter I).

DBR’s obscure nature further contributes to its larger critique within the field of education: how can DBR’s “dual commitment to improving educational practice and furthering our understanding of learning processes” practically be accomplished (Sandoval, 2014, p. 21)? In other words, what are the steps that DBR researchers use to accomplish context-based design innovation and theory building at the same time (Phillips & Dolle, 2006)? Kelly (2004) breaks down this critique into two “problems” for DBR: the “Problem of Demarcation”—is DBR able to present an “argumentative grammar” (p. 118) able to “differentiate scientific claims from those of pseudoscience” for a generalized population (p. 119)—and the “Problem of Meaningfulness”—is DBR able to be “hypothesis and framework generating” and thus “contributing to model [i.e., theory] formulation,” if even at a local level (p. 122)? Kelly (2004, p. 122) argued that, while the direct contribution of DBR to the generalization of educational interventions may be limited to influencing the thinking of researchers within similar contexts, DBR’s substantial and important contributions to the building educational theory precede and are foundational to generalizable contributions. Said another way, the contextualized theoretical insights that are uniquely provided by DBR studies can subsequently be examined through more scientific (i.e., quasi-experimental) studies in order to provide generalizable findings of DBR work.

Alternatively, Sandoval (2014, pp. 22–23) proposed a technique called conjecture mapping to provide the argumentative grammar necessary for conceptualizing and undertaking DBR studies in ways that address both problems. As Sandoval (2014) explains, conjecture mapping is a way of explicitly representing and describing the relationships, in a process map-like form, between the design and theory-based elements of DBR research. Sandoval (2014) suggests that conjecture mapping is one approach for documenting and describing DBR processes systematically and explicitly in order to produce effective interventions and communicate useful, if not generalizable, design principles and theories on learning to practitioners and other researchers.

Purpose

To promote use of diverse research approaches within the EER community, the purpose of this narrative literature review was to identify and critically examine emerging use of DBR within EER. Specifically, this review was guided by the following research question and sub questions a.–f.:

  1. How has DBR been implemented within empirical studies in EER?
    1. What engineering education problems have been studied?
    2. What types of interventions (e.g., frameworks, strategies, environments, tools, policies) have been designed?
    3. Which populations (i.e., demographics, ages, grade levels) and learning environments (i.e., formal, informal, face-to-face, blended, online) have been studied?
    4. What research methods have been used?
    5. What form do the findings take (design, theoretical, both)?
    6. What are the limitations of this body of literature?

Researcher Positionality

As authors, we identify as white, cisgender female engineering education researchers (one has earned and one is working toward a doctoral degree in engineering education), engineering educators (we are experienced and involved in teaching engineering topics to undergraduate or K–12 students), and engineers (we have each earned master’s degrees and participated in engineering research in our respective disciplines; one of us is a registered professional engineer with over 15 years working as an engineer in industry). Based on our identities, we acknowledge that each of us brings distinct “insider status” (Hesse-Biber, 2014), as well as fundamental knowledge of engineering design practices, to this research.

Along with our engineering identities, we acknowledge that we are (both) currently involved in a multi-year, interdisciplinary DBR in EER project that served as the impetus to conduct this literature review. We decided to undertake this review not only to map the current state of DBR in EER and provide recommendations for the field, but also to look more deeply into the literature as a whole to (perhaps) find some “hidden meaning.” We began this study with hope of developing new understandings of the nuances of DBR—those not necessarily congruent with our experiences in engineering design practice, and (perhaps) even uncovering bits of guidance to help us overcome some of the challenges we face in our own DBR study. To this end, we jointly decided not to include our own DBR articles in this review. Rather, we chose to set aside—as best we could—our personal DBR experiences as we conducted the review (i.e., research question one). Then, we revisited the findings using our own DBR experiences as a lens through which to interpret findings from the perspective of engineering education researchers who approach DBR with engineering-related identities.

Methodology

This narrative literature review provides a comprehensive and critical examination of the emerging use of DBR as a research approach in EER. Although the advantages of conducting systematic style literature reviews are currently highlighted within the EER literature (Borrego, et al., 2014, 2015), our review takes the form of a narrative overview style literature review. Narrative overviews, known within the field of medicine as unsystematic (Oxman et al., 1994) or historical (J. A. Collins & Fauser, 2005) reviews, are comprehensive, descriptive syntheses of available published information on a topic of (potentially wide) interest that are written in an inviting and readable narrative form (Green et al., 2006). As J. A. Collins and Fauser (2005) describe, nascent and/or interdisciplinary research topics are especially suited to the wide vantage provided by narrative style reviews. For such topics, the narrow focus and prescriptive methods that define systematic style reviews can become weaknesses, erasing understanding of historical development provided by narrative threads and limiting the types and perspectives of sources examined. Narrative style reviews, therefore, offer a unique set of advantages that make them better suited for certain purposes, such as educating readers on the origins and historical development of emerging ideas and concepts and for provoking dialog and scholarly debate about new ways of doing and thinking about research through philosophically-minded critique (Green et al., 2006).

For this study, we chose a narrative review approach for several reasons. DBR first emerged within the fields of education and learning sciences literature nearly 30 years ago. More recently (i.e., circa 2005), DBR has started to appear within the engineering education literature. The current DBR literature in engineering education varies in terms of publication scope, types of disciplinary expertise involved, implementation of the DBR approach and associated methodologies, choice and use of methods, and reporting and presentation of study findings. Thus, the nascent and interdisciplinary nature of the DBR in EER literature, as well as a lack of established exemplars, guidelines, standards, or formats within the field that describe how findings from DBR work should be reported within the literature, led us to select the narrative overview as the appropriate methodology for this review.

Methods

Initially, we conducted preliminary searches of several online databases, including ERIC, SCOPUS, and Google Scholar, to identify and establish the literature. We used these initial searches to define the scope of the published DBR in EER literature, to refine the topic, and to develop the research questions to be addressed during the review. After completing preliminary database searches, we developed five inclusion criteria to guide our source selection and reduce self-selection bias:

  1. The work is a peer-reviewed and published journal article, not including conference papers or dissertations.
  2. The work is available in full text. The decision to include only full-text studies reflected our desire to read complete sources (and not just abstracts) in order to better ensure accuracy in the analysis and reporting.
  3. The work is published in English. The decision to include only works published in English reflected our language skills.
  4. The work is an empirical study. We included only empirical DBR studies, and not theoretical or practitioner-based articles, in keeping with the focus of our research questions. For a source to be considered empirical, it had to include a description of data sources, methods for data collection (i.e., quantitative, qualitative, or mixed methods) and analysis, and findings (design findings, theoretical findings, or both) that followed from the analysis.
  5. The work describes an application of DBR in EER. We considered ‘application[s] of DBR in EER’ as studies that were explicitly named or described as being design-based research. In addition, the work had to involve engineering learning content, stakeholders, and/or student participants in the design of an educational intervention (e.g., environment, framework, pedagogy, tool) appropriate for implementation in at least one engineering education learning context (e.g., K–12, undergraduate, or graduate; formal learning settings or informal learning settings such as engineering outreach events or camps).

After consulting with our departmental academic librarian for assistance, we began formal database searching using the following search strings: ‘design based research’ + engineering, ‘design based research’ + STEM, ‘design research’ + engineering, and ‘design research’ + STEM. Along with online databases, we searched individual EER journals, including the Journal of Engineering Education and Computer Applications in Engineering Education, to locate primary sources. No date restriction was placed on the searches in order to preserve the developmental history and “narrative thread” of the topic (J. A. Collins & Fauser, 2005). We began database searching in February of 2020 and stopped searching in June of 2020.

We agreed that any source that did not meet any one of the inclusion criteria would be excluded from the review. To ensure accuracy of inclusion with respect to our criteria, both researchers read each source in its entirety and agreed on its inclusion. In some cases, we deliberated about whether a source should be included. For example, we had several discussions about the study by Langman et al. (2019), which describes development of a tissue engineering curricular module to increase student understanding of mathematical modeling and scientific concepts. The module was developed using a DBR approach, employed engineering principles to teach mathematical and scientific concepts, and was published in an EER journal (i.e., Journal of Pre-College Engineering Education Research). The student population, however, comprised K–12 students in a (non-engineering) summer enrichment program and business and pharmacy technician students in a general mathematics course. Additionally, the research questions used to guide the study were specific to math and science content and did not mention the word engineering. This combination of factors caused us to question whether the study truly represented an empirical application of DBR in EER.

Ultimately, we jointly agreed to include the study based on the knowledge that a) the study employed a DBR approach, b) the module employed learning concepts and principles from the field of tissue engineering, and c) the module would be appropriate for use within an informal engineering learning setting at the high school or early undergraduate introductory engineering levels. As we identified primary sources, we reviewed their reference lists in order to locate additional sources. Once primary sources had been identified, we conducted qualitative content analysis (Schreier, 2014) of the primary sources to answer the research questions.

Limitations

This narrative literature review is limited in at least two ways. The primary limitation is researcher bias during source selection. Researcher subjectivity in selecting sources for inclusion in the review analysis is a common limitation of narrative style literature reviews (Ferrari, 2015). To mitigate this limitation, we adopted several methods (i.e., forming research questions and using inclusion criteria) more characteristic of systematic reviews as recommended by several scholars (J. A. Collins & Fauser, 2005; Ferrari, 2015; Green et al., 2006). Use of these methods helped to reduce source selection bias by ensuring our source selection decisions were procedurally organized and explicit.

The second limitation, which is common to all literature reviews as forms of secondary research, is reliance on best available evidence for making claims and providing critique (Ferrari, 2015). In conducting this study, we found that the research approach known (at that time) to us as design-based research (DBR) was also referred to by other names, including design research, design experiments, and developmental research (Brown, 1992; Cobb et al., 2003; Kelly, 2014). The multiple ways that DBR is referred to presented two distinct difficulties during source selection: a) ensuring that studies identified in database searches were actually empirical applications of DBR and b) judging the extent to which all (or most) published applications of DBR in EER were identified. To ensure that studies included in the review were empirical applications of DBR, we developed database search strings using two of the most prevalent ways of referring to DBR (i.e., design-based research and design research) and then carefully checked the methods sections and reference lists of potential sources to ensure that a DBR approach was used before deciding to include a source. To increase the number of DBR articles located, we reviewed the reference lists of primary sources and searched individual research journals and other systematic reviews of DBR in education (i.e., Anderson & Shattuck, 2012; Zheng, 2015). Despite these actions, however, it is possible that some empirical applications of DBR in EER were not included in this review.

Findings

In the following section, we first present general trends identified within the selected literature, and then follow with a presentation of findings related to the research question and sub questions as synthesized across all primary sources.

General Trends Within the Literature

In this section, we discuss general trends within the DBR in EER literature, including a description of primary source publication timeline and venues.

Publication Timeline

The publication timeline of identified sources (n = 24) is provided in Figure 1. (Note that primary source data are provided in the appendix.) Primary sources included in this review were published between 2005 and 2019. The majority of studies were published after 2010 (Figure 1).

Figure 1 

Engineering Education Contexts of Primary Sources (n = 24) Identified Publication Year.

The first published DBR in EER scholarship appeared in 2005–2006 within the context of graduate engineering education. The earliest study (Newstetter, 2005) we identified was published in the Journal of Engineering Education and described a multi-year, federally funded DBR project to develop curriculum for a new graduate program in biomedical engineering.1

Curriculum development was guided by the cognitive apprenticeship model of learning in the form of problem-based learning, as is used extensively throughout medical school education. The purpose of this study was to help students become integrative thinkers by bringing disparate disciplinary content (i.e., biology, chemistry, engineering, computer science) and skills together in an authentic problem-based learning environment (Newstetter, 2005, p. 207). Design experiments were conducted directly with Ph.D. students (i.e., seven students in the first year) who were enrolled in the program over several years.

The following year, Huang et al. (2006) published a study in Innovate: Journal of Online Education that described development and implementation of a graduate level software engineering course based on an open source software development (OSSD) model. Like Newstetter (2005), Huang et al. (2006) implemented DBR directly with students enrolled in a graduate course (19 students). The purpose of the study was to develop a project-based learning curriculum to replicate the multidisciplinary, dynamic, and team-based characteristics of software engineering as it occurs in practice.

By 2010, DBR studies emerged within both undergraduate and K–12 engineering education contexts (Figure 1). Although DBR scholarship within graduate engineering education contexts waned after 2011, DBR scholarship grew steadily within undergraduate and K–12 contexts. Our data suggest that DBR in EER scholarship within undergraduate and K–12 contexts can be characterized as international; we identified ten studies originating outside the United States from countries including Australia, Canada, Cost Rica, Hong Kong, Spain, Sweden, Taiwan, and the United Kingdom and the United States (see appendix). The international quality of this literature contrasts with a predominant U.S. presence in educational DBR research as reported in other reviews (i.e., Anderson & Shattuck, 2012) and may signal a unique characteristic of DBR in EER scholarship.

Primary sources also represent a wide array of engineering disciplines (i.e., agroindustry, biomedical, chemical, civil, computer, electrical, and mechanical engineering and computer science) and topics (i.e., biomechanics, design, mechanics, nano-biotechnology, sustainability, and transportation) and engage several types of participant groups (i.e., first-year undergraduates; second-year undergraduates; teaching assistants, faculty; engineers, engineering employers and supervisors, elementary, middle, and high school students; K–12 teachers; and instructional design and disciplinary content experts) (see appendix). Todd et al. (2011), for example, conducted DBR with participants comprising undergraduate engineering students, engineering and cooperative education faculty, and cooperative employers and supervisors to design an online community to support cooperative students in engineering. Others (Hardré et al., 2010) employed DBR with 17 K–12 teacher participants to understand the elements of teacher learning and transfer that are important for effective implementation of science and engineering applications in their elementary and middle school classrooms.

Publication Venues

The primary sources in this review comprise 24 articles published in a variety of peer-reviewed journals that were categorized into five topic areas: a) engineering education, b) science education, c) online education, d) technology in education, e) general topics in education. Figure 2 presents the number of primary sources identified in each topic area.

Figure 2 

Journal Focus of Primary Sources (n = 24).

Approximately one-half (11/24) of primary sources were published in engineering education journals (i.e., Journal of Engineering Education, European Journal of Engineering Education, Journal of Pre-College Engineering Education Research, and Chemical Engineering Education). This finding suggests that DBR is gaining acceptance as a rigorous research approach within the EER community. A single article (1/24) was published in a journal devoted to science education (i.e., Physical Review Physics Education Research), which suggests that use of DBR may be emerging within engineering education research independent of its use in science education research. The remaining articles were published in journals focused on online education (2/24) (i.e., International Review of Research in Open and Distance Learning, Journal of Online Education), technology in education (7/24) (i.e., Journal of Educational Technology and Society, Journal of Computers and Education, Journal of Education Technology Research and Development, International Journal of Technology and Design Education, IEEE Transactions on Learning Technologies, and the Journal of Interactive Learning Environments), and topics that apply more generally within the field of education (3/24) (i.e., Journal of Higher Education, Teacher Education Quarterly, and Journal of Cooperative Education and Internships).

Interestingly, one-half of the primary sources were published in journals focused on the type of intervention or learning environment (e.g., online learning environment, educational technology, learning community framework for cooperative education) rather than the disciplinary focus of the research (e.g., engineering education). This finding suggests that there are multiple outlets for publishing DBR in EER research, including both the premier journals in EER as well as online and educational-technology-focused journals. This finding can provide researchers with a level of confidence that DBR in EER studies are viewed as publishable within several research communities and at several levels of scholarship.

Trends within the Literature Related to the Research Question

To answer the research question and sub-questions, we examined primary sources to understand how DBR has been implemented within empirical EER studies.

What Engineering Education Problems Have Been Studied?

We first looked across the data to understand the larger engineering education issues addressed by each primary source. As shown in Figure 3, we categorized issues addressed by the primary sources as a) engineering professional skill development (n = 12); b) teaching and learning assessment in engineering (n = 5); c) student learning of engineering content knowledge (n = 4); d) improvement of interest, perceptions and participation in engineering (n = 3).

Figure 3 

Engineering Education Issue Addressed by Primary Source (n = 24).

One-half (12/24) of the studies sought to develop an intervention to improve engineering professional skill development. Engineering professional skills addressed by these studies were thematically grouped into three subcategories, including interdisciplinary and open-ended problem solving skills (n = 3), experimental inquiry and laboratory skills (n = 3), and design and teaming skills (n = 6).

What Types of Interventions Have Been Designed?

We categorized DBR interventions into two types: interventions that require use of technology or Internet access and those that do not.

Technology and Web-Based Interventions. As shown in Table 1, approximately 40 percent (10/24) of the primary sources described development and/or use of technology or web-based interventions. These interventions include technology-based experiments, laboratory activities, and learning environments; web-based tools for classroom instruction; digital and online courseware; online courses; and online course and community environments.

Table 1

Technology and Web-Based Interventions Developed using Design-Based Research.

Author (year) Technology or Web-based Intervention Issue Addressed

Huang et al. (2006) OSSD online course environment and curriculum for graduate students in software engineering Engineering professional skill development: interdisciplinary and open-ended problem solving

Kong et al. (2009) Remote-controlled experiments for electrical circuits for Primary Four students (Hong Kong)2 Engineering professional skill development: experimental inquiry and laboratory skills
Bernhard (2010) Technology-based conceptual labs in mechanics and electrical circuits for undergraduate engineering students
Yueh et al. (2014) Digital laboratory courseware in nanotechnology for undergraduate engineering and science students

Bower (2011) Web-conferencing course environment for introductory software design course Engineering professional skill development: design and teaming
Charlton and Avramides (2016) Internet of Things (IoT) as a learning environment for design and making STEM activities for 14–15 year olds

Friedrichsen et al. (2017) AIChE Concept Warehouse – website to support use of concept-based pedagogies3 Teaching or assessment in engineering
Liu and Yu (2019) Online system to support active learning through questioning and formative evaluation in large undergraduate engineering classrooms

Todd et al. (2011) Online learning community for engineering cooperative students Learning engineering content
Joo et al. (2014) Online course in quality control for undergraduate students in an agroindustry engineering program

Overall, technology and web-based interventions supported three of four purpose categories of purpose: professional skill development (n = 6), teaching or assessment in engineering (n = 2), and learning engineering content (n = 2). Notably, we did not find any studies that used DBR to develop technology or online tools for the purpose of improving interest in, perceptions of, or participation in engineering.

In contrast, six primary sources supported engineering professional skill development, including all three of its subcategories: interdisciplinary and open-ended problem solving, experimental inquiry and laboratory skills, and design and teaming.

From these data, we conclude that DBR is compatible with a focus on development of educational technology and online learning courses and environments. We further note how the compatibility between DBR and technology development may be largely attributed to similarities existing between DBR and the engineering design process; use of DBR for educational technology development is well documented within the educational literature (see Wang & Hannafin, 2005).

Other Interventions (Not Technology or Web-Based). As shown in Table 2, DBR was employed in over one-half (14/25) of the studies to develop interventions that were not based on technology or use of the Internet. We categorized these “other” interventions into five types as shown in Table 2: curricula (n = 4), pedagogy (n = 3), tools (n = 1), frameworks (n = 3), and in person experiences (n = 3).

Table 2

Other Interventions Developed Using Design-Based Research.

Author (year) Intervention Issue Addressed

Curricula (4 studies)

Newstetter (2005) Biomedical engineering curriculum based on cognitive apprenticeships and problem-based learning for graduate students Engineering professional skill development: interdisciplinary and open-ended problem solving
Langman et al. (2019) Mathematical modeling curricular module based on a tissue engineering context for high school and early college students
Weber et al. (2014) Life cycle assessment: environmental sustainability curricular module for introductory engineering students Learning engineering content
Fan et al. (2018) Engineering design curricular module for high school students Engineering professional skill development: design and teaming
Pedagogy (3 studies)

Dasgupta (2019) Improvable models, a new type of physical models, to engage K–12 students in engineering design Engineering professional skill development: design and teaming
Gomez and Svihla (2019) Parley sessions, decision matrices for supporting consensus building and decision making of chemical engineering students during design activities
Guisasola et al. (2017) Teaching and learning sequences for introductory engineering and science students learning physics Learning engineering content
Tools (1 study)

Diefes-Dux et al. (2010) Learning assessment tools for open-end problem solving in large engineering courses (rubrics, task-specific supports, scorer training) Teaching or assessment in engineering
Frameworks (3 studies)

Tang (2013) Similarities and differences between in-school and out-of-school media representations of engineering experienced by high school students Interest, perceptions, and participation in engineering
Hira and Hynes (2019) Interest-based engineering design challenges framework for interesting pre-college students in engineering
Moore et al. (2014) Quality assessment framework for K–12 engineering education Teaching or assessment in engineering
In-Person Experiences (3 studies)

Hardré et al. (2010) Six-week resident learning experience in science and engineering for K–12 teachers Teaching or assessment in engineering
Blanchard et al. (2015) Inquiry-centered after-school program for middle school students; provides design experiences focused on 21st century engineering challenges Interest, perceptions, and participation in engineering
Guloy et al. (2017) Learning support workshops paired with university courses for first year engineering and science students Learning engineering content

These “other” interventions supported all four major categories of study purpose: professional skill development (n = 5), teaching or assessment in engineering (n = 3), learning engineering content (n = 3), and improvement of interests in, perceptions of, and participation in engineering (n = 3). Professional skill development was mainly supported through development of curricular and pedagogical interventions; improving interest, perceptions, and participation in engineering was mainly supported through development of frameworks and in-person experiences. One of these studies (Hira & Hynes, 2019) explicitly named broadening participation in engineering as the purpose of the study.

Which Participant Populations and Learning Environments Have Been Studied?

Three studies (Bower, 2011; Huang et al., 2006; Newstetter, 2005) engaged with graduate engineering student participants. Seven studies engaged with undergraduate engineering student participants; one of those studies identified participants as being from groups historically underrepresented in engineering (Gomez & Svihla, 2019). Eight studies engaged with K–12 student participants; one of those studies (Blanchard et al., 2015) engaged with participants identified as historically underrepresented in engineering.

K–12 participants included elementary (U.S. grades 3–5, Hong Kong Primary Four students), middle (U.S. grades 6–8) and high school (U.S. grades 9–12, UK Year 10, Taiwan Grade 10) students. Other studies engaged with different types of student participants and/or with participants who were not students, including (non-engineering) undergraduate and high school students (Langman et al., 2019); undergraduate faculty (Friedrichsen et al., 2017; Todd et al., 2011); K–12 instructors (Hardré et al., 2010); teaching assistants (Diefes-Dux et al., 2010); practicing engineers and/or industry members (Diefes-Dux et al., 2010; Todd et al., 2011); and instructional designers and content experts (Yueh et al., 2014) (see appendix).

When examining learning environments across studies, we found that 75 percent (18/24) of studies were conducted within/for in-person learning contexts (e.g., face-to-face courses, workshops, and after-school programs). Three studies were conducted within blended course contexts (i.e., remote access or online tools used in face-to-face courses) (Huang et al., 2006; Kong et al., 2008; Yueh et al., 2014). Three studies were conducted within/for purely online courses contexts or through the exclusive use of an online tool (Bower, 2011; Joo et al., 2014; Todd et al., 2011).

What Research Methods Have Been Used?

Use of multiple qualitative methods (i.e., mulitmethods) and/or qualitative and quantitative methods (i.e., mixed methods) is considered essential for developing detailed understandings of design implementation and the associated effects on learning in context. Moreover, it is common for DBR researchers in education to employ the term mixed methods to report use of both quantitative and qualitative methods in a single study (e.g., Anderson & Shattuck, 2012), regardless of the extent to which method mixing or integration occurs (cf. Creamer, 2017; Creswell, 2014). In our data, nine studies (38%) reported using both qualitative and quantitative methods without using the term mixed-methods explicitly. Seven other studies (29%) explicitly stated use of mixed methods; three of these studies (Blanchard et al., 2015; Liu & Yu, 2019; Weber et al., 2014) named or described the specific mixed-method research design employed (e.g., sequential explanatory mixed-methods design). Of the remaining eight studies, six studies employed qualitative methods and two studies used purely quantitative methods (see appendix).

Across studies, researchers employed varying combinations of quantitative and/or qualitative methods. Quantitative methods included in-person and online surveys, design evaluations by experts and instructors, and performance scores. Qualitative methods included open-ended surveys, one-on-one interviews, focus group interviews, participant journaling, observations, and artifact collection. Several authors emphasized the benefits of using quantitative and qualitative methods together in the same study, describing how it improved the trustworthiness of their findings by enabling them to interpret data using multiple analytical perspectives.

Along with use of multi and/or mixed methods, an iterative design cycle (i.e., data collection, analysis, and revision) is paramount to the DBR process (Anderson & Shattuck, 2012). Previously, scholars (Zheng, 2015) have critiqued the educational DBR literature for failing to provide details about design iterations and for reporting on a single iteration. For example, Zheng (2015) reported that 50 percent of 162 primary sources in their systematic review of DBR literature reported on a single design iteration. Anderson and Shattuck (2012) described challenges with deciphering information about iterations that arise due to the “variety of terms and time measurements are used in DBR studies to discuss iterations (e.g., year, site, phase, iteration, cycle, phase, case study)” in the literature. In this study, we noted that more than one-half of (14/24) primary sources reported on a single design iteration. Ten studies reported on multiple (i.e., two or more) design iterations (see Table 3; appendix). Four of these studies reported on three or more iterations; the maximum number of iterations reported was five (Moore et al., 2014).

Table 3

Educational and Learning Outcomes Reported by Design-Based Research Studies.

Author (year) Iterations Outcome Form of Evidence

Educational Outcomes (14 studies)

Blanchard et al. (2015) 1 Increase in student interest and awareness of engineering careers, enjoyment of design-based activities, understanding of what engineers do Analysis of questionnaire and focus group responses
Charlton and Avramides (2016) 1 Indicators of collaboration and problem-based learning (production) Mapping of student activities to learning indicators
Diefes-Dux et al.(2010) 2 Fidelity to engineering expert-identified characteristics of high performance on MEAs Comparison of TA scores with expert scores
Friedrichsen et al. (2017) 1 Propagation of a technology-based educational innovation Diffusion network diagrams and survey responses
Guloy et al. (2017) 1 Identification of learning outcomes and design requirements needed for a paired learning support workshop Analysis of questionnaire and interview responses
Hardré et al. (2010) 1 Identification of key features of teacher professional development that promote critical student and teacher outcomes Documentation of expected/unexpected events, in relation to process and products
Hira and Hynes (2019) 3 Increase in student personal interest in engineering and inclusivity of pre-college engineering education Vignettes: descriptions of student activities; analysis of survey results
Huang et al. (2006) 1 Factors that influence the success of software design projects Descriptive analysis of course features and their effects on student interactions
Liu and Yu (2019) 1 Learning potential of intervention in terms of perceived usefulness, ease of use, and relative advantage Statistical analysis of survey responses
Moore et al. (2014) 5 Key indicators of quality K–12 education Analysis of literature, STEM education standards, and expert consultations
Newstetter (2005) 2 Instructional scaffolds for development of model-based reasoning Thick description of student activities, events, and outcomes
Tang (2013) 1 Differences between in and out-of-school representations of engineering Thick description; analysis of textual representations
Todd et al. (2011) 2 Online community design for cooperative education students Analysis of focus group interviews; survey responses
Yueh et al. (2014) 1 Evaluation of a web-based courseware development approach Analysis of e-Learning Courseware Quality Checklist version 3.0 results
Learning/Skill Development Outcomes (10 studies)

Bernhard (2010) several Improvement in student conceptual understanding Pre/post test results; thick description of student courses of action
Bower (2011) 3 Increased co-construction of knowledge and collaborative design thinking Vignettes: descriptions of key observations and critical learning episodes
Dasgupta (2019) 1 Evidence of productive disciplinary engagement during design Distribution of design moves across disciplinary practices
Fan et al. (2018) 1 Student design performance in relation to conceptual knowledge, engineering design practice, and STEM attitudes Scores on Mechanical Conceptual Knowledge Test (MCKT), design rubric, and STEM attitude questionnaire
Gomez and Svihla (2019) 2 Evidence of student consensus building on design decisions Vignettes: descriptions of conversational sequences
Guisasola et al. (2017) 2 Learning improvements achieved through use of the intervention Changes in pre/post problem-based test results and questionnaire responses
Joo et al. (2014) 2 Improvement in students’ cognitive engagement and learning outcomes Statistical analyses of assignment scores’ analysis of self-checklist scores
Kong et al. (2009) 1 Learning achievement as a result of the remote experiments Pretest/post test evaluation; analysis of interviews
Langman et al. (2019) 1 Gain in maturity of mathematical models; disciplinary learning gains Analysis of mathematical models and pre/post student responses to a science prompt
Weber et al. (2014) 1 Learning gains about environmental sustainability Statistical analysis of Environmental Inventory survey responses

What Form Do the Findings Take?

Given DBR’s iterative and longitudinal nature, questions about how, when, and what to report as DBR research findings linger within the literature. Thus, it was not surprising to find that primary sources did not report findings in similar ways. To develop an approach for synthesizing findings across DBR studies, we followed Sandoval (2014, pp. 21–22) who identified six design and theory-related elements comprising DBR research: a) high level theoretical conjectures about how to improve a practical educational problem, b) embodiment of a design intervention to improve upon the problem, c) educational or learning outcomes resulting from implementation of the design intervention, d) identification of mediating processes generated by the design that produce the outcome, e) design findings in the form of conjectures, heuristics, or principles that describe how the design generated the mediating processes, and f) theoretical findings in the form of conjectures that describe how the mediating processes generated by the design produced the desired outcome. We reasoned that, in absence of an exemplar paper or specific DBR reporting guidelines within the literature, these essential DBR characteristics could serve as a framework for organizing and reporting on the findings of the studies in this review. In the following sections, we present our review of the primary source findings in terms of these six DBR elements.

Theoretical Conjectures about Improving Educational Problems. The larger engineering education issue (i.e., professional skill development; teaching and learning assessment; content learning; and improvement of interest, perceptions, and participation) that was addressed by each primary source was previously presented in Tables 1, 2. Our review found that all primary sources described a larger engineering education issue addressed by the DBR study. In addition, we found that more than one-half of the studies (14/24) explicitly named the high level theory used to support the type of innovation needed to improve the problem or issue.

Specifically, we noted a variety of theories that were used to conjecture about design innovations. These theories include Variation Learning Theory (Bernhard, 2010); Inquiry-Oriented Instruction, Socially Relevant Design, and Collaborative Learning (Blanchard et al., 2015); Multimedia Design Principles and Social Constructivism (Bower, 2011); Experiential Learning (Dewey) and Constructionism (Papert) (Charlton & Avramides, 2016; Hira & Hynes, 2019); Cognitive Constructivism (Piaget) (Charlton & Avramides, 2016); Diffusion of Innovation Theory (Friedrichsen et al., 2017); Ethics of Care (Gomez & Svihla, 2019); Organismic Integration Theory (Guloy et al., 2017); Community of Practice (Lave and Wenger) (Huang et al., 2006); Transactional Distance (Joo et al., 2014); Observational Learning Theory (Kong et al., 2009); Conformity and Expertise Theories (Liu & Yu, 2019); Cognitive Apprenticeship Model (Newstetter, 2005); and Work-Integrated Learning Models (Todd et al., 2011).

Additionally, we found that the 10 remaining studies relied on other forms of theoretically - grounded support to justify their design innovations. For example, some studies (Dasgupta, 2019; Fan et al., 2018; Langman et al., 2019; Moore et al., 2014; Tang, 2013; Weber et al., 2014) used educational goals and requirements described in standards and high-level reports, including those published by organizations such as the National Academies, National Research Council (e.g., Next Generation Science Standards), Council of Chief State School Officers (e.g., Common Core Standards) and the accreditation body for engineering programs (i.e., ABET). Other studies (Diefes-Dux et al., 2010; Guisasola et al., 2017; Yueh et al., 2014) relied on well-researched pedagogical frameworks (i.e., model-eliciting activities, teaching and learning sequences, and virtual laboratories) to support their design ideation. In sum, we found that all studies in this review described use of some form of higher level, theoretically informed backing to support the ideation of the design intervention.

Embodiment of the Design Intervention. All primary sources described the embodiment of a design intervention; brief descriptions of the interventions were previously presented in Tables 1, 2 (see also the appendix). Across studies, design embodiments were depicted using a variety of representations, including thick description (e.g., Blanchard et al., 2015), schematics or graphics (e.g., Bernhard, 2010), computer screen shots (e.g., Bower, 2011), digital images (e.g., Dasgupta, 2019), tables (e.g., Gomez & Svihla, 2019), and through combinations of these types of representations (e.g., Charlton & Avramides, 2016; Tang, 2013).

Educational or Learning Outcomes. All primary sources presented evidence of the educational or learning outcomes resulting from the implementation of the design intervention within an authentic learning context (Table 3).

More than one-half of studies (14/24) provided evidence of educational outcomes (e.g., grading fidelity, quality indicators, interest improvements, courseware evaluation); other studies (10/24) provided evidence of student learning/skill development outcomes (e.g., learning gains, cognitive and design engagement improvement) as DBR findings. Primary sources used varying forms of evidence to support and explain these outcomes; studies that described several design iterations often provided vignettes or thick descriptions of the outcomes and changes that occurred to the intervention.

Mediating Processes. We found that none of primary sources employed conjecture mapping as proposed by (Sandoval, 2014), nor did they explicitly identify mediating processes connecting embodied use of the design intervention to the educational or learning outcomes observed. Thus, it was unrealistic, within the scope of this review, to report on mediating processes. We note that some primary sources may have embedded descriptions of mediating processes within their thick descriptions of student learning events. For example, Bernhard (2010, p. 278) described students’ “lived object of learning”—how students “see, understand, and make sense of the object of learning” during implementation of the intervention. These descriptions are likely to include the identification of mediating processes. Furthermore, other studies (i.e., Dasgupta, 2019; Gomez & Svihla, 2019; Hira & Hynes, 2019; Newstetter, 2005; Tang, 2013) provided such rich and detailed tracings of participants’ actions with the intervention that mediating processes could potentially be interpreted from these data.

Design and Theoretical Findings. Following Sandoval (2014), we define design findings as conjectures, heuristics, or principles that describe how the design of the intervention generates mediating processes, and theoretical findings as conjectures that describe how mediating processes produce the outcomes. Because primary sources did not report design and theoretical findings as such, we carefully reviewed each to study for evidence of these two types of findings. This evidence, presented in Table 4, indicated that an interlacing of design and theory was present in these studies.

Table 4

Evidence of Design and Theoretical Findings in Design-Based Research (n = 24).

Author (year) Design Findings Theoretical Findings

Bernhard (2010) Principles for designing labs, or lab-like learning environments, that support conceptual learning Variation theory supports the design of conceptual labs
Blanchard et al. (2015) Program promoted adoption of concrete strategies and aspirational career goals simultaneously. Concrete strategies supported and nurtured aspirational goals while still in school Program is one approach to respond to calls to broaden access to and increase awareness of engineering, especially among underrepresented groups
Bower (2011) Principles for scaffolding creative design learning in online environments Multimedia and socio-constructivist learning principles support design learning in online environments.
Charlton and Avramides (2016) IoT environment enabled flexible “making” and encouraged experimentation by not exactly helping to solve the problem directly Flexible knowledge construction and production during IoT-based design challenges foster collaborative learning and collective engagement.
Dasgupta (2019) Improvable models engaged students in idea generation through processes of revision and redesign, manipulation of current design parameters, and design decomposition and optimization K–12 students can productively engage in heuristics generation and engineering design practices using Improvable Models
Diefes-Dux et al. (2010) Assessment tools to promote high fidelity to expert evaluation of MEA products Design of evaluation tools for open-ended problems embedded within a larger educational system can be addressed through use of various educational research methods and a multi-tiered teaching experiment methodology
Fan et al. (2018) Engineering design course used to create mechanical toys (Automata) using various mechanisms (see Fan & Yu, 2017) Interest and metacognitive skills are key motivational factors for students involved in engineering design
Friedrichsen et al. (2017) DBR-based model for collecting and analyzing intervention propagation data for developers and researchers interested in research impact and education reform. Intervention propagation data can guide approaches for increasing use and be fed back into the design process to guide design of the technology itself
Gomez and Svihla (2019) Parley sessions related to key design decisions improved communication by providing students opportunities to argue through evidence and negotiate ideas through uncertainty. Scaffolding to key decisions resulted in more manageable amount of core content Negotiating ideas with peers through uncertainty shifts peer communication from transfer of knowledge to collaboration
Guisasola et al. (2017) Guide to teachers for implementing Teaching and Learning Sequences (TLS) DBR can be used as a model for teacher driven design and evaluation of TLS
Guloy et al. (2017) By embodying aspects of more autonomous forms of extrinsic motivation (i.e., identification), the intervention is more likely to help students persist through challenges, engage in disciplinary craft, and seek help from peers, while studying Future research should focus on how students can be extrinsically motivated to participate and value the adoption of desired learning strategies.
Hardré et al. (2010) Guidelines for designing effective professional development programs Authentic transfer among teachers is key to bridge professional development into teaching practice
Hira and Hynes (2019) Guiding principles for engineering design challenges (being human-centered, having broad themes, and involve the making of things) are realized by including authentic clients, students choosing their own themes of interest, and providing access to tools and materials Provides evidence that guiding principles for engineering design challenges can provide more engaging engineering activities.
Huang et al. (2006) Guidelines for designing a computer science courses based on an Open Source Software Development framework Provides evidence of the potential for open source software courses to address concerns generated by more traditional computing courses
Joo et al. (2014) Course redesign impacted the distance learners’ cognitive engagement and learning outcomes through a heightened level of structure Provides evidence to support theory that states that the appropriate balance between dialogue and structure in online instruction must account for the educational sophistication of the learner and the content
Kong et al. (2009) Teacher guidelines for teachers implementing remote experiments Remote experiments have potential to promote elementary students’ learning by observation
Langman et al. (2019) Teachers guidelines for promoting development of mature mathematical models An agenda for future research on module design and the relationship between disciplinary learning and authentic engineering problems
Liu and Yu (2019) System feature of deferring display of other students’ responses stimulates independent thinking and supports meaningful formative evaluation in a large group environment Findings support tenets of conformity theory in that viewing peers’ responses too soon limited the positive effects of formative evaluation and active learning
Moore et al. (2014) Quality framework can be used for curriculum development both for the development of units of instruction and for the development of scope and sequencing throughout K–12 curricula. Quality framework can be used to inform the development and structure of future K–12 engineering and STEM education standards and initiatives
Newstetter (2005) Forced use of cartons, sketches, and assumptions on whiteboards makes the role of diagrammatic reasoning in engineering problem solving explicit Argues the need to scaffold the development of model-based reasoning throughout the engineering curricula
Tang (2013) Conjectured pedagogical strategies to address contrasting views of technology Out-of-school representations, which present contrasting views of technology based on the diverging practices and rhetorical purposes of media professionals, pose affective challenges for beginning engineering students
Todd et al. (2011) Factors to consider in the design of an online community for cooperative engineering education Findings suggest enhancements to the Model of Community-based Online Learning
Weber et al. (2014) Provides insights about which misconceptions about environmental sustainability are the most malleable and which are the most stable Four-week module may not be enough to shift student attitudes about sustainability
Yueh et al. (2014) Validated courseware in nano-biotechnology with areas for improvement noted To accomplish creative learning design in content, navigation and media design, it is necessary to provide substantial assistance within the quality assurance framework to inspire more creative design

In sum, our review found that the primary sources generally communicated design findings related to the functioning of the inventions and the theoretical findings in the form of local proto-theories or connections to higher theory. However, despite these valuable communications, primary source findings were not explicitly labeled as design findings and theoretical findings and it is possible that we misinterpreted what those findings really were.

What Are the Limitations of This Body of Literature?

The primary sources included in this review served as rich examples of DBR. They clearly captured the essence of DBR through thoughtful and purposeful consideration of a practical problem of significance in engineering education, design ideation in conjunction with theory, and execution of research plans that successfully uncovered relationships between theory, designed intervention, and educational practice. We are excited by the robustness of these DBR in EER studies, especially given the early stage of DBR’s emergence within the field. However, as is true in areas of emerging scholarship, recommendations for improvements can be offered. In this section, we describe limitations of the DBR in EER literature as suggested by our findings.

Inconsistent Reporting of Key DBR Elements. Across primary sources, we noted inconsistent reporting of several fundamental elements of DBR, including a) research questions/hypotheses and the number of design iterations completed; b) information about the context of implementation; c) descriptions of changes made to the intervention between iterations; and d) DBR findings to include intervention outcomes, design findings, and theoretical findings. For example, we identified eight (33%) studies that did not clearly state the research question or hypothesis guiding the inquiry. In some studies, it was difficult to discern the number of iterations being reported on, whether because the research agenda spanned several years (e.g., Bernhard, 2010), the multi-year project took place in stages (e.g., Diefes-Dux et al., 2010), or several interwoven cycles of evaluation were conducted (e.g., Yueh et al., 2014). We considered studies in which aggregate findings were reported from data generated across similar contexts (e.g., multiple classrooms in the same school) simultaneously (e.g., Fan et al., 2018) to be single iteration studies, since design improvements could not have between accomplished between the simultaneous implementations.

Primary sources also varied in the amount of detail they provided about the implementation context and the amount of description they provided about the design changes made between iterations. We note that it is difficult for DBR study authors to provide “just the right amount” of detailed information, because the level of required detail can vary with the complexity and length of the study and with publishing requirements. Despite these challenges, we note several exemplar studies that provide substantial detail in accessible forms: Dasgupta (2019) is an example of a single iteration DBR study embedded within the historical context of prior iterations; Gomez and Svihla (2019) and Hira and Hynes (2020) are well-detailed two and three iteration DBR studies, respectively; and Bernhard (2010) and Moore et al., (2014) are examples of five (or more) iteration DBR studies. While we found in our review that all primary sources reported on educational or learning outcomes (Table 3), few explicitly named, or made distinctions between, design and theoretical findings that serve to connect the design embodiment to the reported outcomes (although we found evidence of these findings, as shown in Table 4).

In considering these critiques, it is important to assess how the unique nature of DBR—its relationship to history, its large and complex data sets, its longitudinal nature, and the multiple types of findings it produces—affect the reporting of DBR studies. Clearly, DBR researchers have much to report and, subsequently, consumers of DBR have much to read and understand. Single iteration DBR studies are almost always situated within larger studies in progress—or are themselves the beginnings of a new research studies—and should include detailed information about the fit of the single iteration within the larger DBR landscape. Moreover, the continuously unfolding and never quite finished nature of DBR calls for the kind of thick, rich description common in qualitative research to build transferability into its findings. Thus, we suggest it is easy for DBR researchers to become overwhelmed trying to satisfy both consumers and publishers of their work.

We suggest that one way for the field of engineering education to promote robust DBR research is to develop a set of guidelines for reporting DBR in scholarly publications. The purpose of these guidelines would be to provide a framework for describing and organizing DBR studies in order to support transferability among practitioners and publication readability among all DBR consumers. For example, the framework may take a form already provided within the education literature, such as the conjecture mapping (Sandoval, 2014) schema used in this review. However, whatever its precise form, the framework should help researchers ensure that key DBR elements are consistently reported through detailed guidance pertaining to the inclusion of thick, rich descriptions of the study history and situation, learning context, evolving intervention embodiment, and context of implementation.

Inconsistent Reporting Participant Demographic Information. As is common in engineering education research (Pawley, 2017), the primary sources in this review generally did not report adequate participant demographic information (e.g., gender, race, ethnicity). For example, we found that over one-half (14/24) of the studies did not report any participant demographic information. We note that, in one of these studies (e.g., Yueh et al., 2014), the lack of participant demographic information did not seem to limit transferability since the participants (practitioners and experts) were evaluating the intervention (digital courseware) and were not part of the target population for the intervention (students). In addition, we found that seven studies reported partial participant demographic information (e.g., gender only) and noted only three studies that reported detailed demographic information about participants (see appendix). Because of the criticality of demographic information in DBR, we suggest that the emerging engineering education DBR community has an unique opportunity to set the example for the field by establishing demographics reporting as a required element of DBR in EER scholarship. In doing so, DBR scholarship will serve to heighten the collective impact of diversity and inclusion scholarship throughout the field.

Limited Research with and for Underrepresented Groups in Engineering. As we actively work toward improving diversity and inclusion in engineering education, it is essential that more attention be paid to designing interventions to benefit diverse student groups. To do this, more DBR must be conducted with participants from groups underrepresented in engineering. Within our review, we noted only two studies (Blanchard et al., 2015; Gomez & Svihla, 2019)4 that reported purposefully working with underrepresented racial and ethnic groups in engineering; a single study (Hira & Hynes, 2019) stated that a purpose of their work was to broaden participation in engineering. We applaud these efforts and call for more DBR work to be accomplished with and for diverse, underrepresented groups in engineering.

Discussion

In the following sections, we discuss implications of our review findings in terms of the opportunities and challenges that DBR presents for the field engineering education. We consider our own experiences conducting an ongoing DBR in EER in light of these results to provide further insights to the discussion.

Opportunities for Design-Based Research in Engineering Education

Developing Local Improvements to Engineering Education’s Most Vexing, “Wicked” Problems

Wicked problems are the “problems worth solving” (Kolko, 2012). In contrast to merely difficult problems, wicked problems cannot be solved using traditional processes; use of conventional processes, in fact, often makes wicked problems worse. Wicked problems have multiple, interrelated causes, are difficult to fully and completely define, and don’t have single, correct answers. For these reasons, solutions to wicked problems are better or worse, not right or wrong. Therefore, approaches taken toward fixing wicked problems must focus on practical ways to improve, rather than to solve, these highly contextual situations (Rittel & Webber, 1973).

The results of this review provide evidence that DBR is a research approach suited to tackling engineering education’s wicked problems. Already, DBR has gained attention as a mechanism for instituting and perpetuating instructional change toward the development of collaborative competencies and student-centered learning, most notably through the use of project- and problem-based learning strategies (Kolmos, 2015). This important application of DBR—advancing active, collaborative, and student-centered learning in engineering—is further represented by several studies included in this review (e.g., Bernhard, 2010; Bower, 2011; Charlton & Avramides, 2016; Fan et al., 2018; Gomez & Svihla, 2019; Huang et al., 2006; Liu & Yu, 2019; Newstetter, 2005).

Along with enacting enduring instructional change, the studies in this review revealed other ways that DBR is being used to improve engineering education. Specifically, the works of Blanchard et al. (2015), Hira and Hynes (2019), and Tang (2013) highlighted use of DBR as a mechanism for developing contextual interventions to increase interest and broaden participation in engineering. For example, the work of Blanchard et al. (2015), developing an engineering-focused after-school program within a low-income majority-minority community, acts to improve historically and systematically embedded processes of marginalization and underrepresentation present within that local community. As another example, we consider our own DBR work, which focuses on the development of an accessible, smart-phone enabled mobile tool to excite diverse student interest in the study of fluid mechanics—an important interdisciplinary area of engineering research practice—through active visualization and quantification of fluid flows (cf. Hertzberg, Leppek, & Gray, 2012; Rossman & Skvirsky, 2010). These examples, which highlight the potential of DBR to develop interventions focused on broadening interest and participation in engineering, inspire us to ask following question: In what other ways can DBR be used to make the wicked problems we face engineering education better?

Realizing the Potential of Technology in Engineering Education

Since its emergence, DBR has been increasingly employed within education research, particularly within K–12 contexts and for purposes of designing technology-based educational interventions (Anderson & Shattuck, 2012). Wang and Hannafin (2005) described the intense compatibility that exists between DBR and the design of technology-enhanced learning environments. Unlike other approaches to educational research that may seek to build and abstract scientifically credible theory from findings as a culminating result, technology-enhanced design research thrives when local theory can be generated and applied to inform the design (Wang & Hannafin, 2005).

In their literature review of the five most cited education DBR articles by year between 2000–2010, Anderson and Shattuck (2012) reported that 21/31 (68%) empirical articles focused on developing technology-based interventions, including educational software packages, multi-use virtual environments, wikis, social networking, games, mobile and positioning, digital storytelling, and other technology-supported activities. (We note that none of these studies are situated within an engineering education context.) In our review, we found that 10/24 (42%) of DBR in EER studies focused on development of technology-based interventions, including online or technology-based courses and course environments (Bower, 2011; Charlton & Avramides, 2016; Huang et al., 2006; Joo et al., 2014), digital courseware (Yueh et al., 2014), technology-based laboratory activities (Bernhard, 2010; Kong et al., 2009), an online community support website (Todd et al., 2011), and online systems to support conceptual learning and formative assessment in large classrooms (Friedrichsen et al., 2017; Liu & Yu, 2019). In addition, 18/24 (75%) interventions described in this review were developed for in-person (face-to-face) learning environments. Comparing across fields, we wonder why and how the creative ideation of technology-based interventions in engineering education compares less favorably—both in terms of the number and types of technology-based interventions—than technology-based DBR research in other education fields.

We are hopeful that the growing presence of DBR in EER, as evidenced by the results of this review, will catalyze and cultivate more technology-based interventions in engineering education. In light of the recent pandemic that has made remote and technology-based learning a global necessity, the need for effective and engaging educational technology is higher now than ever before. Moreover, we note that our own interdisciplinary DBR work would not have been possible without making a paradigmatic shift toward a DBR approach. Knowing how DBR has provided opportunities for our own technology-based educational research, we anticipate an increase in similar engineering education research as the larger community discovers DBR.

Challenges to Conducting Design-Based Research in Engineering Education

Managing the Development of an Interdisciplinary DBR Team

DBR project teams necessarily comprise people from different backgrounds and with a variety of skills. For example, DBR project teams often include education and learning science researchers, disciplinary researchers, instructors, teachers, instructional designers, software developers and programmers, administrators, and students. The resultant breadth of knowledge, experience, and know-how is the fuel that propels DBR innovation and discovery and establishes the link between research and practice. One challenge to the success of DBR projects, however, results from the team’s size and multi-disciplinarity: How to develop a group of diverse, multidisciplinary collaborators into a well-functioning and productive interdisciplinary DBR team.

The making of interdisciplinary teams is critical to the success of a DBR project. While interdisciplinary team members possess specific expertise and defined project roles; they also willingly contribute across shared technical objectives that span the boundaries of their area of expertise. Interdisciplinary team members remain committed to project goals and objectives and contribute when, where, and how they can. In contrast, members of teams that remain multidisciplinary in nature often act and feel differently. First and foremost, multidisciplinary team members act as experts in their field; they may compartmentalize project objectives and remain focused on their particular tasks and contributions. Due to their concentration on specific aspects of the project, multidisciplinary team members may lose sight of shared technical objectives and the affordances (and constraints) of the intervention under development.

The studies in our review offer a few hints of this challenge of DBR. While several studies were conducted by DBR teams comprising just a few people (e.g., Bernhard, 2010; Bower, 2011; Dasgupta, 2019; Gomez & Svihla, 2019; Huang et al., 2006; Joo et al., 2014; Tang, 2013), other studies involved teams of stakeholder from multiple disciplinary backgrounds (e.g., Blanchard et al., 2015; Charlton & Avramides, 2016; Diefes-Dux et al., 2010; Friedrichsen et al., 2017; Hardré et al., 2010; Weber et al., 2014). To highlight the range of DBR team sizes present in our review, we compare studies by Dasgupta (2019) and Blanchard et al. (2015). On the one hand, Dasgupta (2019, p. 399) conducted a DBR study as PhD dissertation research, describing the need to find a teacher collaborator with “…interest … in providing an opportunity for … students to engage in an iterative engineering design activity, and the willingness …to work with researchers and implement the research curriculum during the regular science period.” On the other hand, Blanchard et al. (2015) conducted a multi-year, multi-institution DBR project in collaboration with teachers and administrators across three middle schools. From these data we conclude that: a) the challenges of DBR team development may be prevalent among some EER studies and not others, and b) DBR in EER teams can be large and diverse enough to warrant efforts placed into team development.

We have felt the challenges of building an interdisciplinary team within our own DBR in EER project. In partnering with disciplinary researchers in engineering and computer science, we have come to consider how academic cultures of disparate disciplines may affect the development of an interdisciplinary DBR team. Borrego (2007, p. 98) reported on difficulties experienced by disciplinary engineering faculty in “valuing a collaborative approach to engineering education research and incorporating the perspectives and methodological expertise of multiple disciplines.” Generally, disciplinary researchers act (and are rewarded for acting) as expert researchers, charting their own course for discovery within their primary knowledge domain. Realistically, DBR does not provide the same opportunities for domain-specific discovery that disciplinary research does. This reality can result in disciplinary researchers losing interest, especially when DBR projects run over several years and place varying amounts of focus on any single disciplinary expertise.

We have found that the development of an interdisciplinary DBR team takes attention, patience, and generation of sustained interest in the actual intervention itself. We have found that DBR collaborators, even those from engineering disciplines, will not necessarily be familiar or comfortable with the iterative process of design when it is intertwined with research. We have learned the importance of establishing and periodically reinforcing team member connection to and excitement about the educational intervention. For example, we found that encouraging engineering team members to present early design iterations and preliminary technical findings at academic conferences and university-sponsored poster sessions helped them to feel personal ownership in the intervention and remain enthusiastic and committed to project goals. Conferences having a disciplinary focus (e.g., mechanical engineering), rather than an education focus, worked best for this; today, many disciplinary engineering conferences sponsor education-focused sessions.

Need for Expertise across Multiple Educational Research Approaches and Methods

Along with developing an interdisciplinary team, DBR requires engineering education researchers to engage with a variety of data collection methods and to synthesize and integrate multiple forms of data during analysis. As evidence of this challenge, we note that 16/24 studies in this review reported using both quantitative and qualitative methods. Even in studies that reported using purely quantitative or qualitative methods, DBR researchers employ several data collection and analysis methods within a single study.

In our DBR work, we have found engagement across a range of education research methods to be both exhilarating and exhausting. Exhilaration has come from freedom to consider (and learn) new approaches to data collection and how the specific information gained from each method provide needed insights into the intervention design. Exhaustion has, at times, crept in due to the need to prepare and submit multiple research protocols to our institutional review board (IRB) based on the specific combinations of research sites (e.g., undergraduate class, K–12 STEM outreach camp), methods, and procedures being employed. To help combat methods-induced exhaustion, we have developed collaborative relationships with protocol reviewers in our university’s IRB office. Protocol reviewers provide us feedback on early protocol drafts and guidance on composing and organizing protocols efficiently when working with multiple participant populations (e.g., K–12 students and undergraduates).

We also engage with methods-based experts within the college of education at our university, who assist with instrument development and analysis techniques as needed. Last, we use technology, such as online transcription services to reduce costs (time and money) of transcribing qualitative video and audio recordings, as much as is feasible.

Funding DBR Projects

In our review, 15/24 (62%) studies reported having funding sources to conduct DBR research; five of these studies reported having multiple funding sources. Three studies reported funded through five-year mechanisms (CAREER and IUSE/PFE-RED programs) from the National Science Foundation (NSF); another study was funded through the NSF RET program. Other U.S. funding sources included the Kennedy Shriver National Institute of Health and Child Development, Institute of Education Sciences (IES), and the U.S. Department of Education. International funding support was received from the European Union under the Practice-based Experiential Learning Analytics Research and Support (PELARS) program, Simon Frasier University, Swedish Research Council and Swedish National Agency for Higher Education, and the Ministry of Science and Technology of Taiwan, the Taiwanese National Science Council and the National Taiwan University.

Based on these findings, we suggest that, currently, funding may act as an obstacle to DBR in EER. For example, a perceived need to fund DBR work (that can last over a decade or more) through multiple, separate grants (e.g., Bernhard, 2010) may be an obvious deterrent. In our own DBR work, we encountered substantial challenges finding an initial funding mechanism able to support a multi-year, educational technology development project. At that time, solicitations that supported technology development did not offer sufficient funds for an interdisciplinary team endeavor occurring over several years. In contrast, solicitations that did fund multi-year, interdisciplinary education projects at sufficient funding levels did not support technology development as part of the project work plan. Therefore, we suggest funding agencies may best support DBR research via through an assessment of current funding opportunities, with the goal of developing new programs and or funding models specifically designed to support the interdisciplinary and emergent work that is suited to DBR.

Disseminating DBR Results

After securing funding for a three-year DBR project, our challenges shifted toward the dissemination of intermediate and in-progress findings resulting from our on-going work. In reviewing the studies (and reference lists) included in this review, it was difficult to discern a single pattern of DBR in EER publishing. In some cases, a journal publication appeared as the first/sole publication of the DBR study. In some of these cases, the journal publication covered a piece of a larger DBR study, and there was evidence of other journal and/or conference papers published on pilot studies or other facets of the larger DBR study. In other cases, there was record of conference papers and/or dissertations on the work published prior to the journal article. In still other cases, the research-focused journal publication was accompanied by a practitioner-focused journal publication.

Questions of what, where, and how to publish DBR studies are being addressed by today’s DBR scholars, who argue the need to disseminate and share intermediate outcomes and findings as they evolve during DBR work. Kelly (2014), for example, suggested that the field of engineering education could:

…benefit from synchronous sharing of intermediate results during the various stages… of [DBR] research. Such stage-linked cross-fertilization would work against “silo” research in each content area. It would dramatically shorten the time that one community could learn from the other. Under current dissemination models, an engineering education researcher is highly unlikely to chance on a paper of a chemistry educator—[especially] one that would be published many months or even years after the actual experiments (p. 508).

While the number and variety of publication outlets uncovered by this review (Figure 2) suggest the availability of many opportunities for publishing DBR in EER work, the murky and as yet unchartered course for publishing may discourage new engineering education researchers from pursuing DBR studies. In order to support and encourage DBR in EER efforts, we suggest that EER journals develop DBR-focused special editions and/or publish specific guidelines for publishing intermediate and final DBR findings in regular edition manuscripts to ease uncertainty about publishing DBR studies.

Conclusion

In this narrative literature review, we identified 24 studies as empirical applications of design-based research within the field of engineering education. Our review provides insights into the ways design-based research is currently being implemented to advance engineering education imperatives and highlights practical opportunities and challenges for conducting design-based research in the context of engineering education. Across all studies, we found international use of design-based research at all levels of engineering education and in both formal and in formal settings; findings and/or products from the studies are represented across a range of publication venues in engineering education, online learning, technology, and general education journals. Interestingly, less the one-half of the studies sought to develop technology-based interventions. While most studies focused efforts on developing interventions to improve professional skill development, teaching, assessment, or learning of engineering content, a small set of studies focused on developing interventions to improve societal problems in engineering education such as improving interest in, perceptions of, and participation in engineering careers. This work synthesizes existing empirical studies and provides a foundation for promoting and strengthening use of design-based research in future engineering education research.

Additional File

The additional file for this article can be found as follows:

Notes

1Newstetter (2005) described their study as “follow[ing] the evolutionary trajectory of curricular design efforts over four years using Problem-based Learning (PBL) in the Department of Biomedical Engineering [BME].” The department included both graduate and undergraduate BME programs. While the PBL curricular efforts were implemented in both programs, this study was coded as graduate engineering education because the cognitive apprenticeship design experiments described within the study were conducted with Ph.D. students in the graduate BME program. 

2Kong et al. (2009) reported that the mean age of student participants was 9.91 years (SD = 0.36). 

3Friedrichsen et al. (2017) documented the diffusion (rather than the development) of the Concept Warehouse intervention in engineering education using a design-based research approach. 

4Blanchard et al. (2015) reported that the context of the study was a predominantly low-income, majority-minority community and provided detailed demographic information about the participants, including race, ethnicity, and gender data. (Gomez & Svihla, 2019) reported that the study was set at a Hispanic-serving institution in the southwestern United States. 

Ethics and Consent

Ethics approval was not required for this study.

Acknowledgements

The authors thank engineering education librarian Pamela Martin for generous assistance with database search procedures.

This work is supported by the U.S. Office of Naval Research (ONR) through the Navy and Marine Corps Science, Technology, Engineering & Mathematics (STEM) Education, Outreach, and Workforce Program under Award N000141812770. All findings and opinions expressed herein are those of the authors and do not necessarily reflect the views of ONR.

Competing Interests

The authors have no competing interests to declare.

Author Contributions

AM conceived of the study and methodological approach; wrote the initial abstract submission; assisted with data collection; shared data analysis tasks; and led the writing and revision of the manuscript. LC led data collection; shared analysis tasks; developed appendix materials; and contributed to drafting and revising the manuscript content. Both agreed to be named on the author list and approved the final author list.

References

    References marked with an asterisk* indicate articles that were included in the review

  1. Anderson, T., & Shattuck, J. (2012). Design-Based Research: A decade of progress in education research? Educational Researcher, 41(1), 16–25. DOI: https://doi.org/10.3102/0013189X11428813 

  2. Bakker, A. (2019). Design Research in Education: A Practical Guide for Early Career Researchers. New York, NY: Routledge. DOI: https://doi.org/10.4324/9780203701010 

  3. *Bernhard, J. (2010). Insightful learning in the laboratory: Some experiences from 10 years of designing and using conceptual labs. European Journal of Engineering Education, 35(3), 271–287. DOI: https://doi.org/10.1080/03043791003739759 

  4. *Blanchard, S., Judy, J., Muller, C., Crawford, R. H., Petrosino, A. J., White, C. K., Lin, F.-A., & Wood, K. L. (2015). Beyond blackboards: Engaging underserved middle school students in engineering. Journal of Pre-College Engineering Education Research (J-PEER), 5(1). DOI: https://doi.org/10.7771/2157-9288.1084 

  5. Borrego, M. (2007). Conceptual difficulties experienced by trained engineers learning educational research methods. Journal of Engineering Education, 96, 91–102. DOI: https://doi.org/10.1002/j.2168-9830.2007.tb00920.x 

  6. Borrego, M., Foster, M. J., & Froyd, J. E. (2014). Systematic literature reviews in engineering education and other developing interdisciplinary fields. Journal of Engineering Education, 103(1), 45–76. DOI: https://doi.org/10.1002/jee.20038 

  7. Borrego, M., Foster, M. J., & Froyd, J. E. (2015). What is the state of the art of systematic review in engineering education? Journal of Engineering Education, 104(2), 212–242. DOI: https://doi.org/10.1002/jee.20069 

  8. *Bower, M. (2011). Redesigning a web-conferencing environment to scaffold computing students’ creative design processes. Journal of Educational Technology & Society, 14(1), 27–42. https://www.jstor.org/stable/jeductechsoci.14.1.27?seq=1#metadata_info_tab_contents 

  9. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141–178. DOI: https://doi.org/10.1207/s15327809jls0202_2 

  10. *Charlton, P., & Avramides, K. (2016). Knowledge construction in computer science and cngineering when learning through making. IEEE Transactions on Learning Technologies, 9(4), 379–390. DOI: https://doi.org/10.1109/TLT.2016.2627567 

  11. Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. DOI: https://doi.org/10.3102/0013189X032001009 

  12. Cobb, P., Jackson, K., & Dunlap, C. (2017). Conducting design studies to investigate and support mathematics students’ and teachers’ learning. In J. Cai (Ed.), First compendium for research in mathematics education (pp. 208–233). Reston, VA: National Council of Teachers of Mathematics. 

  13. Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology, 96, 15–22. Berlin, Heidelberg: Springer. DOI: https://doi.org/10.1007/978-3-642-77750-9_2 

  14. Collins, J. A., & Fauser, C. J. M. (2005). Balancing the strengths of systematic and narrative reviews. Human Reproduction Update, 11(2), 103–104. DOI: https://doi.org/10.1093/humupd/dmh058 

  15. Creamer, E. G. (2017). An introduction to fully integrated mixed methods research. SAGE. DOI: https://doi.org/10.4135/9781071802823 

  16. Creswell, J. W. (2014). Research design: Quantitative, qualitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: Sage Publications Inc. 

  17. Crotty, M. (1998). The foundations of social research: Meanings and perspective in the research process. London: SAGE Publishers. DOI: https://doi.org/10.4324/9781003115700 

  18. *Dasgupta, C. (2019). Improvable models as scaffolds for promoting productive disciplinary engagement in an engineering design activity. Journal of Engineering Education, 108(3), 394–417. DOI: https://doi.org/10.1002/jee.20282 

  19. *Diefes-Dux, H. A., Zawojewski, J. S., & Hjalmarson, M. A. (2010). Using educational research in the design of evaluation tools for open-ended problems. International Journal Engineering Education, 26(4). https://www.ijee.ie/contents/c260410.html 

  20. Edelson, D. C. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences, 11(1), 105–121. DOI: https://doi.org/10.1207/S15327809JLS1101_4 

  21. *Fan, S.-C., & Yu, K.-C. (2017). How an integrative STEM curriculum can benefit students in engineering design practices. International Journal of Technology and Design Education, 27(1), 107–129. DOI: https://doi.org/10.1007/s10798-015-9328-x 

  22. Fan, S.-C., Yu, K.-C., & Lou, S.-J. (2018). Why do students present different design objectives in engineering design projects? International Journal of Technology and Design Education, 28(4), 1039–1060. DOI: https://doi.org/10.1007/s10798-017-9420-5 

  23. Ferrari, R. (2015). Writing narrative style literature reviews. Medical Writing, 24(4), 230–235. DOI: https://doi.org/10.1179/2047480615Z.000000000329 

  24. *Friedrichsen, D. M., Smith, C., & Koretsky, M. D. (2017). Propagation from the start: The spread of a concept-based instructional tool. Educational Technology Research and Development, 65(1), 177–202. DOI: https://doi.org/10.1007/s11423-016-9473-2 

  25. Gaver, W. (2012). What should we expect from research through design? Paper presented at the SIGCHI Conference on Human Factors in Computing Systems. DOI: https://doi.org/10.1145/2207676.2208538 

  26. *Gomez, J. R., & Svihla, V. (2019). Building individual accountability through consensus. Chemical Engineering Education, 53(2). DOI: https://doi.org/10.18260/2-1-370.660-108007 

  27. Green, B. N., Johnson, C. D., & Adams, A. (2006). Writing narrative literature reviews for peer-reviewed journals: Secrets of the trade. Journal of Chiropractic Medicine, 5(3), 101–117. DOI: https://doi.org/10.1016/S0899-3467(07)60142-6 

  28. *Guisasola, J., Zuza, K., Ametller, J., & Gutierrez-Berraondo, J. (2017). Evaluating and redesigning teaching learning sequences at the introductory physics level. Physical Review Physics Education Research, 13(2), 020139. DOI: https://doi.org/10.1103/PhysRevPhysEducRes.13.020139 

  29. *Guloy, S., Salimi, F., Cukierman, D., & McGee Thompson, D. (2017). Insights on supporting learning during computing science and engineering students’ transition to university: A design-oriented, mixed methods exploration of instructor and student perspectives. Higher Education, 73(3), 479–497. DOI: https://doi.org/10.1007/s10734-016-0097-6 

  30. *Hardré, P. L., Nanny, M., Refai, H., Ling, C., & Slater, J. (2010). Engineering a dynamic science learning environment for K–12 teachers. Teacher Education Quarterly (Spring), 37(2), 157–178. https://www.jstor.org/stable/23479594 

  31. Hertzberg, J., Leppek, B., & Gray, K. (2012). Art for the sake of improving attitudes towards engineering. 2012 ASEE Annual Conference & Exposition, San Antonio, TX. DOI: https://doi.org/10.18260/1-2--20966 

  32. Hesse-Biber, S. N. (2014). Feminist approaches to in-depth interviewing. In S. N. Hesse-Biber (Ed.), Feminist research practice: A primer (pp. 182–232). Los Angeles, CA: Sage Publications. 

  33. *Hira, A., & Hynes, M. M. (2019). Design-based research to broaden participation in pre-college engineering: Research and practice of an interest-based engineering challenges framework. European Journal of Engineering Education, 44(1–2), 103–122. DOI: https://doi.org/10.1080/03043797.2017.1405243 

  34. Hoadley, C. P. (2002). Creating context: Design based research in creating and understanding CSCL. Conference on Computer Support for Collaborative Learning: Foundation for a CSCL Community. http://www.designbasedresearch.org/reppubs/hoadley-cscl02.pdf. DOI: https://doi.org/10.3115/1658616.1658679 

  35. *Huang, K., Dong, Y., & Ge, X. (2006). From, by, and for the OSSD: Software engineering education using an open source software approach. Innovate: Journal of Online Education, 3(1). https://nsuworks.nova.edu/innovate/vol3/iss1/7 

  36. *Joo, K. P., Andrés, C., & Shearer, R. (2014). Promoting distance learners’ cognitive engagement and learning outcomes: Design-based research in the Costa Rican National University of Distance Education. The International Review of Research in Open and Distributed Learning, 15(6). DOI: https://doi.org/10.19173/irrodl.v15i6.1908 

  37. Kelly, A. E. (2004). Design research in education: Yes, but is it methodological? Journal of the Learning Sciences, 13(1), 115–128. DOI: https://doi.org/10.1207/s15327809jls1301_6 

  38. Kelly, A. E. (2014). Design-based research in engineering education: Current state and next steps. In A. Johri & B. M. Olds (Eds.), Cambridge Handbook of Engineering Education Research. New York: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9781139013451.032 

  39. Kolko, J. (2012). Wicked problems: Problems worth solving. Licenses under Creative Commone Attribution-NonCommercial-ShareAlike 3.0 Unported License. Retrieved from http://www.wickedproblems.com/ December 1, 2020. 

  40. Kolmos, A. (2015). Design-based research: A strategy for change in engineering education. In S. H. Christensen, C. Didier, A. Jamison, M. Meganck, C. Mitcham, & B. Newberry (Eds.), International perspectives on engineering education (pp. 373–392). Switzerland: Springer International Publishing. DOI: https://doi.org/10.1007/978-3-319-16169-3_18 

  41. *Kong, S. C., Yeung, Y. Y., & Wu, X. Q. (2009). An experience of teaching for learning by observation: Remote-controlled experiments on electrical circuits. Computers & Education, 52(3), 702–717. DOI: https://doi.org/10.1016/j.compedu.2008.11.011 

  42. *Langman, C., Zawojewski, J., McNicholas, P., Cinar, A., Brey, E., Bilgic, M., & Mehdizadeh, H. (2019). Disciplinary learning from an authentic engineering context. Journal of Pre-College Engineering Education Research (J-PEER), 9(1). DOI: https://doi.org/10.7771/2157-9288.1178 

  43. Levy, R. A., & Begeny, J. C. (2018). Understanding the realities of conducting school-based intervention research. DOI: https://doi.org/10.4135/9781526431837 

  44. *Liu, Y.-H., & Yu, F.-Y. (2019). Supporting active learning and formative evaluation via teaching-by-questioning in classrooms: design, development, and preliminary evaluation of an online learning system. Interactive Learning Environments, 27(5–6), 841–855. DOI: https://doi.org/10.1080/10494820.2018.1489858 

  45. *Moore, T. J., Glancy, A. W., Tank, K. M., Kersten, J. A., Smith, K. A., & Stohlmann, M. S. (2014). A framework for quality K–12 engineering education: Research and development. Journal of Pre-College Engineering Education Research (J-PEER), 4(1). DOI: https://doi.org/10.7771/2157-9288.1069 

  46. *Newstetter, W. C. (2005). Designing cognitive apprenticeships for biomedical engineering. Journal of Engineering Education, 94(2), 207–213. DOI: https://doi.org/10.1002/j.2168-9830.2005.tb00841.x 

  47. Oxman, A. D., Cook, D. J., & Guyatt, G. H. (1994). Users’ guides to the medical literature, VI. How to use an overview. Journal of the American Medical Association, 272, 1367–1371. DOI: https://doi.org/10.1001/jama.272.17.1367 

  48. Pawley, A. L. (2017). Shifting the “default”: The case for making diversity the expected condition for engineering education and making whiteness and maleness visible. Journal of Engineering Education, 106(4), 531–533. DOI: https://doi.org/10.1002/jee.20181 

  49. Phillips, D. C., & Dolle, J. R. (2006). From Plato to Brown and beyond: Theory, practice, and the promise of design experiments. In L. Verschaffel, F. Dochy, M. Bockaerts, & S. Vosniadou (Eds.), Instructional psychology: Past, present and future trends: Sixteen essays in honour of Erik DeCort (pp. 277–293). Amsterdam, The Netherlands: Elsevier. 

  50. Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences (155–169). DOI: https://doi.org/10.1007/BF01405730 

  51. Rossman, J. S., & Skvirsky, K. A. (2010). You don’t need a weatherman to know which way the wind blows: The art & science of flow visualization. Paper presented at the 40th Annual Frontiers in Education Conference Washington, DC. DOI: https://doi.org/10.1109/FIE.2010.5673176 

  52. Rothman, J., & Thomas, E. J. (Eds.). (1994). Intervention research: Design and development for human service. Binghamton, NY: The Haworth Press, Inc. DOI: https://doi.org/10.1093/sw/41.1.111-a 

  53. Salkind, N. J. (Ed.) (2010). Encyclopedia of research design. SAGE Publications, Inc. DOI: https://doi.org/10.4135/9781412961288 

  54. Sandoval, W. (2014). Conjecture mapping: An approach to systematic educational design research. Journal of the Learning Sciences, 23(1), 18–36. DOI: https://doi.org/10.1080/10508406.2013.778204 

  55. Schreier, M. (2014). Qualitative content analysis. In U. Flick (Ed.), The SAGE handbook of qualitative data analysis (pp. 170–183). Los Angeles, CA: SAGE. DOI: https://doi.org/10.4135/9781446282243.n12 

  56. Stappers, P. J., & Giaccardi, E. (2002). Chapter 43. Research through design. In The encyclopedia of human-computer interaction (2nd ed.): Interaction Design Foundation. https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/research-through-design 

  57. *Tang, K.-S. (2013). Out-of-school media representations of science and technology and their relevance for engineering learning. Journal of Engineering Education, 102(1), 51–76. DOI: https://doi.org/10.1002/jee.20007 

  58. The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8. DOI: https://doi.org/10.3102/0013189X032001005 

  59. *Todd, A. M., Zydney, J. M., & Keller, J. M. (2011). Developing an online learning community for engineering, cooperative-education students: A design-based research study. Journal of Cooperative Education and Internships, 45(01), 67–79. 

  60. Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educ. Technol. Res. Dev., 53(4), 5–23. DOI: https://doi.org/10.1007/BF02504682 

  61. *Weber, N. R., Strobel, J., Dyehouse, M. A., Harris, C., David, R., Fang, J., & Hua, I. (2014). First-year students’ environmental awareness and understanding of environmental sustainability through a life cycle assessment module. Journal of Engineering Education, 103(1), 154–181. DOI: https://doi.org/10.1002/jee.20032 

  62. *Yueh, H. P., Chen, T. L., Lin, W., & Sheen, H. J. (2014). Developing digital courseware for a virtual nano-biotechnology laboratory: A design-based research approach. Educational Technology & Society, 17(2), 158–168. https://www.jstor.org/stable/jeductechsoci.17.2.158?seq=1#metadata_info_tab_contents 

  63. Zheng, L. (2015). A systematic literature review of design-based research from 2004 to 2013. Journal of Computers in Education, 2(3), 399–420. DOI: https://doi.org/10.1007/s40692-015-0036-z 

comments powered by Disqus