• Privacy Policy

Buy Me a Coffee

Research Method

Home » Exploratory Research – Types, Methods and Examples

Exploratory Research – Types, Methods and Examples

Table of Contents

Exploratory Research

Exploratory Research

Definition:

Exploratory research is a type of research design that is used to investigate a research question when the researcher has limited knowledge or understanding of the topic or phenomenon under study.

The primary objective of exploratory research is to gain insights and gather preliminary information that can help the researcher better define the research problem and develop hypotheses or research questions for further investigation.

Exploratory Research Methods

There are several types of exploratory research, including:

Literature Review

This involves conducting a comprehensive review of existing published research, scholarly articles, and other relevant literature on the research topic or problem. It helps to identify the gaps in the existing knowledge and to develop new research questions or hypotheses.

Pilot Study

A pilot study is a small-scale preliminary study that helps the researcher to test research procedures, instruments, and data collection methods. This type of research can be useful in identifying any potential problems or issues with the research design and refining the research procedures for a larger-scale study.

This involves an in-depth analysis of a particular case or situation to gain insights into the underlying causes, processes, and dynamics of the issue under investigation. It can be used to develop a more comprehensive understanding of a complex problem, and to identify potential research questions or hypotheses.

Focus Groups

Focus groups involve a group discussion that is conducted to gather opinions, attitudes, and perceptions from a small group of individuals about a particular topic. This type of research can be useful in exploring the range of opinions and attitudes towards a topic, identifying common themes or patterns, and generating ideas for further research.

Expert Opinion

This involves consulting with experts or professionals in the field to gain their insights, expertise, and opinions on the research topic. This type of research can be useful in identifying the key issues and concerns related to the topic, and in generating ideas for further research.

Observational Research

Observational research involves gathering data by observing people, events, or phenomena in their natural settings to gain insights into behavior and interactions. This type of research can be useful in identifying patterns of behavior and interactions, and in generating hypotheses or research questions for further investigation.

Open-ended Surveys

Open-ended surveys allow respondents to provide detailed and unrestricted responses to questions, providing valuable insights into their attitudes, opinions, and perceptions. This type of research can be useful in identifying common themes or patterns, and in generating ideas for further research.

Data Analysis Methods

Exploratory Research Data Analysis Methods are as follows:

Content Analysis

This method involves analyzing text or other forms of data to identify common themes, patterns, and trends. It can be useful in identifying patterns in the data and developing hypotheses or research questions. For example, if the researcher is analyzing social media posts related to a particular topic, content analysis can help identify the most frequently used words, hashtags, and topics.

Thematic Analysis

This method involves identifying and analyzing patterns or themes in qualitative data such as interviews or focus groups. The researcher identifies recurring themes or patterns in the data and then categorizes them into different themes. This can be helpful in identifying common patterns or themes in the data and developing hypotheses or research questions. For example, a thematic analysis of interviews with healthcare professionals about patient care may identify themes related to communication, patient satisfaction, and quality of care.

Cluster Analysis

This method involves grouping data points into clusters based on their similarities or differences. It can be useful in identifying patterns in large datasets and grouping similar data points together. For example, if the researcher is analyzing customer data to identify different customer segments, cluster analysis can be used to group similar customers together based on their demographic, purchasing behavior, or preferences.

Network Analysis

This method involves analyzing the relationships and connections between data points. It can be useful in identifying patterns in complex datasets with many interrelated variables. For example, if the researcher is analyzing social network data, network analysis can help identify the most influential users and their connections to other users.

Grounded Theory

This method involves developing a theory or explanation based on the data collected during the exploratory research process. The researcher develops a theory or explanation that is grounded in the data, rather than relying on pre-existing theories or assumptions. This can be helpful in developing new theories or explanations that are supported by the data.

Applications of Exploratory Research

Exploratory research has many practical applications across various fields. Here are a few examples:

  • Marketing Research : In marketing research, exploratory research can be used to identify consumer needs, preferences, and behavior. It can also help businesses understand market trends and identify new market opportunities.
  • Product Development: In product development, exploratory research can be used to identify customer needs and preferences, as well as potential design flaws or issues. This can help companies improve their product offerings and develop new products that better meet customer needs.
  • Social Science Research: In social science research, exploratory research can be used to identify new areas of study, as well as develop new theories and hypotheses. It can also be used to identify potential research methods and approaches.
  • Healthcare Research : In healthcare research, exploratory research can be used to identify new treatments, therapies, and interventions. It can also be used to identify potential risk factors or causes of health problems.
  • Education Research: In education research, exploratory research can be used to identify new teaching methods and approaches, as well as identify potential areas of study for further research. It can also be used to identify potential barriers to learning or achievement.

Examples of Exploratory Research

Here are some more examples of exploratory research from different fields:

  • Social Science : A researcher wants to study the experience of being a refugee, but there is limited existing research on this topic. The researcher conducts exploratory research by conducting in-depth interviews with refugees to better understand their experiences, challenges, and needs.
  • Healthcare : A medical researcher wants to identify potential risk factors for a rare disease but there is limited information available. The researcher conducts exploratory research by reviewing medical records and interviewing patients and their families to identify potential risk factors.
  • Education : A teacher wants to develop a new teaching method to improve student engagement, but there is limited information on effective teaching methods. The teacher conducts exploratory research by reviewing existing literature and interviewing other teachers to identify potential approaches.
  • Technology : A software developer wants to develop a new app, but is unsure about the features that users would find most useful. The developer conducts exploratory research by conducting surveys and focus groups to identify user preferences and needs.
  • Environmental Science : An environmental scientist wants to study the impact of a new industrial plant on the surrounding environment, but there is limited existing research. The scientist conducts exploratory research by collecting and analyzing soil and water samples, and conducting interviews with residents to better understand the impact of the plant on the environment and the community.

How to Conduct Exploratory Research

Here are the general steps to conduct exploratory research:

  • Define the research problem: Identify the research problem or question that you want to explore. Be clear about the objective and scope of the research.
  • Review existing literature: Conduct a review of existing literature and research on the topic to identify what is already known and where gaps in knowledge exist.
  • Determine the research design : Decide on the appropriate research design, which will depend on the nature of the research problem and the available resources. Common exploratory research designs include case studies, focus groups, interviews, and surveys.
  • Collect data: Collect data using the chosen research design. This may involve conducting interviews, surveys, or observations, or collecting data from existing sources such as archives or databases.
  • Analyze data: Analyze the data collected using appropriate qualitative or quantitative techniques. This may include coding and categorizing qualitative data, or running descriptive statistics on quantitative data.
  • I nterpret and report findings: Interpret the findings of the analysis and report them in a way that is clear and understandable. The report should summarize the findings, discuss their implications, and make recommendations for further research or action.
  • Iterate : If necessary, refine the research question and repeat the process of data collection and analysis to further explore the topic.

When to use Exploratory Research

Exploratory research is appropriate in situations where there is limited existing knowledge or understanding of a topic, and where the goal is to generate insights and ideas that can guide further research. Here are some specific situations where exploratory research may be particularly useful:

  • New product development: When developing a new product, exploratory research can be used to identify consumer needs and preferences, as well as potential design flaws or issues.
  • Emerging technologies: When exploring emerging technologies, exploratory research can be used to identify potential uses and applications, as well as potential challenges or limitations.
  • Developing research hypotheses: When developing research hypotheses, exploratory research can be used to identify potential relationships or patterns that can be further explored through more rigorous research methods.
  • Understanding complex phenomena: When trying to understand complex phenomena, such as human behavior or societal trends, exploratory research can be used to identify underlying patterns or factors that may be influencing the phenomenon.
  • Developing research methods : When developing new research methods, exploratory research can be used to identify potential issues or limitations with existing methods, and to develop new methods that better capture the phenomena of interest.

Purpose of Exploratory Research

The purpose of exploratory research is to gain insights and understanding of a research problem or question where there is limited existing knowledge or understanding. The objective is to explore and generate ideas that can guide further research, rather than to test specific hypotheses or make definitive conclusions.

Exploratory research can be used to:

  • Identify new research questions: Exploratory research can help to identify new research questions and areas of inquiry, by providing initial insights and understanding of a topic.
  • Develop hypotheses: Exploratory research can help to develop hypotheses and testable propositions that can be further explored through more rigorous research methods.
  • Identify patterns and trends : Exploratory research can help to identify patterns and trends in data, which can be used to guide further research or decision-making.
  • Understand complex phenomena: Exploratory research can help to provide a deeper understanding of complex phenomena, such as human behavior or societal trends, by identifying underlying patterns or factors that may be influencing the phenomena.
  • Generate ideas: Exploratory research can help to generate new ideas and insights that can be used to guide further research, innovation, or decision-making.

Characteristics of Exploratory Research

The following are the main characteristics of exploratory research:

  • Flexible and open-ended : Exploratory research is characterized by its flexible and open-ended nature, which allows researchers to explore a wide range of ideas and perspectives without being constrained by specific research questions or hypotheses.
  • Qualitative in nature : Exploratory research typically relies on qualitative methods, such as in-depth interviews, focus groups, or observation, to gather rich and detailed data on the research problem.
  • Limited scope: Exploratory research is generally limited in scope, focusing on a specific research problem or question, rather than attempting to provide a comprehensive analysis of a broader phenomenon.
  • Preliminary in nature : Exploratory research is preliminary in nature, providing initial insights and understanding of a research problem, rather than testing specific hypotheses or making definitive conclusions.
  • I terative process : Exploratory research is often an iterative process, where the research design and methods may be refined and adjusted as new insights and understanding are gained.
  • I nductive approach : Exploratory research typically takes an inductive approach to data analysis, seeking to identify patterns and relationships in the data that can guide further research or hypothesis development.

Advantages of Exploratory Research

The following are some advantages of exploratory research:

  • Provides initial insights: Exploratory research is useful for providing initial insights and understanding of a research problem or question where there is limited existing knowledge or understanding. It can help to identify patterns, relationships, and potential hypotheses that can guide further research.
  • Flexible and adaptable : Exploratory research is flexible and adaptable, allowing researchers to adjust their methods and approach as they gain new insights and understanding of the research problem.
  • Qualitative methods : Exploratory research typically relies on qualitative methods, such as in-depth interviews, focus groups, and observation, which can provide rich and detailed data that is useful for gaining insights into complex phenomena.
  • Cost-effective : Exploratory research is often less costly than other research methods, such as large-scale surveys or experiments. It is typically conducted on a smaller scale, using fewer resources and participants.
  • Useful for hypothesis generation : Exploratory research can be useful for generating hypotheses and testable propositions that can be further explored through more rigorous research methods.
  • Provides a foundation for further research: Exploratory research can provide a foundation for further research by identifying potential research questions and areas of inquiry, as well as providing initial insights and understanding of the research problem.

Limitations of Exploratory Research

The following are some limitations of exploratory research:

  • Limited generalizability: Exploratory research is typically conducted on a small scale and uses non-random sampling techniques, which limits the generalizability of the findings to a broader population.
  • Subjective nature: Exploratory research relies on qualitative methods and is therefore subject to researcher bias and interpretation. The findings may be influenced by the researcher’s own perceptions, beliefs, and assumptions.
  • Lack of rigor: Exploratory research is often less rigorous than other research methods, such as experimental research, which can limit the validity and reliability of the findings.
  • Limited ability to test hypotheses: Exploratory research is not designed to test specific hypotheses, but rather to generate initial insights and understanding of a research problem. It may not be suitable for testing well-defined research questions or hypotheses.
  • Time-consuming : Exploratory research can be time-consuming and resource-intensive, particularly if the researcher needs to gather data from multiple sources or conduct multiple rounds of data collection.
  • Difficulty in interpretation: The open-ended nature of exploratory research can make it difficult to interpret the findings, particularly if the researcher is unable to identify clear patterns or relationships in the data.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Quantitative Research

Quantitative Research – Methods, Types and...

Qualitative Research Methods

Qualitative Research Methods

Explanatory Research

Explanatory Research – Types, Methods, Guide

Research-Methodology

Exploratory Research

Exploratory research, as the name implies, intends merely to explore the research questions and does not intend to offer final and conclusive solutions to existing problems. This type of research is usually conducted to study a problem that has not been clearly defined yet. Conducted in order to determine the nature of the problem, exploratory research is not intended to provide conclusive evidence, but helps us to have a better understanding of the problem.

When conducting exploratory research, the researcher ought to be willing to change his/her direction as a result of revelation of new data and new insights. [1] Accordingly, exploratory studies are often conducted using interpretive research methods and they answer to questions such as what, why and how.

Exploratory research design does not aim to provide the final and conclusive answers to the research questions, but merely explores the research topic with varying levels of depth. It has been noted that “exploratory research is the initial research, which forms the basis of more conclusive research. It can even help in determining the research design, sampling methodology and data collection method” [2] . Exploratory research “tends to tackle new problems on which little or no previous research has been done” [3] .

Unstructured interviews are the most popular primary data collection method with exploratory studies. Additionally, surveys , focus groups and observation methods can be used to collect primary data for this type of studies.

Examples of Exploratory Research Design

The following are some examples for studies with exploratory research design in business studies:

  • A study into the role of social networking sites as an effective marketing communication channel
  • An investigation into the ways of improvement of quality of customer services within hospitality sector in London
  • An assessment of the role of corporate social responsibility on consumer behaviour in pharmaceutical industry in the USA

Differences between Exploratory and Conclusive Research

The difference between exploratory and conclusive research is drawn by Sandhursen (2000) [4] in a way that exploratory studies result in a range of causes and alternative options for a solution of a specific problem, whereas, conclusive studies identify the final information that is the only solution to an existing research problem.

In other words, exploratory research design simply explores the research questions, leaving room for further researches, whereas conclusive research design is aimed to provide final findings for the research.

Moreover, it has been stated that “an exploratory study may not have as rigorous as methodology as it is used in conclusive studies, and sample sizes may be smaller. But it helps to do the exploratory study as methodically as possible, if it is going to be used for major decisions about the way we are going to conduct our next study” [5] (Nargundkar, 2003, p.41).

Exploratory studies usually create scope for future research and the future research may have a conclusive design. For example, ‘a study into the implications of COVID-19 pandemic into the global economy’ is an exploratory research. COVID-19 pandemic is a recent phenomenon and the study can generate an initial knowledge about economic implications of the phenomenon.

A follow-up study, building on the findings of this research ‘a study into the effects of COVID-19 pandemic on tourism revenues in Morocco’ is a causal conclusive research. The second research can produce research findings that can be of a practical use for decision making.

Advantages of Exploratory Research

  • Lower costs of conducting the study
  • Flexibility and adaptability to change
  • Exploratory research is effective in laying the groundwork that will lead to future studies.
  • Exploratory studies can potentially save time by determining at the earlier stages the types of research that are worth pursuing

Disadvantages of Exploratory Research

  • Inclusive nature of research findings
  • Exploratory studies generate qualitative information and interpretation of such type of information is subject to bias
  • These types of studies usually make use of a modest number of samples that may not adequately represent the target population. Accordingly, findings of exploratory research cannot be generalized to a wider population.
  • Findings of such type of studies are not usually useful in decision making in a practical level.

My e-book,  The Ultimate Guide to Writing a Dissertation in Business Studies: a step by step assistance  contains discussions of theory and application of research designs. The e-book also explains all stages of the  research process  starting from the  selection of the research area  to writing personal reflection. Important elements of dissertations such as  research philosophy ,  research approach ,  methods of data collection ,  data analysis  and  sampling  are explained in this e-book in simple words.

John Dudovskiy

Exploratory research

[1] Source: Saunders, M., Lewis, P. & Thornhill, A. (2012) “Research Methods for Business Students” 6 th  edition, Pearson Education Limited

[2] Singh, K. (2007) “Quantitative Social Research Methods” SAGE Publications, p.64

[3] Brown, R.B. (2006) “Doing Your Dissertation in Business and Management: The Reality of Research and Writing” Sage Publications, p.43

[4] Sandhusen, R.L. (2000) “Marketing” Barrons

[5] Nargundkar, R. (2008) “Marketing Research: Text and Cases” 3 rd edition, p.38

  • Tools and Resources
  • Customer Services
  • Original Language Spotlight
  • Alternative and Non-formal Education 
  • Cognition, Emotion, and Learning
  • Curriculum and Pedagogy
  • Education and Society
  • Education, Change, and Development
  • Education, Cultures, and Ethnicities
  • Education, Gender, and Sexualities
  • Education, Health, and Social Services
  • Educational Administration and Leadership
  • Educational History
  • Educational Politics and Policy
  • Educational Purposes and Ideals
  • Educational Systems
  • Educational Theories and Philosophies
  • Globalization, Economics, and Education
  • Languages and Literacies
  • Professional Learning and Development
  • Research and Assessment Methods
  • Technology and Education
  • Share This Facebook LinkedIn Twitter

Article contents

Qualitative design research methods.

  • Michael Domínguez Michael Domínguez San Diego State University
  • https://doi.org/10.1093/acrefore/9780190264093.013.170
  • Published online: 19 December 2017

Emerging in the learning sciences field in the early 1990s, qualitative design-based research (DBR) is a relatively new methodological approach to social science and education research. As its name implies, DBR is focused on the design of educational innovations, and the testing of these innovations in the complex and interconnected venue of naturalistic settings. As such, DBR is an explicitly interventionist approach to conducting research, situating the researcher as a part of the complex ecology in which learning and educational innovation takes place.

With this in mind, DBR is distinct from more traditional methodologies, including laboratory experiments, ethnographic research, and large-scale implementation. Rather, the goal of DBR is not to prove the merits of any particular intervention, or to reflect passively on a context in which learning occurs, but to examine the practical application of theories of learning themselves in specific, situated contexts. By designing purposeful, naturalistic, and sustainable educational ecologies, researchers can test, extend, or modify their theories and innovations based on their pragmatic viability. This process offers the prospect of generating theory-developing, contextualized knowledge claims that can complement the claims produced by other forms of research.

Because of this interventionist, naturalistic stance, DBR has also been the subject of ongoing debate concerning the rigor of its methodology. In many ways, these debates obscure the varied ways DBR has been practiced, the varied types of questions being asked, and the theoretical breadth of researchers who practice DBR. With this in mind, DBR research may involve a diverse range of methods as researchers from a variety of intellectual traditions within the learning sciences and education research design pragmatic innovations based on their theories of learning, and document these complex ecologies using the methodologies and tools most applicable to their questions, focuses, and academic communities.

DBR has gained increasing interest in recent years. While it remains a popular methodology for developmental and cognitive learning scientists seeking to explore theory in naturalistic settings, it has also grown in importance to cultural psychology and cultural studies researchers as a methodological approach that aligns in important ways with the participatory commitments of liberatory research. As such, internal tension within the DBR field has also emerged. Yet, though approaches vary, and have distinct genealogies and commitments, DBR might be seen as the broad methodological genre in which Change Laboratory, design-based implementation research (DBIR), social design-based experiments (SDBE), participatory design research (PDR), and research-practice partnerships might be categorized. These critically oriented iterations of DBR have important implications for educational research and educational innovation in historically marginalized settings and the Global South.

  • design-based research
  • learning sciences
  • social-design experiment
  • qualitative research
  • research methods

Educational research, perhaps more than many other disciplines, is a situated field of study. Learning happens around us every day, at all times, in both formal and informal settings. Our worlds are replete with complex, dynamic, diverse communities, contexts, and institutions, many of which are actively seeking guidance and support in the endless quest for educational innovation. Educational researchers—as a source of potential expertise—are necessarily implicated in this complexity, linked to the communities and institutions through their very presence in spaces of learning, poised to contribute with possible solutions, yet often positioned as separate from the activities they observe, creating dilemmas of responsibility and engagement.

So what are educational scholars and researchers to do? These tensions invite a unique methodological challenge for the contextually invested researcher, begging them to not just produce knowledge about learning, but to participate in the ecology, collaborating on innovations in the complex contexts in which learning is taking place. In short, for many educational researchers, our backgrounds as educators, our connections to community partners, and our sociopolitical commitments to the process of educational innovation push us to ensure that our work is generative, and that our theories and ideas—our expertise—about learning and education are made pragmatic, actionable, and sustainable. We want to test what we know outside of laboratories, designing, supporting, and guiding educational innovation to see if our theories of learning are accurate, and useful to the challenges faced in schools and communities where learning is messy, collaborative, and contested. Through such a process, we learn, and can modify our theories to better serve the real needs of communities. It is from this impulse that qualitative design-based research (DBR) emerged as a new methodological paradigm for education research.

Qualitative design-based research will be examined, documenting its origins, the major tenets of the genre, implementation considerations, and methodological issues, as well as variance within the paradigm. As a relatively new methodology, much tension remains in what constitutes DBR, and what design should mean, and for whom. These tensions and questions, as well as broad perspectives and emergent iterations of the methodology, will be discussed, and considerations for researchers looking toward the future of this paradigm will be considered.

The Origins of Design-Based Research

Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown ( 1992 ) and Allan Collins ( 1992 ). For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory experiments, ethnographies, and large-scale educational interventions were the only methods available. During these decades, a growing community of learning science and educational researchers (e.g., Bereiter & Scardamalia, 1989 ; Brown, Campione, Webber, & McGilley, 1992 ; Cobb & Steffe, 1983 ; Cole, 1995 ; Scardamalia & Bereiter, 1991 ; Schoenfeld, 1982 , 1985 ; Scribner & Cole, 1978 ) interested in educational innovation and classroom interventions in situated contexts began to find the prevailing methodologies insufficient for the types of learning they wished to document, the roles they wished to play in research, and the kinds of knowledge claims they wished to explore. The laboratory, or laboratory-like settings, where research on learning was at the time happening, was divorced from the complexity of real life, and necessarily limiting. Alternatively, most ethnographic research, while more attuned to capturing these complexities and dynamics, regularly assumed a passive stance 1 and avoided interceding in the learning process, or allowing researchers to see what possibility for innovation existed from enacting nascent learning theories. Finally, large-scale interventions could test innovations in practice but lost sight of the nuance of development and implementation in local contexts (Brown, 1992 ; Collins, Joseph, & Bielaczyc, 2004 ).

Dissatisfied with these options, and recognizing that in order to study and understand learning in the messiness of socially, culturally, and historically situated settings, new methods were required, Brown ( 1992 ) proposed an alternative: Why not involve ourselves in the messiness of the process, taking an active, grounded role in disseminating our theories and expertise by becoming designers and implementers of educational innovations? Rather than observing from afar, DBR researchers could trace their own iterative processes of design, implementation, tinkering, redesign, and evaluation, as it unfolded in shared work with teachers, students, learners, and other partners in lived contexts. This premise, initially articulated as “design experiments” (Brown, 1992 ), would be variously discussed over the next decade as “design research,” (Edelson, 2002 ) “developmental research,” (Gravemeijer, 1994 ), and “design-based research,” (Design-Based Research Collective, 2003 ), all of which reflect the original, interventionist, design-oriented concept. The latter term, “design-based research” (DBR), is used here, recognizing this as the prevailing terminology used to refer to this research approach at present. 2

Regardless of the evolving moniker, the prospects of such a methodology were extremely attractive to researchers. Learning scientists acutely aware of various aspects of situated context, and interested in studying the applied outcomes of learning theories—a task of inquiry into situated learning for which canonical methods were rather insufficient—found DBR a welcome development (Bell, 2004 ). As Barab and Squire ( 2004 ) explain: “learning scientists . . . found that they must develop technological tools, curriculum, and especially theories that help them systematically understand and predict how learning occurs” (p. 2), and DBR methodologies allowed them to do this in proactive, hands-on ways. Thus, rather than emerging as a strict alternative to more traditional methodologies, DBR was proposed to fill a niche that other methodologies were ill-equipped to cover.

Effectively, while its development is indeed linked to an inherent critique of previous research paradigms, neither Brown nor Collins saw DBR in opposition to other forms of research. Rather, by providing a bridge from the laboratory to the real world, where learning theories and proposed innovations could interact and be implemented in the complexity of lived socio-ecological contexts (Hoadley, 2004 ), new possibilities emerged. Learning researchers might “trace the evolution of learning in complex, messy classrooms and schools, test and build theories of teaching and learning, and produce instructional tools that survive the challenges of everyday practice” (Shavelson, Phillips, Towne, & Feuer, 2003 , p. 25). Thus, DBR could complement the findings of laboratory, ethnographic, and large-scale studies, answering important questions about the implementation, sustainability, limitations, and usefulness of theories, interventions, and learning when introduced as innovative designs into situated contexts of learning. Moreover, while studies involving these traditional methodologies often concluded by pointing toward implications—insights subsequent studies would need to take up—DBR allowed researchers to address implications iteratively and directly. No subsequent research was necessary, as emerging implications could be reflexively explored in the context of the initial design, offering considerable insight into how research is translated into theory and practice.

Since its emergence in 1992 , DBR as a methodological approach to educational and learning research has quickly grown and evolved, used by researchers from a variety of intellectual traditions in the learning sciences, including developmental and cognitive psychology (e.g., Brown & Campione, 1996 , 1998 ; diSessa & Minstrell, 1998 ), cultural psychology (e.g., Cole, 1996 , 2007 ; Newman, Griffin, & Cole, 1989 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ), cultural anthropology (e.g., Barab, Kinster, Moore, Cunningham, & the ILF Design Team, 2001 ; Polman, 2000 ; Stevens, 2000 ; Suchman, 1995 ), and cultural-historical activity theory (e.g., Engeström, 2011 ; Espinoza, 2009 ; Espinoza & Vossoughi, 2014 ; Gutiérrez, 2008 ; Sannino, 2011 ). Given this plurality of epistemological and theoretical fields that employ DBR, it might best be understood as a broad methodology of educational research, realized in many different, contested, heterogeneous, and distinct iterations, and engaging a variety of qualitative tools and methods (Bell, 2004 ). Despite tensions among these iterations, and substantial and important variances in the ways they employ design-as-research in community settings, there are several common, methodological threads that unite the broad array of research that might be classified as DBR under a shared, though pluralistic, paradigmatic umbrella.

The Tenets of Design-Based Research

Why design-based research.

As we turn to the core tenets of the design-based research (DBR) paradigm, it is worth considering an obvious question: Why use DBR as a methodology for educational research? To answer this, it is helpful to reflect on the original intentions for DBR, particularly, that it is not simply the study of a particular, isolated intervention. Rather, DBR methodologies were conceived of as the complete, iterative process of designing, modifying, and assessing the impact of an educational innovation in a contextual, situated learning environment (Barab & Kirshner, 2001 ; Brown, 1992 ; Cole & Engeström, 2007 ). The design process itself—inclusive of the theory of learning employed, the relationships among participants, contextual factors and constraints, the pedagogical approach, any particular intervention, as well as any changes made to various aspects of this broad design as it proceeds—is what is under study.

Considering this, DBR offers a compelling framework for the researcher interested in having an active and collaborative hand in designing for educational innovation, and interested in creating knowledge about how particular theories of learning, pedagogical or learning practices, or social arrangements function in a context of learning. It is a methodology that can put the researcher in the position of engineer , actively experimenting with aspects of learning and sociopolitical ecologies to arrive at new knowledge and productive outcomes, as Cobb, Confrey, diSessa, Lehrer, and Schauble ( 2003 ) explain:

Prototypically, design experiments entail both “engineering” particular forms of learning and systematically studying those forms of learning within the context defined by the means of supporting them. This designed context is subject to test and revision, and the successive iterations that result play a role similar to that of systematic variation in experiment. (p. 9)

This being said, how directive the engineering role the researcher takes on varies considerably among iterations of DBR. Indeed, recent approaches have argued strongly for researchers to take on more egalitarian positionalities with respect to the community partners with whom they work (e.g., Zavala, 2016 ), acting as collaborative designers, rather than authoritative engineers.

Method and Methodology in Design-Based Research

Now, having established why we might use DBR, a recurring question that has faced the DBR paradigm is whether DBR is a methodology at all. Given the variety of intellectual and ontological traditions that employ it, and thus the pluralism of methods used in DBR to enact the “engineering” role (whatever shape that may take) that the researcher assumes, it has been argued that DBR is not, in actuality a methodology at all (Kelly, 2004 ). The proliferation and diversity of approaches, methods, and types of analysis purporting to be DBR have been described as a lack of coherence that shows there is no “argumentative grammar” or methodology present in DBR (Kelly, 2004 ).

Now, the conclusions one will eventually draw in this debate will depend on one’s orientations and commitments, but it is useful to note that these demands for “coherence” emerge from previous paradigms in which methodology was largely marked by a shared, coherent toolkit for data collection and data analysis. These previous paradigmatic rules make for an odd fit when considering DBR. Yet, even if we proceed—within the qualitative tradition from which DBR emerges—defining methodology as an approach to research that is shaped by the ontological and epistemological commitments of the particular researcher, and methods as the tools for research, data collection, and analysis that are chosen by the researcher with respect to said commitments (Gutiérrez, Engeström, & Sannino, 2016 ), then a compelling case for DBR as a methodology can be made (Bell, 2004 ).

Effectively, despite the considerable variation in how DBR has been and is employed, and tensions within the DBR field, we might point to considerable, shared epistemic common ground among DBR researchers, all of whom are invested in an approach to research that involves engaging actively and iteratively in the design and exploration of learning theory in situated, natural contexts. This common epistemic ground, even in the face of pluralistic ideologies and choices of methods, invites in a new type of methodological coherence, marked by “intersubjectivity without agreement” (Matusov, 1996 ), that links DBR from traditional developmental and cognitive psychology models of DBR (e.g., Brown, 1992 ; Brown & Campione, 1998 ; Collins, 1992 ), to more recent critical and sociocultural manifestations (e.g., Bang & Vossoughi, 2016 ; Engeström, 2011 ; Gutiérrez, 2016 ), and everything in between.

Put in other terms, even as DBR researchers may choose heterogeneous methods for data collection, data analysis, and reporting results complementary to the ideological and sociopolitical commitments of the particular researcher and the types of research questions that are under examination (Bell, 2004 ), a shared epistemic commitment gives the methodology shape. Indeed, the common commitment toward design innovation emerges clearly across examples of DBR methodological studies ranging in method from ethnographic analyses (Salvador, Bell, & Anderson, 1999 ) to studies of critical discourse within a design (Kärkkäinen, 1999 ), to focused examinations of metacognition of individual learners (White & Frederiksen, 1998 ), and beyond. Rather than indicating a lack of methodology, or methodological weakness, the use of varying qualitative methods for framing data collection and retrospective analyses within DBR, and the tensions within the epistemic common ground itself, simply reflects the scope of its utility. Learning in context is complex, contested, and messy, and the plurality of methods present across DBR allow researchers to dynamically respond to context as needed, employing the tools that fit best to consider the questions that are present, or may arise.

All this being the case, it is useful to look toward the coherent elements—the “argumentative grammar” of DBR, if you will—that can be identified across the varied iterations of DBR. Understanding these shared features, in the context and terms of the methodology itself, help us to appreciate what is involved in developing robust and thorough DBR research, and how DBR seeks to make strong, meaningful claims around the types of research questions it takes up.

Coherent Features of Design-Based Research

Several scholars have provided comprehensive overviews and listings of what they see as the cross-cutting features of DBR, both in the context of more traditional models of DBR (e.g., Cobb et al., 2003 ; Design-Based Research Collective, 2003 ), and in regards to newer iterations (e.g., Gutiérrez & Jurow, 2016 ; Bang & Vossoughi, 2016 ). Rather than try to offer an overview of each of these increasingly pluralistic classifications, the intent here is to attend to three broad elements that are shared across articulations of DBR and reflect the essential elements that constitute the methodological approach DBR offers to educational researchers.

Design research is concerned with the development, testing, and evolution of learning theory in situated contexts

This first element is perhaps most central to what DBR of all types is, anchored in what Brown ( 1992 ) was initially most interested in: testing the pragmatic validity of theories of learning by designing interventions that engaged with, or proposed, entire, naturalistic, ecologies of learning. Put another way, while DBR studies may have various units of analysis, focuses, and variables, and may organize learning in many different ways, it is the theoretically informed design for educational innovation that is most centrally under evaluation. DBR actively and centrally exists as a paradigm that is engaged in the development of theory, not just the evaluation of aspects of its usage (Bell, 2004 ; Design-Based Research Collective, 2003 ; Lesh & Kelly, 2000 ; van den Akker, 1999 ).

Effectively, where DBR is taking place, theory as a lived possibility is under examination. Specifically, in most DBR, this means a focus on “intermediate-level” theories of learning, rather than “grand” ones. In essence, DBR does not contend directly with “grand” learning theories (such as developmental or sociocultural theory writ large) (diSessa, 1991 ). Rather, DBR seeks to offer constructive insights by directly engaging with particular learning processes that flow from these theories on a “grounded,” “intermediate” level. This is not, however, to say DBR is limited in what knowledge it can produce; rather, tinkering in this “intermediate” realm can produce knowledge that informs the “grand” theory (Gravemeijer, 1994 ). For example, while cognitive and motivational psychology provide “grand” theoretical frames, interest-driven learning (IDL) is an “intermediate” theory that flows from these and can be explored in DBR to both inform the development of IDL designs in practice and inform cognitive and motivational psychology more broadly (Joseph, 2004 ).

Crucially, however, DBR entails putting the theory in question under intense scrutiny, or, “into harm’s way” (Cobb et al., 2003 ). This is an especially core element to DBR, and one that distinguishes it from the proliferation of educational-reform or educational-entrepreneurship efforts that similarly take up the discourse of “design” and “innovation.” Not only is the reflexive, often participatory element of DBR absent from such efforts—that is, questioning and modifying the design to suit the learning needs of the context and partners—but the theory driving these efforts is never in question, and in many cases, may be actively obscured. Indeed, it is more common to see educational-entrepreneur design innovations seek to modify a context—such as the way charter schools engage in selective pupil recruitment and intensive disciplinary practices (e.g., Carnoy et al., 2005 ; Ravitch, 2010 ; Saltman, 2007 )—rather than modify their design itself, and thus allow for humility in their theory. Such “innovations” and “design” efforts are distinct from DBR, which must, in the spirit of scientific inquiry, be willing to see the learning theory flail and struggle, be modified, and evolve.

This growth and evolution of theory and knowledge is of course central to DBR as a rigorous research paradigm; moving it beyond simply the design of local educational programs, interventions, or innovations. As Barab and Squire ( 2004 ) explain:

Design-based research requires more than simply showing a particular design works but demands that the researcher (move beyond a particular design exemplar to) generate evidence-based claims about learning that address contemporary theoretical issues and further the theoretical knowledge of the field. (pp. 5–6)

DBR as a research paradigm offers a design process through which theories of learning can be tested; they can be modified, and by allowing them to operate with humility in situated conditions, new insights and knowledge, even new theories, may emerge that might inform the field, as well as the efforts and directions of other types of research inquiry. These productive, theory-developing outcomes, or “ontological innovations” (diSessa & Cobb, 2004 ), represent the culmination of an effective program of DBR—the production of new ways to understand, conceptualize, and enact learning as a lived, contextual process.

Design research works to understand learning processes, and the design that supports them in situated contexts

As a research methodology that operates by tinkering with “grounded” learning theories, DBR is itself grounded, and seeks to develop its knowledge claims and designs in naturalistic, situated contexts (Brown, 1992 ). This is, again, a distinguishing element of DBR—setting it apart from laboratory research efforts involving design and interventions in closed, controlled environments. Rather than attempting to focus on singular variables, and isolate these from others, DBR is concerned with the multitude of variables that naturally occur across entire learning ecologies, and present themselves in distinct ways across multiple planes of possible examination (Rogoff, 1995 ; Collins, Joseph, & Bielaczyc, 2004 ). Certainly, specific variables may be identified as dependent, focal units of analysis, but identifying (while not controlling for) the variables beyond these, and analyzing their impact on the design and learning outcomes, is an equally important process in DBR (Collins et al., 2004 ; Barab & Kirshner, 2001 ). In practice, this of course varies across iterations in its depth and breadth. Traditional models of developmental or cognitive DBR may look to account for the complexity and nuance of a setting’s social, developmental, institutional, and intellectual characteristics (e.g., Brown, 1992 ; Cobb et al., 2003 ), while more recent, critical iterations will give increased attention to how historicity, power, intersubjectivity, and culture, among other things, influence and shape a setting, and the learning that occurs within it (e.g., Gutiérrez, 2016 ; Vakil, de Royston, Nasir, & Kirshner, 2016 ).

Beyond these variations, what counts as “design” in DBR varies widely, and so too will what counts as a naturalistic setting. It has been well documented that learning occurs all the time, every day, and in every space imaginable, both formal and informal (Leander, Phillips, & Taylor, 2010 ), and in ways that span strictly defined setting boundaries (Engeström, Engeström, & Kärkkäinen, 1995 ). DBR may take place in any number of contexts, based on the types of questions asked, and the learning theories and processes that a researcher may be interested in exploring. DBR may involve one-to-one tutoring and learning settings, single classrooms, community spaces, entire institutions, or even holistically designed ecologies (Design-Based Research Collective, 2003 ; Engeström, 2008 ; Virkkunen & Newnham, 2013 ). In all these cases, even the most completely designed experimental ecology, the setting remains naturalistic and situated because DBR actively embraces the uncontrollable variables that participants bring with them to the learning process for and from their situated worlds, lives, and experiences—no effort is made to control for these complicated influences of life, simply to understand how they operate in a given ecology as innovation is attempted. Thus, the extent of the design reflects a broader range of qualitative and theoretical study, rather than an attempt to control or isolate some particular learning process from outside influence.

While there is much variety in what design may entail, where DBR takes place, what types of learning ecologies are under examination, and what methods are used, situated ecologies are always the setting of this work. In this way, conscious of naturalistic variables, and the influences that culture, historicity, participation, and context have on learning, researchers can use DBR to build on prior research, and extend knowledge around the learning that occurs in the complexity of situated contexts and lived practices (Collins et al., 2004 ).

Design based research is iterative; it changes, grows, and evolves to meet the needs and emergent questions of the context, and this tinkering process is part of the research

The final shared element undergirding models of DBR is that it is an iterative, active, and interventionist process, interested in and focused on producing educational innovation by actually and actively putting design innovations into practice (Brown, 1992 , Collins, 1992 ; Gutiérrez, 2008 ). Given this interventionist, active stance, tinkering with the design and the theory of learning informing the design is as much a part of the research process as the outcome of the intervention or innovation itself—we learn what impacts learning as much, if not more, than we learn what was learned. In this sense, DBR involves a focus on analyzing the theory-driven design itself, and its implementation as an object of study (Edelson, 2002 ; Penuel, Fishman, Cheng, & Sabelli, 2011 ), and is ultimately interested in the improvement of the design—of how it unfolds, how it shifts, how it is modified, and made to function productively for participants in their contexts and given their needs (Kirshner & Polman, 2013 ).

While DBR is iterative and contextual as a foundational methodological principle, what this means varies across conceptions of DBR. For instance, in more traditional models, Brown and Campione ( 1996 ) pointed out the dangers of “lethal mutation” in which a design, introduced into a context, may become so warped by the influence, pressures, incomplete implementation, or misunderstanding of participants in the local context, that it no longer reflects or tests the theory under study. In short, a theory-driven intervention may be put in place, and then subsumed to such a degree by participants based on their understanding and needs, that it remains the original innovative design in name alone. The assertion here is that in these cases, the research ceases to be DBR in the sense that the design is no longer central, actively shaping learning. We cannot, they argue, analyze a design—and the theory it was meant to reflect—as an object of study when it has been “mutated,” and it is merely a banner under which participants are enacting their idiosyncratic, pragmatic needs.

While the ways in which settings and individuals might disrupt designs intended to produce robust learning is certainly a tension to be cautious of in DBR, it is also worth noting that in many critical approaches to DBR, such mutations—whether “lethal” to the original design or not—are seen as compelling and important moments. Here, where collaboration and community input is more central to the design process, iterative is understood differently. Thus, a “mutation” becomes a point where reflexivity, tension, and contradiction might open the door for change, for new designs, for reconsiderations of researcher and collaborative partner positionalities, or for ethnographic exploration into how a context takes up, shapes, and ultimately engages innovations in a particular sociocultural setting. In short, accounting for and documenting changes in design is a vital part of the DBR process, allowing researchers to respond to context in a variety of ways, always striving for their theories and designs to act with humility, and in the interest of usefulness .

With this in mind, the iterative nature of DBR means that the relationships researchers have with other design partners (educators and learners) in the ecology are incredibly important, and vital to consider (Bang et al., 2016 ; Engeström, 2007 ; Engeström, Sannino, & Virkkunen, 2014 ). Different iterations of DBR might occur in ways in which the researcher is more or less intimately involved in the design and implementation process, both in terms of actual presence and intellectual ownership of the design. Regarding the former, in some cases, a researcher may hand a design off to others to implement, periodically studying and modifying it, while in other contexts or designs, the researcher may be actively involved, tinkering in every detail of the implementation and enactment of the design. With regard to the latter, DBR might similarly range from a somewhat prescribed model, in which the researcher is responsible for the original design, and any modifications that may occur based on their analyses, without significant input from participants (e.g., Collins et al., 2004 ), to incredibly participatory models, in which all parties (researchers, educators, learners) are part of each step of the design-creation, modification, and research process (e.g., Bang, Faber, Gurneau, Marin, & Soto, 2016 ; Kirshner, 2015 ).

Considering the wide range of ideological approaches and models for DBR, we might acknowledge that DBR can be gainfully conducted through many iterations of “openness” to the design process. However, the strength of the research—focused on analyzing the design itself as a unit of study reflective of learning theory—will be bolstered by thoughtfully accounting for how involved the researcher will be, and how open to participation the modification process is. These answers should match the types of questions, and conceptual or ideological framing, with which researchers approach DBR, allowing them to tinker with the process of learning as they build on prior research to extend knowledge and test theory (Barab & Kirshner, 2001 ), while thoughtfully documenting these changes in the design as they go.

Implementation and Research Design

As with the overarching principles of design-based research (DBR), even amid the pluralism of conceptual frameworks of DBR researchers, it is possible, and useful, to trace the shared contours in how DBR research design is implemented. Though texts provide particular road maps for undertaking various iterations of DBR consistent with the specific goals, types of questions, and ideological orientations of these scholarly communities (e.g., Cole & Engeström, 2007 ; Collins, Joseph, & Bielaczyc, 2004 ; Fishman, Penuel, Allen, Cheng, & Sabelli, 2013 ; Gutiérrez & Jurow, 2016 ; Virkkunen & Newnham, 2013 ), certain elements, realized differently, can be found across all of these models, and may be encapsulated in five broad methodological phases.

Considering the Design Focus

DBR begins by considering what the focus of the design, the situated context, and the units of analysis for research will be. Prospective DBR researchers will need to consider broader research in regard to the “grand” theory of learning with which they work to determine what theoretical questions they have, or identify “intermediate” aspects of the theories that might be studied and strengthened by a design process in situated contexts, and what planes of analysis (Rogoff, 1995 ) will be most suitable for examination. This process allows for the identification of the critical theoretical elements of a design, and articulation of initial research questions.

Given the conceptual framework, theoretical and research questions, and sociopolitical interests at play, researchers may undertake this, and subsequent steps in the process, on their own, or in close collaboration with the communities and individuals in the situated contexts in which the design will unfold. As such, across iterations of DBR, and with respect to the ways DBR researchers choose to engage with communities, the origin of the design will vary, and might begin in some cases with theoretical questions, or arise in others as a problem of practice (Coburn & Penuel, 2016 ), though as has been noted, in either case, theory and practice are necessarily linked in the research.

Creating and Implementing a Designed Innovation

From the consideration and identification of the critical elements, planned units of analysis, and research questions that will drive a design, researchers can then actively create (either on their own or in conjunction with potential design partners) a designed intervention reflecting these critical elements, and the overarching theory.

Here, the DBR researcher should consider what partners exist in the process and what ownership exists around these partnerships, determine exactly what the pragmatic features of the intervention/design will be and who will be responsible for them, and consider when checkpoints for modification and evaluation will be undertaken, and by whom. Additionally, researchers should at this stage consider questions of timeline and of recruiting participants, as well as what research materials will be needed to adequately document the design, its implementation, and its outcomes, and how and where collected data will be stored.

Once a design (the planned, theory-informed innovative intervention) has been produced, the DBR researcher and partners can begin the implementation process, putting the design into place and beginning data collection and documentation.

Assessing the Impact of the Design on the Learning Ecology

Chronologically, the next two methodological steps happen recursively in the iterative process of DBR. The researcher must assess the impact of the design, and then, make modifications as necessary, before continuing to assess the impact of these modifications. In short, these next two steps are a cycle that continues across the life and length of the research design.

Once a design has been created and implemented, the researcher begins to observe and document the learning, the ecology, and the design itself. Guided by and in conversation with the theory and critical elements, the researcher should periodically engage in ongoing data analysis, assessing the success of the design, and of learning, paying equal attention to the design itself, and how its implementation is working in the situated ecology.

Within the realm of qualitative research, measuring or assessing variables of learning and assessing the design may look vastly different, require vastly different data-collection and data-analysis tools, and involve vastly different research methods among different researchers.

Modifying the Design

Modification, based on ongoing assessment of the design, is what makes DBR iterative, helping the researcher extend the field’s knowledge about the theory, design, learning, and the context under examination.

Modification of the design can take many forms, from complete changes in approach or curriculum, to introducing an additional tool or mediating artifact into a learning ecology. Moreover, how modification unfolds involves careful reflection from the researcher and any co-designing participants, deciding whether modification will be an ongoing, reflexive, tinkering process, or if it will occur only at predefined checkpoints, after formal evaluation and assessment. Questions of ownership, issues of resource availability, technical support, feasibility, and communication are all central to the work of design modification, and answers will vary given the research questions, design parameters, and researchers’ epistemic commitments.

Each moment of modification indicates a new phase in a DBR project, and a new round of assessing—through data analysis—the impact of the design on the learning ecology, either to guide continued or further modification, report the results of the design, or in some cases, both.

Reporting the Results of the Design

The final step in DBR methodology is to report on the results of the designed intervention, how it contributed to understandings of theory, and how it impacted the local learning ecology or context. The format, genre, and final data analysis methods used in reporting data and research results will vary across iterations of DBR. However, it is largely understood that to avoid methodological confusion, DBR researchers should clearly situate themselves in the DBR paradigm by clearly describing and detailing the design itself; articulating the theory, central elements, and units of analysis under scrutiny, what modifications occurred and what precipitated these changes, and what local effects were observed; and exploring any potential contributions to learning theory, while accounting for the context and their interventionist role and positionality in the design. As such, careful documentation of pragmatic and design decisions for retrospective data analysis, as well as research findings, should be done at each stage of this implementation process.

Methodological Issues in the Design-Based Research Paradigm

Because of its pluralistic nature, its interventionist, nontraditional stance, and the fact that it remains in its conceptual infancy, design-based research (DBR) is replete with ongoing methodological questions and challenges, both from external and internal sources. While there are many more that may exist, addressed will be several of the most pressing the prospective DBR researcher may encounter, or want to consider in understanding the paradigm and beginning a research design.

Challenges to Rigor and Validity

Perhaps the place to begin this reflection on tensions in the DBR paradigm is the recurrent and ongoing challenge to the rigor and validity of DBR, which has asked: Is DBR research at all? Given the interventionist and activist way in which DBR invites the researcher to participate, and the shift in orientation from long-accepted research paradigms, such critiques are hardly surprising, and fall in line with broader challenges to the rigor and objectivity of qualitative social science research in general. Historically, such complaints about DBR are linked to decades of critique of any research that does not adhere to the post-positivist approach set out as the U.S. Department of Education began to prioritize laboratory and large-scale randomized control-trial experimentation as the “gold standard” of research design (e.g., Mosteller & Boruch, 2002 ).

From the outset, DBR, as an interventionist, local, situated, non-laboratory methodology, was bound to run afoul of such conservative trends. While some researchers involved in (particularly traditional developmental and cognitive) DBR have found broader acceptance within these constraints, the rigor of DBR remains contested. It has been suggested that DBR is under-theorized and over-methologized, a haphazard way for researchers to do activist work without engaging in the development of robust knowledge claims about learning (Dede, 2004 ), and an approach lacking in coherence that sheltered interventionist projects of little impact to developing learning theory and allowed researchers to make subjective, pet claims through selective analysis of large bodies of collected data (Kelly, 2003 , 2004 ).

These critiques, however, impose an external set of criteria on DBR, desiring it to fit into the molds of rigor and coherence as defined by canonical methodologies. Bell ( 2004 ) and Bang and Vossoughi ( 2016 ) have made compelling cases for the wide variety of methods and approaches present in DBR not as a fracturing, but as a generative proliferation of different iterations that can offer powerful insights around the different types of questions that exist about learning in the infinitely diverse settings in which it occurs. Essentially, researchers have argued that within the DBR paradigm, and indeed within educational research more generally, the practical impact of research on learning, context, and practices should be a necessary component of rigor (Gutiérrez & Penuel, 2014 ), and the pluralism of methods and approaches available in DBR ensures that the practical impacts and needs of the varied contexts in which the research takes place will always drive the design and research tools.

These moves are emblematic of the way in which DBR is innovating and pushing on paradigms of rigor in educational research altogether, reflecting how DBR fills a complementary niche with respect to other methodologies and attends to elements and challenges of learning in lived, real environments that other types of research have consistently and historically missed. Beyond this, Brown ( 1992 ) was conscious of the concerns around data collection, validity, rigor, and objectivity from the outset, identifying this dilemma—the likelihood of having an incredible amount of data collected in a design only a small fraction of which can be reported and shared, thus leading potentially to selective data analysis and use—as the Bartlett Effect (Brown, 1992 ). Since that time, DBR researchers have been aware of this challenge, actively seeking ways to mitigate this threat to validity by making data sets broadly available, documenting their design, tinkering, and modification processes, clearly situating and describing disconfirming evidence and their own position in the research, and otherwise presenting the broad scope of human and learning activity that occurs within designs in large learning ecologies as comprehensively as possible.

Ultimately, however, these responses are likely to always be insufficient as evidence of rigor to some, for the root dilemma is around what “counts” as education science. While researchers interested and engaged in DBR ought rightly to continue to push themselves to ensure the methodological rigor of their work and chosen methods, it is also worth noting that DBR should seek to hold itself to its own criteria of assessment. This reflects broader trends in qualitative educational research that push back on narrow constructions of what “counts” as science, recognizing the ways in which new methodologies and approaches to research can help us examine aspects of learning, culture, and equity that have continued to be blind spots for traditional education research; invite new voices and perspectives into the process of achieving rigor and validity (Erickson & Gutiérrez, 2002 ); bolster objectivity by bringing it into conversation with the positionality of the researcher (Harding, 1993 ); and perhaps most important, engage in axiological innovation (Bang, Faber, Gurneau, Marin, & Soto, 2016 ), or the exploration of and design for what is, “good right, true, and beautiful . . . in cultural ecologies” (p. 2).

Questions of Generalizability and Usefulness

The generalizability of research results in DBR has been an ongoing and contentious issue in the development of the paradigm. Indeed, by the standards of canonical methods (e.g., laboratory experimentation, ethnography), these local, situated interventions should lack generalizability. While there is reason to discuss and question the merit of generalizability as a goal of qualitative research at all, researchers in the DBR paradigm have long been conscious of this issue. Understanding the question of generalizability around DBR, and how the paradigm has responded to it, can be done in two ways.

First, by distinguishing questions specific to a particular design from the generalizability of the theory. Cole’s (Cole & Underwood, 2013 ) 5th Dimension work, and the nationwide network of linked, theoretically similar sites, operating nationwide with vastly different designs, is a powerful example of this approach to generalizability. Rather than focus on a single, unitary, potentially generalizable design, the project is more interested in variability and sustainability of designs across local contexts (e.g., Cole, 1995 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ; Jurow, Tracy, Hotchkiss, & Kirshner, 2012 ). Through attention to sustainable, locally effective innovations, conscious of the wide variation in culture and context that accompanies any and all learning processes, 5th Dimension sites each derive their idiosyncratic structures from sociocultural theory, sharing some elements, but varying others, while seeking their own “ontological innovations” based on the affordances of their contexts. This pattern reflects a key element of much of the DBR paradigm: that questions of generalizability in DBR may be about the generalizability of the theory of learning, and the variability of learning and design in distinct contexts, rather than the particular design itself.

A second means of addressing generalizability in DBR has been to embrace the pragmatic impacts of designing innovations. This response stems from Messick ( 1992 ) and Schoenfeld’s ( 1992 ) arguments early on in the development of DBR that the consequentialness and validity of DBR efforts as potentially generalizable research depend on the “ usefulness ” of the theories and designs that emerge. Effectively, because DBR is the examination of situated theory, a design must be able to show pragmatic impact—it must succeed at showing the theory to be useful . If there is evidence of usefulness to both the context in which it takes place, and the field of educational research more broadly, then the DBR researcher can stake some broader knowledge claims that might be generalizable. As a result, the DBR paradigm tends to “treat changes in [local] contexts as necessary evidence for the viability of a theory” (Barab & Squire, 2004 , p. 6). This of course does not mean that DBR is only interested in successful efforts. A design that fails or struggles can provide important information and knowledge to the field. Ultimately, though, DBR tends to privilege work that proves the usefulness of designs, whose pragmatic or theoretical findings can then be generalized within the learning science and education research fields.

With this said, the question of usefulness is not always straightforward, and is hardly unitary. While many DBR efforts—particularly those situated in developmental and cognitive learning science traditions—are interested in the generalizability of their useful educational designs (Barab & Squire, 2004 ; Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003 ; Joseph, 2004 ; Steffe & Thompson, 2000 ), not all are. Critical DBR researchers have noted that if usefulness remains situated in the extant sociopolitical and sociocultural power-structures—dominant conceptual and popular definitions of what useful educational outcomes are—the result will be a bar for research merit that inexorably bends toward the positivist spectrum (Booker & Goldman, 2016 ; Dominguez, 2015 ; Zavala, 2016 ). This could potentially, and likely, result in excluding the non-normative interventions and innovations that are vital for historically marginalized communities, but which might have vastly different-looking outcomes, that are nonetheless useful in the sociopolitical context they occur in. Alternative framings to this idea of usefulness push on and extend the intention, and seek to involve the perspectives and agency of situated community partners and their practices in what “counts” as generative and rigorous research outcomes (Gutiérrez & Penuel, 2014 ). An example in this regard is the idea of consequential knowledge (Hall & Jurow, 2015 ; Jurow & Shea, 2015 ), which suggests outcomes that are consequential will be taken up by participants in and across their networks, and over-time—thus a goal of consequential knowledge certainly meets the standard of being useful , but it also implicates the needs and agency of communities in determining the success and merit of a design or research endeavor in important ways that strict usefulness may miss.

Thus, the bar of usefulness that characterizes the DBR paradigm should not be approached without critical reflection. Certainly designs that accomplish little for local contexts should be subject to intense questioning and critique, but considering the sociopolitical and systemic factors that might influence what “counts” as useful in local contexts and education science more generally, should be kept firmly in mind when designing, choosing methods, and evaluating impacts (Zavala, 2016 ). Researchers should think deeply about their goals, whether they are reaching for generalizability at all, and in what ways they are constructing contextual definitions of success, and be clear about these ideologically influenced answers in their work, such that generalizability and the usefulness of designs can be adjudicated based on and in conversation with the intentions and conceptual framework of the research and researcher.

Ethical Concerns of Sustainability, Participation, and Telos

While there are many external challenges to rigor and validity of DBR, another set of tensions comes from within the DBR paradigm itself. Rather than concerns about rigor or validity, these internal critiques are not unrelated to the earlier question of the contested definition of usefulness , and more accurately reflect questions of research ethics and grow from ideological concerns with how an intentional, interventionist stance is taken up in research as it interacts with situated communities.

Given that the nature of DBR is to design and implement some form of educational innovation, the DBR researcher will in some way be engaging with an individual or community, becoming part of a situated learning ecology, complete with a sociopolitical and cultural history. As with any research that involves providing an intervention or support, the question of what happens when the research ends is as much an ethical as a methodological one. Concerns then arise given how traditional models of DBR seem intensely focused on creating and implementing a “complete” cycle of design, but giving little attention to what happens to the community and context afterward (Engeström, 2011 ). In contrast to this privileging of “completeness,” sociocultural and critical approaches to DBR have suggested that if research is actually happening in naturalistic, situated contexts that authentically recognize and allow social and cultural dimensions to function (i.e., avoid laboratory-type controls to mitigate independent variables), there can never be such a thing as “complete,” for the design will, and should, live on as part of the ecology of the space (Cole, 2007 ; Engeström, 2000 ). Essentially, these internal critiques push DBR to consider sustainability, and sustainable scale, as equally important concerns to the completeness of an innovation. Not only are ethical questions involved, but accounting for the unbounded and ongoing nature of learning as a social and cultural activity can help strengthen the viability of knowledge claims made, and what degree of generalizability is reasonably justified.

Related to this question of sustainability are internal concerns regarding the nature and ethics of participation in DBR, whether partners in a design are being adequately invited to engage in the design and modification processes that will unfold in their situated contexts and lived communities (Bang et al., 2016 ; Engeström, 2011 ). DBR has actively sought to examine multiple planes of analysis in learning that might be occurring in a learning ecology but has rarely attended to the subject-subject dynamics (Bang et al., 2016 ), or “relational equity” (DiGiacomo & Gutiérrez, 2015 ) that exists between researchers and participants as a point of focus. Participatory design research (PDR) (Bang & Vossoughi, 2016 ) models have recently emerged as a way to better attend to these important dimensions of collective participation (Engeström, 2007 ), power (Vakil et al., 2016 ), positionality (Kirshner, 2015 ), and relational agency (Edwards, 2007 , 2009 ; Sannino & Engeström, 2016 ) as they unfold in DBR.

Both of these ethical questions—around sustainability and participation—reflect challenges to what we might call the telos —or direction—that DBR takes to innovation and research. These are questions related to whose voices are privileged, in what ways, for what purposes, and toward what ends. While DBR, like many other forms of educational research, has involved work with historically marginalized communities, it has, like many other forms of educational research, not always done so in humanizing ways. Put another way, there are ethical and political questions surrounding whether the designs, goals, and standards of usefulness we apply to DBR efforts should be purposefully activist, and have explicitly liberatory ends. To this point, critical and decolonial perspectives have pushed on the DBR paradigm, suggesting that DBR should situate itself as being a space of liberatory innovation and potential, in which communities and participants can become designers and innovators of their own futures (Gutiérrez, 2005 ). This perspective is reflected in the social design experiment (SDE) approach to DBR (Gutiérrez, 2005 , 2008 ; Gutierréz & Vossoughi, 2010 ; Gutiérrez, 2016 ; Gutiérrez & Jurow, 2016 ), which begins in participatory fashion, engaging a community in identifying its own challenges and desires, and reflecting on the historicity of learning practices, before proleptic design efforts are undertaken that ensure that research is done with , not on , communities of color (Arzubiaga, Artiles, King, & Harris-Murri, 2008 ), and intentionally focused on liberatory goals.

Global Perspectives and Unique Iterations

While design-based research (DBR) has been a methodology principally associated with educational research in the United States, its development is hardly limited to the U.S. context. Rather, while DBR emerged in U.S. settings, similar methods of situated, interventionist research focused on design and innovation were emerging in parallel in European contexts (e.g., Gravemeijer, 1994 ), most significantly in the work of Vygotskian scholars both in Europe and the United States (Cole, 1995 ; Cole & Engeström, 1993 , 2007 ; Engeström, 1987 ).

Particularly, where DBR began in the epistemic and ontological terrain of developmental and cognitive psychology, this vein of design-based research work began deeply grounded in cultural-historical activity theory (CHAT). This ontological and epistemic grounding meant that the approach to design that was taken was more intensively conscious of context, historicity, hybridity, and relational factors, and framed around understanding learning as a complex, collective activity system that, through design, could be modified and transformed (Cole & Engeström, 2007 ). The models of DBR that emerged in this context abroad were the formative intervention (Engeström, 2011 ; Engeström, Sannino, & Virkkunen, 2014 ), which relies heavily on Vygotskian double-stimulation to approach learning in nonlinear, unbounded ways, accounting for the role of learner, educator, and researcher in a collective process, shifting and evolving and tinkering with the design as the context needs and demands; and the Change Laboratory (Engeström, 2008 ; Virkkunen & Newnham, 2013 ), which similarly relies on the principle of double stimulation, while presenting holistic way to approach transforming—or changing—entire learning activity systems in fundamental ways through designs that encourage collective “expansive learning” (Engeström, 2001 ), through which participants can produce wholly new activity systems as the object of learning itself.

Elsewhere in the United States, still parallel to the developmental- or cognitive-oriented DBR work that was occurring, American researchers employing CHAT began to leverage the tools and aims of expansive learning in conversation with the tensions and complexity of the U.S. context (Cole, 1995 ; Gutiérrez, 2005 ; Gutiérrez & Rogoff, 2003 ). Like the CHAT design research of the European context, there was a focus on activity systems, historicity, nonlinear and unbounded learning, and collective learning processes and outcomes. Rather than a simple replication, however, these researchers put further attention on questions of equity, diversity, and justice in this work, as Gutiérrez, Engeström, and Sannino ( 2016 ) note:

The American contribution to a cultural historical activity theoretic perspective has been its attention to diversity, including how we theorize, examine, and represent individuals and their communities. (p. 276)

Effectively, CHAT scholars in parts of the United States brought critical and decolonial perspectives to bear on their design-focused research, focusing explicitly on the complex cultural, racial, and ethnic terrain in which they worked, and ensuring that diversity, equity, justice, and non-dominant perspectives would become central principles to the types of design research conducted. The result was the emergence of the aforementioned social design experiments (e.g., Gutiérrez, 2005 , 2016 ), and participatory design research (Bang & Vossoughi, 2016 ) models, which attend intentionally to historicity and relational equity, tailor their methods to the liberation of historically marginalized communities, aim intentionally for liberatory outcomes as key elements of their design processes, and seek to produce outcomes in which communities of learners become designers of new community futures (Gutiérrez, 2016 ). While these approaches emerged in the United States, their origins reflect ontological and ideological perspectives quite distinct from more traditional learning science models of DBR, and dominant U.S. ontologies in general. Indeed, these iterations of DBR are linked genealogically to the ontologies, ideologies, and concerns of peoples in the Global South, offering some promise for the method in those regions, though DBR has yet to broadly take hold among researchers beyond the United States and Europe.

There is, of course, much more nuance to these models, and each of these models (formative interventions, Change Laboratories, social design experiments, and participatory design research) might itself merit independent exploration and review well beyond the scope here. Indeed, there is some question as to whether all adherents of these CHAT design-based methodologies, with their unique genealogies and histories, would even consider themselves under the umbrella of DBR. Yet, despite significant ontological divergences, these iterations share many of the same foundational tenets of the traditional models (though realized differently), and it is reasonable to argue that they do indeed share the same, broad methodological paradigm (DBR), or at the very least, are so intimately related that any discussion of DBR, particularly one with a global view, should consider the contributions CHAT iterations have made to the DBR methodology in the course of their somewhat distinct, but parallel, development.

Possibilities and Potentials for Design-Based Research

Since its emergence in 1992 , the DBR methodology for educational research has continued to grow in popularity, ubiquity, and significance. Its use has begun to expand beyond the confines of the learning sciences, taken up by researchers in a variety of disciplines, and across a breadth of theoretical and intellectual traditions. While still not as widely recognized as more traditional and well-established research methodologies, DBR as a methodology for rigorous research is unquestionably here to stay.

With this in mind, the field ought to still be cautious of the ways in which the discourse of design is used. Not all design is DBR, and preserving the integrity, rigor, and research ethics of the paradigm (on its own terms) will continue to require thoughtful reflection as its pluralistic parameters come into clearer focus. Yet the proliferation of methods in the DBR paradigm should be seen as a positive. There are far too many theories of learning and ideological perspectives that have meaningful contributions to make to our knowledge of the world, communities, and learning to limit ourselves to a unitary approach to DBR, or set of methods. The paradigm has shown itself to have some core methodological principles, but there is no reason not to expect these to grow, expand, and evolve over time.

In an increasingly globalized, culturally diverse, and dynamic world, there is tremendous potential for innovation couched in this proliferation of DBR. Particularly in historically marginalized communities and across the Global South, we will need to know how learning theories can be lived out in productive ways in communities that have been understudied, and under-engaged. The DBR paradigm generally, and critical and CHAT iterations particularly, can fill an important need for participatory, theory-developing research in these contexts that simultaneously creates lived impacts. Participatory design research (PDR), social design experiments (SDE), and Change Laboratory models of DBR should be of particular interest and attention moving forward, as current trends toward culturally sustaining pedagogies and learning will need to be explored in depth and in close collaboration with communities, as participatory design partners, in the press toward liberatory educational innovations.

Bibliography

The following special issues of journals are encouraged starting points for engaging more deeply with current and past trends in design-based research.

  • Bang, M. , & Vossoughi, S. (Eds.). (2016). Participatory design research and educational justice: Studying learning and relations within social change making [Special issue]. Cognition and Instruction , 34 (3).
  • Barab, S. (Ed.). (2004). Design-based research [Special issue]. Journal of the Learning Sciences , 13 (1).
  • Cole, M. , & The Distributed Literacy Consortium. (2006). The Fifth Dimension: An after-school program built on diversity . New York, NY: Russell Sage Foundation.
  • Kelly, A. E. (Ed.). (2003). Special issue on the role of design in educational research [Special issue]. Educational Researcher , 32 (1).
  • Arzubiaga, A. , Artiles, A. , King, K. , & Harris-Murri, N. (2008). Beyond research on cultural minorities: Challenges and implications of research as situated cultural practice. Exceptional Children , 74 (3), 309–327.
  • Bang, M. , Faber, L. , Gurneau, J. , Marin, A. , & Soto, C. (2016). Community-based design research: Learning across generations and strategic transformations of institutional relations toward axiological innovations. Mind, Culture, and Activity , 23 (1), 28–41.
  • Bang, M. , & Vossoughi, S. (2016). Participatory design research and educational justice: Studying learning and relations within social change making. Cognition and Instruction , 34 (3), 173–193.
  • Barab, S. , Kinster, J. G. , Moore, J. , Cunningham, D. , & The ILF Design Team. (2001). Designing and building an online community: The struggle to support sociability in the Inquiry Learning Forum. Educational Technology Research and Development , 49 (4), 71–96.
  • Barab, S. , & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences , 13 (1), 1–14.
  • Barab, S. A. , & Kirshner, D. (2001). Methodologies for capturing learner practices occurring as part of dynamic learning environments. Journal of the Learning Sciences , 10 (1–2), 5–15.
  • Bell, P. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist , 39 (4), 243–253.
  • Bereiter, C. , & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 361–392). Hillsdale, NJ: Lawrence Erlbaum.
  • Booker, A. , & Goldman, S. (2016). Participatory design research as a practice for systemic repair: Doing hand-in-hand math research with families. Cognition and Instruction , 34 (3), 222–235.
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences , 2 (2), 141–178.
  • Brown, A. , & Campione, J. C. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environments for education (pp. 289–325). Mahwah, NJ: Lawrence Erlbaum.
  • Brown, A. L. , & Campione, J. C. (1998). Designing a community of young learners: Theoretical and practical lessons. In N. M. Lambert & B. L. McCombs (Eds.), How students learn: Reforming schools through learner-centered education (pp. 153–186). Washington, DC: American Psychological Association.
  • Brown, A. , Campione, J. , Webber, L. , & McGilley, K. (1992). Interactive learning environments—A new look at learning and assessment. In B. R. Gifford & M. C. O’Connor (Eds.), Future assessment: Changing views of aptitude, achievement, and instruction (pp. 121–211). Boston, MA: Academic Press.
  • Carnoy, M. , Jacobsen, R. , Mishel, L. , & Rothstein, R. (2005). The charter school dust-up: Examining the evidence on enrollment and achievement . Washington, DC: Economic Policy Institute.
  • Carspecken, P. (1996). Critical ethnography in educational research . New York, NY: Routledge.
  • Cobb, P. , Confrey, J. , diSessa, A. , Lehrer, R. , & Schauble, L. (2003). Design experiments in educational research. Educational Researcher , 32 (1), 9–13.
  • Cobb, P. , & Steffe, L. P. (1983). The constructivist researcher as teacher and model builder. Journal for Research in Mathematics Education , 14 , 83–94.
  • Coburn, C. , & Penuel, W. (2016). Research-practice partnerships in education: Outcomes, dynamics, and open questions. Educational Researcher , 45 (1), 48–54.
  • Cole, M. (1995). From Moscow to the Fifth Dimension: An exploration in romantic science. In M. Cole & J. Wertsch (Eds.), Contemporary implications of Vygotsky and Luria (pp. 1–38). Worcester, MA: Clark University Press.
  • Cole, M. (1996). Cultural psychology: A once and future discipline . Cambridge, MA: Harvard University Press.
  • Cole, M. (2007). Sustaining model systems of educational activity: Designing for the long haul. In J. Campione , K. Metz , & A. S. Palinscar (Eds.), Children’s learning in and out of school: Essays in honor of Ann Brown (pp. 71–89). New York, NY: Routledge.
  • Cole, M. , & Engeström, Y. (1993). A cultural historical approach to distributed cognition. In G. Saloman (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 1–46). Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Engeström, Y. (2007). Cultural-historical approaches to designing for development. In J. Valsiner & A. Rosa (Eds.), The Cambridge handbook of sociocultural psychology , Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Underwood, C. (2013). The evolution of the 5th Dimension. In The Story of the Laboratory of Comparative Human Cognition: A polyphonic autobiography . https://lchcautobio.ucsd.edu/polyphonic-autobiography/section-5/chapter-12-the-later-life-of-the-5th-dimension-and-its-direct-progeny/ .
  • Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). New York, NY: Springer-Verlag.
  • Collins, A. , Joseph, D. , & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences , 13 (1), 15–42.
  • Dede, C. (2004). If design-based research is the answer, what is the question? A commentary on Collins, Joseph, and Bielaczyc; DiSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS special issue on design-based research. Journal of the Learning Sciences , 13 (1), 105–114.
  • Design-Based Research Collective . (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher , 32 (1), 5–8.
  • DiGiacomo, D. , & Gutiérrez, K. D. (2015). Relational equity as a design tool within making and tinkering activities. Mind, Culture, and Activity , 22 (3), 1–15.
  • diSessa, A. A. (1991). Local sciences: Viewing the design of human-computer systems as cognitive science. In J. M. Carroll (Ed.), Designing interaction: Psychology at the human-computer interface (pp. 162–202). Cambridge, U.K.: Cambridge University Press.
  • diSessa, A. A. , & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Journal of the Learning Sciences , 13 (1), 77–103.
  • diSessa, A. A. , & Minstrell, J. (1998). Cultivating conceptual change with benchmark lessons. In J. G. Greeno & S. Goldman (Eds.), Thinking practices (pp. 155–187). Mahwah, NJ: Lawrence Erlbaum.
  • Dominguez, M. (2015). Decolonizing teacher education: Explorations of expansive learning and culturally sustaining pedagogy in a social design experiment (Doctoral dissertation). University of Colorado, Boulder.
  • Edelson, D. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences , 11 (1), 105–121.
  • Edwards, A. (2007). Relational agency in professional practice: A CHAT analysis. Actio: An International Journal of Human Activity Theory , 1 , 1–17.
  • Edwards, A. (2009). Agency and activity theory: From the systemic to the relational. In A. Sannino , H. Daniels , & K. Gutiérrez (Eds.), Learning and expanding with activity theory (pp. 197–211). Cambridge, U.K.: Cambridge University Press.
  • Engeström, Y. (1987). Learning by expanding . Helsinki, Finland: University of Helsinki, Department of Education.
  • Engeström, Y. (2000). Can people learn to master their future? Journal of the Learning Sciences , 9 , 525–534.
  • Engeström, Y. (2001). Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work , 14 (1), 133–156.
  • Engeström, Y. (2007). Enriching the theory of expansive learning: Lessons from journeys toward co-configuration. Mind, Culture, and Activity , 14 (1–2), 23–39.
  • Engeström, Y. (2008). Putting Vygotksy to work: The Change Laboratory as an application of double stimulation. In H. Daniels , M. Cole , & J. Wertsch (Eds.), Cambridge companion to Vygotsky (pp. 363–382). New York, NY: Cambridge University Press.
  • Engeström, Y. (2011). From design experiments to formative interventions. Theory & Psychology , 21 (5), 598–628.
  • Engeström, Y. , Engeström, R. , & Kärkkäinen, M. (1995). Polycontextuality and boundary crossing in expert cognition: Learning and problem solving in complex work activities. Learning and Instruction , 5 (4), 319–336.
  • Engeström, Y. , & Sannino, A. (2010). Studies of expansive learning: Foundations, findings and future challenges. Educational Research Review , 5 (1), 1–24.
  • Engeström, Y. , & Sannino, A. (2011). Discursive manifestations of contradictions in organizational change efforts: A methodological framework. Journal of Organizational Change Management , 24 (3), 368–387.
  • Engeström, Y. , Sannino, A. , & Virkkunen, J. (2014). On the methodological demands of formative interventions. Mind, Culture, and Activity , 2 (2), 118–128.
  • Erickson, F. , & Gutiérrez, K. (2002). Culture, rigor, and science in educational research. Educational Researcher , 31 (8), 21–24.
  • Espinoza, M. (2009). A case study of the production of educational sanctuary in one migrant classroom. Pedagogies: An International Journal , 4 (1), 44–62.
  • Espinoza, M. L. , & Vossoughi, S. (2014). Perceiving learning anew: Social interaction, dignity, and educational rights. Harvard Educational Review , 84 (3), 285–313.
  • Fine, M. (1994). Dis-tance and other stances: Negotiations of power inside feminist research. In A. Gitlin (Ed.), Power and method (pp. 13–25). New York, NY: Routledge.
  • Fishman, B. , Penuel, W. , Allen, A. , Cheng, B. , & Sabelli, N. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National Society for the Study of Education , 112 (2), 136–156.
  • Gravemeijer, K. (1994). Educational development and developmental research in mathematics education. Journal for Research in Mathematics Education , 25 (5), 443–471.
  • Gutiérrez, K. (2005). Intersubjectivity and grammar in the third space . Scribner Award Lecture.
  • Gutiérrez, K. (2008). Developing a sociocritical literacy in the third space. Reading Research Quarterly , 43 (2), 148–164.
  • Gutiérrez, K. (2016). Designing resilient ecologies: Social design experiments and a new social imagination. Educational Researcher , 45 (3), 187–196.
  • Gutiérrez, K. , Bien, A. , Selland, M. , & Pierce, D. M. (2011). Polylingual and polycultural learning ecologies: Mediating emergent academic literacies for dual language learners. Journal of Early Childhood Literacy , 11 (2), 232–261.
  • Gutiérrez, K. , Engeström, Y. , & Sannino, A. (2016). Expanding educational research and interventionist methodologies. Cognition and Instruction , 34 (2), 275–284.
  • Gutiérrez, K. , & Jurow, A. S. (2016). Social design experiments: Toward equity by design. Journal of Learning Sciences , 25 (4), 565–598.
  • Gutiérrez, K. , & Penuel, W. R. (2014). Relevance to practice as a criterion for rigor. Educational Researcher , 43 (1), 19–23.
  • Gutiérrez, K. , & Rogoff, B. (2003). Cultural ways of learning: Individual traits or repertoires of practice. Educational Researcher , 32 (5), 19–25.
  • Gutierréz, K. , & Vossoughi, S. (2010). Lifting off the ground to return anew: Mediated praxis, transformative learning, and social design experiments. Journal of Teacher Education , 61 (1–2), 100–117.
  • Hall, R. , & Jurow, A. S. (2015). Changing concepts in activity: Descriptive and design studies of consequential learning in conceptual practices. Educational Psychologist , 50 (3), 173–189.
  • Harding, S. (1993). Rethinking standpoint epistemology: What is “strong objectivity”? In L. Alcoff & E. Potter (Eds.), Feminist epistemologies (pp. 49–82). New York, NY: Routledge.
  • Hoadley, C. (2002). Creating context: Design-based research in creating and understanding CSCL. In G. Stahl (Ed.), Computer support for collaborative learning 2002 (pp. 453–462). Mahwah, NJ: Lawrence Erlbaum.
  • Hoadley, C. (2004). Methodological alignment in design-based research. Educational Psychologist , 39 (4), 203–212.
  • Joseph, D. (2004). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context. Educational Psychologist , 39 (4), 235–242.
  • Jurow, A. S. , & Shea, M. V. (2015). Learning in equity-oriented scale-making projects. Journal of the Learning Sciences , 24 (2), 286–307.
  • Jurow, S. , Tracy, R. , Hotchkiss, J. , & Kirshner, B. (2012). Designing for the future: How the learning sciences can inform the trajectories of preservice teachers. Journal of Teacher Education , 63 (2), 147–60.
  • Kärkkäinen, M. (1999). Teams as breakers of traditional work practices: A longitudinal study of planning and implementing curriculum units in elementary school teacher teams . Helsinki, Finland: University of Helsinki, Department of Education.
  • Kelly, A. (2004). Design research in education: Yes, but is it methodological? Journal of the Learning Sciences , 13 (1), 115–128.
  • Kelly, A. E. , & Sloane, F. C. (2003). Educational research and the problems of practice. Irish Educational Studies , 22 , 29–40.
  • Kirshner, B. (2015). Youth activism in an era of education inequality . New York: New York University Press.
  • Kirshner, B. , & Polman, J. L. (2013). Adaptation by design: A context-sensitive, dialogic approach to interventions. National Society for the Study of Education Yearbook , 112 (2), 215–236.
  • Leander, K. M. , Phillips, N. C. , & Taylor, K. H. (2010). The changing social spaces of learning: Mapping new mobilities. Review of Research in Education , 34 , 329–394.
  • Lesh, R. A. , & Kelly, A. E. (2000). Multi-tiered teaching experiments. In A. E. Kelly & R. A. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 197–230). Mahwah, NJ: Lawrence Erlbaum.
  • Matusov, E. (1996). Intersubjectivty without agreement. Mind, Culture, and Activity , 3 (1), 29–45.
  • Messick, S. (1992). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher , 23 (2), 13–23.
  • Mosteller, F. , & Boruch, R. F. (Eds.). (2002). Evidence matters: Randomized trials in education research . Washington, DC: Brookings Institution Press.
  • Newman, D. , Griffin, P. , & Cole, M. (1989). The construction zone: Working for cognitive change in school . London, U.K.: Cambridge University Press.
  • Penuel, W. R. , Fishman, B. J. , Cheng, B. H. , & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher , 40 (7), 331–337.
  • Polman, J. L. (2000). Designing project-based science: Connecting learners through guided inquiry . New York, NY: Teachers College Press.
  • Ravitch, D. (2010). The death and life of the great American school system: How testing and choice are undermining education . New York, NY: Basic Books.
  • Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context . New York, NY: Oxford University Press.
  • Rogoff, B. (1995). Observing sociocultural activity on three planes: Participatory appropriation, guided participation, and apprenticeship. In J. V. Wertsch , P. D. Rio , & A. Alvarez (Eds.), Sociocultural studies of mind (pp. 139–164). Cambridge U.K.: Cambridge University Press.
  • Saltman, K. J. (2007). Capitalizing on disaster: Taking and breaking public schools . Boulder, CO: Paradigm.
  • Salvador, T. , Bell, G. , & Anderson, K. (1999). Design ethnography. Design Management Journal , 10 (4), 35–41.
  • Sannino, A. (2011). Activity theory as an activist and interventionist theory. Theory & Psychology , 21 (5), 571–597.
  • Sannino, A. , & Engeström, Y. (2016). Relational agency, double stimulation and the object of activity: An intervention study in a primary school. In A. Edwards (Ed.), Working relationally in and across practices: Cultural-historical approaches to collaboration (pp. 58–77). Cambridge, U.K.: Cambridge University Press.
  • Scardamalia, M. , & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A challenge for the design of new knowledge media. Journal of the Learning Sciences , 1 , 37–68.
  • Schoenfeld, A. H. (1982). Measures of problem solving performance and of problem solving instruction. Journal for Research in Mathematics Education , 13 , 31–49.
  • Schoenfeld, A. H. (1985). Mathematical problem solving . Orlando, FL: Academic Press.
  • Schoenfeld, A. H. (1992). On paradigms and methods: What do you do when the ones you know don’t do what you want them to? Issues in the analysis of data in the form of videotapes. Journal of the Learning Sciences , 2 (2), 179–214.
  • Scribner, S. , & Cole, M. (1978). Literacy without schooling: Testing for intellectual effects. Harvard Educational Review , 48 (4), 448–461.
  • Shavelson, R. J. , Phillips, D. C. , Towne, L. , & Feuer, M. J. (2003). On the science of education design studies. Educational Researcher , 32 (1), 25–28.
  • Steffe, L. P. , & Thompson, P. W. (2000). Teaching experiment methodology: Underlying principles and essential elements. In A. Kelly & R. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 267–307). Mahwah, NJ: Erlbaum.
  • Stevens, R. (2000). Divisions of labor in school and in the workplace: Comparing computer and paper-supported activities across settings. Journal of the Learning Sciences , 9 (4), 373–401.
  • Suchman, L. (1995). Making work visible. Communications of the ACM , 38 (9), 57–64.
  • Vakil, S. , de Royston, M. M. , Nasir, N. , & Kirshner, B. (2016). Rethinking race and power in design-based research: Reflections from the field. Cognition and Instruction , 34 (3), 194–209.
  • van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker , R. M. Branch , K. Gustafson , N. Nieveen , & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Boston, MA: Kluwer Academic.
  • Virkkunen, J. , & Newnham, D. (2013). The Change Laboratory: A tool for collaborative development of work and education . Rotterdam, The Netherlands: Sense.
  • White, B. Y. , & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction , 16 , 3–118.
  • Zavala, M. (2016). Design, participation, and social change: What design in grassroots spaces can teach learning scientists. Cognition and Instruction , 34 (3), 236–249.

1. The reader should note the emergence of critical ethnography (e.g., Carspecken, 1996 ; Fine, 1994 ), and other more participatory models of ethnography that deviated from this traditional paradigm during this same time period. These new forms of ethnography comprised part of the genealogy of the more critical approaches to DBR, described later in this article.

2. The reader will also note that the adjective “qualitative” largely drops away from the acronym “DBR.” This is largely because, as described, DBR, as an exploration of naturalistic ecologies with multitudes of variables, and social and learning dynamics, necessarily demands a move beyond what can be captured by quantitative measurement alone. The qualitative nature of the research is thus implied and embedded as part of what makes DBR a unique and distinct methodology.

Related Articles

  • Qualitative Data Analysis
  • The Entanglements of Ethnography and Participatory Action Research (PAR) in Educational Research in North America
  • Writing Educational Ethnography
  • Qualitative Data Analysis and the Use of Theory
  • Comparative Case Study Research
  • Use of Qualitative Methods in Evaluation Studies
  • Writing Qualitative Dissertations
  • Ethnography in Early Childhood Education
  • A History of Qualitative Research in Education in China
  • Qualitative Research in the Field of Popular Education
  • Qualitative Methodological Considerations for Studying Undocumented Students in the United States
  • Culturally Responsive Evaluation as a Form of Critical Qualitative Inquiry
  • Participatory Action Research in Education
  • Complexity Theory as a Guide to Qualitative Methodology in Teacher Education
  • Observing Schools and Classrooms

Printed from Oxford Research Encyclopedias, Education. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 23 April 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [66.249.64.20|185.80.150.64]
  • 185.80.150.64

Character limit 500 /500

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • Exploratory Research | Definition, Guide, & Examples

Exploratory Research | Definition, Guide, & Examples

Published on 6 May 2022 by Tegan George . Revised on 20 January 2023.

Exploratory research is a methodology approach that investigates topics and research questions that have not previously been studied in depth.

Exploratory research is often qualitative in nature. However, a study with a large sample conducted in an exploratory manner can be quantitative as well. It is also often referred to as interpretive research or a grounded theory approach due to its flexible and open-ended nature.

Table of contents

When to use exploratory research, exploratory research questions, exploratory research data collection, step-by-step example of exploratory research, exploratory vs explanatory research, advantages and disadvantages of exploratory research, frequently asked questions about exploratory research.

Exploratory research is often used when the issue you’re studying is new or when the data collection process is challenging for some reason.

You can use this type of research if you have a general idea or a specific question that you want to study but there is no preexisting knowledge or paradigm with which to study it.

Prevent plagiarism, run a free check.

Exploratory research questions are designed to help you understand more about a particular topic of interest. They can help you connect ideas to understand the groundwork of your analysis without adding any preconceived notions or assumptions yet.

Here are some examples:

  • What effect does using a digital notebook have on the attention span of primary schoolers?
  • What factors influence mental health in undergraduates?
  • What outcomes are associated with an authoritative parenting style?
  • In what ways does the presence of a non-native accent affect intelligibility?
  • How can the use of a grocery delivery service reduce food waste in single-person households?

Collecting information on a previously unexplored topic can be challenging. Exploratory research can help you narrow down your topic and formulate a clear hypothesis , as well as giving you the ‘lay of the land’ on your topic.

Data collection using exploratory research is often divided into primary and secondary research methods, with data analysis following the same model.

Primary research

In primary research, your data is collected directly from primary sources : your participants. There is a variety of ways to collect primary data.

Some examples include:

  • Survey methodology: Sending a survey out to the student body asking them if they would eat vegan meals
  • Focus groups: Compiling groups of 8–10 students and discussing what they think of vegan options for dining hall food
  • Interviews: Interviewing students entering and exiting the dining hall, asking if they would eat vegan meals

Secondary research

In secondary research, your data is collected from preexisting primary research, such as experiments or surveys.

Some other examples include:

  • Case studies : Health of an all-vegan diet
  • Literature reviews : Preexisting research about students’ eating habits and how they have changed over time
  • Online polls, surveys, blog posts, or interviews; social media: Have other universities done something similar?

For some subjects, it’s possible to use large- n government data, such as the decennial census or yearly American Community Survey (ACS) open-source data.

How you proceed with your exploratory research design depends on the research method you choose to collect your data. In most cases, you will follow five steps.

We’ll walk you through the steps using the following example.

Therefore, you would like to focus on improving intelligibility instead of reducing the learner’s accent.

Step 1: Identify your problem

The first step in conducting exploratory research is identifying what the problem is and whether this type of research is the right avenue for you to pursue. Remember that exploratory research is most advantageous when you are investigating a previously unexplored problem.

Step 2: Hypothesise a solution

The next step is to come up with a solution to the problem you’re investigating. Formulate a hypothetical statement to guide your research.

Step 3. Design your methodology

Next, conceptualise your data collection and data analysis methods and write them up in a research design.

Step 4: Collect and analyse data

Next, you proceed with collecting and analysing your data so you can determine whether your preliminary results are in line with your hypothesis.

In most types of research, you should formulate your hypotheses a priori and refrain from changing them due to the increased risk of Type I errors and data integrity issues. However, in exploratory research, you are allowed to change your hypothesis based on your findings, since you are exploring a previously unexplained phenomenon that could have many explanations.

Step 5: Avenues for future research

Decide if you would like to continue studying your topic. If so, it is likely that you will need to change to another type of research. As exploratory research is often qualitative in nature, you may need to conduct quantitative research with a larger sample size to achieve more generalisable results.

It can be easy to confuse exploratory research with explanatory research. To understand the relationship, it can help to remember that exploratory research lays the groundwork for later explanatory research.

Exploratory research investigates research questions that have not been studied in depth. The preliminary results often lay the groundwork for future analysis.

Explanatory research questions tend to start with ‘why’ or ‘how’, and the goal is to explain why or how a previously studied phenomenon takes place.

Exploratory vs explanatory research

Like any other research design , exploratory research has its trade-offs: it provides a unique set of benefits but also comes with downsides.

  • It can be very helpful in narrowing down a challenging or nebulous problem that has not been previously studied.
  • It can serve as a great guide for future research, whether your own or another researcher’s. With new and challenging research problems, adding to the body of research in the early stages can be very fulfilling.
  • It is very flexible, cost-effective, and open-ended. You are free to proceed however you think is best.

Disadvantages

  • It usually lacks conclusive results, and results can be biased or subjective due to a lack of preexisting knowledge on your topic.
  • It’s typically not externally valid and generalisable, and it suffers from many of the challenges of qualitative research .
  • Since you are not operating within an existing research paradigm, this type of research can be very labour-intensive.

Exploratory research is a methodology approach that explores research questions that have not previously been studied in depth. It is often used when the issue you’re studying is new, or the data collection process is challenging in some way.

You can use exploratory research if you have a general idea or a specific question that you want to study but there is no preexisting knowledge or paradigm with which to study it.

Exploratory research explores the main aspects of a new or barely researched question.

Explanatory research explains the causes and effects of an already widely researched question.

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to test a hypothesis by systematically collecting and analysing data, while qualitative methods allow you to explore ideas and experiences in depth.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

George, T. (2023, January 20). Exploratory Research | Definition, Guide, & Examples. Scribbr. Retrieved 22 April 2024, from https://www.scribbr.co.uk/research-methods/exploratory-research-design/

Is this article helpful?

Tegan George

Tegan George

Other students also liked, qualitative vs quantitative research | examples & methods, descriptive research design | definition, methods & examples, case study | definition, examples & methods.

helpful professor logo

Exploratory Research Design: A Guide for Students

exploratory research definition and overview, explained below

Exploratory Research refers to the research methodology used when researchers have a limited understanding of the topic at hand and want to gain a broader perspective or more insights into the subject.

This type of research is conducted in the initial stages of a study when there is a lack of clarity about the problem. It may lead to subsequent studies that attempt to generate greater clarity on the findings generated in the initial, exploratory, phase, or to test hypotheses that the exploratory phase generated.

Definition of Exploratory Research

If you’re writing a research proposal, methodology, or essay in which you need to discuss exploratory research design , I recommend providing a scholarly definition of the topic.

Here are three solid scholarly definitions you could use:

  • “… research used to investigate an issue that is new, a problem which is not clearly defined, a topic on which there is little data” (Gozdziak & Chantavanich, 2022)
  • “… initial research conducted to clarify and define the nature of the problem. It is done to help diagnose the situation, allow for screening of alternatives, and discover new ideas.” (Hutchison, Allen & Macy, 2012)
  • “… the initial research into a hypothetical or theoretical idea. This is where the researcher has an idea and wishes to research a topic seeking to understand it more deeply. An exploratory research study is an attempt to lay the groundwork that could of will lead to future studies.” (Blecher, 2018)

Objectives of Exploratory Research

Exploratory research is generally conducted in order to generate initial data on a relatively unknown topic (Hammond & Wellington, 2013).

We could divide this up into a range of more specific objectives that you could state for your research project, including:

  • Descriptive Data : Exploratory research often aims to provide a better understanding of a relatively unknown problem, phenomenon, or behavior by generating descriptive data (Cargan, 2007). This helps in establishing a clearer understanding of a topic and, consequently, gives subsequent projects the chance to dig deeper now that they have some baseline data.
  • Generation of New Ideas : Through exploratory research, new ideas and perspectives can emerge that were not initially considered (Swedberg, 2020). An exploratory study often begins with an open mind, ready to reveal surprising, remarkable, and unexpected new ideas and insights. For example, researchers may conclude the study with a hypothesis for future research to examine.
  • Determining the Most Appropriate Research Method : After getting a clearer idea of the topic, researchers can determine which research methods (e.g., qualitative, quantitative) would be most suitable for a subsequent, more detailed, conclusive research study.
  • Clarifying Concepts : In generating descriptive datasets, an exploratory study can help in defining and refining ambiguous or unclear concepts, which can help to start to build foundational definitions and conceptual frameworks for the emerging body of literature on a novel topic (Tan, 2022).
  • Establishing Importance: An exploratory study could also serve the purpose of demonstrating that a topic is worthy of larger-scale studies. So, researchers might conduct their exploratory study, and use its results to approach funding bodies. With the exploratory study complete, they will be armed with more data about the topic and informed evidence about the best way to approach it.
  • Establishing Priorities for Future Research : By identifying key issues and questions, researchers can prioritize areas that need deeper investigation. Conclusions of exploratory studies usually provide recommendations for future studies, including by proposing hypotheses and prioritizing future projects.

Common Methods for Exploratory Research Designs

There is no one clear set of methods that must be used in exploratory design. A researcher should select the methodologies that would be most effective in meeting your research questions.

However, commonly, exploratory studies are shaped as small-scale qualitative designs. Qualitative research allows researchers to delve deep into a topic to generate high-quality, contextualized, and nuanced descriptive data. Qualitative data generally cannot create generalizable results, but it does help to create hypotheses that can be looked at with larger-scale quantitative studies.

Examples of exploratory research designs can include:

  • Expert Interviews : When exploring a relatively new phenomenon, researchers can speak to professionals or experts in the field to gain their perspectives. This perspective-taking, often taking the form of a delphi method focus group , can help researchers to better understand the key factors within a topic that will inform subsequent study designs (Raley et al., 2016).
  • Case Studies : Oftentimes, exploratory research looks at novel and rare phenomena. To begin to understand them, researchers need to look at them in depth. To do this, they may employ a case study design, which allows researchers to take a very in-depth examination of one instance (Lee & Saunders, 2017). This will, in turn, inform future studies that may look at a wider corpus that can create statistically relevant results.
  • Observational Studies : This involves observing and noting the behavior or situation of interest. There are multiple types of observational research , ranging from non-intrusive ‘fly on the wall’ observations to participant observations such as ethnographic studies .
  • Cross-Sectional Research : This is an example of a qualitative exploratory research design. Imagine researchers want to study an emerging health condition among a population. As it’s emergent, there’s no data yet, but researchers need to know how big a problem this is. To do so, they may conduct an initial cross-sectional study , which gathers statistically relevant descriptive data about how prevalent the condition is. This exploratory study won’t be able to determine cause-and-effect between variables, but it could from the basis and justification for subsequent studies.

A Hypothetical Example

Study Title: The relationship between urban green spaces and mental well-being

Explanation: In an exploratory study examining the relationship between urban green spaces and mental well-being, researchers could survey residents from various neighborhoods about their frequency of visiting local parks and their perceived stress levels.

Design: Since, hypothetically, little prior research exists on this specific topic, the team could employ open-ended interviews, allowing participants to share detailed experiences and insights. Preliminary data might indicate a potential positive correlation between regular park visits and lower stress levels. To gain a deeper understanding, focus groups could be conducted next, where participants discuss the therapeutic effects of nature, setting the stage for more structured, quantitative studies in the future.

Recommendations: Exploratory research usually provides recommendations for more focused subsequent studies. The researchers in this study might recommend exploring the supposed positive correlation between park visits and lower stress through a statistical analysis.

See More Exploratory Research Examples

Benefits and Limitations of Exploratory Research Design

Exploratory research design is very useful for providing initial insights into a topic, describing phenomena in detail, and exploring a topic without the predetermined constraints of mere hypothesis testing.

But it’s not the best research design in all situations. For example, it might not be ideal if you were seeking to achieve clarity on well-worn topics, generate generalizable results that add to existing literature, or contribute to a specific set of existing scholarly discourse on a topic.

Below are some strengths and weaknesses of this research design:

Blecher, M. (2018). Israeli Settlements: Land Politics Beyond the Geneva Convention . Hamilton Books.

Cargan, L. (2007). Doing Social Research . Rowman & Littlefield Publishers.

Gozdziak, E. M., & Chantavanich, S. (2022). Africans in Thailand. In Gozdziak, E. M., & Chantavanich, S. (Eds.). African Migration to Thailand: Race, Mobility, and Integration. Taylor & Francis.

Hammond, M., & Wellington, J. J. (2013).  Research Methods: The Key Concepts . Routledge.

Hutchison, T., Allen, P., & Macy, A. (2012).  Record Label Marketing . Taylor & Francis.

Lee, B., & Saunders, M. N. K. (2017).  Conducting Case Study Research for Business and Management Students.  SAGE Publications.

Morrison, A. M. (2022).  Tourism Marketing: In the Age of the Consumer . Taylor & Francis.

Raley, M. E., Ragona, M., Sijtsema, S. J., Fischer, A. R., & Frewer, L. J. (2016). Barriers to using consumer science information in food technology innovations: An exploratory study using Delphi methodology.  International Journal of Food Studies ,  5 (1). ( Source )

Swedberg, R. (2020). Exploratory Research. In Elman C, Gerring J, & Mahoney J. (Eds.)  The Production of Knowledge: Enhancing Progress in Social Science.  Cambridge: Cambridge University Press.

Tan, W. C. K. (2022).  Research Methods: A Practical Guide For Students And Researchers  (Second Edition). World Scientific Publishing Company.

Chris

Chris Drew (PhD)

Dr. Chris Drew is the founder of the Helpful Professor. He holds a PhD in education and has published over 20 articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education. [Image Descriptor: Photo of Chris]

  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 5 Top Tips for Succeeding at University
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 50 Durable Goods Examples
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 100 Consumer Goods Examples
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 30 Globalization Pros and Cons

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

1Library

  • No results found

Exploratory design

3.2 the qualitative research approach.

3.2.1.1 Exploratory design

3.2.1 The research design

The research design is the overall plan and procedures employed for addressing the research question, including the specifications for enhancing the study’s integrity. It is therefore a full description of how the research will be conducted (Polit & Beck 2012:743).

In this study, the researcher used an exploratory research design to gain some insight into the adolescent participants’ perceptions. The following section explains how the researcher utilised this research design to achieve the goals of the study.

Exploratory research explores the dimensions of a phenomenon, or develops or refines hypotheses about the relationships between phenomena. It seeks to learn how people get along in the setting in question, what meanings they give to their actions, and what issues concern them. Exploratory research is used to investigate a phenomenon on which little knowledge exists (Polit & Beck 2012:728). In this study, little was known about the perceptions of adolescents on the use of the youth-friendly HIV centre. The researcher therefore selected an exploratory research design in order to explore these perceptions.

3.2.1.2 Descriptive research

Descriptive research aims at a precise representation of people’s characteristics or circumstances, and the frequency with which certain phenomena occur (Polit & Beck 2012:726). Descriptive research aims to describe and document a process, event or an outcome in detail (Houser 2015:138). In addition, qualitative descriptive research is less theoretical than other qualitative approaches, making it easier to design a framework for conducting a study (Kim, Sefcik & Bradway 2017:2). A descriptive study was believed appropriate because this study wanted to describe the perceptions of adolescents on the use of youth- friendly services.

3.2.2 Research Setting

The location for conducting research could be in a natural and in a controlled setting (Grove et al 2013:709). The choice of setting for data generation depends on the nature

of the research question and the type of data required to address it (Brink et al 2012:59).

Lesotho consist of ten districts which are equipped with primary, secondary and tertiary health care facilities. The study took place in the Maseru district, at a public hospital. The hospital functions from 7 am to 5 pm, from Monday to Friday and 8am to 1 pm Saturday.

The hospital offers adolescent friendly health services to adolescents. Since the hospital is town is visited by people from the surrounding urban areas, semi urban and those visiting the town.

3.2.3 Population and sampling

A population is the entire set of individuals or objects that have common characteristics, or which conform to specific criteria, and from which it is intended to generalise the results of the research (Polit & Beck 2012:739). Brink et al. (2012:131) described a population as the entire group of persons or objects that meet the criteria for the research. In this regard, the population of a study comprises the aggregation of elements from which the sample is actually selected. The population of this study comprised the total number of adolescents accessing health services at the adolescent centre of the Public Hospital in Maseru.

In order to determine the group of people who participate in a study, sampling needs to be conducted. Sampling is the process of selecting some of the participants within the population to represent the entire population. Qualitative sampling requires the selection of appropriate participants, in order to best achieve the objectives of the study. Adequate sampling helps to address the research question, and to develop a full description of the phenomenon being studied.

There are two general kinds of sampling: probability and non-probability sampling.

Probability sampling is a sampling process in which every person in the population has an equal chance of being selected, while in non-probability sampling one does not know the probability of a person being selected. (Brink et al. 2012:134). The researcher decided to use non-probability sampling, as it is suitable for studies in which the researcher does not know much about the population, such as its size or its demographic characteristics.

The participants in this study were purposively sampled for inclusion in the study based on their knowledge and their ability to describe the phenomenon (Polit & Beck 2012:743).

Purpose sampling is a non-probability sampling method, in terms of which the researcher selects the participants based on his or her personal judgement about which participants would be the most informative (Joubert 2014:741). Purposive sampling is often used to select atypical cases rather than typical ones. This approach is commonly used in qualitative studies that seek to compare the opposite extremes of a phenomenon, in order to generate hypotheses about it. The participants thus selected help the researcher to understand the problem and to answer the research question. In a qualitative study, the sampling does not depend on the large size of the selection of participants; but it depends on how knowledgeable and willing the participants are. Furthermore, it also depends on the completeness of the overall sense of the meaning of the concept, theme, or process.

It also depends on information saturation, which helps to determine when the sampling is complete. Not all participants had an equal chance of being selected. They needed to conform to specific criteria and were selected based on their presence and availability on the day when the data was collected The researcher used the following criteria for the inclusion of participants in the sample:

• Adolescents aged 18-19 years old;

• Adolescents who accessed services at Public adolescent centre;

• Adolescents who were tested for HIV at the Public hospital adolescent centre;

• Adolescents who were on antiretroviral treatment administered from the Public hospital adolescent centre;

• Adolescents who were HIV positive and were accessing psychosocial support at the Public hospital adolescent centre.

3.2.3.1 Sample size

The sample size is the number of people who participate in a study (Polit & Beck 2012:743). The sample size was determined by data saturation, that is, the point at which no new information surfaces. Polit and Beck (2012: 521–522) suggest continuing to select participants until the data can pass two tests, namely:

 Completeness: when the participants have shared sufficient information for the researcher to gain an overall sense of the meaning of a concept, theme, or process.

 Saturation: when the researcher no longer elicits new information from subsequent interviews.

There was no specific sample size established at the outset. Instead, the sample size was determined by the researcher’s judgement of the point of data saturation. The researcher and the study supervisor decided when data saturation had been reached after both reading through the transcribed interviews. After nine participants had been interviewed, new interviews seemed to yield little additional information.

3.2.4 Recruitment of the participants

In recruiting participants, the researcher assessed all the possible ways in which a potential research participant might feel unduly pressured to participate. These included a personal appeal, the status of being part of a special group of adolescents, and the participant’s willingness. Based on this assessment, the researcher applied the ethical principle of consent. Potential participants were clearly informed about the nature and purpose of the study, and were made to understand that there would be no rewards for participation, and no penalties for refusing to participate or for withdrawing during the study. These aspects were communicated verbally and in writing, on the informed consent form created by the researcher in conjunction with his supervisor.

The researcher started the process by providing health education to all the adolescents who used the services at the Public Hospital adolescent clinic. He then went through the patients’ records to identify potential participants who met the criteria for inclusion in the study. The researcher spoke with the potential participants face-to-face and reintroduced himself, explained the purpose of the study and the criteria for inclusion, and explained to them what their participation in the study would entail. The participants were fully informed that their participation in the study was voluntary, that they had the right to refuse to participate in the study, and that if they were not willing to participate, they would not be discriminated against on the grounds of their refusal to participate.

The potential participants were then informed that the data collected would be discussed with the researcher’s supervisor, would be presented to examiners and the research committee members, and would possibly be published in a journal, as an article. The

researcher assured the participants that they would remain anonymous, as their names would not be used. Instead, they would be identified as a number, so that they could not be linked to the contents of the data collected and documented. The questionnaire that was going to be used during the interview was fully explained to the participants, and they were told that the interview would be digitally recorded, but only when they had given their consent. The potential participants signed the informed consent document (see Annexure 1), thereby indicating their willingness to participate in the study.

Once the participants had agreed to participate in the study, based on the all information provided to them, the researcher arranged an interview time with them. The participants who were immediately available signed the consent forms and started the interview.

Appointments were made with those who were not immediately available, and during these appointments the researcher recapped the necessary information again, after which the participants signed the consent form and the interviews began.

3.2.5 Data collection

Data collection is the precise, systematic gathering of information relevant to the research purpose or to specific objectives, or to the hypothesis of the study (Brink et al. 2012:132;

Grove et al. 2013:52). The researcher in this study used a semi-structured interview as the data collection instrument, since it was suitable for achieving the objectives of the study: to explore and describe the perceptions of adolescents on the use of youth-friendly services; and to identify the best practices that could be recommended to the health care authorities, in order to scale up youth-friendly services. According to Brink et al.

(2012:157), the interview method of data collection is used to obtain responses from the participants. A semi-structured interview lists in outline the general form of the questions that the interviewer should cover in the interview, but allows the interviewer to adapt the sequencing and the wording of the questions to each particular interview. The interview guide ensures that different interviewers would cover the same predetermined questions, while at the same time remaining relaxed and conversational, and being free to probe into unanticipated circumstances and responses. The interviews were conducted in either English or Sesotho those conducted in Sesotho were translated into English. The researcher is fluent in Sesotho and English, and thus had a good understanding of the

information. The interviews lasted for approximately + or- 30 minutes and were conducted for a period of a month and they conducted private room where privacy was ensured.

Before the commencement of each interview, the researcher introduced himself and asked some general questions to put the participants at ease, as suggested by Polit and Beck (2012: 543). In addition, the researcher explained to the participants what the study was about, determined who was eligible to participate, and explained how the interviews would be conducted. The researcher obtained written consent from all the participants before they were interviewed

It is important for the researcher to understand qualitative research methods very well, in order to adhere to the principles and strategies of the methods used. In relation to the semi-structured interview, the researcher used the following principles.

A non-judgmental attitude: For the participants to feel free to participate in the study fully, they needed a safe atmosphere to openly disclose all kinds of information, including private thoughts and feelings. The researcher ensured that he was non-judgmental with all the participants, in order to encourage their openness. All the participants were reassured that the information they shared was accepted, even if such information was very personal, and the researcher demonstrated this by using verbal responses and physical cues, such as nodding, to indicate his empathy and engagement.

Free expression of participants: Allowing the participants to express themselves freely while the researcher listened put the participants at ease and allowed for a free flow of conversation, which allowed the participants to provide all the necessary information.

Asking probing questions also helped to ensure that the participants gave all the relevant information for the study.

Paying attention: Maintaining eye contact, nodding as the participant talked, and rephrasing the statements showed that the researcher was carefully paying attention to the participants. This is known as active listening, which is a process of observation and listening, and is a prerequisite for all other communication skills, since it precedes all other communication. This helped the researcher to get all the relevant information for the study from the participants.

Being sensitive: The researcher was friendly with all the participants, was sympathetic, and asked probing questions. This helped the participants to feel that they were not being patronised, judged, or humiliated.

3.2.6 Data analysis

Grove et al. (2013:535) describe data analysis as a technique used to reduce, organise, and give meaning to the information obtained. The process of data analysis involves making sense of textual and image-based data (Botma et al. 2014:220).

Data analysis was performed concurrently with the data collection. After each interview the researcher listened to a voice recording of the interview, transcribed the interview, and checked on the field notes taken during the data collection. This study used thematic analysis to give meaning to the data. The transcribed data was categorised into themes for the purpose of presentation by using an adapted form of Colaizzi’s (1978) seven steps of analysis, as cited in Bazeley (2013:65):

Step 1: Acquiring a sense of each transcript: In this step, the researcher read and re-read each transcript, in order to obtain a general sense of the overall content. The researcher put aside any preconceived ideas about the phenomenon under study, in order to prevent contamination of the findings.

Step 2: Extracting significant statements: In this step, the researcher extracted the significant statements that pertained to the phenomenon under study from each transcript. The researcher recorded these statements on a separate sheet, noting their pages and the line numbers.

Step 3: Formulating meanings: The researcher formulated meanings from these significant statements. Each underlying meaning was coded into one category, which reflected an exhaustive description. Then, the researcher compared the formulated meanings with the original meanings, maintaining the consistency of description.

Step 4: Theme clusters: The researcher grouped all the formulated meanings into categories that reflected a unique structure of clusters or themes. Each cluster or theme was coded to include all the formulated meanings relating to that group of meanings.

Thereafter, the groups of clusters of themes that reflected a particular issue were incorporated together, in order to form a distinctive construct of a theme.

Step 5: Exhaustive description: All the emergent themes were integrated into an exhaustive description of the phenomenon. After merging all the themes, the whole structure of the perceptions of adolescent on the use of the HIV youth centre was able to be extracted.

Step 6: Statement of identification: The fundamental structure of the phenomenon was be described. The researcher checked the findings for redundant, misused, or overestimation descriptions, and eliminated them from the overall structure.

Step 7: Participant verification: Validation of the findings in the study was sought from the research participants, through member checking, where the analysed data was given to the participants for review, validation, and commentary.

3.3 TRUSTWORTHINESS

Researchers should try to achieve trustworthiness in qualitative research, also known as rigour, in order to confirm that the findings are of use to other researchers (Noble & Smith 2015:34). The following measures of trustworthiness, as suggested by Polit and Beck (2012), were applied to ensure the rigour of this research: credibility, dependability, confirmability, transferability, and authenticity. These measures ensured the truth value, applicability, consistency, and neutrality of the study.

• Credibility is a measure of the integrity and quality of a qualitative study, and it shows confidence that the data is true (Polit & Beck 2012:724). Credibility was ensured by engaging deeply with the participants during the interviews, and with the transcripts of the interviews. In the process, the researcher tried to remain aware of his personal views and biases, and tried to capture the participants’

perceptions as truthfully as possible. The researcher’s supervisor reviewed the data analysis to ensure that it was thorough, and the researcher validated the data through a process of member checking with the participants, where they checked to see whether their perceptions had been truthfully reflected.

• Dependability refers to the stability of the data over time and in similar conditions (Polit & Beck 2012:559). Dependability was ensured by utilising an external audit as a strategy. The researcher’s supervisor acted as an external auditor of the data analysis process, and the researcher kept detailed records of the research process to facilitate the audit.

• Transferability refers to whether the findings can be made applicable in other settings or other groups (Polit & Beck 2012:747). The researcher described the research process and data analysis in detail, so that any other researcher could follow a similar research process in a similar context.

• Confirmability refers to whether the data that is presented and interpreted is a true reflection of the information presented by the participants (Polit & Beck 2012:560).

The researcher recorded the interviews with a voice recorder and took detailed field notes to ensure that the participants’ perceptions were accurately recorded and presented. The researcher’s supervisor acted as an external auditor of the data analysis process, which ensured a measure of confirmability in the interpretation of the data.

• Authenticity refers to the extent to which the researcher presents the different perspectives of the voices of the participants in the collected data (Polit & Beck 2012:720). This was achieved by including direct quotations from the participants, in order to include their authentic voices and to add richness to the discussions.

The research results were communicated by using detailed descriptions of specific themes.

3.4 ETHICAL CONSIDERATIONS

Ethics is set of moral principles that is suggested by an individual or a group, and is widely accepted. Systems of research ethics produce rules and behavioural expectations about the most correct (and appropriate) conduct towards experimental subjects and respondents (participants), employers, sponsors, other researchers, assistants and students.

In addition, the researcher obtained permission to conduct the research from the University of South Africa’s Department of Health Studies Research Ethics Committee (REC-012714-039), from the Lesotho Ministry of Health Research Ethics Committee, and from the manager of the Hospital Nursing Services at the public hospital. After approval had been granted by the various stakeholders, the nurse in charge of the adolescent clinic, who acted as a gatekeeper of the participants, was approached for permission. The researcher assured the nurse in charge that the participants would remain anonymous, and that the information by the participants would remain confidential and would not to be

  • Exploratory design (You are here)

Related documents

AD Center Site Banner

  • Section 2: Home
  • Developing the Quantitative Research Design
  • Qualitative Descriptive Design

Overview of Descriptive Design

Sources of data.

  • Design and Development Research (DDR) For Instructional Design
  • Qualitative Narrative Inquiry Research
  • Action Research Resource
  • Case Study Design in an Applied Doctorate
  • SAGE Research Methods
  • Research Examples (SAGE) This link opens in a new window
  • Dataset Examples (SAGE) This link opens in a new window
  • IRB Resource Center This link opens in a new window

A descriptive design is a flexible, exploratory approach to qualitative research. Descriptive design is referred to in the literature by other labels including generic, general, basic, traditional, interpretive, and pragmatic. Descriptive design as an acceptable research design for dissertation and other robust scholarly research has received varying degrees of acceptance within the academic community. However, descriptive design has been gaining momentum since the early 2000’s as a suitable design for studies that do not fall into the more mainstream genres of qualitative research (ie. Case study, phenomenology, ethnography, narrative inquiry and grounded theory). In contrast to other qualitative designs, descriptive design is not aligned to specific methods (for example, bracketing in phenomenology, bounded systems in case study, or constant comparative analysis in grounded theory). Rather, descriptive design “borrows” methods appropriate to the proposed study from other designs. 

Arguments supporting the flexible nature of descriptive designs describe it as being preferable to forcing a research approach into a design that is not quite appropriate for the nature of the intended study. However, descriptive design has also been criticized for this mixing of methods as well as for the limited literature describing it. The descriptive design can be the foundation for a rigorous study within the ADE program. Because of the flexibility of the methods used, a descriptive design provides the researcher with the opportunity to choose methods best suited to a practice-based research purpose.   

  • Example Descriptive Design in an Applied Doctorate

Sources of Data in Descriptive Design

Because of the exploratory nature of descriptive design, the triangulation of multiple sources of data are often used for additional insight into the phenomenon. Sources of data that can be used in descriptive studies are similar to those that may be used in other qualitative designs and include interviews, focus groups, documents, artifacts, and observations.

The following video provides additional considerations for triangulation in qualitative designs including descriptive design: Triangulation: Pairing Thematic and Content Analysis

  • << Previous: Developing the Qualitative Research Design
  • Next: Design and Development Research (DDR) For Instructional Design >>
  • Last Updated: Jul 28, 2023 8:05 AM
  • URL: https://resources.nu.edu/c.php?g=1013605

NCU Library Home

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.25(5); 2020 Aug

Logo of jrn

An overview of the qualitative descriptive design within nursing research

Louise doyle.

Associate Professor in Mental Health Nursing, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Catherine McCabe

Associate Professor in General Nursing, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Brian Keogh

Assistant Professor in Mental Health Nursing, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Annemarie Brady

Chair of Nursing and Chronic Illness, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Qualitative descriptive designs are common in nursing and healthcare research due to their inherent simplicity, flexibility and utility in diverse healthcare contexts. However, the application of descriptive research is sometimes critiqued in terms of scientific rigor. Inconsistency in decision making within the research process coupled with a lack of transparency has created issues of credibility for this type of approach. It can be difficult to clearly differentiate what constitutes a descriptive research design from the range of other methodologies at the disposal of qualitative researchers.

This paper provides an overview of qualitative descriptive research, orientates to the underlying philosophical perspectives and key characteristics that define this approach and identifies the implications for healthcare practice and policy.

Methods and results

Using real-world examples from healthcare research, the paper provides insight to the practical application of descriptive research at all stages of the design process and identifies the critical elements that should be explicit when applying this approach.

Conclusions

By adding to the existing knowledge base, this paper enhances the information available to researchers who wish to use the qualitative descriptive approach, influencing the standard of how this approach is employed in healthcare research.

Introduction

Qualitative descriptive approaches to nursing and healthcare research provide a broad insight into particular phenomena and can be used in a variety of ways including as a standalone research design, as a precursor to larger qualitative studies and commonly as the qualitative component in mixed-methods studies. Despite the widespread use of descriptive approaches within nursing research, there is limited methodological guidance about this type of design in research texts or papers. The lack of adequate representation in research texts has at times resulted in novice researchers using other more complex qualitative designs including grounded theory or phenomenology without meeting the requirements of these approaches ( Lambert and Lambert, 2012 ), or having an appropriate rationale for use of these approaches. This suggests there is a need to have more discussion about how and why descriptive approaches to qualitative research are used. This serves to not only provide information and guidance for researchers, but to ensure acceptable standards in how this approach is applied in healthcare research.

Rationale for qualitative descriptive research

The selection of an appropriate approach to answer research questions is one of the most important stages of the research process; consequently, there is a requirement that researchers can clearly articulate and defend their selection. Those who wish to undertake qualitative research have a range of approaches available to them including grounded theory, phenomenology and ethnography. However, these designs may not be the most suitable for studies that do not require a deeply theoretical context and aim to stay close to and describe participants’ experiences. The most frequently proposed rationale for the use of a descriptive approach to is to provide straightforward descriptions of experiences and perceptions ( Sandelowski, 2010 ), particularly in areas where little is known about the topic under investigation. A qualitative descriptive design may be deemed most appropriate as it recognises the subjective nature of the problem, the different experiences participants have and will present the findings in a way that directly reflects or closely resembles the terminology used in the initial research question ( Bradshaw et al., 2017 ). This is particularly relevant in nursing and healthcare research, which is commonly concerned with how patients experience illness and associated healthcare interventions. The utilisation of a qualitative descriptive approach is often encouraged in Master’s level nurse education programmes as it enables novice clinical nurse researchers explore important healthcare questions that have direct implications and impact for their specific healthcare setting (Colorafi and Evans, 2016). As a Master’s level project is often the first piece of primary research undertaken by nurses, the use of a qualitative descriptive design provides an excellent method to address important clinical issues where the focus is not on increasing theoretical or conceptual understanding, but rather contributing to change and quality improvement in the practice setting ( Chafe, 2017 ).

This design is also frequently used within mixed-methods studies where qualitative data can explain quantitative findings in explanatory studies, be used for questionnaire development in exploratory studies and validate and corroborate findings in convergent studies ( Doyle et al., 2016 ). There has also been an increase in the use of qualitative descriptive research embedded in large-scale healthcare intervention studies, which can serve a number of purposes including identifying participants’ perceptions of why an intervention worked or, just as importantly, did not work and how the intervention might be improved ( Doyle et al., 2016 ). Using qualitative descriptive research in this manner can help to make the findings of intervention studies more clinically meaningful.

Philosophical and theoretical influences

Qualitative descriptive research generates data that describe the ‘who, what, and where of events or experiences’ from a subjective perspective ( Kim et al., 2017 , p. 23). From a philosophical perspective, this approach to research is best aligned with constructionism and critical theories that use interpretative and naturalistic methods ( Lincoln et al., 2017 ). These philosophical perspectives represent the view that reality exists within various contexts that are dynamic and perceived differently depending on the subject, therefore, reality is multiple and subjective ( Lincoln et al., 2017 ). In qualitative descriptive research, this translates into researchers being concerned with understanding the individual human experience in its unique context. This type of inquiry requires flexible research processes that are inductive and dynamic but do not transform the data beyond recognition from the phenomenon being studied ( Ormston et al., 2014 ; Sandelwoski 2010). Descriptive qualitative research has also been aligned with pragmatism ( Neergaard et al., 2009 ) where decisions are made about how the research should be conducted based on the aims or objectives and context of the study ( Ormston et al., 2014 ). The pragmatist researcher is not aligned to one particular view of knowledge generation or one particular methodology. Instead they look to the concepts or phenomena being studied to guide decision making in the research process, facilitating the selection of the most appropriate methods to answer the research question ( Bishop, 2015 ).

Perhaps linked to the practical application of pragmatism to research, that is, applying the best methods to answer the research question, is the classification of qualitative descriptive research by Sandelowski ( 2010 , p. 82) into a ‘distributed residual category’. This recognises and incorporates uncertainty about the phenomena being studied and the research methods used to study them. For researchers, it permits the use of one or more different types of inquiry, which is essential when acknowledging and exploring different realities and subjective experiences in relation to phenomena ( Long et al., 2018 ). Clarity, in terms of the rationale for the phenomenon being studied and the methods used by the researcher, emerges from the qualitative descriptive approach because the data gathered continue to remain close to the phenomenon throughout the study ( Sandelowski, 2010 ). For this to happen a flexible approach is required and this is evident in the practice of ‘borrowing’ elements of other qualitative methodologies such as grounded theory, phenomenology and ethnography ( Vaismoradi et al., 2013 ).

Regarded as a positive aspect by many researchers who are interested in studying human nature and phenomenon, others believe this flexibility leads to inconsistency across studies and in some cases complacency by researchers. This can result in vague or unexplained decision making around the research process and subsequent lack of credibility. Accordingly, nurse researchers need to be reflexive, that is, clear about their role and position in terms of the phenomena being studied, the context, the theoretical framework and all decision-making processes used in a qualitative descriptive study. This adds credibility to both the study and qualitative descriptive research.

Methods in qualitative descriptive research

As with any research study, the application of descriptive methods will emerge in response to the aims and objectives, which will influence the sampling, data collection and analysis phases of the study.

Most qualitative research aligns itself with non-probability sampling and descriptive research is no different. Descriptive research generally uses purposive sampling and a range of purposive sampling techniques have been described ( Palinkas et al., 2015 ). Many researchers use a combination of approaches such as convenience, opportunistic or snowball sampling as part of the sampling framework, which is determined by the desired sample and the phenomena being studied.

Purposive sampling refers to selecting research participants that can speak to the research aims and who have knowledge and experience of the phenomenon under scrutiny ( Ritchie et al., 2014 ). When purposive sampling is used in a study it delimits and narrows the study population; however, researchers need to remember that other characteristics of the sample will also affect the population, such as the location of the researcher and their flexibility to recruit participants from beyond their base. In addition, the heterogeneity of the population will need to be considered and how this might influence sampling and subsequent data collection and analysis ( Palinkas et al ., 2015 ). Take, for example, conducting research on the experience of caring for people with Alzheimer’s disease (AD). For the most part AD is a condition that affects older people and experiences of participants caring for older people will ultimately dominate the sample. However, AD also affects younger people and how this will impact on sampling needs to be considered before recruitment as both groups will have very different experiences, although there will be overlap. Teddlie and Fu (2007) suggest that although some purposive sampling techniques generate representative cases, most result in describing contrasting cases, which they argue are at the heart of qualitative analysis. To achieve this, Sandelowski (2010) suggests that maximum variation sampling is particularly useful in qualitative descriptive research, which may acknowledge the range of experiences that exist especially in healthcare research. Palinkas et al . (2015) describe maximum variation sampling as identifying shared patterns that emerge from heterogeneity. In other words, researchers attempt to include a wide range of participants and experiences when collecting data. This may be more difficult to achieve in areas where little is known about the substantive area and may depend on the researcher’s knowledge and immersion within the subject area.

Sample size will also need to be considered and although small sample sizes are common in qualitative descriptive research, researchers need to be careful they have enough data collected to meet the study aims ( Ritchie et al., 2014 ). Pre-determining the sample size prior to data collection may stifle the analytic process, resulting in too much or too little data. Traditionally, the gold standard for sample size in qualitative research is data saturation, which differs depending on the research design and the size of the population ( Fusch and Ness, 2015 ). Data saturation is reached ‘when there is enough information to replicate the study, when the ability to obtain additional new information has been attained, and when further coding is no longer feasible’ ( Fusch and Ness, 2015 , p. 1408). However, some argue that although saturation is often reported, it is rarely demonstrated in qualitative descriptive research reports ( Caelli et al., 2003 ; Malterud et al., 2016 ). If data saturation is used to determine sample size, it is suggested that greater emphasis be placed on demonstrating how saturation was reached and at what level to provide more credibility to sample sizes ( Caelli et al., 2003 ). Sample size calculation should be an estimate until saturation has been achieved through the concurrent processes of data collection and analysis. Where saturation has not been achieved, or where sample size has been predetermined for resource reasons, this should be clearly acknowledged. However, there is also a movement away from the reliance on data saturation as a measure of sample size in qualitative research ( Malterud et al., 2016 ). O’Reilly and Parker (2012) question the appropriateness of the rigid application of saturation as a sample size measure arguing that outside of Grounded Theory, its use is inconsistent and at times questionable. Malterud et al. (2016) focus instead on the concept of ‘information power’ to determine sample size. Here, they suggest sample size is determined by the amount of information the sample holds relevant to the actual study rather than the number of participants ( Malterud et al., 2016 ). Some guidance on specific sample size depending on research design has been provided in the literature; however, these are sometimes conflicting and in some cases lack evidence to support their claims ( Guest et al., 2006 ). This is further complicated by the range of qualitative designs and data collection approaches available.

Data collection

Data collection methods in qualitative descriptive research are diverse and aim to discover the who, what and where of phenomena ( Sandelowski, 2000 ). Although semi-structured individual face-to-face interviews are the most commonly used data collection approaches ( Kim et al ., 2017 ), focus groups, telephone interviews and online approaches are also used.

Focus groups involve people with similar characteristics coming together in a relaxed and permissive environment to share their thoughts, experiences and insights ( Krueger and Casey, 2009 ). Participants share their own views and experiences, but also listen to and reflect on the experiences of other group members. It is this synergistic process of interacting with other group members that refines individuals’ viewpoints to a deeper and more considered level and produces data and insights that would not be accessible without the interaction found in a group (Finch et al., 2014). Telephone interviews and online approaches are gaining more traction as they offer greater flexibility and reduced costs for researchers and ease of access for participants. In addition, they may help to achieve maximum variation sampling or examine experiences from a national or international perspective. Face-to-face interviews are often perceived as more appropriate than telephone interviews; however, this assumption has been challenged as evidence to support the use of telephone interviews emerges ( Ward et al., 2015 ). Online data collection also offers the opportunity to collect synchronous and asynchronous data using instant messaging and other online media ( Hooley et al., 2011 ). Online interviews or focus groups conducted via Skype or other media may overcome some of the limitations of telephone interviews, although observation of non-verbal communication may be more difficult to achieve ( Janghorban et al., 2014 ). Open-ended free-text responses in surveys have also been identified as useful data sources in qualitative descriptive studies ( Kim et al . , 2017 ) and in particular the use of online open-ended questions, which can have a large geographical reach ( Seixas et al., 2018 ). Observation is also cited as an approach to data collection in qualitative descriptive research ( Sandelowski, 2000 ; Lambert and Lambert, 2012 ); however, in a systematic review examining the characteristics of qualitative research studies, observation was cited as an additional source of data and was not used as a primary source of data collection ( Kim et al. , 2017 ).

Data analysis and interpretation

According to Lambert and Lambert (2012) , data analysis in qualitative descriptive research is data driven and does not use an approach that has emerged from a pre-existing philosophical or epistemological perspective. Within qualitative descriptive research, it is important analysis is kept at a level at which those to whom the research pertains are easily able to understand and so can use the findings in healthcare practice ( Chafe, 2017 ). The approach to analysis is dictated by the aims of the research and as qualitative descriptive research is generally explorative, inductive approaches will commonly need to be applied although deductive approaches can also be used ( Kim et al . , 2017 ).

Content and thematic analyses are the most commonly used data analysis techniques in qualitative descriptive research. Vaismoradi et al . (2013) argue that content and thematic analysis, although poorly understood and unevenly applied, offer legitimate ways of a lower level of interpretation that is often required in qualitative descriptive research. Sandelowski (2000) indicated that qualitative content analysis is the approach of choice in descriptive research; however, confusion exists between content and thematic analysis, which sometimes means researchers use a combination of the two. Vaismoradi et al. (2013) argue there are differences between the two and that content analysis allows the researchers to analyse the data qualitatively as well as being able to quantify the data whereas thematic analysis provides a purely qualitative account of the data that is richer and more detailed. Decisions to use one over the other will depend on the aims of the study, which will dictate the depth of analysis required. Although there is a range of analysis guidelines available, they share some characteristics and an overview of these, derived from some key texts ( Sandleowski, 2010 ; Braun and Clark, 2006 ; Newell and Burnard, 2006), is presented in Table 1 . Central to these guidelines is an attempt by the researcher to immerse themselves in the data and the ability to demonstrate a consistent and systematic approach to the analysis.

Common characteristics of descriptive qualitative analysis.

Coding in qualitative descriptive research can be inductive and emerge from the data, or a priori where they are based on a pre-determined template as in template analysis. Inductive codes can be ‘in vivo’ where the researcher uses the words or concepts as stated by the participants ( Howitt, 2019 ), or can be named by the researcher and grouped together to form emerging themes or categories through an iterative systematic process until the final themes emerge. Template analysis involves designing a coding template, which is designed inductively from a subset of the data and then applied to all the data and refined as appropriate ( King, 2012 ). It offers a standardised approach that may be useful when several researchers are involved in the analysis process.

Within qualitative research studies generally, the analysis of data and subsequent presentation of research findings can range from studies with a relatively minimal amount of interpretation to those with high levels of interpretation ( Sandelowski and Barroso, 2003 ). The degree of interpretation required in qualitative descriptive research is contentious. Sandelowski (2010) argues that although descriptive research produces findings that are ‘data-near’, they are nevertheless interpretative. Sandelowski (2010) reports that a common misconception in qualitative descriptive designs is that researchers do not need to include any level of analysis and interpretation and can rely solely on indiscriminately selecting direct quotations from participants to answer the research question(s). Although it is important to ensure those familiar with the topic under investigation can recognise their experiences in the description of it ( Kim et al . , 2017 ), this is not to say that there should be no transformation of data. Researchers using a qualitative descriptive design need to, through data analysis, move from un-interpreted participant quotations to interpreted research findings, which can still remain ‘data-near’ ( Sandeklwoski, 2010 ). Willis et al. (2016) suggest that researchers using the qualitative descriptive method might report a comprehensive thematic summary as findings, which moves beyond individual participant reports by developing an interpretation of a common theme. The extent of description and/or interpretation in a qualitative descriptive study is ultimately determined by the focus of the study (Neergard et al ., 2009).

As with any research design, ensuring the rigor or trustworthiness of findings from a qualitative descriptive study is crucial. For a more detailed consideration of the quality criteria in qualitative studies, readers are referred to the seminal work of Lincoln and Guba (1985) in which the four key criteria of credibility, dependability, confirmability and transferability are discussed. At the very least, researchers need to be clear about the methodological decisions taken during the study so readers can judge the trustworthiness of the study and ultimately the findings ( Hallberg, 2013 ). Being aware of personal assumptions and the role they play in the research process is also an important quality criterion (Colorafi and Evans, 2016) and these assumptions can be made explicit through the use of researcher reflexivity in the study ( Bradshaw et al., 2017 ).

Challenges in using a qualitative descriptive design

One of the challenges of utilising a qualitative descriptive design is responding to the charge that many qualitative designs have historically encountered, which is that qualitative designs lack the scientific rigor associated with quantitative approaches ( Vaismoradi et al . , 2013 ). The descriptive design faces further critique in this regard as, unlike other qualitative approaches such as phenomenology or grounded theory, it is not theory driven or oriented ( Neergaard et al ., 2009 ). However, it is suggested that this perceived limitation of qualitative descriptive research only holds true if it is used for the wrong purposes and not primarily for describing the phenomenon ( Neergaard et al ., 2009 ). Kahlke (2014) argues that rather than being atheoretical, qualitative descriptive approaches require researchers to consider to what extent theory will inform the study and are sufficiently flexible to leave space for researchers to utilise theoretical frameworks that are relevant and inform individual research studies. Kim et al. (2017) reported that most descriptive studies reviewed did not identify a theoretical or philosophical framework, but those that did used it to inform the development of either the interview guide or the data analysis framework, thereby identifying the potential use of theory in descriptive designs.

Another challenge around the use of qualitative descriptive research is that it can erroneously be seen as a ‘quick fix’ for researchers who want to employ qualitative methods, but perhaps lack the expertise or familiarity with qualitative research ( Sandelowski, 2010 ). Kim et al. (2017) report how in their review fewer than half of qualitative descriptive papers explicitly identified a rationale for choosing this design, suggesting that in some cases the rationale behind its use was ill considered. Providing a justification for choosing a particular research design is an important part of the research process and, in the case of qualitative descriptive research, a clear justification can offset concerns that a descriptive design was an expedient rather than a measured choice. For studies exploring participants’ experiences, which could be addressed using other qualitative designs, it also helps to clearly make a distinction as to why a descriptive design was the best choice for the research study ( Kim et al ., 2017 ). Similarly, there is a perception that the data analysis techniques most commonly associated with descriptive research – thematic and content analysis are the ‘easiest’ approaches to qualitative analysis; however, as Vaismoradi et al . (2013) suggest, this does not mean they produce low-quality research findings.

As previously identified, a further challenge with the use of qualitative descriptive methods is that as a research design it has limited visibility in research texts and methodological papers ( Kim et al ., 2017 ). This means that novice qualitative researchers have little guidance on how to design and implement a descriptive study as there is a lack of a ‘methodological rulebook’ to guide researchers ( Kahlke, 2014 ). It is also suggested that this lack of strict boundaries and rules around qualitative descriptive research also offers researchers flexibility to design a study using a variety of data collection and analysis approaches that best answer the research question ( Kahlke, 2014 ; Kim et al . , 2017 ). However, should researchers choose to integrate methods ‘borrowed’ from other qualitative designs such as phenomenology or grounded theory, they should do so with the caveat that they do not claim they are using designs they are not actually using ( Neergaard et al . , 2009 ).

Examples of the use of qualitative descriptive research in healthcare

Findings from qualitative descriptive studies within healthcare have the potential to describe the experiences of patients, families and health providers, inform the development of health interventions and policy and promote health and quality of life ( Neergaard et al ., 2009 ; Willis et al ., 2016 ). The examples provided here demonstrate different ways qualitative descriptive methods can be used in a range of healthcare settings.

Simon et al. (2015) used a qualitative descriptive design to identify the perspectives of seriously ill, older patients and their families on the barriers and facilitators to advance care planning. The authors provided a rationale for using a descriptive design, which was to gain a deeper understanding of the phenomenon under investigation. Data were gathered through nine open-ended questions on a researcher-administered questionnaire. Responses to all questions were recorded verbatim and transcribed. Using descriptive, interpretative and explanatory coding that transformed raw data recorded from 278 patients and 225 family members to more abstract ideas and concepts ( Simon et al. , 2015 ), a deeper understanding of the barriers and facilitators to advance care planning was developed. Three categories were developed that identified personal beliefs, access to doctors and interaction with doctors as the central barriers and facilitators to advance care planning. The use of a qualitative descriptive design facilitated the development of a schematic based on these three themes, which provides a framework for use by clinicians to guide improvement in advance care planning.

Focus group interviews are a common data collection method in qualitative descriptive studies and were the method of choice in a study by Pelentsov et al. (2015), which sought to identify the supportive care needs of parents whose child has a rare disease. The rationale provided for using a qualitative descriptive design was to obtain a ‘straight description of the phenomena’ and to provide analysis and interpretation of the findings that remained data-near and representative of the responses of participants. In this study, four semi-structured focus group interviews were conducted with 23 parents. The data from these focus groups were then subjected to a form of thematic analysis during which emerging theories and inferences were identified and organised into a series of thematic networks and ultimately into three global themes. These themes identified that a number of factors including social isolation and lack of knowledge on behalf of healthcare professionals significantly affected how supported parents felt. Identifying key areas of the supportive needs of parents using qualitative description provides direction to health professionals on how best to respond to and support parents of children with a rare disease.

The potential for findings from a qualitative descriptive study to impact on policy was identified in a study by Syme et al. (2016) , who noted a lack of guidance and policies around sexual expression management of residents in long-term care settings. In this study, 20 directors of nursing from long-term care settings were interviewed with a view to identifying challenges in addressing sexual expression in these settings and elicit their recommendations for addressing these challenges in practice and policy. Following thematic analysis, findings relating to what directors of nursing believed to be important components of policy to address sexual expression were identified. These included providing educational resources, having a person-centred care delivery model when responding to sexual expression and providing guidance when working with families. Findings from this qualitative descriptive study provide recommendations that can then feed in to a broader policy on sexual expression in long-term care settings.

The final example of the use of a qualitative descriptive study comes from a mixed-methods study comprising a randomised control trial and a qualitative process evaluation. He et al. (2015) sought to determine the effects of a play intervention for children on parental perioperative anxiety and to explore parents’ perceptions of the intervention. Parents who had children going for surgery were assigned to a control group or an intervention group. The intervention group took part in a 1-hour play therapy session with their child whereas the control group received usual care. Quantitative findings identified there was no difference in parents’ anxiety levels between the intervention and control group. However, qualitative findings identified that parents found the intervention helpful in preparing both themselves and their child for surgery and perceived a reduction in their anxiety about the procedure thereby capturing findings that were not captured by the quantitative measures. In addition, in the qualitative interviews, parents made suggestions about how the play group could be improved, which provides important data for the further development of the intervention.

These examples across a range of healthcare settings provide evidence of the way findings from qualitative descriptive research can be directly used to more fully understand the experiences and perspectives of patients, their families and healthcare providers in addition to guiding future healthcare practice and informing further research.

Qualitative research designs have made significant contributions to the development of nursing and healthcare practices and policy. The utilisation of qualitative descriptive research is common within nursing research and is gaining popularity with other healthcare professions. This paper has identified that the utilisation of this design can be particularly relevant to nursing and healthcare professionals undertaking a primary piece of research and provides an excellent method to address issues that are of real clinical significance to them and their practice setting. However, the conundrum facing researchers who wish to use this approach is its lack of visibility and transparency within methodological papers and texts, resulting in a deficit of available information to researchers when designing such studies. By adding to the existing knowledge base, this paper enhances the information available to researchers who wish to use the qualitative descriptive approach, thus influencing the standard in how this approach is employed in healthcare research. We highlight the need for researchers using this research approach to clearly outline the context, theoretical framework and concepts underpinning it and the decision-making process that informed the design of their qualitative descriptive study including chosen research methods, and how these contribute to the achievement of the study’s aims and objectives. Failure to describe these issues may have a negative impact on study credibility. As seen in our paper, qualitative descriptive studies have a role in healthcare research providing insight into service users and providers’ perceptions and experiences of a particular phenomenon, which can inform healthcare service provision.

Key points for policy, practice and/or research

  • Despite its widespread use, there is little methodological guidance to orientate novice nurse researchers when using the qualitative descriptive design. This paper provides this guidance and champions the qualitative descriptive design as appropriate to explore research questions that require accessible and understandable findings directly relevant to healthcare practice and policy.
  • This paper identifies how the use of a qualitative descriptive design gives direct voice to participants including patients and healthcare staff, allowing exploration of issues of real and immediate importance in the practice area.
  • This paper reports how within qualitative descriptive research, the analysis of data and presentation of findings in a way that is easily understood and recognised is important to contribute to the utilisation of research findings in nursing practice.
  • As this design is often overlooked in research texts despite its suitability to exploring many healthcare questions, this paper adds to the limited methodological guidance and has utility for researchers who wish to defend their rationale for the use of the qualitative descriptive design in nursing and healthcare research.

Louise Doyle (PhD, MSc, BNS, RNT, RPN) is an Associate Professor in Mental Health Nursing at the School of Nursing and Midwifery, Trinity College Dublin. Her research interests are in the area of self-harm and suicide and she has a particular interest and expertise in mixed-methods and qualitative research designs.

Catherine McCabe (PhD, MSc, BNS, RNT, RGN) is an Associate Professor in General Nursing at the School of Nursing and Midwifery, Trinity College Dublin. Her research interests and expertise are in the areas of digital health (chronic disease self-management and social/cultural wellbeing), cancer, dementia, arts and health and systematic reviews.

Brian Keogh (PhD, MSc, BNS, RNT, RPN) is an Assistant Professor in Mental Health Nursing at the School of Nursing and Midwifery, Trinity College Dublin. His main area of research interest is mental health recovery and he specialises in qualitative research approaches with a particular emphasis on grounded theory.

Annemarie Brady (PhD, MSc, BNS, RNT, RPN) is Chair of Nursing and Chronic Illness and Head of School of Nursing and Midwifery at Trinity College Dublin. Her research work has focused on the development of healthcare systems and workforce solutions to respond to increased chronic illness demands within healthcare. She has conducted a range of mixed-method research studies in collaboration with health service providers to examine issues around patient-related outcomes measures, workload measurement, work conditions, practice development, patient safety and competency among healthcare workers.

Margaret McCann (PhD, MSc, BNS, RNT, RGN) is an Assistant Professor in General Nursing at the School of Nursing and Midwifery, Trinity College Dublin. Research interests are focused on chronic illness management, the use of digital health and smart technology in supporting patient/client education, self-management and independence. Other research interests include conducting systematic reviews, infection prevention and control and exploring patient outcomes linked to chronic kidney disease.

Declaration of conflicting interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.

Ethical approval was not required for this paper as it is a methodological paper and does not report on participant data.

The author(s) received no financial support for the research, authorship and/or publication of this article.

Louise Doyle https://orcid.org/0000-0002-0153-8326

Margaret McCann https://orcid.org/0000-0002-7925-6396

  • Bishop FL. (2015) Using mixed methods in health research: Benefits and challenges . British Journal of Health Psychology 20 : 1–4. [ PubMed ] [ Google Scholar ]
  • Bradshaw C, Atkinson S, Doody O. (2017) Employing a qualitative description approach in health care research . Global Qualitative Nursing Research 4 : 1–8. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Braun V, Clarke V. (2006) Using thematic analysis in psychology . Qualitative Research in Psychology 3 : 77–101. [ Google Scholar ]
  • Caelli K, Ray L, Mill J. (2003) ‘Clear as mud’: Toward greater clarity in generic qualitative research . International Journal of Qualitative Methods 2 : 1–13. [ Google Scholar ]
  • Chafe R. (2017) The Value of Qualitative Description in Health Services and Policy Research . Healthcare Policy 12 : 12–18. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Doyle L, Brady AM, Byrne G. (2016) An overview of mixed methods research–revisited . Journal of Research in Nursing 21 : 623–635. [ Google Scholar ]
  • Finch H, Lewis J, Turley C. Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. Focus Groups . Qualitative Research Practice. A Guide for Social Science Students and Researchers , London: Sage, pp. 211–242. [ Google Scholar ]
  • Fusch PI, Ness LR. (2015) Are we there yet? Data saturation in qualitative research . The Qualitative Report 20 : 1408–1416. [ Google Scholar ]
  • Guest G, Bunce A, Johnson L. (2006) How many interviews are enough? An experiment with data saturation and variability . Field Methods 18 : 59–82. [ Google Scholar ]
  • Hallberg L. (2013) Quality criteria and generalization of results from qualitative studies . International Journal of Qualitative Studies in Health and Well-being 8 : 1. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • He HG, Zhu LX, Chan WCS, et al. (2015) A mixed-method study of effects of a therapeutic play intervention for children on parental anxiety and parents’ perceptions of the intervention . Journal of Advanced Nursing 71 ( 7 ): 1539–1551. [ PubMed ] [ Google Scholar ]
  • Hooley T, Wellens J, Marriott J. (2011) What is Online research? Using the Internet for Social Science Research , London: Bloomsbury Academic. [ Google Scholar ]
  • Howitt D. (2019) Introduction to Qualitative Methods in Psychology: Putting Theory into Practice , (4th edition). Harlow: Pearson Education Limited. [ Google Scholar ]
  • Janghorban R, Roudsari RL, Taghipour A. (2014) Skype interviewing: The new generation of online synchronous interview in qualitative research . International Journal of Qualitative Studies on Health and Wellbeing 9 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kahlke RM. (2014) Generic qualitative approaches: Pitfalls and benefits of methodological mixology . International Journal of Qualitative Methods 13 : 37–52. [ Google Scholar ]
  • Kim H, Sefcik JS, Bradway C. (2017) Characteristics of qualitative descriptive studies: A systematic review . Research in Nursing & Health 40 : 23–42. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • King N. (2012) Doing Template Analysis . In: Symon G, Cassell C. (eds) Qualitative Organizational Research: Core Methods and Current Challenges , Los Angeles, CA: Sage. [ Google Scholar ]
  • Krueger RA, Casey MA. (2009) Focus Groups: A Practical Guide for Applied Research , 4th ed. Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Lambert VA, Lambert CE. (2012) Qualitative descriptive research: An acceptable design . Pacific Rim International Journal of Nursing Research 16 : 255–256. [ Google Scholar ]
  • Lincoln YS, Guba EG. (1985) Naturalistic Inquiry , Newbury Park, CA: Sage. [ Google Scholar ]
  • Lincoln YS, Lynham SA, Guba EG. (2017) Paradigmatic Controversies, Contradictions and Emerging Confluences . In: NK Denzin, YS Guba (ed) The Sage Handbook of Qualitative Research , (5th edition). Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Long KM, McDermott F, Meadows GN. (2018) Being pragmatic about healthcare complexity: Our experiences applying complexity theory and pragmatism to health services research . BMC Medicine 16 : 94. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Malterud K, Siersma VD, Guassora AD. (2016) Sample size in qualitative interview studies: Guided by information power . Qualitative Health Research 26 ( 13 ): 1753–1760. [ PubMed ] [ Google Scholar ]
  • Neergaard MA, Olesen F, Andersen RS, et al. (2009) Qualitative description – the poor cousin of health research? BMC Medical Research Methodology 9 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Newell R, Burnard P. (2011) Research for Evidence Based Practice , Oxford: Blackwell Publishing. [ Google Scholar ]
  • O’Reilly M, Parker N. (2012) ‘Unsatisfactory Saturation’: A critical exploration of the notion of saturated sample sizes in qualitative research . Qualitative Research 13 ( 2 ): 190–197. [ Google Scholar ]
  • Ormston R, Spencer L, Barnard M, et al. (2014) The foundations of qualitative research . In: Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. (eds) Qualitative Research Practice. A Guide for Social Science Students and Researchers , London: Sage, pp. 1–25. [ Google Scholar ]
  • Palinkas LA, Horwitz SM, Green CA, et al. (2015) Purposeful sampling for qualitative data collection and analysis in mixed method implementation research . Administration and Policy in Mental Health and Mental Health Services Research 42 : 533–544. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Pelentsov LL, Fielder AL, Esterman AJ. (2016) The supportive care needs of parents with a child with a rare disease: A qualitative descriptive study . Journal of Pediatric Nursing 31 ( 3 ): e207–e218. [ PubMed ] [ Google Scholar ]
  • Ritchie J, Lewis J, Elam G, et al. (2014) Designing and selecting samples . In: Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. (eds) Qualitative Research Practice. A Guide for Social Science Students and Researchers , London: Sage, pp. 111–145. [ Google Scholar ]
  • Sandelowski M. (2000) Whatever happened to qualitative description? Research in Nursing & Health 23 : 334–340. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (2010) What’s in a name? Qualitative description revisited . Research in Nursing & Health 33 : 77–84. [ PubMed ] [ Google Scholar ]
  • Sandelowski M, Barroso J. (2003) Classifying the findings in qualitative studies . Qualitative Health Research 13 : 905–923. [ PubMed ] [ Google Scholar ]
  • Seixas BV, Smith N, Mitton C. (2018) The qualitative descriptive approach in international comparative studies: Using online qualitative surveys . International Journal of Health Policy Management 7 ( 9 ): 778–781. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Simon J, Porterfield P, Bouchal SR, et al. (2015) ‘Not yet’ and ‘just ask’: Barriers and facilitators to advance care planning – a qualitative descriptive study of the perspectives of seriously ill, older patients and their families . BMJ Supportive & Palliative Care 5 : 54–62. [ PubMed ] [ Google Scholar ]
  • Syme ML, Lichtenberg P, Moye J. (2016) Recommendations for sexual expression management in long-term care: A qualitative needs assessment . Journal of Advanced Nursing 72 ( 10 ): 2457–2467. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Teddlie C, Yu F. (2007) Mixed methods sampling: A typology with examples . Journal of Mixed Methods Research 1 : 77–100. [ Google Scholar ]
  • Vaismoradi M, Turunen H, Bondas T. (2013) Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study . Nursing & Health Sciences 15 : 398–405. [ PubMed ] [ Google Scholar ]
  • Ward K, Gott M, Hoare K. (2015) Participants’ views of telephone interviews within a grounded theory study . Journal of Advanced Nursing 71 : 2775–2785. [ PubMed ] [ Google Scholar ]
  • Willis DG, Sullivan-Bolyai S, Knafl K, et al. (2016) Distinguishing features and similarities between descriptive phenomenological and qualitative descriptive research . Western Journal of Nursing Research 38 : 1185–1204. [ PubMed ] [ Google Scholar ]

COMMENTS

  1. Exploratory Research

    Exploratory research is a methodology approach that investigates research questions that have not previously been studied in depth. Exploratory research is often qualitative and primary in nature. However, a study with a large sample conducted in an exploratory manner can be quantitative as well. It is also often referred to as interpretive ...

  2. Exploratory Research

    Common exploratory research designs include case studies, focus groups, interviews, and surveys. Collect data: Collect data using the chosen research design. This may involve conducting interviews, surveys, or observations, or collecting data from existing sources such as archives or databases. ... Qualitative in nature: Exploratory research ...

  3. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  4. Grounded Theory: A Guide for Exploratory Studies in Management Research

    At the same time, the validity of qualitative studies has also been criticized on the basis of their unstructured nature and the subjectivity involved (see, for example, Easterby-Smith et al., 2002).Different solutions have been recommended in the extant literature to deal with this weakness, such as the use of mixed methods (e.g., Opoku et al., 2016), the support of the chosen methodology ...

  5. Exploratory Research

    It has been noted that "exploratory research is the initial research, which forms the basis of more conclusive research. It can even help in determining the research design, sampling methodology and data collection method" [2]. Exploratory research "tends to tackle new problems on which little or no previous research has been done" [3].

  6. How to use and assess qualitative research methods

    Qualitative research is defined as "the study of the nature of phenomena", including "their quality, ... In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step . If the qualitative study around EVT provision had shown a high level of ...

  7. Beyond exploratory: a tailored framework for designing and assessing

    In this framework, we identify and distinguish three objectives of qualitative studies in applied health research: exploratory, descriptive and comparative. For each objective, we propose methodological standards that may be used to assess and improve rigour across all study phases—from design to reporting. ... Qualitative Inquiry & Research ...

  8. Qualitative Methods in Health Care Research

    Qualitative questions are exploratory and are open-ended. A well-formulated study question forms the basis for developing a protocol, guides the selection of design, and data collection methods. ... Types of Qualitative Research Designs. The qualitative research approach encompasses a wide array of research designs. The words such as types ...

  9. Qualitative Design Research Methods

    The Origins of Design-Based Research. Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown and Allan Collins ().For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory ...

  10. Exploratory research in the social sciences: what is exploration?

    An exploratory research design was employed since much is not known about SLP in the area under study and ... an exploratory research design with a qualitative approach (Makri and Neely, 2021 ...

  11. Case Study Methodology of Qualitative Research: Key Attributes and

    Exploratory. The purpose here is to study a phenomenon with the intention of 'exploring' or identifying fresh research questions which can be used in subsequent research studies in an extensive way. ... Research design: Qualitative, quantitative, and mixed method approaches (4th ed.). SAGE Publications. Google Scholar. De Vaus D. (2001).

  12. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  13. Exploratory Research

    Exploratory research is a methodology approach that investigates topics and research questions that have not previously been studied in depth. Exploratory research is often qualitative in nature. However, a study with a large sample conducted in an exploratory manner can be quantitative as well. It is also often referred to as interpretive ...

  14. Defining Exploratory-Descriptive Qualitative (EDQ) research and

    Design The design was an exploratory descriptive qualitative research design. Methods In all, we conducted 13 interviews with service providers (7), patients (4) and caregivers (2). An inductive ...

  15. Choosing a Qualitative Research Approach

    In this Rip Out, we describe 3 different qualitative research approaches commonly used in medical education: grounded theory, ethnography, and phenomenology. Each acts as a pivotal frame that shapes the research question (s), the method (s) of data collection, and how data are analyzed. 4, 5. Go to:

  16. Exploratory Research Design: A Guide for Students

    However, commonly, exploratory studies are shaped as small-scale qualitative designs. Qualitative research allows researchers to delve deep into a topic to generate high-quality, contextualized, and nuanced descriptive data. Qualitative data generally cannot create generalizable results, but it does help to create hypotheses that can be looked ...

  17. Employing a Qualitative Description Approach in Health Care Research

    Exploratory research is the umbrella term used by Brink and Wood (2001) to describe all description qualitative research and suggest it "is a Level 1 research endeavor" (p. 85), and Savin-Baden and Howell Major (2013) refer to a pragmatic qualitative approach. This interchangeable use of terms creates ambiguity and confusion in relation to ...

  18. Exploratory design

    Exploratory design. 3.2 THE QUALITATIVE RESEARCH APPROACH. 3.2.1 The research design. The research design is the overall plan and procedures employed for addressing the research question, including the specifications for enhancing the study's integrity. It is therefore a full description of how the research will be conducted (Polit & Beck ...

  19. Qualitative Study

    Qualitative research is a type of research that explores and provides deeper insights into real-world problems.[1] Instead of collecting numerical data points or intervene or introduce treatments just like in quantitative research, qualitative research helps generate hypotheses as well as further investigate and understand quantitative data. Qualitative research gathers participants ...

  20. Defining Exploratory-Descriptive Qualitative (EDQ) research and

    To address a specific research aim, the primary author believed that an exploratory descriptive qualitative approach was the most suitable. A subsequent review of the relevant literature identified a number of published research articles, which claimed to use an "exploratory descriptive qualitative" design.

  21. Strategies for Generating Deliberately Emergent Qualitative Research

    Planned versus emergent qualitative research design strategies need not, however, ... undervalued and underused strategy that can be helpful to researchers seeking to construct more thoughtful and flexible research designs. Exploratory studies can take many forms. In some cases, they can take the form of a planned intervention which acts as a ...

  22. Qualitative Descriptive Design

    A descriptive design is a flexible, exploratory approach to qualitative research. Descriptive design is referred to in the literature by other labels including generic, general, basic, traditional, interpretive, and pragmatic. Descriptive design as an acceptable research design for dissertation and other robust scholarly research has received ...

  23. An overview of the qualitative descriptive design within nursing research

    Qualitative descriptive designs are common in nursing and healthcare research due to their inherent simplicity, flexibility and utility in diverse healthcare contexts. However, the application of descriptive research is sometimes critiqued in terms of scientific rigor. Inconsistency in decision making within the research process coupled with a ...