• Connelly Library

Qualitative and Quantitative Research

What is "empirical research".

  • empirical research
  • Locating Articles in Cinahl and PsycInfo
  • Locating Articles in PubMed
  • Getting the Articles

Empirical research  is based on observed and measured phenomena and derives knowledge from actual experience rather than from theory or belief. 

How do you know if a study is empirical? Read the subheadings within the article, book, or report and look for a description of the research "methodology."  Ask yourself: Could I recreate this study and test these results?

Key characteristics to look for:

  • Specific research questions  to be answered
  • Definition of the  population, behavior, or   phenomena  being studied
  • Description of the  process  used to study this population or phenomena, including selection criteria, controls, and testing instruments (such as surveys)

Another hint: some scholarly journals use a specific layout, called the "IMRaD" format, to communicate empirical research findings. Such articles typically have 4 components:

  • Introduction : sometimes called "literature review" -- what is currently known about the topic -- usually includes a theoretical framework and/or discussion of previous studies
  • Methodology:  sometimes called "research design" --  how to recreate the study -- usually describes the population, research process, and analytical tools
  • Results : sometimes called "findings"  --  what was learned through the study -- usually appears as statistical data or as substantial quotations from research participants
  • Discussion : sometimes called "conclusion" or "implications" -- why the study is important -- usually describes how the research results influence professional practices or future studies
  • << Previous: Home
  • Next: Locating Articles in Cinahl and PsycInfo >>

La Salle University

© Copyright La Salle University. All rights reserved.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

empirical evidence in qualitative research

Home Market Research

Empirical Research: Definition, Methods, Types and Examples

What is Empirical Research

Content Index

Empirical research: Definition

Empirical research: origin, quantitative research methods, qualitative research methods, steps for conducting empirical research, empirical research methodology cycle, advantages of empirical research, disadvantages of empirical research, why is there a need for empirical research.

Empirical research is defined as any research where conclusions of the study is strictly drawn from concretely empirical evidence, and therefore “verifiable” evidence.

This empirical evidence can be gathered using quantitative market research and  qualitative market research  methods.

For example: A research is being conducted to find out if listening to happy music in the workplace while working may promote creativity? An experiment is conducted by using a music website survey on a set of audience who are exposed to happy music and another set who are not listening to music at all, and the subjects are then observed. The results derived from such a research will give empirical evidence if it does promote creativity or not.

LEARN ABOUT: Behavioral Research

You must have heard the quote” I will not believe it unless I see it”. This came from the ancient empiricists, a fundamental understanding that powered the emergence of medieval science during the renaissance period and laid the foundation of modern science, as we know it today. The word itself has its roots in greek. It is derived from the greek word empeirikos which means “experienced”.

In today’s world, the word empirical refers to collection of data using evidence that is collected through observation or experience or by using calibrated scientific instruments. All of the above origins have one thing in common which is dependence of observation and experiments to collect data and test them to come up with conclusions.

LEARN ABOUT: Causal Research

Types and methodologies of empirical research

Empirical research can be conducted and analysed using qualitative or quantitative methods.

  • Quantitative research : Quantitative research methods are used to gather information through numerical data. It is used to quantify opinions, behaviors or other defined variables . These are predetermined and are in a more structured format. Some of the commonly used methods are survey, longitudinal studies, polls, etc
  • Qualitative research:   Qualitative research methods are used to gather non numerical data.  It is used to find meanings, opinions, or the underlying reasons from its subjects. These methods are unstructured or semi structured. The sample size for such a research is usually small and it is a conversational type of method to provide more insight or in-depth information about the problem Some of the most popular forms of methods are focus groups, experiments, interviews, etc.

Data collected from these will need to be analysed. Empirical evidence can also be analysed either quantitatively and qualitatively. Using this, the researcher can answer empirical questions which have to be clearly defined and answerable with the findings he has got. The type of research design used will vary depending on the field in which it is going to be used. Many of them might choose to do a collective research involving quantitative and qualitative method to better answer questions which cannot be studied in a laboratory setting.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

Quantitative research methods aid in analyzing the empirical evidence gathered. By using these a researcher can find out if his hypothesis is supported or not.

  • Survey research: Survey research generally involves a large audience to collect a large amount of data. This is a quantitative method having a predetermined set of closed questions which are pretty easy to answer. Because of the simplicity of such a method, high responses are achieved. It is one of the most commonly used methods for all kinds of research in today’s world.

Previously, surveys were taken face to face only with maybe a recorder. However, with advancement in technology and for ease, new mediums such as emails , or social media have emerged.

For example: Depletion of energy resources is a growing concern and hence there is a need for awareness about renewable energy. According to recent studies, fossil fuels still account for around 80% of energy consumption in the United States. Even though there is a rise in the use of green energy every year, there are certain parameters because of which the general population is still not opting for green energy. In order to understand why, a survey can be conducted to gather opinions of the general population about green energy and the factors that influence their choice of switching to renewable energy. Such a survey can help institutions or governing bodies to promote appropriate awareness and incentive schemes to push the use of greener energy.

Learn more: Renewable Energy Survey Template Descriptive Research vs Correlational Research

  • Experimental research: In experimental research , an experiment is set up and a hypothesis is tested by creating a situation in which one of the variable is manipulated. This is also used to check cause and effect. It is tested to see what happens to the independent variable if the other one is removed or altered. The process for such a method is usually proposing a hypothesis, experimenting on it, analyzing the findings and reporting the findings to understand if it supports the theory or not.

For example: A particular product company is trying to find what is the reason for them to not be able to capture the market. So the organisation makes changes in each one of the processes like manufacturing, marketing, sales and operations. Through the experiment they understand that sales training directly impacts the market coverage for their product. If the person is trained well, then the product will have better coverage.

  • Correlational research: Correlational research is used to find relation between two set of variables . Regression analysis is generally used to predict outcomes of such a method. It can be positive, negative or neutral correlation.

LEARN ABOUT: Level of Analysis

For example: Higher educated individuals will get higher paying jobs. This means higher education enables the individual to high paying job and less education will lead to lower paying jobs.

  • Longitudinal study: Longitudinal study is used to understand the traits or behavior of a subject under observation after repeatedly testing the subject over a period of time. Data collected from such a method can be qualitative or quantitative in nature.

For example: A research to find out benefits of exercise. The target is asked to exercise everyday for a particular period of time and the results show higher endurance, stamina, and muscle growth. This supports the fact that exercise benefits an individual body.

  • Cross sectional: Cross sectional study is an observational type of method, in which a set of audience is observed at a given point in time. In this type, the set of people are chosen in a fashion which depicts similarity in all the variables except the one which is being researched. This type does not enable the researcher to establish a cause and effect relationship as it is not observed for a continuous time period. It is majorly used by healthcare sector or the retail industry.

For example: A medical study to find the prevalence of under-nutrition disorders in kids of a given population. This will involve looking at a wide range of parameters like age, ethnicity, location, incomes  and social backgrounds. If a significant number of kids coming from poor families show under-nutrition disorders, the researcher can further investigate into it. Usually a cross sectional study is followed by a longitudinal study to find out the exact reason.

  • Causal-Comparative research : This method is based on comparison. It is mainly used to find out cause-effect relationship between two variables or even multiple variables.

For example: A researcher measured the productivity of employees in a company which gave breaks to the employees during work and compared that to the employees of the company which did not give breaks at all.

LEARN ABOUT: Action Research

Some research questions need to be analysed qualitatively, as quantitative methods are not applicable there. In many cases, in-depth information is needed or a researcher may need to observe a target audience behavior, hence the results needed are in a descriptive analysis form. Qualitative research results will be descriptive rather than predictive. It enables the researcher to build or support theories for future potential quantitative research. In such a situation qualitative research methods are used to derive a conclusion to support the theory or hypothesis being studied.

LEARN ABOUT: Qualitative Interview

  • Case study: Case study method is used to find more information through carefully analyzing existing cases. It is very often used for business research or to gather empirical evidence for investigation purpose. It is a method to investigate a problem within its real life context through existing cases. The researcher has to carefully analyse making sure the parameter and variables in the existing case are the same as to the case that is being investigated. Using the findings from the case study, conclusions can be drawn regarding the topic that is being studied.

For example: A report mentioning the solution provided by a company to its client. The challenges they faced during initiation and deployment, the findings of the case and solutions they offered for the problems. Such case studies are used by most companies as it forms an empirical evidence for the company to promote in order to get more business.

  • Observational method:   Observational method is a process to observe and gather data from its target. Since it is a qualitative method it is time consuming and very personal. It can be said that observational research method is a part of ethnographic research which is also used to gather empirical evidence. This is usually a qualitative form of research, however in some cases it can be quantitative as well depending on what is being studied.

For example: setting up a research to observe a particular animal in the rain-forests of amazon. Such a research usually take a lot of time as observation has to be done for a set amount of time to study patterns or behavior of the subject. Another example used widely nowadays is to observe people shopping in a mall to figure out buying behavior of consumers.

  • One-on-one interview: Such a method is purely qualitative and one of the most widely used. The reason being it enables a researcher get precise meaningful data if the right questions are asked. It is a conversational method where in-depth data can be gathered depending on where the conversation leads.

For example: A one-on-one interview with the finance minister to gather data on financial policies of the country and its implications on the public.

  • Focus groups: Focus groups are used when a researcher wants to find answers to why, what and how questions. A small group is generally chosen for such a method and it is not necessary to interact with the group in person. A moderator is generally needed in case the group is being addressed in person. This is widely used by product companies to collect data about their brands and the product.

For example: A mobile phone manufacturer wanting to have a feedback on the dimensions of one of their models which is yet to be launched. Such studies help the company meet the demand of the customer and position their model appropriately in the market.

  • Text analysis: Text analysis method is a little new compared to the other types. Such a method is used to analyse social life by going through images or words used by the individual. In today’s world, with social media playing a major part of everyone’s life, such a method enables the research to follow the pattern that relates to his study.

For example: A lot of companies ask for feedback from the customer in detail mentioning how satisfied are they with their customer support team. Such data enables the researcher to take appropriate decisions to make their support team better.

Sometimes a combination of the methods is also needed for some questions that cannot be answered using only one type of method especially when a researcher needs to gain a complete understanding of complex subject matter.

We recently published a blog that talks about examples of qualitative data in education ; why don’t you check it out for more ideas?

Since empirical research is based on observation and capturing experiences, it is important to plan the steps to conduct the experiment and how to analyse it. This will enable the researcher to resolve problems or obstacles which can occur during the experiment.

Step #1: Define the purpose of the research

This is the step where the researcher has to answer questions like what exactly do I want to find out? What is the problem statement? Are there any issues in terms of the availability of knowledge, data, time or resources. Will this research be more beneficial than what it will cost.

Before going ahead, a researcher has to clearly define his purpose for the research and set up a plan to carry out further tasks.

Step #2 : Supporting theories and relevant literature

The researcher needs to find out if there are theories which can be linked to his research problem . He has to figure out if any theory can help him support his findings. All kind of relevant literature will help the researcher to find if there are others who have researched this before, or what are the problems faced during this research. The researcher will also have to set up assumptions and also find out if there is any history regarding his research problem

Step #3: Creation of Hypothesis and measurement

Before beginning the actual research he needs to provide himself a working hypothesis or guess what will be the probable result. Researcher has to set up variables, decide the environment for the research and find out how can he relate between the variables.

Researcher will also need to define the units of measurements, tolerable degree for errors, and find out if the measurement chosen will be acceptable by others.

Step #4: Methodology, research design and data collection

In this step, the researcher has to define a strategy for conducting his research. He has to set up experiments to collect data which will enable him to propose the hypothesis. The researcher will decide whether he will need experimental or non experimental method for conducting the research. The type of research design will vary depending on the field in which the research is being conducted. Last but not the least, the researcher will have to find out parameters that will affect the validity of the research design. Data collection will need to be done by choosing appropriate samples depending on the research question. To carry out the research, he can use one of the many sampling techniques. Once data collection is complete, researcher will have empirical data which needs to be analysed.

LEARN ABOUT: Best Data Collection Tools

Step #5: Data Analysis and result

Data analysis can be done in two ways, qualitatively and quantitatively. Researcher will need to find out what qualitative method or quantitative method will be needed or will he need a combination of both. Depending on the unit of analysis of his data, he will know if his hypothesis is supported or rejected. Analyzing this data is the most important part to support his hypothesis.

Step #6: Conclusion

A report will need to be made with the findings of the research. The researcher can give the theories and literature that support his research. He can make suggestions or recommendations for further research on his topic.

Empirical research methodology cycle

A.D. de Groot, a famous dutch psychologist and a chess expert conducted some of the most notable experiments using chess in the 1940’s. During his study, he came up with a cycle which is consistent and now widely used to conduct empirical research. It consists of 5 phases with each phase being as important as the next one. The empirical cycle captures the process of coming up with hypothesis about how certain subjects work or behave and then testing these hypothesis against empirical data in a systematic and rigorous approach. It can be said that it characterizes the deductive approach to science. Following is the empirical cycle.

  • Observation: At this phase an idea is sparked for proposing a hypothesis. During this phase empirical data is gathered using observation. For example: a particular species of flower bloom in a different color only during a specific season.
  • Induction: Inductive reasoning is then carried out to form a general conclusion from the data gathered through observation. For example: As stated above it is observed that the species of flower blooms in a different color during a specific season. A researcher may ask a question “does the temperature in the season cause the color change in the flower?” He can assume that is the case, however it is a mere conjecture and hence an experiment needs to be set up to support this hypothesis. So he tags a few set of flowers kept at a different temperature and observes if they still change the color?
  • Deduction: This phase helps the researcher to deduce a conclusion out of his experiment. This has to be based on logic and rationality to come up with specific unbiased results.For example: In the experiment, if the tagged flowers in a different temperature environment do not change the color then it can be concluded that temperature plays a role in changing the color of the bloom.
  • Testing: This phase involves the researcher to return to empirical methods to put his hypothesis to the test. The researcher now needs to make sense of his data and hence needs to use statistical analysis plans to determine the temperature and bloom color relationship. If the researcher finds out that most flowers bloom a different color when exposed to the certain temperature and the others do not when the temperature is different, he has found support to his hypothesis. Please note this not proof but just a support to his hypothesis.
  • Evaluation: This phase is generally forgotten by most but is an important one to keep gaining knowledge. During this phase the researcher puts forth the data he has collected, the support argument and his conclusion. The researcher also states the limitations for the experiment and his hypothesis and suggests tips for others to pick it up and continue a more in-depth research for others in the future. LEARN MORE: Population vs Sample

LEARN MORE: Population vs Sample

There is a reason why empirical research is one of the most widely used method. There are a few advantages associated with it. Following are a few of them.

  • It is used to authenticate traditional research through various experiments and observations.
  • This research methodology makes the research being conducted more competent and authentic.
  • It enables a researcher understand the dynamic changes that can happen and change his strategy accordingly.
  • The level of control in such a research is high so the researcher can control multiple variables.
  • It plays a vital role in increasing internal validity .

Even though empirical research makes the research more competent and authentic, it does have a few disadvantages. Following are a few of them.

  • Such a research needs patience as it can be very time consuming. The researcher has to collect data from multiple sources and the parameters involved are quite a few, which will lead to a time consuming research.
  • Most of the time, a researcher will need to conduct research at different locations or in different environments, this can lead to an expensive affair.
  • There are a few rules in which experiments can be performed and hence permissions are needed. Many a times, it is very difficult to get certain permissions to carry out different methods of this research.
  • Collection of data can be a problem sometimes, as it has to be collected from a variety of sources through different methods.

LEARN ABOUT:  Social Communication Questionnaire

Empirical research is important in today’s world because most people believe in something only that they can see, hear or experience. It is used to validate multiple hypothesis and increase human knowledge and continue doing it to keep advancing in various fields.

For example: Pharmaceutical companies use empirical research to try out a specific drug on controlled groups or random groups to study the effect and cause. This way, they prove certain theories they had proposed for the specific drug. Such research is very important as sometimes it can lead to finding a cure for a disease that has existed for many years. It is useful in science and many other fields like history, social sciences, business, etc.

LEARN ABOUT: 12 Best Tools for Researchers

With the advancement in today’s world, empirical research has become critical and a norm in many fields to support their hypothesis and gain more knowledge. The methods mentioned above are very useful for carrying out such research. However, a number of new methods will keep coming up as the nature of new investigative questions keeps getting unique or changing.

Create a single source of real data with a built-for-insights platform. Store past data, add nuggets of insights, and import research data from various sources into a CRM for insights. Build on ever-growing research with a real-time dashboard in a unified research management platform to turn insights into knowledge.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

feedback loop

Feedback Loop: What It Is, Types & How It Works?

Jun 21, 2024

empirical evidence in qualitative research

QuestionPro Thrive: A Space to Visualize & Share the Future of Technology

Jun 18, 2024

empirical evidence in qualitative research

Relationship NPS Fails to Understand Customer Experiences — Tuesday CX

CX Platforms

CX Platform: Top 13 CX Platforms to Drive Customer Success

Jun 17, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Empirical evidence: A definition

Empirical evidence is information that is acquired by observation or experimentation.

Scientists in a lab

The scientific method

Types of empirical research, identifying empirical evidence, empirical law vs. scientific law, empirical, anecdotal and logical evidence, additional resources and reading, bibliography.

Empirical evidence is information acquired by observation or experimentation. Scientists record and analyze this data. The process is a central part of the scientific method , leading to the proving or disproving of a hypothesis and our better understanding of the world as a result.

Empirical evidence might be obtained through experiments that seek to provide a measurable or observable reaction, trials that repeat an experiment to test its efficacy (such as a drug trial, for instance) or other forms of data gathering against which a hypothesis can be tested and reliably measured. 

"If a statement is about something that is itself observable, then the empirical testing can be direct. We just have a look to see if it is true. For example, the statement, 'The litmus paper is pink', is subject to direct empirical testing," wrote Peter Kosso in " A Summary of Scientific Method " (Springer, 2011).

"Science is most interesting and most useful to us when it is describing the unobservable things like atoms , germs , black holes , gravity , the process of evolution as it happened in the past, and so on," wrote Kosso. Scientific theories , meaning theories about nature that are unobservable, cannot be proven by direct empirical testing, but they can be tested indirectly, according to Kosso. "The nature of this indirect evidence, and the logical relation between evidence and theory, are the crux of scientific method," wrote Kosso.

The scientific method begins with scientists forming questions, or hypotheses , and then acquiring the knowledge through observations and experiments to either support or disprove a specific theory. "Empirical" means "based on observation or experience," according to the Merriam-Webster Dictionary . Empirical research is the process of finding empirical evidence. Empirical data is the information that comes from the research.

Before any pieces of empirical data are collected, scientists carefully design their research methods to ensure the accuracy, quality and integrity of the data. If there are flaws in the way that empirical data is collected, the research will not be considered valid.

The scientific method often involves lab experiments that are repeated over and over, and these experiments result in quantitative data in the form of numbers and statistics. However, that is not the only process used for gathering information to support or refute a theory. 

This methodology mostly applies to the natural sciences. "The role of empirical experimentation and observation is negligible in mathematics compared to natural sciences such as psychology, biology or physics," wrote Mark Chang, an adjunct professor at Boston University, in " Principles of Scientific Methods " (Chapman and Hall, 2017).

"Empirical evidence includes measurements or data collected through direct observation or experimentation," said Jaime Tanner, a professor of biology at Marlboro College in Vermont. There are two research methods used to gather empirical measurements and data: qualitative and quantitative.

Qualitative research, often used in the social sciences, examines the reasons behind human behavior, according to the National Center for Biotechnology Information (NCBI) . It involves data that can be found using the human senses . This type of research is often done in the beginning of an experiment. "When combined with quantitative measures, qualitative study can give a better understanding of health related issues," wrote Dr. Sanjay Kalra for NCBI.

Quantitative research involves methods that are used to collect numerical data and analyze it using statistical methods, ."Quantitative research methods emphasize objective measurements and the statistical, mathematical, or numerical analysis of data collected through polls, questionnaires, and surveys, or by manipulating pre-existing statistical data using computational techniques," according to the LeTourneau University . This type of research is often used at the end of an experiment to refine and test the previous research.

Scientist in a lab

Identifying empirical evidence in another researcher's experiments can sometimes be difficult. According to the Pennsylvania State University Libraries , there are some things one can look for when determining if evidence is empirical:

  • Can the experiment be recreated and tested?
  • Does the experiment have a statement about the methodology, tools and controls used?
  • Is there a definition of the group or phenomena being studied?

The objective of science is that all empirical data that has been gathered through observation, experience and experimentation is without bias. The strength of any scientific research depends on the ability to gather and analyze empirical data in the most unbiased and controlled fashion possible. 

However, in the 1960s, scientific historian and philosopher Thomas Kuhn promoted the idea that scientists can be influenced by prior beliefs and experiences, according to the Center for the Study of Language and Information . 

— Amazing Black scientists

— Marie Curie: Facts and biography

— What is multiverse theory?

"Missing observations or incomplete data can also cause bias in data analysis, especially when the missing mechanism is not random," wrote Chang.

Because scientists are human and prone to error, empirical data is often gathered by multiple scientists who independently replicate experiments. This also guards against scientists who unconsciously, or in rare cases consciously, veer from the prescribed research parameters, which could skew the results.

The recording of empirical data is also crucial to the scientific method, as science can only be advanced if data is shared and analyzed. Peer review of empirical data is essential to protect against bad science, according to the University of California .

Empirical laws and scientific laws are often the same thing. "Laws are descriptions — often mathematical descriptions — of natural phenomenon," Peter Coppinger, associate professor of biology and biomedical engineering at the Rose-Hulman Institute of Technology, told Live Science. 

Empirical laws are scientific laws that can be proven or disproved using observations or experiments, according to the Merriam-Webster Dictionary . So, as long as a scientific law can be tested using experiments or observations, it is considered an empirical law.

Empirical, anecdotal and logical evidence should not be confused. They are separate types of evidence that can be used to try to prove or disprove and idea or claim.

Logical evidence is used proven or disprove an idea using logic. Deductive reasoning may be used to come to a conclusion to provide logical evidence. For example, "All men are mortal. Harold is a man. Therefore, Harold is mortal."

Anecdotal evidence consists of stories that have been experienced by a person that are told to prove or disprove a point. For example, many people have told stories about their alien abductions to prove that aliens exist. Often, a person's anecdotal evidence cannot be proven or disproven. 

There are some things in nature that science is still working to build evidence for, such as the hunt to explain consciousness .

Meanwhile, in other scientific fields, efforts are still being made to improve research methods, such as the plan by some psychologists to fix the science of psychology .

" A Summary of Scientific Method " by Peter Kosso (Springer, 2011)

"Empirical" Merriam-Webster Dictionary

" Principles of Scientific Methods " by Mark Chang (Chapman and Hall, 2017)

"Qualitative research" by Dr. Sanjay Kalra National Center for Biotechnology Information (NCBI)

"Quantitative Research and Analysis: Quantitative Methods Overview" LeTourneau University

"Empirical Research in the Social Sciences and Education" Pennsylvania State University Libraries

"Thomas Kuhn" Center for the Study of Language and Information

"Misconceptions about science" University of California

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

30,000 years of history reveals that hard times boost human societies' resilience

'We're meeting people where they are': Graphic novels can help boost diversity in STEM, says MIT's Ritu Raman

Gates of Hell: Turkmenistan's methane-fueled fire pit that has been burning since 1971

Most Popular

  • 2 Strawberry Moon 2024: See summer's first full moon rise a day after solstice
  • 3 Y chromosome is evolving faster than the X, primate study reveals
  • 4 Ming dynasty shipwrecks hide a treasure trove of artifacts in the South China Sea, excavation reveals
  • 5 Astronomers discover the 1st-ever merging galaxy cores at cosmic dawn
  • 2 '1st of its kind': NASA spots unusually light-colored boulder on Mars that may reveal clues of the planet's past
  • 3 Long-lost Assyrian military camp devastated by 'the angel of the Lord' finally found, scientist claims
  • 4 Giant river system that existed 40 million years ago discovered deep below Antarctic ice

empirical evidence in qualitative research

Banner

  • University of Memphis Libraries
  • Research Guides

Empirical Research: Defining, Identifying, & Finding

Defining empirical research, what is empirical research, quantitative or qualitative.

  • Introduction
  • Database Tools
  • Search Terms
  • Image Descriptions

Calfee & Chambliss (2005)  (UofM login required) describe empirical research as a "systematic approach for answering certain types of questions."  Those questions are answered "[t]hrough the collection of evidence under carefully defined and replicable conditions" (p. 43). 

The evidence collected during empirical research is often referred to as "data." 

Characteristics of Empirical Research

Emerald Publishing's guide to conducting empirical research identifies a number of common elements to empirical research: 

  • A  research question , which will determine research objectives.
  • A particular and planned  design  for the research, which will depend on the question and which will find ways of answering it with appropriate use of resources.
  • The gathering of  primary data , which is then analysed.
  • A particular  methodology  for collecting and analysing the data, such as an experiment or survey.
  • The limitation of the data to a particular group, area or time scale, known as a sample [emphasis added]: for example, a specific number of employees of a particular company type, or all users of a library over a given time scale. The sample should be somehow representative of a wider population.
  • The ability to  recreate  the study and test the results. This is known as  reliability .
  • The ability to  generalize  from the findings to a larger sample and to other situations.

If you see these elements in a research article, you can feel confident that you have found empirical research. Emerald's guide goes into more detail on each element. 

Empirical research methodologies can be described as quantitative, qualitative, or a mix of both (usually called mixed-methods).

Ruane (2016)  (UofM login required) gets at the basic differences in approach between quantitative and qualitative research:

  • Quantitative research  -- an approach to documenting reality that relies heavily on numbers both for the measurement of variables and for data analysis (p. 33).
  • Qualitative research  -- an approach to documenting reality that relies on words and images as the primary data source (p. 33).

Both quantitative and qualitative methods are empirical . If you can recognize that a research study is quantitative or qualitative study, then you have also recognized that it is empirical study. 

Below are information on the characteristics of quantitative and qualitative research. This video from Scribbr also offers a good overall introduction to the two approaches to research methodology: 

Characteristics of Quantitative Research 

Researchers test hypotheses, or theories, based in assumptions about causality, i.e. we expect variable X to cause variable Y. Variables have to be controlled as much as possible to ensure validity. The results explain the relationship between the variables. Measures are based in pre-defined instruments.

Examples: experimental or quasi-experimental design, pretest & post-test, survey or questionnaire with closed-ended questions. Studies that identify factors that influence an outcomes, the utility of an intervention, or understanding predictors of outcomes. 

Characteristics of Qualitative Research

Researchers explore “meaning individuals or groups ascribe to social or human problems (Creswell & Creswell, 2018, p3).” Questions and procedures emerge rather than being prescribed. Complexity, nuance, and individual meaning are valued. Research is both inductive and deductive. Data sources are multiple and varied, i.e. interviews, observations, documents, photographs, etc. The researcher is a key instrument and must be reflective of their background, culture, and experiences as influential of the research.

Examples: open question interviews and surveys, focus groups, case studies, grounded theory, ethnography, discourse analysis, narrative, phenomenology, participatory action research.

Calfee, R. C. & Chambliss, M. (2005). The design of empirical research. In J. Flood, D. Lapp, J. R. Squire, & J. Jensen (Eds.),  Methods of research on teaching the English language arts: The methodology chapters from the handbook of research on teaching the English language arts (pp. 43-78). Routledge.  http://ezproxy.memphis.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=125955&site=eds-live&scope=site .

Creswell, J. W., & Creswell, J. D. (2018).  Research design: Qualitative, quantitative, and mixed methods approaches  (5th ed.). Thousand Oaks: Sage.

How to... conduct empirical research . (n.d.). Emerald Publishing.  https://www.emeraldgrouppublishing.com/how-to/research-methods/conduct-empirical-research .

Scribbr. (2019). Quantitative vs. qualitative: The differences explained  [video]. YouTube.  https://www.youtube.com/watch?v=a-XtVF7Bofg .

Ruane, J. M. (2016).  Introducing social research methods : Essentials for getting the edge . Wiley-Blackwell.  http://ezproxy.memphis.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=1107215&site=eds-live&scope=site .  

  • << Previous: Home
  • Next: Identifying Empirical Research >>
  • Last Updated: Apr 2, 2024 11:25 AM
  • URL: https://libguides.memphis.edu/empirical-research

Jump to navigation

Home

Cochrane Training

Chapter 21: qualitative evidence.

Jane Noyes, Andrew Booth, Margaret Cargo, Kate Flemming, Angela Harden, Janet Harris, Ruth Garside, Karin Hannes, Tomás Pantoja, James Thomas

Key Points:

  • A qualitative evidence synthesis (commonly referred to as QES) can add value by providing decision makers with additional evidence to improve understanding of intervention complexity, contextual variations, implementation, and stakeholder preferences and experiences.
  • A qualitative evidence synthesis can be undertaken and integrated with a corresponding intervention review; or
  • Undertaken using a mixed-method design that integrates a qualitative evidence synthesis with an intervention review in a single protocol.
  • Methods for qualitative evidence synthesis are complex and continue to develop. Authors should always consult current methods guidance at methods.cochrane.org/qi .

This chapter should be cited as: Noyes J, Booth A, Cargo M, Flemming K, Harden A, Harris J, Garside R, Hannes K, Pantoja T, Thomas J. Chapter 21: Qualitative evidence. In: Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, Welch VA (editors). Cochrane Handbook for Systematic Reviews of Interventions version 6.4 (updated August 2023). Cochrane, 2023. Available from www.training.cochrane.org/handbook .

21.1 Introduction

The potential contribution of qualitative evidence to decision making is well-established (Glenton et al 2016, Booth 2017, Carroll 2017). A synthesis of qualitative evidence can inform understanding of how interventions work by:

  • increasing understanding of a phenomenon of interest (e.g. women’s conceptualization of what good antenatal care looks like);
  • identifying associations between the broader environment within which people live and the interventions that are implemented;
  • increasing understanding of the values and attitudes toward, and experiences of, health conditions and interventions by those who implement or receive them; and
  • providing a detailed understanding of the complexity of interventions and implementation, and their impacts and effects on different subgroups of people and the influence of individual and contextual characteristics within different contexts.

The aim of this chapter is to provide authors (who already have experience of undertaking qualitative research and qualitative evidence synthesis) with additional guidance on undertaking a qualitative evidence synthesis that is subsequently integrated with an intervention review. This chapter draws upon guidance presented in a series of six papers published in the Journal of Clinical Epidemiology (Cargo et al 2018, Flemming et al 2018, Harden et al 2018, Harris et al 2018, Noyes et al 2018a, Noyes et al 2018b) and from a further World Health Organization series of papers published in BMJ Global Health, which extend guidance to qualitative evidence syntheses conducted within a complex intervention and health systems and decision making context (Booth et al 2019a, Booth et al 2019b, Flemming et al 2019, Noyes et al 2019, Petticrew et al 2019).The qualitative evidence synthesis and integration methods described in this chapter supplement Chapter 17 on methods for addressing intervention complexity. Authors undertaking qualitative evidence syntheses should consult these papers and chapters for more detailed guidance.

21.2 Designs for synthesizing and integrating qualitative evidence with intervention reviews

There are two main designs for synthesizing qualitative evidence with evidence of the effects of interventions:

  • Sequential reviews: where one or more existing intervention review(s) has been published on a similar topic, it is possible to do a sequential qualitative evidence synthesis and then integrate its findings with those of the intervention review to create a mixed-method review. For example, Lewin and colleagues (Lewin et al (2010) and Glenton and colleagues (Glenton et al (2013) undertook sequential reviews of lay health worker programmes using separate protocols and then integrated the findings.  
  • Convergent mixed-methods review: where no pre-existing intervention review exists, it is possible to do a full convergent ‘mixed-methods’ review where the trials and qualitative evidence are synthesized separately, creating opportunities for them to ‘speak’ to each other during development, and then integrated within a third synthesis. For example, Hurley and colleagues (Hurley et al (2018) undertook an intervention review and a qualitative evidence synthesis following a single protocol.

It is increasingly common for sequential and convergent reviews to be conducted by some or all of the same authors; if not, it is critical that authors working on the qualitative evidence synthesis and intervention review work closely together to identify and create sufficient points of integration to enable a third synthesis that integrates the two reviews, or the conduct of a mixed-method review (Noyes et al 2018a) (see Figure 21.2.a ). This consideration also applies where an intervention review has already been published and there is no prior relationship with the qualitative evidence synthesis authors. We recommend that at least one joint author works across both reviews to facilitate development of the qualitative evidence synthesis protocol, conduct of the synthesis, and subsequent integration of the qualitative evidence synthesis with the intervention review within a mixed-methods review.

Figure 21.2.a Considering context and points of contextual integration with the intervention review or within a mixed-method review

empirical evidence in qualitative research

21.3 Defining qualitative evidence and studies

We use the term ‘qualitative evidence synthesis’ to acknowledge that other types of qualitative evidence (or data) can potentially enrich a synthesis, such as narrative data derived from qualitative components of mixed-method studies or free text from questionnaire surveys. We would not, however, consider a questionnaire survey to be a qualitative study and qualitative data from questionnaires should not usually be privileged over relevant evidence from qualitative studies. When thinking about qualitative evidence, specific terminology is used to describe the level of conceptual and contextual detail. Qualitative evidence that includes higher or lower levels of conceptual detail is described as ‘rich’ or ‘poor’. Associated terms ‘thick’ or ‘thin’ are best used to refer to higher or lower levels of contextual detail. Review authors can potentially develop a stronger synthesis using rich and thick qualitative evidence but, in reality, they will identify diverse conceptually rich and poor and contextually thick and thin studies. Developing a clear picture of the type and conceptual richness of available qualitative evidence strongly influences the choice of methodology and subsequent methods. We recommend that authors undertake scoping searches to determining the type and richness of available qualitative evidence before selecting their methodology and methods.

A qualitative study is a research study that uses a qualitative method of data collection and analysis. Review authors should include the studies that enable them to answer their review question. When selecting qualitative studies in a review about intervention effects, two types of qualitative study are available: those that collect data from the same participants as the included trials, known as ‘trial siblings’; and those that address relevant issues about the intervention, but as separate items of research – not connected to any included trials. Both can provide useful information, with trial sibling studies obviously closer in terms of their precise contexts to the included trials (Moore et al 2015), and non-sibling studies possibly contributing perspectives not present in the trials (Noyes et al 2016b).

21.4 Planning a qualitative evidence synthesis linked to an intervention review

The Cochrane Qualitative and Implementation Methods Group (QIMG) website provides links to practical guidance and key steps for authors who are considering a qualitative evidence synthesis ( methods.cochrane.org/qi ). The RETREAT framework outlines seven key considerations that review authors should systematically work through when planning a review (Booth et al 2016, Booth et al 2018) ( Box 21.4.a ). Flemming and colleagues (Flemming et al (2019) further explain how to factor in such considerations when undertaking a qualitative evidence synthesis within a complex intervention and decision making context when complexity is an important consideration.

Box 21.4.a RETREAT considerations when selecting an appropriate method for qualitative synthesis

first, consider the complexity of the review question. Which elements contribute most to complexity (e.g. the condition, the intervention or the context)?
 

Which elements should be prioritized as the focal point for attention? (Squires et al 2013, Kelly et al 2017).
 

consider the philosophical foundations of the primary studies. Would it be appropriate to favour a method such as thematic synthesis that it is less reliant on epistemological considerations? (Barnett-Page and Thomas 2009).
 

– consider what type of qualitative evidence synthesis will be feasible and manageable within the time frame available (Booth et al 2016).
 

– consider whether the ambition of the review matches the available resources. Will the extent of the scope and the sampling approach of the review need to be limited? (Benoot et al 2016, Booth et al 2016).
 

consider access to expertise, both within the review team and among a wider group of advisors. Does the available expertise match the qualitative evidence synthesis approach chosen? (Booth et al 2016).
 

consider the intended audience and purpose of the review. Does the approach to question formulation, the scope of the review and the intended outputs meet their needs? (Booth et al 2016).
 

consider the type of data present in typical studies for inclusion. To what extent are candidate studies conceptually rich and contextually thick in their detail?

21.5 Question development

The review question is critical to development of the qualitative evidence synthesis (Harris et al 2018). Question development affords a key point for integration with the intervention review. Complementary guidance supports novel thinking about question development, application of question development frameworks and the types of questions to be addressed by a synthesis of qualitative evidence (Cargo et al 2018, Harris et al 2018, Noyes et al 2018a, Booth et al 2019b, Flemming et al 2019).

Research questions for quantitative reviews are often mapped using structures such as PICO. Some qualitative reviews adopt this structure, or use an adapted variation of such a structure (e.g. SPICE (Setting, Perspective, Intervention or Phenomenon of Interest, Comparison, Evaluation) or SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type); (Cooke et al 2012). Booth and colleagues (Booth et al (2019b) propose an extended question framework (PerSPecTIF) to describe both wider context and immediate setting that is particularly suited to qualitative evidence synthesis and complex intervention reviews (see Table 21.5.a ).

Detailed attention to the question and specification of context at an early stage is critical to many aspects of qualitative synthesis (see Petticrew et al (2019) and Booth et al (2019a) for a more detailed discussion). By specifying the context a review team is able to identify opportunities for integration with the intervention review, or opportunities for maximizing use and interpretation of evidence as a mixed-method review progresses (see Figure 21.2.a ), and informs both the interpretation of the observed effects and assessment of the strength of the evidence available in addressing the review question (Noyes et al 2019). Subsequent application of GRADE CERQual (Lewin et al 2015, Lewin et al 2018), an approach to assess the confidence in synthesized qualitative findings, requires further specification of context in the review question.

Table 21.5.a PerSPecTIF Question formulation framework for qualitative evidence syntheses (Booth et al (2019b). Reproduced with permission of BMJ Publishing Group

Perspective

Setting

Phenomenon of interest/ Problem

Environment

Comparison (optional)

Time/ Timing

Findings

From the perspective of a pregnant woman

In the setting of rural communities

How does facility-based care

Within an environment of poor transport infrastructure and distantly located facilities

Compare with traditional birth attendants at home

Up to and including delivery

In relation to the woman’s perceptions and experiences?

21.6 Questions exploring intervention implementation

Additional guidance is available on formulation of questions to understand and assess intervention implementation (Cargo et al 2018). A strong understanding of how an intervention is thought to work, and how it should be implemented in practice, will enable a critical consideration of whether any observed lack of effect might be due to a poorly conceptualized intervention (i.e. theory failure) or a poor intervention implementation (i.e. implementation failure). Heterogeneity needs to be considered for both the underlying theory and the ways in which the intervention was implemented. An a priori scoping review (Levac et al 2010), concept analysis (Walker and Avant 2005), critical review (Grant and Booth 2009) or textual narrative synthesis (Barnett-Page and Thomas 2009) can be undertaken to classify interventions and/or to identify the programme theory, logic model or implementation measures and processes. The intervention Complexity Assessment Tool for Systematic Reviews iCAT_SR (Lewin et al 2017) may be helpful in classifying complexity in interventions and developing associated questions.

An existing intervention model or framework may be used within a new topic or context. The ‘best-fit framework’ approach to synthesis (Carroll et al 2013) can be used to establish the degree to which the source context (from where the framework was derived) resembles the new target context (see Figure 21.2.a ). In the absence of an explicit programme theory and detail of how implementation relates to outcomes, an a priori realist review, meta-ethnography or meta-interpretive review can be undertaken (Booth et al 2016). For example, Downe and colleagues (Downe et al (2016) undertook an initial meta-ethnography review to develop an understanding of the outcomes of importance to women receiving antenatal care.

However, these additional activities are very resource-intensive and are only recommended when the review team has sufficient resources to supplement the planned qualitative evidence syntheses with an additional explanatory review. Where resources are less plentiful a review team could engage with key stakeholders to articulate and develop programme theory (Kelly et al 2017, De Buck et al 2018).

21.6.1 Using logic models and theories to support question development

Review authors can develop a more comprehensive representation of question features through use of logic models, programme theories, theories of change, templates and pathways (Anderson et al 2011, Kneale et al 2015, Noyes et al 2016a) (see also Chapter 17, Section 17.2.1  and Chapter 2, Section 2.5.1 ). These different forms of social theory can be used to visualize and map the research question, its context, components, influential factors and possible outcomes (Noyes et al 2016a, Rehfuess et al 2018).

21.6.2 Stakeholder engagement

Finally, review authors need to engage stakeholders, including consumers affected by the health issue and interventions, or likely users of the review from clinical or policy contexts. From the preparatory stage, this consultation can ensure that the review scope and question is appropriate and resulting products address implementation concerns of decision makers (Kelly et al 2017, Harris et al 2018).

21.7 Searching for qualitative evidence

In comparison with identification of quantitative studies (see also Chapter 4 ), procedures for retrieval of qualitative research remain relatively under-developed. Particular challenges in retrieval are associated with non-informative titles and abstracts, diffuse terminology, poor indexing and the overwhelming prevalence of quantitative studies within data sources (Booth et al 2016).

Principal considerations when planning a search for qualitative studies, and the evidence that underpins them, have been characterized using a 7S framework from Sampling and Sources through Structured questions, Search procedures, Strategies and filters and Supplementary strategies to Standards for Reporting (Booth et al 2016).

A key decision, aligned to the purpose of the qualitative evidence synthesis is whether to use the comprehensive, exhaustive approaches that characterize quantitative searches or whether to use purposive sampling that is more sensitive to the qualitative paradigm (Suri 2011). The latter, which is used when the intent is to generate an interpretative understanding, for example, when generating theory, draws upon a versatile toolkit that includes theoretical sampling, maximum variation sampling and intensity sampling. Sources of qualitative evidence are more likely to include book chapters, theses and grey literature reports than standard quantitative study reports, and so a search strategy should place extra emphasis on these sources. Local databases may be particularly valuable given the criticality of context (Stansfield et al 2012).

Another key decision is whether to use study filters or simply to conduct a topic-based search where qualitative studies are identified at the study selection stage. Search filters for qualitative studies lack the specificity of their quantitative counterparts. Nevertheless, filters may facilitate efficient retrieval by study type (e.g. qualitative (Rogers et al 2018) or mixed methods (El Sherif et al 2016) or by perspective (e.g. patient preferences (Selva et al 2017)) particularly where the quantitative literature is overwhelmingly large and thus increases the number needed to retrieve. Poor indexing of qualitative studies makes citation searching (forward and backward) and the Related Articles features of electronic databases particularly useful (Cooper et al 2017). Further guidance on searching for qualitative evidence is available (Booth et al 2016, Noyes et al 2018a). The CLUSTER method has been proposed as a specific named method for tracking down associated or sibling reports (Booth et al 2013). The BeHEMoTh approach has been developed for identifying explicit use of theory (Booth and Carroll 2015).

21.7.1 Searching for process evaluations and implementation evidence

Four potential approaches are available to identify process evaluations.

  • Identify studies at the point of study selection rather than through tailored search strategies. This involves conducting a sensitive topic search without any study design filter (Harden et al 1999), and identifying all study designs of interest during the screening process. This approach can be feasible when a review question involves multiple publication types (e.g. randomized trial, qualitative research and economic evaluations), which then do not require separate searches.  
  • Restrict included process evaluations to those conducted within randomized trials, which can be identified using standard search filters (see Chapter 4, Section 4.4.7 ). This method relies on reports of process evaluations also describing the surrounding randomized trial in enough detail to be identified by the search filter.  
  • Use unevaluated filter terms (such as ‘process evaluation’, ‘program(me) evaluation’, ‘feasibility study’, ‘implementation’ or ‘proof of concept’ etc) to retrieve process evaluations or implementation data. Approaches using strings of terms associated with the study type or purpose are considered experimental. There is a need to develop and test such filters. It is likely that such filters may be derived from the study type (process evaluation), the data type (process data) or the application (implementation) (Robbins et al 2011).  
  • Minimize reliance on topic-based searching and rely on citations-based approaches to identify linked reports, published or unpublished, of a particular study (Booth et al 2013) which may provide implementation or process data (Bonell et al 2013).

More detailed guidance is provided by Cargo and colleagues (Cargo et al (2018).

21.8 Assessing methodological strengths and limitations of qualitative studies

Assessment of the methodological strengths and limitations of qualitative research remains contested within the primary qualitative research community (Garside 2014). However, within systematic reviews and evidence syntheses it is considered essential, even when studies are not to be excluded on the basis of quality (Carroll et al 2013). One review found almost 100 appraisal tools for assessing primary qualitative studies (Munthe-Kaas et al 2019). Limitations included a focus on reporting rather than conduct and the presence of items that are separate from, or tangential to, consideration of study quality (e.g. ethical approval).

Authors should distinguish between assessment of study quality and assessment of risk of bias by focusing on assessment of methodological strengths and limitations as a marker of study rigour (what we term a ‘risk to rigour’ approach (Noyes et al 2019)). In the absence of a definitive risk to rigour tool, we recommend that review authors select from published, commonly used and validated tools that focus on the assessment of the methodological strengths and limitations of qualitative studies (see Box 21.8.a ). Pragmatically, we consider a ‘validated’ tool as one that has been subjected to evaluation. Issues such as inter-rater reliability are afforded less importance given that identification of complementary or conflicting perspectives on risk to rigour is considered more useful than achievement of consensus per se (Noyes et al 2019).

The CASP tool for qualitative research (as one example) maps onto the domains in Box 21.8.a (CASP 2013). Tools not meeting the criterion of focusing on assessment of methodological strengths and limitations include those that integrate assessment of the quality of reporting (such as scoring of the title and abstract, etc) into an overall assessment of methodological strengths and limitations. As with other risk of bias assessment tools, we strongly recommend against the application of scores to domains or calculation of total quality scores. We encourage review authors to discuss the studies and their assessments of ‘risk to rigour’ for each paper and how the study’s methodological limitations may affect review findings (Noyes et al 2019). We further advise that qualitative ‘sensitivity analysis’, exploring the robustness of the synthesis and its vulnerability to methodologically limited studies, be routinely applied regardless of the review authors’ overall confidence in synthesized findings (Carroll et al 2013). Evidence suggests that qualitative sensitivity analysis is equally advisable for mixed methods studies from which the qualitative component is extracted (Verhage and Boels 2017).

Box 21.8.a Example domains that provide an assessment of methodological strengths and limitations to determine study rigour

Clear aims and research question
 

Congruence between the research aims/question and research design/method(s)
 

Rigour of case and or participant identification, sampling and data collection to address the question
 

Appropriate application of the method
 

Richness/conceptual depth of findings
 

Exploration of deviant cases and alternative explanations
 

Reflexivity of the researchers*
 

*Reflexivity encourages qualitative researchers and reviewers to consider the actual and potential impacts of the researcher on the context, research participants and the interpretation and reporting of data and findings (Newton et al 2012). Being reflexive entails making conflicts of interest transparent, discussing the impact of the reviewers and their decisions on the review process and findings and making transparent any issues discussed and subsequent decisions.

Adapted from Noyes et al (2019) and Alvesson and Sköldberg (2009)

21.8.1 Additional assessment of methodological strengths and limitations of process evaluation and intervention implementation evidence

Few assessment tools explicitly address rigour in process evaluation or implementation evidence. For qualitative primary studies, the 8-item process evaluation tool developed by the EPPI-Centre (Rees et al 2009, Shepherd et al 2010) can be used to supplement tools selected to assess methodological strengths and limitations and risks to rigour in primary qualitative studies. One of these items, a question on usefulness (framed as ‘how well the intervention processes were described and whether or not the process data could illuminate why or how the interventions worked or did not work’ ) offers a mechanism for exploring process mechanisms (Cargo et al 2018).

21.9 Selecting studies to synthesize

Decisions about inclusion or exclusion of studies can be more complex in qualitative evidence syntheses compared to reviews of trials that aim to include all relevant studies. Decisions on whether to include all studies or to select a sample of studies depend on a range of general and review specific criteria that Noyes and colleagues (Noyes et al (2019) outline in detail. The number of qualitative studies selected needs to be consistent with a manageable synthesis, and the contexts of the included studies should enable integration with the trials in the effectiveness analysis (see Figure 21.2.a ). The guiding principle is transparency in the reporting of all decisions and their rationale.

21.10 Selecting a qualitative evidence synthesis and data extraction method

Authors will typically find that they cannot select an appropriate synthesis method until the pool of available qualitative evidence has been thoroughly scoped. Flexible options concerning choice of method may need to be articulated in the protocol.

The INTEGRATE-HTA guidance on selecting methodology and methods for qualitative evidence synthesis and health technology assessment offers a useful starting point when selecting a method of synthesis (Booth et al 2016, Booth et al 2018). Some methods are designed primarily to develop findings at a descriptive level and thus directly feed into lines of action for policy and practice. Others hold the capacity to develop new theory (e.g. meta-ethnography and theory building approaches to thematic synthesis). Noyes and colleagues (Noyes et al (2019) and Flemming and colleagues (Flemming et al (2019) elaborate on key issues for consideration when selecting a method that is particularly suited to a Cochrane Review and decision making context (see Table 21.10.a ). Three qualitative evidence synthesis methods (thematic synthesis, framework synthesis and meta-ethnography) are recommended to produce syntheses that can subsequently be integrated with an intervention review or analysis.

Table 21.10.a Recommended methods for undertaking a qualitative evidence synthesis for subsequent integration with an intervention review, or as part of a mixed-method review (adapted from an original source developed by convenors (Flemming et al 2019, Noyes et al 2019))

Thematic synthesis

(Thomas and Harden 2008)

Most accessible form of synthesis. Clear approach, can be used with ‘thin’ data to produce descriptive themes and with ‘thicker’ data to develop descriptive themes in to more in-depth analytic themes. Themes are then integrated within the quantitative synthesis.

May be limited in interpretive ‘power’ and risks over-simplistic use and thus not truly informing decision making such as guidelines. Complex synthesis process that requires an experienced team. Theoretical findings may combine empirical evidence, expert opinion and conjecture to form hypotheses. More work is needed on how GRADE CERQual to assess confidence in synthesized qualitative findings (see Section ) can be applied to theoretical findings. May lack clarity on how higher-level findings translate into actionable points.

Framework synthesis

(Oliver et al 2008, Dixon-Woods 2011)

Best-fit framework synthesis

(Carroll et al 2011)

Works well within reviews of complex interventions by accommodating complexity within the framework, including representation of theory. The framework allows a clear mechanism for integration of qualitative and quantitative evidence in an aggregative way – see Noyes et al (2018a). Works well where there is broad agreement about the nature of interventions and their desired impacts.

Requires identification, selection and justification of framework. A framework may be revealed as inappropriate only once extraction/synthesis is underway. Risk of simplistically forcing data into a framework for expedience.

Meta-ethnography

(Noblit and Hare 1988)

Primarily interpretive synthesis method leading to creation of descriptive as well as new high order constructs. Descriptive and theoretical findings can help inform decision making such as guidelines. Explicit reporting standards have been developed.

Complex methodology and synthesis process that requires highly experienced team. Can take more time and resources than other methodologies. Theoretical findings may combine empirical evidence, expert opinion and conjecture to form hypotheses. May not satisfy requirements for an audit trail (although new reporting guidelines will help overcome this (France et al 2019). More work is needed to determine how CERQual can be applied to theoretical findings. May be unclear how higher-level findings translate into actionable points.

21.11 Data extraction

Qualitative findings may take the form of quotations from participants, subthemes and themes identified by the study’s authors, explanations, hypotheses or new theory, or observational excerpts and author interpretations of these data (Sandelowski and Barroso 2002). Findings may be presented as a narrative, or summarized and displayed as tables, infographics or logic models and potentially located in any part of the paper (Noyes et al 2019).

Methods for qualitative data extraction vary according to the synthesis method selected. Data extraction is not sequential and linear; often, it involves moving backwards and forwards between review stages. Review teams will need regular meetings to discuss and further interrogate the evidence and thereby achieve a shared understanding. It may be helpful to draw on a key stakeholder group to help in interpreting the evidence and in formulating key findings. Additional approaches (such as subgroup analysis) can be used to explore evidence from specific contexts further.

Irrespective of the review type and choice of synthesis method, we consider it best practice to extract detailed contextual and methodological information on each study and to report this information in a table of ‘Characteristics of included studies’ (see Table 21.11.a ). The template for intervention description and replication TIDieR checklist (Hoffmann et al 2014) and ICAT_SR tool may help with specifying key information for extraction (Lewin et al 2017). Review authors must ensure that they preserve the context of the primary study data during the extraction and synthesis process to prevent misinterpretation of primary studies (Noyes et al 2019).

Table 21.11.a Contextual and methodological information for inclusion within a table of ‘Characteristics of included studies’. From Noyes et al (2019). Reproduced with permission of BMJ Publishing Group

Context and participants

Important elements of study context, relevant to addressing the review question and locating the context of the primary study; for example, the study setting, population characteristics, participants and participant characteristics, the intervention delivered (if appropriate), etc.

Study design and methods used

Methodological design and approach taken by the study; methods for identifying the sample recruitment; the specific data collection and analysis methods utilized; and any theoretical models used to interpret or contextualize the findings.

Noyes and colleagues (Noyes et al (2019) provide additional guidance and examples of the various methods of data extraction. It is usual for review authors to select one method. In summary, extraction methods can be grouped as follows.

  • Using a bespoke universal, standardized or adapted data extraction template Review authors can develop their own review-specific data extraction template, or select a generic data extraction template by study type (e.g. templates developed by the National Institute for Health and Clinical Excellence (National Institute for Health Care Excellence 2012).
  • Using an a priori theory or predetermined framework to extract data Framework synthesis, and its subvariant ‘Best Fit’ Framework approach, involve extracting data from primary studies against an a priori framework in order to better understand a phenomenon of interest (Carroll et al 2011, Carroll et al 2013). For example, Glenton and colleagues (Glenton et al (2013) extracted data against a modified SURE Framework (2011) to synthesize factors affecting the implementation of lay health worker interventions. The SURE framework enumerates possible factors that may influence the implementation of health system interventions (SURE (Supporting the Use of Research Evidence) Collaboration 2011, Glenton et al 2013). Use of the ‘PROGRESS’ (place of residence, race/ethnicity/culture/language, occupation, gender/sex, religion, education, socioeconomic status, and social capital) framework also helps to ensure that data extraction maintains an explicit equity focus (O'Neill et al 2014). A logic model can also be used as a framework for data extraction.
  • Using a software program to code original studies inductively A wide range of software products have been developed by systematic review organizations (such as EPPI-Reviewer (Thomas et al 2010)). Most software for the analysis of primary qualitative data – such as NVivo ( www.qsrinternational.com/nvivo/home ) and others – can be used to code studies in a systematic review (Houghton et al 2017). For example, one method of data extraction and thematic synthesis involves coding the original studies using a software program to build inductive descriptive themes and a theoretical explanation of phenomena of interest (Thomas and Harden 2008). Thomas and Harden (2008) provide a worked example to demonstrate coding and developing a new understanding of children’s choices and motivations to eating fruit and vegetables from included primary studies.

21.12 Assessing the confidence in qualitative synthesized findings

The GRADE system has long featured in assessing the certainty of quantitative findings and application of its qualitative counterpart, GRADE-CERQual, is recommended for Cochrane qualitative evidence syntheses (Lewin et al 2015). CERQual has four components (relevance, methodological limitations, adequacy and coherence) which are used to formulate an overall assessment of confidence in the synthesized qualitative finding. Guidance on its components and reporting requirements have been published in a series in Implementation Science (Lewin et al 2018).

21.13 Methods for integrating the qualitative evidence synthesis with an intervention review

A range of methods and tools is available for data integration or mixed-method synthesis (Harden et al 2018, Noyes et al 2019). As noted at the beginning of this chapter, review authors can integrate a qualitative evidence synthesis with an existing intervention review published on a similar topic (sequential approach), or conduct a new intervention review and qualitative evidence syntheses in parallel before integration (convergent approach). Irrespective of whether the qualitative synthesis is sequential or convergent to the intervention review, we recommend that qualitative and quantitative evidence be synthesized separately using appropriate methods before integration (Harden et al 2018). The scope for integration can be more limited with a pre-existing intervention review unless review authors have access to the data underlying the intervention review report.

Harden and colleagues and Noyes and colleagues outline the following methods and tools for integration with an intervention review (Harden et al 2018, Noyes et al 2019):

  • Juxtaposing findings in a matrix Juxtaposition is driven by the findings from the qualitative evidence synthesis (e.g. intervention components related to the acceptability or feasibility of the interventions) and these findings form one side of the matrix. Findings on intervention effects (e.g. improves outcome, no difference in outcome, uncertain effects) form the other side of the matrix. Quantitative studies are grouped according to findings on intervention effects and the presence or absence of features specified by the hypotheses generated from the qualitative synthesis (Candy et al 2011). Observed patterns in the matrix are used to explain differences in the findings of the quantitative studies and to identify gaps in research (van Grootel et al 2017). (See, for example, (Ames et al 2017, Munabi-Babigumira et al 2017, Hurley et al 2018)
  • Analysing programme theory Theories articulating how interventions are expected to work are analysed. Findings from quantitative studies, testing the effects of interventions, and from qualitative and process evaluation evidence are used together to examine how the theories work in practice (Greenhalgh et al 2007). The value of different theories is assessed or new/revised theory developed. Factors that enhance or reduce intervention effectiveness are also identified.
  • Using logic models or other types of conceptual framework A logic model (Glenton et al 2013) or other type of conceptual framework, which represents the processes by which an intervention produces change provides a common scaffold for integrating findings across different types of evidence (Booth and Carroll 2015). Frameworks can be specified a priori from the literature or through stakeholder engagement or newly developed during the review. Findings from quantitative studies testing the effects of interventions and those from qualitative evidence are used to develop and/or further refine the model.
  • Testing hypotheses derived from syntheses of qualitative evidence Quantitative studies are grouped according to the presence or absence of the proposition specified by the hypotheses to be tested and subgroup analysis is used to explore differential findings on the effects of interventions (Thomas et al 2004).
  • Qualitative comparative analysis (QCA) Findings from a qualitative synthesis are used to identify the range of features that are important for successful interventions, and the mechanisms through which these features operate. A QCA then tests whether or not the features are associated with effective interventions (Kahwati et al 2016). The analysis unpicks multiple potential pathways to effectiveness accommodating scenarios where the same intervention feature is associated both with effective and less effective interventions, depending on context. QCA offers potential for use in integration; unlike the other methods and tools presented here it does not yet have sufficient methodological guidance available. However, exemplar reviews using QCA are available (Thomas et al 2014, Harris et al 2015, Kahwati et al 2016).

Review authors can use the above methods in combination (e.g. patterns observed through juxtaposing findings within a matrix can be tested using subgroup analysis or QCA). Analysing programme theory, using logic models and QCA would require members of the review team with specific skills in these methods. Using subgroup analysis and QCA are not suitable when limited evidence is available (Harden et al 2018, Noyes et al 2019). (See also Chapter 17 on intervention complexity.)

21.14 Reporting the protocol and qualitative evidence synthesis

Reporting standards and tools designed for intervention reviews (such as Cochrane’s MECIR standards ( http://methods.cochrane.org/mecir ) or the PRISMA Statement (Liberati et al 2009), may not be appropriate for qualitative evidence syntheses or an integrated mixed-method review. Additional guidance on how to choose, adapt or create a hybrid reporting tool is provided as a 5-point ‘decision flowchart’ ( Figure 21.14.a ) (Flemming et al 2018). Review authors should consider whether: a specific set of reporting guidance is available (e.g. eMERGe for meta-ethnographies (France et al 2015)); whether generic guidance (e.g. ENTREQ (Tong et al 2012)) is suitable; or whether additional checklists or tools are appropriate for reporting a specific aspect of the review.

Figure 21.14.a Decision flowchart for choice of reporting approach for syntheses of qualitative, implementation or process evaluation evidence (Flemming et al 2018). Reproduced with permission of Elsevier

empirical evidence in qualitative research

21.15 Chapter information

Authors: Jane Noyes, Andrew Booth, Margaret Cargo, Kate Flemming, Angela Harden, Janet Harris, Ruth Garside, Karin Hannes, Tomás Pantoja, James Thomas

Acknowledgements: This chapter replaces Chapter 20 in the first edition of this Handbook (2008) and subsequent Version 5.2. We would like to thank the previous Chapter 20 authors Jennie Popay and Alan Pearson. Elements of this chapter draw on previous supplemental guidance produced by the Cochrane Qualitative and Implementation Methods Group Convenors, to which Simon Lewin contributed.

Funding: JT is supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care North Thames at Barts Health NHS Trust. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

21.16 References

Ames HM, Glenton C, Lewin S. Parents' and informal caregivers' views and experiences of communication about routine childhood vaccination: a synthesis of qualitative evidence. Cochrane Database of Systematic Reviews 2017; 2 : CD011787.

Anderson LM, Petticrew M, Rehfuess E, Armstrong R, Ueffing E, Baker P, Francis D, Tugwell P. Using logic models to capture complexity in systematic reviews. Research Synthesis Methods 2011; 2 : 33-42.

Barnett-Page E, Thomas J. Methods for the synthesis of qualitative research: a critical review. BMC Medical Research Methodology 2009; 9 : 59.

Benoot C, Hannes K, Bilsen J. The use of purposeful sampling in a qualitative evidence synthesis: a worked example on sexual adjustment to a cancer trajectory. BMC Medical Research Methodology 2016; 16 : 21.

Bonell C, Jamal F, Harden A, Wells H, Parry W, Fletcher A, Petticrew M, Thomas J, Whitehead M, Campbell R, Murphy S, Moore L. Public Health Research. Systematic review of the effects of schools and school environment interventions on health: evidence mapping and synthesis . Southampton (UK): NIHR Journals Library; 2013.

Booth A, Harris J, Croot E, Springett J, Campbell F, Wilkins E. Towards a methodology for cluster searching to provide conceptual and contextual "richness" for systematic reviews of complex interventions: case study (CLUSTER). BMC Medical Research Methodology 2013; 13 : 118.

Booth A, Carroll C. How to build up the actionable knowledge base: the role of 'best fit' framework synthesis for studies of improvement in healthcare. BMJ Quality and Safety 2015; 24 : 700-708.

Booth A, Noyes J, Flemming K, Gerhardus A, Wahlster P, van der Wilt GJ, Mozygemba K, Refolo P, Sacchini D, Tummers M, Rehfuess E. Guidance on choosing qualitative evidence synthesis methods for use in health technology assessment for complex interventions 2016. https://www.integrate-hta.eu/wp-content/uploads/2016/02/Guidance-on-choosing-qualitative-evidence-synthesis-methods-for-use-in-HTA-of-complex-interventions.pdf

Booth A. Qualitative evidence synthesis. In: Facey K, editor. Patient involvement in Health Technology Assessment . Singapore: Springer; 2017. p. 187-199.

Booth A, Noyes J, Flemming K, Gehardus A, Wahlster P, Jan van der Wilt G, Mozygemba K, Refolo P, Sacchini D, Tummers M, Rehfuess E. Structured methodology review identified seven (RETREAT) criteria for selecting qualitative evidence synthesis approaches. Journal of Clinical Epidemiology 2018; 99 : 41-52.

Booth A, Moore G, Flemming K, Garside R, Rollins N, Tuncalp Ö, Noyes J. Taking account of context in systematic reviews and guidelines considering a complexity perspective. BMJ Global Health 2019a; 4 : e000840.

Booth A, Noyes J, Flemming K, Moore G, Tuncalp Ö, Shakibazadeh E. Formulating questions to address the acceptability and feasibility of complex interventions in qualitative evidence synthesis. BMJ Global Health 2019b; 4 : e001107.

Candy B, King M, Jones L, Oliver S. Using qualitative synthesis to explore heterogeneity of complex interventions. BMC Medical Research Methodology 2011; 11 : 124.

Cargo M, Harris J, Pantoja T, Booth A, Harden A, Hannes K, Thomas J, Flemming K, Garside R, Noyes J. Cochrane Qualitative and Implementation Methods Group guidance series-paper 4: methods for assessing evidence on intervention implementation. Journal of Clinical Epidemiology 2018; 97 : 59-69.

Carroll C, Booth A, Cooper K. A worked example of "best fit" framework synthesis: a systematic review of views concerning the taking of some potential chemopreventive agents. BMC Medical Research Methodology 2011; 11 : 29.

Carroll C, Booth A, Leaviss J, Rick J. "Best fit" framework synthesis: refining the method. BMC Medical Research Methodology 2013; 13 : 37.

Carroll C. Qualitative evidence synthesis to improve implementation of clinical guidelines. BMJ 2017; 356 : j80.

CASP. Making sense of evidence: 10 questions to help you make sense of qualitative research: Public Health Resource Unit, England; 2013. http://media.wix.com/ugd/dded87_29c5b002d99342f788c6ac670e49f274.pdf .

Cooke A, Smith D, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence synthesis. Qualitative Health Research 2012; 22 : 1435-1443.

Cooper C, Booth A, Britten N, Garside R. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: a methodological review. Systematic Reviews 2017; 6 : 234.

De Buck E, Hannes K, Cargo M, Van Remoortel H, Vande Veegaete A, Mosler HJ, Govender T, Vandekerckhove P, Young T. Engagement of stakeholders in the development of a Theory of Change for handwashing and sanitation behaviour change. International Journal of Environmental Research and Public Health 2018; 28 : 8-22.

Dixon-Woods M. Using framework-based synthesis for conducting reviews of qualitative studies. BMC Medicine 2011; 9 : 39.

Downe S, Finlayson K, Tuncalp, Metin Gulmezoglu A. What matters to women: a systematic scoping review to identify the processes and outcomes of antenatal care provision that are important to healthy pregnant women. BJOG: An International Journal of Obstetrics and Gynaecology 2016; 123 : 529-539.

El Sherif R, Pluye P, Gore G, Granikov V, Hong QN. Performance of a mixed filter to identify relevant studies for mixed studies reviews. Journal of the Medical Library Association 2016; 104 : 47-51.

Flemming K, Booth A, Hannes K, Cargo M, Noyes J. Cochrane Qualitative and Implementation Methods Group guidance series-paper 6: reporting guidelines for qualitative, implementation, and process evaluation evidence syntheses. Journal of Clinical Epidemiology 2018; 97 : 79-85.

Flemming K, Booth A, Garside R, Tuncalp O, Noyes J. Qualitative evidence synthesis for complex interventions and guideline development: clarification of the purpose, designs and relevant methods. BMJ Global Health 2019; 4 : e000882.

France EF, Ring N, Noyes J, Maxwell M, Jepson R, Duncan E, Turley R, Jones D, Uny I. Protocol-developing meta-ethnography reporting guidelines (eMERGe). BMC Medical Research Methodology 2015; 15 : 103.

France EF, Cunningham M, Ring N, Uny I, Duncan EAS, Jepson RG, Maxwell M, Roberts RJ, Turley RL, Booth A, Britten N, Flemming K, Gallagher I, Garside R, Hannes K, Lewin S, Noblit G, Pope C, Thomas J, Vanstone M, Higginbottom GMA, Noyes J. Improving reporting of Meta-Ethnography: The eMERGe Reporting Guidance BMC Medical Research Methodology 2019; 19 : 25.

Garside R. Should we appraise the quality of qualitative research reports for systematic reviews, and if so, how? Innovation: The European Journal of Social Science Research 2014; 27 : 67-79.

Glenton C, Colvin CJ, Carlsen B, Swartz A, Lewin S, Noyes J, Rashidian A. Barriers and facilitators to the implementation of lay health worker programmes to improve access to maternal and child health: qualitative evidence synthesis. Cochrane Database of Systematic Reviews 2013; 10 : CD010414.

Glenton C, Lewin S, Norris S. Chapter 15: Using evidence from qualitative research to develop WHO guidelines. In: Norris S, editor. World Health Organization Handbook for Guideline Development . 2nd. ed. Geneva: WHO; 2016.

Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Information and Libraries Journal 2009; 26 : 91-108.

Greenhalgh T, Kristjansson E, Robinson V. Realist review to understand the efficacy of school feeding programmes. BMJ 2007; 335 : 858.

Harden A, Oakley A, Weston R. A review of the effectiveness and appropriateness of peer-delivered health promotion for young people. London: Institute of Education, University of London; 1999.

Harden A, Thomas J, Cargo M, Harris J, Pantoja T, Flemming K, Booth A, Garside R, Hannes K, Noyes J. Cochrane Qualitative and Implementation Methods Group guidance series-paper 5: methods for integrating qualitative and implementation evidence within intervention effectiveness reviews. Journal of Clinical Epidemiology 2018; 97 : 70-78.

Harris JL, Booth A, Cargo M, Hannes K, Harden A, Flemming K, Garside R, Pantoja T, Thomas J, Noyes J. Cochrane Qualitative and Implementation Methods Group guidance series-paper 2: methods for question formulation, searching, and protocol development for qualitative evidence synthesis. Journal of Clinical Epidemiology 2018; 97 : 39-48.

Harris KM, Kneale D, Lasserson TJ, McDonald VM, Grigg J, Thomas J. School-based self management interventions for asthma in children and adolescents: a mixed methods systematic review (Protocol). Cochrane Database of Systematic Reviews 2015; 4 : CD011651.

Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, Altman DG, Barbour V, Macdonald H, Johnston M, Lamb SE, Dixon-Woods M, McCulloch P, Wyatt JC, Chan AW, Michie S. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 2014; 348 : g1687.

Houghton C, Murphy K, Meehan B, Thomas J, Brooker D, Casey D. From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis. Journal of Clinical Nursing 2017; 26 : 873-881.

Hurley M, Dickson K, Hallett R, Grant R, Hauari H, Walsh N, Stansfield C, Oliver S. Exercise interventions and patient beliefs for people with hip, knee or hip and knee osteoarthritis: a mixed methods review. Cochrane Database of Systematic Reviews 2018; 4 : CD010842.

Kahwati L, Jacobs S, Kane H, Lewis M, Viswanathan M, Golin CE. Using qualitative comparative analysis in a systematic review of a complex intervention. Systematic Reviews 2016; 5 : 82.

Kelly MP, Noyes J, Kane RL, Chang C, Uhl S, Robinson KA, Springs S, Butler ME, Guise JM. AHRQ series on complex intervention systematic reviews-paper 2: defining complexity, formulating scope, and questions. Journal of Clinical Epidemiology 2017; 90 : 11-18.

Kneale D, Thomas J, Harris K. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews. PloS One 2015; 10 : e0142187.

Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implementation Science 2010; 5 : 69.

Lewin S, Munabi-Babigumira S, Glenton C, Daniels K, Bosch-Capblanch X, van Wyk BE, Odgaard-Jensen J, Johansen M, Aja GN, Zwarenstein M, Scheel IB. Lay health workers in primary and community health care for maternal and child health and the management of infectious diseases. Cochrane Database of Systematic Reviews 2010; 3 : CD004015.

Lewin S, Glenton C, Munthe-Kaas H, Carlsen B, Colvin CJ, Gulmezoglu M, Noyes J, Booth A, Garside R, Rashidian A. Using qualitative evidence in decision making for health and social interventions: an approach to assess confidence in findings from qualitative evidence syntheses (GRADE-CERQual). PLoS Medicine 2015; 12 : e1001895.

Lewin S, Hendry M, Chandler J, Oxman AD, Michie S, Shepperd S, Reeves BC, Tugwell P, Hannes K, Rehfuess EA, Welch V, McKenzie JE, Burford B, Petkovic J, Anderson LM, Harris J, Noyes J. Assessing the complexity of interventions within systematic reviews: development, content and use of a new tool (iCAT_SR). BMC Medical Research Methodology 2017; 17 : 76.

Lewin S, Booth A, Glenton C, Munthe-Kaas H, Rashidian A, Wainwright M, Bohren MA, Tuncalp O, Colvin CJ, Garside R, Carlsen B, Langlois EV, Noyes J. Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series. Implementation Science 2018; 13 : 2.

Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JPA, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ 2009; 339 : b2700.

Moore G, Audrey S, Barker M, Bond L, Bonell C, Harderman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ 2015; 350 : h1258.

Munabi-Babigumira S, Glenton C, Lewin S, Fretheim A, Nabudere H. Factors that influence the provision of intrapartum and postnatal care by skilled birth attendants in low- and middle-income countries: a qualitative evidence synthesis. Cochrane Database of Systematic Reviews 2017; 11 : CD011558.

Munthe-Kaas H, Glenton C, Booth A, Noyes J, Lewin S. Systematic mapping of existing tools to appraise methodological strengths and limitations of qualitative research: first stage in the development of the CAMELOT tool. BMC Medical Research Methodology 2019; 19 : 113.

National Institute for Health Care Excellence. NICE Process and Methods Guides. Methods for the Development of NICE Public Health Guidance . London: National Institute for Health and Care Excellence (NICE); 2012.

Newton BJ, Rothlingova Z, Gutteridge R, LeMarchand K, Raphael JH. No room for reflexivity? Critical reflections following a systematic review of qualitative research. Journal of Health Psychology 2012; 17 : 866-885.

Noblit GW, Hare RD. Meta-ethnography: synthesizing qualitative studies . Newbury Park: Sage Publications, Inc; 1988.

Noyes J, Hendry M, Booth A, Chandler J, Lewin S, Glenton C, Garside R. Current use was established and Cochrane guidance on selection of social theories for systematic reviews of complex interventions was developed. Journal of Clinical Epidemiology 2016a; 75 : 78-92.

Noyes J, Hendry M, Lewin S, Glenton C, Chandler J, Rashidian A. Qualitative "trial-sibling" studies and "unrelated" qualitative studies contributed to complex intervention reviews. Journal of Clinical Epidemiology 2016b; 74 : 133-143.

Noyes J, Booth A, Flemming K, Garside R, Harden A, Lewin S, Pantoja T, Hannes K, Cargo M, Thomas J. Cochrane Qualitative and Implementation Methods Group guidance series-paper 3: methods for assessing methodological limitations, data extraction and synthesis, and confidence in synthesized qualitative findings. Journal of Clinical Epidemiology 2018a; 97 : 49-58.

Noyes J, Booth A, Cargo M, Flemming K, Garside R, Hannes K, Harden A, Harris J, Lewin S, Pantoja T, Thomas J. Cochrane Qualitative and Implementation Methods Group guidance series-paper 1: introduction. Journal of Clinical Epidemiology 2018b; 97 : 35-38.

Noyes J, Booth A, Moore G, Flemming K, Tuncalp O, Shakibazadeh E. Synthesising quantitative and qualitative evidence to inform guidelines on complex interventions: clarifying the purposes, designs and outlining some methods. BMJ Global Health 2019; 4 (Suppl 1) : e000893.

O'Neill J, Tabish H, Welch V, Petticrew M, Pottie K, Clarke M, Evans T, Pardo Pardo J, Waters E, White H, Tugwell P. Applying an equity lens to interventions: using PROGRESS ensures consideration of socially stratifying factors to illuminate inequities in health. Journal of Clinical Epidemiology 2014; 67 : 56-64.

Oliver S, Rees R, Clarke-Jones L, Milne R, Oakley A, Gabbay J, Stein K, Buchanan P, Gyte G. A multidimensional conceptual framework for analysing public involvement in health services research. Health Expectations 2008; 11 : 72-84.

Petticrew M, Knai C, Thomas J, Rehfuess E, Noyes J, Gerhardus A, Grimshaw J, Rutter H. Implications of a complexity perspective for systematic reviews and guideline development in health decision making. BMJ Global Health 2019; 4 (Suppl 1) : e000899.

Rees R, Oliver K, Woodman J, Thomas J. Children's views about obesity, body size, shape and weight. A systematic review. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London; 2009.

Rehfuess EA, Booth A, Brereton L, Burns J, Gerhardus A, Mozygemba K, Oortwijn W, Pfadenhauer LM, Tummers M, van der Wilt GJ, Rohwer A. Towards a taxonomy of logic models in systematic reviews and health technology assessments: A priori, staged, and iterative approaches. Research Synthesis Methods 2018; 9 : 13-24.

Robbins SCC, Ward K, Skinner SR. School-based vaccination: a systematic review of process evaluations. Vaccine 2011; 29 : 9588-9599.

Rogers M, Bethel A, Abbott R. Locating qualitative studies in dementia on MEDLINE, EMBASE, CINAHL, and PsycINFO: a comparison of search strategies. Research Synthesis Methods 2018; 9 : 579-586.

Sandelowski M, Barroso J. Finding the findings in qualitative studies. Journal of Nursing Scholarship 2002; 34 : 213-219.

Selva A, Sola I, Zhang Y, Pardo-Hernandez H, Haynes RB, Martinez Garcia L, Navarro T, Schünemann H, Alonso-Coello P. Development and use of a content search strategy for retrieving studies on patients' views and preferences. Health and Quality of Life Outcomes 2017; 15 : 126.

Shepherd J, Kavanagh J, Picot J, Cooper K, Harden A, Barnett-Page E, Jones J, Clegg A, Hartwell D, Frampton GK, Price A. The effectiveness and cost-effectiveness of behavioural interventions for the prevention of sexually transmitted infections in young people aged 13-19: a systematic review and economic evaluation. Health Technology Assessment 2010; 14 : 1-206, iii-iv.

Squires JE, Valentine JC, Grimshaw JM. Systematic reviews of complex interventions: framing the review question. Journal of Clinical Epidemiology 2013; 66 : 1215-1222.

Stansfield C, Kavanagh J, Rees R, Gomersall A, Thomas J. The selection of search sources influences the findings of a systematic review of people's views: a case study in public health. BMC Medical Research Methodology 2012; 12 : 55.

SURE (Supporting the Use of Research Evidence) Collaboration. SURE Guides for Preparing and Using Evidence-based Policy Briefs: 5 Identifying and Addressing Barriers to Implementing the Policy Options. Version 2.1, updated November 2011.  https://epoc.cochrane.org/sites/epoc.cochrane.org/files/public/uploads/SURE-Guides-v2.1/Collectedfiles/sure_guides.html

Suri H. Purposeful sampling in qualitative research synthesis. Qualitative Research Journal 2011; 11 : 63-75.

Thomas J, Harden A, Oakley A, Oliver S, Sutcliffe K, Rees R, Brunton G, Kavanagh J. Integrating qualitative research with trials in systematic reviews. BMJ 2004; 328 : 1010-1012.

Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Medical Research Methodology 2008; 8 : 45.

Thomas J, Brunton J, Graziosi S. EPPI-Reviewer 4.0: software for research synthesis [Software]. EPPI-Centre Software. Social Science Research Unit, Institute of Education, University of London UK; 2010. https://eppi.ioe.ac.uk/CMS/Default.aspx?alias=eppi.ioe.ac.uk/cms/er4& .

Thomas J, O'Mara-Eves A, Brunton G. Using qualitative comparative analysis (QCA) in systematic reviews of complex interventions: a worked example. Systematic Reviews 2014; 3 : 67.

Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Medical Research Methodology 2012; 12 : 181.

van Grootel L, van Wesel F, O'Mara-Eves A, Thomas J, Hox J, Boeije H. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: broadening the matrix approach. Research Synthesis Methods 2017; 8 : 303-311.

Verhage A, Boels D. Critical appraisal of mixed methods research studies in a systematic scoping review on plural policing: assessing the impact of excluding inadequately reported studies by means of a sensitivity analysis. Quality & Quantity 2017; 51 : 1449-1468.

Walker LO, Avant KC. Strategies for theory construction in nursing . Upper Saddle River (NJ): Pearson Prentice Hall; 2005.

For permission to re-use material from the Handbook (either academic or commercial), please see here for full details.

Encyclopedia Britannica

  • Games & Quizzes
  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center

David Hume

empirical evidence

Our editors will review what you’ve submitted and determine whether to revise the article.

  • National Center for Biotechnology Information - PubMed Central - Expert opinion vs. empirical evidence
  • LiveScience - Empirical evidence: A definition
  • Academia - Empirical Evidence

empirical evidence in qualitative research

empirical evidence , information gathered directly or indirectly through observation or experimentation that may be used to confirm or disconfirm a scientific theory or to help justify, or establish as reasonable, a person’s belief in a given proposition. A belief may be said to be justified if there is sufficient evidence to make holding the belief reasonable.

The concept of evidence is the basis of philosophical evidentialism, an epistemological thesis according to which a person is justified in believing a given proposition p if and only if the person’s evidence for p is proper or sufficient. In this context , the Scottish Enlightenment philosopher David Hume (1711–76) famously asserted that the “wise man…proportions his belief to the evidence.” In a similar vein, the American astronomer Carl Sagan popularized the statement, “Extraordinary claims require extraordinary evidence.”

Foundationalists , however, defend the view that certain basic, or foundational, beliefs are either inherently justified or justified by something other than another belief (e.g., a sensation or perception) and that all other beliefs may be justified only if they are directly or indirectly supported by at least one foundational belief (that is, only if they are either supported by at least one foundational belief or supported by other beliefs that are themselves supported by at least one foundational belief). The most influential foundationalist of the modern period was the French philosopher and mathematician René Descartes (1596–1650), who attempted to establish a foundation for justified beliefs regarding an external world in his intuition that, for as long as he is thinking, he exists (“I think, therefore I am”; see cogito, ergo sum ). A traditional argument in favour of foundationalism asserts that no other account of inferential justification—the act of justifying a given belief by inferring it from another belief that itself is justified—is possible. Thus, assume that one belief, Belief 1, is justified by another belief, Belief 2. How is Belief 2 justified? It cannot be justified by Belief 1, because the inference from Belief 2 to Belief 1 would then be circular and invalid. It cannot be justified by a third nonfoundational Belief 3, because the same question would then apply to that belief, leading to an infinite regress. And one cannot simply assume that Belief 2 is not justified, for then Belief 1 would not be justified through the inference from Belief 2. Accordingly, there must be some beliefs whose justification does not depend on other beliefs, and those justified beliefs must function as a foundation for the inferential justification of other beliefs.

Empirical evidence can be quantitative or qualitative. Typically, numerical quantitative evidence can be represented visually by means of diagrams, graphs, or charts, reflecting the use of statistical or mathematical data and the researcher’s neutral noninteractive role. It can be obtained by methods such as experiments, surveys, correlational research (to study the relationship between variables), cross-sectional research (to compare different groups), causal-comparative research (to explore cause-effect relationships), and longitudinal studies (to test a subject during a given time period).

Qualitative evidence, on the other hand, can foster a deeper understanding of behaviour and related factors and is not typically expressed by using numbers. Often subjective and resulting from interaction between the researcher and participants, it can stem from the use of methods such as interviews (based on verbal interaction), observation (informing ethnographic research design), textual analysis (involving the description and interpretation of texts), focus groups (planned group discussions), and case studies (in-depth analyses of individuals or groups).

Empirical evidence is subject to assessments of its validity. Validity can be internal, involving the soundness of an experiment’s design and execution and the accuracy of subsequent data analysis , or external, involving generalizability to other research contexts ( see ecological validity ).

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on June 22, 2023.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Qualitative research approaches
Approach What does it involve?
Grounded theory Researchers collect rich data on a topic of interest and develop theories .
Researchers immerse themselves in groups or organizations to understand their cultures.
Action research Researchers and participants collaboratively link theory to practice to drive social change.
Phenomenological research Researchers investigate a phenomenon or event by describing and interpreting participants’ lived experiences.
Narrative research Researchers examine how stories are told to understand how participants perceive and make sense of their experiences.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

empirical evidence in qualitative research

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative data analysis
Approach When to use Example
To describe and categorize common words, phrases, and ideas in qualitative data. A market researcher could perform content analysis to find out what kind of language is used in descriptions of therapeutic apps.
To identify and interpret patterns and themes in qualitative data. A psychologist could apply thematic analysis to travel blogs to explore how tourism shapes self-identity.
To examine the content, structure, and design of texts. A media researcher could use textual analysis to understand how news coverage of celebrities has changed in the past decade.
To study communication and how language is used to achieve effects in specific contexts. A political scientist could use discourse analysis to study how politicians generate trust in election campaigns.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Prevent plagiarism. Run a free check.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved June 19, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Purdue University

  • Ask a Librarian

Research: Overview & Approaches

  • Getting Started with Undergraduate Research
  • Planning & Getting Started
  • Building Your Knowledge Base
  • Locating Sources
  • Reading Scholarly Articles
  • Creating a Literature Review
  • Productivity & Organizing Research
  • Scholarly and Professional Relationships

Introduction to Empirical Research

Databases for finding empirical research, guided search, google scholar, examples of empirical research, sources and further reading.

  • Interpretive Research
  • Action-Based Research
  • Creative & Experimental Approaches

Your Librarian

Profile Photo

  • Introductory Video This video covers what empirical research is, what kinds of questions and methods empirical researchers use, and some tips for finding empirical research articles in your discipline.

Video Tutorial

  • Guided Search: Finding Empirical Research Articles This is a hands-on tutorial that will allow you to use your own search terms to find resources.

Google Scholar Search

  • Study on radiation transfer in human skin for cosmetics
  • Long-Term Mobile Phone Use and the Risk of Vestibular Schwannoma: A Danish Nationwide Cohort Study
  • Emissions Impacts and Benefits of Plug-In Hybrid Electric Vehicles and Vehicle-to-Grid Services
  • Review of design considerations and technological challenges for successful development and deployment of plug-in hybrid electric vehicles
  • Endocrine disrupters and human health: could oestrogenic chemicals in body care cosmetics adversely affect breast cancer incidence in women?

empirical evidence in qualitative research

  • << Previous: Scholarly and Professional Relationships
  • Next: Interpretive Research >>
  • Last Updated: Jun 20, 2024 1:41 PM
  • URL: https://guides.lib.purdue.edu/research_approaches

Canvas | University | Ask a Librarian

  • Library Homepage
  • Arrendale Library

Empirical & Non-Empirical Research

  • Empirical Research

Introduction: What is Empirical Research?

Quantitative methods, qualitative methods.

  • Quantitative vs. Qualitative
  • Reference Works for Social Sciences Research
  • What is Non-Empirical Research?
  • Contact Us!

 Call us at 706-776-0111

  Chat with a Librarian

  Send Us Email

  Library Hours

Empirical research  is based on phenomena that can be observed and measured. Empirical research derives knowledge from actual experience rather than from theory or belief. 

Key characteristics of empirical research include:

  • Specific research questions to be answered;
  • Definitions of the population, behavior, or phenomena being studied;
  • Description of the methodology or research design used to study this population or phenomena, including selection criteria, controls, and testing instruments (such as surveys);
  • Two basic research processes or methods in empirical research: quantitative methods and qualitative methods (see the rest of the guide for more about these methods).

(based on the original from the Connelly LIbrary of LaSalle University)

empirical evidence in qualitative research

Empirical Research: Qualitative vs. Quantitative

Learn about common types of journal articles that use APA Style, including empirical studies; meta-analyses; literature reviews; and replication, theoretical, and methodological articles.

Academic Writer

© 2024 American Psychological Association.

  • More about Academic Writer ...

Quantitative Research

A quantitative research project is characterized by having a population about which the researcher wants to draw conclusions, but it is not possible to collect data on the entire population.

  • For an observational study, it is necessary to select a proper, statistical random sample and to use methods of statistical inference to draw conclusions about the population. 
  • For an experimental study, it is necessary to have a random assignment of subjects to experimental and control groups in order to use methods of statistical inference.

Statistical methods are used in all three stages of a quantitative research project.

For observational studies, the data are collected using statistical sampling theory. Then, the sample data are analyzed using descriptive statistical analysis. Finally, generalizations are made from the sample data to the entire population using statistical inference.

For experimental studies, the subjects are allocated to experimental and control group using randomizing methods. Then, the experimental data are analyzed using descriptive statistical analysis. Finally, just as for observational data, generalizations are made to a larger population.

Iversen, G. (2004). Quantitative research . In M. Lewis-Beck, A. Bryman, & T. Liao (Eds.), Encyclopedia of social science research methods . (pp. 897-898). Thousand Oaks, CA: SAGE Publications, Inc.

Qualitative Research

What makes a work deserving of the label qualitative research is the demonstrable effort to produce richly and relevantly detailed descriptions and particularized interpretations of people and the social, linguistic, material, and other practices and events that shape and are shaped by them.

Qualitative research typically includes, but is not limited to, discerning the perspectives of these people, or what is often referred to as the actor’s point of view. Although both philosophically and methodologically a highly diverse entity, qualitative research is marked by certain defining imperatives that include its case (as opposed to its variable) orientation, sensitivity to cultural and historical context, and reflexivity. 

In its many guises, qualitative research is a form of empirical inquiry that typically entails some form of purposive sampling for information-rich cases; in-depth interviews and open-ended interviews, lengthy participant/field observations, and/or document or artifact study; and techniques for analysis and interpretation of data that move beyond the data generated and their surface appearances. 

Sandelowski, M. (2004).  Qualitative research . In M. Lewis-Beck, A. Bryman, & T. Liao (Eds.),  Encyclopedia of social science research methods . (pp. 893-894). Thousand Oaks, CA: SAGE Publications, Inc.

  • Next: Quantitative vs. Qualitative >>
  • Last Updated: Jun 20, 2024 12:07 PM
  • URL: https://library.piedmont.edu/empirical-research
  • Ebooks & Online Video
  • New Materials
  • Renew Checkouts
  • Faculty Resources
  • Library Friends
  • Library Services
  • Our Mission
  • Library History
  • Ask a Librarian!
  • Making Citations
  • Working Online

Friend us on Facebook!

Arrendale Library Piedmont University 706-776-0111

  • Skip to primary navigation
  • Skip to content

Avidnote

  • Home – AI for Research

Avidnote

Demystifying Empirical Research: Techniques and Examples

Welcome to the world of empirical research! In this exploration, we will unravel the complexities surrounding empirical research techniques and provide illuminating examples to enhance your understanding. Empirical research forms the backbone of scientific inquiry, offering a systematic approach to gathering and analyzing data to answer research questions. By delving into this topic, we aim to demystify the process, making it accessible and comprehensible to all. Throughout this journey, we will showcase various methodologies, such as experiments, surveys, and observations, shedding light on how researchers employ these techniques to uncover valuable insights. Through real-world examples, we will illustrate the practical application of empirical research in diverse fields, demonstrating its significance in advancing knowledge and driving evidence-based decision-making. Join us as we embark on a quest to decode the intricacies of empirical research, empowering you to navigate this domain with confidence and proficiency.

Methods in Empirical Research

Empirical research is a fundamental aspect of scientific inquiry, providing a systematic approach to understanding the world around us. In this blog section, we will delve into the various methods used in empirical research, highlighting the differences between quantitative and qualitative approaches, exploring different data collection techniques, and discussing various data analysis methods.

Quantitative vs. Qualitative Approaches

  • Understanding the Distinctions: Exploring the fundamental differences between quantitative and qualitative research approaches, including their respective strengths and weaknesses.
  • When to Use Each Approach: Discussing scenarios where quantitative methods are more appropriate and when qualitative methods offer more insights.
  • Mixed-Methods Approach: Highlighting the benefits of combining quantitative and qualitative methods to provide a comprehensive understanding of research questions.

Data Collection Techniques

  • Surveys and Questionnaires: Exploring the use of surveys and questionnaires as common data collection tools in empirical research, highlighting best practices and potential pitfalls.
  • Observational Studies: Discussing the importance of observational studies in capturing real-world behaviors and phenomena, along with strategies for effective data collection.
  • Interviews and Focus Groups: Detailing the use of interviews and focus groups as qualitative data collection methods, emphasizing the value of direct interactions with participants.

Data Analysis Methods

  • Descriptive Statistics: Exploring the use of descriptive statistics to summarize and interpret data, including measures of central tendency and dispersion.
  • Inferential Statistics: Discussing how inferential statistics are used to make inferences and predictions about a population based on sample data, along with common techniques such as hypothesis testing and regression analysis.
  • Qualitative Data Analysis: Introducing qualitative data analysis methods like thematic analysis and content analysis, highlighting their role in interpreting non-numerical data.

By understanding the various methods in empirical research, researchers can enhance the rigor and validity of their studies, ultimately contributing to the advancement of knowledge in their respective fields. The choice of research methods should align with the research questions and objectives, ensuring that the data collected and analyzed provide meaningful insights. Researchers should also consider ethical considerations in data collection and analysis, maintaining the integrity of the research process and upholding participant confidentiality and consent. Continuous learning and adaptation of research methods based on emerging trends and technologies are essential for staying at the forefront of empirical research practices.

Examples of Empirical Research

Medical studies.

Empirical research in the medical field involves conducting experiments and studies to gather data on the effectiveness of treatments, medications, and medical procedures. This type of research is crucial in advancing medical knowledge and improving patient outcomes. Medical studies often include clinical trials, cohort studies, and case-control studies to assess the impact of various interventions on health outcomes. Researchers in this field work tirelessly to ensure that medical practices are evidence-based and safe for patients.

Social Sciences Research

Empirical research in social sciences focuses on studying human behavior, society, and relationships through systematic observation and experimentation. It helps in understanding social phenomena, developing theories, and making evidence-based decisions. Social sciences encompass disciplines such as psychology, sociology, anthropology, and economics, each contributing valuable insights into how individuals and societies function. Researchers in social sciences often use surveys, interviews, and experiments to collect data and draw meaningful conclusions.

Educational Research

Educational research uses empirical methods to study teaching, learning, and educational systems. Researchers gather data to evaluate educational interventions, assess student performance, and improve teaching practices. This research is essential for enhancing educational outcomes and informing policy decisions. Educational researchers explore topics such as curriculum development, classroom management, and educational technology to enhance the learning experience for students of all ages. By conducting empirical studies, educators can continuously improve teaching methods and contribute to the overall advancement of the education sector.

The Significance of Empirical Research

Empirical research plays a vital role in various fields by providing concrete evidence to support or refute hypotheses. In the medical field, empirical studies help in determining the efficacy of new treatments, identifying potential risks, and improving patient care. Social sciences research relies on empirical data to understand complex human behaviors, societal trends, and cultural dynamics. Educational research uses empirical findings to enhance teaching strategies, develop innovative educational tools, and address challenges in the education system.

Challenges and Opportunities

While empirical research offers valuable insights, researchers often face challenges such as obtaining reliable data, ensuring ethical practices, and interpreting results accurately. However, these challenges also present opportunities for further exploration and refinement of research methodologies. Collaborative efforts among researchers from different disciplines can lead to interdisciplinary studies that offer comprehensive solutions to complex problems.

Future Directions

The future of empirical research is promising, with advancements in technology enabling researchers to collect and analyze data more efficiently. Interdisciplinary collaborations are likely to increase, leading to innovative research projects that address multifaceted issues. As the importance of evidence-based practices continues to grow across various sectors, empirical research will remain a cornerstone in driving progress and innovation.

Empirical research serves as a cornerstone in advancing knowledge, informing decision-making, and driving positive change across diverse fields. By embracing empirical methods and fostering a culture of evidence-based inquiry, researchers can contribute significantly to the growth and development of their respective disciplines.

Challenges and Best Practices

Common challenges faced.

Data Collection: Ensuring accurate and relevant data is collected can be a major challenge in empirical research. This involves not only gathering data but also ensuring its quality and relevance to the research questions being addressed.

Sample Size: Determining the appropriate sample size to ensure statistical significance and generalizability of results. A small sample size may lead to unreliable findings, while an excessively large sample size may be impractical or wasteful.

Research Design: Developing a robust research design that minimizes bias and confounding variables. It is crucial to carefully plan the research methodology to control for potential sources of error and ensure the validity of the study findings.

Data Analysis: Properly analyzing the data to draw valid conclusions and avoid misinterpretation. Utilizing appropriate statistical techniques and ensuring the accuracy of data interpretation are essential for deriving meaningful insights from empirical research.

Tips for Successful Empirical Research

Clear Objectives: Define clear research objectives and hypotheses to guide the study. Clearly outlining the purpose of the research helps maintain focus and ensures that the study addresses relevant research questions.

Rigorous Methodology: Follow a systematic and rigorous methodology to ensure the validity and reliability of results. Adhering to established research protocols and best practices enhances the credibility of the study outcomes.

Peer Review: Seek feedback from peers and experts in the field to improve the quality of research. Peer review provides valuable insights, helps identify potential flaws, and enhances the overall robustness of the research.

Transparency: Maintain transparency in reporting methods, data collection, and analysis procedures. Transparent reporting promotes reproducibility and allows other researchers to assess the validity of the study findings.

Continuous Learning: Stay updated with the latest research methods and tools to enhance the quality of empirical research. Embracing a culture of continuous learning and professional development enables researchers to adopt innovative approaches and improve the rigor of their research practices.

In the realm of empirical research, challenges often arise that can impede the progress and quality of a study. Data collection stands out as a fundamental challenge, where the accuracy and relevance of collected data significantly impact the research outcomes. Moreover, determining the optimal sample size is crucial to ensure the reliability and generalizability of findings. Researchers must carefully consider these aspects to avoid pitfalls that could compromise the integrity of their research.

Research design plays a pivotal role in empirical studies, influencing the validity of results. By developing a robust research design that accounts for biases and confounding variables, researchers can enhance the credibility of their findings. Additionally, proper data analysis techniques are essential to draw accurate conclusions and derive meaningful insights from the collected data.

To conduct successful empirical research, it is imperative to establish clear objectives and hypotheses, guiding the study in a focused direction. A rigorous methodology, adherence to best practices, and seeking peer feedback contribute to the overall quality of research. Transparency in reporting methods and a commitment to continuous learning further elevate the standards of empirical research, fostering a culture of excellence and innovation in the scientific community.

Demystifying empirical research involves understanding the various techniques and examples used in the process. By exploring the methodologies, data collection methods, and analysis techniques discussed in this blog, researchers can gain a clearer insight into how empirical research is conducted. Embracing empirical research not only enhances the quality of academic work but also contributes to the advancement of knowledge in various fields. By applying the techniques and examples shared here, researchers can navigate the complexities of empirical research with confidence and precision.

Privacy Overview

Adding {{itemName}} to cart

Added {{itemName}} to cart

What is Qualitative in Qualitative Research

  • Open access
  • Published: 27 February 2019
  • Volume 42 , pages 139–160, ( 2019 )

Cite this article

You have full access to this open access article

empirical evidence in qualitative research

  • Patrik Aspers 1 , 2 &
  • Ugo Corte 3  

611k Accesses

305 Citations

24 Altmetric

Explore all metrics

What is qualitative research? If we look for a precise definition of qualitative research, and specifically for one that addresses its distinctive feature of being “qualitative,” the literature is meager. In this article we systematically search, identify and analyze a sample of 89 sources using or attempting to define the term “qualitative.” Then, drawing on ideas we find scattered across existing work, and based on Becker’s classic study of marijuana consumption, we formulate and illustrate a definition that tries to capture its core elements. We define qualitative research as an iterative process in which improved understanding to the scientific community is achieved by making new significant distinctions resulting from getting closer to the phenomenon studied. This formulation is developed as a tool to help improve research designs while stressing that a qualitative dimension is present in quantitative work as well. Additionally, it can facilitate teaching, communication between researchers, diminish the gap between qualitative and quantitative researchers, help to address critiques of qualitative methods, and be used as a standard of evaluation of qualitative research.

Similar content being viewed by others

empirical evidence in qualitative research

Reporting reliability, convergent and discriminant validity with structural equation modeling: A review and best-practice recommendations

A new criterion for assessing discriminant validity in variance-based structural equation modeling, systematic review or scoping review guidance for authors when choosing between a systematic or scoping review approach.

Avoid common mistakes on your manuscript.

If we assume that there is something called qualitative research, what exactly is this qualitative feature? And how could we evaluate qualitative research as good or not? Is it fundamentally different from quantitative research? In practice, most active qualitative researchers working with empirical material intuitively know what is involved in doing qualitative research, yet perhaps surprisingly, a clear definition addressing its key feature is still missing.

To address the question of what is qualitative we turn to the accounts of “qualitative research” in textbooks and also in empirical work. In his classic, explorative, interview study of deviance Howard Becker ( 1963 ) asks ‘How does one become a marijuana user?’ In contrast to pre-dispositional and psychological-individualistic theories of deviant behavior, Becker’s inherently social explanation contends that becoming a user of this substance is the result of a three-phase sequential learning process. First, potential users need to learn how to smoke it properly to produce the “correct” effects. If not, they are likely to stop experimenting with it. Second, they need to discover the effects associated with it; in other words, to get “high,” individuals not only have to experience what the drug does, but also to become aware that those sensations are related to using it. Third, they require learning to savor the feelings related to its consumption – to develop an acquired taste. Becker, who played music himself, gets close to the phenomenon by observing, taking part, and by talking to people consuming the drug: “half of the fifty interviews were conducted with musicians, the other half covered a wide range of people, including laborers, machinists, and people in the professions” (Becker 1963 :56).

Another central aspect derived through the common-to-all-research interplay between induction and deduction (Becker 2017 ), is that during the course of his research Becker adds scientifically meaningful new distinctions in the form of three phases—distinctions, or findings if you will, that strongly affect the course of his research: its focus, the material that he collects, and which eventually impact his findings. Each phase typically unfolds through social interaction, and often with input from experienced users in “a sequence of social experiences during which the person acquires a conception of the meaning of the behavior, and perceptions and judgments of objects and situations, all of which make the activity possible and desirable” (Becker 1963 :235). In this study the increased understanding of smoking dope is a result of a combination of the meaning of the actors, and the conceptual distinctions that Becker introduces based on the views expressed by his respondents. Understanding is the result of research and is due to an iterative process in which data, concepts and evidence are connected with one another (Becker 2017 ).

Indeed, there are many definitions of qualitative research, but if we look for a definition that addresses its distinctive feature of being “qualitative,” the literature across the broad field of social science is meager. The main reason behind this article lies in the paradox, which, to put it bluntly, is that researchers act as if they know what it is, but they cannot formulate a coherent definition. Sociologists and others will of course continue to conduct good studies that show the relevance and value of qualitative research addressing scientific and practical problems in society. However, our paper is grounded in the idea that providing a clear definition will help us improve the work that we do. Among researchers who practice qualitative research there is clearly much knowledge. We suggest that a definition makes this knowledge more explicit. If the first rationale for writing this paper refers to the “internal” aim of improving qualitative research, the second refers to the increased “external” pressure that especially many qualitative researchers feel; pressure that comes both from society as well as from other scientific approaches. There is a strong core in qualitative research, and leading researchers tend to agree on what it is and how it is done. Our critique is not directed at the practice of qualitative research, but we do claim that the type of systematic work we do has not yet been done, and that it is useful to improve the field and its status in relation to quantitative research.

The literature on the “internal” aim of improving, or at least clarifying qualitative research is large, and we do not claim to be the first to notice the vagueness of the term “qualitative” (Strauss and Corbin 1998 ). Also, others have noted that there is no single definition of it (Long and Godfrey 2004 :182), that there are many different views on qualitative research (Denzin and Lincoln 2003 :11; Jovanović 2011 :3), and that more generally, we need to define its meaning (Best 2004 :54). Strauss and Corbin ( 1998 ), for example, as well as Nelson et al. (1992:2 cited in Denzin and Lincoln 2003 :11), and Flick ( 2007 :ix–x), have recognized that the term is problematic: “Actually, the term ‘qualitative research’ is confusing because it can mean different things to different people” (Strauss and Corbin 1998 :10–11). Hammersley has discussed the possibility of addressing the problem, but states that “the task of providing an account of the distinctive features of qualitative research is far from straightforward” ( 2013 :2). This confusion, as he has recently further argued (Hammersley 2018 ), is also salient in relation to ethnography where different philosophical and methodological approaches lead to a lack of agreement about what it means.

Others (e.g. Hammersley 2018 ; Fine and Hancock 2017 ) have also identified the treat to qualitative research that comes from external forces, seen from the point of view of “qualitative research.” This threat can be further divided into that which comes from inside academia, such as the critique voiced by “quantitative research” and outside of academia, including, for example, New Public Management. Hammersley ( 2018 ), zooming in on one type of qualitative research, ethnography, has argued that it is under treat. Similarly to Fine ( 2003 ), and before him Gans ( 1999 ), he writes that ethnography’ has acquired a range of meanings, and comes in many different versions, these often reflecting sharply divergent epistemological orientations. And already more than twenty years ago while reviewing Denzin and Lincoln’ s Handbook of Qualitative Methods Fine argued:

While this increasing centrality [of qualitative research] might lead one to believe that consensual standards have developed, this belief would be misleading. As the methodology becomes more widely accepted, querulous challengers have raised fundamental questions that collectively have undercut the traditional models of how qualitative research is to be fashioned and presented (1995:417).

According to Hammersley, there are today “serious treats to the practice of ethnographic work, on almost any definition” ( 2018 :1). He lists five external treats: (1) that social research must be accountable and able to show its impact on society; (2) the current emphasis on “big data” and the emphasis on quantitative data and evidence; (3) the labor market pressure in academia that leaves less time for fieldwork (see also Fine and Hancock 2017 ); (4) problems of access to fields; and (5) the increased ethical scrutiny of projects, to which ethnography is particularly exposed. Hammersley discusses some more or less insufficient existing definitions of ethnography.

The current situation, as Hammersley and others note—and in relation not only to ethnography but also qualitative research in general, and as our empirical study shows—is not just unsatisfactory, it may even be harmful for the entire field of qualitative research, and does not help social science at large. We suggest that the lack of clarity of qualitative research is a real problem that must be addressed.

Towards a Definition of Qualitative Research

Seen in an historical light, what is today called qualitative, or sometimes ethnographic, interpretative research – or a number of other terms – has more or less always existed. At the time the founders of sociology – Simmel, Weber, Durkheim and, before them, Marx – were writing, and during the era of the Methodenstreit (“dispute about methods”) in which the German historical school emphasized scientific methods (cf. Swedberg 1990 ), we can at least speak of qualitative forerunners.

Perhaps the most extended discussion of what later became known as qualitative methods in a classic work is Bronisław Malinowski’s ( 1922 ) Argonauts in the Western Pacific , although even this study does not explicitly address the meaning of “qualitative.” In Weber’s ([1921–-22] 1978) work we find a tension between scientific explanations that are based on observation and quantification and interpretative research (see also Lazarsfeld and Barton 1982 ).

If we look through major sociology journals like the American Sociological Review , American Journal of Sociology , or Social Forces we will not find the term qualitative sociology before the 1970s. And certainly before then much of what we consider qualitative classics in sociology, like Becker’ study ( 1963 ), had already been produced. Indeed, the Chicago School often combined qualitative and quantitative data within the same study (Fine 1995 ). Our point being that before a disciplinary self-awareness the term quantitative preceded qualitative, and the articulation of the former was a political move to claim scientific status (Denzin and Lincoln 2005 ). In the US the World War II seem to have sparked a critique of sociological work, including “qualitative work,” that did not follow the scientific canon (Rawls 2018 ), which was underpinned by a scientifically oriented and value free philosophy of science. As a result the attempts and practice of integrating qualitative and quantitative sociology at Chicago lost ground to sociology that was more oriented to surveys and quantitative work at Columbia under Merton-Lazarsfeld. The quantitative tradition was also able to present textbooks (Lundberg 1951 ) that facilitated the use this approach and its “methods.” The practices of the qualitative tradition, by and large, remained tacit or was part of the mentoring transferred from the renowned masters to their students.

This glimpse into history leads us back to the lack of a coherent account condensed in a definition of qualitative research. Many of the attempts to define the term do not meet the requirements of a proper definition: A definition should be clear, avoid tautology, demarcate its domain in relation to the environment, and ideally only use words in its definiens that themselves are not in need of definition (Hempel 1966 ). A definition can enhance precision and thus clarity by identifying the core of the phenomenon. Preferably, a definition should be short. The typical definition we have found, however, is an ostensive definition, which indicates what qualitative research is about without informing us about what it actually is :

Qualitative research is multimethod in focus, involving an interpretative, naturalistic approach to its subject matter. This means that qualitative researchers study things in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them. Qualitative research involves the studied use and collection of a variety of empirical materials – case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts – that describe routine and problematic moments and meanings in individuals’ lives. (Denzin and Lincoln 2005 :2)

Flick claims that the label “qualitative research” is indeed used as an umbrella for a number of approaches ( 2007 :2–4; 2002 :6), and it is not difficult to identify research fitting this designation. Moreover, whatever it is, it has grown dramatically over the past five decades. In addition, courses have been developed, methods have flourished, arguments about its future have been advanced (for example, Denzin and Lincoln 1994) and criticized (for example, Snow and Morrill 1995 ), and dedicated journals and books have mushroomed. Most social scientists have a clear idea of research and how it differs from journalism, politics and other activities. But the question of what is qualitative in qualitative research is either eluded or eschewed.

We maintain that this lacuna hinders systematic knowledge production based on qualitative research. Paul Lazarsfeld noted the lack of “codification” as early as 1955 when he reviewed 100 qualitative studies in order to offer a codification of the practices (Lazarsfeld and Barton 1982 :239). Since then many texts on “qualitative research” and its methods have been published, including recent attempts (Goertz and Mahoney 2012 ) similar to Lazarsfeld’s. These studies have tried to extract what is qualitative by looking at the large number of empirical “qualitative” studies. Our novel strategy complements these endeavors by taking another approach and looking at the attempts to codify these practices in the form of a definition, as well as to a minor extent take Becker’s study as an exemplar of what qualitative researchers actually do, and what the characteristic of being ‘qualitative’ denotes and implies. We claim that qualitative researchers, if there is such a thing as “qualitative research,” should be able to codify their practices in a condensed, yet general way expressed in language.

Lingering problems of “generalizability” and “how many cases do I need” (Small 2009 ) are blocking advancement – in this line of work qualitative approaches are said to differ considerably from quantitative ones, while some of the former unsuccessfully mimic principles related to the latter (Small 2009 ). Additionally, quantitative researchers sometimes unfairly criticize the first based on their own quality criteria. Scholars like Goertz and Mahoney ( 2012 ) have successfully focused on the different norms and practices beyond what they argue are essentially two different cultures: those working with either qualitative or quantitative methods. Instead, similarly to Becker ( 2017 ) who has recently questioned the usefulness of the distinction between qualitative and quantitative research, we focus on similarities.

The current situation also impedes both students and researchers in focusing their studies and understanding each other’s work (Lazarsfeld and Barton 1982 :239). A third consequence is providing an opening for critiques by scholars operating within different traditions (Valsiner 2000 :101). A fourth issue is that the “implicit use of methods in qualitative research makes the field far less standardized than the quantitative paradigm” (Goertz and Mahoney 2012 :9). Relatedly, the National Science Foundation in the US organized two workshops in 2004 and 2005 to address the scientific foundations of qualitative research involving strategies to improve it and to develop standards of evaluation in qualitative research. However, a specific focus on its distinguishing feature of being “qualitative” while being implicitly acknowledged, was discussed only briefly (for example, Best 2004 ).

In 2014 a theme issue was published in this journal on “Methods, Materials, and Meanings: Designing Cultural Analysis,” discussing central issues in (cultural) qualitative research (Berezin 2014 ; Biernacki 2014 ; Glaeser 2014 ; Lamont and Swidler 2014 ; Spillman 2014). We agree with many of the arguments put forward, such as the risk of methodological tribalism, and that we should not waste energy on debating methods separated from research questions. Nonetheless, a clarification of the relation to what is called “quantitative research” is of outmost importance to avoid misunderstandings and misguided debates between “qualitative” and “quantitative” researchers. Our strategy means that researchers, “qualitative” or “quantitative” they may be, in their actual practice may combine qualitative work and quantitative work.

In this article we accomplish three tasks. First, we systematically survey the literature for meanings of qualitative research by looking at how researchers have defined it. Drawing upon existing knowledge we find that the different meanings and ideas of qualitative research are not yet coherently integrated into one satisfactory definition. Next, we advance our contribution by offering a definition of qualitative research and illustrate its meaning and use partially by expanding on the brief example introduced earlier related to Becker’s work ( 1963 ). We offer a systematic analysis of central themes of what researchers consider to be the core of “qualitative,” regardless of style of work. These themes – which we summarize in terms of four keywords: distinction, process, closeness, improved understanding – constitute part of our literature review, in which each one appears, sometimes with others, but never all in the same definition. They serve as the foundation of our contribution. Our categories are overlapping. Their use is primarily to organize the large amount of definitions we have identified and analyzed, and not necessarily to draw a clear distinction between them. Finally, we continue the elaboration discussed above on the advantages of a clear definition of qualitative research.

In a hermeneutic fashion we propose that there is something meaningful that deserves to be labelled “qualitative research” (Gadamer 1990 ). To approach the question “What is qualitative in qualitative research?” we have surveyed the literature. In conducting our survey we first traced the word’s etymology in dictionaries, encyclopedias, handbooks of the social sciences and of methods and textbooks, mainly in English, which is common to methodology courses. It should be noted that we have zoomed in on sociology and its literature. This discipline has been the site of the largest debate and development of methods that can be called “qualitative,” which suggests that this field should be examined in great detail.

In an ideal situation we should expect that one good definition, or at least some common ideas, would have emerged over the years. This common core of qualitative research should be so accepted that it would appear in at least some textbooks. Since this is not what we found, we decided to pursue an inductive approach to capture maximal variation in the field of qualitative research; we searched in a selection of handbooks, textbooks, book chapters, and books, to which we added the analysis of journal articles. Our sample comprises a total of 89 references.

In practice we focused on the discipline that has had a clear discussion of methods, namely sociology. We also conducted a broad search in the JSTOR database to identify scholarly sociology articles published between 1998 and 2017 in English with a focus on defining or explaining qualitative research. We specifically zoom in on this time frame because we would have expect that this more mature period would have produced clear discussions on the meaning of qualitative research. To find these articles we combined a number of keywords to search the content and/or the title: qualitative (which was always included), definition, empirical, research, methodology, studies, fieldwork, interview and observation .

As a second phase of our research we searched within nine major sociological journals ( American Journal of Sociology , Sociological Theory , American Sociological Review , Contemporary Sociology , Sociological Forum , Sociological Theory , Qualitative Research , Qualitative Sociology and Qualitative Sociology Review ) for articles also published during the past 19 years (1998–2017) that had the term “qualitative” in the title and attempted to define qualitative research.

Lastly we picked two additional journals, Qualitative Research and Qualitative Sociology , in which we could expect to find texts addressing the notion of “qualitative.” From Qualitative Research we chose Volume 14, Issue 6, December 2014, and from Qualitative Sociology we chose Volume 36, Issue 2, June 2017. Within each of these we selected the first article; then we picked the second article of three prior issues. Again we went back another three issues and investigated article number three. Finally we went back another three issues and perused article number four. This selection criteria was used to get a manageable sample for the analysis.

The coding process of the 89 references we gathered in our selected review began soon after the first round of material was gathered, and we reduced the complexity created by our maximum variation sampling (Snow and Anderson 1993 :22) to four different categories within which questions on the nature and properties of qualitative research were discussed. We call them: Qualitative and Quantitative Research, Qualitative Research, Fieldwork, and Grounded Theory. This – which may appear as an illogical grouping – merely reflects the “context” in which the matter of “qualitative” is discussed. If the selection process of the material – books and articles – was informed by pre-knowledge, we used an inductive strategy to code the material. When studying our material, we identified four central notions related to “qualitative” that appear in various combinations in the literature which indicate what is the core of qualitative research. We have labeled them: “distinctions”, “process,” “closeness,” and “improved understanding.” During the research process the categories and notions were improved, refined, changed, and reordered. The coding ended when a sense of saturation in the material arose. In the presentation below all quotations and references come from our empirical material of texts on qualitative research.

Analysis – What is Qualitative Research?

In this section we describe the four categories we identified in the coding, how they differently discuss qualitative research, as well as their overall content. Some salient quotations are selected to represent the type of text sorted under each of the four categories. What we present are examples from the literature.

Qualitative and Quantitative

This analytic category comprises quotations comparing qualitative and quantitative research, a distinction that is frequently used (Brown 2010 :231); in effect this is a conceptual pair that structures the discussion and that may be associated with opposing interests. While the general goal of quantitative and qualitative research is the same – to understand the world better – their methodologies and focus in certain respects differ substantially (Becker 1966 :55). Quantity refers to that property of something that can be determined by measurement. In a dictionary of Statistics and Methodology we find that “(a) When referring to *variables, ‘qualitative’ is another term for *categorical or *nominal. (b) When speaking of kinds of research, ‘qualitative’ refers to studies of subjects that are hard to quantify, such as art history. Qualitative research tends to be a residual category for almost any kind of non-quantitative research” (Stiles 1998:183). But it should be obvious that one could employ a quantitative approach when studying, for example, art history.

The same dictionary states that quantitative is “said of variables or research that can be handled numerically, usually (too sharply) contrasted with *qualitative variables and research” (Stiles 1998:184). From a qualitative perspective “quantitative research” is about numbers and counting, and from a quantitative perspective qualitative research is everything that is not about numbers. But this does not say much about what is “qualitative.” If we turn to encyclopedias we find that in the 1932 edition of the Encyclopedia of the Social Sciences there is no mention of “qualitative.” In the Encyclopedia from 1968 we can read:

Qualitative Analysis. For methods of obtaining, analyzing, and describing data, see [the various entries:] CONTENT ANALYSIS; COUNTED DATA; EVALUATION RESEARCH, FIELD WORK; GRAPHIC PRESENTATION; HISTORIOGRAPHY, especially the article on THE RHETORIC OF HISTORY; INTERVIEWING; OBSERVATION; PERSONALITY MEASUREMENT; PROJECTIVE METHODS; PSYCHOANALYSIS, article on EXPERIMENTAL METHODS; SURVEY ANALYSIS, TABULAR PRESENTATION; TYPOLOGIES. (Vol. 13:225)

Some, like Alford, divide researchers into methodologists or, in his words, “quantitative and qualitative specialists” (Alford 1998 :12). Qualitative research uses a variety of methods, such as intensive interviews or in-depth analysis of historical materials, and it is concerned with a comprehensive account of some event or unit (King et al. 1994 :4). Like quantitative research it can be utilized to study a variety of issues, but it tends to focus on meanings and motivations that underlie cultural symbols, personal experiences, phenomena and detailed understanding of processes in the social world. In short, qualitative research centers on understanding processes, experiences, and the meanings people assign to things (Kalof et al. 2008 :79).

Others simply say that qualitative methods are inherently unscientific (Jovanović 2011 :19). Hood, for instance, argues that words are intrinsically less precise than numbers, and that they are therefore more prone to subjective analysis, leading to biased results (Hood 2006 :219). Qualitative methodologies have raised concerns over the limitations of quantitative templates (Brady et al. 2004 :4). Scholars such as King et al. ( 1994 ), for instance, argue that non-statistical research can produce more reliable results if researchers pay attention to the rules of scientific inference commonly stated in quantitative research. Also, researchers such as Becker ( 1966 :59; 1970 :42–43) have asserted that, if conducted properly, qualitative research and in particular ethnographic field methods, can lead to more accurate results than quantitative studies, in particular, survey research and laboratory experiments.

Some researchers, such as Kalof, Dan, and Dietz ( 2008 :79) claim that the boundaries between the two approaches are becoming blurred, and Small ( 2009 ) argues that currently much qualitative research (especially in North America) tries unsuccessfully and unnecessarily to emulate quantitative standards. For others, qualitative research tends to be more humanistic and discursive (King et al. 1994 :4). Ragin ( 1994 ), and similarly also Becker, ( 1996 :53), Marchel and Owens ( 2007 :303) think that the main distinction between the two styles is overstated and does not rest on the simple dichotomy of “numbers versus words” (Ragin 1994 :xii). Some claim that quantitative data can be utilized to discover associations, but in order to unveil cause and effect a complex research design involving the use of qualitative approaches needs to be devised (Gilbert 2009 :35). Consequently, qualitative data are useful for understanding the nuances lying beyond those processes as they unfold (Gilbert 2009 :35). Others contend that qualitative research is particularly well suited both to identify causality and to uncover fine descriptive distinctions (Fine and Hallett 2014 ; Lichterman and Isaac Reed 2014 ; Katz 2015 ).

There are other ways to separate these two traditions, including normative statements about what qualitative research should be (that is, better or worse than quantitative approaches, concerned with scientific approaches to societal change or vice versa; Snow and Morrill 1995 ; Denzin and Lincoln 2005 ), or whether it should develop falsifiable statements; Best 2004 ).

We propose that quantitative research is largely concerned with pre-determined variables (Small 2008 ); the analysis concerns the relations between variables. These categories are primarily not questioned in the study, only their frequency or degree, or the correlations between them (cf. Franzosi 2016 ). If a researcher studies wage differences between women and men, he or she works with given categories: x number of men are compared with y number of women, with a certain wage attributed to each person. The idea is not to move beyond the given categories of wage, men and women; they are the starting point as well as the end point, and undergo no “qualitative change.” Qualitative research, in contrast, investigates relations between categories that are themselves subject to change in the research process. Returning to Becker’s study ( 1963 ), we see that he questioned pre-dispositional theories of deviant behavior working with pre-determined variables such as an individual’s combination of personal qualities or emotional problems. His take, in contrast, was to understand marijuana consumption by developing “variables” as part of the investigation. Thereby he presented new variables, or as we would say today, theoretical concepts, but which are grounded in the empirical material.

Qualitative Research

This category contains quotations that refer to descriptions of qualitative research without making comparisons with quantitative research. Researchers such as Denzin and Lincoln, who have written a series of influential handbooks on qualitative methods (1994; Denzin and Lincoln 2003 ; 2005 ), citing Nelson et al. (1992:4), argue that because qualitative research is “interdisciplinary, transdisciplinary, and sometimes counterdisciplinary” it is difficult to derive one single definition of it (Jovanović 2011 :3). According to them, in fact, “the field” is “many things at the same time,” involving contradictions, tensions over its focus, methods, and how to derive interpretations and findings ( 2003 : 11). Similarly, others, such as Flick ( 2007 :ix–x) contend that agreeing on an accepted definition has increasingly become problematic, and that qualitative research has possibly matured different identities. However, Best holds that “the proliferation of many sorts of activities under the label of qualitative sociology threatens to confuse our discussions” ( 2004 :54). Atkinson’s position is more definite: “the current state of qualitative research and research methods is confused” ( 2005 :3–4).

Qualitative research is about interpretation (Blumer 1969 ; Strauss and Corbin 1998 ; Denzin and Lincoln 2003 ), or Verstehen [understanding] (Frankfort-Nachmias and Nachmias 1996 ). It is “multi-method,” involving the collection and use of a variety of empirical materials (Denzin and Lincoln 1998; Silverman 2013 ) and approaches (Silverman 2005 ; Flick 2007 ). It focuses not only on the objective nature of behavior but also on its subjective meanings: individuals’ own accounts of their attitudes, motivations, behavior (McIntyre 2005 :127; Creswell 2009 ), events and situations (Bryman 1989) – what people say and do in specific places and institutions (Goodwin and Horowitz 2002 :35–36) in social and temporal contexts (Morrill and Fine 1997). For this reason, following Weber ([1921-22] 1978), it can be described as an interpretative science (McIntyre 2005 :127). But could quantitative research also be concerned with these questions? Also, as pointed out below, does all qualitative research focus on subjective meaning, as some scholars suggest?

Others also distinguish qualitative research by claiming that it collects data using a naturalistic approach (Denzin and Lincoln 2005 :2; Creswell 2009 ), focusing on the meaning actors ascribe to their actions. But again, does all qualitative research need to be collected in situ? And does qualitative research have to be inherently concerned with meaning? Flick ( 2007 ), referring to Denzin and Lincoln ( 2005 ), mentions conversation analysis as an example of qualitative research that is not concerned with the meanings people bring to a situation, but rather with the formal organization of talk. Still others, such as Ragin ( 1994 :85), note that qualitative research is often (especially early on in the project, we would add) less structured than other kinds of social research – a characteristic connected to its flexibility and that can lead both to potentially better, but also worse results. But is this not a feature of this type of research, rather than a defining description of its essence? Wouldn’t this comment also apply, albeit to varying degrees, to quantitative research?

In addition, Strauss ( 2003 ), along with others, such as Alvesson and Kärreman ( 2011 :10–76), argue that qualitative researchers struggle to capture and represent complex phenomena partially because they tend to collect a large amount of data. While his analysis is correct at some points – “It is necessary to do detailed, intensive, microscopic examination of the data in order to bring out the amazing complexity of what lies in, behind, and beyond those data” (Strauss 2003 :10) – much of his analysis concerns the supposed focus of qualitative research and its challenges, rather than exactly what it is about. But even in this instance we would make a weak case arguing that these are strictly the defining features of qualitative research. Some researchers seem to focus on the approach or the methods used, or even on the way material is analyzed. Several researchers stress the naturalistic assumption of investigating the world, suggesting that meaning and interpretation appear to be a core matter of qualitative research.

We can also see that in this category there is no consensus about specific qualitative methods nor about qualitative data. Many emphasize interpretation, but quantitative research, too, involves interpretation; the results of a regression analysis, for example, certainly have to be interpreted, and the form of meta-analysis that factor analysis provides indeed requires interpretation However, there is no interpretation of quantitative raw data, i.e., numbers in tables. One common thread is that qualitative researchers have to get to grips with their data in order to understand what is being studied in great detail, irrespective of the type of empirical material that is being analyzed. This observation is connected to the fact that qualitative researchers routinely make several adjustments of focus and research design as their studies progress, in many cases until the very end of the project (Kalof et al. 2008 ). If you, like Becker, do not start out with a detailed theory, adjustments such as the emergence and refinement of research questions will occur during the research process. We have thus found a number of useful reflections about qualitative research scattered across different sources, but none of them effectively describe the defining characteristics of this approach.

Although qualitative research does not appear to be defined in terms of a specific method, it is certainly common that fieldwork, i.e., research that entails that the researcher spends considerable time in the field that is studied and use the knowledge gained as data, is seen as emblematic of or even identical to qualitative research. But because we understand that fieldwork tends to focus primarily on the collection and analysis of qualitative data, we expected to find within it discussions on the meaning of “qualitative.” But, again, this was not the case.

Instead, we found material on the history of this approach (for example, Frankfort-Nachmias and Nachmias 1996 ; Atkinson et al. 2001), including how it has changed; for example, by adopting a more self-reflexive practice (Heyl 2001), as well as the different nomenclature that has been adopted, such as fieldwork, ethnography, qualitative research, naturalistic research, participant observation and so on (for example, Lofland et al. 2006 ; Gans 1999 ).

We retrieved definitions of ethnography, such as “the study of people acting in the natural courses of their daily lives,” involving a “resocialization of the researcher” (Emerson 1988 :1) through intense immersion in others’ social worlds (see also examples in Hammersley 2018 ). This may be accomplished by direct observation and also participation (Neuman 2007 :276), although others, such as Denzin ( 1970 :185), have long recognized other types of observation, including non-participant (“fly on the wall”). In this category we have also isolated claims and opposing views, arguing that this type of research is distinguished primarily by where it is conducted (natural settings) (Hughes 1971:496), and how it is carried out (a variety of methods are applied) or, for some most importantly, by involving an active, empathetic immersion in those being studied (Emerson 1988 :2). We also retrieved descriptions of the goals it attends in relation to how it is taught (understanding subjective meanings of the people studied, primarily develop theory, or contribute to social change) (see for example, Corte and Irwin 2017 ; Frankfort-Nachmias and Nachmias 1996 :281; Trier-Bieniek 2012 :639) by collecting the richest possible data (Lofland et al. 2006 ) to derive “thick descriptions” (Geertz 1973 ), and/or to aim at theoretical statements of general scope and applicability (for example, Emerson 1988 ; Fine 2003 ). We have identified guidelines on how to evaluate it (for example Becker 1996 ; Lamont 2004 ) and have retrieved instructions on how it should be conducted (for example, Lofland et al. 2006 ). For instance, analysis should take place while the data gathering unfolds (Emerson 1988 ; Hammersley and Atkinson 2007 ; Lofland et al. 2006 ), observations should be of long duration (Becker 1970 :54; Goffman 1989 ), and data should be of high quantity (Becker 1970 :52–53), as well as other questionable distinctions between fieldwork and other methods:

Field studies differ from other methods of research in that the researcher performs the task of selecting topics, decides what questions to ask, and forges interest in the course of the research itself . This is in sharp contrast to many ‘theory-driven’ and ‘hypothesis-testing’ methods. (Lofland and Lofland 1995 :5)

But could not, for example, a strictly interview-based study be carried out with the same amount of flexibility, such as sequential interviewing (for example, Small 2009 )? Once again, are quantitative approaches really as inflexible as some qualitative researchers think? Moreover, this category stresses the role of the actors’ meaning, which requires knowledge and close interaction with people, their practices and their lifeworld.

It is clear that field studies – which are seen by some as the “gold standard” of qualitative research – are nonetheless only one way of doing qualitative research. There are other methods, but it is not clear why some are more qualitative than others, or why they are better or worse. Fieldwork is characterized by interaction with the field (the material) and understanding of the phenomenon that is being studied. In Becker’s case, he had general experience from fields in which marihuana was used, based on which he did interviews with actual users in several fields.

Grounded Theory

Another major category we identified in our sample is Grounded Theory. We found descriptions of it most clearly in Glaser and Strauss’ ([1967] 2010 ) original articulation, Strauss and Corbin ( 1998 ) and Charmaz ( 2006 ), as well as many other accounts of what it is for: generating and testing theory (Strauss 2003 :xi). We identified explanations of how this task can be accomplished – such as through two main procedures: constant comparison and theoretical sampling (Emerson 1998:96), and how using it has helped researchers to “think differently” (for example, Strauss and Corbin 1998 :1). We also read descriptions of its main traits, what it entails and fosters – for instance, an exceptional flexibility, an inductive approach (Strauss and Corbin 1998 :31–33; 1990; Esterberg 2002 :7), an ability to step back and critically analyze situations, recognize tendencies towards bias, think abstractly and be open to criticism, enhance sensitivity towards the words and actions of respondents, and develop a sense of absorption and devotion to the research process (Strauss and Corbin 1998 :5–6). Accordingly, we identified discussions of the value of triangulating different methods (both using and not using grounded theory), including quantitative ones, and theories to achieve theoretical development (most comprehensively in Denzin 1970 ; Strauss and Corbin 1998 ; Timmermans and Tavory 2012 ). We have also located arguments about how its practice helps to systematize data collection, analysis and presentation of results (Glaser and Strauss [1967] 2010 :16).

Grounded theory offers a systematic approach which requires researchers to get close to the field; closeness is a requirement of identifying questions and developing new concepts or making further distinctions with regard to old concepts. In contrast to other qualitative approaches, grounded theory emphasizes the detailed coding process, and the numerous fine-tuned distinctions that the researcher makes during the process. Within this category, too, we could not find a satisfying discussion of the meaning of qualitative research.

Defining Qualitative Research

In sum, our analysis shows that some notions reappear in the discussion of qualitative research, such as understanding, interpretation, “getting close” and making distinctions. These notions capture aspects of what we think is “qualitative.” However, a comprehensive definition that is useful and that can further develop the field is lacking, and not even a clear picture of its essential elements appears. In other words no definition emerges from our data, and in our research process we have moved back and forth between our empirical data and the attempt to present a definition. Our concrete strategy, as stated above, is to relate qualitative and quantitative research, or more specifically, qualitative and quantitative work. We use an ideal-typical notion of quantitative research which relies on taken for granted and numbered variables. This means that the data consists of variables on different scales, such as ordinal, but frequently ratio and absolute scales, and the representation of the numbers to the variables, i.e. the justification of the assignment of numbers to object or phenomenon, are not questioned, though the validity may be questioned. In this section we return to the notion of quality and try to clarify it while presenting our contribution.

Broadly, research refers to the activity performed by people trained to obtain knowledge through systematic procedures. Notions such as “objectivity” and “reflexivity,” “systematic,” “theory,” “evidence” and “openness” are here taken for granted in any type of research. Next, building on our empirical analysis we explain the four notions that we have identified as central to qualitative work: distinctions, process, closeness, and improved understanding. In discussing them, ultimately in relation to one another, we make their meaning even more precise. Our idea, in short, is that only when these ideas that we present separately for analytic purposes are brought together can we speak of qualitative research.

Distinctions

We believe that the possibility of making new distinctions is one the defining characteristics of qualitative research. It clearly sets it apart from quantitative analysis which works with taken-for-granted variables, albeit as mentioned, meta-analyses, for example, factor analysis may result in new variables. “Quality” refers essentially to distinctions, as already pointed out by Aristotle. He discusses the term “qualitative” commenting: “By a quality I mean that in virtue of which things are said to be qualified somehow” (Aristotle 1984:14). Quality is about what something is or has, which means that the distinction from its environment is crucial. We see qualitative research as a process in which significant new distinctions are made to the scholarly community; to make distinctions is a key aspect of obtaining new knowledge; a point, as we will see, that also has implications for “quantitative research.” The notion of being “significant” is paramount. New distinctions by themselves are not enough; just adding concepts only increases complexity without furthering our knowledge. The significance of new distinctions is judged against the communal knowledge of the research community. To enable this discussion and judgements central elements of rational discussion are required (cf. Habermas [1981] 1987 ; Davidsson [ 1988 ] 2001) to identify what is new and relevant scientific knowledge. Relatedly, Ragin alludes to the idea of new and useful knowledge at a more concrete level: “Qualitative methods are appropriate for in-depth examination of cases because they aid the identification of key features of cases. Most qualitative methods enhance data” (1994:79). When Becker ( 1963 ) studied deviant behavior and investigated how people became marihuana smokers, he made distinctions between the ways in which people learned how to smoke. This is a classic example of how the strategy of “getting close” to the material, for example the text, people or pictures that are subject to analysis, may enable researchers to obtain deeper insight and new knowledge by making distinctions – in this instance on the initial notion of learning how to smoke. Others have stressed the making of distinctions in relation to coding or theorizing. Emerson et al. ( 1995 ), for example, hold that “qualitative coding is a way of opening up avenues of inquiry,” meaning that the researcher identifies and develops concepts and analytic insights through close examination of and reflection on data (Emerson et al. 1995 :151). Goodwin and Horowitz highlight making distinctions in relation to theory-building writing: “Close engagement with their cases typically requires qualitative researchers to adapt existing theories or to make new conceptual distinctions or theoretical arguments to accommodate new data” ( 2002 : 37). In the ideal-typical quantitative research only existing and so to speak, given, variables would be used. If this is the case no new distinction are made. But, would not also many “quantitative” researchers make new distinctions?

Process does not merely suggest that research takes time. It mainly implies that qualitative new knowledge results from a process that involves several phases, and above all iteration. Qualitative research is about oscillation between theory and evidence, analysis and generating material, between first- and second -order constructs (Schütz 1962 :59), between getting in contact with something, finding sources, becoming deeply familiar with a topic, and then distilling and communicating some of its essential features. The main point is that the categories that the researcher uses, and perhaps takes for granted at the beginning of the research process, usually undergo qualitative changes resulting from what is found. Becker describes how he tested hypotheses and let the jargon of the users develop into theoretical concepts. This happens over time while the study is being conducted, exemplifying what we mean by process.

In the research process, a pilot-study may be used to get a first glance of, for example, the field, how to approach it, and what methods can be used, after which the method and theory are chosen or refined before the main study begins. Thus, the empirical material is often central from the start of the project and frequently leads to adjustments by the researcher. Likewise, during the main study categories are not fixed; the empirical material is seen in light of the theory used, but it is also given the opportunity to kick back, thereby resisting attempts to apply theoretical straightjackets (Becker 1970 :43). In this process, coding and analysis are interwoven, and thus are often important steps for getting closer to the phenomenon and deciding what to focus on next. Becker began his research by interviewing musicians close to him, then asking them to refer him to other musicians, and later on doubling his original sample of about 25 to include individuals in other professions (Becker 1973:46). Additionally, he made use of some participant observation, documents, and interviews with opiate users made available to him by colleagues. As his inductive theory of deviance evolved, Becker expanded his sample in order to fine tune it, and test the accuracy and generality of his hypotheses. In addition, he introduced a negative case and discussed the null hypothesis ( 1963 :44). His phasic career model is thus based on a research design that embraces processual work. Typically, process means to move between “theory” and “material” but also to deal with negative cases, and Becker ( 1998 ) describes how discovering these negative cases impacted his research design and ultimately its findings.

Obviously, all research is process-oriented to some degree. The point is that the ideal-typical quantitative process does not imply change of the data, and iteration between data, evidence, hypotheses, empirical work, and theory. The data, quantified variables, are, in most cases fixed. Merging of data, which of course can be done in a quantitative research process, does not mean new data. New hypotheses are frequently tested, but the “raw data is often the “the same.” Obviously, over time new datasets are made available and put into use.

Another characteristic that is emphasized in our sample is that qualitative researchers – and in particular ethnographers – can, or as Goffman put it, ought to ( 1989 ), get closer to the phenomenon being studied and their data than quantitative researchers (for example, Silverman 2009 :85). Put differently, essentially because of their methods qualitative researchers get into direct close contact with those being investigated and/or the material, such as texts, being analyzed. Becker started out his interview study, as we noted, by talking to those he knew in the field of music to get closer to the phenomenon he was studying. By conducting interviews he got even closer. Had he done more observations, he would undoubtedly have got even closer to the field.

Additionally, ethnographers’ design enables researchers to follow the field over time, and the research they do is almost by definition longitudinal, though the time in the field is studied obviously differs between studies. The general characteristic of closeness over time maximizes the chances of unexpected events, new data (related, for example, to archival research as additional sources, and for ethnography for situations not necessarily previously thought of as instrumental – what Mannay and Morgan ( 2015 ) term the “waiting field”), serendipity (Merton and Barber 2004 ; Åkerström 2013 ), and possibly reactivity, as well as the opportunity to observe disrupted patterns that translate into exemplars of negative cases. Two classic examples of this are Becker’s finding of what medical students call “crocks” (Becker et al. 1961 :317), and Geertz’s ( 1973 ) study of “deep play” in Balinese society.

By getting and staying so close to their data – be it pictures, text or humans interacting (Becker was himself a musician) – for a long time, as the research progressively focuses, qualitative researchers are prompted to continually test their hunches, presuppositions and hypotheses. They test them against a reality that often (but certainly not always), and practically, as well as metaphorically, talks back, whether by validating them, or disqualifying their premises – correctly, as well as incorrectly (Fine 2003 ; Becker 1970 ). This testing nonetheless often leads to new directions for the research. Becker, for example, says that he was initially reading psychological theories, but when facing the data he develops a theory that looks at, you may say, everything but psychological dispositions to explain the use of marihuana. Especially researchers involved with ethnographic methods have a fairly unique opportunity to dig up and then test (in a circular, continuous and temporal way) new research questions and findings as the research progresses, and thereby to derive previously unimagined and uncharted distinctions by getting closer to the phenomenon under study.

Let us stress that getting close is by no means restricted to ethnography. The notion of hermeneutic circle and hermeneutics as a general way of understanding implies that we must get close to the details in order to get the big picture. This also means that qualitative researchers can literally also make use of details of pictures as evidence (cf. Harper 2002). Thus, researchers may get closer both when generating the material or when analyzing it.

Quantitative research, we maintain, in the ideal-typical representation cannot get closer to the data. The data is essentially numbers in tables making up the variables (Franzosi 2016 :138). The data may originally have been “qualitative,” but once reduced to numbers there can only be a type of “hermeneutics” about what the number may stand for. The numbers themselves, however, are non-ambiguous. Thus, in quantitative research, interpretation, if done, is not about the data itself—the numbers—but what the numbers stand for. It follows that the interpretation is essentially done in a more “speculative” mode without direct empirical evidence (cf. Becker 2017 ).

Improved Understanding

While distinction, process and getting closer refer to the qualitative work of the researcher, improved understanding refers to its conditions and outcome of this work. Understanding cuts deeper than explanation, which to some may mean a causally verified correlation between variables. The notion of explanation presupposes the notion of understanding since explanation does not include an idea of how knowledge is gained (Manicas 2006 : 15). Understanding, we argue, is the core concept of what we call the outcome of the process when research has made use of all the other elements that were integrated in the research. Understanding, then, has a special status in qualitative research since it refers both to the conditions of knowledge and the outcome of the process. Understanding can to some extent be seen as the condition of explanation and occurs in a process of interpretation, which naturally refers to meaning (Gadamer 1990 ). It is fundamentally connected to knowing, and to the knowing of how to do things (Heidegger [1927] 2001 ). Conceptually the term hermeneutics is used to account for this process. Heidegger ties hermeneutics to human being and not possible to separate from the understanding of being ( 1988 ). Here we use it in a broader sense, and more connected to method in general (cf. Seiffert 1992 ). The abovementioned aspects – for example, “objectivity” and “reflexivity” – of the approach are conditions of scientific understanding. Understanding is the result of a circular process and means that the parts are understood in light of the whole, and vice versa. Understanding presupposes pre-understanding, or in other words, some knowledge of the phenomenon studied. The pre-understanding, even in the form of prejudices, are in qualitative research process, which we see as iterative, questioned, which gradually or suddenly change due to the iteration of data, evidence and concepts. However, qualitative research generates understanding in the iterative process when the researcher gets closer to the data, e.g., by going back and forth between field and analysis in a process that generates new data that changes the evidence, and, ultimately, the findings. Questioning, to ask questions, and put what one assumes—prejudices and presumption—in question, is central to understand something (Heidegger [1927] 2001 ; Gadamer 1990 :368–384). We propose that this iterative process in which the process of understanding occurs is characteristic of qualitative research.

Improved understanding means that we obtain scientific knowledge of something that we as a scholarly community did not know before, or that we get to know something better. It means that we understand more about how parts are related to one another, and to other things we already understand (see also Fine and Hallett 2014 ). Understanding is an important condition for qualitative research. It is not enough to identify correlations, make distinctions, and work in a process in which one gets close to the field or phenomena. Understanding is accomplished when the elements are integrated in an iterative process.

It is, moreover, possible to understand many things, and researchers, just like children, may come to understand new things every day as they engage with the world. This subjective condition of understanding – namely, that a person gains a better understanding of something –is easily met. To be qualified as “scientific,” the understanding must be general and useful to many; it must be public. But even this generally accessible understanding is not enough in order to speak of “scientific understanding.” Though we as a collective can increase understanding of everything in virtually all potential directions as a result also of qualitative work, we refrain from this “objective” way of understanding, which has no means of discriminating between what we gain in understanding. Scientific understanding means that it is deemed relevant from the scientific horizon (compare Schütz 1962 : 35–38, 46, 63), and that it rests on the pre-understanding that the scientists have and must have in order to understand. In other words, the understanding gained must be deemed useful by other researchers, so that they can build on it. We thus see understanding from a pragmatic, rather than a subjective or objective perspective. Improved understanding is related to the question(s) at hand. Understanding, in order to represent an improvement, must be an improvement in relation to the existing body of knowledge of the scientific community (James [ 1907 ] 1955). Scientific understanding is, by definition, collective, as expressed in Weber’s famous note on objectivity, namely that scientific work aims at truths “which … can claim, even for a Chinese, the validity appropriate to an empirical analysis” ([1904] 1949 :59). By qualifying “improved understanding” we argue that it is a general defining characteristic of qualitative research. Becker‘s ( 1966 ) study and other research of deviant behavior increased our understanding of the social learning processes of how individuals start a behavior. And it also added new knowledge about the labeling of deviant behavior as a social process. Few studies, of course, make the same large contribution as Becker’s, but are nonetheless qualitative research.

Understanding in the phenomenological sense, which is a hallmark of qualitative research, we argue, requires meaning and this meaning is derived from the context, and above all the data being analyzed. The ideal-typical quantitative research operates with given variables with different numbers. This type of material is not enough to establish meaning at the level that truly justifies understanding. In other words, many social science explanations offer ideas about correlations or even causal relations, but this does not mean that the meaning at the level of the data analyzed, is understood. This leads us to say that there are indeed many explanations that meet the criteria of understanding, for example the explanation of how one becomes a marihuana smoker presented by Becker. However, we may also understand a phenomenon without explaining it, and we may have potential explanations, or better correlations, that are not really understood.

We may speak more generally of quantitative research and its data to clarify what we see as an important distinction. The “raw data” that quantitative research—as an idealtypical activity, refers to is not available for further analysis; the numbers, once created, are not to be questioned (Franzosi 2016 : 138). If the researcher is to do “more” or “change” something, this will be done by conjectures based on theoretical knowledge or based on the researcher’s lifeworld. Both qualitative and quantitative research is based on the lifeworld, and all researchers use prejudices and pre-understanding in the research process. This idea is present in the works of Heidegger ( 2001 ) and Heisenberg (cited in Franzosi 2010 :619). Qualitative research, as we argued, involves the interaction and questioning of concepts (theory), data, and evidence.

Ragin ( 2004 :22) points out that “a good definition of qualitative research should be inclusive and should emphasize its key strengths and features, not what it lacks (for example, the use of sophisticated quantitative techniques).” We define qualitative research as an iterative process in which improved understanding to the scientific community is achieved by making new significant distinctions resulting from getting closer to the phenomenon studied. Qualitative research, as defined here, is consequently a combination of two criteria: (i) how to do things –namely, generating and analyzing empirical material, in an iterative process in which one gets closer by making distinctions, and (ii) the outcome –improved understanding novel to the scholarly community. Is our definition applicable to our own study? In this study we have closely read the empirical material that we generated, and the novel distinction of the notion “qualitative research” is the outcome of an iterative process in which both deduction and induction were involved, in which we identified the categories that we analyzed. We thus claim to meet the first criteria, “how to do things.” The second criteria cannot be judged but in a partial way by us, namely that the “outcome” —in concrete form the definition-improves our understanding to others in the scientific community.

We have defined qualitative research, or qualitative scientific work, in relation to quantitative scientific work. Given this definition, qualitative research is about questioning the pre-given (taken for granted) variables, but it is thus also about making new distinctions of any type of phenomenon, for example, by coining new concepts, including the identification of new variables. This process, as we have discussed, is carried out in relation to empirical material, previous research, and thus in relation to theory. Theory and previous research cannot be escaped or bracketed. According to hermeneutic principles all scientific work is grounded in the lifeworld, and as social scientists we can thus never fully bracket our pre-understanding.

We have proposed that quantitative research, as an idealtype, is concerned with pre-determined variables (Small 2008 ). Variables are epistemically fixed, but can vary in terms of dimensions, such as frequency or number. Age is an example; as a variable it can take on different numbers. In relation to quantitative research, qualitative research does not reduce its material to number and variables. If this is done the process of comes to a halt, the researcher gets more distanced from her data, and it makes it no longer possible to make new distinctions that increase our understanding. We have above discussed the components of our definition in relation to quantitative research. Our conclusion is that in the research that is called quantitative there are frequent and necessary qualitative elements.

Further, comparative empirical research on researchers primarily working with ”quantitative” approaches and those working with ”qualitative” approaches, we propose, would perhaps show that there are many similarities in practices of these two approaches. This is not to deny dissimilarities, or the different epistemic and ontic presuppositions that may be more or less strongly associated with the two different strands (see Goertz and Mahoney 2012 ). Our point is nonetheless that prejudices and preconceptions about researchers are unproductive, and that as other researchers have argued, differences may be exaggerated (e.g., Becker 1996 : 53, 2017 ; Marchel and Owens 2007 :303; Ragin 1994 ), and that a qualitative dimension is present in both kinds of work.

Several things follow from our findings. The most important result is the relation to quantitative research. In our analysis we have separated qualitative research from quantitative research. The point is not to label individual researchers, methods, projects, or works as either “quantitative” or “qualitative.” By analyzing, i.e., taking apart, the notions of quantitative and qualitative, we hope to have shown the elements of qualitative research. Our definition captures the elements, and how they, when combined in practice, generate understanding. As many of the quotations we have used suggest, one conclusion of our study holds that qualitative approaches are not inherently connected with a specific method. Put differently, none of the methods that are frequently labelled “qualitative,” such as interviews or participant observation, are inherently “qualitative.” What matters, given our definition, is whether one works qualitatively or quantitatively in the research process, until the results are produced. Consequently, our analysis also suggests that those researchers working with what in the literature and in jargon is often called “quantitative research” are almost bound to make use of what we have identified as qualitative elements in any research project. Our findings also suggest that many” quantitative” researchers, at least to some extent, are engaged with qualitative work, such as when research questions are developed, variables are constructed and combined, and hypotheses are formulated. Furthermore, a research project may hover between “qualitative” and “quantitative” or start out as “qualitative” and later move into a “quantitative” (a distinct strategy that is not similar to “mixed methods” or just simply combining induction and deduction). More generally speaking, the categories of “qualitative” and “quantitative,” unfortunately, often cover up practices, and it may lead to “camps” of researchers opposing one another. For example, regardless of the researcher is primarily oriented to “quantitative” or “qualitative” research, the role of theory is neglected (cf. Swedberg 2017 ). Our results open up for an interaction not characterized by differences, but by different emphasis, and similarities.

Let us take two examples to briefly indicate how qualitative elements can fruitfully be combined with quantitative. Franzosi ( 2010 ) has discussed the relations between quantitative and qualitative approaches, and more specifically the relation between words and numbers. He analyzes texts and argues that scientific meaning cannot be reduced to numbers. Put differently, the meaning of the numbers is to be understood by what is taken for granted, and what is part of the lifeworld (Schütz 1962 ). Franzosi shows how one can go about using qualitative and quantitative methods and data to address scientific questions analyzing violence in Italy at the time when fascism was rising (1919–1922). Aspers ( 2006 ) studied the meaning of fashion photographers. He uses an empirical phenomenological approach, and establishes meaning at the level of actors. In a second step this meaning, and the different ideal-typical photographers constructed as a result of participant observation and interviews, are tested using quantitative data from a database; in the first phase to verify the different ideal-types, in the second phase to use these types to establish new knowledge about the types. In both of these cases—and more examples can be found—authors move from qualitative data and try to keep the meaning established when using the quantitative data.

A second main result of our study is that a definition, and we provided one, offers a way for research to clarify, and even evaluate, what is done. Hence, our definition can guide researchers and students, informing them on how to think about concrete research problems they face, and to show what it means to get closer in a process in which new distinctions are made. The definition can also be used to evaluate the results, given that it is a standard of evaluation (cf. Hammersley 2007 ), to see whether new distinctions are made and whether this improves our understanding of what is researched, in addition to the evaluation of how the research was conducted. By making what is qualitative research explicit it becomes easier to communicate findings, and it is thereby much harder to fly under the radar with substandard research since there are standards of evaluation which make it easier to separate “good” from “not so good” qualitative research.

To conclude, our analysis, which ends with a definition of qualitative research can thus both address the “internal” issues of what is qualitative research, and the “external” critiques that make it harder to do qualitative research, to which both pressure from quantitative methods and general changes in society contribute.

Åkerström, Malin. 2013. Curiosity and serendipity in qualitative research. Qualitative Sociology Review 9 (2): 10–18.

Google Scholar  

Alford, Robert R. 1998. The craft of inquiry. Theories, methods, evidence . Oxford: Oxford University Press.

Alvesson, Mats, and Dan Kärreman. 2011. Qualitative research and theory development. Mystery as method . London: SAGE Publications.

Book   Google Scholar  

Aspers, Patrik. 2006. Markets in Fashion, A Phenomenological Approach. London Routledge.

Atkinson, Paul. 2005. Qualitative research. Unity and diversity. Forum: Qualitative Social Research 6 (3): 1–15.

Becker, Howard S. 1963. Outsiders. Studies in the sociology of deviance . New York: The Free Press.

Becker, Howard S. 1966. Whose side are we on? Social Problems 14 (3): 239–247.

Article   Google Scholar  

Becker, Howard S. 1970. Sociological work. Method and substance . New Brunswick: Transaction Books.

Becker, Howard S. 1996. The epistemology of qualitative research. In Ethnography and human development. Context and meaning in social inquiry , ed. Jessor Richard, Colby Anne, and Richard A. Shweder, 53–71. Chicago: University of Chicago Press.

Becker, Howard S. 1998. Tricks of the trade. How to think about your research while you're doing it . Chicago: University of Chicago Press.

Becker, Howard S. 2017. Evidence . Chigaco: University of Chicago Press.

Becker, Howard, Blanche Geer, Everett Hughes, and Anselm Strauss. 1961. Boys in White, student culture in medical school . New Brunswick: Transaction Publishers.

Berezin, Mabel. 2014. How do we know what we mean? Epistemological dilemmas in cultural sociology. Qualitative Sociology 37 (2): 141–151.

Best, Joel. 2004. Defining qualitative research. In Workshop on Scientific Foundations of Qualitative Research , eds . Charles, Ragin, Joanne, Nagel, and Patricia White, 53-54. http://www.nsf.gov/pubs/2004/nsf04219/nsf04219.pdf .

Biernacki, Richard. 2014. Humanist interpretation versus coding text samples. Qualitative Sociology 37 (2): 173–188.

Blumer, Herbert. 1969. Symbolic interactionism: Perspective and method . Berkeley: University of California Press.

Brady, Henry, David Collier, and Jason Seawright. 2004. Refocusing the discussion of methodology. In Rethinking social inquiry. Diverse tools, shared standards , ed. Brady Henry and Collier David, 3–22. Lanham: Rowman and Littlefield.

Brown, Allison P. 2010. Qualitative method and compromise in applied social research. Qualitative Research 10 (2): 229–248.

Charmaz, Kathy. 2006. Constructing grounded theory . London: Sage.

Corte, Ugo, and Katherine Irwin. 2017. “The Form and Flow of Teaching Ethnographic Knowledge: Hands-on Approaches for Learning Epistemology” Teaching Sociology 45(3): 209-219.

Creswell, John W. 2009. Research design. Qualitative, quantitative, and mixed method approaches . 3rd ed. Thousand Oaks: SAGE Publications.

Davidsson, David. 1988. 2001. The myth of the subjective. In Subjective, intersubjective, objective , ed. David Davidsson, 39–52. Oxford: Oxford University Press.

Denzin, Norman K. 1970. The research act: A theoretical introduction to Ssociological methods . Chicago: Aldine Publishing Company Publishers.

Denzin, Norman K., and Yvonna S. Lincoln. 2003. Introduction. The discipline and practice of qualitative research. In Collecting and interpreting qualitative materials , ed. Norman K. Denzin and Yvonna S. Lincoln, 1–45. Thousand Oaks: SAGE Publications.

Denzin, Norman K., and Yvonna S. Lincoln. 2005. Introduction. The discipline and practice of qualitative research. In The Sage handbook of qualitative research , ed. Norman K. Denzin and Yvonna S. Lincoln, 1–32. Thousand Oaks: SAGE Publications.

Emerson, Robert M., ed. 1988. Contemporary field research. A collection of readings . Prospect Heights: Waveland Press.

Emerson, Robert M., Rachel I. Fretz, and Linda L. Shaw. 1995. Writing ethnographic fieldnotes . Chicago: University of Chicago Press.

Esterberg, Kristin G. 2002. Qualitative methods in social research . Boston: McGraw-Hill.

Fine, Gary Alan. 1995. Review of “handbook of qualitative research.” Contemporary Sociology 24 (3): 416–418.

Fine, Gary Alan. 2003. “ Toward a Peopled Ethnography: Developing Theory from Group Life.” Ethnography . 4(1):41-60.

Fine, Gary Alan, and Black Hawk Hancock. 2017. The new ethnographer at work. Qualitative Research 17 (2): 260–268.

Fine, Gary Alan, and Timothy Hallett. 2014. Stranger and stranger: Creating theory through ethnographic distance and authority. Journal of Organizational Ethnography 3 (2): 188–203.

Flick, Uwe. 2002. Qualitative research. State of the art. Social Science Information 41 (1): 5–24.

Flick, Uwe. 2007. Designing qualitative research . London: SAGE Publications.

Frankfort-Nachmias, Chava, and David Nachmias. 1996. Research methods in the social sciences . 5th ed. London: Edward Arnold.

Franzosi, Roberto. 2010. Sociology, narrative, and the quality versus quantity debate (Goethe versus Newton): Can computer-assisted story grammars help us understand the rise of Italian fascism (1919- 1922)? Theory and Society 39 (6): 593–629.

Franzosi, Roberto. 2016. From method and measurement to narrative and number. International journal of social research methodology 19 (1): 137–141.

Gadamer, Hans-Georg. 1990. Wahrheit und Methode, Grundzüge einer philosophischen Hermeneutik . Band 1, Hermeneutik. Tübingen: J.C.B. Mohr.

Gans, Herbert. 1999. Participant Observation in an Age of “Ethnography”. Journal of Contemporary Ethnography 28 (5): 540–548.

Geertz, Clifford. 1973. The interpretation of cultures . New York: Basic Books.

Gilbert, Nigel. 2009. Researching social life . 3rd ed. London: SAGE Publications.

Glaeser, Andreas. 2014. Hermeneutic institutionalism: Towards a new synthesis. Qualitative Sociology 37: 207–241.

Glaser, Barney G., and Anselm L. Strauss. [1967] 2010. The discovery of grounded theory. Strategies for qualitative research. Hawthorne: Aldine.

Goertz, Gary, and James Mahoney. 2012. A tale of two cultures: Qualitative and quantitative research in the social sciences . Princeton: Princeton University Press.

Goffman, Erving. 1989. On fieldwork. Journal of Contemporary Ethnography 18 (2): 123–132.

Goodwin, Jeff, and Ruth Horowitz. 2002. Introduction. The methodological strengths and dilemmas of qualitative sociology. Qualitative Sociology 25 (1): 33–47.

Habermas, Jürgen. [1981] 1987. The theory of communicative action . Oxford: Polity Press.

Hammersley, Martyn. 2007. The issue of quality in qualitative research. International Journal of Research & Method in Education 30 (3): 287–305.

Hammersley, Martyn. 2013. What is qualitative research? Bloomsbury Publishing.

Hammersley, Martyn. 2018. What is ethnography? Can it survive should it? Ethnography and Education 13 (1): 1–17.

Hammersley, Martyn, and Paul Atkinson. 2007. Ethnography. Principles in practice . London: Tavistock Publications.

Heidegger, Martin. [1927] 2001. Sein und Zeit . Tübingen: Max Niemeyer Verlag.

Heidegger, Martin. 1988. 1923. Ontologie. Hermeneutik der Faktizität, Gesamtausgabe II. Abteilung: Vorlesungen 1919-1944, Band 63, Frankfurt am Main: Vittorio Klostermann.

Hempel, Carl G. 1966. Philosophy of the natural sciences . Upper Saddle River: Prentice Hall.

Hood, Jane C. 2006. Teaching against the text. The case of qualitative methods. Teaching Sociology 34 (3): 207–223.

James, William. 1907. 1955. Pragmatism . New York: Meredian Books.

Jovanović, Gordana. 2011. Toward a social history of qualitative research. History of the Human Sciences 24 (2): 1–27.

Kalof, Linda, Amy Dan, and Thomas Dietz. 2008. Essentials of social research . London: Open University Press.

Katz, Jack. 2015. Situational evidence: Strategies for causal reasoning from observational field notes. Sociological Methods & Research 44 (1): 108–144.

King, Gary, Robert O. Keohane, S. Sidney, and S. Verba. 1994. Designing social inquiry. In Scientific inference in qualitative research . Princeton: Princeton University Press.

Chapter   Google Scholar  

Lamont, Michelle. 2004. Evaluating qualitative research: Some empirical findings and an agenda. In Report from workshop on interdisciplinary standards for systematic qualitative research , ed. M. Lamont and P. White, 91–95. Washington, DC: National Science Foundation.

Lamont, Michèle, and Ann Swidler. 2014. Methodological pluralism and the possibilities and limits of interviewing. Qualitative Sociology 37 (2): 153–171.

Lazarsfeld, Paul, and Alan Barton. 1982. Some functions of qualitative analysis in social research. In The varied sociology of Paul Lazarsfeld , ed. Patricia Kendall, 239–285. New York: Columbia University Press.

Lichterman, Paul, and Isaac Reed I (2014), Theory and Contrastive Explanation in Ethnography. Sociological methods and research. Prepublished 27 October 2014; https://doi.org/10.1177/0049124114554458 .

Lofland, John, and Lyn Lofland. 1995. Analyzing social settings. A guide to qualitative observation and analysis . 3rd ed. Belmont: Wadsworth.

Lofland, John, David A. Snow, Leon Anderson, and Lyn H. Lofland. 2006. Analyzing social settings. A guide to qualitative observation and analysis . 4th ed. Belmont: Wadsworth/Thomson Learning.

Long, Adrew F., and Mary Godfrey. 2004. An evaluation tool to assess the quality of qualitative research studies. International Journal of Social Research Methodology 7 (2): 181–196.

Lundberg, George. 1951. Social research: A study in methods of gathering data . New York: Longmans, Green and Co..

Malinowski, Bronislaw. 1922. Argonauts of the Western Pacific: An account of native Enterprise and adventure in the archipelagoes of Melanesian New Guinea . London: Routledge.

Manicas, Peter. 2006. A realist philosophy of science: Explanation and understanding . Cambridge: Cambridge University Press.

Marchel, Carol, and Stephanie Owens. 2007. Qualitative research in psychology. Could William James get a job? History of Psychology 10 (4): 301–324.

McIntyre, Lisa J. 2005. Need to know. Social science research methods . Boston: McGraw-Hill.

Merton, Robert K., and Elinor Barber. 2004. The travels and adventures of serendipity. A Study in Sociological Semantics and the Sociology of Science . Princeton: Princeton University Press.

Mannay, Dawn, and Melanie Morgan. 2015. Doing ethnography or applying a qualitative technique? Reflections from the ‘waiting field‘. Qualitative Research 15 (2): 166–182.

Neuman, Lawrence W. 2007. Basics of social research. Qualitative and quantitative approaches . 2nd ed. Boston: Pearson Education.

Ragin, Charles C. 1994. Constructing social research. The unity and diversity of method . Thousand Oaks: Pine Forge Press.

Ragin, Charles C. 2004. Introduction to session 1: Defining qualitative research. In Workshop on Scientific Foundations of Qualitative Research , 22, ed. Charles C. Ragin, Joane Nagel, Patricia White. http://www.nsf.gov/pubs/2004/nsf04219/nsf04219.pdf

Rawls, Anne. 2018. The Wartime narrative in US sociology, 1940–7: Stigmatizing qualitative sociology in the name of ‘science,’ European Journal of Social Theory (Online first).

Schütz, Alfred. 1962. Collected papers I: The problem of social reality . The Hague: Nijhoff.

Seiffert, Helmut. 1992. Einführung in die Hermeneutik . Tübingen: Franke.

Silverman, David. 2005. Doing qualitative research. A practical handbook . 2nd ed. London: SAGE Publications.

Silverman, David. 2009. A very short, fairly interesting and reasonably cheap book about qualitative research . London: SAGE Publications.

Silverman, David. 2013. What counts as qualitative research? Some cautionary comments. Qualitative Sociology Review 9 (2): 48–55.

Small, Mario L. 2009. “How many cases do I need?” on science and the logic of case selection in field-based research. Ethnography 10 (1): 5–38.

Small, Mario L 2008. Lost in translation: How not to make qualitative research more scientific. In Workshop on interdisciplinary standards for systematic qualitative research, ed in Michelle Lamont, and Patricia White, 165–171. Washington, DC: National Science Foundation.

Snow, David A., and Leon Anderson. 1993. Down on their luck: A study of homeless street people . Berkeley: University of California Press.

Snow, David A., and Calvin Morrill. 1995. New ethnographies: Review symposium: A revolutionary handbook or a handbook for revolution? Journal of Contemporary Ethnography 24 (3): 341–349.

Strauss, Anselm L. 2003. Qualitative analysis for social scientists . 14th ed. Chicago: Cambridge University Press.

Strauss, Anselm L., and Juliette M. Corbin. 1998. Basics of qualitative research. Techniques and procedures for developing grounded theory . 2nd ed. Thousand Oaks: Sage Publications.

Swedberg, Richard. 2017. Theorizing in sociological research: A new perspective, a new departure? Annual Review of Sociology 43: 189–206.

Swedberg, Richard. 1990. The new 'Battle of Methods'. Challenge January–February 3 (1): 33–38.

Timmermans, Stefan, and Iddo Tavory. 2012. Theory construction in qualitative research: From grounded theory to abductive analysis. Sociological Theory 30 (3): 167–186.

Trier-Bieniek, Adrienne. 2012. Framing the telephone interview as a participant-centred tool for qualitative research. A methodological discussion. Qualitative Research 12 (6): 630–644.

Valsiner, Jaan. 2000. Data as representations. Contextualizing qualitative and quantitative research strategies. Social Science Information 39 (1): 99–113.

Weber, Max. 1904. 1949. Objectivity’ in social Science and social policy. Ed. Edward A. Shils and Henry A. Finch, 49–112. New York: The Free Press.

Download references

Acknowledgements

Financial Support for this research is given by the European Research Council, CEV (263699). The authors are grateful to Susann Krieglsteiner for assistance in collecting the data. The paper has benefitted from the many useful comments by the three reviewers and the editor, comments by members of the Uppsala Laboratory of Economic Sociology, as well as Jukka Gronow, Sebastian Kohl, Marcin Serafin, Richard Swedberg, Anders Vassenden and Turid Rødne.

Author information

Authors and affiliations.

Department of Sociology, Uppsala University, Uppsala, Sweden

Patrik Aspers

Seminar for Sociology, Universität St. Gallen, St. Gallen, Switzerland

Department of Media and Social Sciences, University of Stavanger, Stavanger, Norway

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Patrik Aspers .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Aspers, P., Corte, U. What is Qualitative in Qualitative Research. Qual Sociol 42 , 139–160 (2019). https://doi.org/10.1007/s11133-019-9413-7

Download citation

Published : 27 February 2019

Issue Date : 01 June 2019

DOI : https://doi.org/10.1007/s11133-019-9413-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Qualitative research
  • Epistemology
  • Philosophy of science
  • Phenomenology
  • Find a journal
  • Publish with us
  • Track your research

Banner

Research Basics - All Subjects

  • Getting Started
  • Evaluating Resources
  • Generating Search Terms
  • Citations, Copyright, Plagiarism
  • What is AI Plagiarism?
  • Physical Library Resources
  • Digital Library Resources
  • Videos / Tutorials on Search Skills
  • Primary Sources
  • Subject Guides: Find your Major This link opens in a new window
  • Open Educational Resources (OER) This link opens in a new window
  • Qualitative, Quantitative & Empirical Research

Quantitative Research

Purpose           Supports a hypothesis through a review of the literature
Aim Provides a statistical model of what the literature presents
Previous Knowledge Researcher already knows what has been discovered
Phase in Process Generally occurs later in the research process
Research Design Designed before research begins
Data-Gathering Data is gathered using tools like surveys or computer programs
Form of Data Data is numerical
Objectivity of Research More objective; researcher measures and analyzes data
Keywords Quantitative, survey, literature review

Qualitative Research

Purpose           Used for exploration, generates a hypothesis
Aim Provides an in-depth description of the research methods to be used
Previous Knowledge Researcher has a general idea of what will be discovered
Phase in Process Usually occurs early in the research process
Research Design Design is developed during research
Data-Gathering Researcher gathers data from interviews, etc.
Form of Data Data takes the form of interviews, videos, artifacts
Objectivity of Research More subjective; researcher interprets events
Keywords Qualitative, methods, results, interviews

Empirical Studies

  • An empirical study is research derived from actual observation or experimentation.
  • The written articles resulting from empirical studies undergo a rigorous review by experts in the field of study prior to being published in journals.
  • After passing this review the articles are published in a scholarly, peer-reviewed, or academic journal.
  • Empirical study articles will generally contain the following features: Abstract - This is a summary of the article. Introduction - This is often identified as the hypothesis of the study and describes the researcher's intent.            Method - A description of how the research was conducted. Results - A description of the findings obtained as a result of the research. Most often answers the hypothesis. Conclusion - A description of how/if the findings were successful and the impact made as a result. References - A detailed listing of all resources cited in the article that support the written work.              
  empirical, experiment, methodology, observation, outcomes, sample size, statistical analysis, study
         
  generally employ .

         

Mixed Methods Research

Mixed Methods Research uses strategies from both qualitative and quantitative research processes to provide a greater understanding of the subject matter.

  • << Previous: Open Educational Resources (OER)
  • Next: Help >>
  • Last Updated: Jun 21, 2024 12:59 PM
  • URL: https://campbellsville.libguides.com/researchbasics

Warning: The NCBI web site requires JavaScript to function. more...

U.S. flag

An official website of the United States government

The .gov means it's official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Browse Titles

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Scarbrough H, D’Andreta D, Evans S, et al. Networked innovation in the health sector: comparative qualitative study of the role of Collaborations for Leadership in Applied Health Research and Care in translating research into practice. Southampton (UK): NIHR Journals Library; 2014 May. (Health Services and Delivery Research, No. 2.13.)

Cover of Networked innovation in the health sector: comparative qualitative study of the role of Collaborations for Leadership in Applied Health Research and Care in translating research into practice

Networked innovation in the health sector: comparative qualitative study of the role of Collaborations for Leadership in Applied Health Research and Care in translating research into practice.

Chapter 4 empirical analysis and findings: qualitative investigation.

  • Introduction

As noted previously, to be able to address the different dimensions of CLAHRC activity relevant to our study, we adopted a ‘multilevel’ approach in our fieldwork and analysis 97 that sought to integrate evidence from both our CLAHRC-level and our project-level data collection to provide a coherent, narratively structured account of the CLAHRCs’ development.

The overall approach that we adopted to data analysis incorporated a hybrid process combining both inductive and deductive thematic analysis of interview data. 98 At a basic level, thematic analysis of interview data is simply where coding ‘is used to break up and segment the data into simpler, general categories and expand and tease out the data in order to formulate new questions and levels of interpretation’ (p. 30). 99 It was important to recognise that, in building our study on a theoretical concern with networked innovation, 31 , 79 we had already made assumptions and developed ideas about the focus for the analysis. However, it was also important to allow our analysis to be data driven to allow new ideas to emerge during the process of coding. Therefore, we needed to develop an approach that allowed us to make use of our preconceived ideas and theoretical underpinning, while still maintaining the inductive flexibility of an approach that supports the generation and development of new ideas.

As interpretive research still needs to demonstrate credibility and trustworthiness through being founded on a systematic evidence of the research process, our data analysis was supported by a structured method that combined steps in which we were ‘data driven’ and inductively developed codes based on interesting ideas and themes that emerged from our study of transcripts, together with incorporating phases of review in which we reflected on how these ideas fitted in with the overall objectives of our study. Therefore, although our research analysis was based on a linear ‘step-by-step’ procedure, this still facilitated an iterative and reflexive process. 98 However, in following a structured approach, we were able to continually reframe our analysis both based on ideas from inductive study, and allowing our theoretical grounding to be an integral part of the generation of codes.

We used NVivo to support our data analysis. While NVivo can be used to support a more objective and logical categorisation of codes, we should recognise that this is only an aid to the organisation of the material and is not in itself an interpretive device.

To structure the individual case narratives outlined below, we have adopted three major headings which reflect our conceptual framework and support critical concerns around the development of the CLAHRCs. These headings are as follows: governance, management and organisation; collaboration and networks; and KT. To begin our account, however, we focus on the way in which the goals of the CLAHRC initiative were appropriated by individual CLAHRCs in terms of the vision which they defined for themselves.

The vision of the Collaborations for Leadership in Applied Health Research and Care

As the three case sites of translational initiatives within the UK were all created through the same UK NIHR funding programme, they were all designed to meet the same aim and generic mission. However, there was significant flexibility in the way in which this mission was interpreted by the leaders of different CLAHRCs. We term these interpretive acts of leadership as different ‘visions’ of collaborative translational research. Within our study we have explored how the vision of each CLAHRC has emerged from and interacted with the structuring of the initiative, particularly in terms of management and governance. By studying the CLAHRCs’ development over time, then, the qualitative fieldwork has been able to explore how these distinctive features of each CLAHRC influence their approach to KT.

  • Bluetown Collaboration for Leadership in Applied Health Research and Care

The Bluetown CLAHRC is based on a partnership with organisations from a large urban area. It is led by a university hospital with an established strong reputation in conducting research. The health-care partners are representative of a range of organisation types, including acute hospital, primary care and mental health trusts, which includes both organisations with extensive research experience and those that have been previously less involved. The CLAHRC was originally established around a simple ‘hub-and-spoke’ model of a small central management team and nine clinical project teams. The core of the CLAHRC, including its management team and several of the clinical project teams, is centred on a traditional medical school public health department with high-profile academic expertise in clinical sciences research, and historic links with the lead NHS site. Each project team is largely composed of members based in the same geographical base, with a number of teams based at the university, and other clinical project teams are located within one of the health-care partnership organisations. Specialist support services were included as a CLAHRC-wide resource, providing each clinical project team with access to people who could contribute medical sociology, health economics, methods such as systematic reviewing, and statistical expertise.

The director was integral to developing the vision for this CLAHRC and for embedding this within the different clinical projects. Throughout the development of this CLAHRC, his vision has been strongly influential on the form that the work programmes within the CLAHRC have taken as they have progressed. In particular, a clinical scientific tradition was incorporated into the design of study protocols at the start of the programme, with particular attention being placed on scientific methodological rigour, especially the production of outputs suitable for top-quality, peer-reviewed academic publication. In particular, all of the clinical projects were designed as prospective evaluation clinical-academic research studies and, therefore, constituted a set of work programmes all linked by a common scientific approach.

It’s to prospectively evaluate service delivery as it happens. And where possible to interact, you know, with managers and how the service delivery takes place. So that the product will be examples where this has happened prospectively and good examples that have been published in good places. In the process of doing that to export the idea or develop the idea in the local area. BLUETOWN001

This illustrates a cornerstone of the Bluetown CLAHRC model: the vision emphasises that the quality of the evidence being produced is crucial to its ultimate impact. As a result, the vision of this CLAHRC was founded on the view that any programme of work should first be grounded in a rigorous scientific approach, as only high-quality evidence should be taken up within health-care policy and practice.

Management, governance and organisation

The CLAHRC was originally formed around a small central leadership team, with the vision of the director strongly influencing the focus and direction of the CLAHRC model. As the director had a historically strong reputation in the local area, this helped to legitimise the CLAHRC as something that was perceived as of value by those in senior positions within the partner organisations.

The vision of the core management team has been strongly influential on the approach that each programme of work uses. Each programme of work is expected to use a rigorous scientific design and methodology in order to produce robust evidence that is suitable for publication in high-quality academic journals. Therefore, the model builds on the approach to scientific work that was historically conducted by the lead organisations, with the CLAHRC emphasising that through these work programmes the teams should foster collaborative relationships with relevant service areas. This vision is emphasised through the role of the leaders within each of the project teams, who provide scientific and methodological direction to the programme of work. However, although there is recognition of the overarching objectives expected from each team by the central management, there is no CLAHRC-wide strategy for how each team should be structured or how collaborative relationships should be formed and developed. As a result, the lead of each project team has been provided with extensive flexibility on how their individual programme is organised. As a consequence, each team tended to foster relationships with particular groups and communities as relevant to the local services on which they were gathering evaluative evidence. The influence of the CLAHRC was important here in formalising and legitimising this collaboration between clinical academics and targeted groups in the NHS.

Without CLAHRC, we would have some of those connections but I think the momentum, thrust and energy that’s going into current programme really wouldn’t be there . . . associating with individuals from other fields, groups that we wouldn’t normally be part of. This has really allowed us to reflect more objectively on work, and the direction we’re going. BLUETOWN022

Structural features of the CLAHRC were used to communicate the overarching objectives of central management to the clinical project teams. This involved regular interactions between the centre and projects, management representation at project team meetings, and programmed meetings for project leaders and project managers. The positions which project members held in other environments (i.e. outside of their own team environment) were typically construed as ‘honorary’ – that is, not part of the main role which project members perform within their clinical team or central management group ( Box 1 ).

Case example: organisational processes by which the CLAHRC vision is distributed Observation of the interactions within a project meeting illustrated how the vision of the overall initiative is emphasised through the presence of a member of the core management (more...)

Although the majority of clinical project team members share similar types of disciplinary expertise, with most having clinical–academic experience, the structural organisation of the initiative provides access to other types of expertise. The extent of the CLAHRC-wide resources means that individuals with expertise such as health economics, statistics, systematic reviewing, sociology and communication are easily available for project teams to access. With the sociology theme, for example, each project team allocated a small proportion of their own resources to support the employment of a select number of people with this type of expertise. Although these team members come from different working cultures from the majority of the CLAHRC members, it is clear to the clinical project teams that the director values and respects the expertise that these individuals can provide. This helps to legitimise their contribution within the teams, even in sociological territory, which such teams would not normally view as part of their remit. At the start of CLAHRC, the cross-cutting activity for sociology was an undefined programme of work, but this provided an opportunity for these members to liaise with the clinical project teams to identify how they could support their programme of work. As relationships were built up, they quickly identified certain project teams where they could add value to the other work that was already planned by the team lead.

We are officially termed as a cross-cutting theme but we’re also embedded in the individual research, as in our jobs are paid out of individual projects. BLUETOWN003

This cross-cutting work has become an embedded part of a proportion of the clinical project teams. Although they provide a different type of expertise to the clinical projects, the members of this cross-cutting theme enact their role in such a way as to fit in with the overall work programme. While, overall, the cross-cutting theme constitutes only a small part of the CLAHRC, resources were deliberately allocated so that the members of this group would be highly skilled and experienced, and therefore able to achieve this. They have also been able to contribute guidance to more junior members of the project teams who are involved with areas that overlap with their area of expertise, such as qualitative components. Observation from in-depth studies of the four clinical themes indicates that this approach has facilitated the ‘embedding’ of the cross-cutting theme members within the project teams.

Networks and collaboration

The qualitative interviews demonstrated that from the early stages of the CLAHRC’s development, a clear objective was understood to focus on working with stakeholder groups, such as collaborating with NHS practitioners and managers within the clinical project work. There was also acknowledgement that this required some compromise with established academic work practices, with some effort being required to produce work that is suitable for practitioners.

Getting researchers to understand practitioners is a covert aim of CLAHRC. So that you don’t go away for five years and then tell them what they should have done in the first place because practitioners don’t want to hear that. CMBLUETOWN007

The activity of each of the clinical project teams means that they create links to defined health-care organisations involved in the CLAHRC partnership. The interaction between members of the project team (e.g. the project lead) is integral to fostering the relationships among the official partner organisations of the CLAHRC. As many senior CLAHRC members had pre-existing collaborative relationships with NHS trusts, they were able to enact ‘senior’ boundary-spanning roles. Many of the theme leads were in clinical–academic dual activity roles and held honorary contract positions with NHS organisations. However, their leadership typically reflected the wider ‘epistemic’ community of a university setting, emphasising academic values rather than the practical concerns seen in the health service environment. For university-based teams, the collaborative interaction was framed by the values of the academic community, with high-level academic publication considered as important for demonstrating value to these groups. This was seen as consistent with the vision of the CLAHRC, in that the collaboration is seen as creating a culture within health-care settings which is more receptive to high-quality scientific evidence.

It’s nonsense to say . . . the PCT health, local authority or the voluntary sector don’t consider evidence. They do. They just consider evidence perhaps in a different way than you or I perhaps might consider evidence . . . The CLAHRC process is about the better, the optimal decision making that we can bring, the greater rigour, to set different parameters for making the decision. That is the value. BLUETOWN014

Thus, the CLAHRC emphasises clinical–academic evaluation of service delivery in terms of producing high-quality evidence. As a result, the CLAHRC’s applied health research is conducted on the terms set by the clinical–academic community, to which other groups must be able to fit in. At the same time, the CLAHRC’s strong scientific reputation within certain fields helps it to build links with communities that value this type of evidence, such as national policy groups and certain local clinical groups. Certain groups of health-care managers and policy makers were perceived as valuing exactly this type of evidence.

Publishing all this information in top quality journals as absolutely, absolutely key. Not just to the academic credibility of CLAHRC but to the managerial credibility of CLAHRC . . . dealing with managers, when they want to know how good somebody is the first thing they do is see whether other academics respect that person. And one method of that is you publish in the top journals. They don’t want to deal with somebody who’s not up there at the cutting edge. They want to be with the front people. BLUETOWN001

The work taking place within these themes tends to be dominated by the traditions of the theme lead and the culture where the theme is physically located. Non-health care-based teams do use ‘in-built boundary spanners’, pre-existing contacts and mechanisms such as advisory boards to draw on insights about the local health-care context ( Box 2 ). This approach enables the teams to focus on their own academic areas of expertise but perhaps supports depth rather than breadth of approach.

Case example: clinical boundary spanner within a university-based team Although this CLAHRC did not have a formalised structure within its functional design to facilitate connections between clinical project teams, the flexibility provided for each subgroup (more...)

Bluetown CLAHRC also includes a number of themes based directly in health-care organisations. As the teams working on these themes are closer to the issues of practice, their work has the potential to more easily impact on health-care practice. In ‘speaking the same language’ as the practice and community groups they intend to impact, they are in a better position to integrate the perspectives of these groups. This should facilitate the implementation of findings and local impact that these groups will have. However, as these teams are further away from the core CLAHRC, greater attention has been required to ensure that these themes feel part of the Bluetown CLAHRC community.

Knowledge translation

Bluetown CLAHRC was not built on an explicit CLAHRC-wide strategy for KT. However, as each project team was provided with extensive flexibility in how they organised their clinical study, each individual theme has developed their own strategy for dissemination of outputs, with a flexibility to interpret the aims of the CLAHRC to fit their own clinical context. As a result, the themes have developed their own tailored approaches to relating their findings to local health-care practice and policy. These are not, however, related to an overall CLAHRC plan or strategy for this approach to the work. Additionally, these approaches do not appear to be shared between the themes, and their context-specific design may potentially limit wider application.

Overall, the Bluetown CLAHRC is formed from members who represent both academic disciplines and health-care professional groups. This means that the cumulative work of Bluetown CLAHRC has the potential to innovatively incorporate many different perspectives and draw from different types of expertise. However, currently, these tend to focus on project–team connections, which limit the potential of a co-ordinated Bluetown CLAHRC effect across the region. The lack of a co-ordinated strategy to link with the different types and levels of the Bluetown partners may have limited the capacity-building of the CLAHRC as a whole, notwithstanding the benefit of the strong relationships fostered by each individual team.

As the CLAHRC developed, a new theme of knowledge management was developed by the central CLAHRC management team. This was designed as a high-level cross-cutting theme. The activity was separate from the clinical project work.

But having said all that I am very keen to in addition add a specific knowledge management function to the CLAHRC but I am very clear this is not based on discoveries made in the CLAHRC, although I would not exclude those, but it’s not based on that, it’s based on the knowledge out there in the world, what can we make of that. So knowledge exchange if you like. BLUETOWN001

However, it drew on the infrastructure that the teams had used and fostered during the early stages of the CLAHRC. In particular, it was felt that, as the clinical project teams had demonstrated activity within the NHS organisations at an early stage, NHS managers and executives valued the work of the CLAHRC, and that this facilitated buy-in for this new venture.

Due to the financial model of the CLAHRC, no specific funding was allocated to this area. However, NIHR Flexibility and Sustainability Funding money was allocated to this activity. As part of this, the core management team was expanded to include support for the clinical teams and CLAHRC as a whole with external engagement and communication and dissemination activities. In addition, a knowledge-exchange forum was developed. This was targeted at high-level managers within NHS and local authority organisations. The meetings acted as a place where NHS organisations could discuss issues that they considered important areas for further work, and the CLAHRC team developed these ideas into discrete programmes of work in which they provided the resource for tailored evidence to be produced.

In this sense, as led by the director, this CLAHRC has evolved to incorporate new objectives as it has developed over time. However, the original vision of the CLAHRC model has been central to supporting the development of these new components. In particular, the knowledge-exchange forum has drawn on the connections fostered by the clinical teams, and the reputation established within partner organisations and beyond. This has allowed them to generate a membership of executive-level personnel who are in decision-making and management positions within their own organisations. Overall, the conceptual vision of the CLAHRC has helped to expand the CLAHRC programme of work beyond the initial set of clinical projects ( Box 3 ).

Case example: evolving collaborative relationships to provide additional services Theme 1 was built on what they describe as ‘a unique opportunity offered by these changes to carry out longitudinal studies into the interaction between changing (more...)

Reflections on the Bluetown Collaboration for Leadership in Applied Health Research and Care model and its development over time

The Bluetown CLAHRC’s vision and structure enables it to collaborate readily with those communities that are more aligned with the working practices of its members. The partner organisations and the members involved in the initiative, therefore, tend to support the principle that only rigorous evidence should be used to inform service developments. However, within this constraint, the Bluetown initiative does support new operational approaches, and in particular emphasises that project teams need to develop collaborative relationships with external communities, especially management and decision-makers.

With no overarching CLAHRC-wide strategy to support KT activity, each theme has developed approaches tailored to their own local communities and clinical context to support the mobilisation of the knowledge produced through their programme of work. However, as the CLAHRC has progressed over time, central management have augmented the original structure to develop new aspects to support overall Bluetown CLAHRC work. For example, they identified that CLAHRC-wide support was required to support the dissemination activities of the individual project teams. As a result, new central support was established to facilitate engagement of the project teams with end-users of their research, and to support the translation of research findings through a CLAHRC-wide communication strategy within the region.

  • Greentown Collaboration for Leadership in Applied Health Research and Care

The Greentown CLAHRC is led by a mental health trust, and the core of the initiative builds on established academic-research links between this health-care trust, a university hospital acute trust, and a university institution. However, an aim of this CLAHRC is to spread beyond the organisations that have traditionally been involved with research in order to build research capacity in localities further away from this core. As such, partner organisations are spread over a large geographical area, comprising a mix of urban and rural areas. The overall aim of the CLAHRC is to use an organisational learning model to facilitate a change in how research is conducted and utilised within the region. One key component of the model of this CLAHRC is about using the research experience of members from the ‘core’ organisations as a foundation from which this expertise can be dispersed and built on in other areas that have not traditionally worked in this way.

One of the things that CLAHRC was trying to do is move outside traditional university clinical centres, that would be located in this region . . . So it was not just about getting research into practice per se but broadening research into practice . . . In [another area] it’s red neck territory . . . best practice isn’t as evident over there as it is around the university, the trusts closely located to the university. GREENTOWN001

The CLAHRC model is built on an organisational learning approach which aims to close the gap between academia and practice. Within this approach, the vision of this CLAHRC is to develop the initiative in such a way that it can bring about a ‘step-change’ in how research is delivered and services are designed by facilitating a change in how the different communities involved conceptualise and undertake these types of activities. From the outset, the Greentown CLAHRC model created a number of structural features which were intended to embed this vision into its operational organisation. Key features include the clustering of work programmes within a small number of defined clinical themes, which support the building of communities around these clinical areas. There was also resourcing of dedicated ‘knowledge broker’ roles, through which a selected group of practitioners would support KT from project teams to the wider NHS. Cross-cutting themes were formed with the aim of providing clinical project team members with specialist forms of expertise in areas such as KT, synthesis of evidence, external engagement and communication, and statistical support. A CLAHRC-wide approach guided a similar constitution for all project teams, which included explicit mechanisms to support boundary spanning between different communities, through incorporating links within the structure of the CLAHRC to support the contact academics have with practitioners and managers in health services. This illustrates a key attribute of the Greentown vision for the CLAHRC in which the model that is being developed is designed to close the gap between research and practice by changing the working culture of the various stakeholder groups involved. However, as the CLAHRC has progressed, issues with the original approach were recognised, and amendments to this vision have been undertaken.

Greentown CLAHRC has had several changes in leadership since the decision to apply for a CLAHRC was made, with each one bringing different types of professional expertise to the role, ranging from clinical–academic, through business school academic, to health-services executive management. These changes in leadership are reflected in the ways in which the vision and approach of the CLAHRC has evolved from its conception and funding application through to its latter stages. Each new director has contributed a particular characteristic to the CLAHRC through their leadership. As the first director of the CLAHRC observed, the novelty of the role itself created a need for it to be ‘interpreted’ and ‘enacted’:

Even if they’d had a visible leadership role before the structured things were set up the brokering, engagement, it can be quite intangible. These sort of things have come to the fore . . . I am doing a leadership role that’s much more strategic and autonomous and I think there will be a degree of variability and interpretation of what the role is and how it’s enacted. My first port of call was to get practice on board. Those chief execs are quite frightening people sometimes. So get practice on board. So I went round every trust board that was a partner in CLAHRC and presented the CLAHRC. GREENTOWN001

As this suggests, the agency demonstrated by the leadership of the CLAHRC has been important in shaping its approach to KT. The original head of the CLAHRC bid was a clinical academic with a strong reputation in mental health research. A change in leadership was made to a specialist in organisational studies, with the aim of emphasising that the CLAHRC model was about a different way of working compared with traditional, standalone clinical–academic projects.

Overall, however, Greentown CLAHRC is structured around a set of clinical research programmes of work designed by clinical academics (i.e. typically a professor from one of the medical school clinical subdisciplines). These focus on applied health research issues to do with the delivery of services for chronic and mental health conditions. Although the basic form of the clinical themes persisted over time, they were eventually complemented by the creation of explicit structures for cross-cutting work. A considerable proportion of the finance was reallocated away from clinical research into implementation science to emphasise the integral role of this approach to their proposal. This process of evolving the planned model for the CLAHRC has generally been viewed as a positive process that enabled the CLAHRC strategy to be well defined by the time it started, supporting earlier progress.

And I mean, that letter and the feedback, you know, it was fairly explicit and it provided a platform for the reworking in the bid. You know, along implementation lines and gave me legitimacy to lead it . . . And I think that referral process was really useful for us. I think we were able to hit the ground running to a much greater extent than the other CLAHRCs. GREENTOWN001

Initiative-wide structures connected each clinical project team to members who provided specialist academic expertise, such as KT or statistical support. These fitted in with CLAHRC-wide activities, and were incorporated into the Greentown CLAHRC organisational model as a way of facilitating the sharing of different types of expertise with members from varied disciplinary and professional backgrounds.

This incorporation of this greater emphasis on implementation work was included after the majority of the core individuals had already ‘signed up’ to be part of the bid, and at a point when much of the focus for the clinical research projects had already been decided. In effect, the new KT strategy was grafted onto a CLAHRC structure that had evolved around a more traditional clinical–academic research model. Therefore, this new emphasis on an explicit implementation strategy had to be adopted by CLAHRC members with established interpretations of their roles, and where much of the content of the CLAHRC’s work programme had already been specified.

The majority of CLAHRC members are employed by the university, and many are co-located. However, as the CLAHRC spans multiple university departments, bringing together academics from a clinical–academic background within community health sciences, and social sciences from business and sociology departments, it created significant challenges for members with different types of expertise trying to work together.

For the clinical scientists this is a completely new way for them to do any work . . . They’ve never thought about the wider implementation aspects and actually how do you physically get research into practice. GREENTOWN002

As Greentown CLAHRC is built on a cross-disciplinary academic collaboration, this CLAHRC has faced additional challenges of working across the academic norms and practices of several different academic disciplines, including clinical sciences, nursing and allied health, and management and social sciences. In practice, it has sometimes been difficult to align these different spheres, especially in relating the conceptual remit of the cross-cutting themes. A further change to the leadership was made in the middle phase of the CLAHRC’s time frame, bringing in a director with a background in NHS executive management. This move helped to bring about a shift towards greater engagement with different types and levels of NHS organisations, in order to make the CLAHRC more NHS facing. At the same time, a key part of the restructuring from the mid-term review of Greentown was designed to combat uncertainty about the role of different programme components, and particularly the cross-cutting themes. In particular, as the CLAHRC’s vision emphasised new ways of working, it had been difficult for individuals to comprehend what their role should involve, or to fit this into traditional career trajectories.

As part of this restructuring, roles were divided into academic and non-academic parts, with the aim of allowing each member to focus on their own form of expertise, and to provide clear access to other types of skills. As a result, the CLAHRC was able to become more NHS facing, with a clearly defined team to assist with the delivery of research to practitioner and policy communities.

A typical configuration for the arrangement of positions within project teams had been designed by central management and was used across the initiative. Each core team was established around the team leader, which generally involved other colleagues from within the same academic subdivisions, and the employment of other team members who had trained within similar academic areas to fill designated roles for research and management of the programme of work. Additionally, management created a team structure that aimed to create connections between the core members of each project team, and other team members who could contribute different forms of expertise. These included ‘knowledge broker’ roles aimed at facilitating externally directed ties within health-care or community organisations. These acted as an explicit mechanism to support KT between different communities. An account provided by one team member who had taken on one such externally focused ‘knowledge brokering’ position describes some of her responsibilities within the team, and highlights how working across boundaries was achieved. She describes how an important part of this aspiration is for the team members in the designated brokering positions to spend time with external groups who are relevant stakeholders for to the intervention they are studying.

I did a lot of meetings etc., doing a presentation, explaining about the project. You need to tell them what’s happening with the project and how it’s going to be implemented locally, what the practice can have on them. And they’re a key part of making it work. Whether you . . . take their ideas up or we’re bringing ideas from the [initiative] to them, I just think it works both ways. So that won’t work or this won’t work and what about these participants, how is that going to work? Is there any money for that and then have we got stuff for that. Stuff that people in the university don’t understand.

Members in defined boundary-spanning roles within academic-centred project teams describe how they use team meetings to, for example, bring in insights about public health issues to the academic core team members. We observed in one discussion how project meetings were used as a place where the implications of the information provided by external groups was incorporated into a discussion on developing a sustainable implementation of the intervention which the project team were researching. Senior academic members of the team integrated this information by aligning it with the scientific and methodological approach for the project work, such as considering how this related to the study design, and the implications changes to the protocol would have on demonstrating the academic rigour required for high-quality journal publications.

I think a CLAHRC is about doing things differently to what’s been done before. I think the only way to do that is to bring in people who have different backgrounds and different experiences, who understand the world in a different way. Because I think one of the problems that the CLAHRCs are set up to address is that academic research is done for academics and clinical research is done for clinicians and the twain don’t meet. And I think this CLAHRC has been sensible in bringing in people from different backgrounds. GREENTOWN002

The structure of Greentown is centred on themes defined by related clinical areas. This supports the achievement of collaboration between CLAHRC members and relevant stakeholders in the local area. The commonality of many activities covering mental health, and related projects within the stroke rehabilitation theme, has provided an opportunity for CLAHRC members to develop communities around these clinical areas. In building on the formal roles to link project teams with external groups, efforts were made from the project team leadership to develop relationships with communities relevant to their research topic. With the senior leaders already having a reputation within their field, this provided a platform for connecting with stakeholders from within the local region, and in presenting a body of ‘CLAHRC-type’ work to national clinical groups.

As part of the CLAHRC-wide strategy, the building of ‘communities of practice’ is actively promoted as an approach which can ultimately support the implementation of evidence into local policy and practice. Four groups are perceived as particularly important: academics, clinical practitioners, health services decision-makers (managers and commissioners) and service user representatives. In particular, creating connections and building on links of existing networks of practitioners is viewed as a strategy to support this aim. In addition, the importance of developing relationships with opinion leaders within the local area has been described as an important part of the implementation process. The CLAHRC has developed an inclusive register of associates, where anyone who is interested in the CLAHRC work can sign up to receive updates about the CLAHRC work and are invited to events. Many of the activities and events are formed around the clinical topics that are of mutual interest, which forms an emergent community for people within the local area to connect to certain groups within the CLAHRC.

Greentown adopted an explicitly translational approach for informing external communities, such as commissioners, decision-makers and clinical practitioners, about the results of the clinical teams’ programme of work. This was based on a standard template and style of writing which emphasises the implications for policy and practice. These ‘bite-sized’ outputs operate as a tool to support KT of the work of the CLAHRC to the local practitioner and commissioning community. They require the academic teams (with assistance from specialist support members of the CLAHRC, and those with insight into practice) to tailor the way in which their work is presented for different types of audiences ( Box 4 ).

Case example: using a workshop event to discuss interim findings and facilitate KT A clinical project team organised a workshop to share interim findings from their programme of work. This both acted as a strategic tool for the dissemination of the study (more...)

In addition to organisational-level mechanisms, roles performed by individuals acted as bridges to support KT between different communities. Members in defined boundary-spanning roles within academic-centred project teams describe how they use team meetings to bring insight to the academic core team members relating to issues about using the public health intervention in a community setting:

I add an ability to look at things from an outsider’s perspective and say, ‘why are you doing it that way?’ And I think there’s as much challenge to academia as there is to the NHS. I think this CLAHRC along with the others were set up with ‘we’re the academics, we’ve got the knowledge, we’re going to tell you what you need to know’. And actually one of the biggest shifts that’s happening here is that academics have to get a sort of understanding of a new process around the democratisation of research. GREENTOWN009

Ultimately, these individuals were able to work with the academic team by enacting roles, such as ‘translating’ how information should be presented for different audiences. They also acted as ‘interpreters’ when providing insights into the practical implications of particular implementations of interventions, and helped to negotiate or legitimise the CLAHRC’s work in non-academic settings:

One of the goals was to try and encourage evidence based practice, and it’s using people like [the clinical practitioner who is a team member] to try and overcome some of the barriers. He’s very useful, as being a clinician he would command quite a lot of respect. GREENTOWN018

However, this KT occurred very much at the periphery of the project team’s work, with little impact on academic work practices. The creation of explicit knowledge-broker roles provided a formalised link for the team to work with those with academic expertise, and we observed how the project team meetings were used by its members as a focused time and space in which the insight and knowledge of those from beyond the community of the core team, such as academic advisors, and those representing the vision of the central management of the initiative, could be considered.

Reflections on the Greentown Collaboration for Leadership in Applied Health Research and Care model and its development over time

As noted, this CLAHRC has explicitly aimed at developing a model in which its members work in new and different ways. However, the qualitative data suggest that, in practice, there have been issues with the effectiveness of the CLAHRC’s original model. Some members (including leads) have failed to fully engage with the work of the CLAHRC (a key objective of the CLAHRC model), as they have struggled to understand the purpose and remit of certain elements of this CLAHRC’s structure (e.g. what the purpose of the implementation theme is; what the role of knowledge brokers is), and how they can integrate the CLAHRC model into achieving clinical sciences programmes of work.

The changes that took place in the mid-term of the CLAHRC’s life cycle were designed to take these challenges into account. Overall, the vision of the CLAHRC was reaffirmed, including the organisational learning model and its structural features and roles. However, it was recognised that the original CLAHRC model had led to some confusion about the contribution of particular roles and themes. As a result, the restructuring involved redefining roles to emphasise the specialisms of particular members, while providing greater resources for individuals with the expertise to make the CLAHRC more ‘NHS facing’.

  • Browntown Collaboration for Leadership in Applied Health Research and Care

Browntown CLAHRC is based on a partnership between the universities and health-care organisations within a region that comprises several metropolitan boroughs. The core activity of the initiative is centred on organisations based within the same city that have historically been engaged with research activity. Many of the CLAHRC members are employed either by the universities or by the acute health-care organisation within this city, but the CLAHRC was also designed to build extensive numbers of new collaborative relationships between different communities based across these organisations. In particular, the initiative brings together academics from across different departments and universities who have not previously worked together. In addition, a range of health-care organisations are involved as partners, including acute hospital trusts, primary trusts and mental health services.

This vision of the CLAHRC is described as bringing about a step-change in the way research evidence is used, so as to influence the design of health-care services within the region and to improve their quality and effectiveness.

I initially thought the initiative would have been that the content and the research questions that were there, although soon I began to realise that although they’re important, it was about a much bigger transformation in the way that things are done. To get a paradigm shift really. BROWNTOWN014

The metropolitan area has a high level of health inequalities, and the CLAHRC programme of work is structured around groupings of clinical research projects for various long-term conditions which are undertaken using an applied health approach. In addition, there are a number of separate activities that use KT approaches to undertake later-stage implementation programmes of work designed to directly interact with and impact current health-care policy and services. There is a strong emphasis on capacity building within the various partner organisations across the region in relation to developing expertise on how evidence is handled. Overall, both the clinical research projects and the KT programmes of work are based on a model of integrating members from both research and practice by embedding a focus on practice-based issues.

It’s about addressing the second gap in translation. So it’s about getting research very close to practice or as part of practice. So it’s about undertaking applied research or making sure that research is implemented into practice . . . I don’t think it’s on practice, it’s research with practice . . . really integrating research as practice almost to improve services as you go along through reflection and action. The implementation arm is exactly the same, it’s really trying to get research so close to practice that you can’t really see a difference. So I think it’s about the full bridge really. So it’s about not saying research is one place and practice is another, it’s about trying to make things a lot more connected and integrated. BROWNTOWN021

The CLAHRC’s leadership is largely from an allied health and nursing background, with the CLAHRC membership including a high percentage of academic researchers and health-care practitioners with an allied health background. The vision of this CLAHRC is consequently informed by the practice-based approaches traditionally used within allied health professions, including an operating model centred on applied activities that aim to seamlessly integrate research and practice. However, while the approach of this CLAHRC fits well with the working practices of the allied health professions (both research and practice), it faces greater challenges to involve other groups, such as academics from other disciplines.

Browntown CLAHRC is built around a novel collaboration between the two main academic centres within the region: one which focuses on traditional forms of clinical science work, and another with expertise in allied health. This allied health ethos is reflected in the types of partnerships that are being built with the NHS, with strong links with senior and middle-level management in nursing and allied health within the partner trusts. These have been influential in informing the Browntown programmes of work.

This is not centric about the university. This is about an NHS collaboration that is distributed across a health environment, a health economy . . . It is about long-term conditions, and about knowledge mobilisation. BROWNTOWN002

In practice, this CLAHRC model faced challenges in creating a coherent overall programme of work that integrated members from a wide range of professional and disciplinary backgrounds, including different types of academics, and different groups of health-care practitioners. Nevertheless, especially at the core management level, the overall emphasis of the CLAHRC is to develop integrated work programmes involving both academic and NHS groups, and for these connections to be across different levels (i.e. to create links at high- and medium-level health-care management, and also NHS practice-level participation), and also to build the work of the CLAHRC across the whole Browntown region. In particular, the capacity-building objective of the CLAHRC is about building up the skills and expertise in all partner organisations, and ultimately bringing about an evidence-based applied health research culture across the Browntown region.

Overall, Browntown CLAHRC is formed from members of a wide mixture of academic disciplines and different groups of health-care professionals. This heterogeneous composition means that the work of Browntown CLAHRC has the potential to innovatively incorporate many different perspectives and draw from different types of expertise. This provides this CLAHRC with a strong foundation to develop a novel approach for an inclusive and collaborative model of applied research – one better able to span the boundaries between the ‘producers’ and ‘users’ and research. The clinical project work is structured into a small number of clusters which each constitute a programme of work within a common clinical topic. This deployment of members into relatively large subcommunities helps to foster collaboration between members who come from different departments and organisations. On being involved in a project that was one of a set within a larger programme of work, one respondent comments about the benefits of belonging to the translational initiative, as opposed to doing standalone project work:

It was a group of like-minded people where we could work together, because often we would all be doing independent things and not working together. So it was an opportunity for people to work together in some sort of funded activity which would have more power and influence than an individual academic working on their own. BROWNTOWN010

It was felt that having an interlinked set of projects could potentially have a greater impact. The overall programmes of work were often designed to inform different levels or aspects of health services for the same clinical areas. This more ‘holistic’ package of activity was perceived to have the potential to have a greater impact. In belonging to a larger team, there could be a co-ordinated presentation of activities to policy groups, with team members able to draw on the established connections with other groups to legitimise their own work through the high reputation of other colleagues. However, as members remain part of their ‘home’ organisation, this presents some operational challenges, as limited day-to-day interaction can mean that some members cannot easily access tacit information, or do not have a day-to-day reinforcement of the CLAHRC ‘approach’.

Many of the team members have several roles across different parts of the initiative, such as being members of clinical project teams while holding positions within the core management of the initiative. The central management team comprises a large group of CLAHRC members, meaning that internally the CLAHRC adheres to a distributed leadership style. This helps to support the diffusion of the values and aspirations of the central management team within the project work, and helps to reinforce the vision of the CLAHRC across the whole work programme. Many senior CLAHRC members, both from within the core management and the theme leads, are implicit boundary spanners – sometimes by dint of having ‘dual contracts’ with both NHS and university bodies – who contribute hybrid expertise and ‘belong’ to more than one community. They help to support the aim that the various programmes of work should incorporate different perspectives. The overlap between a large core management group and those in positions of leadership within the project teams helps to spread this vision. A smaller core group co-ordinates the overall CLAHRC-wide organisation, and presents the external-facing view of this CLAHRC model, and is influential in driving the vision throughout the CLAHRC. Overall, the onus is on project leadership to co-ordinate the different types of knowledge into one coherent programme. The leaders of the CLAHRC itself view their role as one of facilitating a new form of collaboration, rather than providing specific scientific or methodological expertise ( Box 5 ).

Case example: the role of a project team leader The role of the project leader is to co-ordinate different areas of work that are producing knowledge aligned to different disciplines, with the aim of producing one coherent programme of work. In working (more...)

The work taking place within these themes or project work is not dominated by the vision or traditions of one individual (e.g. the theme lead), but a culture has been created where individuals can contribute from their own perspective. As there is no one dominant culture influencing the approach of the work programmes, members with ‘specialist’ types of expertise are integrated members of project teams, allowing different types of knowledge to routinely inform the programme of work.

In [this translational initiative] you’re going into situations all the time where everyone in the room has got lots of different roles. That can be a bit of a challenge at times with people having to approach things from lots of different perspectives. It’s very much going in and out of roles sometimes. BROWNTOWN013

Members within the Browntown CLAHRC typically evolved more flexible and overlapping roles, reflecting the need for expertise to support different aspects of the work programme across the initiative.

Although the overall theme of each programme of work remained largely unchanged during the study period, the CLAHRC model and leadership in Browntown has allowed different groups to shape the focus and direction of the work packages. By collaboratively working with other stakeholder groups, the work packages were shaped by the values and insight of different communities, in what one respondent describes as an ‘organic process’.

It’s a tool for facilitating research, applied healthcare research. To enable patients and clinicians and commissioners to make sense of decisions about what to, about what types of treatment to provide . . . The overall structure of the research design didn’t change but it was such an organic process really, what we set out to do is what we’re doing, but their support and interest and feedback was important. CMBROWNTOWN002

In describing the work programme within the clinical theme, the participant highlights the flexibility of the plans for the project work, and how this allowed the integration of insights from different groups to inform and shape the direction of the work ( Box 6 ).

Case example: collaboration with different types of communities In this example, we observed a situation where a project team responded to external groups’ requests for outputs to inform their service development by refocusing their research from (more...)

The project teams across the CLAHRC were composed of a mix of academics from different disciplines and clinical practitioners. The senior management group actively encouraged teams to continue developing their original proposals based on discussion and dialogue with stakeholder groups. This more emergent approach was enabled by certain features of the CLAHRC model and membership. Within the project team, for example, several of the team members were what we will term ‘hybrid’ individuals, that is, they were affiliated to both academic and practitioner communities, and they helped to foster an environment where no one group dominated the direction of work programmes. Instead, all team members were encouraged to actively interact with other groups, and to be flexible in doing their project work. Thus, the vision of the CLAHRC fostered an environment where members were encouraged to develop new work practices and build relationships, rather than to conform to particular disciplinary approaches.

Across the Browntown CLAHRC, many of the members were able to act as boundary spanners precisely by virtue of their ‘hybrid’ academic and health practitioner background. This supported a more fluid integration between research and practice and the building of sustainable relationships, as the overlapping roles conferred membership of both the CLAHRC community and of external groups of managers and commissioners. At the same time, those in leadership positions helped to create an environment where knowledge from different perspectives was routinely shared across teams. Boundary-spanning mechanisms, such as project meetings, were used to support the fluid integration of different perspectives into the various components of work involved.

You see everyone has got a different perspective. Whether you’re a commissioner, you know, perhaps coming from a public health or social services background, but you’re commissioning. Or a manager in the NHS, perhaps social services seconded to NHS. Or a doctor or a nurse or a psychologist or a GP or a service user. You’ve all got a different understanding of what the care pathway is and what needs to be done to improve it. And so very much we deliberately wanted to incorporate a collaborative project between all those different groups. BROWNTOWN008

This CLAHRC’s KT approach drew from an established implementation model, the Canadian ‘knowledge into action’ cycle. In particular, many of the members of Browntown CLAHRC come from the NHS partner organisations, and many of the academics involved with this CLAHRC also hold NHS contracts and have been practically involved with roles at the local NHS organisations. In this sense, the CLAHRC organisation itself acts as a key mechanism to support the translation of knowledge between different communities.

I think CLAHRC is a boundary spanner. That’s its job, that’s what it is. Because the organisations, the NHS organisations, I mean, they do talk to each other because of, you know, Department of Health policy and stuff but they all have different ways of implementing policy. And so they don’t necessarily talk to each other but through us there’s work happening that can be, you know, translated across the different organisations. So we do act as kind of a, it’s almost like a phone exchange. BROWNTOWN017

The Browntown CLAHRC model included a number of work programmes which were designed from the outset to focus on explicit late-stage implementation activity that would produce tangible impact at an early stage in the CLAHRC time frame. These themes were not designed to produce new research evidence, but were intended to align with the NIHR aim of learning more about KT by conducting implementation activity.

We need to demonstrate progress with implementation from the start of CLAHRC. They are implementation projects using research methods, as well as making a difference to practice, and we’re very much committed to adding to the body of knowledge about knowledge translation . . . Implementation work is really a cross between action research and participatory research, and you need to shape the projects as you work with the key stakeholders. BROWNTOWN001

The implementation programmes of work were designed to produce results and impact on practice at an early stage. Thus, although they drew on established research expertise, they also emphasised collaboration with the local partners, including at all levels (executives, middle-management and ground-level staff) with whom the activities were taking place. These strands of work thus helped to reinforce and promote the expectation that all programmes of work, including the research themes, would generate outputs that could be applied to inform health-care practice.

The KT themes, in particular, focus on developing evidence that will help to make a practical difference to local health-care services. The clinical priorities for implementation are identified collaboratively, and the project work is seen as socially embedded in the organisations where change is happening. This facilitates the effective translation of knowledge into action by individuals and teams.

There’s a sense in which often researchers come to the NHS with their research ideas and then you try and get sign-up from them to take a project forward. This is turning the coin over completely and it’s saying, ‘we’re a resource; we want to work with you. What are your priorities?’ It’s been getting that ownership that I think has meant that we work in a different way, but also we get a different response back from NHS managers and clinicians. But also seeing, I guess it’s the role of brokers in all of this. BROWNTOWN001

The main focus of the work of these themes is on piloting and evaluating innovative strategies for implementation, which are then planned to be rolled out across the NIHR CLAHRC for Browntown partnership (with further evaluation of their impact). One key mechanism in this effort involves focusing resources on individuals who are employed by the health-care trusts. A role was designed for these members to act as ‘facilitators’ to build up research activity within their organisations based on the priorities and general ethos of Browntown CLAHRC-type work.

We have what are called ‘research and development facilitators’. The model which was developed was to actively engage each partner healthcare organisation by having a person working within them. So we would bring CLAHRC to these organisations. My role is to bring research evidence to my organisation, and encourage people to use the research evidence much more determining the way that work is completed. And also to help people here to articulate some of their, some of the issues which they have which could have a research solution. They could find a solution through research or at least some preliminary studies. BROWNTOWN022

It is clear from our analysis that members from throughout the CLAHRC recognise that there is an underlying strategy for this initiative to facilitate KT, and that this is conceptualised as involving an integrated, collaborative approach between academics, health-care practitioners and managers in order to facilitate implementation work. Thus, although the CLAHRC does designate certain work packages as ‘implementation’ projects, the vision enacted throughout the CLAHRC emphasises the need to bridge the second translational gap, and not limit KT activity to discrete programmes of work.

It’s not that simple as just having an implementation arm in CLAHRC . . . I think that a lot of people have been practitioners and have done research, there are very few pure academics who have never really linked in and have been in practice or not in the NHS . . . if you look at nearly every lead they’ve either been a practitioner or had a role in the NHS before as well as doing research. They’re all boundary spanners. BROWNTOWN021

However, while this priority given to KT is very apparent at the core management level of the CLAHRC, it is perhaps more difficult for this type of approach to radically influence all areas of the CLAHRC work, and in particular for innovation approaches to inform the approach of the research-focused clinical theme work. In addition, the challenges of conducting implementation research are also highlighted, as the inductive co-production approach here differs from the design of conventional academic projects. Thus, while achieving KT through the seamless integration of research and practice is an important part of the Browntown CLAHRC vision, it depends heavily on the contribution of those occupying ‘hybrid roles’. Crucially, the individuals within these roles are not deemed peripheral to the project teams, but are well positioned to incorporate insights from different community perspectives into the work of the teams.

I work between a number of different organisations, so principally the NHS and academia. It was useful that I am actually from an academic background myself. It makes it a little bit easier in terms of understanding what academic opportunities there might be which the NHS might be able to tap into . . . I work with the initiative really because I work for the NHS and ensure that the CLAHRC work is embedded within this NHS organisation. So it’s very much that boundary-spanning role, I have two identities. BROWNTOWN022

Reflections on the Browntown Collaboration for Leadership in Applied Health Research and Care model and its development over time

The vision of the Browntown CLAHRC collaboration aims to foster a change in the culture of members across the partner organisations by building the capacity of its members to engage with applied health activities. The fact that the core of the CLAHRC is not university-centric, but is instead based on the traditions of allied health academics and practitioners, has helped to privilege health services’ concerns in shaping the programme of work. Although discrete KT activities were established at its inception, the vision of the Browntown CLAHRC is for all programmes of work to engage with diverse perspectives and traditions. In this way, the CLAHRC builds capacity for innovation through a work environment where work practices are able to draw on a diverse range of perspectives.

The Browntown CLAHRC has evolved incrementally over time as new activities have been developed in response to partner need. From its inception, the CLAHRC has emphasised the scope for such incremental growth both through grafting on new programmes of work and through the expansion of the CLAHRC community to include new types of partners. Although not emphasised in the original bid, collaboration with industry has also emerged as an important strand of this evolution. One result of this approach has been the ability of the CLAHRC Browntown model to adapt to a changing policy landscape, as the ‘organisational memory’ of relationships with the primary care trusts has been used to forge relationships with new commissioning organisations.

  • Comparative analysis of Collaboration for Leadership in Applied Health Research and Care models and their enactments

The CLAHRCs were given extensive flexibility in interpreting the NIHR remit. They thus represent a ‘natural experiment’ in how to focus, organise and manage applied health research, which will have an impact on a local health-care environment. Our study of three CLAHRCs has highlighted how it has been necessary for each model to be tailored to their own local context.

The senior management of Bluetown CLAHRC strongly emphasises a common vision throughout its work, and contributes technical scientific support to work programmes. The ambition here is to produce high-quality scientific evidence through a rigorous methodological approach. Within this broad remit, each project team has extensive flexibility in the operational management of their work programmes. Instead of drawing on a CLAHRC-wide approach for KT, each project team here developed their own approach to translating the findings from their work programmes into practice. In this sense, the Bluetown model for KT is about supporting the operational autonomy of each project team to develop its own locally tailored approach.

Greentown CLAHRC draws on an explicit CLAHRC-wide organisational structure to facilitate KT activity. This structure emphasises a common operational management championed by the core leadership. The overall approach draws on a cross-disciplinary conceptual model which integrates different types of knowledge, including both clinical and social science academic traditions. However, each project team develops its own approach to accessing and developing requisite scientific and technical expertise, which allows team members to retain their pre-existing working practices.

The Browntown CLAHRC model was based on adapting the explicit Canadian framework for KT. Operationally, the CLAHRC work was situated within heterogeneous teams embedded within the local partner organisations. This, together with features such as overlapping community memberships and hybrid roles, helped to support the fluid, and often tacit, integration of different types of knowledge across all work programmes.

Notably, all three of the CLAHRCs we studied were able to develop and adapt their model over the 5-year funding time frame. The focus of these developments, and the extent to which modifications occurred, varied depending on each CLAHRC context. Bluetown CLAHRC started with a conventional model of the relationship between research and practice, which allowed work activity to commence straight away, and did not require radical changes to the working practices of its members. However, as the CLAHRC developed over time, new activities were introduced to support KT and knowledge management and engagement activities. Importantly, the established relationships of the early CLAHRC model are credited with helping to gain support for these new types of activities.

Greentown CLAHRC has experienced several changes in leadership since the initial formation of its model. Its development over time has particularly focused on refining the original organisational learning model in order to improve effectiveness. As such, this CLAHRC has engaged in changes to its operational management by restructuring the groups within the CLAHRC, and, in particular, making adaptations to the work of the specialist support services to support the clinical project teams more effectively.

The model of Browntown CLAHRC has evolved incrementally and has maintained consistent support for capacity building across local health-care communities. In particular, this CLAHRC has sought to develop and integrate new work programmes reflective of its overall vision into its CLAHRC community. It has continually expanded its engagement with its original health-care partners, and also new types of stakeholder groups, such as industry. In keeping with its distributed leadership style, the central management team itself grew over time, so as to assimilate representatives of different work components and incorporate their views into the organisation of the CLAHRC’s work. Our study of the three different CLAHRC helps to explain the way in which the broad CLAHRC remit has been appropriated in distinctive ways, according to the social networks and local contexts which have shaped the CLAHRCs’ development. In the process, we have observed also the generative effect of leadership and vision. These variations in the interpretation and enactment of the CLAHRC mission underlines the importance of their differing network structures and sense-making cognitions as addressed through the other research strands in our study (and succeeding sections of this report). In respect of KT, it is clear that each CLAHRC has developed its own distinctive approach, some aspects of which are managed and articulated at senior management level, while others (e.g. the implications of hybrid roles) make a more implicit contribution.

  • USA and Canada qualitative analysis

The next part of the report will describe, analyse and discuss the three North American cases. The first case is the Canada-Coordination, an initiative involving a number of health-care players in the Ottawa district, Ontario (Canada), and aiming to improve the co-ordination and quality of health-care delivery. The initiative is specifically addressed to a small number of children with complex care needs (these children have at least five different specialists who follow them). At the time of our fieldwork, the Canada-Coordination initiative included 23 such children. The second case is the Canada-Translation, an initiative involving a community hospital (Com-Hospital) and a large university (Uni-Canada) in the same city in Quebec, aiming to promote KT processes from academic (Uni-Canada) to practitioners (Com-Hospital) and, more generally, aiming to increase collaboration and cross-fertilisation activities between the hospital and the university. The third case is US-Health, an initiative involving a number of universities, hospitals and consultant companies specialising in health-care management and applied research and aiming to promote implementation research across the USA.

All three initiatives are different to the CLAHRCs in terms of size, being either somewhat smaller (e.g. the Canada-Translation) or larger (US-Health). In addition, their thematic focus and governance arrangements are necessarily different to the CLAHRCs’ (e.g. the Canada-Coordination is more focused on KT and collaboration across existing health-care organisations). However, these three cases were chosen because they are all organised to promote implementation research and KT processes on a networked basis, by exploiting existing networks and/or facilitating the creation of new collaborative networks.

The qualitative analysis of the North American cases, in line with the CLAHRCs analysis, is structured as follows (for each case): firstly, we introduce the case; secondly, we break down the analysis into the three main theoretical elements (or dimensions) of the framework supporting this study (governance and management aspects, networks and collaborations, and KT); and, thirdly, we discuss the implications of the case. As with the CLAHRCs qualitative analysis, each case involves three case examples, one for each dimension of the framework. We conclude this section with a discussion of the three initiatives and a comparison between these (North American) cases and the UK CLAHRCs.

  • Canada-Coordination pilot project

Canada-Coordination is a pilot project housed at the White Hospital, located in Ottawa, ON, Canada, and involves the hospital itself (a world-class tertiary paediatric centre), and several paediatric organisations and agencies in the Ottawa community.

The pilot project involves four main players (organisations): (1) the Regional Community Care Access Centre (RCAC), which is a community health provider that organises home, school, and hospital care, developing customised ‘care plans’ and providing support from health-care professionals, nurses, physiotherapists, social workers, registered dieticians, occupational therapists, speech therapists, and personal support workers to provide a range of care and support services; (2) the Ottawa Association to Support Children (OASC), which is another community health provider that provides specialised care for children and youth in Ontario with multiple physical, developmental, and associated behavioural needs; (3) the social services (SS), which is an agency that develops case resolution mechanisms to provide recommendations and referrals for families with children with complex care needs who are experiencing difficulties accessing support and services in the community; and (4) the White Hospital, which is the ‘hub’ of the project in that the children with complex care needs are patients of the hospital and, therefore, the main treatments are provided at the White Hospital.

The pilot project funds three key people to manage the project: (1) a project manager, (2) the most responsible physician and (3) the nurse co-ordinator. The project manager supervises the pilot project: she ensures that processes, communication pathways and flow maps have been developed; conducts staff training sessions; co-ordinates the meetings and presentations for the steering and advisory committees; and prepares reports. The most responsible physician reviews the overall complex medical needs of each child and co-ordinates communication with all the specialists at the White Hospital, other tertiary paediatric centres specialists, and the community physicians. The nurse co-ordinator works very closely with the most responsible physician, interfacing between the doctors, nurses, and managers at the White Hospital, and the other agencies; the nurse co-ordinator is the link person for all of the families of the children in the project. Interestingly, along with a solid governance structure, informal relationships played a central role in the pilot project, as is outlined in the example below ( Box 7 ).

Case example: governance structure and informal relationships The governance structure of the pilot project is relatively formal including steering and advisory committees that meet monthly in addition to an external entity – the White Hospital (more...)

From the case example in Box 7 , it is clear that informal relationships facilitated tight collaborations and promoted trust among the players involved in the pilot project. The development of trust, as we will show below, was also a relevant element in supporting the health-care network of the Ottawa community.

The pilot project exploits existing networks between the players involved including RCAC, OASC, SS, independent paediatricians, and the White Hospital personnel (doctors, nurses and staff). In fact, while the project involved hiring some key people who could facilitate collaboration among the players, many of the relationships between, for example, social services (RCAC) and the nurses and doctors in the hospital (the White Hospital) were already tight. This aspect (prior networks) has positively affected the overall project because while formal networks are relatively easy to establish, it takes time for people to begin working together productively. Interestingly, the pilot project network developed collaborations aimed at improving the quality of health-care delivery that were not limited to the players that were involved in the initial project. In fact, the managers (of the four agencies) soon realised that in order to clearly identify the needs of the children involved in the pilot project, it was important to involve the patients directly. Therefore, the project manager, in accordance with the directors of RCAC, OASC, and SS, decided to involve the parents of two children involved in the project. This involvement included having the parents sitting in the steering committee of the project that meets monthly and makes decisions regarding how the co-ordination of care at the White Hospital and across its network can be improved.

As the project manager highlighted regarding the evaluation process of the project: ‘So the two parents from family forum that also sit on our steering committee for this Pilot Project reviewed our questionnaires and helped us get to the questions we wanted so that it’s more a participatory evaluation approach’. This quote underlines a collaborative climate where actors belonging to different networks are willing to bring their contributions and are supported by a common aim: to identify ways to improve the quality of health-care delivery services for children with complex care needs. One of the most relevant issues that emerged from the feedback with the (two) families who sit in the steering committee was the difficulty of having all of the different community services up to date with the most recent changes regarding the children’s condition – this problem was made very challenging by the fact that each child is frequently seen by a number of specialists.

Knowledge translation within Canada-Coordination was focused on relations between clinicians and with family members of the children involved. It was facilitated not only by direct interactions among these groups, but also by the development of artefacts that could help span the boundaries between them. In this context, the introduction of the Single Point of Care (SPOC) document is an important example because it radically improved the relationship between the White Hospital and the health-care networks. The SPOC is a paper-based medical sheet including all basic information about a child’s health, such as current health status, current treatments (medications), and any other detail that can be helpful at school (e.g. current allergies) or in emergency situations (particular drugs that the child needs to take if his/her condition suddenly becomes severe or life-threatening). The SPOC is issued by the White Hospital and is carried by the families, who now no longer have the difficult and at times confusing task of collating themselves all the medical information related to their children produced by different specialists. The SPOC is also shared with a number of organisations in the Ottawa community, such as police and schools. In particular, it is very relevant that, for example, if the child develops a new allergy or changes one medication, the school nurse is aware of the changes. While the creation of the SPOC will be broadly discussed in the next section – the SPOC being a KT tool – the following case is meaningful in highlighting how SPOC contributed to promoting networks and collaboration across different players in the Ottawa community ( Box 8 ).

Case example: SPOC as a tool for KT The SPOC originated from discussions between the families of the children involved in the pilot project and the doctors and managers during steering committee sessions. Its value is highlighted here through the example (more...)

The SPOC also helped to reduce redundant examinations because the parents show the medical sheet to each specialist who would edit it as appropriate and gather information about forthcoming tests. In sum, the introduction of the SPOC improved efficiency (being a co-ordination mechanism for different specialists) and provided the families with more awareness about the conditions of their children. In fact, while on the one hand the SPOC is acknowledged by any doctor as an official document because it is issued by a hospital, on the other hand, the (simple) way the SPOC is structured allows the family to interpret (in general terms) the health status of their child.

In sum, according to the project founders (the leaders of RCAC, OASC and SS), the involvement of the parents had produced significant benefits. This point was also confirmed by a number of interviews that we conducted with the families of the children involved in the pilot project because a number of issues of the families in terms of co-ordination emerged that could be brought up for discussion in the project meetings (steering committee).

Reflections on the pilot project model and its development over time

The pilot project is an example of how informal networks are effective in promoting KT and collaboration across community partners, including health-care partners such as a hospital and the SS, yet also including also non-health-care partners such as the police department and the schools. The management of the project includes periodic controls undertaken by an independent entity that is the White Hospital Research Institute. The White Hospital Research Institute surveys doctors and families of the children involved in the project and makes sure that co-ordination and health-care delivery services are continuously improved. This is evidenced by feedback from both clinicians and patient families.

  • Canada-Translation Centre

The Canada-Translation Centre is an initiative that originates at Com-Hospital, a community and university affiliated health centre in Quebec, Canada, and that serves a multicultural population in southern Quebec.

The Canada-Translation Centre started in 2010 with the aim to promote and co-ordinate clinical research carried out by clinicians who work at Com-Hospital and are also appointed by Uni-Canada, a university in Quebec. According to the regulatory framework of the Canada-Translation Centre, it carries out clinical, epidemiological, and health services research studies, provides consultation to other researchers in the hospital, and provides educational services, including seminars and workshops, related to research.

The Canada-Translation Centre is led by Johanna, Associate Professor at Uni-Canada and the vice president (VP) of Academic Affairs at Com-Hospital. The Canada-Translation Centre organises monthly meetings (last Thursday of every month) that are held in the hospital. As at the time of our research the Canada-Translation Centre was in its initial stage of development, the main objective of the meetings that we observed involved the establishment of guidelines and systems.

From a governance perspective, the three main bodies that collaborate with the Canada-Translation Centre are (a) the REC, which is responsible for reviewing the scientific and ethical aspects of all research projects involving human subjects; (b) the research review office, which is co-ordinated by the research administrative secretary and provides administrative support to both the EC and the REC; and (c) the Quality Assessment (QA) Unit, which provides assistance to hospital staff on various aspects of QA analysis and evaluation, that is, projects that assess current performance or practice.

The governance aspects of the Canada-Translation Centre are extremely formalised, reflecting the interest of the academics at Uni-Canada to secure control of the development of the project, especially in its early stages. The board of directors supervises all research activities, while the VP of Academic Affairs (who is also the Canada-Translation Centre’s Director) supervises all main research committees. The Quality and Risk Management Committee is an independent body chaired by the VP of Professional Services. According to an interview with Johanna, the rationale of this design lies in the desire to have objective control of research activities by an independent body (the Quality Committee).

The network of the Canada-Translation Centre is formed by people who work in the hospital who also have at least a teaching appointment or, more often, a professorial appointment, at Uni-Canada. The design of the three key bodies of the centre sought to promote collaboration and cross-fertilisation between the hospital and Uni-Canada. In particular, within Family Medicine at Uni-Canada there are four departments that have been always very independent and loosely coupled; however, with the start of the Canada-Translation Centre, members of these four departments started meeting twice a month to try and pursue common research objectives. In sum, the main objectives of the Canada-Translation Centre are to (1) tighten the collaboration between Uni-Canada and Com-Hospital by promoting networks and (2) bring in expertise from external collaborations.

In terms of improving the network within the Canada-Translation Centre (Com-Hospital and Uni-Canada) one of the initial challenges was to try to build relationships between key actors who had no experience of working together. For instance, quality and risk management staff rarely interacted with academics prior to the start of the project. This group quickly became committed to collaborating with the academic groups, as highlighted by the director of the quality and risk management at Com-Hospital:

And I convinced Johanna that the uniqueness of the relationship between Quality and Research here has to live somehow and has to be formalised in whatever plan she does. And I think she saw that as an opportunity. If it were someone else here than me I don’t know if it would work the same. It’s just because I started out working in Research and I, you know, I did that for ten, twelve years and I know the language and I know their challenges and I know, you know, how they think and how they . . . operate.

While it is important that people within Com-Hospital are able to work together (e.g. the research team and the quality and risk management department, as per what was highlighted above), it is also relevant that the (more practice-oriented) researchers at Com-Hospital collaborate and develop networks with the professors at Uni-Canada. The associate dean of Inter-Hospital Affairs plays a role in doing this by trying to promote fruitful collaborations between the hospital and the university. As he highlighted to us:

It was a natural thing to develop research along these lines. Again my role has been to facilitate communication between the hospital and the university, both with the Dean of Medicine and with other research leaders within the university. I’m not a researcher myself. My role is mainly to put people together. And to use the links that we have with the Ministries to help support what the hospitals do. The other way in which I interplay with the hospital is I’m on the board of directors. So being on the board of directors of course some of these initiatives to fund this research infrastructure comes to the board and at times I speak to it in a supportive fashion in order to help promote this venture for the hospital.

Included in the Canada-Translation Centre mission for the period 2010–15 there are five main domains of research to be pursued:

  • to support and conduct high quality clinical and health services research relevant to the patients and services provided by Com-Hospital
  • to promote KT and exchange activities to support evidence-informed decision-making in practice, management and policy
  • to provide decision support for clinical and management leaders through high-quality rapid systematic reviews of scientific evidence
  • to support the application of evidence through quality improvement and other implementation initiatives
  • to provide a high-quality training programme and environment for students interested in research careers.

From the above it is clear that the Canada-Translation Centre has a specific focus on KT. In fact, each research member (i.e. a professor) must also be a clinician (i.e. he/she needs to spend a minimum number of hours per week in the hospital), a rule which is aimed at promoting applied research. As outlined by the case example below ( Box 9 ), one of the strategies that the Canada-Translation Centre adopts for promoting implementation research is to work on small projects whose short-term tangible results are directly testable in the hospital environment.

Case example: short-term implementation research One example of how the Canada-Translation Centre promotes implementation research is to try to develop very small projects where implementation requires months if not just weeks. Through the exploitation (more...)

Reflections on the Canada-Translation Center model and its development over time

In contrast with the pilot project in Ottawa, this initiative is much more complex and involves more than 50 people including Com-Hospital physicians, Uni-Canada’s academics, and administrative personnel. Staff involved are fully aware of the barriers between academics and practitioners and between people who work in different departments (in the case of Com-Hospital) and in different institutions (at the hospital and at Uni-Canada). Therefore, specific boundary spanners – such as the associate dean of Inter-Hospital affairs – were identified to connect people with different background, professional and personal interests.

US-Health is a model of field-based research designed to promote innovation in health-care delivery by promoting the diffusion of research into practice. The US-Health initiative promotes innovation in health-care delivery by speeding up the development, implementation, diffusion, and uptake of evidence-based tools, strategies, and findings. In particular, US-Health develops and aims to disseminate scientific evidence to improve health-care delivery systems.

The US-Health network includes a number of large partnerships (the ‘contractors’) and collaborating organisations that provide health care to more than 100 million Americans and is a 5-year implementation model of research that is field based and that fosters public–private collaboration aiming to provide concrete results in the short term. The US-Health partnerships involve most US states and provide access to large numbers of providers, major health plans, hospitals, long-term care facilities, ambulatory care settings, and other health-care structures. Each partnership includes health-care systems with large, robust databases, clinical and research expertise, and the authority to implement health-care innovations.

US-Health focuses on a wide variety of demand-driven, practical, applied topics of interest to the partnerships’ own operational leaders as well as the project funders. The programme emphasises projects that are addressed to user needs and operational interests and which, ideally, are expected to be generalisable across a number of settings.

US-Health partnerships operate under multi-year contracts. Proposals are bid on a rolling basis throughout each 5-year cycle. Projects need to be undertaken in the short term; they are awarded under separate task orders and are completed within 12–24 months. Also, the US-Health network is promoted by a national health agency called Federal-Health. Federal-Health’s mission is to improve the quality, safety, efficiency, and effectiveness of US health care. Federal-Health supports research that helps people make more informed decisions and improves the quality of health-care services.

US-Health’s research has two main characteristics: it is practice based and implementation oriented. Thus, it supports field-based research to explore practical, applied topics that are responsive to diverse user needs and operational interests. By testing innovations directly in the practical settings in which they are intended to be adopted, US-Health increases the likelihood of their eventual successful uptake. US-Health research is also designed to increase knowledge about the process of implementing innovations and the contextual factors influencing implementation. It aims to promote understanding of how and why specific strategies work or fail.

The governance structure of US-Health is relatively flat. Federal-Health is the government organisation that manages the tendering process for contracts, and US-Health (the network) is one of the recipients of these contracts. Informal relationships between Federal-Health and its collaborations over time, however, help to ensure that contracts are tailored to the strengths of members of the US-Health network. For example, Federal-Health frequently discusses potential research projects with the contractors of previous bids to elicit their interest.

Once a contract is signed, a project officer is assigned to each contractor; there are few project officers who manage multiple contracts nationwide. The main task of the project officer is to make sure that all milestones are met during the very short period of the contract. Moreover, the project officer can review ongoing documents, can come to visit a contractor to see how the research project progresses, and can provide suggestions and indications. The project officers need to write periodic reports for Federal-Health; therefore, Federal-Health can closely monitor each contractor (and each research project). The way a contractor (e.g. a member of the US-Health network) manages the research project is very subjective. While a proposal on how they will reach the objectives needs to be provided to Federal-Health, no specific guidelines on how to undertake the research are given. However, the contracts (and the research projects) are generally very specific in their objectives and concrete results. Very often, the research output is a ‘toolkit’, that is, a document with specific recommendations on how to manage a health-care problem in practice.

Many of the people whom we interviewed and who were involved in one or more US-Health research projects (i.e. contracts) highlighted that the success of the project depended to a large extent on the project officer. The effects of the project officer role can be both positive and negative, as shown by the example below ( Box 10 ).

Case example: the role of project officers in US-Health This interview was undertaken with a professor in a department of family medicine who has been involved in a Federal-Health project for several years. She describes with examples points of strength (more...)

US-Health has a very complex network including, as we previously noted, a number of large partnerships. In this section we provide some insights that focus on two main partnerships where we were able to conduct interviews: Health-partnership and Health-Consulting. We chose these two networks because they are very different (one is a partnership whose participants are academics while the other is a consultant company with expertise in health care). Although collaboration on bid writing between partners was limited by a lack of funding, some collaboration did take place across US-Health contractors.

Health-partnership (primary contractor) is a partnership led by a US university (name not disclosed) which managed some 10 contracts with Federal-Health within US-Health. The contracts of Health-partnership involve themes such as improving hepatitis C virus-screening practices and testing uptake in select primary care providers, preventing pressure ulcers in hospitals, co-ordinating care across primary care practices, using innovative communication technology to improve the health of minorities, avoiding readmissions in hospitals using technology, and reducing infections caused by particular bacteria.

Health-Consulting is a much smaller entity, being a private consultancy company with some 15 senior consultants. As a result, it is not in a position to develop clinical research involving patients. Health-Consulting’s strategy within US-Health was, therefore, to find collaborators (subcontractors) who could undertake implementation research in health-care structures. To do this, they drew on a wide network of subcontractors. The contracts managed by Health-Consulting focused on the relationship between patient information management in hospitals and risks of complications and mortality while patients are hospitalised (this contract investigates both electronic and paper-based medical records) and on the development of performance measures for injurious falls in nursing homes and rehospitalisation of patients discharged from hospitals to home care. Although collaboration on bid writing between partners was limited by a lack of funding, some collaboration did take place across US-Health contractors.

With the overall goal of translating research into practice, US-Health links many of the largest health-care systems in the USA with top health services researchers who are identified through a tendering process and are managed through short- to medium-term contracts.

It provides a network of delivery-affiliated researchers and sites with a means of testing the application and uptake of research knowledge. US-Health is the successor to another large-scale initiative which was completed in 2005. All of the large partnerships (i.e. the prime contractors who work with a specified range of other organisations) have a demonstrated capacity to turn research into practice for proven interventions, targeting those who manage, deliver or receive health-care services. As per the above, both Health-partnership and Health-Consulting develop research that is focused on a wide variety of demand-driven, practical, applied topics which are of interest to the partnerships’ own operational leaders as well as to the project funders. The overall programme (US-Health) emphasises projects that are broadly responsive to user needs and operational interests and which are expected to be generalisable across a number of settings. An example of implementation research conducted by Health-Consulting is outlined in the case example below ( Box 11 ).

Case example: implementation research at Health-Consulting Health-Consulting developed a tool to improve pharmacists’ communications with patients. This was informed by evidence that only 12% of US adults understand and use health information (more...)

Reflections on the US-Health model and its development over time

The US-Health case, if compared with the two other North American cases, involves a number of partnerships, each partnership involving a number of organisations and players. Moreover, the funding model is quite unique, depending on bids for contracts (very often agreed between parties – e.g. Federal-Health and, in our case, US-Health). Even though there was little interaction across the US-Health network as a whole, collaborations proliferated within partnerships. Moreover, most projects have been successful and some projects were adopted nationwide.

Included under terms of UK Non-commercial Government License .

  • Cite this Page Scarbrough H, D’Andreta D, Evans S, et al. Networked innovation in the health sector: comparative qualitative study of the role of Collaborations for Leadership in Applied Health Research and Care in translating research into practice. Southampton (UK): NIHR Journals Library; 2014 May. (Health Services and Delivery Research, No. 2.13.) Chapter 4, Empirical analysis and findings: qualitative investigation.
  • PDF version of this title (2.6M)

In this Page

Other titles in this collection.

  • Health Services and Delivery Research

Recent Activity

  • Empirical analysis and findings: qualitative investigation - Networked innovatio... Empirical analysis and findings: qualitative investigation - Networked innovation in the health sector: comparative qualitative study of the role of Collaborations for Leadership in Applied Health Research and Care in translating research into practice

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

Connect with NLM

National Library of Medicine 8600 Rockville Pike Bethesda, MD 20894

Web Policies FOIA HHS Vulnerability Disclosure

Help Accessibility Careers

statistics

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sustainability-logo

Article Menu

empirical evidence in qualitative research

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Evidence-based green human resource management: a systematic literature review.

empirical evidence in qualitative research

1. Introduction

1.1. state of the art of ghrm, 1.2. previous reviews of ghrm, 1.3. aims of the research, 2.1. identification: information sources and search strategies, 2.1.1. information sources, 2.1.2. search strategies, 2.2. screening: selection process and criteria, 2.3. included: data processing and analyzing, 3. results and discussion, 3.1. general overview of ghrm literature, 3.1.1. publications by years, 3.1.2. publications by sources, 3.1.3. publications by contexts, 3.1.4. publications by methodologies, 3.2. ghrm research foci and trends, 3.2.1. bibliographic coupling analysis, 3.2.2. keyword evolution analysis, 3.3. ghrm conceptualizations and research framework, 3.3.1. dimensions and measurements of ghrm, 3.3.2. theoretical bases of ghrm, 3.3.3. framework of ghrm, 3.4. recommendation for practitioners and researchers, 3.4.1. context and trend of ghrm, 3.4.2. access and approach to ghrm, 3.4.3. mechanism and innovation in ghrm, 4. conclusions, 4.1. summary of the research, 4.2. contribution of the research, 4.3. limitations and proposed solutions, author contributions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

  • Jackson, S.E.; Renwick, D.W.; Jabbour, C.J.; Muller-Camen, M. State-of-the-art and future directions for green human resource management: Introduction to the special issue. Ger. J. Hum. Resour. Manag. 2011 , 25 , 99–116. [ Google Scholar ] [ CrossRef ]
  • Panda, T.K.; Kumar, A.; Jakhar, S.; Luthra, S.; Garza-Reyes, J.A.; Kazancoglu, I.; Nayak, S.S. Social and environmental sustainability model on consumers’ altruism, green purchase intention, green brand loyalty and evangelism. J. Clean. Prod. 2020 , 243 , 118575. [ Google Scholar ] [ CrossRef ]
  • Mensah, J. Sustainable development: Meaning, history, principles, pillars, and implications for human action: Literature review. Cogent Soc. Sci. 2019 , 5 , 1653531. [ Google Scholar ] [ CrossRef ]
  • Zeng, H.; Li, X.; Zhou, Q.; Wang, L. Local government environmental regulatory pressures and corporate environmental strategies: Evidence from natural resource accountability audits in China. Bus. Strategy Environ. 2022 , 31 , 3060–3082. [ Google Scholar ] [ CrossRef ]
  • Pham, N.T.; Tučková, Z.; Jabbour, C.J.C. Greening the hospitality industry: How do green human resource management practices influence organizational citizenship behavior in hotels? A mixed-methods study. Tour. Manag. 2019 , 72 , 386–399. [ Google Scholar ] [ CrossRef ]
  • Han, H. Consumer behavior and environmental sustainability in tourism and hospitality: A review of theories, concepts, and latest research. J. Sustain. Tour. 2021 , 29 , 1021–1042. [ Google Scholar ] [ CrossRef ]
  • González-Benito, J.; González-Benito, Ó. A study of determinant factors of stakeholder environmental pressure perceived by industrial companies. Bus. Strategy Environ. 2010 , 19 , 164–181. [ Google Scholar ] [ CrossRef ]
  • Manaktola, K.; Jauhari, V. Exploring consumer attitude and behaviour towards green practices in the lodging industry in India. Int. J. Contemp. Hosp. Manag. 2007 , 19 , 364–377. [ Google Scholar ] [ CrossRef ]
  • Garcés-Ayerbe, C.; Rivera-Torres, P.; Murillo-Luna, J.L. Stakeholder pressure and environmental proactivity: Moderating effect of competitive advantage expectations. Manag. Decis. 2012 , 50 , 189–206. [ Google Scholar ] [ CrossRef ]
  • Song, W.; Yu, H. Green innovation strategy and green innovation: The roles of green creativity and green organizational identity. Corp. Soc. Responsib. Environ. Manag. 2018 , 25 , 135–150. [ Google Scholar ] [ CrossRef ]
  • do Paço, A.; Shiel, C.; Alves, H. A new model for testing green consumer behaviour. J. Clean. Prod. 2019 , 207 , 998–1006. [ Google Scholar ] [ CrossRef ]
  • Singh, S.K.; El-Kassar, A.-N. Role of big data analytics in developing sustainable capabilities. J. Clean. Prod. 2019 , 213 , 1264–1273. [ Google Scholar ] [ CrossRef ]
  • Jabbour, C.J.C.; Santos, F.C.A. The central role of human resource management in the search for sustainable organizations. Int. J. Hum. Resour. Manag. 2008 , 19 , 2133–2154. [ Google Scholar ] [ CrossRef ]
  • Renwick, D.W.; Redman, T.; Maguire, S. Green human resource management: A review and research agenda. Int. J. Manag. Rev. 2013 , 15 , 1–14. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Evans, P.A. The strategic outcomes of human resource management. Hum. Resour. Manag. 1986 , 25 , 149–167. [ Google Scholar ] [ CrossRef ]
  • Benevene, P.; Buonomo, I. Green human resource management: An evidence-based systematic literature review. Sustainability 2020 , 12 , 5974. [ Google Scholar ] [ CrossRef ]
  • Jabbour, C.J.C.; Santos, F.C.A. Relationships between human resource dimensions and environmental management in companies: Proposal of a model. J. Clean. Prod. 2008 , 16 , 51–58. [ Google Scholar ] [ CrossRef ]
  • Jackson, S.E.; Seo, J. The greening of strategic HRM scholarship. Organ. Manag. J. 2010 , 7 , 278–290. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Renwick, D.; Redman, T.; Maguire, S. Green HRM: A review, process model, and research agenda. Univ. Sheff. Manag. Sch. Discuss. Pap. 2008 , 1 , 1–46. [ Google Scholar ] [ CrossRef ]
  • Jabbour, C.J.C.; de Sousa Jabbour, A.B.L. Green human resource management and green supply chain management: Linking two emerging agendas. J. Clean. Prod. 2016 , 112 , 1824–1833. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Milliman, J.; Clair, J. Best environmental HRM practices in the US. In Greening People ; Greenleaf Publishing: Sheffield, UK, 1996; pp. 49–74. [ Google Scholar ]
  • Renwick, D.W.S.; Jabbour, C.J.C.; Muller-Camen, M.; Redman, T.; Wilkinson, A. Contemporary developments in Green (environmental) HRM scholarship. Int. J. Hum. Resour. Manag. 2016 , 27 , 114–128. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • James, P. Total Quality Environmental Management and Human Resource Management. In Greening People ; Greenleaf Publishing: Sheffield, UK, 1996; pp. 33–47. [ Google Scholar ]
  • Dutta, D.S. Greening people: A strategic dimension. ZENITH Int. J. Bus. Econ. Manag. Res. 2012 , 2 . Available online: https://ssrn.com/abstract=2382034 (accessed on 1 June 2023).
  • Jabbour, C.J.C.; Santos, F.C.A.; Nagano, M.S. Contributions of HRM throughout the stages of environmental management: Methodological triangulation applied to companies in Brazil. Int. J. Hum. Resour. Manag. 2010 , 21 , 1049–1089. [ Google Scholar ] [ CrossRef ]
  • Gholami, H.; Rezaei, G.; Saman, M.Z.M.; Sharif, S.; Zakuan, N. State-of-the-art Green HRM System: Sustainability in the sports center in Malaysia using a multi-methods approach and opportunities for future research. J. Clean. Prod. 2016 , 124 , 142–163. [ Google Scholar ] [ CrossRef ]
  • Ren, S.; Tang, G.; E Jackson, S. Green human resource management research in emergence: A review and future directions. Asia Pac. J. Manag. 2018 , 35 , 769–803. [ Google Scholar ] [ CrossRef ]
  • Appelbaum, E.; Bailey, T.; Berg, P.; Kalleberg, A.L. Manufacturing Advantage: Why High-Performance Work Systems Pay off ; Cornell University Press: Ithaca, NY, USA, 2000; pp. 25–63. [ Google Scholar ]
  • Barney, J. Firm resources and sustained competitive advantage. J. Manag. 1991 , 17 , 99–120. [ Google Scholar ] [ CrossRef ]
  • Barney, J.B. Resource-based theories of competitive advantage: A ten-year retrospective on the resource-based view. J. Manag. 2001 , 27 , 643–650. [ Google Scholar ] [ CrossRef ]
  • Freeman, R.E. Strategic Management: A Stakeholder Approach ; Cambridge University Press: Cambridge, UK, 2010. [ Google Scholar ]
  • Connelly, B.L.; Certo, S.T.; Ireland, R.D.; Reutzel, C.R. Signaling theory: A review and assessment. J. Manag. 2011 , 37 , 39–67. [ Google Scholar ] [ CrossRef ]
  • Edwards, J.R. An examination of competing versions of the person-environment fit approach to stress. Acad. Manag. J. 1996 , 39 , 292–339. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Edwards, J.R.; Shipp, A.J. The relationship between person-environment fit and outcomes: An integrative theoretical framework. In Perspectives on Organizational Fit ; Psychology Press: San Francisco, CA, USA, 2007; pp. 209–258. [ Google Scholar ]
  • Dumont, J.; Shen, J.; Deng, X. Effects of green HRM practices on employee workplace green behavior: The role of psychological green climate and employee green values. Hum. Resour. Manag. 2017 , 56 , 613–627. [ Google Scholar ] [ CrossRef ]
  • Shafaei, A.; Nejati, M.; Mohd Yusoff, Y. Green human resource management: A two-study investigation of antecedents and outcomes. Int. J. Manpow. 2020 , 41 , 1041–1060. [ Google Scholar ] [ CrossRef ]
  • Al-Hawari, M.A.; Quratulain, S.; Melhem, S.B. How and when frontline employees’ environmental values influence their green creativity? Examining the role of perceived work meaningfulness and green HRM practices. J. Clean. Prod. 2021 , 310 , 127598. [ Google Scholar ] [ CrossRef ]
  • Zaid, A.A.; Jaaron, A.A.; Bon, A.T. The impact of green human resource management and green supply chain management practices on sustainable performance: An empirical study. J. Clean. Prod. 2018 , 204 , 965–979. [ Google Scholar ] [ CrossRef ]
  • Yu, W.; Chavez, R.; Feng, M.; Wong, C.Y.; Fynes, B. Green human resource management and environmental cooperation: An ability-motivation-opportunity and contingency perspective. Int. J. Prod. Econ. 2020 , 219 , 224–235. [ Google Scholar ] [ CrossRef ]
  • Fachada, J.; Rebelo, T.; Lourenço, P.; Dimas, I.; Martins, H. Green Human Resource Management: A Bibliometric Analysis. Adm. Sci. 2022 , 12 , 95. [ Google Scholar ] [ CrossRef ]
  • Yong, J.Y.; Yusliza, M.-Y.; Fawehinmi, O.O. Green human resource management: A systematic literature review from 2007 to 2019. Benchmarking: Int. J. 2020 , 27 , 2005–2027. [ Google Scholar ] [ CrossRef ]
  • Pham, N.T.; Hoang, H.T.; Phan, Q.P.T. Green human resource management: A comprehensive review and future research agenda. Int. J. Manpow. 2020 , 41 , 845–878. [ Google Scholar ] [ CrossRef ]
  • Tanova, C.; Bayighomog, S.W. Green human resource management in service industries: The construct, antecedents, consequences, and outlook. Serv. Ind. J. 2022 , 42 , 412–452. [ Google Scholar ] [ CrossRef ]
  • Chowdhury, S.R.; Mendy, J.; Rahman, M. A Systematic Literature Review of GHRM: Organizational Sustainable Performance Reimagined Using a New Holistic Framework. Sustainability 2023 , 15 , 7513. [ Google Scholar ] [ CrossRef ]
  • Singh, S.K.; Del Giudice, M.; Chierici, R.; Graziano, D. Green innovation and environmental performance: The role of green transformational leadership and green human resource management. Technol. Forecast. Soc. Chang. 2020 , 150 , 119762. [ Google Scholar ] [ CrossRef ]
  • Tang, G.; Chen, Y.; Jiang, Y.; Paillé, P.; Jia, J. Green human resource management practices: Scale development and validity. Asia Pac. J. Hum. Resour. 2018 , 56 , 31–55. [ Google Scholar ] [ CrossRef ]
  • Shah, M. Green human resource management: Development of a valid measurement scale. Bus. Strategy Environ. 2019 , 28 , 771–785. [ Google Scholar ] [ CrossRef ]
  • Guerci, M.; Montanari, F.; Scapolan, A.; Epifanio, A. Green and nongreen recruitment practices for attracting job applicants: Exploring independent and interactive effects. Int. J. Hum. Resour. Manag. 2016 , 27 , 129–150. [ Google Scholar ] [ CrossRef ]
  • Cop, S.; Alola, U.V.; Alola, A.A. Perceived behavioral control as a mediator of hotels’ green training, environmental commitment, and organizational citizenship behavior: A sustainable environmental practice. Bus. Strategy Environ. 2020 , 29 , 3495–3508. [ Google Scholar ] [ CrossRef ]
  • Liu, J.; Liu, Y.; Yang, L. Uncovering the influence mechanism between top management support and green procurement: The effect of green training. J. Clean. Prod. 2020 , 251 , 119674. [ Google Scholar ] [ CrossRef ]
  • Pfeffer, J.; Sutton, R.I. Evidence-based management. Harv. Bus. Rev. 2006 , 84 , 62–74. Available online: https://hbr.org/2006/01/evidence-based-management (accessed on 1 June 2023).
  • Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Int. J. Surg. 2021 , 88 , 105906. [ Google Scholar ] [ CrossRef ]
  • Tranfield, D.; Denyer, D.; Smart, P. Towards a methodology for developing evidence-informed management knowledge by means of systematic review. Br. J. Manag. 2003 , 14 , 207–222. [ Google Scholar ] [ CrossRef ]
  • Page, M.J.; Moher, D.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E. PRISMA 2020 explanation and elaboration: Updated guidance and exemplars for reporting systematic reviews. BMJ 2021 , 372 , n160. [ Google Scholar ] [ CrossRef ]
  • Pahlevan-Sharif, S.; Mura, P.; Wijesinghe, S.N. A systematic review of systematic reviews in tourism. J. Hosp. Tour. Manag. 2019 , 39 , 158–165. [ Google Scholar ] [ CrossRef ]
  • Perianes-Rodriguez, A.; Waltman, L.; Van Eck, N.J. Constructing bibliometric networks: A comparison between full and fractional counting. J. Informetr. 2016 , 10 , 1178–1195. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Bahuguna, P.C.; Srivastava, R.; Tiwari, S. Two-decade journey of green human resource management research: A bibliometric analysis. Benchmarking Int. J. 2023 , 30 , 585–602. [ Google Scholar ] [ CrossRef ]
  • Yu, D.; Wang, W.; Zhang, W.; Zhang, S. A bibliometric analysis of research on multiple criteria decision making. Curr. Sci. 2018 , 114 , 747–758. [ Google Scholar ] [ CrossRef ]
  • Zhu, J.; Liu, W. A tale of two databases: The use of Web of Science and Scopus in academic papers. Scientometrics 2020 , 123 , 321–335. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Singh, V.K.; Singh, P.; Karmakar, M.; Leta, J.; Mayr, P. The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis. Scientometrics 2021 , 126 , 5113–5142. [ Google Scholar ] [ CrossRef ]
  • López-Concepción, A.; Gil-Lacruz, A.I.; Saz-Gil, I. Stakeholder engagement, Csr development and Sdgs compliance: A systematic review from 2015 to 2021. Corp. Soc. Responsib. Environ. Manag. 2022 , 29 , 19–31. [ Google Scholar ] [ CrossRef ]
  • Wagner, M. ‘Green’human resource benefits: Do they matter as determinants of environmental management system implementation? J. Bus. Ethics 2013 , 114 , 443–456. [ Google Scholar ] [ CrossRef ]
  • Leal Filho, W.; Azul, A.M.; Wall, T.; Vasconcelos, C.R.; Salvia, A.L.; do Paço, A.; Shulla, K.; Levesque, V.; Doni, F.; Alvarez-Castañón, L. COVID-19: The impact of a global crisis on sustainable development research. Sustain. Sci. 2021 , 16 , 85–99. [ Google Scholar ] [ CrossRef ]
  • Chen, C.; Feng, Y.; Shen, B. Managing Labor Sustainability in Digitalized Supply Chains: A Systematic Literature Review. Sustainability 2022 , 14 , 3895. [ Google Scholar ] [ CrossRef ]
  • Tuan, L.T. Promoting employee green behavior in the Chinese and Vietnamese hospitality contexts: The roles of green human resource management practices and responsible leadership. Int. J. Hosp. Manag. 2022 , 105 , 103253. [ Google Scholar ] [ CrossRef ]
  • Haddock-Millar, J.; Sanyal, C.; Müller-Camen, M. Green human resource management: A comparative qualitative case study of a United States multinational corporation. Int. J. Hum. Resour. Manag. 2016 , 27 , 192–211. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Ureña Espaillat, H.J.; Briones Penalver, A.J.; Bernal Conesa, J.A. Influencing responsible green innovation in Dominican agribusiness performance. Corp. Soc. Responsib. Environ. Manag. 2022 , 29 , 675–685. [ Google Scholar ] [ CrossRef ]
  • Ojo, A.O. Motivational factors of pro-environmental behaviors among information technology professionals. Rev. Manag. Sci. 2022 , 16 , 1853–1876. [ Google Scholar ] [ CrossRef ]
  • Yong, J.Y.; Yusliza, M.-Y.; Ramayah, T.; Fawehinmi, O. Nexus between green intellectual capital and green human resource management. J. Clean. Prod. 2019 , 215 , 364–374. [ Google Scholar ] [ CrossRef ]
  • Van Eck, N.J.; Waltman, L. Visualizing bibliometric networks. In Measuring Scholarly Impact: Methods and Practice ; Springer: Berlin/Heidelberg, Germany, 2014; pp. 285–320. [ Google Scholar ] [ CrossRef ]
  • Bellucci, M.; Marzi, G.; Orlando, B.; Ciampi, F. Journal of Intellectual Capital: A review of emerging themes and future trends. J. Intellect. Cap. 2021 , 22 , 744–767. [ Google Scholar ] [ CrossRef ]
  • Masri, H.A.; Jaaron, A.A. Assessing green human resources management practices in Palestinian manufacturing context: An empirical study. J. Clean. Prod. 2017 , 143 , 474–489. [ Google Scholar ] [ CrossRef ]
  • Kim, Y.J.; Kim, W.G.; Choi, H.-M.; Phetvaroon, K. The effect of green human resource management on hotel employees’ eco-friendly behavior and environmental performance. Int. J. Hosp. Manag. 2019 , 76 , 83–93. [ Google Scholar ] [ CrossRef ]
  • Guerci, M.; Longoni, A.; Luzzini, D. Translating stakeholder pressures into environmental performance—The mediating role of green HRM practices. Int. J. Hum. Resour. Manag. 2016 , 27 , 262–289. [ Google Scholar ] [ CrossRef ]
  • Nejati, M.; Rabiei, S.; Jabbour, C.J.C. Envisioning the invisible: Understanding the synergy between green human resource management and green supply chain management in manufacturing firms in Iran in light of the moderating effect of employees’ resistance to change. J. Clean. Prod. 2017 , 168 , 163–172. [ Google Scholar ] [ CrossRef ]
  • Ogbeibu, S.; Chiappetta Jabbour, C.J.; Burgess, J.; Gaskin, J.; Renwick, D.W. Green talent management and turnover intention: The roles of leader STARA competence and digital task interdependence. J. Intellect. Cap. 2022 , 23 , 27–55. [ Google Scholar ] [ CrossRef ]
  • Bazrkar, A.; Moshiripour, A. Corporate practices of green human resources management. Foresight STI Gov. 2021 , 15 , 97–105. [ Google Scholar ] [ CrossRef ]
  • Ghouri, A.M.; Mani, V.; Khan, M.R.; Khan, N.R.; Srivastava, A.P. Enhancing business performance through green human resource management practices: An empirical evidence from Malaysian manufacturing industry. Int. J. Product. Perform. Manag. 2020 , 69 , 1585–1607. [ Google Scholar ] [ CrossRef ]
  • Khatoon, A.; Khan, N.A.; Parvin, F.; Wahid, M.S.; Jamal, M.T.; Azhar, S. Green HRM: Pathway towards environmental sustainability using AHP and FAHP in a nascent parsimony. Int. J. Manpow. 2022 , 43 , 805–826. [ Google Scholar ] [ CrossRef ]
  • Marrucci, L.; Daddi, T.; Iraldo, F. The contribution of green human resource management to the circular economy and performance of environmental certified organisations. J. Clean. Prod. 2021 , 319 , 128859. [ Google Scholar ] [ CrossRef ]
  • Muisyo, P.K.; Qin, S. Enhancing the FIRM’S green performance through green HRM: The moderating role of green innovation culture. J. Clean. Prod. 2021 , 289 , 125720. [ Google Scholar ] [ CrossRef ]
  • Muisyo, P.K.; Qin, S.; Ho, T.H.; Julius, M.M. The effect of green HRM practices on green competitive advantage of manufacturing firms. J. Manuf. Technol. Manag. 2022 , 33 , 22–40. [ Google Scholar ] [ CrossRef ]
  • Muisyo, P.; Su, Q.; Ho, T.H.; Julius, M.M.; Usmani, M.S. Implications of green HRM on the firm’s green competitive advantage: The mediating role of enablers of green culture. J. Manuf. Technol. Manag. 2022 , 33 , 308–333. [ Google Scholar ] [ CrossRef ]
  • Ojo, A.O.; Tan, C.N.-L.; Alias, M. Linking green HRM practices to environmental performance through pro-environment behaviour in the information technology sector. Soc. Responsib. J. 2022 , 18 , 1–18. [ Google Scholar ] [ CrossRef ]
  • Paillé, P.; Valéau, P.; Renwick, D.W. Leveraging green human resource practices to achieve environmental sustainability. J. Clean. Prod. 2020 , 260 , 121137. [ Google Scholar ] [ CrossRef ]
  • Yong, J.Y.; Yusliza, M.Y.; Ramayah, T.; Seles, B.M.R.P. Testing the stakeholder pressure, relative advantage, top management commitment and green human resource management linkage. Corp. Soc. Responsib. Environ. Manag. 2022 , 29 , 1283–1299. [ Google Scholar ] [ CrossRef ]
  • Teixeira, A.A.; Jabbour, C.J.C.; de Sousa Jabbour, A.B.L. Relationship between green management and environmental training in companies located in Brazil: A theoretical framework and case studies. Int. J. Prod. Econ. 2012 , 140 , 318–329. [ Google Scholar ] [ CrossRef ]
  • Shields, J.; Brown, M.; Kaine, S.; Dolle-Samuel, C.; North-Samardzic, A.; McLean, P.; Johns, R.; O’Leary, P.; Plimmer, G.; Robinson, J. Managing Employee Performance & Reward: Concepts, Practices, Strategies ; Cambridge University Press: Cambridge, UK, 2015. [ Google Scholar ]
  • Pinzone, M.; Guerci, M.; Lettieri, E.; Huisingh, D. Effects of ‘green’training on pro-environmental behaviors and job satisfaction: Evidence from the Italian healthcare sector. J. Clean. Prod. 2019 , 226 , 221–232. [ Google Scholar ] [ CrossRef ]
  • Ogbeibu, S.; Emelifeonwu, J.; Senadjki, A.; Gaskin, J.; Kaivo-oja, J. Technological turbulence and greening of team creativity, product innovation, and human resource management: Implications for sustainability. J. Clean. Prod. 2020 , 244 , 118703. [ Google Scholar ] [ CrossRef ]
  • Amrutha, V.; Geetha, S. Linking organizational green training and voluntary workplace green behavior: Mediating role of green supporting climate and employees’ green satisfaction. J. Clean. Prod. 2021 , 290 , 125876. [ Google Scholar ] [ CrossRef ]
  • Paillé, P.; Valéau, P. “I don’t owe you, but I am committed”: Does felt obligation matter on the effect of green training on employee environmental commitment? Organ. Environ. 2021 , 34 , 123–144. [ Google Scholar ] [ CrossRef ]
  • Jose Chiappetta Jabbour, C. How green are HRM practices, organizational culture, learning and teamwork? A Brazilian study. Ind. Commer. Train. 2011 , 43 , 98–105. [ Google Scholar ] [ CrossRef ]
  • Daily, B.F.; Bishop, J.W.; Massoud, J.A. The role of training and empowerment in environmental performance: A study of the Mexican maquiladora industry. Int. J. Oper. Prod. Manag. 2012 , 32 , 631–647. [ Google Scholar ] [ CrossRef ]
  • Bos-Nehles, A.C.; Van Riemsdijk, M.J.; Kees Looise, J. Employee perceptions of line management performance: Applying the AMO theory to explain the effectiveness of line managers’ HRM implementation. Hum. Resour. Manag. 2013 , 52 , 861–877. [ Google Scholar ] [ CrossRef ]
  • Gerhart, B. Human resources and business performance: Findings, unanswered questions, and an alternative approach. Manag. Rev. 2005 , 16 , 174–185. Available online: http://hdl.handle.net/10419/78948 (accessed on 1 June 2023). [ CrossRef ]
  • Hart, S.L. A natural-resource-based view of the firm. Acad. Manag. Rev. 1995 , 20 , 986–1014. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Emerson, R.M. Social Exchange Theory. Annu. Rev. Sociol. 1976 , 2 , 335–362. Available online: http://www.jstor.org/stable/2946096 (accessed on 1 June 2023). [ CrossRef ]
  • Ashforth, B.E.; Mael, F. Social identity theory and the organization. Acad. Manag. Rev. 1989 , 14 , 20–39. [ Google Scholar ] [ CrossRef ]
  • Adubor, N.V.; Adeniji, A.A.; Salau, O.P.; Olajugba, O.J.; Onibudo, G.O. Exploring Green Human Resource Adoption and Corporate Sustainability in Nigerian Manufacturing Industry. Sustainability 2022 , 14 , 12635. [ Google Scholar ] [ CrossRef ]
  • Moin, M.F.; Omar, M.K.; Wei, F.; Rasheed, M.I.; Hameed, Z. Green HRM and psychological safety: How transformational leadership drives follower’s job satisfaction. Curr. Issues Tour. 2021 , 24 , 2269–2277. [ Google Scholar ] [ CrossRef ]
  • Farooq, R.; Zhang, Z.; Talwar, S.; Dhir, A. Do green human resource management and self-efficacy facilitate green creativity? A study of luxury hotels and resorts. J. Sustain. Tour. 2022 , 30 , 824–845. [ Google Scholar ] [ CrossRef ]
  • Jabbour, C.J.C.; Jugend, D.; de Sousa Jabbour, A.B.L.; Gunasekaran, A.; Latan, H. Green product development and performance of Brazilian firms: Measuring the role of human and technical aspects. J. Clean. Prod. 2015 , 87 , 442–451. [ Google Scholar ] [ CrossRef ]
  • Song, W.; Yu, H.; Xu, H. Effects of green human resource management and managerial environmental concern on green innovation. Eur. J. Innov. Manag. 2021 , 24 , 951–967. [ Google Scholar ] [ CrossRef ]
  • Gim, G.C.; Ooi, S.K.; Teoh, S.T.; Lim, H.L.; Yeap, J.A. Green human resource management, leader–member exchange, core self-evaluations and work engagement: The mediating role of human resource management performance attributions. Int. J. Manpow. 2022 , 43 , 682–700. [ Google Scholar ] [ CrossRef ]
  • Yong, J.Y.; Yusliza, M.Y.; Ramayah, T.; Chiappetta Jabbour, C.J.; Sehnem, S.; Mani, V. Pathways towards sustainability in manufacturing organizations: Empirical evidence on the role of green human resource management. Bus. Strategy Environ. 2020 , 29 , 212–228. [ Google Scholar ] [ CrossRef ]
  • Mousa, S.K.; Othman, M. The impact of green human resource management practices on sustainable performance in healthcare organisations: A conceptual framework. J. Clean. Prod. 2020 , 243 , 118595. [ Google Scholar ] [ CrossRef ]
  • Muisyo, P.K.; Qin, S.; Julius, M.M.; Ho, T.H.; Ho, T.H. Green HRM and employer branding: The role of collective affective commitment to environmental management change and environmental reputation. J. Sustain. Tour. 2022 , 30 , 1897–1914. [ Google Scholar ] [ CrossRef ]
  • Rehman, S.U.; Kraus, S.; Shah, S.A.; Khanin, D.; Mahto, R.V. Analyzing the relationship between green innovation and environmental performance in large manufacturing firms. Technol. Forecast. Soc. Chang. 2021 , 163 , 120481. [ Google Scholar ] [ CrossRef ]
  • Karatepe, O.M.; Hsieh, H.; Aboramadan, M. The effects of green human resource management and perceived organizational support for the environment on green and non-green hotel employee outcomes. Int. J. Hosp. Manag. 2022 , 103 , 103202. [ Google Scholar ] [ CrossRef ]
  • Saeed, B.B.; Afsar, B.; Hafeez, S.; Khan, I.; Tahir, M.; Afridi, M.A. Promoting employee’s proenvironmental behavior through green human resource management practices. Corp. Soc. Responsib. Environ. Manag. 2019 , 26 , 424–438. [ Google Scholar ] [ CrossRef ]
  • Fawehinmi, O.; Yusliza, M.Y.; Mohamad, Z.; Noor Faezah, J.; Muhammad, Z. Assessing the green behaviour of academics: The role of green human resource management and environmental knowledge. Int. J. Manpow. 2020 , 41 , 879–900. [ Google Scholar ] [ CrossRef ]
  • Rubel, M.R.B.; Kee, D.M.H.; Rimi, N.N. The influence of green HRM practices on green service behaviors: The mediating effect of green knowledge sharing. Empl. Relat. Int. J. 2021 , 43 , 996–1015. [ Google Scholar ] [ CrossRef ]
  • Chaudhary, R. Green human resource management and employee green behavior: An empirical analysis. Corp. Soc. Responsib. Environ. Manag. 2020 , 27 , 630–641. [ Google Scholar ] [ CrossRef ]
  • Hameed, Z.; Khan, I.U.; Islam, T.; Sheikh, Z.; Naeem, R.M. Do green HRM practices influence employees’ environmental performance? Int. J. Manpow. 2020 , 41 , 1061–1079. [ Google Scholar ] [ CrossRef ]
  • Rizvi, Y.S.; Garg, R. The simultaneous effect of green ability-motivation-opportunity and transformational leadership in environment management: The mediating role of green culture. Benchmarking Int. J. 2021 , 28 , 830–856. [ Google Scholar ] [ CrossRef ]
  • Umrani, W.A.; Channa, N.A.; Ahmed, U.; Syed, J.; Pahi, M.H.; Ramayah, T. The laws of attraction: Role of green human resources, culture and environmental performance in the hospitality sector. Int. J. Hosp. Manag. 2022 , 103 , 103222. [ Google Scholar ] [ CrossRef ]
  • He, J.; Morrison, A.M.; Zhang, H. Being sustainable: The three-way interactive effects of CSR, green human resource management, and responsible leadership on employee green behavior and task performance. Corp. Soc. Responsib. Environ. Manag. 2021 , 28 , 1043–1054. [ Google Scholar ] [ CrossRef ]
  • Ren, S.; Tang, G.; Jackson, S.E. Effects of Green HRM and CEO ethical leadership on organizations’ environmental performance. Int. J. Manpow. 2021 , 42 , 961–983. [ Google Scholar ] [ CrossRef ]
  • Roscoe, S.; Subramanian, N.; Jabbour, C.J.; Chong, T. Green human resource management and the enablers of green organisational culture: Enhancing a firm’s environmental performance for sustainable development. Bus. Strategy Environ. 2019 , 28 , 737–749. [ Google Scholar ] [ CrossRef ]
  • Ahmed, M.; Guo, Q.; Qureshi, M.A.; Raza, S.A.; Khan, K.A.; Salam, J. Do green HR practices enhance green motivation and proactive environmental management maturity in hotel industry? Int. J. Hosp. Manag. 2021 , 94 , 102852. [ Google Scholar ] [ CrossRef ]
  • Cabral, C.; Jabbour, C.J.C. Understanding the human side of green hospitality management. Int. J. Hosp. Manag. 2020 , 88 , 102389. [ Google Scholar ] [ CrossRef ]
  • Islam, M.A.; Jantan, A.H.; Yusoff, Y.M.; Chong, C.W.; Hossain, M.S. Green Human Resource Management (GHRM) practices and millennial employees’ turnover intentions in tourism industry in malaysia: Moderating role of work environment. Glob. Bus. Rev. 2020 . [ Google Scholar ] [ CrossRef ]
  • O’Donohue, W.; Torugsa, N. The moderating effect of ‘Green’HRM on the association between proactive environmental management and financial performance in small firms. Int. J. Hum. Resour. Manag. 2016 , 27 , 239–261. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Yusliza, M.-Y.; Norazmi, N.A.; Jabbour, C.J.C.; Fernando, Y.; Fawehinmi, O.; Seles, B.M.R.P. Top management commitment, corporate social responsibility and green human resource management: A Malaysian study. Benchmarking: Int. J. 2019 , 26 , 2051–2078. [ Google Scholar ] [ CrossRef ]
  • Yu, J.; Zhu, L. Corporate ambidexterity: Uncovering the antecedents of enduring sustainable performance. J. Clean. Prod. 2022 , 365 , 132740. [ Google Scholar ] [ CrossRef ]
  • Pinzone, M.; Guerci, M.; Lettieri, E.; Redman, T. Progressing in the change journey towards sustainability in healthcare: The role of ‘Green’HRM. J. Clean. Prod. 2016 , 122 , 201–211. [ Google Scholar ] [ CrossRef ]

Click here to enlarge figure

DatabaseSearch TermsEligible Criteria Set in Automatic Filters
N = Initial Records (searching date)n = Records marked as ineligible
ScopusTITLE-ABS-KEY = (“green hr*” OR “green human resource*” OR “green attract*” OR “green recruit*” OR “green select*” OR “green train*” OR “green performance manag*” OR “green performance apprais*” OR “green pay*” OR “green compensat*” OR “green reward” OR “green employee*”)
N = 963 (19/04/2023)
Publication years: 2022
Document type: Article
Source type: Journal
Publication stage: Final
Subject Area: Business, Management, and Accounting Language: English
n = 636 (excluded)
Web of Science Core CollectionTS = (“green hr*” OR “green human resource*” OR “green attract*” OR “green recruit*” OR “green select*” OR “green train*” OR “green performance manag*” OR “green performance apprais*” OR “green pay*” OR “green compensat*” OR “green reward” OR “green employee*”)
N = 796 (19/04/2023)
Publication years: 2022
Document type: Article
Publication stage: Final
Language: English
Citation Topics Meso: Management
Research Area: Business Economics, Environmental Sciences Ecology, Social Sciences Other Topics
n = 406 (excluded)
RQsEvidence from GHRM LiteratureNew Findings of GHRM
RQ1. What is the status quo of empirical GHRM research?1. Significant growth varies in research disciplines; 1. Attention from not only HRM but also across disciplines;
2. Conducted context: mainly in Asia, developing economies, and the service and manufacturing sectors;2. Lack of evidence in the primary sector; lack of industrial description for the manufacturing sector;
3. Research clusters: theory implementing and framework extension, linking with employees, linking with an organization;3. Three main research streams;
4. Keyword trends: pro-environmental behavior, green creativity, and competitive advantage appeared in recent studies.4. Recent interests are employee-related issues.
RQ2. How has GHRM been conceptualized, and how have theories been in empirical research?1. GHRM dimensions: most studies adopted bundled GHRMPs, and GT is among the most examined GHRMP aspect; bundled GHRMPs and GT were found positively relates to desired outcomes in most cases;1. Preference showed in bundled GHRMPs and GT, which show a high possibility of having positive effects; comparisons between GHRMPs were rarely made;
2. Theory application: around half of the studies were based on a single theory; AMO theory was used most, followed by RBV, SET, and SIT; some recent studies used multiple theories.2. Tendency of using multiple theories and conducting research from different perspectives.
RQ3. What suggestions can be offered for future GHRM development based on the evidence?1. Research that adopted the same measurement scales for GHRM constructs show similar choices in the design of GHRM dimensions;
2. Research that applied the same theories had similar inclinations of construct levels (organizational vs. employee level).
1. Selection of GHRM dimensions influences the choice of measurement scales;
2. Theory applications interrelated with constructs choice and framework design.
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Xie, H.; Lau, T.C. Evidence-Based Green Human Resource Management: A Systematic Literature Review. Sustainability 2023 , 15 , 10941. https://doi.org/10.3390/su151410941

Xie H, Lau TC. Evidence-Based Green Human Resource Management: A Systematic Literature Review. Sustainability . 2023; 15(14):10941. https://doi.org/10.3390/su151410941

Xie, Huirong, and Teck Chai Lau. 2023. "Evidence-Based Green Human Resource Management: A Systematic Literature Review" Sustainability 15, no. 14: 10941. https://doi.org/10.3390/su151410941

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

IMAGES

  1. 15 Empirical Evidence Examples (2024)

    empirical evidence in qualitative research

  2. Empirical Evidence

    empirical evidence in qualitative research

  3. Empirical Research: Definition, Methods, Types and Examples

    empirical evidence in qualitative research

  4. What Is Empirical Research? Definition, Types & Samples in 2024

    empirical evidence in qualitative research

  5. What Is Empirical Research? Definition, Types & Samples

    empirical evidence in qualitative research

  6. Empirical Research: Definition, Methods, Types and Examples

    empirical evidence in qualitative research

VIDEO

  1. Quantitative vs qualitative evidence in policymaking

  2. Sociology Scientific Method

  3. Using Qualitative Methods to Synthesise Evidence (Dr Melissa Bond)

  4. PIA6204 Week1 Section1: FOUNDATIONS OF RESEARCH METHODS: UNDERSTANDING TRUTH

  5. Session 04: Data Analysis techniques in Qualitative Research

  6. What is Qualitative Research

COMMENTS

  1. What is "Empirical Research"?

    Another hint: some scholarly journals use a specific layout, called the "IMRaD" format, to communicate empirical research findings. Such articles typically have 4 components: Introduction : sometimes called "literature review" -- what is currently known about the topic -- usually includes a theoretical framework and/or discussion of previous ...

  2. What is Qualitative in Qualitative Research

    The point is that the ideal-typical quantitative process does not imply change of the data, and iteration between data, evidence, hypotheses, empirical work, and theory. The data, quantified variables, are, in most cases fixed. ... Lamont M. Evaluating qualitative research: Some empirical findings and an agenda. In: Lamont M, White P, editors.

  3. Empirical Research: Definition, Methods, Types and Examples

    Empirical research is defined as any research where conclusions of the study is strictly drawn from concretely empirical evidence, and therefore "verifiable" evidence. This empirical evidence can be gathered using quantitative market research and qualitative market research methods. For example: A research is being conducted to find out if ...

  4. Empirical evidence: A definition

    Empirical evidence is information that is acquired by observation or experimentation and is used to prove or disprove a hypothesis. ... Qualitative research, often used in the social sciences, ...

  5. Empirical Research: Defining, Identifying, & Finding

    The evidence collected during empirical research is often referred to as "data." ... Qualitative research -- an approach to documenting reality that relies on words and images as the primary data source (p. 33). Both quantitative and qualitative methods are empirical. If you can recognize that a research study is quantitative or qualitative ...

  6. Chapter 21: Qualitative evidence

    The aim of this chapter is to provide authors (who already have experience of undertaking qualitative research and qualitative evidence synthesis) with additional guidance on undertaking a qualitative evidence synthesis that is subsequently integrated with an intervention review. ... Theoretical findings may combine empirical evidence, expert ...

  7. Empirical evidence

    Empirical evidence can be quantitative or qualitative. Typically, numerical quantitative evidence can be represented visually by means of diagrams, graphs, or charts, reflecting the use of statistical or mathematical data and the researcher's neutral noninteractive role.

  8. Empirical research

    Empirical research is research using empirical evidence. ... Quantifying the evidence or making sense of it in qualitative form, a researcher can answer empirical questions, which should be clearly defined and answerable with the evidence collected (usually called data). Research design varies by field and by the question being investigated.

  9. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  10. Qualitative Research: Data Collection, Analysis, and Management

    Qualitative research is used to gain insights into people's feelings and thoughts, which may provide the basis for a future stand-alone qualitative study or may help researchers to map out survey instruments for use in a quantitative study. ... the researchers must have felt that there was anecdotal evidence of differences in access to ...

  11. The Central Role of Theory in Qualitative Research

    Bott ends the article by saying the project has shown "how the 'othering' of researcher subjectivity by research subjects and how we as researchers feel about those often 'othering' appropriations, as well as our own intellectual reactions to them, can affect the empirical, epistemological and ethical concerns of the research" (p. 172).

  12. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  13. A Review of the Quality Indicators of Rigor in Qualitative Research

    Abstract. Attributes of rigor and quality and suggested best practices for qualitative research design as they relate to the steps of designing, conducting, and reporting qualitative research in health professions educational scholarship are presented. A research question must be clear and focused and supported by a strong conceptual framework ...

  14. PDF 1 Introduction

    the logic behind the empirical research process, whether qualitative, quantitative or mixed methods. (1) The pre-empirical stage of research A sharp distinction is made in the early part of this book between the pre-empir-ical and empirical stages of research. It is stressed in Chapters 4 and 5 that

  15. Empirical Research

    Strategies for Empirical Research in Writing is a particularly accessible approach to both qualitative and quantitative empirical research methods, helping novices appreciate the value of empirical research in writing while easing their fears about the research process. This comprehensive book covers research methods ranging from traditional ...

  16. Toward Developing a Framework for Conducting Case Study Research

    Stake mentions four defining characteristics of qualitative research which are valid for qualitative case studies as well: they are "holistic," "empirical," "interpretive," and "emphatic." Whether the study is experimental or quasi-experimental, the data collection and analysis methods are known to hide some details (Yazan, 2015).

  17. Empirical Research

    In its many guises, qualitative research is a form of empirical inquiry that typically entails some form of purposive sampling for information-rich cases; in-depth interviews and open-ended interviews, lengthy participant/field observations, and/or document or artifact study; and techniques for analysis and interpretation of data that move ...

  18. Demystifying Empirical Research: Techniques and Examples

    Conclusion. Demystifying empirical research involves understanding the various techniques and examples used in the process. By exploring the methodologies, data collection methods, and analysis techniques discussed in this blog, researchers can gain a clearer insight into how empirical research is conducted.

  19. What is Qualitative in Qualitative Research

    Qualitative research involves the studied use and collection of a variety of empirical materials - case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts - that describe routine and problematic moments and meanings in individuals' lives.

  20. Case Study Methodology of Qualitative Research: Key Attributes and

    A case study is one of the most commonly used methodologies of social research. This article attempts to look into the various dimensions of a case study research strategy, the different epistemological strands which determine the particular case study type and approach adopted in the field, discusses the factors which can enhance the effectiveness of a case study research, and the debate ...

  21. Qualitative, Quantitative & Empirical Research

    An empirical study is research derived from actual observation or experimentation. The written articles resulting from empirical studies undergo a rigorous review by experts in the field of study prior to being published in journals.

  22. Synthesising quantitative and qualitative evidence to inform guidelines

    Pluye and Hong 52 define mixed-methods research as "a research approach in which a researcher integrates (a) qualitative and quantitative research questions, (b) qualitative research methods* and quantitative research designs, (c) techniques for collecting and analyzing qualitative and quantitative evidence, and (d) qualitative findings and quantitative results".A mixed-method synthesis ...

  23. Evidence‐based multimodal learning analytics for feedback and

    Empirical evidence supporting the potential of MMLA in assisting teachers to facilitate students' reflective practices during intricate collaborative learning scenarios. ... Additionally, there is a need for more research on the qualitative aspects of communication and the development of measures to capture these aspects in the MMLA solution ...

  24. Empirical analysis and findings: qualitative investigation

    These themes were not designed to produce new research evidence, but were intended to align with the NIHR aim of learning more about KT by conducting implementation activity. ... Empirical analysis and findings: qualitative investigation - Networked innovation in the health sector: comparative qualitative study of the role of Collaborations for ...

  25. Sustainability

    To better understand the empirical development of green human resource management (GHRM) research and theories and to provide evidence-based suggestions, the article conducts a systematic review of evidence-based studies within the academic field of GHRM. The review follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Protocol 2020 to select GHRM-focused and ...

  26. School counsellors' reporting child maltreatment: A rapid review of

    The empirical evidence shows that the experiences of school counsellors in reporting child maltreatment are seldom investigated despite school counsellors being considered frontline members of the school-based child protection workforce for decades (World Health Organisation, 2019). The existing evidence is limited to having been generated in ...