The qualitative research process, end-to-end

Step by step guide overview to the qualitative research process

.css-1nrevy2{position:relative;display:inline-block;} The qualitative research process: step by step guide

Although research processes may vary by methodology or project team, some fundamentals exist across research projects. Below outlines the collective experience that qualitative researchers undertake to conduct research.

Step 1: Determine what to research

Once a researcher has determined a list of potential projects to tackle, they will prioritize projects based on the business's impact, available resourcing, timelines & dependencies to create a research roadmap. For each project, they will also identify the key questions they need to answer in the research.

The researcher should identify the participants they plan to research, and any key attributes that are a 'must-have' or 'nice to have' as these can be influential in determining the research approach (e.g. a niche group may require a longer timeline to recruit).

Researchers will generally aim for a mix of project types. Some may be more tactical or requests from stakeholders, and some will be projects that the researcher has proactively identified as opportunities for strategic research.

It's easier to determine a shortlist of potential methodologies based on where the research projects may fall within the product life-cycle. Image from Nielsen / Norman Group.

Step 2: Identify how to research it

Once the researcher has finalized the research project, they will need to figure out how they will do the work.

Firstly, the researcher will look through secondary data and research (e.g. analytics, previous research reports). Secondary analysis will help determine if there are existing answers to any of the open questions, ensuring that any net-new study doesn't duplicate current work (unless previous research is out of date).

A quadrant showing where different types of research fall.

After scoping the research, researchers will determine if the research input needs to be  attitudinal  (i.e. what someone says) or  behavioral  (i.e. what someone does); as well as if they need to  explore  a problem space or  evaluate  a product – these help determine the methodology to use. There are many methodologies out there, but the main ones you generally will find from a qualitative perspective are:

Interviews [Attitudinal / Exploratory]  – semi-structured conversation with a participant focused on a small set of topics. Runs for 30-60 minutes.

Contextual Inquiry [Behavioral / Exploratory]  – observation of a participant in their environment. Probing questions may be asked during the observation. Runs for 2-3 hours.

Survey [Attitudinal / Evaluative]  – gathering structured information from a sample of people, traditionally to generalize the results to a larger population. Surveys should generally not take participants more than 10 minutes to complete.

Usability Test [Behavioral / Evaluative]  – evaluating how representative users are, or are not, able to achieve critical tasks within an experience.

Check out these articles for more information about different methodologies:

When to Use Which User-Experience Research Methods

UX Research Cheat Sheet

Usability.gov

Design Research Kit

Step 3: Get buy-in and alignment from others

Once a researcher has determined what they will be researching and how they will research it, they will generally write up a research plan that includes additional information about the research goals, participant scope, timelines, and dependencies. The plan is typically either a document or presentation shared with stakeholders depending on the company and how they work.

After the research plan is complete, researchers will share the plan for feedback and input from their stakeholders to ensure that the stakeholders have the right expectations going into the research. Stakeholders may ask for additional question topics to be added, ensure that research will be executed against specific timelines, or provide recommendations on how the study will help make product decisions.

At organizations where there is a research team, researchers may also share their plan with other researchers informally or through a 'crit' process. Generally, researchers will provide feedback on the research craft, such as methodologies, participant mixes, and the research goals or questions.

Once the researcher feels confident in their plan, they will either begin to plan the research, or in the case of more junior researchers, get approval from their manager to begin the study.

Step 4: Prepare research

This step is where the researcher will get all of their ducks in a row to execute the research. Preparation activities include:

Equipment: Booking venues, labs, observation rooms, and procuring any appropriate equipment needed to run the study (e.g. cameras, mobile devices).

Participants: Sourcing participants from internal / external databases, reaching out/scheduling participants, managing schedule changes.

Incentives: Find budget, identify incentive type (e.g. Amazon gift card? customer credit? gift baskets), and purchasing.

Assets: Building relevant designs / prototypes (with design or design technologists), creating interview / observation guides and other research tools needed for sessions (e.g. physical cards for in-person card sorts).

Legal & Procurement: Participant waivers or NDAs preparation to ensure they are sent in advance of the research session to participants, vendor procurement, and management.

If Research Operations exists within an organization, they will generally take on most of the load in this area. The researcher will focus on assets required for executing research, such as interview guides.

In some cases, vendors may be engaged for some of these requirements (e.g. labs, participants, and incentive management) if resourcing is not available internally or if a researcher wants a blinded study (i.e. the participant doesn't know what company is running the research). In this case, additional time is incorporated to brief, onboard, and get approvals to work with the vendor.

Step 5: Execute research

Now the researcher gets to research!

Researchers will generally aim to execute research activities for 1–2 weeks, depending on the methodology to ensure they can be efficient in execution. In some more longitudinal methods (e.g., diary studies), or if a participant type is harder to recruit, it may take longer.

In consumer research, there will usually be back up participants available in case of no shows. However, in business or enterprise research, researchers will engage will all recruited participants as participants will generally have relationships with other parts of the company (e.g. sales). It is essential to maintain those relationships post-research.

During sessions, in a perfect world, there is one facilitator (principal researcher). In some cases, a secondary attendee who takes notes – this can be a stakeholder or a more junior researcher who can then learn soft skills from the primary researcher. By delegating note-taking, the principal researcher can focus on driving and managing the participant's conversation.

However, in most cases (especially if there is a "research team of one"), researchers will try to have to do both facilitation and documentation – this can lead to a clunkier conversation as the facilitator attempts to quickly write notes between trying to think of the next question. If a researcher decides to record a session instead, they will have to spend additional time after the research listening to the full recordings and writing notes.

In qualitative research, researchers may begin to  see patterns in the findings after five sessions . They may start to tailor the research questions to be more specific to gaps in their understanding.

Researchers may also set up an observation room for stakeholders (or share links to remote sessions) to attend live. Generally, researchers will have a backchannel (e.g., slack, chat, or SMS), so if a stakeholder has a follow-up question to an answer, the researcher can dig deeper. In some cases, researchers will give stakeholders an input form to take their notes that can be shared with the researcher afterward - this can be useful for the researcher to understand how the stakeholder views the research and what the stakeholder perceives as necessary to the research insights.

Step 6: Synthesize and find insights

Once the research capture is complete, the researcher will then aggregate findings to begin to look for common themes (in exploratory) or success rates (in evaluative). Both of these will then lead to insight generation that researchers will then look to tie back to the project's original research goals.

As analysis can be one of the most high-effort tasks in research, researchers will lean towards how to be efficient in their study, generally using digital tools, hacks, or workarounds. Researchers will usually create the analysis process they refine throughout their careers to help them become more efficient.

In cases where researchers are looking to get buy-in for research or capture stakeholder input, they may seem to more visual approaches (e.g. post-it affinity analysis) in war rooms. This process can take longer to process (especially if there is a high volume of data). Still, there can be a higher impact on analyzing research in this way – especially if the researcher is looking to get buy-in for future projects.

Step 7: Create research outputs

After a researcher identifies the key themes and insights, the researcher will reframe these findings to a relevant research output to ensure that stakeholders understand and buy-in to the outcomes. Outputs may include:

Report: Outlines vital findings from research in a document or presentation format. Will most likely include an executive summary, insight themes, and supporting evidence.

Videos: A highlight reel of supporting evidence from crucial findings. Generally seen as more useful and engaging compared to just a report. In most cases, the video will help the research report.

Personas : A written representation of a product's intended users to understand different types of user goals, needs, and behaviors. Also used to help stakeholders build empathy for the end-user of the product.

Journey Map : A visualization of the process that a person goes through to accomplish a goal. Generally created in conjunction with a persona.

Concepts / Wireframes / Designs: If research is evaluative, designs can visualize recommendations.

Storyboarding

Before a researcher makes the output, researchers will spend time planning the structure and storyboarding. Storyboarding is incredibly essential to help researchers define information requirements and ensure they present their findings in the most impactful way to stakeholders.

Having a point of view in outputs

Historically, researchers have tried to stay neutral to the data and not try to have a strong opinion or perspective to let the data speak. However, as researchers become more embedded in the industry, this has shifted to stakeholders wanting a strong point of view or recommendations from researchers that can help other stakeholders (especially product managers and designers) decide the knowledge captured as part of the research.

Having a strong perspective helps researchers have a seat at the table and appear as a trusted advisor/partner in cross-functional settings.

Step 8: Share and follow up on findings

After the research outputs are complete, some researchers will do a "pre-share" or walkthrough with key stakeholders or potential detractors to the research. The purpose of these meetings is to align with stakeholders' expectations and find potential 'watch-outs' (things that may derail a presentation).

Researchers will generally have to share their findings out multiple times to different stakeholder groups and tailor them for each audience. For example, executive meetings will be more higher level than a meeting with a product manager.

After sharing, researchers will follow up with key stakeholders (especially those who provided input to the research) to confirm they understand the findings and identify next steps. Next steps may include incorporating results in product strategy documents, proposals / PRDs, or user stories to ensure that the recommendations or findings have been reflected or sourced.

Keep reading

qualitative research process design

.css-je19u9{-webkit-align-items:flex-end;-webkit-box-align:flex-end;-ms-flex-align:flex-end;align-items:flex-end;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-box-flex-wrap:wrap;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;row-gap:0;text-align:center;max-width:671px;}@media (max-width: 1079px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}}@media (max-width: 799px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}} Decide what to .css-1kiodld{max-height:56px;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}@media (max-width: 1079px){.css-1kiodld{display:none;}} build next

Decide what to build next.

qualitative research process design

Users report unexpectedly high data usage, especially during streaming sessions.

qualitative research process design

Users find it hard to navigate from the home page to relevant playlists in the app.

qualitative research process design

It would be great to have a sleep timer feature, especially for bedtime listening.

qualitative research process design

I need better filters to find the songs or artists I’m looking for.

Log in or sign up

Get started for free

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is a Research Design | Types, Guide & Examples

What Is a Research Design | Types, Guide & Examples

Published on June 7, 2021 by Shona McCombes . Revised on November 20, 2023 by Pritha Bhandari.

A research design is a strategy for answering your   research question  using empirical data. Creating a research design means making decisions about:

  • Your overall research objectives and approach
  • Whether you’ll rely on primary research or secondary research
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research objectives and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, other interesting articles, frequently asked questions about research design.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities—start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative approach Quantitative approach
and describe frequencies, averages, and correlations about relationships between variables

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed-methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types.

  • Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships
  • Descriptive and correlational designs allow you to measure variables and describe relationships between them.
Type of design Purpose and characteristics
Experimental relationships effect on a
Quasi-experimental )
Correlational
Descriptive

With descriptive and correlational designs, you can get a clear picture of characteristics, trends and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analyzing the data.

Type of design Purpose and characteristics
Grounded theory
Phenomenology

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study—plants, animals, organizations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

  • Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalize your results to the population as a whole.

Probability sampling Non-probability sampling

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study , your aim is to deeply understand a specific context, not to generalize to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question .

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviors, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews .

Questionnaires Interviews
)

Observation methods

Observational studies allow you to collect data unobtrusively, observing characteristics, behaviors or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Quantitative observation

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

Field Examples of data collection methods
Media & communication Collecting a sample of texts (e.g., speeches, articles, or social media posts) for data on cultural norms and narratives
Psychology Using technologies like neuroimaging, eye-tracking, or computer-based tasks to collect data on things like attention, emotional response, or reaction time
Education Using tests or assignments to collect data on knowledge and skills
Physical sciences Using scientific instruments to collect data on things like weight, blood pressure, or chemical composition

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what kinds of data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected—for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

Prevent plagiarism. Run a free check.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are high in reliability and validity.

Operationalization

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalization means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in—for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced, while validity means that you’re actually measuring the concept you’re interested in.

Reliability Validity
) )

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method , you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample—by mail, online, by phone, or in person?

If you’re using a probability sampling method , it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method , how will you avoid research bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organizing and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymize and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well-organized will save time when it comes to analyzing it. It can also help other researchers validate and add to your findings (high replicability ).

On its own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyze the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarize your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarize your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

Approach Characteristics
Thematic analysis
Discourse analysis

There are many other ways of analyzing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

If you want to know more about the research process , methodology , research bias , or statistics , make sure to check out some of our other articles with explanations and examples.

  • Simple random sampling
  • Stratified sampling
  • Cluster sampling
  • Likert scales
  • Reproducibility

 Statistics

  • Null hypothesis
  • Statistical power
  • Probability distribution
  • Effect size
  • Poisson distribution

Research bias

  • Optimism bias
  • Cognitive bias
  • Implicit bias
  • Hawthorne effect
  • Anchoring bias
  • Explicit bias

A research design is a strategy for answering your   research question . It defines your overall approach and determines how you will collect and analyze data.

A well-planned research design helps ensure that your methods match your research aims, that you collect high-quality data, and that you use the right kind of analysis to answer your questions, utilizing credible sources . This allows you to draw valid , trustworthy conclusions.

Quantitative research designs can be divided into two main categories:

  • Correlational and descriptive designs are used to investigate characteristics, averages, trends, and associations between variables.
  • Experimental and quasi-experimental designs are used to test causal relationships .

Qualitative research designs tend to be more flexible. Common types of qualitative design include case study , ethnography , and grounded theory designs.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

A sample is a subset of individuals from a larger population . Sampling means selecting the group that you will actually collect data from in your research. For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

In statistics, sampling allows you to test a hypothesis about the characteristics of a population.

Operationalization means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioral avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalize the variables that you want to measure.

A research project is an academic, scientific, or professional undertaking to answer a research question . Research projects can take many forms, such as qualitative or quantitative , descriptive , longitudinal , experimental , or correlational . What kind of research approach you choose will depend on your topic.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, November 20). What Is a Research Design | Types, Guide & Examples. Scribbr. Retrieved June 18, 2024, from https://www.scribbr.com/methodology/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, guide to experimental design | overview, steps, & examples, how to write a research proposal | examples & templates, ethical considerations in research | types & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

  • Search Menu

Sign in through your institution

  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Papyrology
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Acquisition
  • Language Evolution
  • Language Reference
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Religion
  • Music and Media
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Science
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Strategy
  • Business Ethics
  • Business History
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Systems
  • Economic History
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • Ethnic Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Theory
  • Politics and Law
  • Politics of Development
  • Public Administration
  • Public Policy
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Qualitative Research (2nd edn)

The Oxford Handbook of Qualitative Research (2nd edn)

The Oxford Handbook of Qualitative Research (2nd edn)

Patricia Leavy Independent Scholar Kennebunk, ME, USA

  • Cite Icon Cite
  • Permissions Icon Permissions

The Oxford Handbook of Qualitative Research, second edition, presents a comprehensive retrospective and prospective review of the field of qualitative research. Original, accessible chapters written by interdisciplinary leaders in the field make this a critical reference work. Filled with robust examples from real-world research; ample discussion of the historical, theoretical, and methodological foundations of the field; and coverage of key issues including data collection, interpretation, representation, assessment, and teaching, this handbook aims to be a valuable text for students, professors, and researchers. This newly revised and expanded edition features up-to-date examples and topics, including seven new chapters on duoethnography, team research, writing ethnographically, creative approaches to writing, writing for performance, writing for the public, and teaching qualitative research.

Personal account

  • Sign in with email/username & password
  • Get email alerts
  • Save searches
  • Purchase content
  • Activate your purchase/trial code
  • Add your ORCID iD

Institutional access

Sign in with a library card.

  • Sign in with username/password
  • Recommend to your librarian
  • Institutional account management
  • Get help with access

Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:

IP based access

Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.

Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.

  • Click Sign in through your institution.
  • Select your institution from the list provided, which will take you to your institution's website to sign in.
  • When on the institution site, please use the credentials provided by your institution. Do not use an Oxford Academic personal account.
  • Following successful sign in, you will be returned to Oxford Academic.

If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.

Enter your library card number to sign in. If you cannot sign in, please contact your librarian.

Society Members

Society member access to a journal is achieved in one of the following ways:

Sign in through society site

Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:

  • Click Sign in through society site.
  • When on the society site, please use the credentials provided by that society. Do not use an Oxford Academic personal account.

If you do not have a society account or have forgotten your username or password, please contact your society.

Sign in using a personal account

Some societies use Oxford Academic personal accounts to provide access to their members. See below.

A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.

Some societies use Oxford Academic personal accounts to provide access to their members.

Viewing your signed in accounts

Click the account icon in the top right to:

  • View your signed in personal account and access account management features.
  • View the institutional accounts that are providing access.

Signed in but can't access content

Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.

For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.

Our books are available by subscription or purchase to libraries and institutions.

Month: Total Views:
October 2022 6
October 2022 10
October 2022 53
October 2022 93
October 2022 31
October 2022 57
October 2022 88
October 2022 28
October 2022 11
October 2022 32
October 2022 30
October 2022 216
October 2022 100
October 2022 100
October 2022 39
October 2022 74
October 2022 115
October 2022 27
October 2022 41
October 2022 18
October 2022 20
October 2022 23
October 2022 99
October 2022 146
October 2022 22
October 2022 38
October 2022 5
October 2022 27
October 2022 28
October 2022 19
October 2022 94
October 2022 33
October 2022 97
October 2022 102
October 2022 115
October 2022 217
October 2022 131
October 2022 65
October 2022 36
October 2022 37
October 2022 33
October 2022 7
October 2022 48
October 2022 68
October 2022 71
October 2022 11
October 2022 49
November 2022 89
November 2022 229
November 2022 34
November 2022 56
November 2022 9
November 2022 12
November 2022 89
November 2022 28
November 2022 24
November 2022 53
November 2022 107
November 2022 65
November 2022 306
November 2022 19
November 2022 137
November 2022 93
November 2022 30
November 2022 20
November 2022 117
November 2022 22
November 2022 16
November 2022 31
November 2022 36
November 2022 42
November 2022 1
November 2022 13
November 2022 49
November 2022 77
November 2022 14
November 2022 29
November 2022 15
November 2022 22
November 2022 102
November 2022 87
November 2022 110
November 2022 25
November 2022 71
November 2022 34
November 2022 39
November 2022 26
November 2022 53
November 2022 61
November 2022 8
November 2022 51
November 2022 42
November 2022 97
November 2022 35
December 2022 90
December 2022 30
December 2022 109
December 2022 50
December 2022 102
December 2022 98
December 2022 57
December 2022 292
December 2022 73
December 2022 22
December 2022 7
December 2022 86
December 2022 66
December 2022 61
December 2022 26
December 2022 23
December 2022 40
December 2022 154
December 2022 49
December 2022 21
December 2022 48
December 2022 102
December 2022 207
December 2022 53
December 2022 68
December 2022 23
December 2022 41
December 2022 102
December 2022 47
December 2022 81
December 2022 22
December 2022 27
December 2022 29
December 2022 27
December 2022 32
December 2022 95
December 2022 83
December 2022 60
December 2022 72
December 2022 28
December 2022 41
December 2022 20
December 2022 86
December 2022 63
December 2022 55
December 2022 32
December 2022 44
January 2023 64
January 2023 23
January 2023 83
January 2023 98
January 2023 69
January 2023 191
January 2023 50
January 2023 71
January 2023 329
January 2023 89
January 2023 32
January 2023 7
January 2023 29
January 2023 28
January 2023 4
January 2023 48
January 2023 50
January 2023 79
January 2023 70
January 2023 8
January 2023 31
January 2023 23
January 2023 4
January 2023 74
January 2023 48
January 2023 178
January 2023 35
January 2023 36
January 2023 26
January 2023 17
January 2023 65
January 2023 9
January 2023 59
January 2023 26
January 2023 35
January 2023 213
January 2023 25
January 2023 31
January 2023 12
January 2023 65
January 2023 27
January 2023 88
January 2023 112
January 2023 150
January 2023 48
January 2023 192
January 2023 74
February 2023 99
February 2023 27
February 2023 141
February 2023 66
February 2023 126
February 2023 40
February 2023 51
February 2023 83
February 2023 293
February 2023 175
February 2023 9
February 2023 48
February 2023 105
February 2023 11
February 2023 35
February 2023 44
February 2023 103
February 2023 28
February 2023 157
February 2023 45
February 2023 39
February 2023 11
February 2023 40
February 2023 7
February 2023 91
February 2023 239
February 2023 64
February 2023 10
February 2023 45
February 2023 27
February 2023 60
February 2023 27
February 2023 42
February 2023 84
February 2023 87
February 2023 31
February 2023 56
February 2023 152
February 2023 35
February 2023 35
February 2023 129
February 2023 9
February 2023 106
February 2023 87
February 2023 59
February 2023 93
February 2023 97
March 2023 80
March 2023 21
March 2023 60
March 2023 154
March 2023 174
March 2023 74
March 2023 84
March 2023 228
March 2023 111
March 2023 383
March 2023 51
March 2023 6
March 2023 19
March 2023 13
March 2023 46
March 2023 85
March 2023 219
March 2023 51
March 2023 15
March 2023 40
March 2023 48
March 2023 12
March 2023 20
March 2023 81
March 2023 39
March 2023 83
March 2023 83
March 2023 12
March 2023 313
March 2023 101
March 2023 43
March 2023 20
March 2023 214
March 2023 101
March 2023 26
March 2023 27
March 2023 63
March 2023 21
March 2023 80
March 2023 82
March 2023 9
March 2023 49
March 2023 151
March 2023 108
March 2023 106
March 2023 28
March 2023 36
April 2023 65
April 2023 110
April 2023 18
April 2023 58
April 2023 173
April 2023 74
April 2023 126
April 2023 43
April 2023 393
April 2023 41
April 2023 82
April 2023 12
April 2023 53
April 2023 39
April 2023 30
April 2023 53
April 2023 2
April 2023 16
April 2023 24
April 2023 22
April 2023 33
April 2023 8
April 2023 11
April 2023 33
April 2023 15
April 2023 37
April 2023 95
April 2023 29
April 2023 254
April 2023 58
April 2023 45
April 2023 24
April 2023 166
April 2023 94
April 2023 40
April 2023 7
April 2023 32
April 2023 29
April 2023 69
April 2023 31
April 2023 6
April 2023 93
April 2023 124
April 2023 46
April 2023 18
April 2023 152
April 2023 101
May 2023 90
May 2023 27
May 2023 59
May 2023 139
May 2023 202
May 2023 51
May 2023 100
May 2023 68
May 2023 90
May 2023 324
May 2023 36
May 2023 8
May 2023 67
May 2023 75
May 2023 14
May 2023 25
May 2023 30
May 2023 73
May 2023 34
May 2023 45
May 2023 38
May 2023 141
May 2023 13
May 2023 28
May 2023 93
May 2023 84
May 2023 270
May 2023 9
May 2023 39
May 2023 18
May 2023 36
May 2023 32
May 2023 13
May 2023 68
May 2023 36
May 2023 66
May 2023 180
May 2023 112
May 2023 58
May 2023 60
May 2023 26
May 2023 42
May 2023 7
May 2023 113
May 2023 109
May 2023 129
May 2023 22
June 2023 42
June 2023 25
June 2023 87
June 2023 93
June 2023 108
June 2023 85
June 2023 42
June 2023 54
June 2023 181
June 2023 83
June 2023 34
June 2023 31
June 2023 67
June 2023 9
June 2023 87
June 2023 40
June 2023 13
June 2023 28
June 2023 33
June 2023 13
June 2023 44
June 2023 27
June 2023 30
June 2023 13
June 2023 55
June 2023 51
June 2023 33
June 2023 158
June 2023 24
June 2023 73
June 2023 29
June 2023 13
June 2023 28
June 2023 80
June 2023 87
June 2023 36
June 2023 30
June 2023 33
June 2023 65
June 2023 63
June 2023 45
June 2023 66
June 2023 10
June 2023 76
June 2023 36
June 2023 98
June 2023 89
July 2023 36
July 2023 14
July 2023 56
July 2023 75
July 2023 51
July 2023 103
July 2023 27
July 2023 155
July 2023 55
July 2023 62
July 2023 33
July 2023 22
July 2023 27
July 2023 5
July 2023 9
July 2023 33
July 2023 35
July 2023 27
July 2023 16
July 2023 11
July 2023 99
July 2023 17
July 2023 35
July 2023 10
July 2023 20
July 2023 21
July 2023 17
July 2023 44
July 2023 11
July 2023 32
July 2023 23
July 2023 76
July 2023 49
July 2023 43
July 2023 25
July 2023 59
July 2023 80
July 2023 159
July 2023 18
July 2023 63
July 2023 14
July 2023 20
July 2023 34
July 2023 44
July 2023 48
July 2023 32
July 2023 9
August 2023 49
August 2023 27
August 2023 99
August 2023 52
August 2023 181
August 2023 49
August 2023 68
August 2023 127
August 2023 90
August 2023 223
August 2023 43
August 2023 7
August 2023 54
August 2023 26
August 2023 19
August 2023 39
August 2023 26
August 2023 69
August 2023 45
August 2023 25
August 2023 148
August 2023 32
August 2023 10
August 2023 42
August 2023 14
August 2023 36
August 2023 71
August 2023 177
August 2023 65
August 2023 23
August 2023 77
August 2023 28
August 2023 79
August 2023 43
August 2023 22
August 2023 60
August 2023 23
August 2023 17
August 2023 30
August 2023 128
August 2023 80
August 2023 182
August 2023 59
August 2023 28
August 2023 41
August 2023 12
August 2023 80
September 2023 57
September 2023 26
September 2023 100
September 2023 107
September 2023 190
September 2023 345
September 2023 277
September 2023 76
September 2023 98
September 2023 5
September 2023 51
September 2023 102
September 2023 48
September 2023 30
September 2023 75
September 2023 51
September 2023 24
September 2023 60
September 2023 39
September 2023 23
September 2023 210
September 2023 61
September 2023 118
September 2023 75
September 2023 257
September 2023 28
September 2023 23
September 2023 51
September 2023 43
September 2023 77
September 2023 119
September 2023 50
September 2023 40
September 2023 30
September 2023 168
September 2023 39
September 2023 53
September 2023 24
September 2023 108
September 2023 110
September 2023 31
September 2023 93
September 2023 129
September 2023 39
September 2023 43
September 2023 38
September 2023 31
October 2023 83
October 2023 15
October 2023 45
October 2023 92
October 2023 152
October 2023 109
October 2023 154
October 2023 79
October 2023 34
October 2023 208
October 2023 64
October 2023 2
October 2023 27
October 2023 30
October 2023 11
October 2023 16
October 2023 20
October 2023 30
October 2023 31
October 2023 54
October 2023 38
October 2023 13
October 2023 14
October 2023 23
October 2023 107
October 2023 50
October 2023 139
October 2023 16
October 2023 41
October 2023 29
October 2023 13
October 2023 118
October 2023 29
October 2023 13
October 2023 18
October 2023 45
October 2023 94
October 2023 159
October 2023 257
October 2023 80
October 2023 30
October 2023 86
October 2023 152
October 2023 18
October 2023 12
October 2023 102
October 2023 97
November 2023 52
November 2023 9
November 2023 109
November 2023 50
November 2023 129
November 2023 44
November 2023 106
November 2023 52
November 2023 21
November 2023 265
November 2023 109
November 2023 14
November 2023 13
November 2023 32
November 2023 9
November 2023 1
November 2023 61
November 2023 37
November 2023 24
November 2023 49
November 2023 37
November 2023 9
November 2023 77
November 2023 313
November 2023 46
November 2023 14
November 2023 11
November 2023 25
November 2023 24
November 2023 139
November 2023 12
November 2023 21
November 2023 80
November 2023 96
November 2023 34
November 2023 2
November 2023 10
November 2023 25
November 2023 99
November 2023 11
November 2023 59
November 2023 89
November 2023 71
November 2023 69
November 2023 21
November 2023 136
November 2023 106
December 2023 59
December 2023 18
December 2023 72
December 2023 44
December 2023 103
December 2023 267
December 2023 58
December 2023 102
December 2023 4
December 2023 61
December 2023 36
December 2023 36
December 2023 12
December 2023 42
December 2023 32
December 2023 35
December 2023 32
December 2023 5
December 2023 52
December 2023 20
December 2023 22
December 2023 84
December 2023 13
December 2023 15
December 2023 57
December 2023 63
December 2023 114
December 2023 18
December 2023 35
December 2023 11
December 2023 41
December 2023 49
December 2023 9
December 2023 29
December 2023 68
December 2023 90
December 2023 206
December 2023 22
December 2023 53
December 2023 18
December 2023 15
December 2023 83
December 2023 24
December 2023 5
December 2023 76
December 2023 40
December 2023 72
January 2024 74
January 2024 25
January 2024 65
January 2024 88
January 2024 124
January 2024 110
January 2024 127
January 2024 302
January 2024 107
January 2024 74
January 2024 9
January 2024 86
January 2024 61
January 2024 36
January 2024 42
January 2024 43
January 2024 39
January 2024 76
January 2024 40
January 2024 21
January 2024 79
January 2024 45
January 2024 84
January 2024 35
January 2024 15
January 2024 17
January 2024 90
January 2024 216
January 2024 42
January 2024 65
January 2024 16
January 2024 31
January 2024 32
January 2024 20
January 2024 25
January 2024 88
January 2024 79
January 2024 116
January 2024 75
January 2024 261
January 2024 173
January 2024 24
January 2024 31
January 2024 114
January 2024 12
January 2024 91
January 2024 67
February 2024 37
February 2024 71
February 2024 12
February 2024 124
February 2024 96
February 2024 27
February 2024 63
February 2024 192
February 2024 81
February 2024 34
February 2024 353
February 2024 3
February 2024 73
February 2024 7
February 2024 5
February 2024 33
February 2024 42
February 2024 7
February 2024 19
February 2024 28
February 2024 85
February 2024 45
February 2024 127
February 2024 39
February 2024 238
February 2024 13
February 2024 30
February 2024 17
February 2024 13
February 2024 24
February 2024 111
February 2024 48
February 2024 7
February 2024 131
February 2024 8
February 2024 71
February 2024 12
February 2024 14
February 2024 98
February 2024 53
February 2024 56
February 2024 35
February 2024 107
February 2024 126
February 2024 16
February 2024 127
February 2024 97
March 2024 99
March 2024 27
March 2024 86
March 2024 129
March 2024 151
March 2024 51
March 2024 65
March 2024 240
March 2024 108
March 2024 48
March 2024 8
March 2024 34
March 2024 223
March 2024 481
March 2024 3
March 2024 36
March 2024 232
March 2024 73
March 2024 166
March 2024 11
March 2024 22
March 2024 45
March 2024 9
March 2024 29
March 2024 39
March 2024 26
March 2024 86
March 2024 57
March 2024 33
March 2024 194
March 2024 52
March 2024 8
March 2024 27
March 2024 153
March 2024 32
March 2024 104
March 2024 80
March 2024 25
March 2024 88
March 2024 44
March 2024 45
March 2024 26
March 2024 9
March 2024 49
March 2024 114
March 2024 173
March 2024 111
April 2024 105
April 2024 25
April 2024 92
April 2024 140
April 2024 194
April 2024 88
April 2024 83
April 2024 495
April 2024 157
April 2024 49
April 2024 69
April 2024 13
April 2024 236
April 2024 188
April 2024 18
April 2024 31
April 2024 9
April 2024 11
April 2024 286
April 2024 61
April 2024 128
April 2024 76
April 2024 271
April 2024 21
April 2024 23
April 2024 44
April 2024 36
April 2024 50
April 2024 169
April 2024 42
April 2024 29
April 2024 187
April 2024 39
April 2024 13
April 2024 91
April 2024 149
April 2024 91
April 2024 16
April 2024 21
April 2024 37
April 2024 96
April 2024 32
April 2024 99
April 2024 143
April 2024 100
April 2024 148
April 2024 19
May 2024 57
May 2024 131
May 2024 123
May 2024 18
May 2024 135
May 2024 182
May 2024 57
May 2024 58
May 2024 88
May 2024 410
May 2024 65
May 2024 10
May 2024 19
May 2024 47
May 2024 14
May 2024 82
May 2024 67
May 2024 45
May 2024 26
May 2024 12
May 2024 201
May 2024 49
May 2024 32
May 2024 19
May 2024 121
May 2024 28
May 2024 81
May 2024 15
May 2024 297
May 2024 33
May 2024 69
May 2024 35
May 2024 31
May 2024 21
May 2024 123
May 2024 20
May 2024 166
May 2024 56
May 2024 73
May 2024 30
May 2024 11
May 2024 39
May 2024 179
May 2024 132
May 2024 133
May 2024 17
May 2024 110
June 2024 32
June 2024 10
June 2024 39
June 2024 50
June 2024 75
June 2024 65
June 2024 26
June 2024 23
June 2024 60
June 2024 145
June 2024 25
June 2024 1
June 2024 8
June 2024 25
June 2024 6
June 2024 22
June 2024 30
June 2024 11
June 2024 21
June 2024 29
June 2024 4
June 2024 15
June 2024 17
June 2024 1
June 2024 72
June 2024 27
June 2024 115
June 2024 24
June 2024 26
June 2024 7
June 2024 49
June 2024 17
June 2024 3
June 2024 52
June 2024 21
June 2024 7
June 2024 72
June 2024 16
June 2024 14
June 2024 12
June 2024 32
June 2024 3
June 2024 9
June 2024 61
June 2024 51
June 2024 11
June 2024 32
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Rights and permissions
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Last updated 20/06/24: Online ordering is currently unavailable due to technical issues. We apologise for any delays responding to customers while we resolve this. For further updates please visit our website: https://www.cambridge.org/news-and-insights/technical-incident

We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings .

Login Alert

qualitative research process design

  • > The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • > Qualitative Research Design

qualitative research process design

Book contents

  • The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Cambridge Handbooks in Psychology
  • Copyright page
  • Contributors
  • Part I From Idea to Reality: The Basics of Research
  • Part II The Building Blocks of a Study
  • Part III Data Collection
  • 13 Cross-Sectional Studies
  • 14 Quasi-Experimental Research
  • 15 Non-equivalent Control Group Pretest–Posttest Design in Social and Behavioral Research
  • 16 Experimental Methods
  • 17 Longitudinal Research: A World to Explore
  • 18 Online Research Methods
  • 19 Archival Data
  • 20 Qualitative Research Design
  • Part IV Statistical Approaches
  • Part V Tips for a Successful Research Career

20 - Qualitative Research Design

from Part III - Data Collection

Published online by Cambridge University Press:  25 May 2023

The social world is fascinating – full of complexities, tensions, and contradictions. Social scientists have long been interested in better understanding the social world around us. Unlike quantitative research, that focuses on collecting and analyzing numerical data to make statistical inferences about the social world, qualitative research contributes to empirical and theoretical understandings of society by examining and explaining how and why people think and act as they do through the use of non-numerical data. In other words, qualitative research uncovers social processes and mechanisms undergirding human behavior. In this chapter, we will discuss how to design a qualitative research project using two of the most common qualitative research methods: in-depth interviewing and ethnographic observations (also known as ethnography or participant observation). We will begin the chapter by discussing the what , how , and why of interviewing and ethnography. We will then discuss the importance of interrogating one’s underlying ontological and epistemological assumptions regarding research (and the research process) and the steps to follow in designing a qualitative study. We conclude the chapter by reviewing the different elements to consider when developing a qualitative research project.

Access options

Save book to kindle.

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle .

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service .

  • Qualitative Research Design
  • By Sinikka Elliott , Kayonne Christy , Siqi Xiao
  • Edited by Austin Lee Nichols , Central European University, Vienna , John Edlund , Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.021

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox .

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive .

  • Technical Support
  • Find My Rep

You are here

Qualitative Research Design

Qualitative Research Design An Interactive Approach

  • Joseph A. Maxwell - George Mason University, VA
  • Description
ISBN: 9781412981194 Paperback Suggested Retail Price: $68.00 Bookstore Price: $54.40
ISBN: 9781452285832 Electronic Version Suggested Retail Price: $52.00 Bookstore Price: $41.60

See what’s new to this edition by selecting the Features tab on this page. Should you need additional information or have questions regarding the HEOA information provided for this title, including what is new to this edition, please email [email protected] . Please include your name, contact information, and the name of the title for which you would like more information. For information on the HEOA, please go to http://ed.gov/policy/highered/leg/hea08/index.html .

For assistance with your order: Please email us at [email protected] or connect with your SAGE representative.

SAGE 2455 Teller Road Thousand Oaks, CA 91320 www.sagepub.com

“This book uses everyday language that will captivate students’ attention and embed practical knowledge to supplement the technical.”

“The key strengths of the text are the passion and the enthusiasm that Dr. Maxwell has for qualitative research after all these years. I feel I can also utilize these concepts on my own research team and take them out of the classroom and into research team meetings with colleagues.”

“I really liked this book. I found myself taking notes and saying “yes” so many times because Maxwell captures the research process so well and provides many points worth quoting. As a faculty mentor, I particularly see the value of this book for my students who are conducting qualitative dissertations.”

"Maxwell provides a clear explanation regarding the nuances involved in the circular process of qualitative research design."

Useful book for students undertaking qualitative research

Good book for qualitative research design. Used as a secondary text.

comprehensive, well written

This really is an excellent text which covers a vast range of qualitative research approaches - and is very readable

Maxwell's book has been very helpful to my students to make a conceptual design for their research. They were asked to read the first 3 chapters carefully and make the Chapter 2 assignment. Some students had never thought about their qualitative research in this way. I adopted the book as reference, however I would have made it essential to the course if the price were more reasonable. I think 26 pounds would be acceptable. Maxwell's book gives a common-sense approach to designing social research. I used it to provide a framework for integrating discourse analysis in social research.

Although a very good book, we cover more material than the focus of this book and we would have to adopt too many texts for our students.

New to the Third Edition

  • Provides new and expanded coverage of key topics such as paradigms in qualitative research, conceptual frameworks and using theory, doing literature reviews, and writing research proposals

Key Features

  • Offers an original, innovative model of design based on a systemic rather than a linear or typological structure, well suited for designing studies and writing research proposals
  • Includes many exercises that help readers to design their study
  • Provides guidance in a clear, direct writing style , offering practical advice on research design

A major impetus for a new edition of this book was the opportunity to expand it somewhat beyond the page limits of the earlier series on Applied Social Research Methods, for which it was originally written. However, many readers of the previous editions have said that they appreciated the conciseness of the book, so I didn't want to lose this virtue. Consequently, much of the new material in this edition consists of additional examples of my students' work, including a second example of a dissertation proposal (Appendix B).

Another impetus has been the ongoing development of qualitative research 1 , with a flourishing of new approaches, including arts-based approaches, to how it is conducted and presented. I haven't attempted to deal comprehensively with these, which would have ballooned the book well past what I felt was an appropriate length, as well as taking it beyond an introductory level. If you want to investigate these developments, the SAGE Encyclopedia of Qualitative Research (Given, 2008), the SAGE Handbook of Qualitative Research , 4th edition (Denzin & Lincoln, 2011) and the journal Qualitative Inquiry are good places to start. I've tried to indicate, in Chapters 1 and 3, how I see my approach to design as compatible with some of these developments, in particular with aspects of postmodernism and with the approach known as bricolage, and I have substantially rewritten and expanded my discussion of research paradigms, in Chapter 2.

However, I am also sceptical of some of these developments, particularly those that adopt a radical constructivist and relativist stance that denies the existence of any "reality" that our research attempts to understand, and that rejects any conception of "validity" (or related terms) that addresses the relationship between our research conclusions and the phenomena that we study. While I am enough of a postmodernist to believe that every theory and conclusion is our own construction, with no claim to "objective" or absolute truth, and argue in Chapter 2 that no theory can capture the full complexity of the things we study, I refuse to abandon the goal of gaining a better understanding of the physical, social, and cultural world in which we live, or the possibility of developing credible explanations for these phenomena.

This position is grounded in my third impetus for revising this book: my increasing awareness of how my own perspective on qualitative research has been informed by a philosophical realism about the things we study. I have developed this perspective at length in my book A Realist Approach for Qualitative Research (2011), arguing that the "critical realist" position I have taken is not only compatible with most qualitative researchers' actual practices, but can be valuable in helping researchers with some difficult theoretical, methodological, and political issues that they face. However, I offer this as a useful perspective among other perspectives, not as the single correct paradigm for qualitative research. As the writing teacher Peter Elbow (1973, 2006) argued, it is important to play both the "believing game" and the "doubting game" with any theory or position you encounter, trying to see both its advantages and its distortions or blind spots. For this reason, I want the present book to be of practical value to students and researchers who hold a variety of positions on these issues. The model of qualitative research design that I develop here is compatible with a range of philosophical perspectives, and I believe it is broadly applicable to most qualitative research.

My greater awareness of the implications of a critical realist stance have led me to revise or expand other parts of the book—in particular, the discussion of theory in Chapter 3; developing (and revising) research questions in Chapter 4; research relationships and ethics, developing interview questions, and data analysis, in Chapter 5; the concept of validity in Chapter 6; and the appropriate functions and content of a literature review in a research proposal, in Chapter 7. I've also continued to compulsively tinker with the language of the book, striving to make what I say clearer. I would be grateful for any feedback you can give me on how the book could be made more useful to you.

Finally, I realized in revising this work that I had said almost nothing explicitly about how I define qualitative research—what I see as most essential about a qualitative approach. I say more about this in Chapter 2. However, a brief definition would be that qualitative research is research that is intended to help you better understand 1) the meanings and perspectives of the people you study—seeing the world from their point of view, rather than simply from your own; 2) how these perspectives are shaped by, and shape, their physical, social, and cultural contexts; and 3) the specific processes that are involved in maintaining or altering these phenomena and relationships. All three of these aspects of qualitative research, but particularly the last one, contrast with most quantitative approaches to research, which are based on seeing the phenomena studied in terms of variables —properties of things that can vary, and can thus be measured and compared across contexts. I see most of the more obvious aspects of qualitative research—its inductive, open-ended approach, its reliance on textual or visual rather than numerical data, its primary goal of particular understanding rather than generalization across persons and settings—as due to these three main features of qualitative inquiry. (For a more detailed discussion of these issues, see Maxwell, 2011b.)

1. Some qualitative practitioners prefer the term "inquiry" to "research," seeing the latter as too closely associated with a quantitative or positivist approach. I agree with their concerns (see Maxwell, 2004a, b), and I understand that some types of qualitative inquiry are more humanistic than scientific, but I prefer to argue for a broader definition of "research" that includes a range of qualitative approaches.

Sample Materials & Chapters

Sage college publishing.

You can purchase this book and request an instructor sample on our US College site:

Logo for Open Educational Resources

Chapter 2. Research Design

Getting started.

When I teach undergraduates qualitative research methods, the final product of the course is a “research proposal” that incorporates all they have learned and enlists the knowledge they have learned about qualitative research methods in an original design that addresses a particular research question. I highly recommend you think about designing your own research study as you progress through this textbook. Even if you don’t have a study in mind yet, it can be a helpful exercise as you progress through the course. But how to start? How can one design a research study before they even know what research looks like? This chapter will serve as a brief overview of the research design process to orient you to what will be coming in later chapters. Think of it as a “skeleton” of what you will read in more detail in later chapters. Ideally, you will read this chapter both now (in sequence) and later during your reading of the remainder of the text. Do not worry if you have questions the first time you read this chapter. Many things will become clearer as the text advances and as you gain a deeper understanding of all the components of good qualitative research. This is just a preliminary map to get you on the right road.

Null

Research Design Steps

Before you even get started, you will need to have a broad topic of interest in mind. [1] . In my experience, students can confuse this broad topic with the actual research question, so it is important to clearly distinguish the two. And the place to start is the broad topic. It might be, as was the case with me, working-class college students. But what about working-class college students? What’s it like to be one? Why are there so few compared to others? How do colleges assist (or fail to assist) them? What interested me was something I could barely articulate at first and went something like this: “Why was it so difficult and lonely to be me?” And by extension, “Did others share this experience?”

Once you have a general topic, reflect on why this is important to you. Sometimes we connect with a topic and we don’t really know why. Even if you are not willing to share the real underlying reason you are interested in a topic, it is important that you know the deeper reasons that motivate you. Otherwise, it is quite possible that at some point during the research, you will find yourself turned around facing the wrong direction. I have seen it happen many times. The reason is that the research question is not the same thing as the general topic of interest, and if you don’t know the reasons for your interest, you are likely to design a study answering a research question that is beside the point—to you, at least. And this means you will be much less motivated to carry your research to completion.

Researcher Note

Why do you employ qualitative research methods in your area of study? What are the advantages of qualitative research methods for studying mentorship?

Qualitative research methods are a huge opportunity to increase access, equity, inclusion, and social justice. Qualitative research allows us to engage and examine the uniquenesses/nuances within minoritized and dominant identities and our experiences with these identities. Qualitative research allows us to explore a specific topic, and through that exploration, we can link history to experiences and look for patterns or offer up a unique phenomenon. There’s such beauty in being able to tell a particular story, and qualitative research is a great mode for that! For our work, we examined the relationships we typically use the term mentorship for but didn’t feel that was quite the right word. Qualitative research allowed us to pick apart what we did and how we engaged in our relationships, which then allowed us to more accurately describe what was unique about our mentorship relationships, which we ultimately named liberationships ( McAloney and Long 2021) . Qualitative research gave us the means to explore, process, and name our experiences; what a powerful tool!

How do you come up with ideas for what to study (and how to study it)? Where did you get the idea for studying mentorship?

Coming up with ideas for research, for me, is kind of like Googling a question I have, not finding enough information, and then deciding to dig a little deeper to get the answer. The idea to study mentorship actually came up in conversation with my mentorship triad. We were talking in one of our meetings about our relationship—kind of meta, huh? We discussed how we felt that mentorship was not quite the right term for the relationships we had built. One of us asked what was different about our relationships and mentorship. This all happened when I was taking an ethnography course. During the next session of class, we were discussing auto- and duoethnography, and it hit me—let’s explore our version of mentorship, which we later went on to name liberationships ( McAloney and Long 2021 ). The idea and questions came out of being curious and wanting to find an answer. As I continue to research, I see opportunities in questions I have about my work or during conversations that, in our search for answers, end up exposing gaps in the literature. If I can’t find the answer already out there, I can study it.

—Kim McAloney, PhD, College Student Services Administration Ecampus coordinator and instructor

When you have a better idea of why you are interested in what it is that interests you, you may be surprised to learn that the obvious approaches to the topic are not the only ones. For example, let’s say you think you are interested in preserving coastal wildlife. And as a social scientist, you are interested in policies and practices that affect the long-term viability of coastal wildlife, especially around fishing communities. It would be natural then to consider designing a research study around fishing communities and how they manage their ecosystems. But when you really think about it, you realize that what interests you the most is how people whose livelihoods depend on a particular resource act in ways that deplete that resource. Or, even deeper, you contemplate the puzzle, “How do people justify actions that damage their surroundings?” Now, there are many ways to design a study that gets at that broader question, and not all of them are about fishing communities, although that is certainly one way to go. Maybe you could design an interview-based study that includes and compares loggers, fishers, and desert golfers (those who golf in arid lands that require a great deal of wasteful irrigation). Or design a case study around one particular example where resources were completely used up by a community. Without knowing what it is you are really interested in, what motivates your interest in a surface phenomenon, you are unlikely to come up with the appropriate research design.

These first stages of research design are often the most difficult, but have patience . Taking the time to consider why you are going to go through a lot of trouble to get answers will prevent a lot of wasted energy in the future.

There are distinct reasons for pursuing particular research questions, and it is helpful to distinguish between them.  First, you may be personally motivated.  This is probably the most important and the most often overlooked.   What is it about the social world that sparks your curiosity? What bothers you? What answers do you need in order to keep living? For me, I knew I needed to get a handle on what higher education was for before I kept going at it. I needed to understand why I felt so different from my peers and whether this whole “higher education” thing was “for the likes of me” before I could complete my degree. That is the personal motivation question. Your personal motivation might also be political in nature, in that you want to change the world in a particular way. It’s all right to acknowledge this. In fact, it is better to acknowledge it than to hide it.

There are also academic and professional motivations for a particular study.  If you are an absolute beginner, these may be difficult to find. We’ll talk more about this when we discuss reviewing the literature. Simply put, you are probably not the only person in the world to have thought about this question or issue and those related to it. So how does your interest area fit into what others have studied? Perhaps there is a good study out there of fishing communities, but no one has quite asked the “justification” question. You are motivated to address this to “fill the gap” in our collective knowledge. And maybe you are really not at all sure of what interests you, but you do know that [insert your topic] interests a lot of people, so you would like to work in this area too. You want to be involved in the academic conversation. That is a professional motivation and a very important one to articulate.

Practical and strategic motivations are a third kind. Perhaps you want to encourage people to take better care of the natural resources around them. If this is also part of your motivation, you will want to design your research project in a way that might have an impact on how people behave in the future. There are many ways to do this, one of which is using qualitative research methods rather than quantitative research methods, as the findings of qualitative research are often easier to communicate to a broader audience than the results of quantitative research. You might even be able to engage the community you are studying in the collecting and analyzing of data, something taboo in quantitative research but actively embraced and encouraged by qualitative researchers. But there are other practical reasons, such as getting “done” with your research in a certain amount of time or having access (or no access) to certain information. There is nothing wrong with considering constraints and opportunities when designing your study. Or maybe one of the practical or strategic goals is about learning competence in this area so that you can demonstrate the ability to conduct interviews and focus groups with future employers. Keeping that in mind will help shape your study and prevent you from getting sidetracked using a technique that you are less invested in learning about.

STOP HERE for a moment

I recommend you write a paragraph (at least) explaining your aims and goals. Include a sentence about each of the following: personal/political goals, practical or professional/academic goals, and practical/strategic goals. Think through how all of the goals are related and can be achieved by this particular research study . If they can’t, have a rethink. Perhaps this is not the best way to go about it.

You will also want to be clear about the purpose of your study. “Wait, didn’t we just do this?” you might ask. No! Your goals are not the same as the purpose of the study, although they are related. You can think about purpose lying on a continuum from “ theory ” to “action” (figure 2.1). Sometimes you are doing research to discover new knowledge about the world, while other times you are doing a study because you want to measure an impact or make a difference in the world.

Purpose types: Basic Research, Applied Research, Summative Evaluation, Formative Evaluation, Action Research

Basic research involves research that is done for the sake of “pure” knowledge—that is, knowledge that, at least at this moment in time, may not have any apparent use or application. Often, and this is very important, knowledge of this kind is later found to be extremely helpful in solving problems. So one way of thinking about basic research is that it is knowledge for which no use is yet known but will probably one day prove to be extremely useful. If you are doing basic research, you do not need to argue its usefulness, as the whole point is that we just don’t know yet what this might be.

Researchers engaged in basic research want to understand how the world operates. They are interested in investigating a phenomenon to get at the nature of reality with regard to that phenomenon. The basic researcher’s purpose is to understand and explain ( Patton 2002:215 ).

Basic research is interested in generating and testing hypotheses about how the world works. Grounded Theory is one approach to qualitative research methods that exemplifies basic research (see chapter 4). Most academic journal articles publish basic research findings. If you are working in academia (e.g., writing your dissertation), the default expectation is that you are conducting basic research.

Applied research in the social sciences is research that addresses human and social problems. Unlike basic research, the researcher has expectations that the research will help contribute to resolving a problem, if only by identifying its contours, history, or context. From my experience, most students have this as their baseline assumption about research. Why do a study if not to make things better? But this is a common mistake. Students and their committee members are often working with default assumptions here—the former thinking about applied research as their purpose, the latter thinking about basic research: “The purpose of applied research is to contribute knowledge that will help people to understand the nature of a problem in order to intervene, thereby allowing human beings to more effectively control their environment. While in basic research the source of questions is the tradition within a scholarly discipline, in applied research the source of questions is in the problems and concerns experienced by people and by policymakers” ( Patton 2002:217 ).

Applied research is less geared toward theory in two ways. First, its questions do not derive from previous literature. For this reason, applied research studies have much more limited literature reviews than those found in basic research (although they make up for this by having much more “background” about the problem). Second, it does not generate theory in the same way as basic research does. The findings of an applied research project may not be generalizable beyond the boundaries of this particular problem or context. The findings are more limited. They are useful now but may be less useful later. This is why basic research remains the default “gold standard” of academic research.

Evaluation research is research that is designed to evaluate or test the effectiveness of specific solutions and programs addressing specific social problems. We already know the problems, and someone has already come up with solutions. There might be a program, say, for first-generation college students on your campus. Does this program work? Are first-generation students who participate in the program more likely to graduate than those who do not? These are the types of questions addressed by evaluation research. There are two types of research within this broader frame; however, one more action-oriented than the next. In summative evaluation , an overall judgment about the effectiveness of a program or policy is made. Should we continue our first-gen program? Is it a good model for other campuses? Because the purpose of such summative evaluation is to measure success and to determine whether this success is scalable (capable of being generalized beyond the specific case), quantitative data is more often used than qualitative data. In our example, we might have “outcomes” data for thousands of students, and we might run various tests to determine if the better outcomes of those in the program are statistically significant so that we can generalize the findings and recommend similar programs elsewhere. Qualitative data in the form of focus groups or interviews can then be used for illustrative purposes, providing more depth to the quantitative analyses. In contrast, formative evaluation attempts to improve a program or policy (to help “form” or shape its effectiveness). Formative evaluations rely more heavily on qualitative data—case studies, interviews, focus groups. The findings are meant not to generalize beyond the particular but to improve this program. If you are a student seeking to improve your qualitative research skills and you do not care about generating basic research, formative evaluation studies might be an attractive option for you to pursue, as there are always local programs that need evaluation and suggestions for improvement. Again, be very clear about your purpose when talking through your research proposal with your committee.

Action research takes a further step beyond evaluation, even formative evaluation, to being part of the solution itself. This is about as far from basic research as one could get and definitely falls beyond the scope of “science,” as conventionally defined. The distinction between action and research is blurry, the research methods are often in constant flux, and the only “findings” are specific to the problem or case at hand and often are findings about the process of intervention itself. Rather than evaluate a program as a whole, action research often seeks to change and improve some particular aspect that may not be working—maybe there is not enough diversity in an organization or maybe women’s voices are muted during meetings and the organization wonders why and would like to change this. In a further step, participatory action research , those women would become part of the research team, attempting to amplify their voices in the organization through participation in the action research. As action research employs methods that involve people in the process, focus groups are quite common.

If you are working on a thesis or dissertation, chances are your committee will expect you to be contributing to fundamental knowledge and theory ( basic research ). If your interests lie more toward the action end of the continuum, however, it is helpful to talk to your committee about this before you get started. Knowing your purpose in advance will help avoid misunderstandings during the later stages of the research process!

The Research Question

Once you have written your paragraph and clarified your purpose and truly know that this study is the best study for you to be doing right now , you are ready to write and refine your actual research question. Know that research questions are often moving targets in qualitative research, that they can be refined up to the very end of data collection and analysis. But you do have to have a working research question at all stages. This is your “anchor” when you get lost in the data. What are you addressing? What are you looking at and why? Your research question guides you through the thicket. It is common to have a whole host of questions about a phenomenon or case, both at the outset and throughout the study, but you should be able to pare it down to no more than two or three sentences when asked. These sentences should both clarify the intent of the research and explain why this is an important question to answer. More on refining your research question can be found in chapter 4.

Chances are, you will have already done some prior reading before coming up with your interest and your questions, but you may not have conducted a systematic literature review. This is the next crucial stage to be completed before venturing further. You don’t want to start collecting data and then realize that someone has already beaten you to the punch. A review of the literature that is already out there will let you know (1) if others have already done the study you are envisioning; (2) if others have done similar studies, which can help you out; and (3) what ideas or concepts are out there that can help you frame your study and make sense of your findings. More on literature reviews can be found in chapter 9.

In addition to reviewing the literature for similar studies to what you are proposing, it can be extremely helpful to find a study that inspires you. This may have absolutely nothing to do with the topic you are interested in but is written so beautifully or organized so interestingly or otherwise speaks to you in such a way that you want to post it somewhere to remind you of what you want to be doing. You might not understand this in the early stages—why would you find a study that has nothing to do with the one you are doing helpful? But trust me, when you are deep into analysis and writing, having an inspirational model in view can help you push through. If you are motivated to do something that might change the world, you probably have read something somewhere that inspired you. Go back to that original inspiration and read it carefully and see how they managed to convey the passion that you so appreciate.

At this stage, you are still just getting started. There are a lot of things to do before setting forth to collect data! You’ll want to consider and choose a research tradition and a set of data-collection techniques that both help you answer your research question and match all your aims and goals. For example, if you really want to help migrant workers speak for themselves, you might draw on feminist theory and participatory action research models. Chapters 3 and 4 will provide you with more information on epistemologies and approaches.

Next, you have to clarify your “units of analysis.” What is the level at which you are focusing your study? Often, the unit in qualitative research methods is individual people, or “human subjects.” But your units of analysis could just as well be organizations (colleges, hospitals) or programs or even whole nations. Think about what it is you want to be saying at the end of your study—are the insights you are hoping to make about people or about organizations or about something else entirely? A unit of analysis can even be a historical period! Every unit of analysis will call for a different kind of data collection and analysis and will produce different kinds of “findings” at the conclusion of your study. [2]

Regardless of what unit of analysis you select, you will probably have to consider the “human subjects” involved in your research. [3] Who are they? What interactions will you have with them—that is, what kind of data will you be collecting? Before answering these questions, define your population of interest and your research setting. Use your research question to help guide you.

Let’s use an example from a real study. In Geographies of Campus Inequality , Benson and Lee ( 2020 ) list three related research questions: “(1) What are the different ways that first-generation students organize their social, extracurricular, and academic activities at selective and highly selective colleges? (2) how do first-generation students sort themselves and get sorted into these different types of campus lives; and (3) how do these different patterns of campus engagement prepare first-generation students for their post-college lives?” (3).

Note that we are jumping into this a bit late, after Benson and Lee have described previous studies (the literature review) and what is known about first-generation college students and what is not known. They want to know about differences within this group, and they are interested in ones attending certain kinds of colleges because those colleges will be sites where academic and extracurricular pressures compete. That is the context for their three related research questions. What is the population of interest here? First-generation college students . What is the research setting? Selective and highly selective colleges . But a host of questions remain. Which students in the real world, which colleges? What about gender, race, and other identity markers? Will the students be asked questions? Are the students still in college, or will they be asked about what college was like for them? Will they be observed? Will they be shadowed? Will they be surveyed? Will they be asked to keep diaries of their time in college? How many students? How many colleges? For how long will they be observed?

Recommendation

Take a moment and write down suggestions for Benson and Lee before continuing on to what they actually did.

Have you written down your own suggestions? Good. Now let’s compare those with what they actually did. Benson and Lee drew on two sources of data: in-depth interviews with sixty-four first-generation students and survey data from a preexisting national survey of students at twenty-eight selective colleges. Let’s ignore the survey for our purposes here and focus on those interviews. The interviews were conducted between 2014 and 2016 at a single selective college, “Hilltop” (a pseudonym ). They employed a “purposive” sampling strategy to ensure an equal number of male-identifying and female-identifying students as well as equal numbers of White, Black, and Latinx students. Each student was interviewed once. Hilltop is a selective liberal arts college in the northeast that enrolls about three thousand students.

How did your suggestions match up to those actually used by the researchers in this study? It is possible your suggestions were too ambitious? Beginning qualitative researchers can often make that mistake. You want a research design that is both effective (it matches your question and goals) and doable. You will never be able to collect data from your entire population of interest (unless your research question is really so narrow to be relevant to very few people!), so you will need to come up with a good sample. Define the criteria for this sample, as Benson and Lee did when deciding to interview an equal number of students by gender and race categories. Define the criteria for your sample setting too. Hilltop is typical for selective colleges. That was a research choice made by Benson and Lee. For more on sampling and sampling choices, see chapter 5.

Benson and Lee chose to employ interviews. If you also would like to include interviews, you have to think about what will be asked in them. Most interview-based research involves an interview guide, a set of questions or question areas that will be asked of each participant. The research question helps you create a relevant interview guide. You want to ask questions whose answers will provide insight into your research question. Again, your research question is the anchor you will continually come back to as you plan for and conduct your study. It may be that once you begin interviewing, you find that people are telling you something totally unexpected, and this makes you rethink your research question. That is fine. Then you have a new anchor. But you always have an anchor. More on interviewing can be found in chapter 11.

Let’s imagine Benson and Lee also observed college students as they went about doing the things college students do, both in the classroom and in the clubs and social activities in which they participate. They would have needed a plan for this. Would they sit in on classes? Which ones and how many? Would they attend club meetings and sports events? Which ones and how many? Would they participate themselves? How would they record their observations? More on observation techniques can be found in both chapters 13 and 14.

At this point, the design is almost complete. You know why you are doing this study, you have a clear research question to guide you, you have identified your population of interest and research setting, and you have a reasonable sample of each. You also have put together a plan for data collection, which might include drafting an interview guide or making plans for observations. And so you know exactly what you will be doing for the next several months (or years!). To put the project into action, there are a few more things necessary before actually going into the field.

First, you will need to make sure you have any necessary supplies, including recording technology. These days, many researchers use their phones to record interviews. Second, you will need to draft a few documents for your participants. These include informed consent forms and recruiting materials, such as posters or email texts, that explain what this study is in clear language. Third, you will draft a research protocol to submit to your institutional review board (IRB) ; this research protocol will include the interview guide (if you are using one), the consent form template, and all examples of recruiting material. Depending on your institution and the details of your study design, it may take weeks or even, in some unfortunate cases, months before you secure IRB approval. Make sure you plan on this time in your project timeline. While you wait, you can continue to review the literature and possibly begin drafting a section on the literature review for your eventual presentation/publication. More on IRB procedures can be found in chapter 8 and more general ethical considerations in chapter 7.

Once you have approval, you can begin!

Research Design Checklist

Before data collection begins, do the following:

  • Write a paragraph explaining your aims and goals (personal/political, practical/strategic, professional/academic).
  • Define your research question; write two to three sentences that clarify the intent of the research and why this is an important question to answer.
  • Review the literature for similar studies that address your research question or similar research questions; think laterally about some literature that might be helpful or illuminating but is not exactly about the same topic.
  • Find a written study that inspires you—it may or may not be on the research question you have chosen.
  • Consider and choose a research tradition and set of data-collection techniques that (1) help answer your research question and (2) match your aims and goals.
  • Define your population of interest and your research setting.
  • Define the criteria for your sample (How many? Why these? How will you find them, gain access, and acquire consent?).
  • If you are conducting interviews, draft an interview guide.
  •  If you are making observations, create a plan for observations (sites, times, recording, access).
  • Acquire any necessary technology (recording devices/software).
  • Draft consent forms that clearly identify the research focus and selection process.
  • Create recruiting materials (posters, email, texts).
  • Apply for IRB approval (proposal plus consent form plus recruiting materials).
  • Block out time for collecting data.
  • At the end of the chapter, you will find a " Research Design Checklist " that summarizes the main recommendations made here ↵
  • For example, if your focus is society and culture , you might collect data through observation or a case study. If your focus is individual lived experience , you are probably going to be interviewing some people. And if your focus is language and communication , you will probably be analyzing text (written or visual). ( Marshall and Rossman 2016:16 ). ↵
  • You may not have any "live" human subjects. There are qualitative research methods that do not require interactions with live human beings - see chapter 16 , "Archival and Historical Sources." But for the most part, you are probably reading this textbook because you are interested in doing research with people. The rest of the chapter will assume this is the case. ↵

One of the primary methodological traditions of inquiry in qualitative research, ethnography is the study of a group or group culture, largely through observational fieldwork supplemented by interviews. It is a form of fieldwork that may include participant-observation data collection. See chapter 14 for a discussion of deep ethnography. 

A methodological tradition of inquiry and research design that focuses on an individual case (e.g., setting, institution, or sometimes an individual) in order to explore its complexity, history, and interactive parts.  As an approach, it is particularly useful for obtaining a deep appreciation of an issue, event, or phenomenon of interest in its particular context.

The controlling force in research; can be understood as lying on a continuum from basic research (knowledge production) to action research (effecting change).

In its most basic sense, a theory is a story we tell about how the world works that can be tested with empirical evidence.  In qualitative research, we use the term in a variety of ways, many of which are different from how they are used by quantitative researchers.  Although some qualitative research can be described as “testing theory,” it is more common to “build theory” from the data using inductive reasoning , as done in Grounded Theory .  There are so-called “grand theories” that seek to integrate a whole series of findings and stories into an overarching paradigm about how the world works, and much smaller theories or concepts about particular processes and relationships.  Theory can even be used to explain particular methodological perspectives or approaches, as in Institutional Ethnography , which is both a way of doing research and a theory about how the world works.

Research that is interested in generating and testing hypotheses about how the world works.

A methodological tradition of inquiry and approach to analyzing qualitative data in which theories emerge from a rigorous and systematic process of induction.  This approach was pioneered by the sociologists Glaser and Strauss (1967).  The elements of theory generated from comparative analysis of data are, first, conceptual categories and their properties and, second, hypotheses or generalized relations among the categories and their properties – “The constant comparing of many groups draws the [researcher’s] attention to their many similarities and differences.  Considering these leads [the researcher] to generate abstract categories and their properties, which, since they emerge from the data, will clearly be important to a theory explaining the kind of behavior under observation.” (36).

An approach to research that is “multimethod in focus, involving an interpretative, naturalistic approach to its subject matter.  This means that qualitative researchers study things in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them.  Qualitative research involves the studied use and collection of a variety of empirical materials – case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts – that describe routine and problematic moments and meanings in individuals’ lives." ( Denzin and Lincoln 2005:2 ). Contrast with quantitative research .

Research that contributes knowledge that will help people to understand the nature of a problem in order to intervene, thereby allowing human beings to more effectively control their environment.

Research that is designed to evaluate or test the effectiveness of specific solutions and programs addressing specific social problems.  There are two kinds: summative and formative .

Research in which an overall judgment about the effectiveness of a program or policy is made, often for the purpose of generalizing to other cases or programs.  Generally uses qualitative research as a supplement to primary quantitative data analyses.  Contrast formative evaluation research .

Research designed to improve a program or policy (to help “form” or shape its effectiveness); relies heavily on qualitative research methods.  Contrast summative evaluation research

Research carried out at a particular organizational or community site with the intention of affecting change; often involves research subjects as participants of the study.  See also participatory action research .

Research in which both researchers and participants work together to understand a problematic situation and change it for the better.

The level of the focus of analysis (e.g., individual people, organizations, programs, neighborhoods).

The large group of interest to the researcher.  Although it will likely be impossible to design a study that incorporates or reaches all members of the population of interest, this should be clearly defined at the outset of a study so that a reasonable sample of the population can be taken.  For example, if one is studying working-class college students, the sample may include twenty such students attending a particular college, while the population is “working-class college students.”  In quantitative research, clearly defining the general population of interest is a necessary step in generalizing results from a sample.  In qualitative research, defining the population is conceptually important for clarity.

A fictional name assigned to give anonymity to a person, group, or place.  Pseudonyms are important ways of protecting the identity of research participants while still providing a “human element” in the presentation of qualitative data.  There are ethical considerations to be made in selecting pseudonyms; some researchers allow research participants to choose their own.

A requirement for research involving human participants; the documentation of informed consent.  In some cases, oral consent or assent may be sufficient, but the default standard is a single-page easy-to-understand form that both the researcher and the participant sign and date.   Under federal guidelines, all researchers "shall seek such consent only under circumstances that provide the prospective subject or the representative sufficient opportunity to consider whether or not to participate and that minimize the possibility of coercion or undue influence. The information that is given to the subject or the representative shall be in language understandable to the subject or the representative.  No informed consent, whether oral or written, may include any exculpatory language through which the subject or the representative is made to waive or appear to waive any of the subject's rights or releases or appears to release the investigator, the sponsor, the institution, or its agents from liability for negligence" (21 CFR 50.20).  Your IRB office will be able to provide a template for use in your study .

An administrative body established to protect the rights and welfare of human research subjects recruited to participate in research activities conducted under the auspices of the institution with which it is affiliated. The IRB is charged with the responsibility of reviewing all research involving human participants. The IRB is concerned with protecting the welfare, rights, and privacy of human subjects. The IRB has the authority to approve, disapprove, monitor, and require modifications in all research activities that fall within its jurisdiction as specified by both the federal regulations and institutional policy.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Grad Med Educ
  • v.7(4); 2015 Dec

Choosing a Qualitative Research Approach

Associated data.

Editor's Note: The online version of this article contains a list of further reading resources and the authors' professional information .

The Challenge

Educators often pose questions about qualitative research. For example, a program director might say: “I collect data from my residents about their learning experiences in a new longitudinal clinical rotation. If I want to know about their learning experiences, should I use qualitative methods? I have been told that there are many approaches from which to choose. Someone suggested that I use grounded theory, but how do I know this is the best approach? Are there others?”

What Is Known

Qualitative research is the systematic inquiry into social phenomena in natural settings. These phenomena can include, but are not limited to, how people experience aspects of their lives, how individuals and/or groups behave, how organizations function, and how interactions shape relationships. In qualitative research, the researcher is the main data collection instrument. The researcher examines why events occur, what happens, and what those events mean to the participants studied. 1 , 2

Qualitative research starts from a fundamentally different set of beliefs—or paradigms—than those that underpin quantitative research. Quantitative research is based on positivist beliefs that there is a singular reality that can be discovered with the appropriate experimental methods. Post-positivist researchers agree with the positivist paradigm, but believe that environmental and individual differences, such as the learning culture or the learners' capacity to learn, influence this reality, and that these differences are important. Constructivist researchers believe that there is no single reality, but that the researcher elicits participants' views of reality. 3 Qualitative research generally draws on post-positivist or constructivist beliefs.

Qualitative scholars develop their work from these beliefs—usually post-positivist or constructivist—using different approaches to conduct their research. In this Rip Out, we describe 3 different qualitative research approaches commonly used in medical education: grounded theory, ethnography, and phenomenology. Each acts as a pivotal frame that shapes the research question(s), the method(s) of data collection, and how data are analyzed. 4 , 5

Choosing a Qualitative Approach

Before engaging in any qualitative study, consider how your views about what is possible to study will affect your approach. Then select an appropriate approach within which to work. Alignment between the belief system underpinning the research approach, the research question, and the research approach itself is a prerequisite for rigorous qualitative research. To enhance the understanding of how different approaches frame qualitative research, we use this introductory challenge as an illustrative example.

The clinic rotation in a program director's training program was recently redesigned as a longitudinal clinical experience. Resident satisfaction with this rotation improved significantly following implementation of the new longitudinal experience. The program director wants to understand how the changes made in the clinic rotation translated into changes in learning experiences for the residents.

Qualitative research can support this program director's efforts. Qualitative research focuses on the events that transpire and on outcomes of those events from the perspectives of those involved. In this case, the program director can use qualitative research to understand the impact of the new clinic rotation on the learning experiences of residents. The next step is to decide which approach to use as a frame for the study.

The table lists the purpose of 3 commonly used approaches to frame qualitative research. For each frame, we provide an example of a research question that could direct the study and delineate what outcomes might be gained by using that particular approach.

Methodology Overview

An external file that holds a picture, illustration, etc.
Object name is i1949-8357-7-4-669-t01.jpg

How You Can Start TODAY

  • 1 Examine the foundations of the existing literature: As part of the literature review, make note of what is known about the topic and which approaches have been used in prior studies. A decision should be made to determine the extent to which the new study is exploratory and the extent to which findings will advance what is already known about the topic.
  • 2 Find a qualitatively skilled collaborator: If you are interested in doing qualitative research, you should consult with a qualitative expert. Be prepared to talk to the qualitative scholar about what you would like to study and why . Furthermore, be ready to describe the literature to date on the topic (remember, you are asking for this person's expertise regarding qualitative approaches—he or she won't necessarily have content expertise). Qualitative research must be designed and conducted with rigor (rigor will be discussed in Rip Out No. 8 of this series). Input from a qualitative expert will ensure that rigor is employed from the study's inception.
  • 3 Consider the approach: With a literature review completed and a qualitatively skilled collaborator secured, it is time to decide which approach would be best suited to answering the research question. Questions to consider when weighing approaches might include the following:
  • • Will my findings contribute to the creation of a theoretical model to better understand the area of study? ( grounded theory )
  • • Will I need to spend an extended amount of time trying to understand the culture and process of a particular group of learners in their natural context? ( ethnography )
  • • Is there a particular phenomenon I want to better understand/describe? ( phenomenology )

What You Can Do LONG TERM

  • 1 Develop your qualitative research knowledge and skills : A basic qualitative research textbook is a valuable investment to learn about qualitative research (further reading is provided as online supplemental material). A novice qualitative researcher will also benefit from participating in a massive online open course or a mini-course (often offered by professional organizations or conferences) that provides an introduction to qualitative research. Most of all, collaborating with a qualitative researcher can provide the support necessary to design, execute, and report on the study.
  • 2 Undertake a pilot study: After learning about qualitative methodology, the next best way to gain expertise in qualitative research is to try it in a small scale pilot study with the support of a qualitative expert. Such application provides an appreciation for the thought processes that go into designing a study, analyzing the data, and reporting on the findings. Alternatively, if you have the opportunity to work on a study led by a qualitative expert, take it! The experience will provide invaluable opportunities for learning how to engage in qualitative research.

Supplementary Material

The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Uniformed Services University of the Health Sciences, the Department of the Navy, the Department of Defense, or the US government.

References and Resources for Further Reading

  • Privacy Policy

Research Method

Home » Qualitative Research – Methods, Analysis Types and Guide

Qualitative Research – Methods, Analysis Types and Guide

Table of Contents

Qualitative Research

Qualitative Research

Qualitative research is a type of research methodology that focuses on exploring and understanding people’s beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus groups, observations, and textual analysis.

Qualitative research aims to uncover the meaning and significance of social phenomena, and it typically involves a more flexible and iterative approach to data collection and analysis compared to quantitative research. Qualitative research is often used in fields such as sociology, anthropology, psychology, and education.

Qualitative Research Methods

Types of Qualitative Research

Qualitative Research Methods are as follows:

One-to-One Interview

This method involves conducting an interview with a single participant to gain a detailed understanding of their experiences, attitudes, and beliefs. One-to-one interviews can be conducted in-person, over the phone, or through video conferencing. The interviewer typically uses open-ended questions to encourage the participant to share their thoughts and feelings. One-to-one interviews are useful for gaining detailed insights into individual experiences.

Focus Groups

This method involves bringing together a group of people to discuss a specific topic in a structured setting. The focus group is led by a moderator who guides the discussion and encourages participants to share their thoughts and opinions. Focus groups are useful for generating ideas and insights, exploring social norms and attitudes, and understanding group dynamics.

Ethnographic Studies

This method involves immersing oneself in a culture or community to gain a deep understanding of its norms, beliefs, and practices. Ethnographic studies typically involve long-term fieldwork and observation, as well as interviews and document analysis. Ethnographic studies are useful for understanding the cultural context of social phenomena and for gaining a holistic understanding of complex social processes.

Text Analysis

This method involves analyzing written or spoken language to identify patterns and themes. Text analysis can be quantitative or qualitative. Qualitative text analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Text analysis is useful for understanding media messages, public discourse, and cultural trends.

This method involves an in-depth examination of a single person, group, or event to gain an understanding of complex phenomena. Case studies typically involve a combination of data collection methods, such as interviews, observations, and document analysis, to provide a comprehensive understanding of the case. Case studies are useful for exploring unique or rare cases, and for generating hypotheses for further research.

Process of Observation

This method involves systematically observing and recording behaviors and interactions in natural settings. The observer may take notes, use audio or video recordings, or use other methods to document what they see. Process of observation is useful for understanding social interactions, cultural practices, and the context in which behaviors occur.

Record Keeping

This method involves keeping detailed records of observations, interviews, and other data collected during the research process. Record keeping is essential for ensuring the accuracy and reliability of the data, and for providing a basis for analysis and interpretation.

This method involves collecting data from a large sample of participants through a structured questionnaire. Surveys can be conducted in person, over the phone, through mail, or online. Surveys are useful for collecting data on attitudes, beliefs, and behaviors, and for identifying patterns and trends in a population.

Qualitative data analysis is a process of turning unstructured data into meaningful insights. It involves extracting and organizing information from sources like interviews, focus groups, and surveys. The goal is to understand people’s attitudes, behaviors, and motivations

Qualitative Research Analysis Methods

Qualitative Research analysis methods involve a systematic approach to interpreting and making sense of the data collected in qualitative research. Here are some common qualitative data analysis methods:

Thematic Analysis

This method involves identifying patterns or themes in the data that are relevant to the research question. The researcher reviews the data, identifies keywords or phrases, and groups them into categories or themes. Thematic analysis is useful for identifying patterns across multiple data sources and for generating new insights into the research topic.

Content Analysis

This method involves analyzing the content of written or spoken language to identify key themes or concepts. Content analysis can be quantitative or qualitative. Qualitative content analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Content analysis is useful for identifying patterns in media messages, public discourse, and cultural trends.

Discourse Analysis

This method involves analyzing language to understand how it constructs meaning and shapes social interactions. Discourse analysis can involve a variety of methods, such as conversation analysis, critical discourse analysis, and narrative analysis. Discourse analysis is useful for understanding how language shapes social interactions, cultural norms, and power relationships.

Grounded Theory Analysis

This method involves developing a theory or explanation based on the data collected. Grounded theory analysis starts with the data and uses an iterative process of coding and analysis to identify patterns and themes in the data. The theory or explanation that emerges is grounded in the data, rather than preconceived hypotheses. Grounded theory analysis is useful for understanding complex social phenomena and for generating new theoretical insights.

Narrative Analysis

This method involves analyzing the stories or narratives that participants share to gain insights into their experiences, attitudes, and beliefs. Narrative analysis can involve a variety of methods, such as structural analysis, thematic analysis, and discourse analysis. Narrative analysis is useful for understanding how individuals construct their identities, make sense of their experiences, and communicate their values and beliefs.

Phenomenological Analysis

This method involves analyzing how individuals make sense of their experiences and the meanings they attach to them. Phenomenological analysis typically involves in-depth interviews with participants to explore their experiences in detail. Phenomenological analysis is useful for understanding subjective experiences and for developing a rich understanding of human consciousness.

Comparative Analysis

This method involves comparing and contrasting data across different cases or groups to identify similarities and differences. Comparative analysis can be used to identify patterns or themes that are common across multiple cases, as well as to identify unique or distinctive features of individual cases. Comparative analysis is useful for understanding how social phenomena vary across different contexts and groups.

Applications of Qualitative Research

Qualitative research has many applications across different fields and industries. Here are some examples of how qualitative research is used:

  • Market Research: Qualitative research is often used in market research to understand consumer attitudes, behaviors, and preferences. Researchers conduct focus groups and one-on-one interviews with consumers to gather insights into their experiences and perceptions of products and services.
  • Health Care: Qualitative research is used in health care to explore patient experiences and perspectives on health and illness. Researchers conduct in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education: Qualitative research is used in education to understand student experiences and to develop effective teaching strategies. Researchers conduct classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work : Qualitative research is used in social work to explore social problems and to develop interventions to address them. Researchers conduct in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : Qualitative research is used in anthropology to understand different cultures and societies. Researchers conduct ethnographic studies and observe and interview members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : Qualitative research is used in psychology to understand human behavior and mental processes. Researchers conduct in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy : Qualitative research is used in public policy to explore public attitudes and to inform policy decisions. Researchers conduct focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

How to Conduct Qualitative Research

Here are some general steps for conducting qualitative research:

  • Identify your research question: Qualitative research starts with a research question or set of questions that you want to explore. This question should be focused and specific, but also broad enough to allow for exploration and discovery.
  • Select your research design: There are different types of qualitative research designs, including ethnography, case study, grounded theory, and phenomenology. You should select a design that aligns with your research question and that will allow you to gather the data you need to answer your research question.
  • Recruit participants: Once you have your research question and design, you need to recruit participants. The number of participants you need will depend on your research design and the scope of your research. You can recruit participants through advertisements, social media, or through personal networks.
  • Collect data: There are different methods for collecting qualitative data, including interviews, focus groups, observation, and document analysis. You should select the method or methods that align with your research design and that will allow you to gather the data you need to answer your research question.
  • Analyze data: Once you have collected your data, you need to analyze it. This involves reviewing your data, identifying patterns and themes, and developing codes to organize your data. You can use different software programs to help you analyze your data, or you can do it manually.
  • Interpret data: Once you have analyzed your data, you need to interpret it. This involves making sense of the patterns and themes you have identified, and developing insights and conclusions that answer your research question. You should be guided by your research question and use your data to support your conclusions.
  • Communicate results: Once you have interpreted your data, you need to communicate your results. This can be done through academic papers, presentations, or reports. You should be clear and concise in your communication, and use examples and quotes from your data to support your findings.

Examples of Qualitative Research

Here are some real-time examples of qualitative research:

  • Customer Feedback: A company may conduct qualitative research to understand the feedback and experiences of its customers. This may involve conducting focus groups or one-on-one interviews with customers to gather insights into their attitudes, behaviors, and preferences.
  • Healthcare : A healthcare provider may conduct qualitative research to explore patient experiences and perspectives on health and illness. This may involve conducting in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education : An educational institution may conduct qualitative research to understand student experiences and to develop effective teaching strategies. This may involve conducting classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work: A social worker may conduct qualitative research to explore social problems and to develop interventions to address them. This may involve conducting in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : An anthropologist may conduct qualitative research to understand different cultures and societies. This may involve conducting ethnographic studies and observing and interviewing members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : A psychologist may conduct qualitative research to understand human behavior and mental processes. This may involve conducting in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy: A government agency or non-profit organization may conduct qualitative research to explore public attitudes and to inform policy decisions. This may involve conducting focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

Purpose of Qualitative Research

The purpose of qualitative research is to explore and understand the subjective experiences, behaviors, and perspectives of individuals or groups in a particular context. Unlike quantitative research, which focuses on numerical data and statistical analysis, qualitative research aims to provide in-depth, descriptive information that can help researchers develop insights and theories about complex social phenomena.

Qualitative research can serve multiple purposes, including:

  • Exploring new or emerging phenomena : Qualitative research can be useful for exploring new or emerging phenomena, such as new technologies or social trends. This type of research can help researchers develop a deeper understanding of these phenomena and identify potential areas for further study.
  • Understanding complex social phenomena : Qualitative research can be useful for exploring complex social phenomena, such as cultural beliefs, social norms, or political processes. This type of research can help researchers develop a more nuanced understanding of these phenomena and identify factors that may influence them.
  • Generating new theories or hypotheses: Qualitative research can be useful for generating new theories or hypotheses about social phenomena. By gathering rich, detailed data about individuals’ experiences and perspectives, researchers can develop insights that may challenge existing theories or lead to new lines of inquiry.
  • Providing context for quantitative data: Qualitative research can be useful for providing context for quantitative data. By gathering qualitative data alongside quantitative data, researchers can develop a more complete understanding of complex social phenomena and identify potential explanations for quantitative findings.

When to use Qualitative Research

Here are some situations where qualitative research may be appropriate:

  • Exploring a new area: If little is known about a particular topic, qualitative research can help to identify key issues, generate hypotheses, and develop new theories.
  • Understanding complex phenomena: Qualitative research can be used to investigate complex social, cultural, or organizational phenomena that are difficult to measure quantitatively.
  • Investigating subjective experiences: Qualitative research is particularly useful for investigating the subjective experiences of individuals or groups, such as their attitudes, beliefs, values, or emotions.
  • Conducting formative research: Qualitative research can be used in the early stages of a research project to develop research questions, identify potential research participants, and refine research methods.
  • Evaluating interventions or programs: Qualitative research can be used to evaluate the effectiveness of interventions or programs by collecting data on participants’ experiences, attitudes, and behaviors.

Characteristics of Qualitative Research

Qualitative research is characterized by several key features, including:

  • Focus on subjective experience: Qualitative research is concerned with understanding the subjective experiences, beliefs, and perspectives of individuals or groups in a particular context. Researchers aim to explore the meanings that people attach to their experiences and to understand the social and cultural factors that shape these meanings.
  • Use of open-ended questions: Qualitative research relies on open-ended questions that allow participants to provide detailed, in-depth responses. Researchers seek to elicit rich, descriptive data that can provide insights into participants’ experiences and perspectives.
  • Sampling-based on purpose and diversity: Qualitative research often involves purposive sampling, in which participants are selected based on specific criteria related to the research question. Researchers may also seek to include participants with diverse experiences and perspectives to capture a range of viewpoints.
  • Data collection through multiple methods: Qualitative research typically involves the use of multiple data collection methods, such as in-depth interviews, focus groups, and observation. This allows researchers to gather rich, detailed data from multiple sources, which can provide a more complete picture of participants’ experiences and perspectives.
  • Inductive data analysis: Qualitative research relies on inductive data analysis, in which researchers develop theories and insights based on the data rather than testing pre-existing hypotheses. Researchers use coding and thematic analysis to identify patterns and themes in the data and to develop theories and explanations based on these patterns.
  • Emphasis on researcher reflexivity: Qualitative research recognizes the importance of the researcher’s role in shaping the research process and outcomes. Researchers are encouraged to reflect on their own biases and assumptions and to be transparent about their role in the research process.

Advantages of Qualitative Research

Qualitative research offers several advantages over other research methods, including:

  • Depth and detail: Qualitative research allows researchers to gather rich, detailed data that provides a deeper understanding of complex social phenomena. Through in-depth interviews, focus groups, and observation, researchers can gather detailed information about participants’ experiences and perspectives that may be missed by other research methods.
  • Flexibility : Qualitative research is a flexible approach that allows researchers to adapt their methods to the research question and context. Researchers can adjust their research methods in real-time to gather more information or explore unexpected findings.
  • Contextual understanding: Qualitative research is well-suited to exploring the social and cultural context in which individuals or groups are situated. Researchers can gather information about cultural norms, social structures, and historical events that may influence participants’ experiences and perspectives.
  • Participant perspective : Qualitative research prioritizes the perspective of participants, allowing researchers to explore subjective experiences and understand the meanings that participants attach to their experiences.
  • Theory development: Qualitative research can contribute to the development of new theories and insights about complex social phenomena. By gathering rich, detailed data and using inductive data analysis, researchers can develop new theories and explanations that may challenge existing understandings.
  • Validity : Qualitative research can offer high validity by using multiple data collection methods, purposive and diverse sampling, and researcher reflexivity. This can help ensure that findings are credible and trustworthy.

Limitations of Qualitative Research

Qualitative research also has some limitations, including:

  • Subjectivity : Qualitative research relies on the subjective interpretation of researchers, which can introduce bias into the research process. The researcher’s perspective, beliefs, and experiences can influence the way data is collected, analyzed, and interpreted.
  • Limited generalizability: Qualitative research typically involves small, purposive samples that may not be representative of larger populations. This limits the generalizability of findings to other contexts or populations.
  • Time-consuming: Qualitative research can be a time-consuming process, requiring significant resources for data collection, analysis, and interpretation.
  • Resource-intensive: Qualitative research may require more resources than other research methods, including specialized training for researchers, specialized software for data analysis, and transcription services.
  • Limited reliability: Qualitative research may be less reliable than quantitative research, as it relies on the subjective interpretation of researchers. This can make it difficult to replicate findings or compare results across different studies.
  • Ethics and confidentiality: Qualitative research involves collecting sensitive information from participants, which raises ethical concerns about confidentiality and informed consent. Researchers must take care to protect the privacy and confidentiality of participants and obtain informed consent.

Also see Research Methods

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

One-to-One Interview in Research

One-to-One Interview – Methods and Guide

Questionnaire

Questionnaire – Definition, Types, and Examples

Applied Research

Applied Research – Types, Methods and Examples

Focus Groups in Qualitative Research

Focus Groups – Steps, Examples and Guide

Exploratory Research

Exploratory Research – Types, Methods and...

Case Study Research

Case Study – Methods, Examples and Guide

Banner

Qualitative Research Design: Start

Qualitative Research Design

qualitative research process design

What is Qualitative research design?

Qualitative research is a type of research that explores and provides deeper insights into real-world problems. Instead of collecting numerical data points or intervening or introducing treatments just like in quantitative research, qualitative research helps generate hypotheses as well as further investigate and understand quantitative data. Qualitative research gathers participants' experiences, perceptions, and behavior. It answers the hows and whys instead of how many or how much . It could be structured as a stand-alone study, purely relying on qualitative data or it could be part of mixed-methods research that combines qualitative and quantitative data.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and analyzing numerical data for statistical analysis. Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

While qualitative and quantitative approaches are different, they are not necessarily opposites, and they are certainly not mutually exclusive. For instance, qualitative research can help expand and deepen understanding of data or results obtained from quantitative analysis. For example, say a quantitative analysis has determined that there is a correlation between length of stay and level of patient satisfaction, but why does this correlation exist? This dual-focus scenario shows one way in which qualitative and quantitative research could be integrated together.

Research Paradigms 

  • Positivist versus Post-Positivist
  • Social Constructivist (this paradigm/ideology mostly birth qualitative studies)

Events Relating to the Qualitative Research and Community Engagement Workshops @ CMU Libraries

CMU Libraries is committed to helping members of our community become data experts. To that end, CMU is offering public facing workshops that discuss Qualitative Research, Coding, and Community Engagement best practices.

The following workshops are a part of a broader series on using data. Please follow the links to register for the events. 

Qualitative Coding

Using Community Data to improve Outcome (Grant Writing)

Survey Design  

Upcoming Event: March 21st, 2024 (12:00pm -1:00 pm)

Community Engagement and Collaboration Event 

Join us for an event to improve, build on and expand the connections between Carnegie Mellon University resources and the Pittsburgh community. CMU resources such as the Libraries and Sustainability Initiative can be leveraged by users not affiliated with the university, but barriers can prevent them from fully engaging.

The conversation features representatives from CMU departments and local organizations about the community engagement efforts currently underway at CMU and opportunities to improve upon them. Speakers will highlight current and ongoing projects and share resources to support future collaboration.

Event Moderators:

Taiwo Lasisi, CLIR Postdoctoral Fellow in Community Data Literacy,  Carnegie Mellon University Libraries

Emma Slayton, Data Curation, Visualization, & GIS Specialist,  Carnegie Mellon University Libraries

Nicky Agate , Associate Dean for Academic Engagement, Carnegie Mellon University Libraries

Chelsea Cohen , The University’s Executive fellow for community engagement, Carnegie Mellon University

Sarah Ceurvorst , Academic Pathways Manager, Program Director, LEAP (Leadership, Excellence, Access, Persistence) Carnegie Mellon University

Julia Poeppibg , Associate Director of Partnership Development, Information Systems, Carnegie Mellon University 

Scott Wolovich , Director of New Sun Rising, Pittsburgh 

Additional workshops and events will be forthcoming. Watch this space for updates. 

Workshop Organizer

Profile Photo

Qualitative Research Methods

What are Qualitative Research methods?

Qualitative research adopts numerous methods or techniques including interviews, focus groups, and observation. Interviews may be unstructured, with open-ended questions on a topic and the interviewer adapts to the responses. Structured interviews have a predetermined number of questions that every participant is asked. It is usually one-on-one and is appropriate for sensitive topics or topics needing an in-depth exploration. Focus groups are often held with 8-12 target participants and are used when group dynamics and collective views on a topic are desired. Researchers can be participant observers to share the experiences of the subject or non-participant or detached observers.

What constitutes a good research question? Does the question drive research design choices?

According to Doody and Bailey (2014);

 We can only develop a good research question by consulting relevant literature, colleagues, and supervisors experienced in the area of research. (inductive interactions).

Helps to have a directed research aim and objective.

Researchers should not be “ research trendy” and have enough evidence. This is why research objectives are important. It helps to take time, and resources into consideration.

Research questions can be developed from theoretical knowledge, previous research or experience, or a practical need at work (Parahoo 2014). They have numerous roles, such as identifying the importance of the research and providing clarity of purpose for the research, in terms of what the research intends to achieve in the end.

Qualitative Research Questions

What constitutes a good Qualitative research question?

A good qualitative question answers the hows and whys instead of how many or how much. It could be structured as a stand-alone study, purely relying on qualitative data or it could be part of mixed-methods research that combines qualitative and quantitative data. Qualitative research gathers participants' experiences, perceptions and behavior.

Examples of good Qualitative Research Questions:

What are people's thoughts on the new library? 

How does it feel to be a first-generation student attending college?

Difference example (between Qualitative and Quantitative research questions):

How many college students signed up for the new semester? (Quan) 

How do college students feel about the new semester? What are their experiences so far? (Qual)

  • Qualitative Research Design Workshop Powerpoint

Foley G, Timonen V. Using Grounded Theory Method to Capture and Analyze Health Care Experiences. Health Serv Res. 2015 Aug;50(4):1195-210. [ PMC free article: PMC4545354 ] [ PubMed: 25523315 ]

Devers KJ. How will we know "good" qualitative research when we see it? Beginning the dialogue in health services research. Health Serv Res. 1999 Dec;34(5 Pt 2):1153-88. [ PMC free article: PMC1089058 ] [ PubMed: 10591278 ]

Huston P, Rowan M. Qualitative studies. Their role in medical research. Can Fam Physician. 1998 Nov;44:2453-8. [ PMC free article: PMC2277956 ] [ PubMed: 9839063 ]

Corner EJ, Murray EJ, Brett SJ. Qualitative, grounded theory exploration of patients' experience of early mobilisation, rehabilitation and recovery after critical illness. BMJ Open. 2019 Feb 24;9(2):e026348. [ PMC free article: PMC6443050 ] [ PubMed: 30804034 ]

Moser A, Korstjens I. Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis. Eur J Gen Pract. 2018 Dec;24(1):9-18. [ PMC free article: PMC5774281 ] [ PubMed: 29199486 ]

Houghton C, Murphy K, Meehan B, Thomas J, Brooker D, Casey D. From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis. J Clin Nurs. 2017 Mar;26(5-6):873-881. [ PubMed: 27324875 ]

Soratto J, Pires DEP, Friese S. Thematic content analysis using ATLAS.ti software: Potentialities for researchs in health. Rev Bras Enferm. 2020;73(3):e20190250. [ PubMed: 32321144 ]

Zamawe FC. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections. Malawi Med J. 2015 Mar;27(1):13-5. [ PMC free article: PMC4478399 ] [ PubMed: 26137192 ]

Korstjens I, Moser A. Series: Practical guidance to qualitative research. Part 4: Trustworthiness and publishing. Eur J Gen Pract. 2018 Dec;24(1):120-124. [ PMC free article: PMC8816392 ] [ PubMed: 29202616 ]

Saldaña, J. (2021). The coding manual for qualitative researchers. The coding manual for qualitative researchers, 1-440.

O'Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014 Sep;89(9):1245-51. [ PubMed: 24979285 ]

Palermo C, King O, Brock T, Brown T, Crampton P, Hall H, Macaulay J, Morphet J, Mundy M, Oliaro L, Paynter S, Williams B, Wright C, E Rees C. Setting priorities for health education research: A mixed methods study. Med Teach. 2019 Sep;41(9):1029-1038. [ PubMed: 31141390 ]

  • Last Updated: Feb 14, 2024 4:25 PM
  • URL: https://guides.library.cmu.edu/c.php?g=1346006

Qualitative Research: Characteristics, Design, Methods & Examples

Lauren McCall

MSc Health Psychology Graduate

MSc, Health Psychology, University of Nottingham

Lauren obtained an MSc in Health Psychology from The University of Nottingham with a distinction classification.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Qualitative research is a type of research methodology that focuses on gathering and analyzing non-numerical data to gain a deeper understanding of human behavior, experiences, and perspectives.

It aims to explore the “why” and “how” of a phenomenon rather than the “what,” “where,” and “when” typically addressed by quantitative research.

Unlike quantitative research, which focuses on gathering and analyzing numerical data for statistical analysis, qualitative research involves researchers interpreting data to identify themes, patterns, and meanings.

Qualitative research can be used to:

  • Gain deep contextual understandings of the subjective social reality of individuals
  • To answer questions about experience and meaning from the participant’s perspective
  • To design hypotheses, theory must be researched using qualitative methods to determine what is important before research can begin. 

Examples of qualitative research questions include: 

  • How does stress influence young adults’ behavior?
  • What factors influence students’ school attendance rates in developed countries?
  • How do adults interpret binge drinking in the UK?
  • What are the psychological impacts of cervical cancer screening in women?
  • How can mental health lessons be integrated into the school curriculum? 

Characteristics 

Naturalistic setting.

Individuals are studied in their natural setting to gain a deeper understanding of how people experience the world. This enables the researcher to understand a phenomenon close to how participants experience it. 

Naturalistic settings provide valuable contextual information to help researchers better understand and interpret the data they collect.

The environment, social interactions, and cultural factors can all influence behavior and experiences, and these elements are more easily observed in real-world settings.

Reality is socially constructed

Qualitative research aims to understand how participants make meaning of their experiences – individually or in social contexts. It assumes there is no objective reality and that the social world is interpreted (Yilmaz, 2013). 

The primacy of subject matter 

The primary aim of qualitative research is to understand the perspectives, experiences, and beliefs of individuals who have experienced the phenomenon selected for research rather than the average experiences of groups of people (Minichiello, 1990).

An in-depth understanding is attained since qualitative techniques allow participants to freely disclose their experiences, thoughts, and feelings without constraint (Tenny et al., 2022). 

Variables are complex, interwoven, and difficult to measure

Factors such as experiences, behaviors, and attitudes are complex and interwoven, so they cannot be reduced to isolated variables , making them difficult to measure quantitatively.

However, a qualitative approach enables participants to describe what, why, or how they were thinking/ feeling during a phenomenon being studied (Yilmaz, 2013). 

Emic (insider’s point of view)

The phenomenon being studied is centered on the participants’ point of view (Minichiello, 1990).

Emic is used to describe how participants interact, communicate, and behave in the research setting (Scarduzio, 2017).

Interpretive analysis

In qualitative research, interpretive analysis is crucial in making sense of the collected data.

This process involves examining the raw data, such as interview transcripts, field notes, or documents, and identifying the underlying themes, patterns, and meanings that emerge from the participants’ experiences and perspectives.

Collecting Qualitative Data

There are four main research design methods used to collect qualitative data: observations, interviews,  focus groups, and ethnography.

Observations

This method involves watching and recording phenomena as they occur in nature. Observation can be divided into two types: participant and non-participant observation.

In participant observation, the researcher actively participates in the situation/events being observed.

In non-participant observation, the researcher is not an active part of the observation and tries not to influence the behaviors they are observing (Busetto et al., 2020). 

Observations can be covert (participants are unaware that a researcher is observing them) or overt (participants are aware of the researcher’s presence and know they are being observed).

However, awareness of an observer’s presence may influence participants’ behavior. 

Interviews give researchers a window into the world of a participant by seeking their account of an event, situation, or phenomenon. They are usually conducted on a one-to-one basis and can be distinguished according to the level at which they are structured (Punch, 2013). 

Structured interviews involve predetermined questions and sequences to ensure replicability and comparability. However, they are unable to explore emerging issues.

Informal interviews consist of spontaneous, casual conversations which are closer to the truth of a phenomenon. However, information is gathered using quick notes made by the researcher and is therefore subject to recall bias. 

Semi-structured interviews have a flexible structure, phrasing, and placement so emerging issues can be explored (Denny & Weckesser, 2022).

The use of probing questions and clarification can lead to a detailed understanding, but semi-structured interviews can be time-consuming and subject to interviewer bias. 

Focus groups 

Similar to interviews, focus groups elicit a rich and detailed account of an experience. However, focus groups are more dynamic since participants with shared characteristics construct this account together (Denny & Weckesser, 2022).

A shared narrative is built between participants to capture a group experience shaped by a shared context. 

The researcher takes on the role of a moderator, who will establish ground rules and guide the discussion by following a topic guide to focus the group discussions.

Typically, focus groups have 4-10 participants as a discussion can be difficult to facilitate with more than this, and this number allows everyone the time to speak.

Ethnography

Ethnography is a methodology used to study a group of people’s behaviors and social interactions in their environment (Reeves et al., 2008).

Data are collected using methods such as observations, field notes, or structured/ unstructured interviews.

The aim of ethnography is to provide detailed, holistic insights into people’s behavior and perspectives within their natural setting. In order to achieve this, researchers immerse themselves in a community or organization. 

Due to the flexibility and real-world focus of ethnography, researchers are able to gather an in-depth, nuanced understanding of people’s experiences, knowledge and perspectives that are influenced by culture and society.

In order to develop a representative picture of a particular culture/ context, researchers must conduct extensive field work. 

This can be time-consuming as researchers may need to immerse themselves into a community/ culture for a few days, or possibly a few years.

Qualitative Data Analysis Methods

Different methods can be used for analyzing qualitative data. The researcher chooses based on the objectives of their study. 

The researcher plays a key role in the interpretation of data, making decisions about the coding, theming, decontextualizing, and recontextualizing of data (Starks & Trinidad, 2007). 

Grounded theory

Grounded theory is a qualitative method specifically designed to inductively generate theory from data. It was developed by Glaser and Strauss in 1967 (Glaser & Strauss, 2017).

This methodology aims to develop theories (rather than test hypotheses) that explain a social process, action, or interaction (Petty et al., 2012). To inform the developing theory, data collection and analysis run simultaneously. 

There are three key types of coding used in grounded theory: initial (open), intermediate (axial), and advanced (selective) coding. 

Throughout the analysis, memos should be created to document methodological and theoretical ideas about the data. Data should be collected and analyzed until data saturation is reached and a theory is developed. 

Content analysis

Content analysis was first used in the early twentieth century to analyze textual materials such as newspapers and political speeches.

Content analysis is a research method used to identify and analyze the presence and patterns of themes, concepts, or words in data (Vaismoradi et al., 2013). 

This research method can be used to analyze data in different formats, which can be written, oral, or visual. 

The goal of content analysis is to develop themes that capture the underlying meanings of data (Schreier, 2012). 

Qualitative content analysis can be used to validate existing theories, support the development of new models and theories, and provide in-depth descriptions of particular settings or experiences.

The following six steps provide a guideline for how to conduct qualitative content analysis.
  • Define a Research Question : To start content analysis, a clear research question should be developed.
  • Identify and Collect Data : Establish the inclusion criteria for your data. Find the relevant sources to analyze.
  • Define the Unit or Theme of Analysis : Categorize the content into themes. Themes can be a word, phrase, or sentence.
  • Develop Rules for Coding your Data : Define a set of coding rules to ensure that all data are coded consistently.
  • Code the Data : Follow the coding rules to categorize data into themes.
  • Analyze the Results and Draw Conclusions : Examine the data to identify patterns and draw conclusions in relation to your research question.

Discourse analysis

Discourse analysis is a research method used to study written/ spoken language in relation to its social context (Wood & Kroger, 2000).

In discourse analysis, the researcher interprets details of language materials and the context in which it is situated.

Discourse analysis aims to understand the functions of language (how language is used in real life) and how meaning is conveyed by language in different contexts. Researchers use discourse analysis to investigate social groups and how language is used to achieve specific communication goals.

Different methods of discourse analysis can be used depending on the aims and objectives of a study. However, the following steps provide a guideline on how to conduct discourse analysis.
  • Define the Research Question : Develop a relevant research question to frame the analysis.
  • Gather Data and Establish the Context : Collect research materials (e.g., interview transcripts, documents). Gather factual details and review the literature to construct a theory about the social and historical context of your study.
  • Analyze the Content : Closely examine various components of the text, such as the vocabulary, sentences, paragraphs, and structure of the text. Identify patterns relevant to the research question to create codes, then group these into themes.
  • Review the Results : Reflect on the findings to examine the function of the language, and the meaning and context of the discourse. 

Thematic analysis

Thematic analysis is a method used to identify, interpret, and report patterns in data, such as commonalities or contrasts. 

Although the origin of thematic analysis can be traced back to the early twentieth century, understanding and clarity of thematic analysis is attributed to Braun and Clarke (2006).

Thematic analysis aims to develop themes (patterns of meaning) across a dataset to address a research question. 

In thematic analysis, qualitative data is gathered using techniques such as interviews, focus groups, and questionnaires. Audio recordings are transcribed. The dataset is then explored and interpreted by a researcher to identify patterns. 

This occurs through the rigorous process of data familiarisation, coding, theme development, and revision. These identified patterns provide a summary of the dataset and can be used to address a research question.

Themes are developed by exploring the implicit and explicit meanings within the data. Two different approaches are used to generate themes: inductive and deductive. 

An inductive approach allows themes to emerge from the data. In contrast, a deductive approach uses existing theories or knowledge to apply preconceived ideas to the data.

Phases of Thematic Analysis

Braun and Clarke (2006) provide a guide of the six phases of thematic analysis. These phases can be applied flexibly to fit research questions and data. 
Phase
1. Gather and transcribe dataGather raw data, for example interviews or focus groups, and transcribe audio recordings fully
2. Familiarization with dataRead and reread all your data from beginning to end; note down initial ideas
3. Create initial codesStart identifying preliminary codes which highlight important features of the data and may be relevant to the research question
4. Create new codes which encapsulate potential themesReview initial codes and explore any similarities, differences, or contradictions to uncover underlying themes; create a map to visualize identified themes
5. Take a break then return to the dataTake a break and then return later to review themes
6. Evaluate themes for good fitLast opportunity for analysis; check themes are supported and saturated with data

Template analysis

Template analysis refers to a specific method of thematic analysis which uses hierarchical coding (Brooks et al., 2014).

Template analysis is used to analyze textual data, for example, interview transcripts or open-ended responses on a written questionnaire.

To conduct template analysis, a coding template must be developed (usually from a subset of the data) and subsequently revised and refined. This template represents the themes identified by researchers as important in the dataset. 

Codes are ordered hierarchically within the template, with the highest-level codes demonstrating overarching themes in the data and lower-level codes representing constituent themes with a narrower focus.

A guideline for the main procedural steps for conducting template analysis is outlined below.
  • Familiarization with the Data : Read (and reread) the dataset in full. Engage, reflect, and take notes on data that may be relevant to the research question.
  • Preliminary Coding : Identify initial codes using guidance from the a priori codes, identified before the analysis as likely to be beneficial and relevant to the analysis.
  • Organize Themes : Organize themes into meaningful clusters. Consider the relationships between the themes both within and between clusters.
  • Produce an Initial Template : Develop an initial template. This may be based on a subset of the data.
  • Apply and Develop the Template : Apply the initial template to further data and make any necessary modifications. Refinements of the template may include adding themes, removing themes, or changing the scope/title of themes. 
  • Finalize Template : Finalize the template, then apply it to the entire dataset. 

Frame analysis

Frame analysis is a comparative form of thematic analysis which systematically analyzes data using a matrix output.

Ritchie and Spencer (1994) developed this set of techniques to analyze qualitative data in applied policy research. Frame analysis aims to generate theory from data.

Frame analysis encourages researchers to organize and manage their data using summarization.

This results in a flexible and unique matrix output, in which individual participants (or cases) are represented by rows and themes are represented by columns. 

Each intersecting cell is used to summarize findings relating to the corresponding participant and theme.

Frame analysis has five distinct phases which are interrelated, forming a methodical and rigorous framework.
  • Familiarization with the Data : Familiarize yourself with all the transcripts. Immerse yourself in the details of each transcript and start to note recurring themes.
  • Develop a Theoretical Framework : Identify recurrent/ important themes and add them to a chart. Provide a framework/ structure for the analysis.
  • Indexing : Apply the framework systematically to the entire study data.
  • Summarize Data in Analytical Framework : Reduce the data into brief summaries of participants’ accounts.
  • Mapping and Interpretation : Compare themes and subthemes and check against the original transcripts. Group the data into categories and provide an explanation for them.

Preventing Bias in Qualitative Research

To evaluate qualitative studies, the CASP (Critical Appraisal Skills Programme) checklist for qualitative studies can be used to ensure all aspects of a study have been considered (CASP, 2018).

The quality of research can be enhanced and assessed using criteria such as checklists, reflexivity, co-coding, and member-checking. 

Co-coding 

Relying on only one researcher to interpret rich and complex data may risk key insights and alternative viewpoints being missed. Therefore, coding is often performed by multiple researchers.

A common strategy must be defined at the beginning of the coding process  (Busetto et al., 2020). This includes establishing a useful coding list and finding a common definition of individual codes.

Transcripts are initially coded independently by researchers and then compared and consolidated to minimize error or bias and to bring confirmation of findings. 

Member checking

Member checking (or respondent validation) involves checking back with participants to see if the research resonates with their experiences (Russell & Gregory, 2003).

Data can be returned to participants after data collection or when results are first available. For example, participants may be provided with their interview transcript and asked to verify whether this is a complete and accurate representation of their views.

Participants may then clarify or elaborate on their responses to ensure they align with their views (Shenton, 2004).

This feedback becomes part of data collection and ensures accurate descriptions/ interpretations of phenomena (Mays & Pope, 2000). 

Reflexivity in qualitative research

Reflexivity typically involves examining your own judgments, practices, and belief systems during data collection and analysis. It aims to identify any personal beliefs which may affect the research. 

Reflexivity is essential in qualitative research to ensure methodological transparency and complete reporting. This enables readers to understand how the interaction between the researcher and participant shapes the data.

Depending on the research question and population being researched, factors that need to be considered include the experience of the researcher, how the contact was established and maintained, age, gender, and ethnicity.

These details are important because, in qualitative research, the researcher is a dynamic part of the research process and actively influences the outcome of the research (Boeije, 2014). 

Reflexivity Example

Who you are and your characteristics influence how you collect and analyze data. Here is an example of a reflexivity statement for research on smoking. I am a 30-year-old white female from a middle-class background. I live in the southwest of England and have been educated to master’s level. I have been involved in two research projects on oral health. I have never smoked, but I have witnessed how smoking can cause ill health from my volunteering in a smoking cessation clinic. My research aspirations are to help to develop interventions to help smokers quit.

Establishing Trustworthiness in Qualitative Research

Trustworthiness is a concept used to assess the quality and rigor of qualitative research. Four criteria are used to assess a study’s trustworthiness: credibility, transferability, dependability, and confirmability.

1. Credibility in Qualitative Research

Credibility refers to how accurately the results represent the reality and viewpoints of the participants.

To establish credibility in research, participants’ views and the researcher’s representation of their views need to align (Tobin & Begley, 2004).

To increase the credibility of findings, researchers may use data source triangulation, investigator triangulation, peer debriefing, or member checking (Lincoln & Guba, 1985). 

2. Transferability in Qualitative Research

Transferability refers to how generalizable the findings are: whether the findings may be applied to another context, setting, or group (Tobin & Begley, 2004).

Transferability can be enhanced by giving thorough and in-depth descriptions of the research setting, sample, and methods (Nowell et al., 2017). 

3. Dependability in Qualitative Research

Dependability is the extent to which the study could be replicated under similar conditions and the findings would be consistent.

Researchers can establish dependability using methods such as audit trails so readers can see the research process is logical and traceable (Koch, 1994).

4. Confirmability in Qualitative Research

Confirmability is concerned with establishing that there is a clear link between the researcher’s interpretations/ findings and the data.

Researchers can achieve confirmability by demonstrating how conclusions and interpretations were arrived at (Nowell et al., 2017).

This enables readers to understand the reasoning behind the decisions made. 

Audit Trails in Qualitative Research

An audit trail provides evidence of the decisions made by the researcher regarding theory, research design, and data collection, as well as the steps they have chosen to manage, analyze, and report data. 

The researcher must provide a clear rationale to demonstrate how conclusions were reached in their study.

A clear description of the research path must be provided to enable readers to trace through the researcher’s logic (Halpren, 1983).

Researchers should maintain records of the raw data, field notes, transcripts, and a reflective journal in order to provide a clear audit trail. 

Discovery of unexpected data

Open-ended questions in qualitative research mean the researcher can probe an interview topic and enable the participant to elaborate on responses in an unrestricted manner.

This allows unexpected data to emerge, which can lead to further research into that topic. 

The exploratory nature of qualitative research helps generate hypotheses that can be tested quantitatively (Busetto et al., 2020).

Flexibility

Data collection and analysis can be modified and adapted to take the research in a different direction if new ideas or patterns emerge in the data.

This enables researchers to investigate new opportunities while firmly maintaining their research goals. 

Naturalistic settings

The behaviors of participants are recorded in real-world settings. Studies that use real-world settings have high ecological validity since participants behave more authentically. 

Limitations

Time-consuming .

Qualitative research results in large amounts of data which often need to be transcribed and analyzed manually.

Even when software is used, transcription can be inaccurate, and using software for analysis can result in many codes which need to be condensed into themes. 

Subjectivity 

The researcher has an integral role in collecting and interpreting qualitative data. Therefore, the conclusions reached are from their perspective and experience.

Consequently, interpretations of data from another researcher may vary greatly. 

Limited generalizability

The aim of qualitative research is to provide a detailed, contextualized understanding of an aspect of the human experience from a relatively small sample size.

Despite rigorous analysis procedures, conclusions drawn cannot be generalized to the wider population since data may be biased or unrepresentative.

Therefore, results are only applicable to a small group of the population. 

While individual qualitative studies are often limited in their generalizability due to factors such as sample size and context, metasynthesis enables researchers to synthesize findings from multiple studies, potentially leading to more generalizable conclusions.

By integrating findings from studies conducted in diverse settings and with different populations, metasynthesis can provide broader insights into the phenomenon of interest.

Extraneous variables

Qualitative research is often conducted in real-world settings. This may cause results to be unreliable since extraneous variables may affect the data, for example:

  • Situational variables : different environmental conditions may influence participants’ behavior in a study. The random variation in factors (such as noise or lighting) may be difficult to control in real-world settings.
  • Participant characteristics : this includes any characteristics that may influence how a participant answers/ behaves in a study. This may include a participant’s mood, gender, age, ethnicity, sexual identity, IQ, etc.
  • Experimenter effect : experimenter effect refers to how a researcher’s unintentional influence can change the outcome of a study. This occurs when (i) their interactions with participants unintentionally change participants’ behaviors or (ii) due to errors in observation, interpretation, or analysis. 

What sample size should qualitative research be?

The sample size for qualitative studies has been recommended to include a minimum of 12 participants to reach data saturation (Braun, 2013).

Are surveys qualitative or quantitative?

Surveys can be used to gather information from a sample qualitatively or quantitatively. Qualitative surveys use open-ended questions to gather detailed information from a large sample using free text responses.

The use of open-ended questions allows for unrestricted responses where participants use their own words, enabling the collection of more in-depth information than closed-ended questions.

In contrast, quantitative surveys consist of closed-ended questions with multiple-choice answer options. Quantitative surveys are ideal to gather a statistical representation of a population.

What are the ethical considerations of qualitative research?

Before conducting a study, you must think about any risks that could occur and take steps to prevent them. Participant Protection : Researchers must protect participants from physical and mental harm. This means you must not embarrass, frighten, offend, or harm participants. Transparency : Researchers are obligated to clearly communicate how they will collect, store, analyze, use, and share the data. Confidentiality : You need to consider how to maintain the confidentiality and anonymity of participants’ data.

What is triangulation in qualitative research?

Triangulation refers to the use of several approaches in a study to comprehensively understand phenomena. This method helps to increase the validity and credibility of research findings. 

Types of triangulation include method triangulation (using multiple methods to gather data); investigator triangulation (multiple researchers for collecting/ analyzing data), theory triangulation (comparing several theoretical perspectives to explain a phenomenon), and data source triangulation (using data from various times, locations, and people; Carter et al., 2014).

Why is qualitative research important?

Qualitative research allows researchers to describe and explain the social world. The exploratory nature of qualitative research helps to generate hypotheses that can then be tested quantitatively.

In qualitative research, participants are able to express their thoughts, experiences, and feelings without constraint.

Additionally, researchers are able to follow up on participants’ answers in real-time, generating valuable discussion around a topic. This enables researchers to gain a nuanced understanding of phenomena which is difficult to attain using quantitative methods.

What is coding data in qualitative research?

Coding data is a qualitative data analysis strategy in which a section of text is assigned with a label that describes its content.

These labels may be words or phrases which represent important (and recurring) patterns in the data.

This process enables researchers to identify related content across the dataset. Codes can then be used to group similar types of data to generate themes.

What is the difference between qualitative and quantitative research?

Qualitative research involves the collection and analysis of non-numerical data in order to understand experiences and meanings from the participant’s perspective.

This can provide rich, in-depth insights on complicated phenomena. Qualitative data may be collected using interviews, focus groups, or observations.

In contrast, quantitative research involves the collection and analysis of numerical data to measure the frequency, magnitude, or relationships of variables. This can provide objective and reliable evidence that can be generalized to the wider population.

Quantitative data may be collected using closed-ended questionnaires or experiments.

What is trustworthiness in qualitative research?

Trustworthiness is a concept used to assess the quality and rigor of qualitative research. Four criteria are used to assess a study’s trustworthiness: credibility, transferability, dependability, and confirmability. 

Credibility refers to how accurately the results represent the reality and viewpoints of the participants. Transferability refers to whether the findings may be applied to another context, setting, or group.

Dependability is the extent to which the findings are consistent and reliable. Confirmability refers to the objectivity of findings (not influenced by the bias or assumptions of researchers).

What is data saturation in qualitative research?

Data saturation is a methodological principle used to guide the sample size of a qualitative research study.

Data saturation is proposed as a necessary methodological component in qualitative research (Saunders et al., 2018) as it is a vital criterion for discontinuing data collection and/or analysis. 

The intention of data saturation is to find “no new data, no new themes, no new coding, and ability to replicate the study” (Guest et al., 2006). Therefore, enough data has been gathered to make conclusions.

Why is sampling in qualitative research important?

In quantitative research, large sample sizes are used to provide statistically significant quantitative estimates.

This is because quantitative research aims to provide generalizable conclusions that represent populations.

However, the aim of sampling in qualitative research is to gather data that will help the researcher understand the depth, complexity, variation, or context of a phenomenon. The small sample sizes in qualitative studies support the depth of case-oriented analysis.

What is narrative analysis?

Narrative analysis is a qualitative research method used to understand how individuals create stories from their personal experiences.

There is an emphasis on understanding the context in which a narrative is constructed, recognizing the influence of historical, cultural, and social factors on storytelling.

Researchers can use different methods together to explore a research question.

Some narrative researchers focus on the content of what is said, using thematic narrative analysis, while others focus on the structure, such as holistic-form or categorical-form structural narrative analysis. Others focus on how the narrative is produced and performed.

Boeije, H. (2014). Analysis in qualitative research. Sage.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology , 3 (2), 77-101. https://doi.org/10.1191/1478088706qp063oa

Brooks, J., McCluskey, S., Turley, E., & King, N. (2014). The utility of template analysis in qualitative psychology research. Qualitative Research in Psychology , 12 (2), 202–222. https://doi.org/10.1080/14780887.2014.955224

Busetto, L., Wick, W., & Gumbinger, C. (2020). How to use and assess qualitative research methods. Neurological research and practice , 2 (1), 14-14. https://doi.org/10.1186/s42466-020-00059-z 

Carter, N., Bryant-Lukosius, D., DiCenso, A., Blythe, J., & Neville, A. J. (2014). The use of triangulation in qualitative research. Oncology nursing forum , 41 (5), 545–547. https://doi.org/10.1188/14.ONF.545-547

Critical Appraisal Skills Programme. (2018). CASP Checklist: 10 questions to help you make sense of a Qualitative research. https://casp-uk.net/images/checklist/documents/CASP-Qualitative-Studies-Checklist/CASP-Qualitative-Checklist-2018_fillable_form.pdf Accessed: March 15 2023

Clarke, V., & Braun, V. (2013). Successful qualitative research: A practical guide for beginners. Successful Qualitative Research , 1-400.

Denny, E., & Weckesser, A. (2022). How to do qualitative research?: Qualitative research methods. BJOG : an international journal of obstetrics and gynaecology , 129 (7), 1166-1167. https://doi.org/10.1111/1471-0528.17150 

Glaser, B. G., & Strauss, A. L. (2017). The discovery of grounded theory. The Discovery of Grounded Theory , 1–18. https://doi.org/10.4324/9780203793206-1

Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18 (1), 59-82. doi:10.1177/1525822X05279903

Halpren, E. S. (1983). Auditing naturalistic inquiries: The development and application of a model (Unpublished doctoral dissertation). Indiana University, Bloomington.

Hammarberg, K., Kirkman, M., & de Lacey, S. (2016). Qualitative research methods: When to use them and how to judge them. Human Reproduction , 31 (3), 498–501. https://doi.org/10.1093/humrep/dev334

Koch, T. (1994). Establishing rigour in qualitative research: The decision trail. Journal of Advanced Nursing, 19, 976–986. doi:10.1111/ j.1365-2648.1994.tb01177.x

Lincoln, Y., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.

Mays, N., & Pope, C. (2000). Assessing quality in qualitative research. BMJ, 320(7226), 50–52.

Minichiello, V. (1990). In-Depth Interviewing: Researching People. Longman Cheshire.

Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic Analysis: Striving to Meet the Trustworthiness Criteria. International Journal of Qualitative Methods, 16 (1). https://doi.org/10.1177/1609406917733847

Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? part 2: Introducing qualitative research methodologies and methods. Manual Therapy , 17 (5), 378–384. https://doi.org/10.1016/j.math.2012.03.004

Punch, K. F. (2013). Introduction to social research: Quantitative and qualitative approaches. London: Sage

Reeves, S., Kuper, A., & Hodges, B. D. (2008). Qualitative research methodologies: Ethnography. BMJ , 337 (aug07 3). https://doi.org/10.1136/bmj.a1020

Russell, C. K., & Gregory, D. M. (2003). Evaluation of qualitative research studies. Evidence Based Nursing, 6 (2), 36–40.

Saunders, B., Sim, J., Kingstone, T., Baker, S., Waterfield, J., Bartlam, B., Burroughs, H., & Jinks, C. (2018). Saturation in qualitative research: exploring its conceptualization and operationalization. Quality & quantity , 52 (4), 1893–1907. https://doi.org/10.1007/s11135-017-0574-8

Scarduzio, J. A. (2017). Emic approach to qualitative research. The International Encyclopedia of Communication Research Methods, 1–2 . https://doi.org/10.1002/9781118901731.iecrm0082

Schreier, M. (2012). Qualitative content analysis in practice / Margrit Schreier.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22 , 63–75.

Starks, H., & Trinidad, S. B. (2007). Choose your method: a comparison of phenomenology, discourse analysis, and grounded theory. Qualitative health research , 17 (10), 1372–1380. https://doi.org/10.1177/1049732307307031

Tenny, S., Brannan, J. M., & Brannan, G. D. (2022). Qualitative Study. In StatPearls. StatPearls Publishing.

Tobin, G. A., & Begley, C. M. (2004). Methodological rigour within a qualitative framework. Journal of Advanced Nursing, 48, 388–396. doi:10.1111/j.1365-2648.2004.03207.x

Vaismoradi, M., Turunen, H., & Bondas, T. (2013). Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing & health sciences , 15 (3), 398-405. https://doi.org/10.1111/nhs.12048

Wood L. A., Kroger R. O. (2000). Doing discourse analysis: Methods for studying action in talk and text. Sage.

Yilmaz, K. (2013). Comparison of Quantitative and Qualitative Research Traditions: epistemological, theoretical, and methodological differences. European journal of education , 48 (2), 311-325. https://doi.org/10.1111/ejed.12014

Print Friendly, PDF & Email

Related Articles

Discourse Analysis

Research Methodology

Discourse Analysis

Phenomenology In Qualitative Research

Phenomenology In Qualitative Research

Ethnography In Qualitative Research

Ethnography In Qualitative Research

Narrative Analysis In Qualitative Research

Narrative Analysis In Qualitative Research

Thematic Analysis: A Step by Step Guide

Thematic Analysis: A Step by Step Guide

Metasynthesis Of Qualitative Research

Metasynthesis Of Qualitative Research

Qualitative Research : Definition

Qualitative research is the naturalistic study of social meanings and processes, using interviews, observations, and the analysis of texts and images.  In contrast to quantitative researchers, whose statistical methods enable broad generalizations about populations (for example, comparisons of the percentages of U.S. demographic groups who vote in particular ways), qualitative researchers use in-depth studies of the social world to analyze how and why groups think and act in particular ways (for instance, case studies of the experiences that shape political views).   

Events and Workshops

  • Introduction to NVivo Have you just collected your data and wondered what to do next? Come join us for an introductory session on utilizing NVivo to support your analytical process. This session will only cover features of the software and how to import your records. Please feel free to attend any of the following sessions below: April 25th, 2024 12:30 pm - 1:45 pm Green Library - SVA Conference Room 125 May 9th, 2024 12:30 pm - 1:45 pm Green Library - SVA Conference Room 125
  • Next: Choose an approach >>
  • Choose an approach
  • Find studies
  • Learn methods
  • Getting Started
  • Get software
  • Get data for secondary analysis
  • Network with researchers

Profile Photo

  • Last Updated: May 23, 2024 1:27 PM
  • URL: https://guides.library.stanford.edu/qualitative_research
  • Tools and Resources
  • Customer Services
  • Original Language Spotlight
  • Alternative and Non-formal Education 
  • Cognition, Emotion, and Learning
  • Curriculum and Pedagogy
  • Education and Society
  • Education, Change, and Development
  • Education, Cultures, and Ethnicities
  • Education, Gender, and Sexualities
  • Education, Health, and Social Services
  • Educational Administration and Leadership
  • Educational History
  • Educational Politics and Policy
  • Educational Purposes and Ideals
  • Educational Systems
  • Educational Theories and Philosophies
  • Globalization, Economics, and Education
  • Languages and Literacies
  • Professional Learning and Development
  • Research and Assessment Methods
  • Technology and Education
  • Share This Facebook LinkedIn Twitter

Article contents

Qualitative design research methods.

  • Michael Domínguez Michael Domínguez San Diego State University
  • https://doi.org/10.1093/acrefore/9780190264093.013.170
  • Published online: 19 December 2017

Emerging in the learning sciences field in the early 1990s, qualitative design-based research (DBR) is a relatively new methodological approach to social science and education research. As its name implies, DBR is focused on the design of educational innovations, and the testing of these innovations in the complex and interconnected venue of naturalistic settings. As such, DBR is an explicitly interventionist approach to conducting research, situating the researcher as a part of the complex ecology in which learning and educational innovation takes place.

With this in mind, DBR is distinct from more traditional methodologies, including laboratory experiments, ethnographic research, and large-scale implementation. Rather, the goal of DBR is not to prove the merits of any particular intervention, or to reflect passively on a context in which learning occurs, but to examine the practical application of theories of learning themselves in specific, situated contexts. By designing purposeful, naturalistic, and sustainable educational ecologies, researchers can test, extend, or modify their theories and innovations based on their pragmatic viability. This process offers the prospect of generating theory-developing, contextualized knowledge claims that can complement the claims produced by other forms of research.

Because of this interventionist, naturalistic stance, DBR has also been the subject of ongoing debate concerning the rigor of its methodology. In many ways, these debates obscure the varied ways DBR has been practiced, the varied types of questions being asked, and the theoretical breadth of researchers who practice DBR. With this in mind, DBR research may involve a diverse range of methods as researchers from a variety of intellectual traditions within the learning sciences and education research design pragmatic innovations based on their theories of learning, and document these complex ecologies using the methodologies and tools most applicable to their questions, focuses, and academic communities.

DBR has gained increasing interest in recent years. While it remains a popular methodology for developmental and cognitive learning scientists seeking to explore theory in naturalistic settings, it has also grown in importance to cultural psychology and cultural studies researchers as a methodological approach that aligns in important ways with the participatory commitments of liberatory research. As such, internal tension within the DBR field has also emerged. Yet, though approaches vary, and have distinct genealogies and commitments, DBR might be seen as the broad methodological genre in which Change Laboratory, design-based implementation research (DBIR), social design-based experiments (SDBE), participatory design research (PDR), and research-practice partnerships might be categorized. These critically oriented iterations of DBR have important implications for educational research and educational innovation in historically marginalized settings and the Global South.

  • design-based research
  • learning sciences
  • social-design experiment
  • qualitative research
  • research methods

Educational research, perhaps more than many other disciplines, is a situated field of study. Learning happens around us every day, at all times, in both formal and informal settings. Our worlds are replete with complex, dynamic, diverse communities, contexts, and institutions, many of which are actively seeking guidance and support in the endless quest for educational innovation. Educational researchers—as a source of potential expertise—are necessarily implicated in this complexity, linked to the communities and institutions through their very presence in spaces of learning, poised to contribute with possible solutions, yet often positioned as separate from the activities they observe, creating dilemmas of responsibility and engagement.

So what are educational scholars and researchers to do? These tensions invite a unique methodological challenge for the contextually invested researcher, begging them to not just produce knowledge about learning, but to participate in the ecology, collaborating on innovations in the complex contexts in which learning is taking place. In short, for many educational researchers, our backgrounds as educators, our connections to community partners, and our sociopolitical commitments to the process of educational innovation push us to ensure that our work is generative, and that our theories and ideas—our expertise—about learning and education are made pragmatic, actionable, and sustainable. We want to test what we know outside of laboratories, designing, supporting, and guiding educational innovation to see if our theories of learning are accurate, and useful to the challenges faced in schools and communities where learning is messy, collaborative, and contested. Through such a process, we learn, and can modify our theories to better serve the real needs of communities. It is from this impulse that qualitative design-based research (DBR) emerged as a new methodological paradigm for education research.

Qualitative design-based research will be examined, documenting its origins, the major tenets of the genre, implementation considerations, and methodological issues, as well as variance within the paradigm. As a relatively new methodology, much tension remains in what constitutes DBR, and what design should mean, and for whom. These tensions and questions, as well as broad perspectives and emergent iterations of the methodology, will be discussed, and considerations for researchers looking toward the future of this paradigm will be considered.

The Origins of Design-Based Research

Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown ( 1992 ) and Allan Collins ( 1992 ). For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory experiments, ethnographies, and large-scale educational interventions were the only methods available. During these decades, a growing community of learning science and educational researchers (e.g., Bereiter & Scardamalia, 1989 ; Brown, Campione, Webber, & McGilley, 1992 ; Cobb & Steffe, 1983 ; Cole, 1995 ; Scardamalia & Bereiter, 1991 ; Schoenfeld, 1982 , 1985 ; Scribner & Cole, 1978 ) interested in educational innovation and classroom interventions in situated contexts began to find the prevailing methodologies insufficient for the types of learning they wished to document, the roles they wished to play in research, and the kinds of knowledge claims they wished to explore. The laboratory, or laboratory-like settings, where research on learning was at the time happening, was divorced from the complexity of real life, and necessarily limiting. Alternatively, most ethnographic research, while more attuned to capturing these complexities and dynamics, regularly assumed a passive stance 1 and avoided interceding in the learning process, or allowing researchers to see what possibility for innovation existed from enacting nascent learning theories. Finally, large-scale interventions could test innovations in practice but lost sight of the nuance of development and implementation in local contexts (Brown, 1992 ; Collins, Joseph, & Bielaczyc, 2004 ).

Dissatisfied with these options, and recognizing that in order to study and understand learning in the messiness of socially, culturally, and historically situated settings, new methods were required, Brown ( 1992 ) proposed an alternative: Why not involve ourselves in the messiness of the process, taking an active, grounded role in disseminating our theories and expertise by becoming designers and implementers of educational innovations? Rather than observing from afar, DBR researchers could trace their own iterative processes of design, implementation, tinkering, redesign, and evaluation, as it unfolded in shared work with teachers, students, learners, and other partners in lived contexts. This premise, initially articulated as “design experiments” (Brown, 1992 ), would be variously discussed over the next decade as “design research,” (Edelson, 2002 ) “developmental research,” (Gravemeijer, 1994 ), and “design-based research,” (Design-Based Research Collective, 2003 ), all of which reflect the original, interventionist, design-oriented concept. The latter term, “design-based research” (DBR), is used here, recognizing this as the prevailing terminology used to refer to this research approach at present. 2

Regardless of the evolving moniker, the prospects of such a methodology were extremely attractive to researchers. Learning scientists acutely aware of various aspects of situated context, and interested in studying the applied outcomes of learning theories—a task of inquiry into situated learning for which canonical methods were rather insufficient—found DBR a welcome development (Bell, 2004 ). As Barab and Squire ( 2004 ) explain: “learning scientists . . . found that they must develop technological tools, curriculum, and especially theories that help them systematically understand and predict how learning occurs” (p. 2), and DBR methodologies allowed them to do this in proactive, hands-on ways. Thus, rather than emerging as a strict alternative to more traditional methodologies, DBR was proposed to fill a niche that other methodologies were ill-equipped to cover.

Effectively, while its development is indeed linked to an inherent critique of previous research paradigms, neither Brown nor Collins saw DBR in opposition to other forms of research. Rather, by providing a bridge from the laboratory to the real world, where learning theories and proposed innovations could interact and be implemented in the complexity of lived socio-ecological contexts (Hoadley, 2004 ), new possibilities emerged. Learning researchers might “trace the evolution of learning in complex, messy classrooms and schools, test and build theories of teaching and learning, and produce instructional tools that survive the challenges of everyday practice” (Shavelson, Phillips, Towne, & Feuer, 2003 , p. 25). Thus, DBR could complement the findings of laboratory, ethnographic, and large-scale studies, answering important questions about the implementation, sustainability, limitations, and usefulness of theories, interventions, and learning when introduced as innovative designs into situated contexts of learning. Moreover, while studies involving these traditional methodologies often concluded by pointing toward implications—insights subsequent studies would need to take up—DBR allowed researchers to address implications iteratively and directly. No subsequent research was necessary, as emerging implications could be reflexively explored in the context of the initial design, offering considerable insight into how research is translated into theory and practice.

Since its emergence in 1992 , DBR as a methodological approach to educational and learning research has quickly grown and evolved, used by researchers from a variety of intellectual traditions in the learning sciences, including developmental and cognitive psychology (e.g., Brown & Campione, 1996 , 1998 ; diSessa & Minstrell, 1998 ), cultural psychology (e.g., Cole, 1996 , 2007 ; Newman, Griffin, & Cole, 1989 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ), cultural anthropology (e.g., Barab, Kinster, Moore, Cunningham, & the ILF Design Team, 2001 ; Polman, 2000 ; Stevens, 2000 ; Suchman, 1995 ), and cultural-historical activity theory (e.g., Engeström, 2011 ; Espinoza, 2009 ; Espinoza & Vossoughi, 2014 ; Gutiérrez, 2008 ; Sannino, 2011 ). Given this plurality of epistemological and theoretical fields that employ DBR, it might best be understood as a broad methodology of educational research, realized in many different, contested, heterogeneous, and distinct iterations, and engaging a variety of qualitative tools and methods (Bell, 2004 ). Despite tensions among these iterations, and substantial and important variances in the ways they employ design-as-research in community settings, there are several common, methodological threads that unite the broad array of research that might be classified as DBR under a shared, though pluralistic, paradigmatic umbrella.

The Tenets of Design-Based Research

Why design-based research.

As we turn to the core tenets of the design-based research (DBR) paradigm, it is worth considering an obvious question: Why use DBR as a methodology for educational research? To answer this, it is helpful to reflect on the original intentions for DBR, particularly, that it is not simply the study of a particular, isolated intervention. Rather, DBR methodologies were conceived of as the complete, iterative process of designing, modifying, and assessing the impact of an educational innovation in a contextual, situated learning environment (Barab & Kirshner, 2001 ; Brown, 1992 ; Cole & Engeström, 2007 ). The design process itself—inclusive of the theory of learning employed, the relationships among participants, contextual factors and constraints, the pedagogical approach, any particular intervention, as well as any changes made to various aspects of this broad design as it proceeds—is what is under study.

Considering this, DBR offers a compelling framework for the researcher interested in having an active and collaborative hand in designing for educational innovation, and interested in creating knowledge about how particular theories of learning, pedagogical or learning practices, or social arrangements function in a context of learning. It is a methodology that can put the researcher in the position of engineer , actively experimenting with aspects of learning and sociopolitical ecologies to arrive at new knowledge and productive outcomes, as Cobb, Confrey, diSessa, Lehrer, and Schauble ( 2003 ) explain:

Prototypically, design experiments entail both “engineering” particular forms of learning and systematically studying those forms of learning within the context defined by the means of supporting them. This designed context is subject to test and revision, and the successive iterations that result play a role similar to that of systematic variation in experiment. (p. 9)

This being said, how directive the engineering role the researcher takes on varies considerably among iterations of DBR. Indeed, recent approaches have argued strongly for researchers to take on more egalitarian positionalities with respect to the community partners with whom they work (e.g., Zavala, 2016 ), acting as collaborative designers, rather than authoritative engineers.

Method and Methodology in Design-Based Research

Now, having established why we might use DBR, a recurring question that has faced the DBR paradigm is whether DBR is a methodology at all. Given the variety of intellectual and ontological traditions that employ it, and thus the pluralism of methods used in DBR to enact the “engineering” role (whatever shape that may take) that the researcher assumes, it has been argued that DBR is not, in actuality a methodology at all (Kelly, 2004 ). The proliferation and diversity of approaches, methods, and types of analysis purporting to be DBR have been described as a lack of coherence that shows there is no “argumentative grammar” or methodology present in DBR (Kelly, 2004 ).

Now, the conclusions one will eventually draw in this debate will depend on one’s orientations and commitments, but it is useful to note that these demands for “coherence” emerge from previous paradigms in which methodology was largely marked by a shared, coherent toolkit for data collection and data analysis. These previous paradigmatic rules make for an odd fit when considering DBR. Yet, even if we proceed—within the qualitative tradition from which DBR emerges—defining methodology as an approach to research that is shaped by the ontological and epistemological commitments of the particular researcher, and methods as the tools for research, data collection, and analysis that are chosen by the researcher with respect to said commitments (Gutiérrez, Engeström, & Sannino, 2016 ), then a compelling case for DBR as a methodology can be made (Bell, 2004 ).

Effectively, despite the considerable variation in how DBR has been and is employed, and tensions within the DBR field, we might point to considerable, shared epistemic common ground among DBR researchers, all of whom are invested in an approach to research that involves engaging actively and iteratively in the design and exploration of learning theory in situated, natural contexts. This common epistemic ground, even in the face of pluralistic ideologies and choices of methods, invites in a new type of methodological coherence, marked by “intersubjectivity without agreement” (Matusov, 1996 ), that links DBR from traditional developmental and cognitive psychology models of DBR (e.g., Brown, 1992 ; Brown & Campione, 1998 ; Collins, 1992 ), to more recent critical and sociocultural manifestations (e.g., Bang & Vossoughi, 2016 ; Engeström, 2011 ; Gutiérrez, 2016 ), and everything in between.

Put in other terms, even as DBR researchers may choose heterogeneous methods for data collection, data analysis, and reporting results complementary to the ideological and sociopolitical commitments of the particular researcher and the types of research questions that are under examination (Bell, 2004 ), a shared epistemic commitment gives the methodology shape. Indeed, the common commitment toward design innovation emerges clearly across examples of DBR methodological studies ranging in method from ethnographic analyses (Salvador, Bell, & Anderson, 1999 ) to studies of critical discourse within a design (Kärkkäinen, 1999 ), to focused examinations of metacognition of individual learners (White & Frederiksen, 1998 ), and beyond. Rather than indicating a lack of methodology, or methodological weakness, the use of varying qualitative methods for framing data collection and retrospective analyses within DBR, and the tensions within the epistemic common ground itself, simply reflects the scope of its utility. Learning in context is complex, contested, and messy, and the plurality of methods present across DBR allow researchers to dynamically respond to context as needed, employing the tools that fit best to consider the questions that are present, or may arise.

All this being the case, it is useful to look toward the coherent elements—the “argumentative grammar” of DBR, if you will—that can be identified across the varied iterations of DBR. Understanding these shared features, in the context and terms of the methodology itself, help us to appreciate what is involved in developing robust and thorough DBR research, and how DBR seeks to make strong, meaningful claims around the types of research questions it takes up.

Coherent Features of Design-Based Research

Several scholars have provided comprehensive overviews and listings of what they see as the cross-cutting features of DBR, both in the context of more traditional models of DBR (e.g., Cobb et al., 2003 ; Design-Based Research Collective, 2003 ), and in regards to newer iterations (e.g., Gutiérrez & Jurow, 2016 ; Bang & Vossoughi, 2016 ). Rather than try to offer an overview of each of these increasingly pluralistic classifications, the intent here is to attend to three broad elements that are shared across articulations of DBR and reflect the essential elements that constitute the methodological approach DBR offers to educational researchers.

Design research is concerned with the development, testing, and evolution of learning theory in situated contexts

This first element is perhaps most central to what DBR of all types is, anchored in what Brown ( 1992 ) was initially most interested in: testing the pragmatic validity of theories of learning by designing interventions that engaged with, or proposed, entire, naturalistic, ecologies of learning. Put another way, while DBR studies may have various units of analysis, focuses, and variables, and may organize learning in many different ways, it is the theoretically informed design for educational innovation that is most centrally under evaluation. DBR actively and centrally exists as a paradigm that is engaged in the development of theory, not just the evaluation of aspects of its usage (Bell, 2004 ; Design-Based Research Collective, 2003 ; Lesh & Kelly, 2000 ; van den Akker, 1999 ).

Effectively, where DBR is taking place, theory as a lived possibility is under examination. Specifically, in most DBR, this means a focus on “intermediate-level” theories of learning, rather than “grand” ones. In essence, DBR does not contend directly with “grand” learning theories (such as developmental or sociocultural theory writ large) (diSessa, 1991 ). Rather, DBR seeks to offer constructive insights by directly engaging with particular learning processes that flow from these theories on a “grounded,” “intermediate” level. This is not, however, to say DBR is limited in what knowledge it can produce; rather, tinkering in this “intermediate” realm can produce knowledge that informs the “grand” theory (Gravemeijer, 1994 ). For example, while cognitive and motivational psychology provide “grand” theoretical frames, interest-driven learning (IDL) is an “intermediate” theory that flows from these and can be explored in DBR to both inform the development of IDL designs in practice and inform cognitive and motivational psychology more broadly (Joseph, 2004 ).

Crucially, however, DBR entails putting the theory in question under intense scrutiny, or, “into harm’s way” (Cobb et al., 2003 ). This is an especially core element to DBR, and one that distinguishes it from the proliferation of educational-reform or educational-entrepreneurship efforts that similarly take up the discourse of “design” and “innovation.” Not only is the reflexive, often participatory element of DBR absent from such efforts—that is, questioning and modifying the design to suit the learning needs of the context and partners—but the theory driving these efforts is never in question, and in many cases, may be actively obscured. Indeed, it is more common to see educational-entrepreneur design innovations seek to modify a context—such as the way charter schools engage in selective pupil recruitment and intensive disciplinary practices (e.g., Carnoy et al., 2005 ; Ravitch, 2010 ; Saltman, 2007 )—rather than modify their design itself, and thus allow for humility in their theory. Such “innovations” and “design” efforts are distinct from DBR, which must, in the spirit of scientific inquiry, be willing to see the learning theory flail and struggle, be modified, and evolve.

This growth and evolution of theory and knowledge is of course central to DBR as a rigorous research paradigm; moving it beyond simply the design of local educational programs, interventions, or innovations. As Barab and Squire ( 2004 ) explain:

Design-based research requires more than simply showing a particular design works but demands that the researcher (move beyond a particular design exemplar to) generate evidence-based claims about learning that address contemporary theoretical issues and further the theoretical knowledge of the field. (pp. 5–6)

DBR as a research paradigm offers a design process through which theories of learning can be tested; they can be modified, and by allowing them to operate with humility in situated conditions, new insights and knowledge, even new theories, may emerge that might inform the field, as well as the efforts and directions of other types of research inquiry. These productive, theory-developing outcomes, or “ontological innovations” (diSessa & Cobb, 2004 ), represent the culmination of an effective program of DBR—the production of new ways to understand, conceptualize, and enact learning as a lived, contextual process.

Design research works to understand learning processes, and the design that supports them in situated contexts

As a research methodology that operates by tinkering with “grounded” learning theories, DBR is itself grounded, and seeks to develop its knowledge claims and designs in naturalistic, situated contexts (Brown, 1992 ). This is, again, a distinguishing element of DBR—setting it apart from laboratory research efforts involving design and interventions in closed, controlled environments. Rather than attempting to focus on singular variables, and isolate these from others, DBR is concerned with the multitude of variables that naturally occur across entire learning ecologies, and present themselves in distinct ways across multiple planes of possible examination (Rogoff, 1995 ; Collins, Joseph, & Bielaczyc, 2004 ). Certainly, specific variables may be identified as dependent, focal units of analysis, but identifying (while not controlling for) the variables beyond these, and analyzing their impact on the design and learning outcomes, is an equally important process in DBR (Collins et al., 2004 ; Barab & Kirshner, 2001 ). In practice, this of course varies across iterations in its depth and breadth. Traditional models of developmental or cognitive DBR may look to account for the complexity and nuance of a setting’s social, developmental, institutional, and intellectual characteristics (e.g., Brown, 1992 ; Cobb et al., 2003 ), while more recent, critical iterations will give increased attention to how historicity, power, intersubjectivity, and culture, among other things, influence and shape a setting, and the learning that occurs within it (e.g., Gutiérrez, 2016 ; Vakil, de Royston, Nasir, & Kirshner, 2016 ).

Beyond these variations, what counts as “design” in DBR varies widely, and so too will what counts as a naturalistic setting. It has been well documented that learning occurs all the time, every day, and in every space imaginable, both formal and informal (Leander, Phillips, & Taylor, 2010 ), and in ways that span strictly defined setting boundaries (Engeström, Engeström, & Kärkkäinen, 1995 ). DBR may take place in any number of contexts, based on the types of questions asked, and the learning theories and processes that a researcher may be interested in exploring. DBR may involve one-to-one tutoring and learning settings, single classrooms, community spaces, entire institutions, or even holistically designed ecologies (Design-Based Research Collective, 2003 ; Engeström, 2008 ; Virkkunen & Newnham, 2013 ). In all these cases, even the most completely designed experimental ecology, the setting remains naturalistic and situated because DBR actively embraces the uncontrollable variables that participants bring with them to the learning process for and from their situated worlds, lives, and experiences—no effort is made to control for these complicated influences of life, simply to understand how they operate in a given ecology as innovation is attempted. Thus, the extent of the design reflects a broader range of qualitative and theoretical study, rather than an attempt to control or isolate some particular learning process from outside influence.

While there is much variety in what design may entail, where DBR takes place, what types of learning ecologies are under examination, and what methods are used, situated ecologies are always the setting of this work. In this way, conscious of naturalistic variables, and the influences that culture, historicity, participation, and context have on learning, researchers can use DBR to build on prior research, and extend knowledge around the learning that occurs in the complexity of situated contexts and lived practices (Collins et al., 2004 ).

Design based research is iterative; it changes, grows, and evolves to meet the needs and emergent questions of the context, and this tinkering process is part of the research

The final shared element undergirding models of DBR is that it is an iterative, active, and interventionist process, interested in and focused on producing educational innovation by actually and actively putting design innovations into practice (Brown, 1992 , Collins, 1992 ; Gutiérrez, 2008 ). Given this interventionist, active stance, tinkering with the design and the theory of learning informing the design is as much a part of the research process as the outcome of the intervention or innovation itself—we learn what impacts learning as much, if not more, than we learn what was learned. In this sense, DBR involves a focus on analyzing the theory-driven design itself, and its implementation as an object of study (Edelson, 2002 ; Penuel, Fishman, Cheng, & Sabelli, 2011 ), and is ultimately interested in the improvement of the design—of how it unfolds, how it shifts, how it is modified, and made to function productively for participants in their contexts and given their needs (Kirshner & Polman, 2013 ).

While DBR is iterative and contextual as a foundational methodological principle, what this means varies across conceptions of DBR. For instance, in more traditional models, Brown and Campione ( 1996 ) pointed out the dangers of “lethal mutation” in which a design, introduced into a context, may become so warped by the influence, pressures, incomplete implementation, or misunderstanding of participants in the local context, that it no longer reflects or tests the theory under study. In short, a theory-driven intervention may be put in place, and then subsumed to such a degree by participants based on their understanding and needs, that it remains the original innovative design in name alone. The assertion here is that in these cases, the research ceases to be DBR in the sense that the design is no longer central, actively shaping learning. We cannot, they argue, analyze a design—and the theory it was meant to reflect—as an object of study when it has been “mutated,” and it is merely a banner under which participants are enacting their idiosyncratic, pragmatic needs.

While the ways in which settings and individuals might disrupt designs intended to produce robust learning is certainly a tension to be cautious of in DBR, it is also worth noting that in many critical approaches to DBR, such mutations—whether “lethal” to the original design or not—are seen as compelling and important moments. Here, where collaboration and community input is more central to the design process, iterative is understood differently. Thus, a “mutation” becomes a point where reflexivity, tension, and contradiction might open the door for change, for new designs, for reconsiderations of researcher and collaborative partner positionalities, or for ethnographic exploration into how a context takes up, shapes, and ultimately engages innovations in a particular sociocultural setting. In short, accounting for and documenting changes in design is a vital part of the DBR process, allowing researchers to respond to context in a variety of ways, always striving for their theories and designs to act with humility, and in the interest of usefulness .

With this in mind, the iterative nature of DBR means that the relationships researchers have with other design partners (educators and learners) in the ecology are incredibly important, and vital to consider (Bang et al., 2016 ; Engeström, 2007 ; Engeström, Sannino, & Virkkunen, 2014 ). Different iterations of DBR might occur in ways in which the researcher is more or less intimately involved in the design and implementation process, both in terms of actual presence and intellectual ownership of the design. Regarding the former, in some cases, a researcher may hand a design off to others to implement, periodically studying and modifying it, while in other contexts or designs, the researcher may be actively involved, tinkering in every detail of the implementation and enactment of the design. With regard to the latter, DBR might similarly range from a somewhat prescribed model, in which the researcher is responsible for the original design, and any modifications that may occur based on their analyses, without significant input from participants (e.g., Collins et al., 2004 ), to incredibly participatory models, in which all parties (researchers, educators, learners) are part of each step of the design-creation, modification, and research process (e.g., Bang, Faber, Gurneau, Marin, & Soto, 2016 ; Kirshner, 2015 ).

Considering the wide range of ideological approaches and models for DBR, we might acknowledge that DBR can be gainfully conducted through many iterations of “openness” to the design process. However, the strength of the research—focused on analyzing the design itself as a unit of study reflective of learning theory—will be bolstered by thoughtfully accounting for how involved the researcher will be, and how open to participation the modification process is. These answers should match the types of questions, and conceptual or ideological framing, with which researchers approach DBR, allowing them to tinker with the process of learning as they build on prior research to extend knowledge and test theory (Barab & Kirshner, 2001 ), while thoughtfully documenting these changes in the design as they go.

Implementation and Research Design

As with the overarching principles of design-based research (DBR), even amid the pluralism of conceptual frameworks of DBR researchers, it is possible, and useful, to trace the shared contours in how DBR research design is implemented. Though texts provide particular road maps for undertaking various iterations of DBR consistent with the specific goals, types of questions, and ideological orientations of these scholarly communities (e.g., Cole & Engeström, 2007 ; Collins, Joseph, & Bielaczyc, 2004 ; Fishman, Penuel, Allen, Cheng, & Sabelli, 2013 ; Gutiérrez & Jurow, 2016 ; Virkkunen & Newnham, 2013 ), certain elements, realized differently, can be found across all of these models, and may be encapsulated in five broad methodological phases.

Considering the Design Focus

DBR begins by considering what the focus of the design, the situated context, and the units of analysis for research will be. Prospective DBR researchers will need to consider broader research in regard to the “grand” theory of learning with which they work to determine what theoretical questions they have, or identify “intermediate” aspects of the theories that might be studied and strengthened by a design process in situated contexts, and what planes of analysis (Rogoff, 1995 ) will be most suitable for examination. This process allows for the identification of the critical theoretical elements of a design, and articulation of initial research questions.

Given the conceptual framework, theoretical and research questions, and sociopolitical interests at play, researchers may undertake this, and subsequent steps in the process, on their own, or in close collaboration with the communities and individuals in the situated contexts in which the design will unfold. As such, across iterations of DBR, and with respect to the ways DBR researchers choose to engage with communities, the origin of the design will vary, and might begin in some cases with theoretical questions, or arise in others as a problem of practice (Coburn & Penuel, 2016 ), though as has been noted, in either case, theory and practice are necessarily linked in the research.

Creating and Implementing a Designed Innovation

From the consideration and identification of the critical elements, planned units of analysis, and research questions that will drive a design, researchers can then actively create (either on their own or in conjunction with potential design partners) a designed intervention reflecting these critical elements, and the overarching theory.

Here, the DBR researcher should consider what partners exist in the process and what ownership exists around these partnerships, determine exactly what the pragmatic features of the intervention/design will be and who will be responsible for them, and consider when checkpoints for modification and evaluation will be undertaken, and by whom. Additionally, researchers should at this stage consider questions of timeline and of recruiting participants, as well as what research materials will be needed to adequately document the design, its implementation, and its outcomes, and how and where collected data will be stored.

Once a design (the planned, theory-informed innovative intervention) has been produced, the DBR researcher and partners can begin the implementation process, putting the design into place and beginning data collection and documentation.

Assessing the Impact of the Design on the Learning Ecology

Chronologically, the next two methodological steps happen recursively in the iterative process of DBR. The researcher must assess the impact of the design, and then, make modifications as necessary, before continuing to assess the impact of these modifications. In short, these next two steps are a cycle that continues across the life and length of the research design.

Once a design has been created and implemented, the researcher begins to observe and document the learning, the ecology, and the design itself. Guided by and in conversation with the theory and critical elements, the researcher should periodically engage in ongoing data analysis, assessing the success of the design, and of learning, paying equal attention to the design itself, and how its implementation is working in the situated ecology.

Within the realm of qualitative research, measuring or assessing variables of learning and assessing the design may look vastly different, require vastly different data-collection and data-analysis tools, and involve vastly different research methods among different researchers.

Modifying the Design

Modification, based on ongoing assessment of the design, is what makes DBR iterative, helping the researcher extend the field’s knowledge about the theory, design, learning, and the context under examination.

Modification of the design can take many forms, from complete changes in approach or curriculum, to introducing an additional tool or mediating artifact into a learning ecology. Moreover, how modification unfolds involves careful reflection from the researcher and any co-designing participants, deciding whether modification will be an ongoing, reflexive, tinkering process, or if it will occur only at predefined checkpoints, after formal evaluation and assessment. Questions of ownership, issues of resource availability, technical support, feasibility, and communication are all central to the work of design modification, and answers will vary given the research questions, design parameters, and researchers’ epistemic commitments.

Each moment of modification indicates a new phase in a DBR project, and a new round of assessing—through data analysis—the impact of the design on the learning ecology, either to guide continued or further modification, report the results of the design, or in some cases, both.

Reporting the Results of the Design

The final step in DBR methodology is to report on the results of the designed intervention, how it contributed to understandings of theory, and how it impacted the local learning ecology or context. The format, genre, and final data analysis methods used in reporting data and research results will vary across iterations of DBR. However, it is largely understood that to avoid methodological confusion, DBR researchers should clearly situate themselves in the DBR paradigm by clearly describing and detailing the design itself; articulating the theory, central elements, and units of analysis under scrutiny, what modifications occurred and what precipitated these changes, and what local effects were observed; and exploring any potential contributions to learning theory, while accounting for the context and their interventionist role and positionality in the design. As such, careful documentation of pragmatic and design decisions for retrospective data analysis, as well as research findings, should be done at each stage of this implementation process.

Methodological Issues in the Design-Based Research Paradigm

Because of its pluralistic nature, its interventionist, nontraditional stance, and the fact that it remains in its conceptual infancy, design-based research (DBR) is replete with ongoing methodological questions and challenges, both from external and internal sources. While there are many more that may exist, addressed will be several of the most pressing the prospective DBR researcher may encounter, or want to consider in understanding the paradigm and beginning a research design.

Challenges to Rigor and Validity

Perhaps the place to begin this reflection on tensions in the DBR paradigm is the recurrent and ongoing challenge to the rigor and validity of DBR, which has asked: Is DBR research at all? Given the interventionist and activist way in which DBR invites the researcher to participate, and the shift in orientation from long-accepted research paradigms, such critiques are hardly surprising, and fall in line with broader challenges to the rigor and objectivity of qualitative social science research in general. Historically, such complaints about DBR are linked to decades of critique of any research that does not adhere to the post-positivist approach set out as the U.S. Department of Education began to prioritize laboratory and large-scale randomized control-trial experimentation as the “gold standard” of research design (e.g., Mosteller & Boruch, 2002 ).

From the outset, DBR, as an interventionist, local, situated, non-laboratory methodology, was bound to run afoul of such conservative trends. While some researchers involved in (particularly traditional developmental and cognitive) DBR have found broader acceptance within these constraints, the rigor of DBR remains contested. It has been suggested that DBR is under-theorized and over-methologized, a haphazard way for researchers to do activist work without engaging in the development of robust knowledge claims about learning (Dede, 2004 ), and an approach lacking in coherence that sheltered interventionist projects of little impact to developing learning theory and allowed researchers to make subjective, pet claims through selective analysis of large bodies of collected data (Kelly, 2003 , 2004 ).

These critiques, however, impose an external set of criteria on DBR, desiring it to fit into the molds of rigor and coherence as defined by canonical methodologies. Bell ( 2004 ) and Bang and Vossoughi ( 2016 ) have made compelling cases for the wide variety of methods and approaches present in DBR not as a fracturing, but as a generative proliferation of different iterations that can offer powerful insights around the different types of questions that exist about learning in the infinitely diverse settings in which it occurs. Essentially, researchers have argued that within the DBR paradigm, and indeed within educational research more generally, the practical impact of research on learning, context, and practices should be a necessary component of rigor (Gutiérrez & Penuel, 2014 ), and the pluralism of methods and approaches available in DBR ensures that the practical impacts and needs of the varied contexts in which the research takes place will always drive the design and research tools.

These moves are emblematic of the way in which DBR is innovating and pushing on paradigms of rigor in educational research altogether, reflecting how DBR fills a complementary niche with respect to other methodologies and attends to elements and challenges of learning in lived, real environments that other types of research have consistently and historically missed. Beyond this, Brown ( 1992 ) was conscious of the concerns around data collection, validity, rigor, and objectivity from the outset, identifying this dilemma—the likelihood of having an incredible amount of data collected in a design only a small fraction of which can be reported and shared, thus leading potentially to selective data analysis and use—as the Bartlett Effect (Brown, 1992 ). Since that time, DBR researchers have been aware of this challenge, actively seeking ways to mitigate this threat to validity by making data sets broadly available, documenting their design, tinkering, and modification processes, clearly situating and describing disconfirming evidence and their own position in the research, and otherwise presenting the broad scope of human and learning activity that occurs within designs in large learning ecologies as comprehensively as possible.

Ultimately, however, these responses are likely to always be insufficient as evidence of rigor to some, for the root dilemma is around what “counts” as education science. While researchers interested and engaged in DBR ought rightly to continue to push themselves to ensure the methodological rigor of their work and chosen methods, it is also worth noting that DBR should seek to hold itself to its own criteria of assessment. This reflects broader trends in qualitative educational research that push back on narrow constructions of what “counts” as science, recognizing the ways in which new methodologies and approaches to research can help us examine aspects of learning, culture, and equity that have continued to be blind spots for traditional education research; invite new voices and perspectives into the process of achieving rigor and validity (Erickson & Gutiérrez, 2002 ); bolster objectivity by bringing it into conversation with the positionality of the researcher (Harding, 1993 ); and perhaps most important, engage in axiological innovation (Bang, Faber, Gurneau, Marin, & Soto, 2016 ), or the exploration of and design for what is, “good right, true, and beautiful . . . in cultural ecologies” (p. 2).

Questions of Generalizability and Usefulness

The generalizability of research results in DBR has been an ongoing and contentious issue in the development of the paradigm. Indeed, by the standards of canonical methods (e.g., laboratory experimentation, ethnography), these local, situated interventions should lack generalizability. While there is reason to discuss and question the merit of generalizability as a goal of qualitative research at all, researchers in the DBR paradigm have long been conscious of this issue. Understanding the question of generalizability around DBR, and how the paradigm has responded to it, can be done in two ways.

First, by distinguishing questions specific to a particular design from the generalizability of the theory. Cole’s (Cole & Underwood, 2013 ) 5th Dimension work, and the nationwide network of linked, theoretically similar sites, operating nationwide with vastly different designs, is a powerful example of this approach to generalizability. Rather than focus on a single, unitary, potentially generalizable design, the project is more interested in variability and sustainability of designs across local contexts (e.g., Cole, 1995 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ; Jurow, Tracy, Hotchkiss, & Kirshner, 2012 ). Through attention to sustainable, locally effective innovations, conscious of the wide variation in culture and context that accompanies any and all learning processes, 5th Dimension sites each derive their idiosyncratic structures from sociocultural theory, sharing some elements, but varying others, while seeking their own “ontological innovations” based on the affordances of their contexts. This pattern reflects a key element of much of the DBR paradigm: that questions of generalizability in DBR may be about the generalizability of the theory of learning, and the variability of learning and design in distinct contexts, rather than the particular design itself.

A second means of addressing generalizability in DBR has been to embrace the pragmatic impacts of designing innovations. This response stems from Messick ( 1992 ) and Schoenfeld’s ( 1992 ) arguments early on in the development of DBR that the consequentialness and validity of DBR efforts as potentially generalizable research depend on the “ usefulness ” of the theories and designs that emerge. Effectively, because DBR is the examination of situated theory, a design must be able to show pragmatic impact—it must succeed at showing the theory to be useful . If there is evidence of usefulness to both the context in which it takes place, and the field of educational research more broadly, then the DBR researcher can stake some broader knowledge claims that might be generalizable. As a result, the DBR paradigm tends to “treat changes in [local] contexts as necessary evidence for the viability of a theory” (Barab & Squire, 2004 , p. 6). This of course does not mean that DBR is only interested in successful efforts. A design that fails or struggles can provide important information and knowledge to the field. Ultimately, though, DBR tends to privilege work that proves the usefulness of designs, whose pragmatic or theoretical findings can then be generalized within the learning science and education research fields.

With this said, the question of usefulness is not always straightforward, and is hardly unitary. While many DBR efforts—particularly those situated in developmental and cognitive learning science traditions—are interested in the generalizability of their useful educational designs (Barab & Squire, 2004 ; Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003 ; Joseph, 2004 ; Steffe & Thompson, 2000 ), not all are. Critical DBR researchers have noted that if usefulness remains situated in the extant sociopolitical and sociocultural power-structures—dominant conceptual and popular definitions of what useful educational outcomes are—the result will be a bar for research merit that inexorably bends toward the positivist spectrum (Booker & Goldman, 2016 ; Dominguez, 2015 ; Zavala, 2016 ). This could potentially, and likely, result in excluding the non-normative interventions and innovations that are vital for historically marginalized communities, but which might have vastly different-looking outcomes, that are nonetheless useful in the sociopolitical context they occur in. Alternative framings to this idea of usefulness push on and extend the intention, and seek to involve the perspectives and agency of situated community partners and their practices in what “counts” as generative and rigorous research outcomes (Gutiérrez & Penuel, 2014 ). An example in this regard is the idea of consequential knowledge (Hall & Jurow, 2015 ; Jurow & Shea, 2015 ), which suggests outcomes that are consequential will be taken up by participants in and across their networks, and over-time—thus a goal of consequential knowledge certainly meets the standard of being useful , but it also implicates the needs and agency of communities in determining the success and merit of a design or research endeavor in important ways that strict usefulness may miss.

Thus, the bar of usefulness that characterizes the DBR paradigm should not be approached without critical reflection. Certainly designs that accomplish little for local contexts should be subject to intense questioning and critique, but considering the sociopolitical and systemic factors that might influence what “counts” as useful in local contexts and education science more generally, should be kept firmly in mind when designing, choosing methods, and evaluating impacts (Zavala, 2016 ). Researchers should think deeply about their goals, whether they are reaching for generalizability at all, and in what ways they are constructing contextual definitions of success, and be clear about these ideologically influenced answers in their work, such that generalizability and the usefulness of designs can be adjudicated based on and in conversation with the intentions and conceptual framework of the research and researcher.

Ethical Concerns of Sustainability, Participation, and Telos

While there are many external challenges to rigor and validity of DBR, another set of tensions comes from within the DBR paradigm itself. Rather than concerns about rigor or validity, these internal critiques are not unrelated to the earlier question of the contested definition of usefulness , and more accurately reflect questions of research ethics and grow from ideological concerns with how an intentional, interventionist stance is taken up in research as it interacts with situated communities.

Given that the nature of DBR is to design and implement some form of educational innovation, the DBR researcher will in some way be engaging with an individual or community, becoming part of a situated learning ecology, complete with a sociopolitical and cultural history. As with any research that involves providing an intervention or support, the question of what happens when the research ends is as much an ethical as a methodological one. Concerns then arise given how traditional models of DBR seem intensely focused on creating and implementing a “complete” cycle of design, but giving little attention to what happens to the community and context afterward (Engeström, 2011 ). In contrast to this privileging of “completeness,” sociocultural and critical approaches to DBR have suggested that if research is actually happening in naturalistic, situated contexts that authentically recognize and allow social and cultural dimensions to function (i.e., avoid laboratory-type controls to mitigate independent variables), there can never be such a thing as “complete,” for the design will, and should, live on as part of the ecology of the space (Cole, 2007 ; Engeström, 2000 ). Essentially, these internal critiques push DBR to consider sustainability, and sustainable scale, as equally important concerns to the completeness of an innovation. Not only are ethical questions involved, but accounting for the unbounded and ongoing nature of learning as a social and cultural activity can help strengthen the viability of knowledge claims made, and what degree of generalizability is reasonably justified.

Related to this question of sustainability are internal concerns regarding the nature and ethics of participation in DBR, whether partners in a design are being adequately invited to engage in the design and modification processes that will unfold in their situated contexts and lived communities (Bang et al., 2016 ; Engeström, 2011 ). DBR has actively sought to examine multiple planes of analysis in learning that might be occurring in a learning ecology but has rarely attended to the subject-subject dynamics (Bang et al., 2016 ), or “relational equity” (DiGiacomo & Gutiérrez, 2015 ) that exists between researchers and participants as a point of focus. Participatory design research (PDR) (Bang & Vossoughi, 2016 ) models have recently emerged as a way to better attend to these important dimensions of collective participation (Engeström, 2007 ), power (Vakil et al., 2016 ), positionality (Kirshner, 2015 ), and relational agency (Edwards, 2007 , 2009 ; Sannino & Engeström, 2016 ) as they unfold in DBR.

Both of these ethical questions—around sustainability and participation—reflect challenges to what we might call the telos —or direction—that DBR takes to innovation and research. These are questions related to whose voices are privileged, in what ways, for what purposes, and toward what ends. While DBR, like many other forms of educational research, has involved work with historically marginalized communities, it has, like many other forms of educational research, not always done so in humanizing ways. Put another way, there are ethical and political questions surrounding whether the designs, goals, and standards of usefulness we apply to DBR efforts should be purposefully activist, and have explicitly liberatory ends. To this point, critical and decolonial perspectives have pushed on the DBR paradigm, suggesting that DBR should situate itself as being a space of liberatory innovation and potential, in which communities and participants can become designers and innovators of their own futures (Gutiérrez, 2005 ). This perspective is reflected in the social design experiment (SDE) approach to DBR (Gutiérrez, 2005 , 2008 ; Gutierréz & Vossoughi, 2010 ; Gutiérrez, 2016 ; Gutiérrez & Jurow, 2016 ), which begins in participatory fashion, engaging a community in identifying its own challenges and desires, and reflecting on the historicity of learning practices, before proleptic design efforts are undertaken that ensure that research is done with , not on , communities of color (Arzubiaga, Artiles, King, & Harris-Murri, 2008 ), and intentionally focused on liberatory goals.

Global Perspectives and Unique Iterations

While design-based research (DBR) has been a methodology principally associated with educational research in the United States, its development is hardly limited to the U.S. context. Rather, while DBR emerged in U.S. settings, similar methods of situated, interventionist research focused on design and innovation were emerging in parallel in European contexts (e.g., Gravemeijer, 1994 ), most significantly in the work of Vygotskian scholars both in Europe and the United States (Cole, 1995 ; Cole & Engeström, 1993 , 2007 ; Engeström, 1987 ).

Particularly, where DBR began in the epistemic and ontological terrain of developmental and cognitive psychology, this vein of design-based research work began deeply grounded in cultural-historical activity theory (CHAT). This ontological and epistemic grounding meant that the approach to design that was taken was more intensively conscious of context, historicity, hybridity, and relational factors, and framed around understanding learning as a complex, collective activity system that, through design, could be modified and transformed (Cole & Engeström, 2007 ). The models of DBR that emerged in this context abroad were the formative intervention (Engeström, 2011 ; Engeström, Sannino, & Virkkunen, 2014 ), which relies heavily on Vygotskian double-stimulation to approach learning in nonlinear, unbounded ways, accounting for the role of learner, educator, and researcher in a collective process, shifting and evolving and tinkering with the design as the context needs and demands; and the Change Laboratory (Engeström, 2008 ; Virkkunen & Newnham, 2013 ), which similarly relies on the principle of double stimulation, while presenting holistic way to approach transforming—or changing—entire learning activity systems in fundamental ways through designs that encourage collective “expansive learning” (Engeström, 2001 ), through which participants can produce wholly new activity systems as the object of learning itself.

Elsewhere in the United States, still parallel to the developmental- or cognitive-oriented DBR work that was occurring, American researchers employing CHAT began to leverage the tools and aims of expansive learning in conversation with the tensions and complexity of the U.S. context (Cole, 1995 ; Gutiérrez, 2005 ; Gutiérrez & Rogoff, 2003 ). Like the CHAT design research of the European context, there was a focus on activity systems, historicity, nonlinear and unbounded learning, and collective learning processes and outcomes. Rather than a simple replication, however, these researchers put further attention on questions of equity, diversity, and justice in this work, as Gutiérrez, Engeström, and Sannino ( 2016 ) note:

The American contribution to a cultural historical activity theoretic perspective has been its attention to diversity, including how we theorize, examine, and represent individuals and their communities. (p. 276)

Effectively, CHAT scholars in parts of the United States brought critical and decolonial perspectives to bear on their design-focused research, focusing explicitly on the complex cultural, racial, and ethnic terrain in which they worked, and ensuring that diversity, equity, justice, and non-dominant perspectives would become central principles to the types of design research conducted. The result was the emergence of the aforementioned social design experiments (e.g., Gutiérrez, 2005 , 2016 ), and participatory design research (Bang & Vossoughi, 2016 ) models, which attend intentionally to historicity and relational equity, tailor their methods to the liberation of historically marginalized communities, aim intentionally for liberatory outcomes as key elements of their design processes, and seek to produce outcomes in which communities of learners become designers of new community futures (Gutiérrez, 2016 ). While these approaches emerged in the United States, their origins reflect ontological and ideological perspectives quite distinct from more traditional learning science models of DBR, and dominant U.S. ontologies in general. Indeed, these iterations of DBR are linked genealogically to the ontologies, ideologies, and concerns of peoples in the Global South, offering some promise for the method in those regions, though DBR has yet to broadly take hold among researchers beyond the United States and Europe.

There is, of course, much more nuance to these models, and each of these models (formative interventions, Change Laboratories, social design experiments, and participatory design research) might itself merit independent exploration and review well beyond the scope here. Indeed, there is some question as to whether all adherents of these CHAT design-based methodologies, with their unique genealogies and histories, would even consider themselves under the umbrella of DBR. Yet, despite significant ontological divergences, these iterations share many of the same foundational tenets of the traditional models (though realized differently), and it is reasonable to argue that they do indeed share the same, broad methodological paradigm (DBR), or at the very least, are so intimately related that any discussion of DBR, particularly one with a global view, should consider the contributions CHAT iterations have made to the DBR methodology in the course of their somewhat distinct, but parallel, development.

Possibilities and Potentials for Design-Based Research

Since its emergence in 1992 , the DBR methodology for educational research has continued to grow in popularity, ubiquity, and significance. Its use has begun to expand beyond the confines of the learning sciences, taken up by researchers in a variety of disciplines, and across a breadth of theoretical and intellectual traditions. While still not as widely recognized as more traditional and well-established research methodologies, DBR as a methodology for rigorous research is unquestionably here to stay.

With this in mind, the field ought to still be cautious of the ways in which the discourse of design is used. Not all design is DBR, and preserving the integrity, rigor, and research ethics of the paradigm (on its own terms) will continue to require thoughtful reflection as its pluralistic parameters come into clearer focus. Yet the proliferation of methods in the DBR paradigm should be seen as a positive. There are far too many theories of learning and ideological perspectives that have meaningful contributions to make to our knowledge of the world, communities, and learning to limit ourselves to a unitary approach to DBR, or set of methods. The paradigm has shown itself to have some core methodological principles, but there is no reason not to expect these to grow, expand, and evolve over time.

In an increasingly globalized, culturally diverse, and dynamic world, there is tremendous potential for innovation couched in this proliferation of DBR. Particularly in historically marginalized communities and across the Global South, we will need to know how learning theories can be lived out in productive ways in communities that have been understudied, and under-engaged. The DBR paradigm generally, and critical and CHAT iterations particularly, can fill an important need for participatory, theory-developing research in these contexts that simultaneously creates lived impacts. Participatory design research (PDR), social design experiments (SDE), and Change Laboratory models of DBR should be of particular interest and attention moving forward, as current trends toward culturally sustaining pedagogies and learning will need to be explored in depth and in close collaboration with communities, as participatory design partners, in the press toward liberatory educational innovations.

Bibliography

The following special issues of journals are encouraged starting points for engaging more deeply with current and past trends in design-based research.

  • Bang, M. , & Vossoughi, S. (Eds.). (2016). Participatory design research and educational justice: Studying learning and relations within social change making [Special issue]. Cognition and Instruction , 34 (3).
  • Barab, S. (Ed.). (2004). Design-based research [Special issue]. Journal of the Learning Sciences , 13 (1).
  • Cole, M. , & The Distributed Literacy Consortium. (2006). The Fifth Dimension: An after-school program built on diversity . New York, NY: Russell Sage Foundation.
  • Kelly, A. E. (Ed.). (2003). Special issue on the role of design in educational research [Special issue]. Educational Researcher , 32 (1).
  • Arzubiaga, A. , Artiles, A. , King, K. , & Harris-Murri, N. (2008). Beyond research on cultural minorities: Challenges and implications of research as situated cultural practice. Exceptional Children , 74 (3), 309–327.
  • Bang, M. , Faber, L. , Gurneau, J. , Marin, A. , & Soto, C. (2016). Community-based design research: Learning across generations and strategic transformations of institutional relations toward axiological innovations. Mind, Culture, and Activity , 23 (1), 28–41.
  • Bang, M. , & Vossoughi, S. (2016). Participatory design research and educational justice: Studying learning and relations within social change making. Cognition and Instruction , 34 (3), 173–193.
  • Barab, S. , Kinster, J. G. , Moore, J. , Cunningham, D. , & The ILF Design Team. (2001). Designing and building an online community: The struggle to support sociability in the Inquiry Learning Forum. Educational Technology Research and Development , 49 (4), 71–96.
  • Barab, S. , & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences , 13 (1), 1–14.
  • Barab, S. A. , & Kirshner, D. (2001). Methodologies for capturing learner practices occurring as part of dynamic learning environments. Journal of the Learning Sciences , 10 (1–2), 5–15.
  • Bell, P. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist , 39 (4), 243–253.
  • Bereiter, C. , & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 361–392). Hillsdale, NJ: Lawrence Erlbaum.
  • Booker, A. , & Goldman, S. (2016). Participatory design research as a practice for systemic repair: Doing hand-in-hand math research with families. Cognition and Instruction , 34 (3), 222–235.
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences , 2 (2), 141–178.
  • Brown, A. , & Campione, J. C. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environments for education (pp. 289–325). Mahwah, NJ: Lawrence Erlbaum.
  • Brown, A. L. , & Campione, J. C. (1998). Designing a community of young learners: Theoretical and practical lessons. In N. M. Lambert & B. L. McCombs (Eds.), How students learn: Reforming schools through learner-centered education (pp. 153–186). Washington, DC: American Psychological Association.
  • Brown, A. , Campione, J. , Webber, L. , & McGilley, K. (1992). Interactive learning environments—A new look at learning and assessment. In B. R. Gifford & M. C. O’Connor (Eds.), Future assessment: Changing views of aptitude, achievement, and instruction (pp. 121–211). Boston, MA: Academic Press.
  • Carnoy, M. , Jacobsen, R. , Mishel, L. , & Rothstein, R. (2005). The charter school dust-up: Examining the evidence on enrollment and achievement . Washington, DC: Economic Policy Institute.
  • Carspecken, P. (1996). Critical ethnography in educational research . New York, NY: Routledge.
  • Cobb, P. , Confrey, J. , diSessa, A. , Lehrer, R. , & Schauble, L. (2003). Design experiments in educational research. Educational Researcher , 32 (1), 9–13.
  • Cobb, P. , & Steffe, L. P. (1983). The constructivist researcher as teacher and model builder. Journal for Research in Mathematics Education , 14 , 83–94.
  • Coburn, C. , & Penuel, W. (2016). Research-practice partnerships in education: Outcomes, dynamics, and open questions. Educational Researcher , 45 (1), 48–54.
  • Cole, M. (1995). From Moscow to the Fifth Dimension: An exploration in romantic science. In M. Cole & J. Wertsch (Eds.), Contemporary implications of Vygotsky and Luria (pp. 1–38). Worcester, MA: Clark University Press.
  • Cole, M. (1996). Cultural psychology: A once and future discipline . Cambridge, MA: Harvard University Press.
  • Cole, M. (2007). Sustaining model systems of educational activity: Designing for the long haul. In J. Campione , K. Metz , & A. S. Palinscar (Eds.), Children’s learning in and out of school: Essays in honor of Ann Brown (pp. 71–89). New York, NY: Routledge.
  • Cole, M. , & Engeström, Y. (1993). A cultural historical approach to distributed cognition. In G. Saloman (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 1–46). Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Engeström, Y. (2007). Cultural-historical approaches to designing for development. In J. Valsiner & A. Rosa (Eds.), The Cambridge handbook of sociocultural psychology , Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Underwood, C. (2013). The evolution of the 5th Dimension. In The Story of the Laboratory of Comparative Human Cognition: A polyphonic autobiography . https://lchcautobio.ucsd.edu/polyphonic-autobiography/section-5/chapter-12-the-later-life-of-the-5th-dimension-and-its-direct-progeny/ .
  • Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). New York, NY: Springer-Verlag.
  • Collins, A. , Joseph, D. , & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences , 13 (1), 15–42.
  • Dede, C. (2004). If design-based research is the answer, what is the question? A commentary on Collins, Joseph, and Bielaczyc; DiSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS special issue on design-based research. Journal of the Learning Sciences , 13 (1), 105–114.
  • Design-Based Research Collective . (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher , 32 (1), 5–8.
  • DiGiacomo, D. , & Gutiérrez, K. D. (2015). Relational equity as a design tool within making and tinkering activities. Mind, Culture, and Activity , 22 (3), 1–15.
  • diSessa, A. A. (1991). Local sciences: Viewing the design of human-computer systems as cognitive science. In J. M. Carroll (Ed.), Designing interaction: Psychology at the human-computer interface (pp. 162–202). Cambridge, U.K.: Cambridge University Press.
  • diSessa, A. A. , & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Journal of the Learning Sciences , 13 (1), 77–103.
  • diSessa, A. A. , & Minstrell, J. (1998). Cultivating conceptual change with benchmark lessons. In J. G. Greeno & S. Goldman (Eds.), Thinking practices (pp. 155–187). Mahwah, NJ: Lawrence Erlbaum.
  • Dominguez, M. (2015). Decolonizing teacher education: Explorations of expansive learning and culturally sustaining pedagogy in a social design experiment (Doctoral dissertation). University of Colorado, Boulder.
  • Edelson, D. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences , 11 (1), 105–121.
  • Edwards, A. (2007). Relational agency in professional practice: A CHAT analysis. Actio: An International Journal of Human Activity Theory , 1 , 1–17.
  • Edwards, A. (2009). Agency and activity theory: From the systemic to the relational. In A. Sannino , H. Daniels , & K. Gutiérrez (Eds.), Learning and expanding with activity theory (pp. 197–211). Cambridge, U.K.: Cambridge University Press.
  • Engeström, Y. (1987). Learning by expanding . Helsinki, Finland: University of Helsinki, Department of Education.
  • Engeström, Y. (2000). Can people learn to master their future? Journal of the Learning Sciences , 9 , 525–534.
  • Engeström, Y. (2001). Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work , 14 (1), 133–156.
  • Engeström, Y. (2007). Enriching the theory of expansive learning: Lessons from journeys toward co-configuration. Mind, Culture, and Activity , 14 (1–2), 23–39.
  • Engeström, Y. (2008). Putting Vygotksy to work: The Change Laboratory as an application of double stimulation. In H. Daniels , M. Cole , & J. Wertsch (Eds.), Cambridge companion to Vygotsky (pp. 363–382). New York, NY: Cambridge University Press.
  • Engeström, Y. (2011). From design experiments to formative interventions. Theory & Psychology , 21 (5), 598–628.
  • Engeström, Y. , Engeström, R. , & Kärkkäinen, M. (1995). Polycontextuality and boundary crossing in expert cognition: Learning and problem solving in complex work activities. Learning and Instruction , 5 (4), 319–336.
  • Engeström, Y. , & Sannino, A. (2010). Studies of expansive learning: Foundations, findings and future challenges. Educational Research Review , 5 (1), 1–24.
  • Engeström, Y. , & Sannino, A. (2011). Discursive manifestations of contradictions in organizational change efforts: A methodological framework. Journal of Organizational Change Management , 24 (3), 368–387.
  • Engeström, Y. , Sannino, A. , & Virkkunen, J. (2014). On the methodological demands of formative interventions. Mind, Culture, and Activity , 2 (2), 118–128.
  • Erickson, F. , & Gutiérrez, K. (2002). Culture, rigor, and science in educational research. Educational Researcher , 31 (8), 21–24.
  • Espinoza, M. (2009). A case study of the production of educational sanctuary in one migrant classroom. Pedagogies: An International Journal , 4 (1), 44–62.
  • Espinoza, M. L. , & Vossoughi, S. (2014). Perceiving learning anew: Social interaction, dignity, and educational rights. Harvard Educational Review , 84 (3), 285–313.
  • Fine, M. (1994). Dis-tance and other stances: Negotiations of power inside feminist research. In A. Gitlin (Ed.), Power and method (pp. 13–25). New York, NY: Routledge.
  • Fishman, B. , Penuel, W. , Allen, A. , Cheng, B. , & Sabelli, N. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National Society for the Study of Education , 112 (2), 136–156.
  • Gravemeijer, K. (1994). Educational development and developmental research in mathematics education. Journal for Research in Mathematics Education , 25 (5), 443–471.
  • Gutiérrez, K. (2005). Intersubjectivity and grammar in the third space . Scribner Award Lecture.
  • Gutiérrez, K. (2008). Developing a sociocritical literacy in the third space. Reading Research Quarterly , 43 (2), 148–164.
  • Gutiérrez, K. (2016). Designing resilient ecologies: Social design experiments and a new social imagination. Educational Researcher , 45 (3), 187–196.
  • Gutiérrez, K. , Bien, A. , Selland, M. , & Pierce, D. M. (2011). Polylingual and polycultural learning ecologies: Mediating emergent academic literacies for dual language learners. Journal of Early Childhood Literacy , 11 (2), 232–261.
  • Gutiérrez, K. , Engeström, Y. , & Sannino, A. (2016). Expanding educational research and interventionist methodologies. Cognition and Instruction , 34 (2), 275–284.
  • Gutiérrez, K. , & Jurow, A. S. (2016). Social design experiments: Toward equity by design. Journal of Learning Sciences , 25 (4), 565–598.
  • Gutiérrez, K. , & Penuel, W. R. (2014). Relevance to practice as a criterion for rigor. Educational Researcher , 43 (1), 19–23.
  • Gutiérrez, K. , & Rogoff, B. (2003). Cultural ways of learning: Individual traits or repertoires of practice. Educational Researcher , 32 (5), 19–25.
  • Gutierréz, K. , & Vossoughi, S. (2010). Lifting off the ground to return anew: Mediated praxis, transformative learning, and social design experiments. Journal of Teacher Education , 61 (1–2), 100–117.
  • Hall, R. , & Jurow, A. S. (2015). Changing concepts in activity: Descriptive and design studies of consequential learning in conceptual practices. Educational Psychologist , 50 (3), 173–189.
  • Harding, S. (1993). Rethinking standpoint epistemology: What is “strong objectivity”? In L. Alcoff & E. Potter (Eds.), Feminist epistemologies (pp. 49–82). New York, NY: Routledge.
  • Hoadley, C. (2002). Creating context: Design-based research in creating and understanding CSCL. In G. Stahl (Ed.), Computer support for collaborative learning 2002 (pp. 453–462). Mahwah, NJ: Lawrence Erlbaum.
  • Hoadley, C. (2004). Methodological alignment in design-based research. Educational Psychologist , 39 (4), 203–212.
  • Joseph, D. (2004). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context. Educational Psychologist , 39 (4), 235–242.
  • Jurow, A. S. , & Shea, M. V. (2015). Learning in equity-oriented scale-making projects. Journal of the Learning Sciences , 24 (2), 286–307.
  • Jurow, S. , Tracy, R. , Hotchkiss, J. , & Kirshner, B. (2012). Designing for the future: How the learning sciences can inform the trajectories of preservice teachers. Journal of Teacher Education , 63 (2), 147–60.
  • Kärkkäinen, M. (1999). Teams as breakers of traditional work practices: A longitudinal study of planning and implementing curriculum units in elementary school teacher teams . Helsinki, Finland: University of Helsinki, Department of Education.
  • Kelly, A. (2004). Design research in education: Yes, but is it methodological? Journal of the Learning Sciences , 13 (1), 115–128.
  • Kelly, A. E. , & Sloane, F. C. (2003). Educational research and the problems of practice. Irish Educational Studies , 22 , 29–40.
  • Kirshner, B. (2015). Youth activism in an era of education inequality . New York: New York University Press.
  • Kirshner, B. , & Polman, J. L. (2013). Adaptation by design: A context-sensitive, dialogic approach to interventions. National Society for the Study of Education Yearbook , 112 (2), 215–236.
  • Leander, K. M. , Phillips, N. C. , & Taylor, K. H. (2010). The changing social spaces of learning: Mapping new mobilities. Review of Research in Education , 34 , 329–394.
  • Lesh, R. A. , & Kelly, A. E. (2000). Multi-tiered teaching experiments. In A. E. Kelly & R. A. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 197–230). Mahwah, NJ: Lawrence Erlbaum.
  • Matusov, E. (1996). Intersubjectivty without agreement. Mind, Culture, and Activity , 3 (1), 29–45.
  • Messick, S. (1992). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher , 23 (2), 13–23.
  • Mosteller, F. , & Boruch, R. F. (Eds.). (2002). Evidence matters: Randomized trials in education research . Washington, DC: Brookings Institution Press.
  • Newman, D. , Griffin, P. , & Cole, M. (1989). The construction zone: Working for cognitive change in school . London, U.K.: Cambridge University Press.
  • Penuel, W. R. , Fishman, B. J. , Cheng, B. H. , & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher , 40 (7), 331–337.
  • Polman, J. L. (2000). Designing project-based science: Connecting learners through guided inquiry . New York, NY: Teachers College Press.
  • Ravitch, D. (2010). The death and life of the great American school system: How testing and choice are undermining education . New York, NY: Basic Books.
  • Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context . New York, NY: Oxford University Press.
  • Rogoff, B. (1995). Observing sociocultural activity on three planes: Participatory appropriation, guided participation, and apprenticeship. In J. V. Wertsch , P. D. Rio , & A. Alvarez (Eds.), Sociocultural studies of mind (pp. 139–164). Cambridge U.K.: Cambridge University Press.
  • Saltman, K. J. (2007). Capitalizing on disaster: Taking and breaking public schools . Boulder, CO: Paradigm.
  • Salvador, T. , Bell, G. , & Anderson, K. (1999). Design ethnography. Design Management Journal , 10 (4), 35–41.
  • Sannino, A. (2011). Activity theory as an activist and interventionist theory. Theory & Psychology , 21 (5), 571–597.
  • Sannino, A. , & Engeström, Y. (2016). Relational agency, double stimulation and the object of activity: An intervention study in a primary school. In A. Edwards (Ed.), Working relationally in and across practices: Cultural-historical approaches to collaboration (pp. 58–77). Cambridge, U.K.: Cambridge University Press.
  • Scardamalia, M. , & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A challenge for the design of new knowledge media. Journal of the Learning Sciences , 1 , 37–68.
  • Schoenfeld, A. H. (1982). Measures of problem solving performance and of problem solving instruction. Journal for Research in Mathematics Education , 13 , 31–49.
  • Schoenfeld, A. H. (1985). Mathematical problem solving . Orlando, FL: Academic Press.
  • Schoenfeld, A. H. (1992). On paradigms and methods: What do you do when the ones you know don’t do what you want them to? Issues in the analysis of data in the form of videotapes. Journal of the Learning Sciences , 2 (2), 179–214.
  • Scribner, S. , & Cole, M. (1978). Literacy without schooling: Testing for intellectual effects. Harvard Educational Review , 48 (4), 448–461.
  • Shavelson, R. J. , Phillips, D. C. , Towne, L. , & Feuer, M. J. (2003). On the science of education design studies. Educational Researcher , 32 (1), 25–28.
  • Steffe, L. P. , & Thompson, P. W. (2000). Teaching experiment methodology: Underlying principles and essential elements. In A. Kelly & R. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 267–307). Mahwah, NJ: Erlbaum.
  • Stevens, R. (2000). Divisions of labor in school and in the workplace: Comparing computer and paper-supported activities across settings. Journal of the Learning Sciences , 9 (4), 373–401.
  • Suchman, L. (1995). Making work visible. Communications of the ACM , 38 (9), 57–64.
  • Vakil, S. , de Royston, M. M. , Nasir, N. , & Kirshner, B. (2016). Rethinking race and power in design-based research: Reflections from the field. Cognition and Instruction , 34 (3), 194–209.
  • van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker , R. M. Branch , K. Gustafson , N. Nieveen , & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Boston, MA: Kluwer Academic.
  • Virkkunen, J. , & Newnham, D. (2013). The Change Laboratory: A tool for collaborative development of work and education . Rotterdam, The Netherlands: Sense.
  • White, B. Y. , & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction , 16 , 3–118.
  • Zavala, M. (2016). Design, participation, and social change: What design in grassroots spaces can teach learning scientists. Cognition and Instruction , 34 (3), 236–249.

1. The reader should note the emergence of critical ethnography (e.g., Carspecken, 1996 ; Fine, 1994 ), and other more participatory models of ethnography that deviated from this traditional paradigm during this same time period. These new forms of ethnography comprised part of the genealogy of the more critical approaches to DBR, described later in this article.

2. The reader will also note that the adjective “qualitative” largely drops away from the acronym “DBR.” This is largely because, as described, DBR, as an exploration of naturalistic ecologies with multitudes of variables, and social and learning dynamics, necessarily demands a move beyond what can be captured by quantitative measurement alone. The qualitative nature of the research is thus implied and embedded as part of what makes DBR a unique and distinct methodology.

Related Articles

  • Qualitative Data Analysis
  • The Entanglements of Ethnography and Participatory Action Research (PAR) in Educational Research in North America
  • Writing Educational Ethnography
  • Qualitative Data Analysis and the Use of Theory
  • Comparative Case Study Research
  • Use of Qualitative Methods in Evaluation Studies
  • Writing Qualitative Dissertations
  • Ethnography in Early Childhood Education
  • A History of Qualitative Research in Education in China
  • Qualitative Research in the Field of Popular Education
  • Qualitative Methodological Considerations for Studying Undocumented Students in the United States
  • Culturally Responsive Evaluation as a Form of Critical Qualitative Inquiry
  • Participatory Action Research in Education
  • Complexity Theory as a Guide to Qualitative Methodology in Teacher Education
  • Observing Schools and Classrooms

Printed from Oxford Research Encyclopedias, Education. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 21 June 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [81.177.182.174]
  • 81.177.182.174

Character limit 500 /500

Educational resources and simple solutions for your research journey

What is qualitative research? Methods, types, approaches, and examples

What is Qualitative Research? Methods, Types, Approaches and Examples

Qualitative research is a type of method that researchers use depending on their study requirements. Research can be conducted using several methods, but before starting the process, researchers should understand the different methods available to decide the best one for their study type. The type of research method needed depends on a few important criteria, such as the research question, study type, time, costs, data availability, and availability of respondents. The two main types of methods are qualitative research and quantitative research. Sometimes, researchers may find it difficult to decide which type of method is most suitable for their study. Keeping in mind a simple rule of thumb could help you make the correct decision. Quantitative research should be used to validate or test a theory or hypothesis and qualitative research should be used to understand a subject or event or identify reasons for observed patterns.  

Qualitative research methods are based on principles of social sciences from several disciplines like psychology, sociology, and anthropology. In this method, researchers try to understand the feelings and motivation of their respondents, which would have prompted them to select or give a particular response to a question. Here are two qualitative research examples :  

  • Two brands (A & B) of the same medicine are available at a pharmacy. However, Brand A is more popular and has higher sales. In qualitative research , the interviewers would ideally visit a few stores in different areas and ask customers their reason for selecting either brand. Respondents may have different reasons that motivate them to select one brand over the other, such as brand loyalty, cost, feedback from friends, doctor’s suggestion, etc. Once the reasons are known, companies could then address challenges in that specific area to increase their product’s sales.  
  • A company organizes a focus group meeting with a random sample of its product’s consumers to understand their opinion on a new product being launched.  

qualitative research process design

Table of Contents

What is qualitative research? 1

Qualitative research is the process of collecting, analyzing, and interpreting non-numerical data. The findings of qualitative research are expressed in words and help in understanding individuals’ subjective perceptions about an event, condition, or subject. This type of research is exploratory and is used to generate hypotheses or theories from data. Qualitative data are usually in the form of text, videos, photographs, and audio recordings. There are multiple qualitative research types , which will be discussed later.  

Qualitative research methods 2

Researchers can choose from several qualitative research methods depending on the study type, research question, the researcher’s role, data to be collected, etc.  

The following table lists the common qualitative research approaches with their purpose and examples, although there may be an overlap between some.  

     
Narrative  Explore the experiences of individuals and tell a story to give insight into human lives and behaviors. Narratives can be obtained from journals, letters, conversations, autobiographies, interviews, etc.  A researcher collecting information to create a biography using old documents, interviews, etc. 
Phenomenology  Explain life experiences or phenomena, focusing on people’s subjective experiences and interpretations of the world.  Researchers exploring the experiences of family members of an individual undergoing a major surgery.  
Grounded theory  Investigate process, actions, and interactions, and based on this grounded or empirical data a theory is developed. Unlike experimental research, this method doesn’t require a hypothesis theory to begin with.  A company with a high attrition rate and no prior data may use this method to understand the reasons for which employees leave. 
Ethnography  Describe an ethnic, cultural, or social group by observation in their naturally occurring environment.  A researcher studying medical personnel in the immediate care division of a hospital to understand the culture and staff behaviors during high capacity. 
Case study  In-depth analysis of complex issues in real-life settings, mostly used in business, law, and policymaking. Learnings from case studies can be implemented in other similar contexts.  A case study about how a particular company turned around its product sales and the marketing strategies they used could help implement similar methods in other companies. 

Types of qualitative research 3,4

The data collection methods in qualitative research are designed to assess and understand the perceptions, motivations, and feelings of the respondents about the subject being studied. The different qualitative research types include the following:  

  • In-depth or one-on-one interviews : This is one of the most common qualitative research methods and helps the interviewers understand a respondent’s subjective opinion and experience pertaining to a specific topic or event. These interviews are usually conversational and encourage the respondents to express their opinions freely. Semi-structured interviews, which have open-ended questions (where the respondents can answer more than just “yes” or “no”), are commonly used. Such interviews can be either face-to-face or telephonic, and the duration can vary depending on the subject or the interviewer. Asking the right questions is essential in this method so that the interview can be led in the suitable direction. Face-to-face interviews also help interviewers observe the respondents’ body language, which could help in confirming whether the responses match.  
  • Document study/Literature review/Record keeping : Researchers’ review of already existing written materials such as archives, annual reports, research articles, guidelines, policy documents, etc.  
  • Focus groups : Usually include a small sample of about 6-10 people and a moderator, to understand the participants’ opinion on a given topic. Focus groups ensure constructive discussions to understand the why, what, and, how about the topic. These group meetings need not always be in-person. In recent times, online meetings are also encouraged, and online surveys could also be administered with the option to “write” subjective answers as well. However, this method is expensive and is mostly used for new products and ideas.  
  • Qualitative observation : In this method, researchers collect data using their five senses—sight, smell, touch, taste, and hearing. This method doesn’t include any measurements but only the subjective observation. For example, “The dessert served at the bakery was creamy with sweet buttercream frosting”; this observation is based on the taste perception.  

qualitative research process design

Qualitative research : Data collection and analysis

  • Qualitative data collection is the process by which observations or measurements are gathered in research.  
  • The data collected are usually non-numeric and subjective and could be recorded in various methods, for instance, in case of one-to-one interviews, the responses may be recorded using handwritten notes, and audio and video recordings, depending on the interviewer and the setting or duration.  
  • Once the data are collected, they should be transcribed into meaningful or useful interpretations. An experienced researcher could take about 8-10 hours to transcribe an interview’s recordings. All such notes and recordings should be maintained properly for later reference.  
  • Some interviewers make use of “field notes.” These are not exactly the respondents’ answers but rather some observations the interviewer may have made while asking questions and may include non-verbal cues or any information about the setting or the environment. These notes are usually informal and help verify respondents’ answers.  

2. Qualitative data analysis 

  • This process involves analyzing all the data obtained from the qualitative research methods in the form of text (notes), audio-video recordings, and pictures.  
  • Text analysis is a common form of qualitative data analysis in which researchers examine the social lives of the participants and analyze their words, actions, etc. in specific contexts. Social media platforms are now playing an important role in this method with researchers analyzing all information shared online.   

There are usually five steps in the qualitative data analysis process: 5

  • Prepare and organize the data  
  • Transcribe interviews  
  • Collect and document field notes and other material  
  • Review and explore the data  
  • Examine the data for patterns or important observations  
  • Develop a data coding system  
  • Create codes to categorize and connect the data  
  • Assign these codes to the data or responses  
  • Review the codes  
  • Identify recurring themes, opinions, patterns, etc.  
  • Present the findings  
  • Use the best possible method to present your observations  

The following table 6 lists some common qualitative data analysis methods used by companies to make important decisions, with examples and when to use each. The methods may be similar and can overlap.  

     
Content analysis  To identify patterns in text, by grouping content into words, concepts, and themes; that is, determine presence of certain words or themes in some text  Researchers examining the language used in a journal article to search for bias 
Narrative analysis  To understand people’s perspectives on specific issues. Focuses on people’s stories and the language used to tell these stories  A researcher conducting one or several in-depth interviews with an individual over a long period 
Discourse analysis  To understand political, cultural, and power dynamics in specific contexts; that is, how people express themselves in different social contexts  A researcher studying a politician’s speeches across multiple contexts, such as audience, region, political history, etc. 
Thematic analysis  To interpret the meaning behind the words used by people. This is done by identifying repetitive patterns or themes by reading through a dataset  Researcher analyzing raw data to explore the impact of high-stakes examinations on students and parents 

Characteristics of qualitative research methods 4

  • Unstructured raw data : Qualitative research methods use unstructured, non-numerical data , which are analyzed to generate subjective conclusions about specific subjects, usually presented descriptively, instead of using statistical data.  
  • Site-specific data collection : In qualitative research methods , data are collected at specific areas where the respondents or researchers are either facing a challenge or have a need to explore. The process is conducted in a real-world setting and participants do not need to leave their original geographical setting to be able to participate.  
  • Researchers’ importance : Researchers play an instrumental role because, in qualitative research , communication with respondents is an essential part of data collection and analysis. In addition, researchers need to rely on their own observation and listening skills during an interaction and use and interpret that data appropriately.  
  • Multiple methods : Researchers collect data through various methods, as listed earlier, instead of relying on a single source. Although there may be some overlap between the qualitative research methods , each method has its own significance.  
  • Solving complex issues : These methods help in breaking down complex problems into more useful and interpretable inferences, which can be easily understood by everyone.  
  • Unbiased responses : Qualitative research methods rely on open communication where the participants are allowed to freely express their views. In such cases, the participants trust the interviewer, resulting in unbiased and truthful responses.  
  • Flexible : The qualitative research method can be changed at any stage of the research. The data analysis is not confined to being done at the end of the research but can be done in tandem with data collection. Consequently, based on preliminary analysis and new ideas, researchers have the liberty to change the method to suit their objective.  

qualitative research process design

When to use qualitative research   4

The following points will give you an idea about when to use qualitative research .  

  • When the objective of a research study is to understand behaviors and patterns of respondents, then qualitative research is the most suitable method because it gives a clear insight into the reasons for the occurrence of an event.  
  • A few use cases for qualitative research methods include:  
  • New product development or idea generation  
  • Strengthening a product’s marketing strategy  
  • Conducting a SWOT analysis of product or services portfolios to help take important strategic decisions  
  • Understanding purchasing behavior of consumers  
  • Understanding reactions of target market to ad campaigns  
  • Understanding market demographics and conducting competitor analysis  
  • Understanding the effectiveness of a new treatment method in a particular section of society  

A qualitative research method case study to understand when to use qualitative research 7

Context : A high school in the US underwent a turnaround or conservatorship process and consequently experienced a below average teacher retention rate. Researchers conducted qualitative research to understand teachers’ experiences and perceptions of how the turnaround may have influenced the teachers’ morale and how this, in turn, would have affected teachers’ retention.  

Method : Purposive sampling was used to select eight teachers who were employed with the school before the conservatorship process and who were subsequently retained. One-on-one semi-structured interviews were conducted with these teachers. The questions addressed teachers’ perspectives of morale and their views on the conservatorship process.  

Results : The study generated six factors that may have been influencing teachers’ perspectives: powerlessness, excessive visitations, loss of confidence, ineffective instructional practices, stress and burnout, and ineffective professional development opportunities. Based on these factors, four recommendations were made to increase teacher retention by boosting their morale.  

qualitative research process design

Advantages of qualitative research 1

  • Reflects real-world settings , and therefore allows for ambiguities in data, as well as the flexibility to change the method based on new developments.  
  • Helps in understanding the feelings or beliefs of the respondents rather than relying only on quantitative data.  
  • Uses a descriptive and narrative style of presentation, which may be easier to understand for people from all backgrounds.  
  • Some topics involving sensitive or controversial content could be difficult to quantify and so qualitative research helps in analyzing such content.  
  • The availability of multiple data sources and research methods helps give a holistic picture.  
  • There’s more involvement of participants, which gives them an assurance that their opinion matters, possibly leading to unbiased responses.   

Disadvantages of qualitative research 1

  • Large-scale data sets cannot be included because of time and cost constraints.  
  • Ensuring validity and reliability may be a challenge because of the subjective nature of the data, so drawing definite conclusions could be difficult.  
  • Replication by other researchers may be difficult for the same contexts or situations.  
  • Generalization to a wider context or to other populations or settings is not possible.  
  • Data collection and analysis may be time consuming.  
  • Researcher’s interpretation may alter the results causing an unintended bias.  

Differences between qualitative research and quantitative research 1

     
Purpose and design  Explore ideas, formulate hypotheses; more subjective  Test theories and hypotheses, discover causal relationships; measurable and more structured 
Data collection method  Semi-structured interviews/surveys with open-ended questions, document study/literature reviews, focus groups, case study research, ethnography  Experiments, controlled observations, questionnaires and surveys with a rating scale or closed-ended questions. The methods can be experimental, quasi-experimental, descriptive, or correlational. 
Data analysis  Content analysis (determine presence of certain words/concepts in texts), grounded theory (hypothesis creation by data collection and analysis), thematic analysis (identify important themes/patterns in data and use these to address an issue)  Statistical analysis using applications such as Excel, SPSS, R 
Sample size  Small  Large 
Example  A company organizing focus groups or one-to-one interviews to understand customers’ (subjective) opinions about a specific product, based on which the company can modify their marketing strategy  Customer satisfaction surveys sent out by companies. Customers are asked to rate their experience on a rating scale of 1 to 5  

Frequently asked questions on qualitative research  

Q: how do i know if qualitative research is appropriate for my study  .

A: Here’s a simple checklist you could use:  

  • Not much is known about the subject being studied.  
  • There is a need to understand or simplify a complex problem or situation.  
  • Participants’ experiences/beliefs/feelings are required for analysis.  
  • There’s no existing hypothesis to begin with, rather a theory would need to be created after analysis.  
  • You need to gather in-depth understanding of an event or subject, which may not need to be supported by numeric data.  

Q: How do I ensure the reliability and validity of my qualitative research findings?  

A: To ensure the validity of your qualitative research findings you should explicitly state your objective and describe clearly why you have interpreted the data in a particular way. Another method could be to connect your data in different ways or from different perspectives to see if you reach a similar, unbiased conclusion.   

To ensure reliability, always create an audit trail of your qualitative research by describing your steps and reasons for every interpretation, so that if required, another researcher could trace your steps to corroborate your (or their own) findings. In addition, always look for patterns or consistencies in the data collected through different methods.  

Q: Are there any sampling strategies or techniques for qualitative research ?   

A: Yes, the following are few common sampling strategies used in qualitative research :  

1. Convenience sampling  

Selects participants who are most easily accessible to researchers due to geographical proximity, availability at a particular time, etc.  

2. Purposive sampling  

Participants are grouped according to predefined criteria based on a specific research question. Sample sizes are often determined based on theoretical saturation (when new data no longer provide additional insights).  

3. Snowball sampling  

Already selected participants use their social networks to refer the researcher to other potential participants.  

4. Quota sampling  

While designing the study, the researchers decide how many people with which characteristics to include as participants. The characteristics help in choosing people most likely to provide insights into the subject.  

qualitative research process design

Q: What ethical standards need to be followed with qualitative research ?  

A: The following ethical standards should be considered in qualitative research:  

  • Anonymity : The participants should never be identified in the study and researchers should ensure that no identifying information is mentioned even indirectly.  
  • Confidentiality : To protect participants’ confidentiality, ensure that all related documents, transcripts, notes are stored safely.  
  • Informed consent : Researchers should clearly communicate the objective of the study and how the participants’ responses will be used prior to engaging with the participants.  

Q: How do I address bias in my qualitative research ?  

  A: You could use the following points to ensure an unbiased approach to your qualitative research :  

  • Check your interpretations of the findings with others’ interpretations to identify consistencies.  
  • If possible, you could ask your participants if your interpretations convey their beliefs to a significant extent.  
  • Data triangulation is a way of using multiple data sources to see if all methods consistently support your interpretations.  
  • Contemplate other possible explanations for your findings or interpretations and try ruling them out if possible.  
  • Conduct a peer review of your findings to identify any gaps that may not have been visible to you.  
  • Frame context-appropriate questions to ensure there is no researcher or participant bias.

We hope this article has given you answers to the question “ what is qualitative research ” and given you an in-depth understanding of the various aspects of qualitative research , including the definition, types, and approaches, when to use this method, and advantages and disadvantages, so that the next time you undertake a study you would know which type of research design to adopt.  

References:  

  • McLeod, S. A. Qualitative vs. quantitative research. Simply Psychology [Accessed January 17, 2023]. www.simplypsychology.org/qualitative-quantitative.html    
  • Omniconvert website [Accessed January 18, 2023]. https://www.omniconvert.com/blog/qualitative-research-definition-methodology-limitation-examples/  
  • Busetto L., Wick W., Gumbinger C. How to use and assess qualitative research methods. Neurological Research and Practice [Accessed January 19, 2023] https://neurolrespract.biomedcentral.com/articles/10.1186/s42466-020-00059  
  • QuestionPro website. Qualitative research methods: Types & examples [Accessed January 16, 2023]. https://www.questionpro.com/blog/qualitative-research-methods/  
  • Campuslabs website. How to analyze qualitative data [Accessed January 18, 2023]. https://baselinesupport.campuslabs.com/hc/en-us/articles/204305675-How-to-analyze-qualitative-data  
  • Thematic website. Qualitative data analysis: Step-by-guide [Accessed January 20, 2023]. https://getthematic.com/insights/qualitative-data-analysis/  
  • Lane L. J., Jones D., Penny G. R. Qualitative case study of teachers’ morale in a turnaround school. Research in Higher Education Journal . https://files.eric.ed.gov/fulltext/EJ1233111.pdf  
  • Meetingsnet website. 7 FAQs about qualitative research and CME [Accessed January 21, 2023]. https://www.meetingsnet.com/cme-design/7-faqs-about-qualitative-research-and-cme     
  • Qualitative research methods: A data collector’s field guide. Khoury College of Computer Sciences. Northeastern University. https://course.ccs.neu.edu/is4800sp12/resources/qualmethods.pdf  

Researcher.Life is a subscription-based platform that unifies the best AI tools and services designed to speed up, simplify, and streamline every step of a researcher’s journey. The Researcher.Life All Access Pack is a one-of-a-kind subscription that unlocks full access to an AI writing assistant, literature recommender, journal finder, scientific illustration tool, and exclusive discounts on professional publication services from Editage.  

Based on 21+ years of experience in academia, Researcher.Life All Access empowers researchers to put their best research forward and move closer to success. Explore our top AI Tools pack, AI Tools + Publication Services pack, or Build Your Own Plan. Find everything a researcher needs to succeed, all in one place –  Get All Access now starting at just $17 a month !    

Related Posts

research

What is Research? Definition, Types, Methods, and Examples

Turabian Format

Turabian Format: A Beginner’s Guide

Qualitative vs. quantitative data in research: what's the difference?

Qualitative vs. quantitative data in research: what's the difference?

If you're reading this, you likely already know the importance of data analysis. And you already know it can be incredibly complex.

At its simplest, research and it's data can be broken down into two different categories: quantitative and qualitative. But what's the difference between each? And when should you use them? And how can you use them together?

Understanding the differences between qualitative and quantitative data is key to any research project. Knowing both approaches can help you in understanding your data better—and ultimately understand your customers better. Quick takeaways:

Quantitative research uses objective, numerical data to answer questions like "what" and "how often." Conversely, qualitative research seeks to answer questions like "why" and "how," focusing on subjective experiences to understand motivations and reasons.

Quantitative data is collected through methods like surveys and experiments and analyzed statistically to identify patterns. Qualitative data is gathered through interviews or observations and analyzed by categorizing information to understand themes and insights.

Effective data analysis combines quantitative data for measurable insights with qualitative data for contextual depth.

What is quantitative data?

Qualitative and quantitative data differ in their approach and the type of data they collect.

Quantitative data refers to any information that can be quantified — that is, numbers. If it can be counted or measured, and given a numerical value, it's quantitative in nature. Think of it as a measuring stick.

Quantitative variables can tell you "how many," "how much," or "how often."

Some examples of quantitative data :  

How many people attended last week's webinar? 

How much revenue did our company make last year? 

How often does a customer rage click on this app?

To analyze these research questions and make sense of this quantitative data, you’d normally use a form of statistical analysis —collecting, evaluating, and presenting large amounts of data to discover patterns and trends. Quantitative data is conducive to this type of analysis because it’s numeric and easier to analyze mathematically.

Computers now rule statistical analytics, even though traditional methods have been used for years. But today’s data volumes make statistics more valuable and useful than ever. When you think of statistical analysis now, you think of powerful computers and algorithms that fuel many of the software tools you use today.

Popular quantitative data collection methods are surveys, experiments, polls, and more.

Quantitative Data 101: What is quantitative data?

Take a deeper dive into what quantitative data is, how it works, how to analyze it, collect it, use it, and more.

Learn more about quantitative data →

What is qualitative data?

Unlike quantitative data, qualitative data is descriptive, expressed in terms of language rather than numerical values.

Qualitative data analysis describes information and cannot be measured or counted. It refers to the words or labels used to describe certain characteristics or traits.

You would turn to qualitative data to answer the "why?" or "how?" questions. It is often used to investigate open-ended studies, allowing participants (or customers) to show their true feelings and actions without guidance.

Some examples of qualitative data:

Why do people prefer using one product over another?

How do customers feel about their customer service experience?

What do people think about a new feature in the app?

Think of qualitative data as the type of data you'd get if you were to ask someone why they did something. Popular data collection methods are in-depth interviews, focus groups, or observation.

Start growing with data and Fullstory.

Request your personalized demo of the Fullstory behavioral data platform.

What are the differences between qualitative vs. quantitative data?

When it comes to conducting data research, you’ll need different collection, hypotheses and analysis methods, so it’s important to understand the key differences between quantitative and qualitative data:

Quantitative data is numbers-based, countable, or measurable. Qualitative data is interpretation-based, descriptive, and relating to language.

Quantitative data tells us how many, how much, or how often in calculations. Qualitative data can help us to understand why, how, or what happened behind certain behaviors .

Quantitative data is fixed and universal. Qualitative data is subjective and unique.

Quantitative research methods are measuring and counting. Qualitative research methods are interviewing and observing.

Quantitative data is analyzed using statistical analysis. Qualitative data is analyzed by grouping the data into categories and themes.

Qualtitative vs quantitative examples

As you can see, both provide immense value for any data collection and are key to truly finding answers and patterns. 

More examples of quantitative and qualitative data

You’ve most likely run into quantitative and qualitative data today, alone. For the visual learner, here are some examples of both quantitative and qualitative data: 

Quantitative data example

The customer has clicked on the button 13 times. 

The engineer has resolved 34 support tickets today. 

The team has completed 7 upgrades this month. 

14 cartons of eggs were purchased this month.

Qualitative data example

My manager has curly brown hair and blue eyes.

My coworker is funny, loud, and a good listener. 

The customer has a very friendly face and a contagious laugh.

The eggs were delicious.

The fundamental difference is that one type of data answers primal basics and one answers descriptively. 

What does this mean for data quality and analysis? If you just analyzed quantitative data, you’d be missing core reasons behind what makes a data collection meaningful. You need both in order to truly learn from data—and truly learn from your customers. 

What are the advantages and disadvantages of each?

Both types of data has their own pros and cons. 

Advantages of quantitative data

It’s relatively quick and easy to collect and it’s easier to draw conclusions from. 

When you collect quantitative data, the type of results will tell you which statistical tests are appropriate to use. 

As a result, interpreting your data and presenting those findings is straightforward and less open to error and subjectivity.

Another advantage is that you can replicate it. Replicating a study is possible because your data collection is measurable and tangible for further applications.

Disadvantages of quantitative data

Quantitative data doesn’t always tell you the full story (no matter what the perspective). 

With choppy information, it can be inconclusive.

Quantitative research can be limited, which can lead to overlooking broader themes and relationships.

By focusing solely on numbers, there is a risk of missing larger focus information that can be beneficial.

Advantages of qualitative data

Qualitative data offers rich, in-depth insights and allows you to explore context.

It’s great for exploratory purposes.

Qualitative research delivers a predictive element for continuous data.

Disadvantages of qualitative data

It’s not a statistically representative form of data collection because it relies upon the experience of the host (who can lose data).

It can also require multiple data sessions, which can lead to misleading conclusions.

The takeaway is that it’s tough to conduct a successful data analysis without both. They both have their advantages and disadvantages and, in a way, they complement each other. 

Now, of course, in order to analyze both types of data, information has to be collected first.

Let's get into the research.

Quantitative and qualitative research

The core difference between qualitative and quantitative research lies in their focus and methods of data collection and analysis. This distinction guides researchers in choosing an appropriate approach based on their specific research needs.

Using mixed methods of both can also help provide insights form combined qualitative and quantitative data.

Best practices of each help to look at the information under a broader lens to get a unique perspective. Using both methods is helpful because they collect rich and reliable data, which can be further tested and replicated.

What is quantitative research?

Quantitative research is based on the collection and interpretation of numeric data. It's all about the numbers and focuses on measuring (using inferential statistics ) and generalizing results. Quantitative research seeks to collect numerical data that can be transformed into usable statistics.

It relies on measurable data to formulate facts and uncover patterns in research. By employing statistical methods to analyze the data, it provides a broad overview that can be generalized to larger populations.

In terms of digital experience data, it puts everything in terms of numbers (or discrete data )—like the number of users clicking a button, bounce rates , time on site, and more. 

Some examples of quantitative research: 

What is the amount of money invested into this service?

What is the average number of times a button was dead clicked ?

How many customers are actually clicking this button?

Essentially, quantitative research is an easy way to see what’s going on at a 20,000-foot view. 

Each data set (or customer action, if we’re still talking digital experience) has a numerical value associated with it and is quantifiable information that can be used for calculating statistical analysis so that decisions can be made. 

You can use statistical operations to discover feedback patterns (with any representative sample size) in the data under examination. The results can be used to make predictions , find averages, test causes and effects, and generalize results to larger measurable data pools. 

Unlike qualitative methodology, quantitative research offers more objective findings as they are based on more reliable numeric data.

Quantitative data collection methods

A survey is one of the most common research methods with quantitative data that involves questioning a large group of people. Questions are usually closed-ended and are the same for all participants. An unclear questionnaire can lead to distorted research outcomes.

Similar to surveys, polls yield quantitative data. That is, you poll a number of people and apply a numeric value to how many people responded with each answer.

Experiments

An experiment is another common method that usually involves a control group and an experimental group . The experiment is controlled and the conditions can be manipulated accordingly. You can examine any type of records involved if they pertain to the experiment, so the data is extensive. 

What is qualitative research?

Qualitative research does not simply help to collect data. It gives a chance to understand the trends and meanings of natural actions. It’s flexible and iterative.

Qualitative research focuses on the qualities of users—the actions that drive the numbers. It's descriptive research. The qualitative approach is subjective, too. 

It focuses on describing an action, rather than measuring it.

Some examples of qualitative research: 

The sunflowers had a fresh smell that filled the office.

All the bagels with bites taken out of them had cream cheese.

The man had blonde hair with a blue hat.

Qualitative research utilizes interviews, focus groups, and observations to gather in-depth insights.

This approach shines when the research objective calls for exploring ideas or uncovering deep insights rather than quantifying elements.

Qualitative data collection methods

An interview is the most common qualitative research method. This method involves personal interaction (either in real life or virtually) with a participant. It’s mostly used for exploring attitudes and opinions regarding certain issues.

Interviews are very popular methods for collecting data in product design .

Focus groups

Data analysis by focus group is another method where participants are guided by a host to collect data. Within a group (either in person or online), each member shares their opinion and experiences on a specific topic, allowing researchers to gather perspectives and deepen their understanding of the subject matter.

Digital Leadership Webinar: Accelerating Growth with Quantitative Data and Analytics

Learn how the best-of-the-best are connecting quantitative data and experience to accelerate growth.

So which type of data is better for data analysis?

So how do you determine which type is better for data analysis ?

Quantitative data is structured and accountable. This type of data is formatted in a way so it can be organized, arranged, and searchable. Think about this data as numbers and values found in spreadsheets—after all, you would trust an Excel formula.

Qualitative data is considered unstructured. This type of data is formatted (and known for) being subjective, individualized, and personalized. Anything goes. Because of this, qualitative data is inferior if it’s the only data in the study. However, it’s still valuable. 

Because quantitative data is more concrete, it’s generally preferred for data analysis. Numbers don’t lie. But for complete statistical analysis, using both qualitative and quantitative yields the best results. 

At Fullstory, we understand the importance of data, which is why we created a behavioral data platform that analyzes customer data for better insights. Our platform delivers a complete, retroactive view of how people interact with your site or app—and analyzes every point of user interaction so you can scale.

Unlock business-critical data with Fullstory

A perfect digital customer experience is often the difference between company growth and failure. And the first step toward building that experience is quantifying who your customers are, what they want, and how to provide them what they need.

Access to product analytics is the most efficient and reliable way to collect valuable quantitative data about funnel analysis, customer journey maps , user segments, and more.

But creating a perfect digital experience means you need organized and digestible quantitative data—but also access to qualitative data. Understanding the why is just as important as the what itself.

Fullstory's DXI platform combines the quantitative insights of product analytics with picture-perfect session replay for complete context that helps you answer questions, understand issues, and uncover customer opportunities.

Start a free 14-day trial to see how Fullstory can help you combine your most invaluable quantitative and qualitative insights and eliminate blind spots.

About the author

Our team of experts is committed to introducing people to important topics surrounding analytics, digital experience intelligence, product development, and more.

Related posts

Quantitative data is used for calculations or obtaining numerical results. Learn about the different types of quantitative data uses cases and more.

Discover how data discovery transforms raw data into actionable insights for informed decisions, improved strategies, and better customer experiences.

Learn the 3 key benefits democratized data can achieve, and 3 of the most pertinent dangers of keeping data (and teams) siloed.

Learn the essentials of behavioral data and its transformative impact on customer experience. Our comprehensive guide provides the tools and knowledge to harness this power effectively.

Discover how Fullstory leverages AI to turn raw data into actionable insights, transforming user experiences and driving business growth.

Discover how just-in-time data, explained by Lane Greer, enhances customer insights and decision-making beyond real-time analytics.

  • Open access
  • Published: 20 June 2024

A mixed methods approach identifying facilitators and barriers to guide adaptations to InterCARE strategies: an integrated HIV and hypertension care model in Botswana

  • Pooja Gala   ORCID: orcid.org/0000-0002-7505-9352 1   na1 ,
  • Ponego Ponatshego 2 , 3   na1 ,
  • Laura M. Bogart 4 ,
  • Nabila Youssouf 5 ,
  • Mareko Ramotsababa 6 ,
  • Amelia E. Van Pelt 7 ,
  • Thato Moshomo 2 , 3 ,
  • Evelyn Dintwa 6 ,
  • Khumo Seipone 6 ,
  • Maliha Ilias 8 ,
  • Veronica Tonwe 8 ,
  • Tendani Gaolathe 2 , 3 , 6   na2 ,
  • Lisa R. Hirschhorn 7   na2 &
  • Mosepele Mosepele 2 , 3 , 6   na2  

Implementation Science Communications volume  5 , Article number:  67 ( 2024 ) Cite this article

Metrics details

Botswana serves as a model of success for HIV with 95% of people living with HIV (PLWH) virally suppressed. Yet, only 19% of PLWH and hypertension have controlled blood pressure. To address this gap, InterCARE, a care model that integrates HIV and hypertension care through a) provider training; b) adapted electronic health record; and c) treatment partners (peer support), was designed. This study presents results from our baseline assessment of the determinants and factors used to guide adaptations to InterCARE implementation strategies prior to a hybrid type 2 effectiveness-implementation study.

This study employed a convergent mixed methods design across two clinics (one rural, one urban) to collect quantitative and qualitative data through facility assessments, 100 stakeholder surveys (20 each PLWH and hypertension, existing HIV treatment partners, clinical healthcare providers (HCPs), and 40 community leaders) and ten stakeholder key informative interviews (KIIs). Data were analyzed using descriptive statistics and deductive qualitative analysis organized by the Consolidated Framework for Implementation Research (CFIR) and compared to identify areas of convergence and divergence.

Although 90.3% of 290 PLWH and hypertension at the clinics were taking antihypertensive medications, 52.8% had uncontrolled blood pressure. Results from facility assessments, surveys, and KIIs identified key determinants in the CFIR innovation and inner setting domains. Most stakeholders (> 85%) agreed that InterCARE was adaptable, compatible and would be successful at improving blood pressure control in PLWH and hypertension. HCPs agreed that there were insufficient resources (40%), consistent with facility assessments and KIIs which identified limited staffing, inconsistent electricity, and a lack of supplies as key barriers. Adaptations to InterCARE included a task-sharing strategy and expanded treatment partner training and support.

Conclusions

Integrating hypertension services into HIV clinics was perceived as more advantageous for PLWH than the current model of hypertension care delivered outside of HIV clinics. Identified barriers were used to adapt InterCARE implementation strategies for more effective intervention delivery.

Trial registration

ClinicalTrials.gov, ClinicalTrials.gov Identifier: NCT05414526 . Registered 18 May 2022 – Retrospectively registered.

Peer Review reports

Contributions to the literature

Using existing HIV infrastructure through the integration of services is an intervention strategy that is gaining traction worldwide to address co-morbid hypertension and non-communicable diseases.

The Consolidated Framework for Implementation Research (CFIR) can be utilized to identify the determinants of implementation and guide adaptations of implementation strategies for integrated care.

We found that available resources and levels of staffing were significant barriers to intervention implementation, addressed by stakeholder engagement and a nurse task shifting strategy.

When developing a model of integrated care, including the perspectives of multiple key stakeholder groups is crucial to identifying determinants of implementation and adapting integration strategies.

The global availability of effective treatment for HIV has transformed HIV into a chronic condition. In sub-Saharan Africa, home to a third of global HIV infections, effective healthcare delivery models have resulted in multiple countries, including Botswana [ 1 , 2 ], approaching or achieving key global targets for the HIV care cascade: 95% aware of their diagnosis, 95% on antiretroviral therapy, and 95% with viral suppression [ 1 , 3 ]. In Botswana, where in 2017 an estimated 20.3% of the adult population had HIV [ 4 ], factors contributing to achieving these targets include having specialized HIV clinics [ 5 ], trained healthcare providers (HCPs) in HIV care, task shifting (e.g., licensing antiretroviral therapy nurse prescribers) [ 6 , 7 ], using electronic health records [ 8 ], making HIV medications free and accessible, and using treatment partners (i.e., participant-chosen peers who counsel and support people living with HIV (PLWH) in attending clinic appointments and taking their medications) [ 9 , 10 ].

As the survival of PLWH has improved, PLWH are developing other chronic conditions, such as cardiovascular disease (CVD). PLWH are twice as likely to develop CVD than people without HIV infection [ 11 , 12 ]. Hypertension is a leading risk factor for CVD in the general population and among PLWH globally [ 4 , 13 , 14 , 15 , 16 ]. A recent hypertension study nested within the Botswana HIV Combination Prevention Project found that nearly one-third of PLWH had hypertension [ 4 ]. Amongst these individuals, only 46.0% were aware of their hypertension diagnosis, and 42% of those aware were on hypertension treatment; 44% of those on treatment had controlled blood pressure, resulting in only 19% of all people living with HIV and with hypertension (PLWH and hypertension) in the study having attained blood pressure control [ 4 ].

Recently, the country’s focus has expanded from HIV alone to co-morbid disease management (e.g., hypertension) amongst PLWH. Building on the successes of HIV care, one evidence-based intervention to improving hypertension management is the integration of hypertension prevention and treatment into existing longitudinal models of HIV care [ 17 , 18 , 19 ]. To address the gap in hypertension care for PLWH in Botswana, a bundle of strategies to achieve hypertension and HIV integration was designed: Integrating hypertension and Cardiovascular Care into Existing HIV Services Package in Botswana (InterCARE). This bundle involves hypertension care integrated into the HIV clinic supported by three evidence-based strategies: a) adapted Electronic Health Record to capture hypertension risk and treatment, b) provider training on hypertension diagnosis and management, and c) treatment partners to support adherence to hypertension care and treatment. While there is evidence of the effectiveness of these implementation strategies in HIV clinics in increasing uptake of antiretroviral therapy [ 6 , 7 ], there is less evidence on the acceptability of, appropriateness of, and factors affecting strategies for integrated hypertension and HIV care in Botswana. This study aimed to identify gaps in current care, summarize the factors affecting the implementation of InterCARE, and highlight the adaptations made to InterCARE implementation strategies prior to the pilot study. The results provided valuable insight into the determinants of implementation that informed tailoring of the InterCARE intervention to the local context prior to a two-stage type 2 hybrid effectiveness-implementation trial.

Overview of study design and methods

The InterCARE intervention trial is a two-stage type 2 hybrid effectiveness-implementation cluster randomized control trial of a multi-component, multi-level implementation intervention aimed at reducing the risk of CVD among adults with a dual diagnosis of HIV and hypertension followed in HIV clinics in Botswana. The first stage is a pilot study assessing the feasibility of implementing InterCARE. Prior to the pilot study, formative work was conducted to measure gaps in care, facility readiness, and factors affecting the implementation of InterCARE. These results informed adaptations to the InterCARE implementation strategies prior to the pilot study. We present a mixed methods convergent analysis of this formative work and describe the resulting adaptations.

Study setting

The HIV/Infectious Diseases Care Clinic (IDCC) model of care in Botswana, established in 2001 in response to the HIV pandemic, is a HIV clinic for PLWH that provides services limited to HIV management including antiretroviral therapy initiation, follow-up, and antiretroviral therapy failure management [ 5 ]. PLWH and hypertension separately attend the general clinic for the management of their hypertension and the remainder of their other healthcare conditions. Two public HIV clinics were chosen for pre-implementation data collection and the pilot study including a small clinic (S clinic) in the south of Botswana ( n =  500 patients, rural, staffed primarily by nurses) and a large community clinic (L clinic) in the northeast of Botswana ( n  ≥ 3,000 patients, urban, staffed by nurses, doctors and a family nurse practitioner (FNP)). The clinics were chosen to optimize variability in the level of care and size across clinics in the primary health HIV care model in Botswana.

Study population

Eligibility and recruitment, baseline population.

PLWH and hypertension (prior diagnosis or newly diagnosed hypertension defined as blood pressure ≥ 140/90 at baseline study visit) aged 20–75 receiving HIV care at one of the two study sites were consecutively enrolled between October 2021 to November 2021 until a sample size of 290 individuals was achieved ( n =  50 in S clinic and n =  240 in L clinic).

Pre-implementation stakeholders

Individuals were eligible to complete pre-implementation surveys based on the following criteria for each of the four groups: 1) all HCPs at both clinics, 2) PLWH and hypertension aged 20–75 receiving HIV care at one of the two study sites, 3) HIV treatment partners (e.g., participant-chosen peers who counsel and support PLWH in attending clinic appointments and taking their medications) aged 20–75 who were actively supporting PLWH at one of the two study clinics, and 4) community members (e.g., local leaders, local council members and leaders in the commercial sector) over the age of 20 who attended one of the two study. PLWH and hypertension ( n =  20, 50% female), HIV treatment partners ( n =  20, 95% female) and HCPs ( n =  20, 55% female) were consecutively recruited in person at each clinic site for the pre-implementation surveys and community members ( n =  40, 57.5% female) were purposively selected based on role in the community until the targeted number was reached (Table 1, Supplemental Materials).

For key informant interviews (KIIs), from the stakeholders surveyed, a diverse sample of stakeholders with a wide range of perspectives were purposively selected based on their clinic location, and availability to be interviewed ( n =  2 HCPs, n =  3 community members, n =  2 treatment partners, n =  3 participants) (Table 2, Supplemental Materials). There were no refusals or dropouts for qualitative interviews.

Data collection

Implementation science framework.

The Consolidated Framework for Implementation Research (CFIR) was used to guide quantitative and qualitative data collection. The updated CFIR, published after data collection materials were developed, was used for analysis [ 20 ]. The study team selected the updated CFIR constructs of ‘Tailoring Strategies’ and ‘Adapting’ from the updated CFIR as these were deemed relevant to understanding how the determinants of the implementation of the intervention were used to tailor and adapt the intervention prior to the pilot study (Table 3, Supplemental Materials) [ 21 ].

Data tools and collection

Facility readiness assessments were adapted from the World Health Organization package of essential noncommunicable disease interventions (WHO PEN) tool [ 22 , 23 ] and included data on the number of PLWH and hypertension seen at each clinic, staff responsibilities in hypertension management, and availability of clinic resources (e.g. trained staff, guidelines, equipment) (Table  1 ). Four trained research assistants (described in detail below) completed the facility assessments at each study site using observation, clinic logbooks, and pharmacy stock review.

Stakeholder surveys included socio-demographics, experiences with hypertension, and attitudes towards InterCARE (1 = strongly agree, 5 = strongly disagree). For HCPs and treatment partners, surveys also included items adapted from an existing questionnaire previously used in Botswana [ 24 ] regarding confidence in managing hypertension for HCPs and HIV for treatment partners (1 = very confident, 5 = not very confident).

Semi-structured KIIs were developed guided by CFIR and other factors (e.g., HIV stigma) known to affect HIV care from prior research. KIIs were intended to explore stakeholders’ understanding of and experiences with hypertension, challenges managing hypertension in the current system, and perceptions of the InterCARE intervention (acceptability, feasibility, and relative advantage) (Table 4, Supplemental Materials).

Survey and KII administration

Four university-educated research assistants (3 females, 1 male) fluent in both the local language, Setswana, and in English underwent survey administration and qualitative research training. These trained research assistants pilot tested surveys and KIIs on all key stakeholder groups for feasibility and readability. Informed consent of eligible stakeholders was collected prior to survey and KII administration. Surveys and KIIs were conducted anonymously by the same research assistants who had no prior relationships with the participants. Surveys and KIIs were conducted in English and/or Setswana, based on participant preference, for approximately 30 min in a private room at the clinic. Aside from the interviewee and interviewer, no other individuals were present in the room. KIIs were audio-recorded, transcribed, and translated to English where necessary. No field notes were made during the interviews. A native Setswana speaker (KS) from the study team independently verified the translations. No repeat interviews were carried out, and no transcripts were returned to the participants for comment or correction. A set number of interviews were planned based on time and resources, with no protocol to collect data until data saturation.

All PLWH and hypertension eligible and enrolled in the pilot phase of InterCARE underwent a baseline study visit with trained research assistants that included measurement of anthropometric data and collection of self-reported socio-demographic, economic, and clinical data. Three left arm blood pressure readings using an automated blood pressure cuff were taken during the initial study visit [ 25 ]. An average of the three blood pressure readings, weights, and height measurements were used in analysis. Participant health records were accessed by research assistants to collect co-morbidity, prescription, and laboratory data for enrolled participants.

Data analysis

Quantitative.

Anthropometric, socio-demographic, and clinical (co-morbidities, laboratory data) data were summarized using descriptive statistics. Clinical data were used to calculate the WHO CVD Risk Score. As lipids were not readily available for most participants, the non-laboratory based risk charts were primarily used [ 26 ]. Uncontrolled blood pressure cut-offs were selected based on 2016 Botswana National Primary Care guidelines, defined as a systolic blood pressure of ≥ 140 or diastolic blood pressure ≥ 90 mm Hg in non-diabetic participants and a systolic blood pressure of ≥ 130 or diastolic blood pressure ≥ 80 mm Hg in participants with diabetes [ 27 ]. Chi-square, t test, and Wilcoxon’s rank sum statistics were calculated to compare participant characteristics between those with uncontrolled blood pressure and those with controlled blood pressure.

Descriptive statistics were used to summarize survey and facility assessment data. Based on the distribution of data, survey responses were re-categorized from Likert scales into a binary variable. Strongly agree or very confident (1) and agree or confident (2) were categorized as “agree” and “confident”, respectively. All other responses (3–5) were re-coded as “does not agree” or “not confident”, respectively. Data were organized by CFIR constructs, and each construct was coded as a facilitator ( +), barrier (-), or both facilitator and barrier ( ±). All analyses were completed in Stata Statistical Software (17.0; College Station, 2021).

Qualitative

KIIs were transcribed, and direct deductive content analysis guided by CFIR was completed [ 28 ]. Two investigators (PG, NY) read all of the transcripts and independently manually coded the same two full transcripts in Microsoft Word to identify preliminary codes. After discussion of these codes and use of consensus strategies to resolve disagreements, an initial codebook was created and applied to transcripts. In subsequent meetings, a final codebook was agreed upon, and subthemes were expanded and mapped onto the updated CFIR domains and constructs [ 29 ]. Coding and subthemes were reviewed by an additional investigator, and both investigators met with senior investigators to reach consensus. Participants did not provide feedback on the findings.

Mixed method analysis

A convergent parallel design was used [ 30 , 31 ]. The study was designed, and data were collected and analyzed according to best practices commissioned by the Office of Behavioural and Social Sciences Research at the NIH and written by Creswell JW et al. Quantitative and qualitative data were collected simultaneously and analysed independently (Table 4, Supplemental Materials). The results were compared to identify areas of convergence and divergence. Findings were discussed with the research team to validate the results [ 32 ].

Adherence to reporting guidelines

The COREQ checklist report was used to report qualitative methods and results in this manuscript.

Patient and public involvement

Patients and key stakeholders were engaged when designing survey tools and KIIs. Study findings were disseminated to local and Ministry of Health (MOH) clinical staff and key stakeholders.

Gaps in current care

Baseline data were collected on a total of 290 PLWH and hypertension (22.8% male, mean age 54 (SD 11) at the study clinics. HIV viral load was suppressed (< 400 copies/ul) in 97.5 of the population, and 72.5% had a last CD4 count above 500 cells/ul. Obesity (BMI >  = 30kg/m 2 ) was present in 25.5% of the population. Most (90.3%) participants reported taking medications for hypertension, but only 47.2% had controlled blood pressure. The mean systolic blood pressure was 135 mmHg (SD 18 mmHg), and the mean diastolic blood pressure was 88 mmHg (SD 13 mmHg). Male gender ( p  < 0.01) and decreased CD4 count ( p  = 0.04) were significantly associated with uncontrolled blood pressure. There were no significant associations of blood pressure control with age, education, employment status, household income, viral load, and BMI (Table  2 ).

Determinants of implementation by CFIR domains

Cfir innovation domain, innovation relative advantage (+).

HCPs (85%) agreed that InterCARE would be more effective than current models of care at controlling hypertension in PLWH. In KIIs, HCPs, PLWH and hypertension, and treatment partners also believed that InterCARE would be advantageous in reducing transportation costs for PLWH and hypertension, improving medication adherence, and improving patient education compared to the current model of care (Table  3 ; Table 5, Supplemental Materials).

"Because there is a combined screening and consultation for HIV and cardiovascular diseases… [there is a reduced] wait time, [reduced] number of visits to the clinic…and patients are [equipped] with knowledge [to manage their hypertension] in regards to diet and exercise.” -Nurse

Innovation adaptability (+/-)

In surveys, over a third of HCPs agreed that it would be difficult to adapt InterCARE to meet the needs of PLWH and hypertension. In contrast, HCPs interviewed in KIIs felt it would not be difficult to adapt InterCARE to meet the needs of PLWH with hypertension since many services for both conditions were already available.

“You already have [the services] in place [independently] and now [they would be integrated], so I don’t think [it would be difficult] to maneuver around [services] and achieve integration.”

Innovation complexity (-)

In survey data, 25% of HCPs and 30% of community members believed that implementing InterCARE in the clinic would be too complex. Regarding the use of treatment partners, 38% of community members and 40% of treatment partners report that it would be too complicated to have treatment partners manage both HIV and hypertension. There were no associated qualitative data collected.

Innovation design (+)

In surveys, treatment partners (80%), community members (98%), and HCPs (90%) thought that InterCARE would be successful at improving the treatment of hypertension for PLWH and hypertension. HCPs (85%) also agreed that the intervention would be easy to understand (85%) and would benefit PLWH and hypertension (95%). The KIIs also reflected these results with a positive perception of integration of HIV and hypertension services by all groups interviewed, including PLWH and hypertension.

“…all services must be done in same room (place) without a person moving from one place to another. Also, by giving medications at same place [for HIV and hypertension] without [telling us to] take AR[T] from one side and take high blood pressure medications from the other side, you have come with a good program by combining all of these services.”
-PLWH and hypertension

CFIR inner setting domain

Structural characteristics—work infrastructure (-/ +).

Perspectives on staffing needs prior to implementing InterCARE were variable. In surveys, two-thirds of HCPs (65%) thought InterCARE would require too many staff and other resources.

In KIIs, divergent from survey data HCPs noted in the long-term consolidating clinic visits would decrease the clinical care load.

“We [as HCPs] deal with patients having to come for a certain service, and tomorrow they are coming for a different service…but knowing that [a patient] might come here once and still be able to get help for 2 or 3 ailments [could prevent staff from overworking and allow participants to utilize their clinic time more efficiently].”
-Public health officer

The facility readiness assessment identified the different models between clinics, with doctors predominantly initiating and managing hypertension in L clinic compared to more task sharing amongst nurses and doctors in the smaller, more remote S clinic (Table  1 ).

In KIIs, respondents noted that nurses typically lead new initiatives, a significant factor for successful implementation of InterCARE. A doctor was not always readily available in person, particularly at smaller clinics.

"We don’t have a medical doctor in the cluster so I would say nurses are the ones who take the lead for the new initiatives of improving the care"

HCPs discussed that the leadership structure within the clinic could serve as a facilitator to support implementation of InterCARE.

"We have leaders like district leaders, matrons, chief doctors, also [there] are cluster matrons and facility matrons…They will be able to take the rightful administrative steps…in order for the program to run smoothly."

Compatibility (+)

In surveys, 90% of HCPs agreed that InterCARE was compatible with the needs of PLWH and hypertension at clinic. Nearly all treatment partners and community members agreed that treatment partners could successfully help participants manage their hypertension.

In KIIs, it was noted that the use of treatment partners for hypertension is also consistent with the current HIV treatment partner program. In addition, outside of HIV, family and friends informally fill the role of treatment partners for acute and chronic conditions. PLWH and hypertension had a positive view of treatment partners, describing their multi-faceted role in medication and clinic appointment reminders, disease counselling and emotional support.

“For instance, if somebody has been involved in a road traffic accident and has a fracture… [relatives or close partners take care of that patient so a treatment partner is used for many conditions].
–Nurse “I do not have any challenges since my treatment partner gives me full support towards my diet, what to/what not to do.”

Available resources (-)

Resource limitations in equipment were documented in the facility readiness assessment, including dysfunctional blood pressure machines and no hypertension medications at L clinic. In surveys, 40% of HCPs agreed that InterCARE would be problematic due to not having enough medical and supportive care resources to care for both HIV and hypertension. KIIs confirmed the lack of essential equipment at baseline (electricity, blood pressure cuffs, working scales) and re-iterated the importance of having available resources to implement InterCARE.

“The[re] is a regular cut of electricity [and] since our blood pressure machines rely on electricity it means that the checking of blood pressure and pulse rate may be compromised.”

In KIIs, stakeholders discussed medication stock-outs and long distances to the clinic, both contributing to defaulting on medications.

“…sometimes you will go to a clinic where the medication is out of stock, and you end up going to X where the medication will also be out of stock…Other challenges could be…travelling a long distance to come for [a] medication refill.”

Access to knowledge and information (-)

In the facility readiness assessment, prior HCP training on hypertension care, and availability of national hypertension guidelines and patient education materials were variable between the two clinics. In surveys, only three (15%) HCPs reporting receiving hypertension training in the past two years. In KIIs, HCPs requested more dedicated hypertension training.

"I believe there can be a large-scale training for dispensers and nurses in the clinic, maybe you can lobby for it or recommend for it."

CFIR individuals domain

Innovation deliverers (-).

HCPs responsible for direct patient care (nurses, FNPs, doctors) had differing levels of confidence with different components of hypertension management. HCPs felt confident counseling patients on diet for hypertension and identifying uncontrolled hypertension. Few felt confident prescribing medications for hypertension (36.4%) or adjusting medications when hypertension is not controlled (27.3%).

Most treatment partners (70%) agreed to having the confidence and knowledge to help patients manage HIV. Over half (60%) felt they were being expected to do too many things as a treatment partner for HIV, and most (90%) reported needing additional support to complete their job as a treatment partner for HIV (Table 6, Supplemental Materials). There were no associated qualitative data collected.

Opinion leaders ( +)

In KIIs, there was an emphasis on the importance of communicating the intervention in public forums (e.g., kgotla – tribal council) through trusted community members (e.g., nurses and Kgosi – chief) to ensure public acceptance.

"The Kgosi [chiefs] are the gate keepers to the village so if they are receptive of [the] initiative, chances are people are going to be [accepting].”

Implementation facilitators ( +)

In KIIs, champions or model patients (with HIV or hypertension) were identified as natural advocates who could promote the intervention, medication adherence, and health lifestyle behaviors in the community.

"They [model patients] are in a position to share their personal experiences [and positively influence their peers]. [Their peers] might be paranoid [about this new thing, but will join if they see that other members in the community are participating].

CFIR outer setting domain

Local attitudes (+/-).

HIV and hypertension related stigma were discussed in KIIs. For some, HIV stigma was seen as a major barrier to care. Certain clinic structures did not allow participants with HIV to remain anonymous, and some PLWH and treatment partners were unwilling to disclose their HIV status to peers. Others reported more acceptance of HIV. No stakeholder reported any hypertension related stigma when prompted.

“Yes, stigma is very common... Even at my household there is so much stigma, I always hear [my family] criticising people living with HIV…”
-Treatment partner
“No sir there is no stigma [against high blood pressure]…this thing is now common [and is the] same as taking ARV or taking any pills…You see when AIDS started [there was stigma]. These days a person can go in public saying “I’m going to charge meaning AR[T]’’

Additional qualitative themes and associated quotations are summarized in Table 7, Supplemental Materials.

CFIR implementation process domain

Prior to implementation and at the start of the pilot study, the mixed methods results were reviewed and discussed by the study team. These results supported existing InterCARE intervention strategies and guided tailoring and adapting prior to the pilot study (Table  4 ). The updated CFIR implementation process domain was used to characterize these tailored strategies and adaptations.

Tailoring strategies

To improve the treatment partner strategy and bridge gaps in access to knowledge and information, efforts were made to expand training provided to treatment partners. In addition to the existing one-on-one instruction and support from HCPs and research staff, training videos were created by the study staff specifically for treatment partners.

To strengthen the provider training strategy and address staffing shortages reported, the intervention was adapted and a task sharing strategy was added by the study team. Task sharing procedures to split hypertension management tasks amongst nurses (e.g., vitals, participant education, follow-up care, CVD screening) and FNPs and doctors (e.g., medication initiation, complex hypertension care) were included in training. HCPs were also supported and coached by the study nurse and study physician to improve confidence in managing hypertension, particularly prescribing and adjusting medications.

Outside of the three core components of InterCARE, other strategies that were identified as crucial for successful implementation included engaging and partnering with community leaders, clinic leadership, and key stakeholders. Senior members of the study staff met with the chiefs and local leadership to discuss the study to gain support and suggestions to ensure effective implementation prior to commencing the pilot study. To address available resources, meetings between senior study staff and clinic leadership and engagement of the MOH were necessary to ensure that electricity, working equipment, and medications were available.

Not all facilitators were used to tailor and adapt strategies and not all barriers were addressed by the CFIR implementation process domain. For example, HCPs identified model patients as facilitators to mobilize community participation. However due to possible stigma towards having HIV and HTN expressed by PLWH and HTN, the team chose not to include any adaptations that incorporated the use of model patients.

In this study population in Botswana, we found that while nearly all of the PLWH and hypertension enrolled were already on antihypertensive treatment, over half had uncontrolled blood pressure. Integrated HIV and hypertension care as implemented through InterCARE was perceived to be an advantageous, compatible intervention design by participants, HCPs, treatment partners, and community members to address the gap in effective delivery of hypertension treatment and blood pressure control amongst PLWH. Significant barriers to implementation of InterCARE arose in the CFIR inner setting constructs of available resources, structural infrastructure (e.g., levels of staffing), and access to knowledge and information which informed the adaptations for InterCARE implementation strategies prior to the pilot study.

The intervention design was viewed favorably, particularly the treatment partner component. Stakeholders expressed their familiarity with treatment partners and recognized its compatibility with the existing HIV clinic infrastructure in Botswana. Recognizing the complex task of integrating hypertension and HIV care, most studies including InterCARE use multiple implementation strategies to achieve integration [ 17 , 33 , 34 ]. The three components of the InterCARE intervention (treatment partners, modified EHR, provider training), particularly the treatment partner component, were chosen carefully and specifically for Botswana based on the strategies that had been successful in controlling the HIV epidemic in Botswana, and to optimize existing clinic structures [ 5 , 7 , 8 , 9 ].

Identified barriers to intervention implementation included available resources and insufficient staffing for the workload required. These findings are consistent with other studies conducted in Africa [ 29 , 35 ]. In a study conducted at three HIV clinics of varying hypertension care cascade performance in Uganda, major barriers to hypertension and HIV integrated care included lack of available resources (e.g., functional blood pressure machines), an inadequate supply of antihypertensive medications, and concern regarding extra workload to HCPs [ 29 ]. For our pilot study, we incorporated task shifting strategies in our provider training to more efficiently use existing staff and better adapt to existing clinic work infrastructure without increasing workload. This evidence-based approach leveraging existing human resources has also been used in other African settings to address staff shortages [ 35 , 36 , 37 ].

Another significant barrier to intervention implementation was limited access to knowledge and information for HCPs. The importance of provider training and limited opportunities for continued professional development opportunities is a thread across many studies integrating HIV and hypertension services [ 35 ]. In a mixed-methods study of the implementation of a task-strengthening strategy for hypertension and HIV control, clinic HCPs from 29 HIV clinics in Lagos, Nigeria noted that a sub-optimal number of clinics (52%) reported access to hypertension training materials. Training was desired as long as it did not overburden health care HCPs [ 36 ]. This study had consistent findings highlighting the need for dedicated provider training in HIV/hypertension integration interventions [ 18 , 38 , 39 , 40 ], which is a key component of the InterCARE intervention. In addition to providing continuing professional development (CPD) opportunities through provider training, one-on-one coaching and supportive supervision were added evidence-based strategies [ 41 , 42 , 43 ], aimed at addressing low confidence amongst HCPs.

A strength of this study design was the use of the updated CFIR to identify determinants of implementation and guide adaptations of implementation strategies. This is one of the few studies to use the updated CFIR to assess factors that influence hypertension and HIV care integration [ 29 , 36 ]. This study demonstrates the utility of the updated CFIR as a guiding framework for systematic implementation of integrated care in settings outside of Botswana [ 20 , 44 ]. CFIR has been used less frequently in LMICs compared to high income countries (HICs) and there is a growing amount of literature on CFIR expansions and modifications that better fit the global context. Means AR, et al. adds CFIR domains and constructs that address scalability and sustainability of an intervention and acknowledge relationships between CFIR domains and constructs [ 45 ]. This is particularly valuable in settings like Botswana where resource limitations (e.g., internet connectivity, electricity, medication availability) that are barriers to implementation of the intervention will likely also be barriers to scaling and sustaining the intervention. Addressing these limitations will likely require complementary strategies across multiple CFIR domains.

Another strength of this study was the addition of the community stakeholder perspective. Community acceptance plays a central role in health care provision in Botswana [ 46 , 47 ]. Local leaders and clinic staff are important facilitators in sensitizing the population to, and promoting, integrated care. The central role that chiefs and the kgotla, the community council, play in the lives of the citizens of Botswana has long been recognized in National MOH Guidelines in Botswana as a key facilitator in healthcare delivery [ 46 ]. The addition of a strategy to engage and partner with key stakeholders, including the MOH, clinic leadership, and community leadership can serve to strengthen the delivery of InterCARE.

Limitations of this study design include the small sample size and limited perspectives in KIIs from HCPs who were prescribers of antiretroviral therapy and antihypertensives (doctors and FNPs). Time and resource limitations prevented the collection of more KIIs for all groups, which would have helped achieve data saturation and a more comprehensive assessment of barriers and facilitators to each of the three major strategies. Another limitation was a potential selection bias as all stakeholders, including community members, were recruited from the HIV clinic setting. In future studies, community recruitment may be helpful in gathering a more universal sample of opinions. Our qualitative sample size may have also contributed to divergent quantitative and qualitative data for a few CFIR constructs. As a pilot study, for some CFIR constructs saturation of themes may not have been achieved to reflect the full diversity of convergent and divergent opinions. For example, for the CFIR innovation domain construct of adaptability, one third of participants agreed that InterCARE would be difficult to adapt but the remainder disagreed with this statement. It is possible that the sampling of participants for KIIs was biased and only reflected the opinions of those that disagreed with this statement.

For other constructs, divergent data may have arisen due to differences in how participants understood the survey questions versus KII questions. However, because we used a convergent parallel mixed methods study design, in which qualitative and quantitative data were conducted during the same study phase, we were not able to use the qualitative data to help explain quantitative data (as would have been possible with an explanatory sequential mixed methods design.” [ 30 ].

Finally, this study only includes perceived determinants of implementation based on a description of the planned implementation of InterCARE. Further insights will be obtained during the two-stage type 2 hybrid effectiveness-implementation trial.

While this study focused on key facilitators and barriers that were addressed directly, further research is needed on the potential indirect effects of integrated care. In KIIs, stakeholders discussed the role of HIV stigma in care seeking behaviors. It is possible that receiving hypertension care in HIV specific clinics may have the potential to exacerbate HIV stigma affecting both HIV and hypertension care. Adding hypertension services to HIV clinics may not adequately address HIV-related stigma and will be an important factor to explore further during the pilot and randomized controlled study [ 48 ].

Botswana, like many other LMICs, is at a crossroads moving forward from the devastation of rampant, poorly controlled HIV decades earlier. Now faced with a growing epidemic of hypertension, immediate action is needed to better control hypertension in this setting. Utilizing existing HIV infrastructure through integration of services is an intervention strategy that is gaining traction worldwide [ 18 , 49 ]. Our study found that the integration of hypertension and HIV services through the InterCARE intervention was viewed positively. Implementation strategies of key stakeholder engagement and partnership, provider supervised support and coaching, and task sharing can be used to address major barriers, utilize facilitators, and strengthen the existing components of InterCARE. These strategies along with the core InterCARE strategies are now being tested in a nation-wide two-stage type 2 hybrid effectiveness-implementation cluster randomized control trial.

Integrating hypertension services in HIV clinics is a feasible and acceptable intervention that is envisioned to be effective in better controlling blood pressure, with potential advantages over the current standard of care. Barriers exist, but strategies to address these can be successfully adapted and will be tested in the planned pilot study.

Availability of data and materials

This study is in compliance with the NIH Public Access Policy, which ensures that the public has access to the published results of NIH funded research. All results have been (and will be made) available from final peer-reviewed journal manuscripts (including this one) via the digital archive PubMed Central upon acceptance for publication.

Abbreviations

People living with HIV

Cardiovascular diseases

Infectious Diseases Care Clinic

Botswana HIV Combination Prevention Project

Research Electronic Data Capture

Electronic health records

Consolidated Framework for Implementation Research

Bachanas P, Alwano MG, Lebelonyane R, et al. Finding, treating and retaining persons with HIV in a high HIV prevalence and high treatment coverage country: results from the Botswana Combination Prevention Project. PLoS ONE. 2021;16(4):e0250211. https://doi.org/10.1371/journal.pone.0250211 .

Article   CAS   PubMed   PubMed Central   Google Scholar  

BS M. Botswana: Fifth Botswana AIDS Impact Survey (BAIS V). Gaborone, Botswana: Republic of Botswana, 6 September 2022 2022. ( https://www.statsbots.org.bw/sites/default/files/BAIS%20V%20Preliminary%20Report.pdf ).

Levi J, Raymond A, Pozniak A, Vernazza P, Kohler P, Hill A. Can the UNAIDS 90–90-90 target be achieved? A systematic analysis of national HIV treatment cascades. BMJ Glob Health. 2016;1(2):e000010. https://doi.org/10.1136/bmjgh-2015-000010 .

Article   PubMed   PubMed Central   Google Scholar  

Mosepele M. High prevalence of HTN in HIV-infected and HIV-uninfected adults in Botswana. Conference on Retroviruses and Opportunistic Infections (CROI) Boston 2018.

Wester CW, Bussmann H, Avalos A, et al. Establishment of a public antiretroviral treatment clinic for adults in urban Botswana: lessons learned. Clin Infect Dis. 2005;40(7):1041–4. https://doi.org/10.1086/428352 .

Article   PubMed   Google Scholar  

Ledikwe JH, Kejelepula M, Maupo K, et al. Evaluation of a well-established task-shifting initiative: the lay counselor cadre in Botswana. PLoS ONE. 2013;8(4):e61601. https://doi.org/10.1371/journal.pone.0061601 .

Bussmann C, Rotz P, Ndwapi N, et al. Strengthening healthcare capacity through a responsive, country-specific, training standard: the KITSO AIDS training program’s support of Botswana’s national antiretroviral therapy rollout. Open AIDS J. 2008;2:10–6. https://doi.org/10.2174/1874613600802010010 .

Galani M HD, Tibben W, Letsholo KJ. Improving continuity of HIV/AIDS care through electronic health records in resource-limited settings: a Botswana perspective. Health Policy Technol. 2021;10(2). https://doi.org/10.1016/j.hlpt.2021.03.001 .

Bogart LM, Mosepele M, Phaladze N, et al. A social network analysis of HIV treatment partners and patient viral suppression in Botswana. J Acquir Immune Defic Syndr. 2018;78(2):183–92. https://doi.org/10.1097/QAI.0000000000001661 .

Ramiah I RM. Public-private partnerships and antiretroviral drugs For HIV/AIDS: lessons from Botswana. Health Aff. 2005;24(2). https://doi.org/10.1377/hlthaff.24.2.545 .

Hsue PY, Waters DD. Time to recognize HIV infection as a major cardiovascular risk factor. Circulation. 2018;138(11):1113–5. https://doi.org/10.1161/CIRCULATIONAHA.118.036211 .

Delabays B, Cavassini M, Damas J, et al. Cardiovascular risk assessment in people living with HIV compared to the general population. Eur J Prev Cardiol. 2022;29(4):689–99. https://doi.org/10.1093/eurjpc/zwab201 .

Sarfo FS, Nichols M, Singh A, et al. Characteristics of hypertension among people living with HIV in Ghana: Impact of new hypertension guideline. J Clin Hypertens (Greenwich). 2019;21(6):838–50. https://doi.org/10.1111/jch.13561 .

Article   CAS   PubMed   Google Scholar  

Xu Y, Chen X, Wang K. Global prevalence of hypertension among people living with HIV: a systematic review and meta-analysis. J Am Soc Hypertens. 2017;11(8):530–40. https://doi.org/10.1016/j.jash.2017.06.004 .

Bigna JJ, Ndoadoumgue AL, Nansseu JR, et al. Global burden of hypertension among people living with HIV in the era of increased life expectancy: a systematic review and meta-analysis. J Hypertens. 2020;38(9):1659–68. https://doi.org/10.1097/HJH.0000000000002446 .

Dzudie A, Hoover D, Kim HY, et al. Hypertension among people living with HIV/AIDS in Cameroon: a cross-sectional analysis from Central Africa International Epidemiology Databases to Evaluate AIDS. PLoS ONE. 2021;16(7):e0253742. https://doi.org/10.1371/journal.pone.0253742 .

Birungi J, Kivuyo S, Garrib A, et al. Integrating health services for HIV infection, diabetes and hypertension in sub-Saharan Africa: a cohort study. BMJ Open. 2021;11(11):e053412. https://doi.org/10.1136/bmjopen-2021-053412 .

McCombe G, Lim J, Hout MCV, et al. Integrating care for diabetes and hypertension with HIV care in sub-Saharan Africa: a scoping review. Int J Integr Care. 2022;22(1):6. https://doi.org/10.5334/ijic.5839 .

McCombe G, Murtagh S, Lazarus JV, et al. Integrating diabetes, hypertension and HIV care in sub-Saharan Africa: a Delphi consensus study on international best practice. BMC Health Serv Res. 2021;21(1):1235. https://doi.org/10.1186/s12913-021-07073-0 .

Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17(1):75. https://doi.org/10.1186/s13012-022-01245-0 .

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. https://doi.org/10.1186/1748-5908-4-50 .

Mutale W, Bosomprah S, Shankalala P, et al. Assessing capacity and readiness to manage NCDs in primary care setting: gaps and opportunities based on adapted WHO PEN tool in Zambia. PLoS ONE. 2018;13(8):e0200994. https://doi.org/10.1371/journal.pone.0200994 .

Organization WH. Implementation tools: Package of Essential Noncommunicable (PEN) disease interventions for primary health care in low-resource settings. Geneva: Switzerland World Health Organization; 2013.

Google Scholar  

Pooja Gala, Bhavna Seth, Veronica Moshokgo, et al. Confidence and performance of health workers in cardiovascular risk factor management in rural Botswana. Lancet Glob Health 2019;7(S13) (Abstract). https://doi.org/10.1016/S2214-109X(19)30098-1 .

Whelton PK, Carey RM, Mancia G, Kreutz R, Bundy JD, Williams B. Harmonization of the American College of Cardiology/American Heart Association and European Society of Cardiology/European Society of Hypertension blood pressure/hypertension guidelines. Eur Heart J. 2022;43(35):3302–11. https://doi.org/10.1093/eurheartj/ehac432 .

Group CRCW. World Health Organization cardiovascular disease risk charts: revised models to estimate risk in 21 global regions. Lancet Glob Health. 2019;7(10):e1332–45. https://doi.org/10.1016/S2214-109X(19)30318-3 .

Article   Google Scholar  

Tsima BM, Setlhare V, Nkomazana O. Developing the Botswana Primary Care Guideline: an integrated, symptom-based primary care guideline for the adult patient in a resource-limited setting. J Multidiscip Healthc. 2016;9:347–54. https://doi.org/10.2147/JMDH.S112466 .

Assarroudi A, HeshmatiNabavi F, Armat MR, Ebadi A, Vaismoradi M. Directed qualitative content analysis: the description and elaboration of its underpinning methods and data analysis process. J Res Nurs. 2018;23(1):42–55. https://doi.org/10.1177/1744987117741667 .

Muddu M, Tusubira AK, Nakirya B, et al. Exploring barriers and facilitators to integrated hypertension-HIV management in Ugandan HIV clinics using the Consolidated Framework for Implementation Research (CFIR). Implement Sci Commun. 2020;1:45. https://doi.org/10.1186/s43058-020-00033-5 .

Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. 3rd ed. Thousand Oaks: SAGE; 2018.

Gaglio B, Henton M, Barbeau A, et al. Methodological standards for qualitative and mixed methods patient centered outcomes research. BMJ. 2020;371:m4435. https://doi.org/10.1136/bmj.m4435 .

Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Health. 2011;38(1):44–53. https://doi.org/10.1007/s10488-010-0314-z .

Muddu M, Semitala FC, Kimera I, et al. Improved hypertension control at six months using an adapted WHO HEARTS-based implementation strategy at a large urban HIV clinic in Uganda. BMC Health Serv Res. 2022;22(1):699. https://doi.org/10.1186/s12913-022-08045-8 .

Kwarisiima D, Atukunda M, Owaraganise A, et al. Hypertension control in integrated HIV and chronic disease clinics in Uganda in the SEARCH study. BMC Public Health. 2019;19(1):511. https://doi.org/10.1186/s12889-019-6838-6 .

Njuguna B, Vorkoper S, Patel P, et al. Models of integration of HIV and noncommunicable disease care in sub-Saharan Africa: lessons learned and evidence gaps. AIDS. 2018;32(Suppl 1):S33–42. https://doi.org/10.1097/QAD.0000000000001887 .

Iwelunmor J, Ezechi O, Obiezu-Umeh C, et al. Factors influencing the integration of evidence-based task-strengthening strategies for hypertension control within HIV clinics in Nigeria. Implement Sci Commun. 2022;3(1):43. https://doi.org/10.1186/s43058-022-00289-z .

Aifah A, Onakomaiya D, Iwelunmor J, et al. Nurses’ perceptions on implementing a task-shifting/sharing strategy for hypertension management in patients with HIV in Nigeria: a group concept mapping study. Implement Sci Commun. 2020;1:58. https://doi.org/10.1186/s43058-020-00048-y .

Muddu M, Ssinabulya I, Kigozi SP, et al. Hypertension care cascade at a large urban HIV clinic in Uganda: a mixed methods study using the Capability, Opportunity, Motivation for Behavior change (COM-B) model. Implement Sci Commun. 2021;2(1):121. https://doi.org/10.1186/s43058-021-00223-9 .

MatanjeMwagomba BL, Ameh S, Bongomin P, et al. Opportunities and challenges for evidence-informed HIV-noncommunicable disease integrated care policies and programs: lessons from Malawi, South Africa. Swaziland and Kenya AIDS. 2018;32(Suppl 1):S21–32. https://doi.org/10.1097/QAD.0000000000001885 .

Patel P, Speight C, Maida A, et al. Integrating HIV and hypertension management in low-resource settings: lessons from Malawi. PLoS Med. 2018;15(3):e1002523. https://doi.org/10.1371/journal.pmed.1002523 .

Schwerdtle P, Morphet J, Hall H. A scoping review of mentorship of health personnel to improve the quality of health care in low and middle-income countries. Global Health. 2017;13(1):77. https://doi.org/10.1186/s12992-017-0301-1 .

Arsenault C, Rowe SY, Ross-Degnan D, et al. How does the effectiveness of strategies to improve healthcare provider practices in low-income and middle-income countries change after implementation? Secondary analysis of a systematic review. BMJ Qual Saf. 2022;31(2):123–33. https://doi.org/10.1136/bmjqs-2020-011717 .

Iyasere CA, Baggett M, Romano J, Jena A, Mills G, Hunt DP. Beyond continuing medical education: clinical coaching as a tool for ongoing professional development. Acad Med. 2016;91(12):1647–50. https://doi.org/10.1097/ACM.0000000000001131 .

King DK, Shoup JA, Raebel MA, et al. Planning for implementation success using RE-AIM and CFIR frameworks: a qualitative study. Front Public Health. 2020;8:59. https://doi.org/10.3389/fpubh.2020.00059 .

Means AR, Kemp CG, Gwayi-Chore MC, et al. Evaluating and optimizing the consolidated framework for implementation research (CFIR) for use in low- and middle-income countries: a systematic review. Implement Sci. 2020;15(1):17. https://doi.org/10.1186/s13012-020-0977-0 .

Ministry of Health and Wellness. National guideline for implementation of integrated community-based health services in Botswana. Gaborone: Government of Botswana; 2020.

Allen THS. HIV/AIDS policy in Africa: what has worked in Uganda and what has failed in Botswana? J Int Dev. 2004;16:1141–54. https://doi.org/10.1002/jid.1168 .

Ameh S, D’Ambruoso L, Gomez-Olive FX, Kahn K, Tollman SM, Klipstein-Grobusch K. Paradox of HIV stigma in an integrated chronic disease care in rural South Africa: Viewpoints of service users and providers. PLoS ONE. 2020;15(7):e0236270. https://doi.org/10.1371/journal.pone.0236270 .

Osetinsky B, Hontelez JAC, Lurie MN, et al. Epidemiological and health systems implications of evolving HIV and hypertension in South Africa and Kenya. Health Aff (Millwood). 2019;38(7):1173–81. https://doi.org/10.1377/hlthaff.2018.05287 .

Download references

Acknowledgements

We would like to acknowledge Dr. Karen Steger-May, Research Coordinating Center of the HLB-SIMPLe Alliance, for her continues support and guidance with this study and the InterCARE hybrid type II randomized controlled study. We would also like to thank Dr. Kara Bennett, Dr. Shabbar Jaffar, and Dr. Kathleen Wirtz Hurwitz for their statistical support and guidance on this project. Finally, we would also like to thank the HLB-SIMPLe Alliance.

The content of this manuscript is solely the responsibility of the authors and does not necessarily reflect the views of the National Heart, Lung, and Blood Institute, Fogarty International Center, or the United States Department of Health and Human Services.

This study was conducted as a part of the HLB-SIMPLe Alliance. The HLB-SIMPLe Alliance was sponsored by the National Heart, Lung and Blood Institute and funded under grant numbers U24HL154426 and UG3HL154499 with the U.S. Department of Health and Human Services, National Institutes of Health, National Heart, Lung and Blood Institute (NIH/NHLBI). NIH/NHLBI’s role in the study design, collection, or analysis of data was limited to the consultative interactions provided by the Project Scientist and Clinical Trial Specialist.

Author information

Pooja Gala and Ponego Ponatshego are co-first authors.

Tendani Gaolathe, Lisa R. Hirschhorn and Mosepele Moseple are co-last authors.

Authors and Affiliations

Department of Medicine, NYU Langone Grossman School of Medicine, New York, NY, USA

Department of Internal Medicine, Faculty of Medicine, University of Botswana, Gaborone, Botswana

Ponego Ponatshego, Thato Moshomo, Tendani Gaolathe & Mosepele Mosepele

Botswana Harvard AIDS Institute Partnership, Gaborone, Botswana

RAND Corporation, Santa Monica, CA, USA

Laura M. Bogart

Department of Clinical Research, Faculty of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine, London, UK

Nabila Youssouf

Government of Botswana, Ministry of Health and Wellness, Gaborone, Botswana

Mareko Ramotsababa, Evelyn Dintwa, Khumo Seipone, Tendani Gaolathe & Mosepele Mosepele

Department of Medical Social Sciences, Northwestern University, Chicago, IL, USA

Amelia E. Van Pelt & Lisa R. Hirschhorn

Center for Translation Research and Implementation Science, Department of Health and Human Services, National Heart, Lung and Blood Institute, National Institutes of Health, Bethesda, MD, USA

Maliha Ilias & Veronica Tonwe

You can also search for this author in PubMed   Google Scholar

Contributions

PG, PP, NY, LB, KH, TG, LH, MM, and KB designed the study. NY, ED, KS, KB, and KH acquired and maintained the data. PG, PP, NY, LB, KH, LH, MM, KB analyzed and interpreted the data. PG, PP, LH, MM wrote the manuscript. PG, PP, NY, MR, AVP, TM, MI, VT, TG, LH, MM revised the manuscript. All authors approved the final manuscript.

Corresponding author

Correspondence to Pooja Gala .

Ethics declarations

Ethics approval and consent to participate.

All procedures were approved by the Institutional Review Board at the University of Botswana and the Botswana Ministry of Health (MoH) Research and Development Committee. Patients received materials for informed consent and were consented by trained research assistants to participate in the study.

Consent for publication

Not required.

Competing interests

The authors have declared that no competing interests exist.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., supplementary material 2., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Gala, P., Ponatshego, P., Bogart, L.M. et al. A mixed methods approach identifying facilitators and barriers to guide adaptations to InterCARE strategies: an integrated HIV and hypertension care model in Botswana. Implement Sci Commun 5 , 67 (2024). https://doi.org/10.1186/s43058-024-00603-x

Download citation

Received : 18 October 2023

Accepted : 09 June 2024

Published : 20 June 2024

DOI : https://doi.org/10.1186/s43058-024-00603-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Hypertension
  • Low- and middle- income countries
  • Integrated care
  • Implementation science

Implementation Science Communications

ISSN: 2662-2211

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

qualitative research process design

Javascript is currently disabled in your browser. Several features of this site will not function whilst javascript is disabled.

  • Why Publish With Us?
  • Editorial Policies
  • Author Guidelines
  • Peer Review Guidelines
  • Open Outlook
  • Submit New Manuscript

qualitative research process design

  • Sustainability
  • Press Center
  • Testimonials
  • Favored Author Program
  • Permissions
  • Pre-Submission

Chinese website (中文网站)

open access to scientific and medical research

Back to Journals » Advances in Medical Education and Practice » Volume 15

Cultural Competence in Ophthalmic Dispensing Education: A Qualitative Study

  • Get Permission
  • Cite this article

Authors Buthelezi S   , Gerber B

Received 25 September 2023

Accepted for publication 13 December 2023

Published 20 June 2024 Volume 2024:15 Pages 585—594

DOI https://doi.org/10.2147/AMEP.S438707

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Dr Md Anwarul Azim Majumder

Sanele Buthelezi, 1, 2 Berna Gerber 3 1 Department of Optometry, Faculty of Health Sciences, University of Johannesburg, Johannesburg, South Africa; 2 Department of Health Professions Education, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa; 3 Division of Speech-Language and Hearing Therapy, Faculty of Medicine and Health Sciences, Cape Town, South Africa Correspondence: Sanele Buthelezi, Department of Optometry, Faculty of Health Sciences, University of Johannesburg, P.O Box 17077, Doornfontein, Johannesburg, 2028, South Africa, Tel +27 11 559 6387, Email [email protected] Purpose: Understanding and acknowledging cultural diversity in healthcare is essential in providing culturally competent care. Higher education institutions are critical to providing students with the necessary knowledge, attitudes, and skills to respond to cultural diversity in various contexts. Cultural competence teaching in ophthalmic dispensing education has emerged as an essential concept that needs to be included in the curriculum. This study explored ophthalmic dispensing lecturers’ understandings, experiences, and attitudes in teaching cultural competence. Methods: This study used a qualitative approach within an interpretivist paradigm by conducting semi-structured interviews with lecturers (n = 7) in the ophthalmic dispensing program. Braun and Clarke’s framework for thematic analysis was utilized. The research was conducted at an ophthalmic dispensing department at a South African university. Results: The analysis of the semi-structured interviews indicated three main themes of importance regarding factors influencing cultural competence education in the ophthalmic dispensing curriculum: the interplay between experiences and understandings of cultural competence, cross-cultural exposure and teaching practices, and inclusion of cultural competence into the curriculum. The participants recognized that cultural competence was not explicitly included in the curriculum. Including culture in education was rather unsystematic and, in most cases, unplanned. Conclusion: Further training of lecturers on cultural competence skills and evidence-based teaching and assessment strategies are required to assist in developing curricula that include cultural competence. Keywords: ophthalmic education, cross-cultural exposure, cultural competence, ophthalmic dispensing teachers, South Africa

Introduction

Ophthalmic dispensing professionals and their patients are often from different cultural backgrounds. Recognizing cultural diversity in a healthcare setting is paramount in providing quality healthcare, including eyecare. 1 Healthcare professionals, which includes ophthalmic dispensing professionals, must provide services that consider their patients’ cultural values, beliefs, and practices. 1 , 2 These practitioners are encouraged to be critically aware of their own cultural backgrounds, values, and beliefs and not to impose these on their patients and create toxic cross-cultural differences in this way. 3

the process by which individuals and systems respond respectfully and effectively to people of all cultures, languages, classes, races and ethnic backgrounds in a way that recognizes, affirms, and values the worth of the individual and protects and preserves the dignity of each.

The lack of cultural competence within an individual or a health organization may negatively affect patients’ health outcomes. 1 , 6 Due to cross-cultural differences, misunderstandings and poor communication may arise, resulting in poor treatment adherence, lack of trust, poor patient rapport, and pronounced health disparities. 7 , 8 These may be further exacerbated by social, financial, political, and geographical factors. 9

Depending on their location, healthcare facilities in South Africa may still show remnants of the apartheid era. 10 During the apartheid era, the healthcare system was fragmented and discriminatory towards different racial groups. 11 Healthcare professionals working in specific communities may not represent their patient base due to the unbalanced workforce distribution, especially eye care professionals such as optometrists and ophthalmic dispensing professionals, who are more common in urban areas. 12–15 Uncorrected refractive errors and other ocular anomalies that can be avoided are common in rural areas due to this limited workforce of eye care professionals. This unequal distribution of health professionals is also enhanced by the lack of compulsory community services for newly qualified optometry and ophthalmic dispensing graduates. 16

Reports of discrimination, preferential treatment, and prejudice are still being reported by patients, together with implicit bias reported by health professionals. 17 , 18 These reports not only occur in South Africa but globally. The COVID-19 pandemic has amplified health inequalities, where disadvantaged groups have been disproportionately affected by the pandemic. 19 , 20 Poor communities continue to suffer due to limited access to healthcare services. Due to these issues of cultural differences and health disparities, the need to address diversity and cultural awareness in health professions education has been gaining significance. 21

Culturally competent healthcare professionals and health systems will allow better quality healthcare for patients with diverse beliefs, values, and practices. Cultural competence education has also been recognized as a strategy for reducing health disparities. Higher education institutions must ensure that health professionals develop the clinical and cultural competence necessary for clinical practice. 22 , 23 Professional bodies have recognized the importance of this, and some have developed guidelines to assist program developers and educators with cultural competence education. 24 , 25 However, anecdotal evidence suggests that health professions educators need help including cultural competence concepts in the curriculum. 26 , 27 There are several reasons, such as an already crowded curriculum, lack of time, lack of institutional support for cultural competence, and academic staff feeling underprepared to provide cultural competence education. 28

To accommodate some of these challenges, several changes are likely required to include cultural competence in the ophthalmic dispensing curriculum. 29 In ophthalmic dispensing education, lecturers who provide theoretical and clinical education play an essential role in developing cultural competence among their students. However, minimal literature exists that seeks to understand lecturers’ perceptions and attitudes toward including cultural competence in a curriculum, particularly in ophthalmic dispensing education and South Africa. Thus, this study aimed to explore ophthalmic dispensing lecturers’ understandings, experiences, and attitudes toward including cultural competence in ophthalmic dispensing and potentially develop recommendations for the curriculum.

Research Design

This qualitative study used an interpretive research paradigm to explore ophthalmic dispensing lecturers’ understanding, experiences, and attitudes on teaching cultural competence.

Study Participants and Study Context

autonomous, regulated (licensed/registered) health professionals who dispense and fit spectacles and other optical aids, working from prescriptions written by optometrists.

30 The ophthalmic dispensing program is only offered in one university in Africa. The program is a 3-year diploma, and graduates are referred to as dispensing opticians who then register with the professional body and practice independently.

Data Collection methods

Questions of the Interview Guide developed by the researchers

Data Analysis

The researchers recorded and transcribed the interviews verbatim to ensure data familiarization. Thematic analysis using Braun and Clarkes’ approach was applied and began concurrently with data collection. 32 This allowed for creating and modifying codes through an iterative process. It finally led to the forming of well-defined categories, which were then organized into themes.

Quality Assurance

To enhance the study’s rigor, the researchers analyzed the anonymized transcripts with the assistance of an expert coder. An audit trail of the researchers’ decision-making process and reflexivity was kept. The author also kept field notes and observations during the interview process. The authors also communicated frequently to promote further discussions on potential emerging themes and possible biases. Member checking was also conducted to further add to the quality of the data.

Ethical Considerations

Permission to conduct the study was obtained from the Stellenbosch University Health Research Ethics Committee (S22/02/020). Institutional approval for the study was also received before commencing. Informed consent was obtained from all the participants, and confidentiality was maintained throughout the study. The informed consent agreement also included the publication of anonymized responses.

A total of seven ophthalmic dispensing lecturers participated in this study. The majority of the participants were female (n=4) and had between 2 and 30 years of experience as a lecturer in ophthalmic dispensing, with a mean experience of 14 years. The average age of the participants was 46 years. The participants were of diverse races, with two participants who were Black Africans, two who were mixed race, two who were of Indian descent, and one who was Caucasian. Most participants originated from various parts of South Africa, and only one had a migrant background. Most participants held a master’s degree, and only one held a PhD.

Three main themes were identified through the thematic analysis: interplay between experiences and understandings of cultural competence, cross-cultural exposure and teaching practices, and the inclusion of cultural competence into the curriculum.

Theme 1: The Interplay Between Experiences and Understandings of Cultural Competence

…But when I was examining patients and seeing patients of different cultures, they would see me in a particular light, and I would go so far as to say, and I’m generalizing, that some members of certain cultures or races rather, particularly the more disadvantaged ones, would hold me up in the higher regard than my other colleagues who are of different culture or race as me being a white male. (P4)

…Being okay and comfortable with the different patients I see. And making them feel comfortable with me as their eyecare professional. (P2)

…Understanding and recognizing your belief system, where you come from, and your thoughts. The second part of cultural competence is being sensitive enough to understand someone else’s point of view, whether it is a patient or a student or whether it is someone you are chatting to or someone you meet in the street. (P3)

… it is essential in our ophthalmic science degree because at the end of the day, we are producing clinicians or, should I say, health providers. We are producing members of society. (P7)

…cultural competence also means understanding it from the other person’s point of view, understanding that someone comes from a different set of beliefs, cultural backgrounds, education, religion, socio-cultural, economic conditions, etcetera. Understanding their point of view, their way of seeing the world or their beliefs as well being sensitive to it, you know, not necessarily jumping down their throat because they think about the world differently than you. (P4)

…I mean, there is a relationship between health and culture again. There are people of certain ages, certain racial groups, and certain lifestyles that are more prone to certain conditions than others. So, there is a link between cultural lifestyle practices and health. (P7)

It was evident that the participants acknowledged the importance of cultural competence and the factors that influence how they perceive the concept of cultural competence. The findings also suggested that the participants perceived that their own culture influenced their teaching practice. The findings further indicated that each participant had exposure to cultural diversity before entering their teaching and academic careers. These experiences influenced how the participants understood cultural competence.

Theme 2: Cross-Cultural Exposure and Teaching Practices

…I always encouraged, you know, group work of mixed and diverse groups, as students would typically choose to work only with students of the same language, of similar cultural background, or other students who look like them, I guess. (P4)

…the most important thing is getting students to ask questions for things they do not understand and what they feel uncomfortable with regarding the issues of race, ethnicity or any ethical dilemmas they may face in practice. (P5)

… (students performing visual screenings at a nursing home) Some patients were Afrikaans speaking, and our students were not. The interaction between students who did not speak Afrikaans but needed to engage with these patients…the students showed graciousness and sensitivity towards the patients and were able to adapt. Then, the students had to reflect on that process later in class. Furthermore, I think that was a useful, valuable, transformative experience for the students and patients. (P6)

It was evident that lecturers’ exposure to cultural diversity inside and outside the classroom affected how they conducted cultural competence education. The cultural diversity the students were exposed to was used as a teaching tool to enhance student’s awareness and knowledge of the relationship between culture and health.

Theme 3: Inclusion of Cultural Competence into the Curriculum

…you know, the opticianry program is kind of technical. It makes it a bit difficult to teach [issues of culture] in some instances as we focus on lens manufacturing and design, etc. (P7)

…It is inherent in the way I teach; I try to teach them to be professional and patient with people, hear and listen to people, and give respect. I think those are not things you teach as part of a course guide or in the lecture notes per se but actions… respect for people and listening to others, I think, come through with how you teach, and I hope my students mimic such behaviors. (P3)

…things like lifelong learning, appreciating diversity, cultural sensitivity, all those kinds of what they used to call critical outcomes are what we are looking at here… I do not see graduate attributes in the documentation I have come up with so far for the accreditation report or curriculum guidelines; it is not clear how we teach these skills – it is just mentioned. (P4)

… It is not necessarily a fixed training, but rather you know it is a thread that should run through everything in the program. (P6)

…it’s not seen as a measurable entity if you know, if it’s not seen as an actual subject or having some credit-bearing value, you know? It is sometimes pushed to the side and seen as an additional load. (P1)

…I do not think students take it seriously as we usually do not assess cultural competence concepts. Assessment drives learning, and if students are not assessed on a particular topic – they would not pay attention to that concept. (P7)

… there is always the pressure of time. There is always the pressure of assessments. There is always the pressure of assignments, and there is always the pressure of time to get through the curriculum. (P2)

…I think we need cultural competence and sensitivity training for us faculty members. Maybe it can be included in the induction program we have for new academics or even a workshop once a semester facilitated by the university or faculty (P5)

The study aimed to explore ophthalmic dispensing lecturers’ understandings, experiences, and attitudes toward cultural competence. Most of the study’s participants acknowledged the importance of cultural competence in ophthalmic dispensing education. However, in line with the literature on cultural competence, the findings also illustrated the complexity of this concept. 33 There are several perspectives on the nature of cultural competence and how it should be implemented within a health professions curriculum or healthcare setting. 34 The participants agreed that cultural competence was one of the strategies that could be utilized to ensure equitable access to healthcare by persons of different races and ethnicities and ensure individualized care, and this is also expressed in the literature. 35–37

The participants related cultural competence to aspects of culture, such as ethnicity and race, including their own upbringing. The participants also focused on the word culture and its role in healthcare provision. Understanding a patient’s culture can provide the foundation for adequate eye care. Ophthalmic dispensing professionals should not only consider factors such as ethnicity and race. Instead, they should have a holistic view when providing eye care services to culturally diverse communities, considering the patient’s age, gender, sexual orientation and identity, occupation, social and economic factors, and religion. 1 The participants in this study shared views similar to those of Truong and Selig that failure to recognize and acknowledge cultural differences may result in miscommunication and misunderstanding, resulting in poor health and visual outcomes.

employing one’s knowledge, consideration, understanding, respect, and tailoring after realizing awareness of self and others and encountering a diverse group of individuals. 43

Both cultural awareness and cultural sensitivity are necessary to develop and strive towards cultural competence.

Health science students’ awareness of their cultural background directly impacts their clinical exchanges and, to some extent, their clinical decision-making. 44 Health professionals and their patients are often from different cultural backgrounds and may have opposing views regarding healthcare practices. Therefore, the lack of awareness of their own cultural experiences and those of their patients may pose several issues that affect health outcomes.

It was clear that the lecturers who participated in the study perceived exposure to cultural diversity as integral in developing their cultural competence and, ultimately, that of their students. Creating opportunities for students to interact with persons from different cultural backgrounds, in and outside the classroom, can thus assist in developing their cultural competence. Being exposed to cultural diversity helps students become aware of the challenges related to cultural differences, such as feeling under-prepared to deal with diversity in clinical settings. 45 The same authors further argue that exposure to cultural diversity could highlight a lack of knowledge about the various cultures, thus creating the need to improve one’s knowledge, attitudes, and skills. Using cultural diversity among the students in the teaching space can improve the teaching environment as the students would have opportunities to engage and reflect on their cultural differences. This can lead to facilitated discussions in the teaching space, an excellent tool for promoting cultural awareness and sensitivity. Students can be provided with cultural experiences they may not gain otherwise, such as engaging with students with different cultural backgrounds. 46

Some participants perceived the ophthalmic dispensing program as technical, with little emphasis on patient interaction. Thus, topics related to culture seemed more distant as greater emphasis was placed on discipline-related technical aspects such as ophthalmic lens manufacturing, measurements, and dispensing of optical appliances such as spectacles, contact lenses, and low vision aids. This may indicate that cultural competence education runs the danger of having lower priority within the ophthalmic dispensing curriculum. There is a need for cultural competence training among ophthalmic dispensing educators and students, 1 which the participants also acknowledged.

set of implicit messages about values, norms, and attitudes that learners infer from behaviour of individual role models as well as from group dynamics, processes, rituals, and structures. 47

The participants highlighted the importance of being “good” role models for the students. The hidden curriculum can negatively impact cultural competence education if students observe behaviours from their lecturers that are biased and discriminatory. 48 Students may be unsure whether the behaviour is genuinely biased or part of the clinical training.

Clinical teachers are crucial in implementing explicit and hidden curricula. 49 The participants expressed that issues of culture were not explicitly taught and assessed within the ophthalmic dispensing program at their institution. Still, cultural competence was considered part of the learning objectives of the discipline-specific major subjects within the program. This indicates that cultural competency teaching is “unstructured” in its integration within the ophthalmic dispensing curriculum. There is no verification that students are adequately taught the necessary competencies related to cultural competence.

The study findings indicate a need for clear guidelines for lecturers on the inclusion of cultural competence into the ophthalmic dispensing curriculum. The participants perceived the concept of cultural competence as necessary in ophthalmic dispensing education, but there is a need for a systematic approach to teaching cultural competence. Previous research has illustrated the importance of cultural competence training in medical education 35 and ophthalmic dispensing education. 1 Implementing cultural competence education is complex. 21 According to the participants, they are primarily unprepared when opportunities for including cultural competence in classroom activities arise. Besides constant encouragement from the Professional Board for Optometry and Dispensing Opticians, this regulatory body needs to produce explicit guidelines about cultural competence training. Clear guidelines and standards are required for a systematic approach to implementing cultural competence in the ophthalmic dispensing curriculum. 50 Clear guidelines will assist with curriculum development toward including cultural competence and developing and assessing learning objectives. 50

This study showed that commitment from the lecturers and the university administration is required to enhance the teaching and learning environment for developing students’ cultural competence. The development of ophthalmic dispensing professionals who strive for cultural competence begins at the undergraduate level, and therefore, lecturers play an essential role in developing students’ related knowledge, attitudes, and skills at the early stages of their careers. For this to occur, the lecturers also require training and development in cultural competence and education. The lack of teaching and assessment standards and guidelines that the lecturers can use harms the inclusion of cultural competence into the curriculum. Teamwork and coordination among the lecturers are also necessary to systematically incorporate cultural competence into the curriculum.

Limitations

One major limitation of the study is the relatively small sample size, as data was only collected at one university; it should be noted that only one university offers the ophthalmic dispensing program in South Africa. Thus, some elements and shortcomings in implementing cultural competence in ophthalmic dispensing education may have been overlooked. Despite these limitations, this study’s perceptions of the ophthalmic dispensing lecturers were similar to other literature in health professions education.

The study revealed that providing ophthalmic dispensing students with the necessary skills to provide culturally competent eye care is essential in improving health outcomes. The study further suggested that these skills must be embedded throughout the program. For this to occur, support from the institution and the professional bodies is required to create a conducive learning space using the necessary resources. The lecturers require further training on cultural competence and its importance in healthcare. This is particularly important as current literature promotes a paradigm shift towards cultural humility.

It is recommended that discussions amongst the lecturers, regulatory bodies, and qualified ophthalmic dispensing and optometric professionals are needed to ascertain the required learning objectives considering the nature and needs of the country. This also includes discussions regarding assessment methods to measure these outcomes. Furthermore, a dedicated module or an in-depth workshop within the ophthalmic dispensing curriculum is necessary for cultural competence education. Preferably, the subject should be taught in the earlier levels of the program as this will provide them with the required conceptual foundation before cross-cultural clinical exposure. Students should be urged to continuously reflect on their experiences in the workplace and receive feedback on their reflections from their culturally competent lecturers. The lecturers should receive training on cultural competence and evidence-based teaching and assessment strategies related to this concept. Future research should focus on developing frameworks and guidelines relevant to ophthalmic dispensing education in the African context. A thorough review of cultural competence, humility, and safety concepts is also required for further research.

Acknowledgments

The authors thank Prof. Cecilia Jacobs (Department of Health Professions Education, Stellenbosch University) for her support and guidance in manuscript preparation.

The authors declare that they have no competing interests in this work.

1. Truong M, Selig S. Advancing cultural competence in optometry. Clin Exp Optom . 2017;100(4):385–387. doi:10.1111/cxo.12508

2. Truong M, Fuscaldo G. Optometrists’ perspectives on cross-cultural encounters in clinical practice: a pilot study. Clin Exp Optom . 2012;95(1):37–42. doi:10.1111/j.1444-0938.2011.00671.x

3. Campinha-Bacote J. The Process of Cultural Competence in the Delivery of Healthcare Services: a Model of Care. J Transcult Nurs . 2002;13(3):181–184. doi:10.1177/10459602013003003

4. Gulati S, Weir C. Cultural Competence in Healthcare Leadership Education and Development. Societies . 2022;12(39). doi:10.3390/soc12020039

5. Govender P, Mpanza DM, Carey T, Jiyane K, Andrews B, Mashele S. Exploring cultural competence amongst OT students. Occup Ther Int . 2017;1–8. doi:10.1155/2017/2179781

6. Campinha-Bacote J. Many faces: addressing diversity in health care. Online J Issues Nurs . 2003;8(1):18–29. doi:10.3912/OJIN.VOL8NO01MAN02

7. Beach MC, Price EG, Gary TL, et al. Cultural competence: a systematic review of health care provider educational interventions. Med Care . 2005;43(4):356–373. doi:10.1097/01.mlr.0000156861.58905.96

8. Govere L, Govere EM. How Effective is Cultural Competence Training of Healthcare Providers on Improving Patient Satisfaction of Minority Groups? A Systematic Review of Literature. Worldviews Evid Based Nurs . 2016;13(6):402–410. doi:10.1111/wvn.12176

9. Kleinman A, Benson P. Anthropology in the Clinic: the Problem of Cultural Competency and How to Fix It. PLoS Med . 2006;3(10):1673–1676. doi:10.1371/journal.pmed.0030294

10. Matthews M, van Wyk J. Towards a culturally competent health professional: a South African case study. BMC Med Educ . 2018;18(1):1–11. doi:10.1186/s12909-018-1187-1

11. Baker P. From apartheid to neoliberalism: health equity in post-apartheid South Africa. Int J Health Serv . 2010;40(1):79–95. doi:10.2190/HS.40.1.e

12. Xulu-Kasaba ZN, Mashige KP, Naidoo KS. An Assessment of Human Resource Distribution for Public Eye Health Services in KwaZulu-Natal, South Africa. Afr Vision Eye Health . 2021;80(1):1–8. doi:10.4102/AVEH.V80I1.583

13. Hatcher AM, Onah M, Kornik S, Peacocke J, Reid S. Placement, support, and retention of health professionals: national, cross-sectional findings from medical and dental community service officers in South Africa. Hum Resour Health . 2014;12(1). doi:10.1186/1478-4491-12-14

14. Mashige KP, Oduntan OA, Rampersad N. Perceptions and opinions of graduating South African optometry students on the proposed community service. Afr Vision Eye Health . 2013;72(1). doi:10.4102/aveh.v72i1.43

15. Dussault G, Franceschini MC. Not enough there, too many here: understanding geographical imbalances in the distribution of the health workforce. Hum Resour Health . 2006;4. doi:10.1186/1478-4491-4-12

16. Ramson P, Govender P, Naidoo K. Recruitment and retention strategies for public sector optometrists in KwaZulu-Natal Province, South Africa. Afr Vision Eye Health . 2016;75(1). doi:10.4102/aveh.v75i1.349

17. Williams DR, Gonzalez HM, Williams S, Mohammed SA, Moomal H, Stein DJ. Perceived discrimination, race and health in South Africa. Soc Sci Med . 2008;67(3):441–452. doi:10.1016/j.socscimed.2008.03.021

18. Fitzgerald C, Hurst S. Implicit bias in healthcare professionals: a systematic review. BMC Med Ethics . 2017;18(19):1–18. doi:10.1186/s12910-017-0179-8

19. Wheatle M. COVID-19 highlights health inequalities in individuals from black and minority ethnic backgrounds within the United Kingdom. Health Promot Perspect . 2021;11(2):115–116. doi:10.34172/hpp.2021.15

20. Nwosu CO, Oyenubi A. Income-related health inequalities associated with the coronavirus pandemic in South Africa: a decomposition analysis. Int J Equity Health . 2021;20(21). doi:10.1186/s12939-020-01361-7

21. Sorensen J, Norredam M, Suurmond J, Carter-Pokras O, Garcia-Ramirez M, Krasnik A. Need for ensuring cultural competence in medical programmes of European universities. BMC Med Educ . 2019;19(1):4–11. doi:10.1186/s12909-018-1449-y

22. Chauhan A, Walton M, Manias E, et al. The safety of health care for ethnic minority patients: a systematic review. Int J Equity Health . 2020;19(118):1–25. doi:10.1186/s12939-020-01223-2

23. Johnstone MJ, Kanitsaki O. Culture, language, and patient safety: making the link. Int J Qual Health Care . 2006;18(5):383–388. doi:10.1093/intqhc/mzl039

24. HPCSA. Guidelines for Practice in a Culturally and Linguistically Diverse South Africa: Professional Board for Speech . Language and Hearing Professions.; 2019.

25. ASCO. ASCO Guidelines for Culturally Competent Eye and Vision Care; 2020. Available from: https://optometriceducation.org/files/Guidelines-for-Cult-Com-v2-7-24-2020.pdf . Accessed January 18 , 2023 .

26. Montenery SM, Jones AD, Perry N, Ross D, Zoucha R. Cultural Competence in Nursing Faculty: a Journey, Not a Destination. J Prof Nurs . 2013;29(6):e51–e57. doi:10.1016/j.profnurs.2013.09.003

27. Morton-Miller AR. Cultural competence in nursing education: practicing what we preach. Teaching Learn Nursing . 2013;8(3):91–95. doi:10.1016/j.teln.2013.04.007

28. Kripalani S, Bussey-Jones J, Katz MG, Genao I. A Prescription for Cultural Competence in Medical Education. J Gen Intern Med . 2006;21(10):1116–1120. doi:10.1111/j.1525-1497.2006.00557.x

29. Green AR, Chun MBJ, Cervantes MC, et al. Measuring Medical Students’ Preparedness and Skills to Provide Cross-Cultural Care. Health Equity . 2017;1(1):15–22. doi:10.1089/heq.2016.0011

30. Health Professions Council of South Africa. Optometry and Dispensing Opticians. Available from: https://www.hpcsa.co.za/?contentId=0&menuSubId=50&actionName=Professional%20Boards . Accessed October 20 , 2023 .

31. McGrath C, Palmgren PJ, Liljedahl M. Twelve tips for conducting qualitative research interviews. Med Teach . 2019;41(9):1002–1006. doi:10.1080/0142159X.2018.1497149

32. Braun V, Clarke V. Thematic Analysis. In: APA Handbook of Research Methods in Psychology: Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological. Vol 2 . American Psychological Association; 2021:57–71.

33. Hall JN. Cultivating Cultural Competence. Canadian J Program Evaluation . 2021;36(2):191–209. doi:10.3138/CJPE.70053

34. Rukadikar C, Mali S, Bajpai R, Rukadikar A, Singh A. A review on cultural competency in medical education. J Family Med Prim Care . 2022;11(8):4319–4329. doi:10.4103/jfmpc.jfmpc_2503_21

35. Seeleman C, Suurmond J, Stronks K. Cultural competence: a conceptual framework for teaching and learning. Med Educ . 2009;43(3):229–237. doi:10.1111/j.1365-2923.2008.03269.x

36. Smedley A, Smedley BD. Race as biology is fiction, racism as a social problem is real: anthropological and historical perspectives on the social construction of race. Am Psychologist . 2005;60(1):16–26. doi:10.1037/0003-066X.60.1.16

37. Betancourt JR, Green AR, Carrillo JE Cultural Competence in Health Care: Emerging Frameworks and Practical Approaches; 2002. Available from: www.cmwf.org . Accessed November 12 , 2022 .

38. Papadopoulos I, Tilki M, Taylor G Transcultural Care: a Guide for Health Care Professionals. Quay Books; 1998. Available from: https://www.semanticscholar.org/paper/Transcultural-care%3A-A-guide-for-health-care-Papadopoulos-Tilki/c24ee832f7b22f95310c032b2a9018e9491ecf24 . Accessed July 12 , 2022 .

39. Dudas K. Cultural Competence: an Evolutionary Concept Analysis. Nurs Educ Perspect . 2012;33(5):317–321.

40. Shen Z. Cultural Competence Models in Nursing: a Selected Annotated Bibliography. J Transcult Nurs . 2004;15(4):317–322. doi:10.1177/1043659604268964

41. Camphina-Bacote J. Coming to Know Cultural Competence: an Evolutionary Process. Int J Hum Caring . 2011;15(3):42–48.

42. Brock MJ, Fowler LB, Freeman JG, Richardson DC, Barnes LJ. Cultural Immersion in the Education of Healthcare Professionals: a Systematic Review. J Educ Eval Health Prof . 2019;16(4). doi:10.3352/jeehp.2019.16.4

43. Foronda CL. A Concept Analysis of Cultural Sensitivity. J Transcult Nurs . 2008;19(3):207–212. doi:10.1177/1043659608317093

44. Matthews MG, Diab PN. An exploration into the awareness and perceptions of medical students of the psychosocio cultural factors which influence the consultation: implications for teaching and learning of health professionals. Afr J Health Prof Educ . 2016;8(1):65–68. doi:10.7196/ajhpe.2016.v8i1.562

45. Choi JS, Kim JS. Effects of cultural education and cultural experiences on the cultural competence among undergraduate nursing students. Nurse Educ Pract . 2018;29:159–162. doi:10.1016/j.nepr.2018.01.007

46. Mulder H, ter Braak E, Chen HC, ten Cate O. Addressing the hidden curriculum in the clinical workplace: a practical tool for trainees and faculty. Med Teach . 2019;41(1):36–43. doi:10.1080/0142159X.2018.1436760

47. Gonzalez CM, Deno ML, Kintzer E, Marantz PR, Lypson ML, McKee MD. A Qualitative Study of New York Medical Student Views on Implicit Bias Instruction: implications for Curriculum Development. J Gen Intern Med . 2019;34(5):692–698. doi:10.1007/s11606-019-04891-1

48. Lu PY, Tsai JC, Tseng SYH. Clinical teachers’ perspectives on cultural competence in medical education. Med Educ . 2014;48(2):204–214. doi:10.1111/medu.12305

49. Truong M, Bentley SA, Napper GA, Guest DJ, Anjou MD. How Australian and New Zealand schools of optometry prepare students for culturally competent practice. Clin Exp Optom . 2014;97(6):12197. doi:10.1111/cxo.12196

50. Han R, Koskinen M, Mikkonen K, et al. Social and Health Care Educators’ Cultural Competence. Int J Caring Sci . 2020;13(3):1555–1562.

Creative Commons License

Contact Us   •   Privacy Policy   •   Associations & Partners   •   Testimonials   •   Terms & Conditions   •   Recommend this site •   Cookies •   Top

Contact Us   •   Privacy Policy

IMAGES

  1. What is Research Design in Qualitative Research

    qualitative research process design

  2. Research Design Diagram

    qualitative research process design

  3. Understanding Qualitative Research: An In-Depth Study Guide

    qualitative research process design

  4. Research Design in Qualitative Research

    qualitative research process design

  5. 5 step in qualitative research process Royalty Free Vector

    qualitative research process design

  6. 22. A survey of qualitative designs

    qualitative research process design

VIDEO

  1. Qualitative Methods

  2. Steps in qualitative research process

  3. Quantitative Research Designs

  4. Scientific Research Process

  5. Research ProcessII Quantitative Research Process and stepsII part -1II Nightingale Nursing Academy

  6. Research Design

COMMENTS

  1. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  2. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  3. The Qualitative Research Process: Step-by-Step Guide

    Step 1: Determine what to research. The first step in doing research is determining what to research. Researchers will look through any product roadmaps, strategy documents, data, customer feedback, and conversations with stakeholders to identify potential knowledge gaps or research opportunities. Once a researcher has determined a list of ...

  4. What is Qualitative Research Design? Definition, Types, Methods and

    The qualitative research design process typically involves several key steps. While the specific details may vary depending on the research context and methodology, here is a general overview of the steps involved: 1. Identify the Research Question.

  5. How to use and assess qualitative research methods

    Abstract. This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions ...

  6. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  7. The Oxford Handbook of Qualitative Research

    Abstract. The Oxford Handbook of Qualitative Research, second edition, presents a comprehensive retrospective and prospective review of the field of qualitative research. Original, accessible chapters written by interdisciplinary leaders in the field make this a critical reference work. Filled with robust examples from real-world research ...

  8. What is Qualitative in Qualitative Research

    We see qualitative research as a process in which significant new distinctions are made to the scholarly community; to make distinctions is a key aspect of obtaining new knowledge; a point, as we will see, that also has implications for "quantitative research." ... Research design. Qualitative, quantitative, and mixed method approaches. 3 ...

  9. 20

    In other words, qualitative research uncovers social processes and mechanisms undergirding human behavior. In this chapter, we will discuss how to design a qualitative research project using two of the most common qualitative research methods: in-depth interviewing and ethnographic observations (also known as ethnography or participant ...

  10. Qualitative Research Design

    Features. Preview. Qualitative Research Design: An Interactive Approach provides researchers and students with a user-friendly, step-by-step guide to planning qualitative research. It shows how the components of design interact with each other, and provides a strategy for creating coherent and workable relationships among these design ...

  11. Chapter 2. Research Design

    Chapter 2. Research Design Getting Started. When I teach undergraduates qualitative research methods, the final product of the course is a "research proposal" that incorporates all they have learned and enlists the knowledge they have learned about qualitative research methods in an original design that addresses a particular research question.

  12. Choosing a Qualitative Research Approach

    In this Rip Out, we describe 3 different qualitative research approaches commonly used in medical education: grounded theory, ethnography, and phenomenology. Each acts as a pivotal frame that shapes the research question (s), the method (s) of data collection, and how data are analyzed. 4, 5. Go to:

  13. Qualitative Research

    Qualitative Research. Qualitative research is a type of research methodology that focuses on exploring and understanding people's beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus ...

  14. CMU LibGuides: Qualitative Research Design: Start

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  15. PDF 12 Qualitative Data, Analysis, and Design

    analysis process, as it does in the design and data collection phase. Qualitative research methods are not "routinized", meaning there are many different ways to think about qualitative research and the creative approaches that can be used. Good qualitative research contributes to science via a

  16. PDF Designing a Qualitative Study

    • Emergent design. The research process for qualitative researchers is emergent. This means that the initial plan for research cannot be tightly prescribed, and that all phases of the process may change or shift after the researchers enter the field and begin to collect data. For example, the

  17. Characteristics of Qualitative Research

    Qualitative research is a method of inquiry used in various disciplines, including social sciences, education, and health, to explore and understand human behavior, experiences, and social phenomena. It focuses on collecting non-numerical data, such as words, images, or objects, to gain in-depth insights into people's thoughts, feelings, motivations, and perspectives.

  18. PDF Qualitative Research Design

    selected topic and goals, and also, in an ongoing process, from the data. Thus research design is both challenging and essential, yet it is the least discussed and least adequately critiqued component of many qualitative projects. Freedom from a preemptive research design should never be seen as release from a requirement to have a research ...

  19. Definition

    Qualitative research is the naturalistic study of social meanings and processes, using interviews, observations, and the analysis of texts and images. In contrast to quantitative researchers, whose statistical methods enable broad generalizations about populations (for example, comparisons of the percentages of U.S. demographic groups who vote in particular ways), qualitative researchers use ...

  20. Qualitative Design Research Methods

    The Origins of Design-Based Research. Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown and Allan Collins ().For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory ...

  21. PDF Principles of Qualitative Research: Designing a Qualitative Study

    Office of Qualitative & Mixed Methods Research, University of Nebraska, Lincoln 19 •Write the purpose statement, central question, and sub-questions for our qualitative study Office of Qualitative & Mixed Methods Research, University of Nebraska, Lincoln 20 Let's design the methods for this qualitative study. What to include: •Data collection

  22. What is Qualitative Research? Methods, Types, Approaches and Examples

    Qualitative research is the process of collecting, analyzing, and interpreting non-numerical data. The findings of qualitative research are expressed in words and help in understanding individuals' subjective perceptions about an event, condition, or subject. This type of research is exploratory and is used to generate hypotheses or theories ...

  23. PDF Chapter Three 3 Qualitative Research Design and Methods 3.1

    In qualitative research, the related processes of collecting, analyzing, and interpreting data is not necessarily a chronological, step-by-step process as it tends to be in the quantitative research design (Neuman, 2006, p. 15). Instead, Neuman (p. 15) explains that all three processes of data collection, data analysis, and interpretation occur

  24. What is a Research Design? Definition, Types, Methods and Examples

    A research design is defined as the overall plan or structure that guides the process of conducting research. It is a critical component of the research process and serves as a blueprint for how a study will be carried out, including the methods and techniques that will be used to collect and analyze data.

  25. Qualitative vs. Quantitative Data in Research: The Difference

    Qualitative research methods are interviewing and observing. ... Interviews are very popular methods for collecting data in product design. Focus groups. Data analysis by focus group is another method where participants are guided by a host to collect data. Within a group (either in person or online), each member shares their opinion and ...

  26. A mixed methods approach identifying facilitators and barriers to guide

    This study employed a convergent mixed methods design across two clinics (one rural, one urban) to collect quantitative and qualitative data through facility assessments, 100 stakeholder surveys (20 each PLWH and hypertension, existing HIV treatment partners, clinical healthcare providers (HCPs), and 40 community leaders) and ten stakeholder ...

  27. Practical Strategies for Co-design: The Case of Engaging Patients in

    Co-design provides a meaningful way to involve patients in research. 1 Through co-design, end-users are engaged as equal partners in the development of a product. 2 However, detailed guidance is needed; as Slattery notes, "vague description of co-design makes it very difficult for researchers to undertake co-design activities." 3 Our co-design study had 2 overarching goals: (1) co-design ...

  28. Nurses' perceptions of the design, implementation, and adoption of

    The purpose of this study was to explore nurses' perspectives on Machine Learning Clinical Decision Support (ML CDS) design, development, implementation, and adoption. Design. Qualitative descriptive study. Methods. Nurses (n = 17) participated in semi-structured interviews. Data were transcribed, coded, and analyzed using Thematic analysis ...

  29. Cultural competence in ophthalmic dispensing education

    Methods: This study used a qualitative approach within an interpretivist paradigm by conducting semi-structured interviews with lecturers (n = 7) in the ophthalmic dispensing program. Braun and Clarke's framework for thematic analysis was utilized. The research was conducted at an ophthalmic dispensing department at a South African university.

  30. A Qualitative Enquiry of On-Farm Rules About Quad Bikes (ATVs): How

    This small qualitative study offers a unique in-depth insight into how farmers design on-farm rules, using the case study of safe operation of quad bikes, but it is inherently limited. Maximum variation sampling was not achieved in participant recruitment for this study, with the absence of data from dairy farming, a large part of the ...