Statistical Methods in Qualitative Research
Statistical Methods in Qualitative Research
Respond using one or more of the following approaches:
Ask a probing question, substantiated with additional background information, and evidence.
Share an insight from having read your colleagues’ postings, synthesizing the information to provide new perspectives.
Statistical Methods in Qualitative Research
Statistical Method
What is measured by this method
Circumstances for Use
Examples of use in Research Studies
Qualitative Content Analysis
Analyzes narrative data, and in-depth interviews. Can evaluate large volumes of data with intent to identify recurring themes and patterns. Attempts to break down elements of data into clusters. May be concurrent or sequential (Polit &Beck, 2017).
Good method for evaluating personal histories, perspectives, experiences. Best method for studying personal, sensitive situations (Sauro, 2015).
Examples of this methodology include evaluation of the experience of a rape victim, what it feels like to have an abortion, how it feels to have lived through a disaster.
Ethnographic analysis
Evaluates cultural phenomena, patterns, perspectives. Requires “participant observer” technique. No preconceived hypothesis. May take months or years to complete. Maps and flowcharts are tools to help illustrate findings (Polit & Beck, 2017).
Method to “acquire a deep understanding of the culture being studied” (Polit & Beck, 2017 p. 538).
An example of ethnographic analysis could include a research study with ethnographers integrating with Native Americans living on a reservation while observing everyday life seeking to extrapolate overlying cultural issues.
Phenomenologic Analysis
Attempts to understand the essence of experiencing a particular phenomenon by observation, interviews, and outside research. Descriptive analysis
Method for understanding individual perspectives of experiencing a certain phenomenon. Seeks to extrapolate commonalities and themes among subjects (Sauro, 2015).
Conducting interviews with persons who have experienced hallucinations, with the intent to understand their perspective and experience of the phenomenon, is an example of this method of research.
Grounded Theory Analysis
Aim is to provide theories and explanations for phenomena based on previously coded information Uses interviews and previous accepted research. Unlike Qualitative content analysis, which seeks to break down information, Grounded theory strives to put information back together (Polit & Beck, 2017).
Method for development of theories, Could be used meta-analyses or systematic reviews.
An example of a grounded theory analysis is” Beck’s (2002) model of mothering twins” as cited in Polit & Beck (2017).
Focus Group Analysis
Analyzes group data in relation to a specific topic. Group interviews, recordings, and field notes .are instruments for conducting this type of research.
May be used for evaluation of a potential survey tool, consensus on a new product. Researchers seek to extrapolate recurring themes.
An example of a focus group analysis might be to evaluate perceptions of a new product being marketed to test for general consensus of its desirability.
Quasi-statistics: a tabulation of the frequency with which certain themes or insights are supported by the data
Qualitative content analysis: analysis of the content of narrative data to identify prominent themes and patterns among the themes
Domain analysis: 1st of 4 levels of data analysis, domains are units of cultural knowledge, are broad categories that encompass smaller ones. Ethnographers identify rational patterns among terms in the domains are used by members of the culture. Ethnographer focuses on the cultural meaning of terms and symbols used in a culture
Taxonomic analysis: second level of data analysis, ethnographers decides how many domains the analysis will encompass. Taxonomy is then developed to illustrate the internal organization of a domain and the relationship among the subcategories of the domain
Taxonomy: a system of classifying and organizing terms
Componential analysis: relationships among terms in the domains are examined; ethnographer analyzes data for similarities and differences among cultural terms in a domain.
Theme analysis: cultural themes are uncovered; domains are connected in cultural themes, which help to provide a holistic view of the culture being studied. The discovery of cultural meaning is the outcome.
Holistic approach: researchers view the text as a whole and try to capture is meanings
Selective approach: researchers highlight or pull out statements or phrases that seem essential to the experience under study
Detailed approach: researchers analyze every sentence
Hermeneutic circle: signifies a methodological process in which to reach understanding, there is continual movement between the parts and the whole of the text being analyzed
Exemplars: illuminate aspects of a paradigm case or theme
Substantive codes: substance of the topic under study is conceptualized through substantive codes. Substantive codes are either open or selective
Open coding: used in the first stage of the constant comparative analysis,
captures what is going on in the data. May be actual words stated by participant. In open coding,
data are broken down into incidents and their similarities and differences are examined. Raw
data interpreted
Three Levels of Open Coding: Levels I, II, III
Level I codes: in vivo codes, derived directly from the language of the
substantive area and have vivid imagery
Level II codes: Researchers constantly compare new level one codes to
previously identified ones and then condense them into broader level II
codes
Level III codes: theoretical constructs, most abstract, add scope beyond local
meanings
Core category: pattern of behavior that is relevant and/or problematic for participants
Selective coding: can have 3 levels of abstraction, researchers code only those data that are related to the core variable
Basic social process (BSP): evolves over time in two or more phases, all BSP’s are core variables, but not all core variables have to be BSPs
Emergent fit: prevents individual substantive theories from being “respected little islands of knowledge”
Axial coding: analyst codes for context
Paradigm: used as an analytical strategy to help integrate structure and process
Central category: core category, which is the main theme of the research
Initial coding: pieces of data (words, lines, segments, incidents) are studied so the researcher begins to learn what the participants view as problematic
Focused coding: the analysis is directed toward using the most significant codes from the initial coding
Congruent methodological approach: analyzes interaction data in the same manner as a group or individual data
Sociograms: can be used to understand the flow of conversation as it goes around the members of the focus group
Incubation: process of living the data, a process in which researchers must try to understand their meanings, find their essential patterns, and draw legitimate, insightful conclusions
Conceptual files: physical files in which coded excerpts of data relevant to specific categories are placed
Themes: involves the discovery nor only of commonalities across participants but also of natural variation and patterns in the data
Metaphors: figurative comparisons used to evoke a visual or symbolic analogy
Quasi-statistics: involves a tabulation of the frequency with which certain themes or relations are supported by the data
Qualitative content analysis: can vary in terms of an emphasis on manifest content or latent content and in the role of induction
Managing Qualitative Data
Computer-assisted qualitative data analysis software (CAQDAS): a program that can take uploaded data files, code the narratives, retrieve information, and display text for analysis
Text Box: • Text retrievers-locate text and terms in a database. • Code-and-retrieve packages allow researchers to code text. • Theory building software functions to examine relationships between concepts, develop hierarchies of codes, diagram, and create hyperlinks to create nonhierarchical networks. • Concept mapping constructs sophisticated diagrams. • Data conversion/collection software converts audio into text.
Within a qualitative data analysis there is not statistical tests, because qualitative research is based on thoughts, open ended questions, interpretations and interviews not numerical values. Data within qualitative research is understood and analyzed during the entirety of the process. “Researchers interpret the data as they read and reread them, categorize and code them, inductively develop a thematic analysis, and integrate the themes into a unified whole,” (Polit & Beck, 2017, p.549). There is not a step by step understanding of how the process occurs of interpreting the data, researchers “live” within the data by understanding the meanings, looking for patterns, draw valid, discerning conclusions. An additional importance of understanding of the facts is having the inventiveness to find the “aha” meaning of the information and discovery of the meanings of the facts gained (Polit & Beck, 2017).
The importance of the interpretation is just as important as the validity of the data. Thorough and sensible researchers have a high standard of their data interpretation by dissecting themselves, peers and outside reviewers. It is vital that the qualitative researchers consider possible different explanations or meanings other than their own (Polit & Beck, 2017).
It is important nurses to understand statistical data because this is a large part of the work nurses base the practice on is evidence based, which means understanding the research behind the reason of the practice is important to understand. According to Hayat, it is important to understand the difference between statistical significance and clinical importance, researchers tend to use statistics to claim proof and scientific breakthrough. Significance testing can be used to decide which data may be considered evidence to support a practice change (2010). “Judgment and subjectivity are necessary and part of the decision-making process. Statistical significance is not a measure of importance; it is a subjective and qualitative construct. Researchers conducting quantitative analyses should quantify the magnitude of an effect. The value of the data collected should be assessed by examining study design, bias, and confounding variables, as well as meaningfulness of the results to the topic under study,” (Hayat, 2010, p.222). Nurses must consider this and have an understanding when utilizing statistical methods to base their practice changes.
References
Hayat, M. J. (2010). Understanding Statistical Significance. Nursing Research, 59(3), 219–223
Polit, D.E. & Beck, C.T. (2017). Nursing Research: Generating and Assessing Evidence for Nursing
Practice 10th ed. Philadelphia, PA: Wolters Kluwer
Sauro, J., (2015. October 13). Five types of qualitative methods, Retrieved from https://flic.kr/p/4PXXCYp.
By: Casey Hoffman, Tami Frazier, Sarah Pudenz, and Elizabeth Wilson