ORCID

0009-0008-0585-0123

Keywords

coding, moderator variables, meta-analysis, IRR

Abstract

Meta-analyses are critical tools for evidence-based policymaking, but their validity depends heavily on accurate data extraction from primary studies. While previous research has focused on procedural aspects of coding and reliability of effect size extraction, there remains a significant gap in understanding how to improve inter-rater reliability (IRR) for moderator variables that provide crucial context for meta-analytic findings. This study compared two methodological approaches for coding moderator variables: traditional low-inference coding and a novel drill-down questioning technique adapted from database development principles. Using a comparative analysis design, two independent coders extracted data from 18 primary studies included in a published meta-analysis examining bullying and social status outcomes. Data from each study was independently extracted using both methods. Three analyses were conducted: (1) comparing the frequency of extracted unique moderators aligned to outcomes, (2) examining accuracy in longitudinal outcome alignment, and (3) assessing IRR for nominal and ordinal variables requiring coder deduction. Results showed the drill-down questioning method achieved marginally better accuracy in moderator extraction tied to specific outcomes (99.1% vs 96.1%) and perfect alignment for longitudinal outcomes compared to 95.5% with traditional coding. For variables requiring coder deduction, the drill-down method consistently yielded higher reliability estimates across agreement rate, though differences were statistically significant for only one variable (i.e., assessment of bullying, z = 3.414, p < .001). However, an unexpected finding emerged with ordinal variables, where traditional coding showed stronger reliability for one variable (i.e., developmental phase). While the differences between methods were modest, this study provides preliminary evidence that structured drill-down questioning may improve coding accuracy and reliability, particularly for complex nominal variables. These findings contribute to the development of more systematic approaches for extracting moderator variables in meta-analyses, though further research with larger samples is needed to validate these results across different disciplines and variable types.

Completion Date

2025

Semester

Spring

Committee Chair

Hahs-Vaughn, Debbie and Bai, Haiyan

Degree

Doctor of Philosophy (Ph.D.)

College

College of Community Innovation and Education

Department

Learning Sciences and Educational Research

Identifier

DP0029288

Document Type

Dissertation/Thesis

Campus Location

Orlando (Main) Campus

Share

COinS