Knowledge Structures, Knowledge Structure Evaluations, Concept Mapping, Conceptual Knowledge, Rater Error
The likelihood of conducting safe operations increases when operators ave effectively integrated their knowledge of the operation into meaningful relationships, referred to as knowledge structures (KSs). Unlike knowing isolated facts about an operation, well integrated KSs reflect a deeper understanding. It is, however, only the isolated facts that are often evaluated in training environments. To know whether an operator has formed well integrated KSs, KS evaluation methods must be employed. Many of these methods, however, require subjective, human-rated evaluations. These ratings are often prone to the negative influence of a rater's limitations such as rater biases and cognitive limitations; therefore, the extent to which KS evaluations are beneficial is dependent on the degree to which the rater's limitations can be mitigated. The main objective of this study was to identify factors that will mitigate rater limitations and test their influence on the reliability and validity of KS evaluations. These factors were identified through the delineation of a framework that represents how a rater's limitations will influence the cognitive processes that occur during the evaluation process. From this framework, one factor (i.e., operation knowledge), and three mitigation techniques (i.e., frame-of-reference training, reducing the complexity of the KSs, and providing referent material) were identified. Ninety-two participants rated the accuracy of eight KSs over a period of two days. Results indicated that reliability was higher after training. Furthermore, several interactions indicated that the benefits of domain knowledge, referent material, and reduced complexity existed within subsets of the participants. For example, reduced complexity only increased reliability among evaluators with less knowledge of the operation. Also, referent material increased reliability only for those who scored less complex KSs. Both the practical and theoretical implications of these results are provided.
If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu
Doctor of Philosophy (Ph.D.)
College of Sciences
Length of Campus-only Access
Doctoral Dissertation (Open Access)
Harper-Sciarini, Michelle, "Investigating The Reliability And Validity Of Knowledge Structure Evaluations: The Influence Of Rater Error And Rater Limitation" (2010). Electronic Theses and Dissertations. 4212.