Title

Building Large Learning Models With Herbal

Abstract

In this paper, we describe a high-level behavior representation language (Herbal) and report new work regarding Herbal's ACT-R compiler. This work suggests that Herbal reduces model development time by a factor of 10 when compared to working directly in Soar, ACT-R, or Jess. We then introduce a large ACT-R model (541 rules) that we generated in approximately 8 hours. We fit the model to learning data. The comparison indicates that humans performing spreadsheet tasks appeared to start with some expertise. The comparison also suggests that ACT-R, when processing tasks consisting of hundreds of unique memory elements over times spans of twenty to forty minutes, may have problems accurately representing the learning rates of humans. In addition, our study indicates that the spacing between learning sessions has significant effects that may impact the modeling of memory decay in ACT-R.

Publication Date

12-1-2010

Publication Title

Proceedings of the 10th International Conference on Cognitive Modeling, ICCM 2010

Number of Pages

187-192

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

Socpus ID

78149394173 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/78149394173

This document is currently not available here.

Share

COinS