KIDTAP: A Strengths-Based Approach to Curriculum-Based Assessment

by Denyse Doerries, Ph.D.

The requirements of the No Child Left Behind legislation (NCLB) and the Individuals with Disabilities Education Act (IDEA) both support the use of assessment data to improve instruction. The special education literature suggests several curriculum-based assessment (CBA) models that directly link assessment data to the general education curriculum as alternatives to traditional norm-referenced tests (Burns, 2002). In Virginia, the Department of Education (VDOE) Instructional Support Team (IST) priority project involves training teachers in the use of the Gickling model of CBA as part of a problem- solving process designed to help identify students' instructional level and improve the quality of academic interventions.

The Gickling CBA model is unique for a number of reasons. First, it examines any mismatch between an individual student's instructional level and the curriculum and provides the teacher with enough specific information about the student to determine where to begin instruction (Gravois & Gickling, 2002). Second, this CBA model views the gap between what a student knows and is able to do and what the learning environment demands as the problem to be addressed rather than the student's absence of skills (Rosenfield & Gravois, 1996). In other words, the assessment is viewed not as a process of gathering scores for grading or comparing students, but as a way to determine how a student is thinking about and learning what is being taught (Gravois & Gickling, 2002). Finally, the Gickling CBA model looks at what the student can do rather than what he or she cannot do. Assessing a student on instructional rather than grade level allows the student to display known skills and takes into consideration the student's prior knowledge (Gravois & Gickling, 2002). For example, if a student is able to read second-grade material but is assessed on third-grade material, we will only see what she cannot do rather than her actual skills at decoding and understanding the text.

Instructional Level

A pivotal piece of the instructional support model is maintaining students on their instructional level or in their "comfort zone," where the material is neither too hard nor too easy, throughout the assessment as well as during instruction. Instructional level should not be confused with grade level. Instructional level does not refer to a particular book or material a student is using. Frequently, even within a literature textbook, the instructional level changes from story to story depending on the student's entry-level skills and the prior knowledge required of the student (Gravois & Gickling, 2002).

Instructional level is based on research in the area of working memory (Gravois & Gickling, 2002). Working memory refers to what we are attending to at any given point. Working memory has a limited capacity and requires total concentration in order to effectively process new information. When too much new information is presented, working memory becomes overloaded and clears itself. The amount of information that the working memory can handle is influenced by the number of connections made to prior knowledge and the age of the student. For example, a 7-year-old typically can manage three new chunks of information, whereas a 15-year-old might manage seven. Instructional level honors the student's working memory by providing an amount of new information that is in keeping with the student's developmental needs (Gravois & Gickling, 2002).

Further, instructional level varies not only by the quantity of new material presented but by the demands of the task. In order for a student to be on his instructional level on comprehension tasks, the student needs to know 93-97% of the material. Thus, only 3-6% of the material should be new or unknown to the student. However, if the task requires drill and practice, 70-85% of the material should be known with only 15-30% new material (Gravois & Gickling, 2002).

Keeping a balance between unknown to known material during assessment is challenging. In order to maintain the student at an instructional level during the assessment process the materials must be selected carefully. Prior knowledge must be assessed, and trial teaching may be employed to bring the material into the student's comfort zone. A summary of the steps in this CBA process for reading and writing may be found in the Considerations Packet Instructional Assessment: An Essential Tool for Designing Effective Instruction (Doerries, 2002). Remember, the purpose of the assessment is not to create fear or discomfort for the student but to discover what the student knows and can do.

Five Relevant Questions

The Gickling CBA model includes five questions to guide teachers in determining a student's instructional level. The mnemonic, KIDTAP, may be used to remember the five questions that need to be answered by the CBA and the kind of information necessary to design an appropriate instructional intervention.

Know--What does the student know?
Prior knowledge is the foundation of learning. Students must be able to connect new information to prior learning in order to retain it more easily. The question of what kinds of skills and information the student already possesses is a pivotal one to answer.

Do------What can the student do?
The task on which the student is being assessed must provide an appropriate challenge and be tied to prior knowledge. If the task is too difficult, for example, or if the student does not understand the directions, the student will not be able to show what he can do. During assessment the student's prior knowledge must be matched with the demand of the task. When the student is asked to perform on his instructional level, the assessment should reveal the kind of skills he already has.

Think---How does the student think?
This can only be assessed by having the student "think aloud" and explain how she came to her answer. This question assumes that, by knowing how a student thinks, a teacher is better able to address what the student needs to know.

Approach---How does a student approach what he is unsure of?
This can be assessed by observing the student's behavior during the assessment, particularly on challenging tasks. How persistent is the student? How much frustration can the student tolerate before giving up? Does the student ask for assistance even when he knows the answer?

Plan----As a teacher, how can I use this information to plan for the student's instructional
At this point the assessment data must be analyzed by specifically looking at what skills the student can perform, the demands of the tasks she is expected to perform, and any gap between the two. The instruction must be focused on filling the gap between what the student can do and the demands of the task. In other words, the instructional methods must connect with the student's strengths.

Typically, special education has employed assessment techniques based on a deficit model to design IEP interventions. The Gickling model of CBA provides a tool to identify student strengths to inform intervention design. The purpose of this model is to find out what students know and can do in order to design instructional interventions to accelerate learning. Only by increasing the quality of instruction at the intersection of where new learning and prior knowledge connect can learning be accelerated.


Burns, M.D. (2002). Comprehensive system of assessment to intervention using curriculum-based assessments. Intervention in School and Clinic, 38, 8-13.

Gravois, T., & Gickling, E. (2002). Best practices in curriculum-based assessment. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology (IV, pp.1-13). Washington, DC: National Association of School Psychologists.

Rosenfield, S., & Gravois, T. (1996). Instructional consultation teams. New York: Guilford Press.

Other resources: The Consideration Packet Instructional Assessment: An Essential Tool for Designing Effective Instruction, may be downloaded from the T/TAC website at For additional CBA suggestions, send requests to

Date: Feb/Mar 2004