Skip to main content

Designing the IEP: Measuring and Reporting
Progress Toward Mastery of Annual Goals

By Dale P. Pennell, C.A.S.
February/March 2013

 

After IEP teams write annual goals and objectives, they must determine who, how, and how often the local education agency will assess and report student progress toward achieving the goals and objectives (IDEA §300.320). IDEA does not require that all IEP teams write corresponding short-term objectives for annual goals. Nevertheless, IEP teams must still determine how to measure progress toward mastery of annual goals. IEPs that include observable, relevant, and assessable objectives that describe the frequency and means of determining their mastery provide, in effect, a monitoring and evaluation plan for the annual goals to which they correspond.

 Purpose for Measuring and Reporting Progress and Mastery 

Systematic, ongoing assessment and reporting of student progress enables educators to “substantiate what the student is learning, the effectiveness of materials and methods being used during instruction, and the efficacy of the IEP” (Gleckel & Koretz, 2008, p. 211).  If periodic assessments reveal that students are not making adequate progress, educators have ample opportunity to try alternate interventions. Eventually, data collected on a regular basis may indicate the need for IEP teams to reconsider the appropriateness of the annual goal(s) in question. 

Measuring Progress and Mastery

Gleckel and Koretz (2008) suggest several questions IEP teams may use to guide the development of monitoring and evaluation plans. Three of these questions follow.

  1. What evidence will document student progress toward and mastery of objectives and stated goals?
  2. What tools will generate this evidence?
  3. How often will educators collect data and report progress?

Appropriately written annual goals, as well as the short-term objectives that support goal achievement, describe how to determine mastery of goals and objectives. (See Link Lines article entitled Designing Meaningful IEPs – Selecting and Writing Annual Goals and Short-Term Objectives (November/December, 2012) for further information about the content of annual goals and objectives.) Additionally, IEP teams must specify the ways in which progress will be monitored. These methods generally fall into the following three categories, which include various assessment strategies:

  • Direct measures  “provide valid and reliable indications of student progress” (Etscheidt, 2006, p. 58) and include:
  • Indirect measures that supplement direct measures, including:
    • Rubrics, scales that describe performance in quantitative or qualitative terms
    • Attainment scaling, a process in which teachers rate student responses on a five-point scale of best-to-worst performance 
    • Student self-monitoring, whereby students are taught to monitor and record their own performance
      (Etscheidt, 2006)
  • Authentic measures, including:
    • Anecdotal notes of informal interviews with students
    • Portfolios of student work samples
    • Video recordings of student performance
      (Etscheidt, 2006) 
Following are illustrations of several devices and formats that are useful for monitoring and recording academic and functional skill attainment.
  

Frequency Recording

Frequency recording involves measuring the number of times a behavior occurs within a given period. 

Example: Annual Goal (Functional Area – Communication): By March 2013, when given an augmentative communication device, David will communicate his needs in 3 or more environments (e.g., cafeteria, school store, and community-based site) at least 4 times per school day for at least 3 out of 5 consecutive school days.

For each date that David is observed:

  • Record each observation location.
  • Record with a hash mark each time the behavior occurs while in the location.

  f

 

Curriculum-Based Assessment

Curriculum-based assessment involves direct observation of student performance and repeated recording of student responses in relation to specific academic tasks. 

Example: Annual Goal (Academic Area – Literacy): By October 2013, when assigned literary works of fiction, Anita will correctly answer inferential comprehension questions that address the elements of main idea, setting, characters, and plot with at least 80% accuracy for each story element in 2 out of 3 consecutive trials.

Before the assessment trial begins, the assessor records the open-ended (uncued) and targeted (cued) inferential comprehension questions he or she will ask Anita for each story element. Each time Anita answers a question correctly, the assessor indicates this by placing a check mark in the appropriate column.

c    Rubrics

A rubric is a scale that describes performance in quantitative or qualitative terms. Rubrics vary widely and can be used for any subject or functional performance task.

Example: Annual Goal (Functional Area – Social Competence): By May 2013, when working in cooperative learning groups, Grady will accept group rules for making decisions and complete his roles agreeably (without incident or argument) in at least 3 of 4 consecutive opportunities. 

Before the assessment trial begins, the assessor records in the “Expectation” column desired behaviors targeted in the annual goal. Next, the assessor records in the “Consistently Demonstrates” column descriptors that illustrate how these desired behaviors look when Grady exhibits them. The three columns between the “Expectation” column and the “Consistently Demonstrates” column describe the continuum of behaviors Grady might exhibit as he progresses toward the performance criteria described in the “Consistently Demonstrates” column. The assessor observes Grady during small-group activities and records the number that reflects Grady’s performance for listening, cooperating, and respecting others.  

r

 Student Self-Monitoring

Student self-monitoring involves students evaluating and recording their own behaviors or performance.  

Annual Goal (Functional Area – Adaptive Behavior): By November 2013, when Kirk does not understand instruction during his mathematics class, he will express frustration/confusion by raising his hand, waiting for the teacher to acknowledge him, and either letting the teacher know he does not understand or asking clarifying questions in 4 out of 5 consecutive class periods for two weeks.

Each time Kirk monitors his progress on this goal, he will receive this self-monitoring template from the assessor, who will ask him to record how he responds each time during the class period when he doesn’t understand what his teacher is teaching or asking him to do.

s                                           

Student Interviews

Student interviews refer to the process of recording information provided by students during informal interviews. 

Annual Goal (Academic Area – Literacy): By February 2013, when asked to identify strategies that she finds most helpful for learning content-specific vocabulary, Marketa will describe 2 strategies she has used most effectively to learn new vocabulary (correlated to weekly vocabulary quiz results in science and history/social science classes) for 2 consecutive units of study in each course. 

  1. The interviewer meets with Marketa once her unit test has been scored and asks her to determine how often she has used each of the listed (taught) strategies to learn the vocabulary assessed on the test.  
  2. Then the interviewer and Marketa review her performance on the vocabulary portion of the unit test. The interviewer asks Marketa to identify the strategies she used to learn the words she defined correctly on the test.   s   

Reporting Progress and Mastery

While annual goals and objectives provide target dates for mastery, they do not include schedules for periodic progress monitoring, nor do they describe how student performance will be reported to educators, families, and students. IDEA requires that educators report the results of progress monitoring activities at least as often as assessment data are reported for students without disabilities. However, this may not be often enough to inform the revision of instructional strategies or the need to adjust annual goals and objectives.  Etscheidt (2006) offers the following guidelines to help IEP teams determine how often progress should be measured and recorded:

 f 

Progress-monitoring data may be shared with the student, other staff members, and the student’s family in a variety of ways. These include:

  • Face-to-face conferences,
  • Phone calls,
  • Copies of data summaries, such as charts, graphs, data-collection templates, and
  • Computer technologies, such as e-mail and video-conferencing.

Parents who are members of the IEP team should be invited to identify their preferred means of receiving these data.

Once the IEP team determines how it will assess progress toward mastery of annual goals, it may begin to identify structures the local education agency will employ to move the student forward. These structures – specially designed instruction, related services, supplementary aids and services, modifications, and supports for school personnel – will be addressed in the March-April 2013 edition of Link Lines.

References

Etscheidt, S. K. (2006).  Progress monitoring: Legal issues and recommendations for IEP teams. TEACHING Exceptional Children, 56-60.

Gleckel, E. K., & Koretz, E. S. (2008). Collaborative individualized education process. Upper Saddle River, NJ: Pearson/Merrill/Prentice Hall.

Hargrove, L. J., Church, K. L., Yssel, N., & Koch, K. (2002). Curriculum-based assessment: Reading and state academic standards. Preventing School Failure, 46(4), 48-51.

Individuals with Disabilities Education Improvement Act (IDEA) Public Law 108-446. (2004). Retrieved October 10, 2010, from http://idea.ed.gov/download/statute.html.

Maag, J. W. (2004). Behavior management: From theoretical implications to practical applications (2nd ed.). Belmont, CA: Wadsworth. 

Tieghi-Benet, M. C., Miller, K., Reiners, J., Robinett, B. E., Freeman, R. L., Smith, C. L., Baer, D., & Palmer, A. (2003). Encouraging student progress (ESP), student/ team book. Lawrence: University of Kansas.