Archive‎ > ‎tpa-communications‎ > ‎

Special Education Handbooks

posted Dec 12, 2012, 8:56 AM by Patti Larriva
The following letter was sent out to institutions from Stanford University on June 5, 2012

We want to thank you for your participation in the TPAC field test. It was designed to try out TPA instructions, prompts, and rubrics and examine variations in candidate performance.. And in this, the field test was very successful. Unfortunately, we also learned that the handbook and rubrics require major revision and stronger communication with faculty supporting candidates. In submissions for the first and second scoring windows, less than 20% of those reviewed were scorable. Some of this was attributable to ambiguities in the handbook instructions, some was due to assumptions made about common understandings of key terms (such as instructional objectives, performance records) that did not appear to be in place, and some appeared to be due to candidate error. The low percentage of scorable TPAs has led to a joint decision by Stanford and Pearson NOT to score the Special Education (Other Settings) TPAs that were submitted during the Field Test.

In place of scores, we are providing three sets of observations based on review of a small number of complete TPAs for your consideration: 1) common errors affecting multiple rubrics that prevented assignment of a significant number of scores; 2) observed strengths; and 3) practices that concerned the reviewers.

1. Common errors that prevented scoring of several rubrics included:

  • A significant percentage of TPAs were missing required evidence, e.g,. video clips or commentaries.
  • The intention was to see how could teach toward two different instructional targets. An option was given to describe instruction of two students instead of one. Unfortunately, this instruction was not read in tandem with the one about the two different instructional targets, so several candidates had two students, each with an academic instructional target.
  • The instruction/intervention did not address all of the instructional objectives.
  • Many candidates were missing performance records. Some submitted a student work sample instead of a record of performance over time, and some did not submit performance records that addressed all instructional objectives.
  • No communication demand was identified.
2. Reviewers observed the following strengths in the sample of TPAs reviewed:
  • A learning environment with strong social-emotional support
  • Well-designed integration of instruction and support for two learning targets from different areas
  • Documentation of performance across time that showed the extent of progress toward each instructional objective
  • Documentation of performance over time that showed either how varied contexts (e.g., the extent of support, individual vs. small group setting) affected performance or the extent to which the learning objective was generalized
  • Capitalizing on a teachable moment that enabled student progress
3. The reviewers noted the following concerns:
  • Descriptions of students that were not clear enough to judge the appropriateness of instruction
  • Instructional objectives that were not appropriate for the student as described
  • Use of materials that were not developmentally appropriate for older students
  • Over support of students, not allowing them to show the full extent of what they could do
  • A learning environment with strong social-emotional support but with weak support for the content to be learned, perhaps owing to unfamiliarity with the academic content
  • Good teaching clearly impacting student performance in the video that, however, was not relevant to the instructional objective named by the candidate
These observations will be used to inform revisions in the instructions and prompts, additions to the glossary, revisions of rubric level descriptions, and communication with faculty supporting candidates.

 

Comments