Division of Education Educator Preparation Programs

Home > CAEP Annual Reporting Measures

CAEP Annual Reporting Measures

Display of candidate performance data

2020 CAEP Impact and Outcome Measures 

  1. Impact on P-12 learning and development (Component 4.1)
  2. Indicators of teaching effectiveness (Component 4.2)

 

MSU faculty are currently engaged in case study research as a strategy to assess the effectiveness of completers in the classroom and impact on P-12 student learning. Findings and results are presented in the finished manuscripts available on this webpage as attached files; these include detailed accounts of recruitment, methodology, research questions, and analysis. The three cycles of data were phased in starting with protocol development in 2016 followed by purposeful sampling of participants, implementation, refinement and further data collection. A pilot with 2 elementary completers occurred in spring 2017. Two completers from elementary education and two from secondary science education participated in spring 2018. In spring 2019 participants were recruited from the remaining areas of early childhood, special education (SPED), and MAT. Recruitment scripts were sent to 11 eligible completers (early childhood n = 6; SPED n = 3; MAT n = 2); none agreed to participate. Of the three SPED graduates, only one was currently teaching in a special education position. All others were employed in a regular education setting. The one completer did not consent to participate. The completers in MAT and EC noted their reasons for not participating: limited time due to other school-related obligations (i.e., coaching, assessment), personal responsibilities, and need for supervisor participation/consent. In this way, the EPP attempted to represent all program areas. The EPP has maintained a continuing cycle of case studies examining completer performance in different grades and subjects. During the three cycles the only program with more than 10 completers in a cohort year was elementary education. Validity and reliability of measures within the case study are described in instrumentation. Data from all six participants were analyzed together and results presented collectively. Multiple sources of data for examining the growth and achievement of and impact on students were examined. This included results from student engagement surveys, standardized assessment growth percentile scores, teacher selected pre-post assessments, and phone interviews with graduates and their supervisors.  

A cross-case review was conducted to look for patterns across findings related to P-12 student learning. Overall, a majority of students taught by EPP completers met expected norms and demonstrated growth over time after receiving instruction according to teacher-linked student achievement tests, with the exception of one completer. As a measure of student learning growth, completers also submitted de-identified data from pre and post assessments. A majority of students in completers' classrooms demonstrated growth. This occurred for 100% of students in the classrooms of four completers. Only one student in the fifth completers' classroom did not demonstrate growth on the math pre-post assessment; the student's score decreased by one question answered incorrectly. The sixth completer did not submit any comparative pre/post data for analysis. Completers were also interviewed to collect data identifying perceptions about factors related to their ability to impact student learning. A cross-case analysis was conducted to identify commonalities across cases and to gather evidence of influences on student learning. Analysis revealed three main themes: utilizing collaboration as a tool for improvement, the impact of reflection on teaching practices, and the role self-efficacy plays in student learning. The full analysis and more detailed discussion of results is available in the completed manuscripts. It was evident that EPP completers recognize their role in the classroom and how the effectiveness of their instruction impacts students. According to multiple measures, students in these completers' classrooms did indeed demonstrate expected levels of learning. As a descriptive case study, the EPP did not attempt to infer direct causality of teacher preparation or effectiveness of teaching on the results of student outcomes; however, it appears completers contributed to positive student outcomes amid multiple influencing factors. Foremost, results from learner outcomes reaffirmed the EPPs commitment to ensure every learner has a competent, effective, engaging, and reflective educator.

The EPP received the Mayville State First Year Teacher Effect on Student Outcomes report from SLDS on Thursday, January 9, 2020. The report is available as Addendum File #A21. Students of first year EPP teachers demonstrated a growth percentile distribution in math at the 48th percentile with a range of 22.5-73 percentile points. This distribution was slightly under all ND non-first year teachers (50th) and other ND first year teachers (49th). Math proficiency levels of the student cohort of EPP first year teachers increased from 35.4% of students proficient or advanced to 41.7% proficient or advanced. In reading, students of first year EPP teachers demonstrated a growth percentile distribution in the 48th percentile with a range of 21.5-80 percentile points. This distribution was slightly lower than all ND non-first year teachers (50th) and the same as other ND first year teachers (48th). Language arts proficiency levels of the student cohort of EPP first year teachers increased from 39.1% of students proficient or advanced to 44.5% proficient or advanced.

The EPP continues to find the state-provided data as inadequate for evaluating program impact due to a limited sample, the inability of state data to be disaggregated by program or directly linked to completers, and test scores limited to reading, language arts and math which accounts for only a portion of EPP completers. In addition, only completers who remained in ND to teach have student data available. Overall the EPP continues to determine that state provided, teacher-linked student achievement data does little to inform continuous improvement efforts, and thus the EPP continues to pursue other sources of evidence (i.e., teaching effectiveness survey questions, supervisor evaluations, and case study research).

Anderson, S.K., Hagen, B., Whitsel, C., Dulski-Bucholz, A., Smith, K., & Willeson, A. (2019). Leveraging case study research: A mechanism for measuring teaching effectiveness. Mid-Western Educational Researcher 31(1),1-43.

Anderson, S.K., Hagen, B., Whitsel. C., Smith, K.D., & Duffield, S. (2020). Graduate impact on student learning: A descriptive case study. (Under Peer Review)

Madler, A., Anderson, S. K., & Smith, K. (2020). Perceptions of teacher preparation for classroom diversity: A secondary data analysis. (Under Peer Review)

3. Satisfaction of employers and employment milestones (Component 4.3)

Findings on employer satisfaction are collected from surveys administered to supervisors of first-year MSU teachers each spring. The survey asks supervisors to assess the quality of graduates’ instructional practices, abilities to work with diverse learners, abilities to establish positive classroom environment, and levels of professionalism. The survey is administered to direct supervisors of teacher preparation graduates employed in schools approximately one year after the teachers completed their preparation programs.

The Division of Education has set an acceptable target of a 2 or higher mean on all indicators (all tends to agree or agree); employers are satisfied with the completers’ preparation for their assigned responsibilities in working with P-12 students as indicated with a tendency to agree. An ideal target has been set at a 3 or higher mean on all indicators (all agree). Results of the survey are reviewed annual by the Division of Education Faculty. 

Supervisor Survey Scores-employers of teachers completing the TTS (4-point scale)

Year 1

2016

Year 2

2017

Year 3

2018

Year 4

2019

Instructional Practice 

3.39

3.19

3.66

3.52

Diverse Learners 

3.43

3.15

3.64

3.62

Learning Environment 

3.46

3.28

3.66

3.59

Professionalism 

3.49

3.40

3.75

3.73

Overall Supervisor Ratings

3.42

3.23

3.67

3.62

4. Satisfaction of completers (Component 4.4)

  1. Findings on completer satisfaction are collected from surveys administered to completers the academic year following their graduation. All graduates are invited to complete the survey, but those who are teaching complete an additional section to rate the quality of their preparation. The survey is administered approximately one year after the graduates completed their preparation programs. The Division of Education has set an acceptable target of a 2 or higher mean on all indicators (all tends to agree or agree) and approximately commensurate means with the ND state aggregate; completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective as indicated with a tendency to agree. An ideal target has been set at a 3 or higher mean on all indicators (all agree) with results above the ND aggregate. Results of the survey are reviewed annual by the Division of Education Faculty.

Transition to Teaching Survey (TTS) Scores-1 year after graduation (4-point scale)

Year 1

2016

Year 2

2017

Year 3

2018

Year 4

2019

Instructional Practice 

3.39

3.28

3.40

3.55

Diverse Learners 

3.16

2.96

3.06

3.28

Learning Environment

3.46

3.49

3.37

3.54

Professionalism 

3.48

3.45

3.33

3.55

Overall Graduate Ratings

3.37

3.30

3.29

3.48

 

 

 
5. Program enrollment and graduates for 2018-2019: 
  1. Total Enrollment

    95

    90

    89

    101

    131

    157

    Total Completers

    44

    47

    42

    41

    39

    61

 

  1. 6. Ability of completers to meet licensing: Institutional Pass Rate on Praxis Exams: 95%      Average Undergradaute GPA: 3.50      Average Graduate GPA: 3.70

 

  1. 7. Ability of completers to be hired in education positions for which they have prepared-placement patterns of completers 2018-2019:
  2. Graduate survey data indicates that 96.6% of EPP graduates are employed, and 5.2% are continuing their education; 72.6% are employed in ND helping to meet the state shortage needs.

8. Student loan default rates & other consumer information:

 

To assure appropriate stakeholders are involved in the continuous improvement process, annual findings are brought for feedback and discussion to the Teacher Education Advisory Board, which includes alumni, employers, practitioners, and school and community partners. Action plans are shared and revised based on stakeholder feedback. Additionally, information on annual reporting measures and program outcomes is included in all MOUs, shared via the MSU Foundations Office, and also through the Division of Education Facebook Page.

Annual Review Important Finding #1: Sustaining Growth

Finding: Over the last six years as institutional enrollment continues to climb, the number of candidates who are enrolled and admitted to BSED programs has doubled. The number of enrolled candidates increase from 95 in 2013-2014 to 186 in the fall of 2019. Furthermore, 394 students have declared education as a major in 2019. As a the results show, declared majors have also doubled during this time frame. In addition, more students are being retained from the time of declaring a major to being admitted to the program (from 33-47%) which is increasing course enrollment and the number of times courses or format in which courses are offered. It is positive and exciting to see the quality of the educator preparation program being noted in the state level and in professional organizations for individuals who want to be highly effective teachers to apply to the MSU EPP. As an example of enrollment growth, the course enrollment numbers for required professional education courses at the beginning, middle, and end of teacher training programs are indicated below for the most recent academic year and five years prior for comparison.

The increased enrollment has also impacted field experiences and collaborations with P-12 school partners. In 2018-2019, the Student Placement Coordinator arranged for 310 separate experiences. For the fall 2019 semester alone, 211 placements have been made. In 2016-2017 the Division maintained a Memorandum of Understanding (MOU’s) with 92 school districts. In September of 2019 MOU’s increased to 195. As distance and online programs continue to grow, the outreach to new districts for partnership in clinical experience has increased accordingly. This has resulted in additional onboarding of clinical partners for program expectations, learning management systems, and training for the use of EPP-wide evaluations such as the disposition and the Skills of Teaching Observation Tool (STOT).

In addition, the Division has submitted Stage I proposals and is preparing Stage 2 proposals for two new programs and one new certificate. This includes a graduate level Masters of Education (M.Ed.), Special Education/Registered Behavior Technician (RBT) Certificate, and a non-teaching degree in Applied Behavior Analysis (ABA) for Board Certified Assistant Behavior Analysts (BCaBA). Growth in multiple areas has resulted in on-going discussions regarding course enrollment caps, creating new course sections, faculty overload, hiring new faculty, adjunct faculty, the number of field placements, academic advising, selectivity of program admissions, overall capacity, maintaining rigor and a commitment to personal service.

Action: Identify and plan changes to processes, practices, programs, systems, and resources needed to respond to increased enrollment and program growth.

 

Annual Review Important Finding #2: Selectivity Measures at Program Completion

Finding: The measures of candidates at completion of the program indicate the acceptable targets, and most often the ideal targets, established by the EPP faculty are met on multiple measures across all levels of preparation. The requirements for continuance and the student success plan process are supporting candidates as well as serving to maintain selectivity of future educators. A number of candidates have been retained through additional supports, and others have been counseled out of the program to complete non-teaching degrees. There are multiple measures used to evaluate teacher candidates at completion of their program in the areas of knowledge, skills, and disposition. In the recently completed CAEP self-study report, these measures for CAEP components 3.5 and 3.6 for “Selectivity at Completion” included:

  • EPP major coursework passed with a final grade of 'C' or better (SLO 2)
  • Student engagement survey results during student teaching (SLO 1)
  • Pre-post assessment results during student teaching (SLO 3)
  • Service learning lesson during student teaching (SLO 4)
  • 1.1.1.3/4 Skills of Teaching Observation Tool (STOT) evaluation during student teaching (all SLO’s)
  • 1.1.3.8 disposition during student teaching (SLO 4)
  • 1.1.2.6 Lesson Plans EDUC 400 Student Teaching (1.1.3.9, 1.2.1, 1.4.2) during student teaching (SLO 3)
  • 1.1.1.6, 1.1.2.8,1.1.3.5, 1.1.4.5 Portfolio (all SLO’s)
  • 1.1.1.7 PRAXIS II Subject Area Assessment Pass Rates (1.1.2.1, 3.5.4) (SLO 2)
  • 1.1.1.8 PRAXIS II Principles of Learning and Teaching (PLT) exam (SLO 3)
  • 3.5.1 minimum grade point average of 2.75 (SLO 2)
  • 3.6.1 MCEE pre and post assessment (EDUC 401s); ETS ProEthica (for MAT) (SLO 4)

The items with coded evidence numbers are a part of the quality assurance system annual review process across the EPP. Coursework information is maintained in the admission and continuance data base and is re-confirmed for each student during the graduation audit. The other three items used as measures of completer preparedness are assignments during student teaching which are captured in the learning management system only and are being added to the quality assurance system as measures. The list of indicated measures, along with program growth in relationship to the number of faculty available to review capstone portfolios at Checkpoints 1, 2, and 3, suggests the Division should consider revision in selectivity measures at the end of the program.

Action: Research and revision of Capstone Portfolio with Integration of AAC&U High-Impact Practices

 

Author: Brittany Hagen (CAEP)
Last modified: 5/6/2020 7:12 PM (EDT)