Richard Therrien

Sample SchoolWide Improvement

 

ACTION PLAN

 

 

1) Data Analysis

 

2) Action Plan

 

3) Professional Development

 

4) Communication Plan

 


 

Data Analysis of the grade distributions among courses and teachers as compared to CAPT score levels show that there is a need for improvement in:

            Curriculum coordination

            Assessment coordination

            Alignment of assessment to CAPT goals

To analyze this data, I was handed a list of grades by teacher, both for final grades, midterms and exams from the past two years. I also had CAPT score distributions for the last several years. My initial reaction was to try and see if there was a relationship between the subjects taught and the grades distribution. One thing that has concerned us greatly in the last several years is the disparity in achievement between subjects, as well as teachers. There is also a great concern that grades in the specific courses do not necessarily correspond to the various levels assigned to the courses. In our school, we have leveled courses. A “B” in an Honors course is supposed to represent the same level of work and skill as an “A” in an Academic course.

At first, I was intending just to compare grade distributions. I realized that this does not include much student data. The only method that we have of determining the real student school level, independent of teacher, subject, and level, would be to use the annual CAPT scores. It just so happens that the five levels of CAPT scores correspond with the five grade levels (F to A).

I took the list of final grades by course and teacher from June 2004. I compiled how many F’s, D’s, C’s, B’s, A’s were given in each course. I did not separate these by teacher as I had done in the fall data. I do not feel comfortable trying to analyze teachers and their grading other than those in my department. I did count the Honors and Academic levels differently. After doing this, I realized that I was not going to be able to use any courses other than those in Math, Science, and English, since these are the major categories of the CAPT test. I also broke down the grades from last year by student age and grade level. Those students that are taking courses as seniors now took the CAPT in 2003, and those students that are juniors now took the CAPT in 2004.

The ideal, of course would have been to have the individual students CAPT scores and grades in each course. I only have this for the science classes, and I wanted to examine some data that was a bit more school wide.

After a while working with the SPSS program, part of my frustration came with the realization that I was comparing number counts for grades along with percentages. I had to use Excel to help me go back and calculate the grades given into percentages.

Probably this was the most successful part of my experience, since I had, by this point, taken an enormous amount of data and compiled it down into a useful set! (12 data sets, 2 levels, 2 grades for each of 3 subjects)

 

After importing into SPSS (for perhaps the third time), I was faced with the dilemma of how to analyze my data. I did use paired samples correlations at first. A 2 tailed T-test showed that the distributions for each course level and grade level showed little correlation. The percent of D’s and the percent of Level 1’s seemed the closest with a significance of .959.

I also tried to use the means for each group, assigning the grades the traditional 0-4 score, which align with the 0-4 score for the CAPT levels. The means did not show anything out of the ordinary, except that the math classes show a somewhat wider difference between the honors and academic courses. It is also clear that the honors grades have a higher mean, which is not necessarily to be expected if the courses are supposed to be harder at this level. The CAPT means could not be broken down by levels, but show as expected the improvement between 2003 and 2004.

Correlations of the means do not yield any useful information, since there is so little data to work with (12 data sets, 2 levels, 2 grades for each of 3 subjects).

Some other tests which allowed for correlations between paired sets of data involved sign test. This examines the difference between the pairs in a data set, assuming that some will be positive, some will be negative, but if there is a correlation they should cancel to show some significance.

Unfortunately this did not yield useful data either.

At this point, I was unable to use the SPSS program for awhile. I continued using my examination of the Excel data set, and consulted some of my AP Statistics students.

 

After looking closely, it became apparent that there would not be one test to show what I wanted. It is not appropriate to use t test methods, since the population scores are not expected to be a normal distribution, and are not treated as a random sample. What I really want to do is compare the distribution of proportions in a single population. In this case, the chi square test for goodness of fit is more appropriate. Does the distribution of grades that students receive influence the percentages of those students at each level on the CAPT scores? There are 5 levels, thus there are 4 degrees of freedom in comparing these two distributions. Using a P level of .05 the value of Chi Square would have to be below 20 for any significance to be shown. My data is obviously not going to meet this test. I calculated Chi-Square using the SPSS program as a formatted computed variable, taking the value of percent of each level for grades and CAPT scores as my expected and observed. (O-E)^2/E = Xsquare.

 

               As I had started, my most useful information comes from comparing visuals. The bar graphs showing the grades and CAPT score distributions side by side for year, subject, and level seem to be the most useful.

               Here are the key findings:

Our CAPT scores went up in all areas from 2003 to 2004, but it seems our grade distributions remain fairly constant. If we have “smarter” students, shouldn’t they be getting better grades? Or do we grade based on comparisons to the current year, not some curriculum standard.

 

Our Honors classes consistently get more A’s and B’s than the Academic classes. Shouldn’t they be equal if the courses are harder? What is it about our Honors curriculum that is harder? Why doesn’t this show on CAPT scores?

 

The largest percentage of A’s and B’s by subject were given in science. This subject also had the highest overall CAPT scores as well, and the most in the highest two levels.

Math seems to have the highest amount of students receiving C’s, as well as the highest scoring at the proficient level on the CAPT test.

 

Should the grade distributions by subject be similar to CAPT scores by subject? It seems as if they are (with the rough eyeball estimate, due to lack of efficiency in the statistical analysis).

 

Further needs are:

   Look at the correlation between CAPT testing and our methods of assessing students.

Look at the curriculum of our honors and academic courses and find the differences.

   Look at our overall grading policy. There were many instances as I was compiling the data that I saw the wide differences between teachers teaching the same curriculum in a course.

Look at what we can do to help those students on the bubble: in between D’s and C’s, and in between C’s and B’s.

 

Overall, I learned a great deal from this. I definitely am of the mindset that a visual analysis is more useful for this type of data, but I was happy to have the opportunity to “play” with some different techniques.


Data Analysis of the grade distributions among courses and teachers as compared to CAPT score levels show that there is a need for improvement in:

            Curriculum coordination

            Assessment coordination

            Alignment of assessment to CAPT goals

 

The majority of this school improvement plan involves teachers meeting around data, both on their own curriculum, grading and assessment, as well as student data. The NEASC committees on Curriculum and Assessment are used as a tool to compile department and school wide data, and the Department Chair meetings are used as the shared decision making body.

CURRICULUM COORDINATION:

 

            Examine current curriculum goals and objectives for:

                        Number of goals

                        Specificity/concept level (1 to 5 scales)

                        Thinking skill levels. (Concrete to Synthesis)

            Leader will make a spreadsheet.

 

            Each teacher will download their current course(s), and fill in the spreadsheet based on goals as currently written. (2 months)

 

            Department Chairs will compile results, forward to curriculum head. (Note, we are doing this now as part of our NEASC preparation). (1 month)

 

            Department Chairs share results at meeting, come to consensus as to target levels for goal number, specificity, thinking skill levels. (2 months).

 

            Department Chairs hold session with teachers to revise overall curriculum goals to align with targets. (2 full day sessions).

 

            Feedback goes back to Department Chairs to go back to Leader as needed to monitor and adjust.

 

            Check: In one year, objective observation shows similar goals and alignment.

            Note: Process is then repeated to ensure differences in Honors and Academic.

 

ASSESSMENT COORDINATION

 

            Begin:

                        Overall meeting, brainstorming: what is a grade?

                        Professional Development on assessment techniques and ties to curriculum. (1st day of school)

 

                        Department meetings: (First two months)

                                    Grading Policies discussion:

                                                Percent in grade of homework, class, projects, activities, labs, assessments, etc...

 

                                    Feedback to Assessment Committee.

                                    Assessment Committee designs template of assessment philosophy. (by December).

 

                                    Teachers give feedback, model midterm exams after statement. (January)Common midterm exams are a requirement.

 

                        Teachers meet to go over midterm data analysis. Compare with practice CAPT (if sophomores or freshman, OR real CAPT data). Find anomalies, commonalities. (One or two full days in late January, February)

 

                        Teachers devise plan for instructional modifications as needed, and/or assessment modifications in second semester. (February)

 

                        Teachers meet to construct final exams. Common exams, based on common type assessments throughout second semester. (May)

                        Data is analyzed from final exams. (late June or August)

 

            YEAR TWO:

                        Process repeated for new courses, adjusted as needed for levels.

 

ALIGNMENT OF ASSESSMENT TO CAPT GOALS:

            December:

                        Department Chairs identify “on the bubble” students based on real and practice CAPT scores. (Students very close to making the next level)

 

                        From list, teachers analyze real data on course assessments and exams. Teachers examine practice CAPTs to find weaknesses.

 

                        Teachers identify two areas per subject to focus on in the month of February and March to prepare those bubble students.

 

 

 


PROFESSIONAL DEVELOPMENT PLAN

Data Analysis of the grade distributions among courses and teachers as compared to CAPT score levels show that there is a need for improvement in:

            Curriculum coordination

            Assessment coordination

            Alignment of assessment to CAPT goals

The majority of this school improvement action plan involves teachers meeting around data, both on their own curriculum, grading, and assessment, as well as student data. The NEASC committees on Curriculum and Assessment are used as a tool to compile department and school wide data, and the Department Chair meetings are used as the shared decision making body.

            Large group faculty meetings, small breakout groups, extensive use of subject department meetings, study groups, external conferences, readings, action research and invited speakers are all part of the professional development plan to support these goals.

Curriculum:

            Higher Order Thinking Skills:

                        First Semester Professional Development Day

                        1 Hour, Led by School Leader Large Group. Entire Staff Read Articles on current research into higher order thinking skills.

(Select articles from annotated bibliographies such as http://www.enoroe.on.ca/schoolnet/grassroots/pdf/ho_thinking_skills.pdf and http://www.mr.koczij.com/resources/dyc/MrKoczij_learning_higherorder_annotated.pdf)

                        1 Hour, Led by School Leader, Small Group (5-8), then group summary.

                        4 Hours Breakout Groups to examine current curriculum. (5-8 staff per group, mixed subject). Results forwarded to NEASC curriculum head.

                        Revision and Adjustment: NEASC curriculum head and school leader to research the best resources. Monitored by NEASC curriculum committee members. Adjustments include time for discussion and work as needed.

 

            Subject Specific Standards:

                        Select members from each subject area (chairs plus 2-3 others) will attend fall/spring regional subject area conferences. Example: National Science Teachers Association Regional in Hartford, Fall 2005 or ATOMIC (math) Fall conference. Will attend workshop sessions on current state, national frameworks.

                        Winter First Year. Members then lead other teachers in their department to examine the new frameworks and curriculum. (Can be adjusted if needed to later )>

                        Action Plan Research: Individual teachers design action research projects around some new aspect of the revised curriculum to be implemented in the second year. Department meetings will used for continuous feedback on results of action research and to share ideas. Supervisors will be responsible for making sure this is implemented as part of teacher professional goals.


Assessment:

            Alternative Assessment Study Groups:

                        The school leader, or other designee (an interested teacher, an educational administrative intern, the NEASC assessment chair) will lead monthly study groups focused around alternative assessments. The teachers will use resources and videos found at http://wwww.learner.org/resources/series93.html , which is an 8 session workshop entitled “Assessment: What’s The Point? “. This Annenberg/CPB workshop uses video segments of real classrooms with discussion guides and questions. The teachers will be in small groups of 5-12, and attendance is strictly voluntary. These teachers will be asked to share their models and significant learnings with their department members at the end of the year (or at the beginning of the next if needed).

            In the second year, teachers will be asked to implement at least one new alternative assessment, based on research.

            Writing Workshops:

                        Full Day Professional Development Day in Spring of First Year. District hires a professional development consultant to give workshop on writing good open ended assessment questions. Example: http://wwww.makingstandardswork.com/professional_development/writing_excellence_cross_curriculum.htm

                        Teachers will meet in groups for a full day and practice writing good open ended questions. After the full day workshop, they will exchange their work products with each other and “peer-evaluate” the writing prompts.

                        This model will continue in the second year. Twice a year, a teacher from the school will then lead an afternoon professional development session that revolves around the exchange and evaluation of the writing assessment prompts.

                        Adjustments: If there are enough teachers interested, a second study group can be formed to focus on high school writing. This group will meet 8 times during the year and watch the 8 videos from the Annenberg/CPB online professional development workshop “Developing Writers: A Workshop for High School Teachers” online video workshop with accompanying study guides and outlines. Found at: http://www.learner.org/resources/series194.html.

 

Alignment with CAPT

            Each state department coordinator will be invited to attend a subject department meeting in the second year to review and go over CAPT frameworks and assessment types. Alternatively, the department chairs will attend professional development sessions run by the state coordinators, and report back to their department.

 

LEADER PROFESSIONAL DEVELOPMENT:

            Because many of these activities require the school leader, department, and committee heads to lead, it is anticipated that there will be an additional need for professional development for these people. Besides online research and journal readings, these educators will be invited to attend workshops and conferences as needed to fulfill these goals.


 

TOPIC

TYPE

TIME

LEADER

CONTENT

RESULT

HOTS

FullStaffà Small Groups

Full Day:

First Year Fall

NEASC Curr Chair

Articles, Examine HOTS in curriculum

Curriculum HOTS Analysis

SUBJECT Standards

Conferenceà

Dept Meetings

First Year

Dept. Chair plus others

Examine new state frameworks and current curriculum

Curriculum analysis as related to frameworks

SUBJECT Standards

Action Research Indiv Teachers-à Dept meetings

Second Year

Supervisors

Action Research on changes to curriculum

Gradual Implementation with data analysis of new standards

Alternative Assessment

Study Groupsà Full Staff

8 times a Yearà Second Year

NEASC Assessment Chair?

Guided video/online assessment discussions

Examples of alternative assessment use

Writing in Assessments

Full Staff Dayà half day followups

Full Day Spring First Year

Consultant-à Peer Evaluators

Writing Open Ended Assessments

Peer Evaluated Open Ended Writing Questions

Writing in Assessments

Study Groups

Second Year

Volunteers

Guided video/online writing discussions

Examples of writing in high school

CAPT assessments

Dept Meetings

Second Year

State Consultants

Review of CAPT

Aligned assessments


            Action Plan and Professional Development Plan focus on:

            Revision of Curriculum around HOTS Skills

            Alignment of Curriculum and Assessments to Standards/CAPT

            Alignment of Assessments/Grading

            Open Ended Writing in Assessments

WHAT

WHO

HOW

TYPE

Entire Action Plan

All Staff

Memo Explaining (1 pg)

Emailed/Distributed

Entire Action Plan

All People (Includes Staff, Parents, Community)

Flow Chart

Website downloadable with comments

Entire Action Plan

All School

Flow Chart with Check Off Points

On Wall of Office

Curriculum Revision

NEASC Curr. Committee (at meeting)

Spreadsheet of HOTS Skills

Handed Out Memo

Curriculum

Revision

All Staff

Spreadsheet of

HOTS Skills

Website

downloadable template

Curriculum

Revision

All Staff (at meeting)

Articles on HOTS Skills

Handouts

Curriculum

Revision

All Staff

Share ideas on HOTS skills in curriculum

Online Bulletin Board/Forum

Curriculum

Revision

Dept Heads

Discussion of HOTs in their courses

Weekly Meetings

Curriculum

Revision

All People

Newsletter on links between HOTs, curriculum, and standards. (bimonthly)

Newsletter mailed home/Posted on Web Site

Curriculum

Revision

All People inc. Media

“Commercials” on HOTS activities in classes

Videos done by students shown on Cable Access/News/School TV

Curriculum

Revision

Parents

Outline Revision Process

Principal Parent Council Meeting

Curriculum Revision

All Staff

Dept. Head Consensus on HOTS goals

Chart emailed/posted/sent to all staff

Curriculum

Revision

All People

Curriculum aligned with HOTS goals & standards

• Notebook in Central Office

•Interactive website linking to state standards/goals and links between courses

Assessment

Revision

All Staff (at meeting)

What is a grade brainstorming

Group Whiteboard/Copied and Shared.

Assessment

Revision

NEASC Assessment. Committee (at meeting)

Checklists of Assessment Skills

Handed Out Memo

Assessment

Revision

All Staff

Spreadsheet of

HOTS Skills in Assessment

Website

downloadable template

Assessment

Revision

All Staff (at meeting)

What is a grade brainstorming

Group Whiteboard/Copied and Shared.

Assessment

Revision

All Staff

Share ideas on HOTS skills in assessment

Online Bulletin Board/Forum

Assessment

Revision

Dept Heads

Discussion of Assessment revision

Weekly Meetings

Assessment

Revision

All People

Newsletter on links between new assessments and HOTS skills (bimonthly)

Newsletter mailed home/Posted on Web Site

Assessment

Revision

Parents

Outline Revision Process

Principal Parent Council Meeting

Assessment

Revision

All People

Curriculum aligned with HOTS goals & standards

• Notebook in Central Office

•Interactive website linking to state standards/goals and links between courses

Assessment

Revision

All Staff

Exams

Secure shared filing cabinet/online folder to share exam revisions

Assessment Study Groups

All Staff

Invitation to Study Groups/Newsletter on progress

Newsletter emailed/posted describing goals and progress

Assessment/Writing

All Staff/People

Examples of open ended assessments

Examples shared/linked online to curriculum

Assessment

All Staff

grade breakdown/analysis by course/level

Chart distributed each semester

CAPT

All Staff in Depts

Review of CAPT skills and goals

Charts/Graphs, examples.