At the New Mexico School for the Arts, we experimented with many different ways of making use of data in order to boost the academic performance of our students, but at first, we were unsuccessful. What we have gleaned is as follows:
What Didn’t Work For Us
The faculty got together to discuss and analyse the results of the state’s reading and math exams. We used a commercial short-cycle assessment to generate student-by-student reports, which we then dissected in order to look for information that we could use to inform the instruction that we provide.
Unfortunately, the most specific prescription that we could usually glean from the data was something along the lines of “This student needs more practise with informational texts” or “This student needs more work on statistics.” The data showed us which students were proficient in a given topic and which were not, but we were unable to locate the specifics that we required to make meaningful adjustments to our teaching in order to improve student achievement. The what (in what general skill this group of students is not proficient?) was provided by this type of data, but not the why (why did they not get it?) or the how (how might we reteach the skill in a way that sticks?)
The book Driven by Data by Paul Bambrick-Santoyo provided us with the strategy for dealing with data that we were looking for. We were finally able to answer the why and the how questions by developing our very own interim assessments, which enable data analysis to be performed on the basis of student, standard, and question. Our educators were better able than ever before to identify the specific areas in which students lacked knowledge. This was a powerful discovery because it provided educators with immediate access to information that could be used to plan instruction and reteaching for each and every student.
What Worked For Us
According to Bambrick-argument, Santoyo’s this model does not require prior buy-in because the buy-in is created by the results themselves. It was successful at our institution. The overall proficiency of NMSA students on the PARCC math tests rose from 29 to 40 percent over the course of one academic year. On the PARCC test of English language arts, the percentage of proficient students increased from 80 to 87.
However, the advantages of teaching based on data go beyond merely improving students’ performance on tests. Our educators came to realise that our students had difficulty acquiring in-depth knowledge across the board in the subject areas. Now, the teachers in our schools are working to improve their students’ abilities to infer, recognise causal patterns, and universalize themes, skills, and principles. These are the kinds of critical thinking skills that we want all of our students to have when they go off to college and continue their educations beyond high school.
The seven steps outlined below illustrate how our school evolved into a data-driven institution:
1. Roll Out Professional Development
Give your teachers the freedom to take charge of their own professional development, particularly with regard to the creation of their own interim assessments (see steps two and three). Learning the process was included in the professional development training that I received. The second component involved looking at case studies of schools that had implemented data-driven instruction and seen significant improvements in their students’ academic performance as a result. The materials for professional development that are included with Driven by Data were utilised by our team.
2. Determine Essential Standards
Start by concentrating on high-leverage classes if you want to successfully implement interim assessments. The first year simply does not have enough time to accommodate all of the classes. We concentrated on classes that required students to take an exam as a prerequisite for graduation (end-of-course exams), as a prerequisite for college entrance (ACT/SAT), or as a prerequisite for both (PARCC): math, English, and specific science and social studies classes.
In order to concentrate on the most important aspects of our assessments, we determined our essential standards and then further refined them based on state and Common Core standards. We asked:
What are the requirements that students need to meet in order to advance to the next level of the subject and/or be ready for college and other stages of their education beyond high school?
Which standards will be evaluated, and which ones will require additional instruction if they are not yet mastered?
Because of this, we were able to steer clear of the trap of covering an excessive number of standards without ever going into sufficient depth to achieve mastery.
3. Make sure your interim evaluations are of a high quality.
A necessary prerequisite for obtaining high-quality data is conducting high-quality assessments. We gave teachers a week to develop their assessments during in-service work days at the end of the school year (after students had left for summer), in addition to approximately 15 hours during hour-long faculty meetings. Initial trainings on the development of useful interim assessments were delivered by a group consisting of school administrators and teacher leaders (two hours). After the writing of the assessments was complete, they spent thirty minutes with each teacher going over the results of the assessments using the assessment review tool that is available in Driven by Data. This helped to ensure that the assessments adhered to the following criteria:
They were required to adhere to the appropriate standards.
In order for them to be successful on end-goal assessments, they needed to have the appropriate degree of rigour. (In order to calibrate their assessments, teachers looked at sample questions from ACT test preparation books, as well as PARCC practise tests and New York Regents examinations.)
They were required to use a format that was identical to the one that students would experience on the end-goal assessments.
The ACT is one of the high-quality assessments that our teachers used as a reference when developing the practise tests that they used to evaluate the level of difficulty in the curriculum. They developed their own expertise in developing high-quality questions that pinpointed what students knew and didn’t know by using resources such as the Vanderbilt Center for Teaching, in addition to learning through trial and error and working with colleagues.
4. Establish a Process for the Generation of Data Reports
Data-driven instruction cannot be implemented successfully in the absence of detailed data reports. Because we wanted to be able to export the data from all of our tests, we put them all on a learning management system called Blackboard. One of our teacher leaders utilised the structure modelled in Driven by Data and devised a method to convert the exported assessment results into an Excel spreadsheet that provided information that was broken down by student, question, and standard. This spreadsheet was very helpful in analysing the data. Because of the complexity and length of time required by this process, extensive preparation and the allocation of sufficient resources are essential to achieving desired results.
5. Establish a Method for the Evaluation of Data
Perform an analysis of the results categorising the students, questions, and standards. This method is broken down in detail in Driven by Data. It is essential to make time for both this analysis and the meetings that will discuss the observations. We cancel all faculty meetings for the week following each assessment, which occurs at the end of the first, second, and third quarters of the school year. This gives the teachers ample time to prepare their analysis. After that, we pair everyone up with a different colleague or administrator for a thirty-minute data-analysis meeting, which takes place either during the day or after classes, depending on when the teacher prep time falls. This presents an excellent opportunity for in-depth conversation about what the data reveals and the next steps that need to be taken. We incorporate professional development into the process by having coworkers collaborate on the development of instructional strategies that could be utilised for the reteaching of standards that have not yet been mastered.
6. Continue with the Reteaching Process
Maintain communication with your instructors regarding their reteaching. Ask one another the following questions during collaborative teacher meetings and meetings with administrators:
How did it work?
When it comes to the next instance, do you plan to start with that strategy?
What do some of your students not understand even now?
7. Take Time to Meditate and Work on Getting Buy-In
What Driven by Data had to say about buy-in inspired our faith: Because the results would ultimately build the buy-in, it was unnecessary from the very beginning. After one year, considering how students’ test scores had improved provided further evidence of the success of the model. Furthermore, when both teachers and students can see how their efforts are leading to success, it is an intrinsically motivating experience for everyone involved.