Why I found a new way to analyse my classroom data.
As teachers, I feel like we develop an incredible gut feeling for what our students can do. From the tops of our heads we can state which of our students can describe a topic but can't yet evaluate claims within it, or which topics a student excels in and which they need further assistance in. As a scientist, I like lots of evidence evidence, and whilst I felt like I knew my students and I had some evidence, I wanted more detailed evidence to be able to validate my teacher judgements and assessments.
My husband, a dedicated primary school teacher, suggested reading 'Assessment for Teaching' edited by Patrick Griffin. This book is now my favourite teaching resource; it literally changed the way I teach. I've also had the privilege to be able to undertake some of his workshops and work with some of the staff from the Assessment Research Centre to develop assessments that align with the principals in the book (a post about this as well as the rubrics are coming soon!). This style of assessment taught me to isolate what it was I was trying to teach, and then specific ways to measure my students' achievement.
Once I started measuring and collecting this data, I wanted a way to track and analyse it, but I struggled to find a way to do it. The programs I had access to allowed me to record a mark, or fill in a rubric, but it didn't allow me to investigate the detail of the results. So I decided to put the years I spent in my engineering course learning how to code and undertake data analysis to use.
I started with my Year 12 Chemistry class and a school-assessed coursework (SAC) test I had just given them. I made a spreadsheet of the questions where I tagged every question with information about the respective topic, sub-topic, specific learning intention and type of question. I then input the student results, did some calculations and made some graphs. I was then able to give each student a personalised graph with a full break down of their results by everything I had tagged - sub-topic, question type, specific learning intention...
My students were blown away as not only were they getting their mark (and of course my written feedback), but they were able to visualise what they had achieved, and which areas it would be helpful for them to focus on. I too was able to draw conclusions that I hadn't realised, including that my students were able to answer 'explain' type questions, but they were struggling with 'describe' type questions (they were adding too much detail and then some of this extra detail was incorrect, therefore resulting in the deducting of marks). I was able to address this insight immediately in my teaching by having a mini-lesson specifically teaching my students the command verbs that may be present in their final exam and what the different verbs require. In their next assessment, I could see this growth - and then I could show the students their individual growth specifically in this area. It was powerful! My students learning progressed faster than it had previously in my class and I had detailed evidence to assist me in planning and implementing my teaching.
Stay tuned for my future posts where I discuss how I then started to apply these new learnings about data analysis to my other classes and then to my faculties, as well as a love dedication to the rubrics of Patrick Griffin!