Selecting & Evaluating Digital Tools & Resources:
|
Artifact
Technology Workshop Reflection I collaborated with a colleague on the creation of a technology workshop, the idea for which emerged naturally from my instructional coaching experience with him. Through our partnership approach to instructional technology coaching (Knight, 2007), we came up with a solution to the problem of timely and useful data collection. The use of Google Forms as an assessment tool in combination with Flubaroo, an add-on that grades Google Forms assessments, became the focus of our technology workshop after being implemented successfully in his math class. Before selecting the tool, my collaborating partner and I evaluated its potential by testing it in his classroom. Students were able to log in, take a quick five-question assessment, then receive immediate feedback on their understanding. Accessing the data through the Flubaroo add-on, my partner and I were able to determine exactly which skills needed re-teaching and to whom, all within moments. The accuracy of the data, we realized, depended largely on the thoughtfulness of the assessment writer, but the tool itself provided excellent data immediately, and we were able to use it to accurately identify the needs of individual learners in the group as well as the group’s needs. The Google Forms + Flubaroo digital tool combination, we realized, is suitable and compatible to any classroom environment at our school, where a Bring-Your-Own-Technology philosophy has been implemented and augmented by at least three computers in every classroom. Students can log into the computers or their mobile devices via the school wireless Internet connection and participate in the assessments anytime teachers want them to do so, for free. Teachers can access the data immediately on their own desktop computers or mobile devices, also for free, provided that the Internet connection does not drop during the data transfer. My partner and I were disheartened, however, when few teachers showed up to the workshop. Of those that did, all realized the incredible potential of the tool combination and reported plans to implement the strategy immediately. Still, I learned that many teachers feel too bogged down to even consider new ideas during the semester. Pre-planning is coming around again, though, and I have a chance to try again. This time, I will be more careful about directly inviting key innovators from each department to the workshop, then follow through with them throughout the beginning of the semester to see how the tool combination is working for them and to present how it’s working for me. The impact on student learning can be measured immediately via post-test after the tool identifies remediation needs and areas in which students need no further instruction. The impact on staff development can be assessed via the implementation survey I created as a part of the workshop. The impact on school improvement can also be assessed via survey, but more important information can be gleaned from teacher evaluations. Can this segue into technology integration spur more technology integration? It’s a question worth exploring. References Knight, J. (2007). Instructional coaching: A partnership approach to improving instruction. Thousand Oaks, CA: Corwin Press. |