One area where I need a lot of growth is responding to the progress of my students. Specifically, what are the next steps after I’ve identified the level of understanding in certain concepts? I recently decided to tackle this challenge during our review days for the state assessment. The goal was to review the most tested concepts, but I also wanted the kids to work on their biggest struggle areas.
I began the process by consolidating all of the data from concept quizzes throughout the year. From there, I created a “Personal Growth Report” for each student using autoCrat.
Here’s a tutorial video showing the process of creating these reports.
This is a similar idea to the Growth Mindset Reports I blogged about a few months ago. Basically, I was looking for a student progress report that promoted growth mindset with its language and format. I wanted the students to come away with a positive, motivated attitude about where they needed growth and their steps toward improvement.
For the original reports, I had the students self-assess their understanding of each concept because I was concerned about classroom status issues. I’ve found in the past that students compare the grades on their papers and status ensues. However, I went ahead and placed the current progress on the reports because I knew that some students would not be able to identify the areas where they needed the most growth. In order to hopefully avoid negative takeaways, I made sure to add the statement, “I’m giving you this feedback because I believe in you” (thanks Jo Boaler). No doubt, words can be empty with the kids, but I was hopeful that my actions throughout the year would back up this encouraging phrase.
I created several silent solution videos as the first step toward responding to student progress (the idea came from Kyle Pearce and Cathy Yenca). The videos are short, silent and meant to be a quick how-to for students to learn from. I really like these because they’re straight to the point, and students can easily play them over and over without spending a ton of time.
In addition, Jennie Magiera has talked about some interesting findings related to tutorial videos for teachers in Chicago Public Schools. Her team found (I’m paraphrasing) that videos 40 seconds or shorter were watched all the way through 100% of the time. However, longer videos had a much lower watch percentage. This makes sense because we’ve all had moments where we see a video is 5 minutes long, and we either stop watching or try to skip to the “good part.”
Finally, I felt like the silent solutions would be important and useful because I had students requesting tutorial videos. This interest was confirmed during the review process because many kids demonstrated improvement and mentioned the helpfulness of the videos.
Here’s a sample…
…and a link to all of the silent solutions I’ve uploaded to YouTube so far. I’d really appreciate your feedback on errors and possible edits that can be made in order to improve these. Also, please share if you’ve created your own.
For the in-class review, I decided to create an assignment for each topic with different levels of difficulty. I wanted to hit as many depth of knowledge levels as possible so I tried to emulate Robert Kaplinsky’s excellent tool.
For level 1, I simply had the students watch the silent solution videos and work some practice problems afterwards. Plain and simple, but I wanted the assignment to have a low entry point.
For most of the level 4 challenges, I had the students create something cool with Desmos. In this case, I had them complete a putt putt golf hole. These holes always seem to challenge even the most confident students.
Overall, I really liked the level approach to each assignment because the students who are “starting out” with a concept can find challenges in the lower levels, and the “got it and then some” kids are challenged in the higher levels. This allows for every student to improve his or her depth of understanding in each topic.
Here are the other assignment links:
I made a couple mistakes on the original versions of these tasks (the links are for the updated versions). First, I put too many practice problems in the level 1 challenges. I found that 4 to 6 problems was a sufficient amount. More than 4-6 led to a halt in momentum and boredom. Plus, a lot of the kids were able to pick up on the concept after 4 problems.
Another mistake I made was putting released state assessment problems in each assignment. The problems heavily impeded student progress and took the fun out of the experience. I’ll definitely leave them out from now on and instead weave them into other parts of the class. Better yet, I’ll use Geoff Krall’s guide to better test prep.
Again, it would be great to get your feedback about how to make these tasks better. DOK levels are another area where I need growth so I know each assignment has room for improvement.
Extra Practice Website:
Finally, I used a site I created a while back in order for students to get extra practice. I don’t give homework, so this is my way to allow the kids who want extra work to get it. It includes Desmos explorations, reviews of in-class activities, practice problems, videos, and more. I made sure to include links to this site on the Personal Growth Report in case students needed extra practice or other options in order to prepare.
Once again, I’d like your feedback on how to make the site better (am I asking this too much?). Let me know if you find mistakes, things that can be improved, and anything else you think would benefit the kids. The site is definitely a work in progress.
For the actual launch and implementation of the tasks, I grouped students based on their areas of need. However, I didn’t put all of the “starting out” kids in the same group (it seems like a lot of research is showing that’s not a good idea). Instead, I tried to form heterogeneous groups. This worked out because many students’ need areas were stronger than other students, so I was able to put mixed abilities in each group. For example, there were students whose lowest progress level on a concept was “getting there” or “got it.” I placed these students with the “starting outs”, and the groups became mixed. At the same time, each student was able to focus on a topic where more pursuit of learning was necessary.
Overall, I was happy with the way the activities went. The mistakes mentioned above did lead to undesired results at times, but I think the improved assignments will help counter this. Heading into next year, I plan to have a day or two set aside each grading period to give students a personal growth report and allow them to work on similar assignments to those linked above.