One area where I need a lot of growth is responding to the progress of my students. Specifically, what are the next steps after I’ve identified the level of understanding in certain concepts? I recently decided to tackle this challenge during our review days for the state assessment. The goal was to review the most tested concepts, but I also wanted the kids to work on their biggest struggle areas.
I began the process by consolidating all of the data from concept quizzes throughout the year. From there, I created a “Personal Growth Report” for each student using autoCrat.
Here’s a tutorial video showing the process of creating these reports.
This is a similar idea to the Growth Mindset Reports I blogged about a few months ago. Basically, I was looking for a student progress report that promoted growth mindset with its language and format. I wanted the students to come away with a positive, motivated attitude about where they needed growth and their steps toward improvement.
For the original reports, I had the students self-assess their understanding of each concept because I was concerned about classroom status issues. I’ve found in the past that students compare the grades on their papers and status ensues. However, I went ahead and placed the current progress on the reports because I knew that some students would not be able to identify the areas where they needed the most growth. In order to hopefully avoid negative takeaways, I made sure to add the statement, “I’m giving you this feedback because I believe in you” (thanks Jo Boaler). No doubt, words can be empty with the kids, but I was hopeful that my actions throughout the year would back up this encouraging phrase.
Silent Solutions:
I created several silent solution videos as the first step toward responding to student progress (the idea came from Kyle Pearce and Cathy Yenca). The videos are short, silent and meant to be a quick how-to for students to learn from. I really like these because they’re straight to the point, and students can easily play them over and over without spending a ton of time.
In addition, Jennie Magiera has talked about some interesting findings related to tutorial videos for teachers in Chicago Public Schools. Her team found (I’m paraphrasing) that videos 40 seconds or shorter were watched all the way through 100% of the time. However, longer videos had a much lower watch percentage. This makes sense because we’ve all had moments where we see a video is 5 minutes long, and we either stop watching or try to skip to the “good part.”
Finally, I felt like the silent solutions would be important and useful because I had students requesting tutorial videos. This interest was confirmed during the review process because many kids demonstrated improvement and mentioned the helpfulness of the videos.
Here’s a sample…
…and a link to all of the silent solutions I’ve uploaded to YouTube so far. I’d really appreciate your feedback on errors and possible edits that can be made in order to improve these. Also, please share if you’ve created your own.
DOK Levels:
For the in-class review, I decided to create an assignment for each topic with different levels of difficulty. I wanted to hit as many depth of knowledge levels as possible so I tried to emulate Robert Kaplinsky’s excellent tool.
Level 1:
For level 1, I simply had the students watch the silent solution videos and work some practice problems afterwards. Plain and simple, but I wanted the assignment to have a low entry point.
Level 2:
I used a lot of Desmos to ramp the difficulty for each concept. For the topic pictured above, Michael Fenton’s Match My Line activity was perfect (with an assist from Cathy Yenca on thinglink).
Level 3:
For most of the concepts, I used an Open Middle problem to reach higher DOK levels. In this case, I used the following from Jon Orr.
Level 4:
For most of the level 4 challenges, I had the students create something cool with Desmos. In this case, I had them complete a putt putt golf hole. These holes always seem to challenge even the most confident students.
Overall, I really liked the level approach to each assignment because the students who are “starting out” with a concept can find challenges in the lower levels, and the “got it and then some” kids are challenged in the higher levels. This allows for every student to improve his or her depth of understanding in each topic.
Here are the other assignment links:
I made a couple mistakes on the original versions of these tasks (the links are for the updated versions). First, I put too many practice problems in the level 1 challenges. I found that 4 to 6 problems was a sufficient amount. More than 4-6 led to a halt in momentum and boredom. Plus, a lot of the kids were able to pick up on the concept after 4 problems.
Another mistake I made was putting released state assessment problems in each assignment. The problems heavily impeded student progress and took the fun out of the experience. I’ll definitely leave them out from now on and instead weave them into other parts of the class. Better yet, I’ll use Geoff Krall’s guide to better test prep.
Again, it would be great to get your feedback about how to make these tasks better. DOK levels are another area where I need growth so I know each assignment has room for improvement.
Extra Practice Website:
Finally, I used a site I created a while back in order for students to get extra practice. I don’t give homework, so this is my way to allow the kids who want extra work to get it. It includes Desmos explorations, reviews of in-class activities, practice problems, videos, and more. I made sure to include links to this site on the Personal Growth Report in case students needed extra practice or other options in order to prepare.
Once again, I’d like your feedback on how to make the site better (am I asking this too much?). Let me know if you find mistakes, things that can be improved, and anything else you think would benefit the kids. The site is definitely a work in progress.
Implementation:
For the actual launch and implementation of the tasks, I grouped students based on their areas of need. However, I didn’t put all of the “starting out” kids in the same group (it seems like a lot of research is showing that’s not a good idea). Instead, I tried to form heterogeneous groups. This worked out because many students’ need areas were stronger than other students, so I was able to put mixed abilities in each group. For example, there were students whose lowest progress level on a concept was “getting there” or “got it.” I placed these students with the “starting outs”, and the groups became mixed. At the same time, each student was able to focus on a topic where more pursuit of learning was necessary.
Overall, I was happy with the way the activities went. The mistakes mentioned above did lead to undesired results at times, but I think the improved assignments will help counter this. Heading into next year, I plan to have a day or two set aside each grading period to give students a personal growth report and allow them to work on similar assignments to those linked above.
I’m very interested in SBG and love your break down and explanation. I’m a little concerned that SBG will encourage students to think about math a series of discrete skills vs. seeing connections and thinking more conceptually. I’m wondering if you have encountered that and found ways to push back on it, if necessary. I assume careful choices about the kinds of questions you ask can help, but then can’t that make it more difficult to determine what standard you’re assessing? Just thinking things through and curious about your thoughts.
Thanks for the kind words and your thoughts, Rachel! That’s a valid and caring concern. I’m still learning as I go, but after using SBG for three years, I think my teaching practices and classroom environment have more of an effect on this area than SBG. As I’ve improved my everyday teaching and setup, the class has promoted more of a seeing connections feel. It’s still a work in progress though. One setup with SBG that has helped is making the majority of the quizzes multi-concept. I think this helps promote the message that math is more than just a skill by skill subject. So, my quizzes rarely just have one skill now.
Also, I’ve tried to emphasize my goal for the grading system with the kids more. I think my main goal now with the system is to reduce the amount of testing, and mainly use quizzes to see what they know and how we can improve. I actually put this image in front of them before every quiz now. In addition to the message, I try to make sure that everything I do backs it up. So, when the kids can start to see that it’s true that I’m not just trying to get grades in the grade book, it helps the overall feel, and I’m able to move away from the discrete skills zone.
Basically, I try to de-emphasize the importance of quizzes and grades, and I try to focus on learning about student thought processes with methods other than quizzes. SBG has helped focus my analysis of their thinking because I can see how big ideas connect when they’re listed out. It also is nice to have grades that show where there progress is. But, I’m learning that the most important part of using SBG is actually the vision behind it. When the vision is good and my actions back it up, the environment improves. Not sure if that answer helps!
Wow. I went searching for info on standards-based grading in math and I feel like I’ve struck a gold mine! Can’t wait to implement these strategies next year and improve the feedback process for my students! Thank you for sharing your progress!
Thanks so much for the kind words! I really appreciate it! Let me know how it goes for you and how these materials can be improved. Looking forward to learning from you!
Hi Dane, how do you distribute the progress reports to students? My district doesn’t allow email access for middle/elementary students, so I can’t use email to distribute the reports. Clicking on each report to share to each individual students sounds like a lot of work, to be honest. Does Doctopus do this for you? Or do you print them out and pass them out in class? Email to parents?
Thanks 🙂
Hey Ivy, thanks for your comment. You’re right that it would take a lot of work to share each individual report. I don’t think that’s worth it, so I don’t do it. I really like the idea of Doctopus though. I haven’t investigated that route and really don’t know anything about that program, so it’s worth investigating. Let me know if you find that it does the work for you!
So, with all that said, I print out the reports and pass them out in class. Also, sometimes a parent may request one (very rarely), and I’ll just go in and share the individual report with them. Since it’s not many, it’s quick to do.
Wow – thank you for sharing so many great resources! Can you go teach 7th math for a couple years and develop all of this for me? Just kidding! My math team is going to do SBG next year, and I am glad to have found your blog to help us get there. I love the silent videos – what program are you using to create those?
Haha thank you for the kind words! I really appreciate it. I’m excited to hear that y’all will be moving to SBG next year. Good luck and let me know if you have questions at any time.
As for the videos, I use Adobe After Effects. Here is a page with some tutorial videos to get you started on it if you use it in the future.