From golf to tennis to baseball to diving, basketball, and countless other sports, athletes and coaches have used videotape analysis for years to pinpoint errors, identify areas for improvement, and maximize performance.
And while I’ve heard some musicians speak about the value of videotape analysis (for instance, check out these articles – here and here – of violinist Burton Kaplan advocating for its use way back in the 60’s and 70’s!), I don’t think the use of video is nearly as prevalent as it is in sports.
Which is totally understandable. I mean, it’s only pretty recently that we’ve had easy access to video recording equipment like we have now in our phones and tablets. And even if you have the equipment, it still takes a good bit of time to not only videotape yourself, but review and analyze the recording afterwards. Plus there’s all that groaning and weeping and teeth-gnashing and drowning of sorrows in a pint of Cherry Garcia to do when we hear what we actually sound like (although, technically, this part is optional). =)
All this to say, when you have a ton of things to do, and not a lot of time, it’s easy to question the value of videotaping. I mean, you do have ears after all, and probably don’t need to play more than a few bars of anything before you hear something that need work.
But…do athletes know something we don’t? Like, is there any actual research evidence to suggest that videotaping is worth the time and effort?
As I dug into this topic, I actually found a number of studies that were pretty interesting. It was too much to fit into a single post, so I figured I’d make this a two-part series. So this week, we’ll look at what kind of impact videotaping might have on the evaluation of our performances. And next week, we’ll look at how videotaping might affect the process of practicing itself. I know it sounds a little dry and boring when I say it like that, but bear with me for a moment – I think you’ll find that there are more actionable takeaways than you’d imagine.
“Concert Practice” class
A 2001 study (Daniel) at James Cook University (Australia) looked at 35 music majors (including pianists, singers, wind/brass players, and guitarists), who participated in a “Concert Practice” class. A class in which students gave a graded performance for faculty (kind of like a jury), not once per year, but twice per semester.
The other interesting wrinkle is that the class experience wasn’t just about performing – it was also about self-evaluating and critiquing one’s own performance as well. Because within a week of their performance, students had to watch the videotape of their performance “as many times as was necessary or appropriate” and write a 300-word “self-critical reflection” of the performance covering six areas.
The guidelines were:
- Personal presentation: entrance and exit, bowing, physical presence, characteristic mannerisms, etc.
- Musical issues: accuracy, stylistic appropriateness, choice of repertoire;
- Overall impression: personal response, audience response;
- Reflections on actual performance via viewing the video against perceived performance (i.e. how the performance actually went vs. how they may initially have thought it went);
- Reflections on progress: improvements and developments since previous performance; and
- Directions: plans to improve and enhance performance.
This self-reflection report was then graded by the committee, and students were required to discuss things that came up in their performance and in their report with their teacher as well.
Two parts to their grade
The cool part of all of this to me, is that students’ performances only counted for half of their grade. The other half of their grade came from the faculty’s evaluation of the students’ self-reflection report.
In other words, students were not only given opportunities to practice getting better at performing, but opportunities to practice getting better at critiquing themselves accurately and effectively as well. Where the implicit message is that the ability to self-evaluate and critique yourself effectively is as integral to being a musician as being able to play well.
So what happened? Were there any interesting insights that came from having students evaluate their own performances with video?
What did students get from the video?
Students were given a questionnaire, which asked them to compare their impressions of the performance video with their impressions of their performance after walking off stage.
Though a few students said that what they saw on the video was pretty close to what they remembered feeling as they walked off stage, most felt differently after watching the video.
For instance, 37% reported that the performance on the video sounded better than what they remembered feeling in the moment. And 49% reported that the video helped them identify issues and problem spots that they otherwise might not have noticed or remembered.
For instance, while there were (inevitably) some comments like “Horrified” and “I thought I looked funny,” there were also comments like “I could see things I didn’t know I was doing” and “I was surprised at my tone quality and excited to hear where I could improve.”
Our performance memory is not reliable
So yes, videotaping might be uncomfortable at first, but this study suggests that some musicians may have a negatively skewed memory of their performances, either perceiving or remembering their performances as being worse than they actually were.
The results also suggest that musicians may not be great at performing and evaluating their performances at the same time. And that if we rely only on our memory of our playing, we might be practicing with an awareness of only part of the puzzle – working only on the things we know we’re doing, and neglecting the things we don’t know we’re doing. Or the things we don’t know we’re not doing.
Ok, but what about the accuracy of students’ self-critiques? Like, how did the students’ self-evaluations compare to the faculty’s evaluations of their playing?
Well, that wasn’t part of this study, but a more recent study did ask that question.
A piano study
In a 2011 Canadian study (Masaki, Hechler, Gadbois, & Waddell), 21 undergraduate and graduate pianists recorded both an official school performance and the dress rehearsal for that performance.
Immediate self-evaluation
Immediately after performing, they completed a quick evaluation of their performance, comparing it to their dress rehearsal run-through in 8 areas – like note accuracy, rhythmic accuracy, expressiveness, etc.
Evaluation with video
A while later, students watched the video of their performance, and evaluated their performance once again based on what they saw and heard in the video.
Evaluation by a pro
Meanwhile, a professional pianist also evaluated the students’ performances using the same 8-question assessment.
How do they compare?
Then, the researchers compared the students’ self-evaluations of their performances from before and after watching the video with that of the professional pianist.
And what did they find?
Well, when the students evaluated their own performances after watching the performance videos, the scores they gave themselves were pretty darn close to the scores that the professional pianist gave them.
But when evaluating their performances based only on their memory of the performance and gut impressions, their self-assessment was not as well-aligned with that of the expert.
In other words, when students were able to review their concert on video, their performance evaluation was much closer to what an impartial, objective expert would say. Without the video, their evaluation was less accurate.
Takeaways
The main takeaway from these two studies, at least for me, is that even though we may think we know how a performance went, the reality is that it’s pretty tough for us to do a truly accurate evaluation our own playing while we’re performing.
And that we seem to be much more capable of evaluating our performances when we’re watching video of it and can focus 100% of our attention on just listening and observing. Indeed, the artist Sister Corita Kent may have been onto something when she wrote – “Don’t try to create and analyse at the same time. They’re different processes.”
Ok, so videotape analysis might be able to help us more accurately evaluate our playing. But does that translate into meaningful changes in learning or performance?
Well, that’s a great question. And next week, we’ll take a closer look at a couple studies which provide some answers!
One more thing…
I briefly mentioned the 8-part assessment that students used in the 2011 study above, but I think it’s worth taking a moment to go into a bit more detail about it, because it’s an interesting way to evaluate performances.
The authors call it the Quality Assessment in Music Performance Inventory (QAMPI), and it looks like this:

I think it’s pretty cool in a couple respects. For one, it splits the performance up into multiple categories, like a rubric, to prompt you to listen for specific aspects of your performance.
But more importantly, I like that it’s not the 1-10 scale we might otherwise gravitate towards, where 1=mortifying levels of incompetence, humiliation, and shame and 10=idyllic perfection and bliss.
Because as important as it is to have an idealized standard of perfection to aim for in the practice room, that standard is always changing and evolving as our playing improves, and like the end of a rainbow, always stays slightly out of reach. Which means that if you only ever use this 1-10 scale to evaluate your performances, you’re kind of setting yourself up to fail, no matter how well you play.
This QAMPI scale on the other hand, simply measures the gap between what you know you’re capable of at this moment (your dress rehearsal) and what you were able to demonstrate on stage (the performance). Which encourages you to compare yourself with yourself. Which at the end of the day, is really all we can do anyway. As one of my mentors in grad school was fond of saying – you can’t play better than you can play!
References
Masaki, M. and Hechler, P. and Gadbois, S. and Waddell, G. (2011) Piano performance assessment: Video feedback and the Quality Assessment in Music Performance Inventory (QAMPI). In A. Williamon, D. Edwards, & L. Bartel (Eds.), Proceedings of the International Symposium on Performance Science 2011 (pp. 503-508). Utrecht, Netherlands: European Association of Conservatoires (AEC).
Daniel, R. (2001). Self-assessment in performance. British Journal of Music Education, 18(3), 215–226.