Is Videotaping Yourself Worth the Time and Trouble? And If Yes, How So? (Part 1 of 2)

Subscribe to the weekly “audio edition” via iTunes

From golf to tennis to baseball to diving, basketball, and countless other sports, athletes and coaches have used videotape analysis for years to pinpoint errors, identify areas for improvement, and maximize performance.

And while I’ve heard some musicians speak about the value of videotape analysis (for instance, check out these articles – here and here – of violinist Burton Kaplan advocating for its use way back in the 60’s and 70’s!), I don’t think the use of video is nearly as prevalent as it is in sports.

Which is totally understandable. I mean, it’s only pretty recently that we’ve had easy access to video recording equipment like we have now in our phones and tablets. And even if you have the equipment, it still takes a good bit of time to not only videotape yourself, but review and analyze the recording afterwards. Plus there’s all that groaning and weeping and teeth-gnashing and drowning of sorrows in a pint of Cherry Garcia to do when we hear what we actually sound like (although, technically, this part is optional). =)

All this to say, when you have a ton of things to do, and not a lot of time, it’s easy to question the value of videotaping. I mean, you do have ears after all, and probably don’t need to play more than a few bars of anything before you hear something that need work.

But…do athletes know something we don’t? Like, is there any actual research evidence to suggest that videotaping is worth the time and effort?

As I dug into this topic, I actually found a number of studies that were pretty interesting. It was too much to fit into a single post, so I figured I’d make this a two-part series. So this week, we’ll look at what kind of impact videotaping might have on the evaluation of our performances. And next week, we’ll look at how videotaping might affect the process of practicing itself. I know it sounds a little dry and boring when I say it like that, but bear with me for a moment – I think you’ll find that there are more actionable takeaways than you’d imagine.

“Concert Practice” class

A 2001 study (Daniel) at James Cook University (Australia) looked at 35 music majors (including pianists, singers, wind/brass players, and guitarists), who participated in a “Concert Practice” class. A class in which students gave a graded performance for faculty (kind of like a jury), not once per year, but twice per semester.

The other interesting wrinkle is that the class experience wasn’t just about performing – it was also about self-evaluating and critiquing one’s own performance as well. Because within a week of their performance, students had to watch the videotape of their performance “as many times as was necessary or appropriate” and write a 300-word “self-critical reflection” of the performance covering six areas.

The guidelines were:

  1. Personal presentation: entrance and exit, bowing, physical presence, characteristic mannerisms, etc.
  2. Musical issues: accuracy, stylistic appropriateness, choice of repertoire;
  3. Overall impression: personal response, audience response; 
  4. Reflections on actual performance via viewing the video against perceived performance (i.e. how the performance actually went vs. how they may initially have thought it went);
  5. Reflections on progress: improvements and developments since previous performance; and
  6. Directions: plans to improve and enhance performance.

This self-reflection report was then graded by the committee, and students were required to discuss things that came up in their performance and in their report with their teacher as well.

Two parts to their grade

The cool part of all of this to me, is that students’ performances only counted for half of their grade. The other half of their grade came from the faculty’s evaluation of the students’ self-reflection report.

In other words, students were not only given opportunities to practice getting better at performing, but opportunities to practice getting better at critiquing themselves accurately and effectively as well. Where the implicit message is that the ability to self-evaluate and critique yourself effectively is as integral to being a musician as being able to play well.

So what happened? Were there any interesting insights that came from having students evaluate their own performances with video?

What did students get from the video?

Students were given a questionnaire, which asked them to compare their impressions of the performance video with their impressions of their performance after walking off stage.

Though a few students said that what they saw on the video was pretty close to what they remembered feeling as they walked off stage, most felt differently after watching the video.

For instance, 37% reported that the performance on the video sounded better than what they remembered feeling in the moment. And 49% reported that the video helped them identify issues and problem spots that they otherwise might not have noticed or remembered.

For instance, while there were (inevitably) some comments like “Horrified” and “I thought I looked funny,” there were also comments like “I could see things I didn’t know I was doing” and “I was surprised at my tone quality and excited to hear where I could improve.”

Our performance memory is not reliable

So yes, videotaping might be uncomfortable at first, but this study suggests that some musicians may have a negatively skewed memory of their performances, either perceiving or remembering their performances as being worse than they actually were.

The results also suggest that musicians may not be great at performing and evaluating their performances at the same time. And that if we rely only on our memory of our playing, we might be practicing with an awareness of only part of the puzzle – working only on the things we know we’re doing, and neglecting the things we don’t know we’re doing. Or the things we don’t know we’re not doing.

Ok, but what about the accuracy of students’ self-critiques? Like, how did the students’ self-evaluations compare to the faculty’s evaluations of their playing?

Well, that wasn’t part of this study, but a more recent study did ask that question.

A piano study

In a 2011 Canadian study (Masaki, Hechler, Gadbois, & Waddell), 21 undergraduate and graduate pianists recorded both an official school performance and the dress rehearsal for that performance.

Immediate self-evaluation

Immediately after performing, they completed a quick evaluation of their performance, comparing it to their dress rehearsal run-through in 8 areas – like note accuracy, rhythmic accuracy, expressiveness, etc.

Evaluation with video

A while later, students watched the video of their performance, and evaluated their performance once again based on what they saw and heard in the video.

Evaluation by a pro

Meanwhile, a professional pianist also evaluated the students’ performances using the same 8-question assessment.

How do they compare?

Then, the researchers compared the students’ self-evaluations of their performances from before and after watching the video with that of the professional pianist.

And what did they find?

Well, when the students evaluated their own performances after watching the performance videos, the scores they gave themselves were pretty darn close to the scores that the professional pianist gave them.

But when evaluating their performances based only on their memory of the performance and gut impressions, their self-assessment was not as well-aligned with that of the expert.

In other words, when students were able to review their concert on video, their performance evaluation was much closer to what an impartial, objective expert would say. Without the video, their evaluation was less accurate.

Takeaways

The main takeaway from these two studies, at least for me, is that even though we may think we know how a performance went, the reality is that it’s pretty tough for us to do a truly accurate evaluation our own playing while we’re performing.

And that we seem to be much more capable of evaluating our performances when we’re watching video of it and can focus 100% of our attention on just listening and observing. Indeed, the artist Sister Corita Kent may have been onto something when she wrote – “Don’t try to create and analyse at the same time. They’re different processes.”

Ok, so videotape analysis might be able to help us more accurately evaluate our playing. But does that translate into meaningful changes in learning or performance?

Well, that’s a great question. And next week, we’ll take a closer look at a couple studies which provide some answers!

One more thing…

I briefly mentioned the 8-part assessment that students used in the 2011 study above, but I think it’s worth taking a moment to go into a bit more detail about it, because it’s an interesting way to evaluate performances.

The authors call it the Quality Assessment in Music Performance Inventory (QAMPI), and it looks like this:

From: Masaki, M. and Hechler, P. and Gadbois, S. and Waddell, G. (2011) Piano performance assessment: Video feedback and the Quality Assessment in Music Performance Inventory (QAMPI). In A. Williamon, D. Edwards, & L. Bartel (Eds.), Proceedings of the International Symposium on Performance Science 2011 (pp. 503-508). Utrecht, Netherlands: European Association of Conservatoires (AEC).

I think it’s pretty cool in a couple respects. For one, it splits the performance up into multiple categories, like a rubric, to prompt you to listen for specific aspects of your performance.

But more importantly, I like that it’s not the 1-10 scale we might otherwise gravitate towards, where 1=mortifying levels of incompetence, humiliation, and shame and 10=idyllic perfection and bliss.

Because as important as it is to have an idealized standard of perfection to aim for in the practice room, that standard is always changing and evolving as our playing improves, and like the end of a rainbow, always stays slightly out of reach. Which means that if you only ever use this 1-10 scale to evaluate your performances, you’re kind of setting yourself up to fail, no matter how well you play.

This QAMPI scale on the other hand, simply measures the gap between what you know you’re capable of at this moment (your dress rehearsal) and what you were able to demonstrate on stage (the performance). Which encourages you to compare yourself with yourself. Which at the end of the day, is really all we can do anyway. As one of my mentors in grad school was fond of saying – you can’t play better than you can play!


References

Masaki, M. and Hechler, P. and Gadbois, S. and Waddell, G. (2011) Piano performance assessment: Video feedback and the Quality Assessment in Music Performance Inventory (QAMPI). In A. Williamon, D. Edwards, & L. Bartel (Eds.), Proceedings of the International Symposium on Performance Science 2011 (pp. 503-508). Utrecht, Netherlands: European Association of Conservatoires (AEC).

Daniel, R. (2001). Self-assessment in performance. British Journal of Music Education, 18(3), 215–226.

Ack! After Countless Hours of Practice...
Why Are Performances Still So Hit or Miss?

It’s not a talent issue. And that rush of adrenaline and emotional roller coaster you experience before performances is totally normal too.

Performing at the upper ranges of your ability under pressure is a unique skill – one that requires specific mental skills, and perhaps a few other tweaks in your approach to practicing too. Elite athletes have been learning these techniques for decades; if nerves and self-doubt have been recurring obstacles in your performances, I’d like to help you do the same.

Click below to learn more about Beyond Practicing – a home-study course where you’ll explore the 6 skills that are characteristic of top performers. And learn how you can develop these into strengths of your own. And begin to see tangible improvements in your playing that transfer to the stage.

Comments

9 Responses

  1. Many musicians already know to record themselves. But I’ve mostly done (and seen others do) audio recording without visual. I would be interested to read about what the research says about video recording vs. audio-only recording for musicians. Have you found anything about that?

    1. Hi Eric,

      I have to admit that I haven’t come across anything comparing audio and video. And while it’s not a direct comparison, next week’s studies might provide some clues on the benefits of video over audio (hopefully, anyway!).

  2. Hello! Very happy to find your blog. I had trouble accessing the Burton Kaplan articles. Can you fix this?

    Thank you,
    Noralee Walker

    Chair, String Department
    Winchester Community Music School
    Violist, Aryaloka String Quartet

    1. Hi Noralee,

      Yeah, sorry about that. It kind of depends on which browser you’re using, but when you get that warning, click on “advanced” or “details” and in the description, there will be a link that lets you go on to view the page despite the warning. It’s a good idea to be vigilant about these sorts of things, but in this case the scary warning is just letting you know that link to the PDF’s is using http rather than https.

  3. In regards to the following:
    “A while later, students watched the video of their performance, and evaluated their performance once again based on what they saw and heard in the video.”
    I’m curious just how much time “a while” is. I think it’s a key point.

    1. Hi Paula,

      Very observant. =) Yeah, I struggled with the wording on that for a while, because unless I missed it, this isn’t specified in the paper. Because these performances were actual performances that students had to give throughout the school year, they ranged in length from 10-min to 50-min and took place at different times. So while the paper seems to suggest that the students completed the QAMPI right after their performance, it’s not clear when they reviewed the video. This is as much detail as I could find: “Each participant completed the QAMPI form immediately after their concert performances before watching and then again while reviewing the video recordings.”

      1. I would think that the more time between the performance and evaluating the video (giving he student ample time to “shake off” his/her feelings about the performance), the more accurate the analyses would be. But I guess that is a slightly different question.
        Thanks for the response!

Leave a Reply

Your email address will not be published.

Get the (Free) Practice Hacks Guide

Learn the #1 thing that top practicers do differently, plus 7 other strategies for practice that sticks.

Discover your mental strengths and weaknesses

If performances have been frustratingly inconsistent, try the 3-min Mental Skills Audit. It won't tell you what Harry Potter character you are, but it will point you in the direction of some new practice methods that could help you level up in the practice room and on stage.

Share230
Tweet12
Email