While waiting for Mac’s new OS, Yosemite, to download today I watched a few videos to familiarize myself with some of the new features and functionality. One of the most useful things I learned about is the ability to use QuickTime to display (and record) real-time video from the screen of another iOS device (e.g., iPhone or iPad) onto a Mac computer screen.
But wait, what’s happening on-screen is only part of the video data that you need to collect for effective usability testing. Arguably, seeing what the user is doing, and being able to see their facial expressions or body language are just as important as what’s happening on screen – and that’s where screen capture software like ScreenFlow or Camtasia can come into the mix to facilitate live recording of what’s happening on the iOS device (as projected through the computer screen via Quicktime) along with a secondary video stream that focuses on users’ facial expressions and body language.
I did a test run with the app PBS Parents Play and Learn (which we helped to evaluate a few years ago) and ScreenFlow. I haven’t tried things with Camtasia yet, but I’m assuming it would work as well.
Here’s what my quick test produced: http://vimeo.com/111035206
And here’s a tutorial that explains how to get your iOS device to project onto your computer screen: