Sorry for the ramble, but this is a long one:
I'm not happy with anecdotal approaches to measuring camera stability. Cameras are real things operating in real environs, so we should be able to apply real measurements to them. Since I've been trying to design a more stable camera suspension, this has been bugging me for a while. I think I came up with a combination of tools that work well enough to share. Even better, everyone one of us can use these tools. No IMU, no data logger. Just a kite, some line, a camera, a rig, and a computer: the same stuff we use to do KAP.
One of the most useful tools for anecdotally analyzing camera motion is a video made with the camera in question. Watching that video will tell you more about what the rig is doing than anything else we've come up with so far. For a while now I've wanted a way to take a kite aerial video and get a motion profile out of it. It turns out it's been possible for years. I just didn't know it.Video Deshaker
is a filter for Virtual Dub
, a free video editor. Timonoko first introduced me to Virtual Dub and Video Deshaker, both. Deshaker does a good job of taking out extraneous camera motion in pan and tilt. It also does a good job of taking out extraneous camera motion about the optic axis: roll. (As a quick aside, a number of video editors offer stabilization. Few of them stabilize about the roll axis, even fairly expensive commercial packages. Score one for free software!)
In addition to removing extraneous motion from video files, Video Deshaker also logs what those motions are. On a frame-by-frame basis, it logs translation in X and Y in pixels, and roll rotation in degrees. With a little knowledge about the camera's field of view and resolution, that can be translated into X, Y, and Z rotation in degrees. Multiply by the frame rate, and you get a motion profile of the camera in degrees per second about all three axes.
At this point it becomes possible to do statistics with the numbers. Just for grins, I ran this on some handheld video first. I tested a couple of cases. The first was with image stabilization turned off and the camera held away from my body. The next, with my elbows braced on a table. I repeated these with image stabilization turned on. The results were interesting! With IS off and my arms extended, I had an angular velocity in X of about 1.5 degrees/sec and in Y about 1.0 degrees/sec. In Z I got closer to 2.2 degrees/sec. With arms braced those numbers changed to about 0.6, 0.5, and 1.8, respectively. With IS turned on and arms extended it was 0.5, 0.4, and 1.7 respectively. With IS on and arms braced it was 0.2, 0.2, 0.9 respectively. Neat! IS works!
Also of interest was the total excursion. I calculated this by integrating the velocities to get an angular offset from the first frame as a function of time. For the handheld videos they were all at or around 1 degree over 1024 frames. I had the benefit of live view, so I wasn't surprised by this.
I spent the morning pawing around on disk to try to find some kite aerial video I hadn't already smoothed with Deshaker. I couldn't find any from my A650 or T2i rigs, but I found some Nokia N8 video files I'd made with the camera mounted in a rig built from Brooxes components. I picked a sequence where things weren't bouncing around TOO much (I think the N8 rig is light enough that it suffers from wind buffeting more than my other rigs), and ran the same analysis on it. The results were a little startling:
Average angular velocities around the three axes were 5.6, 6.5, and 10.3 degrees per second. RMS excursion 4.5, 1.7, and 5.6 degrees in each of the axes. This confirmed something I'd wondered about: It's possible to have fairly small motion of the rig in terms of the total amount of swing, but still have it occur as a high frequency, high velocity motion. Having a high total excursion is what results in things like tilted horizons and wobbly composition. But it's the high angular velocity that kills image sharpness. So even if a rig seems like it's barely wobbling around, if those wobbles are fast it still kills the image quality.
My next question was what all this meant in terms of shutter speed. In the past I've arbitrarily said if the image on the focal plane moves by more than one pixel in any axis, it'll show up as blur. I've since come to believe this extremely optimistic. In Photoshop I can typically tell when an unsharp mask with a FWHM of 0.33 pixels makes a difference. Smaller than that and it's harder to tell. So for the purpose of finding out what this means in terms of shutter speed, I decided to try to keep the motion to under 0.33 pixels in each axis.
I realize this video came off of an N8, but what I'm really interested in is my T2i rig. The T2i has a detector that's 5184 pixels wide. With the lens at 18mm the image scale is 0.0331 degrees per pixel. I took the angular velocities of the N8 rig and applied them to the T2i. To meet my specs I calculated that I'd have to use a 1/2000 second exposure speed or faster. This was a little discouraging, to say the least.
But... That was using the N8 rig, which I really do think is bouncier than my A650 or T2i rig. So the next step is to get real video footage with those cameras in their respective rigs, and repeat the processing with Video Deshaker and Excel. Then the real fun begins: Modifying the rigs to try to provide more stability. The nice thing is, this time I have tools to get real concrete metrics before and after so I not only know I made a difference, I can say just how big a difference I'm making with each change.
I'm continuing to play with this. One other tool I'm trying out is to do a Fast Fourier Transform on the time sequences that this technique generates. It should be possible to pick out the dominant oscillation modes for each axis and determine how strong each one is. (We can already see this by eye when watching kite aerial video. I just want to put numbers to things.) From a design standpoint, it's often easier to attack particular oscillation modes than it is to attack random motion. So far I'm not confident enough in my results to put anything in print. But the idea should be sound.
I encourage anyone else who's going after rig stability to use these tools as well to help them in their work. Video Deshaker and Virtual Dub are free, and the numbers can be run through Open Office, Excel, Matlab, MathCAD, or any number of other analysis packages, many of which are also free.