IR, NDVI, BG3 Superblue, Oh My!

edited December 2013 in General
In another thread, we got way off topic and touched on stuff you can do with an IR-converted camera. John posted a cool link to a discussion of NDVI on the PLOTS web page. Among other things, the PLOTS page mentioned the "Superblue" filter, which passes blue and IR wavelengths. The graph on the PLOTS web page seemed to show that the Superblue filter is actually a Schott BG3 filter. I have one of those at work, so I stuck it in our spectrophotometer to take a look. While I was at it I stuck in a Schott UG1 UV filter and my Hoya R72 IR filter:

BG3 UG1 R72 Transmission Curves

Both the UG1 and the BG3 have peaks at both the blue and red ends of the spectrum. The R72 turns on around 720nm and just keeps going.

UG1's blue peak turns off right around 400nm, and its red peak turns on right around 700-720nm, same as the Hoya R72. So for the purpose of an unconverted camera, it's essentially black with a little red leak.

BG3's blue peak turns off a little later, closer to 450nm, and its red peak also turns on right around 700-720nm. So for the purpose of an unconverted camera, it's got marginal blue and red leaks. Just for fun I stuck it on my Canon T2i and took a picture:

Unmodified Canon T2i with Schott Glass BG3 Filter

That's our courtyard at work in more or less direct sunlight. I uploaded the picture full size.

Looking at the curves for the image, there's a fair bit of light in the R and B channels, and practically nothing in G. Which makes sense given the nature of the filter. I'm not sure what I can do with it on my camera, but on a modified camera you should get a fair bit of IR at the red end, and a bit of UV at the blue end.

Given what's on the PLOTS page, it may be possible to get something similar to an NDVI image off of a single frame by doing:

Pixel = (R-B)/(R+B)

I'm going to ask around at work tomorrow to see if we have any modified cameras sitting around. But if anyone can think of anything to do with this using an unmodified camera, please let me know. I can go out and get whatever pictures you'd find useful and upload them for some sandbox play time.

Tom
«1

Comments

  • Hi Tom,

    Thanks for posting the graph of the BG3. It seems to be slightly different from the transmission curve LifePixel posts on their website for their Super Blue filter (http://www.lifepixel.com/infrared-filters-choices).

    Your NDVI formula is what is commonly used for a camera modified with a blue filter. I'm curious to see if someone can think of a reason to use the BG3 filter with an unmodified camera. My sense is that you're effectively removing the green channel so instead of a 3-band RGB image you just have red and blue which seems more like a loss of information. With an unmodified camera you can sort of simulate NDVI using (G - R) / (G + R).

    It's nice to see these discussions on this forum.

    Ned
  • edited December 2013
    If anyone wants to look at the data sheets for a range of filters, some are listed here:
    http://www.uqgoptics.com/catalogue/Filters/FILTERS.aspx
    but often, results can vary depending on the batch/supplier.

    Working in the near IR on archaeological features, we have found that the best differential results are almost always on overcast days, with sunny shots looking more like what you would expect from using a red filter, but there are exceptions:
    image
    http://www.armadale.org.uk/dart.htm
    image
    http://www.armadale.org.uk/bathandbristolyac.htm
    which I have posted before.

    At some point, we will have to try some NDVI for those sites that prove a little elusive in the near IR alone.
    We will be interested to see what approach is finally recommended on PLOTS, as the simplest, effective technique to employ.

    As we are entering the near IR phase of our SNAPS Scheme, I am a little reticent to mention NDVI.
    Although in practice, working in the near IR is no different to working in the visible part of the spectrum, it is not always that easy to convince others of this. Confusing the recipients, within the Scheme, with NDVI could be going a little too far at this stage.

    Experimentally, Jim Knowles and I have a lot more to do in the thermal infra-red.
  • I'm still monkeying around with this. We use full-spectrum modified cameras at work, so I need to ask if we have any on the shelf at the moment. If we do I want to try this on a modified camera.

    I got some replies to that test image I posted to Flickr that makes me think I'm barking up the wrong tree. I'm still trying to figure out how to test some of the stuff he mentioned. I'll post here regardless.

    John, I wouldn't worry about introducing NDVI into your SNAPS Scheme. One of the things I love about the KAP community is that people will go off on their own, test the bejeebers out of an idea, and post the results. I don't know how much of my KAP gear and technique was built off of other people's ideas and experiences. I'd say most of it. I'm happy to be one of the ones who goes off on this tangent and comes back with something that may or may not be usable down the road. If it works, that can be a future steps for you guys. If it doesn't, it's time you spent better doing other stuff.

    And yeah, please share what you and Jim are doing in the thermal IR! I keep begging the folks at work to let me fly our 640x480 FLIR camera on a kite. They keep looking at me like I'm a dangerous and slightly crazy animal.

    Tom
  • Tom,
    I hope you can find a full spectrum camera to try the BG3. It will produce a really good approximation of NDVI. But even better results are possible with a red filter like a Wratten 25A. As Paul mentioned on Flickr, the blue channel will have a very pure NIR signal and the red channel will have mostly red with some NIR. This not only produces clearer and more meaningful NDVI images than the BG3, but they are similar to the NDVI the remote sensing community has been using for 40 years. I have been comparing cameras modified by replacing the IR block filter with either a BG3 or a Wratten 25A. Here are some notes: http://publiclab.org/tag/Wratten25A.

    In order to get good approximations of NDVI with a modified camera, it is important to do a custom white balance. With the BG3 filter, the best results follow custom white balance white flooding the sensor with blue light. So fill the frame with blue paper under blue sky in the shade, or point at a blue sky, or use a blue LED. With the red Wratten 25A filter, the best results follow custom white balance while flooding the sensor with red light. So use red paper in the sun, or a red LED. Here are some notes about white balancing these cameras: http://publiclab.org/tag/white-balance.

    My objectives with these cameras is for analysis of plant health and vegetation classification. Most of my aerial images for these purposes have been with two-camera NIR rigs, but it will be nice to have a single camera system, too. Here is an application for agriculture (http://publiclab.org/notes/cfastie/10-29-2012/agricultural-trial-mapping) and here is one for invasive aquatic plants (http://publiclab.org/notes/cfastie/07-29-2013/waterchestnut-nrg).

    Chris
  • edited December 2013
    Thanks Tom, I look forward to anything that you may produce. A single camera approach appeals.

    For safety reasons, we do not fly the thermal imager solo, so it a questions of having the appropriate environmental conditions and being able to team up with Jim at a moments notice.
    image
    Me and a dog back in 2011 (http://www.armadale.org.uk/phototech06.htm)

    We want to thermally image this site next year before normal crop marks appear:
    image
    http://www.armadale.org.uk/lochlands.htm

    Larry Purcell (http://cses.uark.edu/1817.php) is the only other person I know who is doing kite aerial thermography:
    image
    Kite aerial image of a soybean crop (33°C, RH ~33%, wind 10-12mph)

    But I am sure that there are others out there.

    Thermal imagers are being flown on hexacopters more often than on kites (or they just publish more!) but at greater risk. http://diydrones.com/profiles/blogs/flir-fpv-with-hexacopter
    Kites have the clear advantage of being able to lift any weight.
  • I don't have experience with filter but would it give similar results by taking RAW pictures and then post-process the photos?
  • edited December 2013
    Ned Horning posted this in another thread, but it bears re-posting here since it's the topic for the thread:

    http://publiclab.org/notes/nedhorning/11-01-2013/why-a-red-filter-should-work-well-for-ndvi

    It's an article Ned wrote that points out why a Wratten #25 is better for single-frame NDVI than the Schott BG3.

    I've got two Wratten #25 filters - one glass and one gel - both of which I can fly from a kite. All I need now is a full-spectrum camera to stick them on. Unfortunately the full-spectrum cameras at work are all apart on the bench. So it's time I got one of my own. With the holidays and all I can't sink any money into a new camera. So I'm toying around with the idea of converting my A650 KAP camera to full-spectrum. I'd rather not since I still use it from time to time, but right now it's my best bet.

    Anyway, I'm going to pursue this with Wratten filters once I've got a camera to play with. In an ideal world I'd pick up a new DSLR for KAP and convert my T2i to full-spectrum since that would give me access to 14-bit RAW files, which would lend themselves to post-processing better than JPG. (To answer your question, Yvon, the answer is yes.) But that's a little out of my reach at the moment.

    Tom

    P.S. Egads! The filter numbers are showing up as hash tags. This is why I don't like hash tags. Some things still need that symbol to be a number sign... :(
  • Chris, I'm reading through some of the posts and comments on Public Lab, and may have an answer to a question that was raised in the post "Focus on filters":

    Any time you introduce a plano/plano optic into a converging beam, you introduce spherical abberation to the system. This is why even if you use the absolute best quality glass filter, adding ANY filter to the end of a lens will degrade the image quality to some degree.

    In the case of IR blocking filters, these are fixed characteristics of the optical system at the time the lenses are designed. So most lens designers will design in a little negative spherical to compensate for the IR blocking filter. When you remove the IR blocking filter, you essentially introduce negative spherical into the optical prescription. It helps if you can put that back in. That's why a number of places that do full-spectrum conversions will stick an AR coated BK7 window in front of the detector.

    The devil's in the details, as with most things. How much spherical the plano/plano optic introduces will depend on both the thickness and the index of refraction of the material it's made from. This is why you see a difference in image quality going from a gel filter to a glass filter. Even though the gel filter seems like it should be the poorer choice, the glass filter will typically introduce more spherical.

    So when you pull out an IR blocking filter while converting a camera, be sure to measure how thick the filter is and try to replace it with a chunk of glass of about the same thickness. It's almost impossible to guess what the index of refraction of the IR blocking filter is, but BK7 is a good general purpose optical glass that's the basis for a lot of interference filters. Replacing it with an equivalent thickness BK7 window is usually good enough to make the camera focus exactly the way it did prior to conversion.

    Tom
  • Hey, cool!

    I've been reading the Public Lab site for the past couple of days. Ned Horning wrote a nice article about converting the Canon Powershot A2200 to infrared. The Canon Refurb site didn't have any in stock (thanks for the link!), but I picked one up off Ebay for 29 USD. (Cripes! These things can be had for less than I paid for my R72 filter.) As soon as it shows up, I'm going to convert it to full spectrum. I think we've still got a stock of BK7 optical window from all the DSLR conversions we've done at work, so I should be able to use some of the smaller scraps for a window on the A2200.

    YAY! Now I get to start playing in the sandbox with everyone else!

    Chris, I saw in one of your posts that there's a CHDK beta build for the A2200. HOT DIGGITY! This makes experimentation soooo much easier. With regards to white balancing, have you tried making DNG RAW files and doing the processing after the fact? I like my workflow for Canon RAW files, but the software I use won't touch a CR2 or DNG file so I can't use it. Does anyone have any suggestions for RAW workflow using DNG files?

    For anyone else interested in trying this, the A2200 runs CHDK, as far as I know it can be triggered through the USB cable (same as my A650 using the same script!), and it's got analog video output on a 3.5mm jack so it'll talk to video downlink hardware without a converter. Basically I should be able to stick this in my KAP rig using the cables I built for my A650 and get shutter and video right out of the box. I'm stoked.

    I'm off to keep reading on the Public Lab site...

    Tom
  • Tom,

    That's a great deal on a A2200.

    That's really good information about spherical aberration. I'm really disappointed that the glass BG3 filter I installed was maybe a different thickness than the IR blocking filter it replaced. Or maybe it was just a different density. So I guess the ideal procedure for converting to a plant stress camera is to replace the IR blocking filter with exactly the same thickness BK7 glass, and then put a Wratten 25A (or whatever) in front of the lens. Attaching filters in front of Powershots is messy though. And finding the proper piece of BK7 glass won't be easy for most.

    I tried CHDK RAW files once, and they were huge, slow to write to SD, and had no EXIF data. They were pretty easy to convert to jpeg with RawTherapee, but I have not tried it with an infrared converted camera. I have not tried DNG. Ned Horning has a Powershot G11 that was converted to BG3 by Lifepixel, and that camera saves native CR2 RAW. I used that G11 to make CR2 RAW files and also simultaneous jpegs with custom white balance that gave good NDVI results. Lightroom and Photoshop read the CR2 files, but I never figured out how to make Lightroom or Photoshop start with the CR2 and reproduce the color balance of the camera jpegs. It must be possible, but I gave up and just make sure the camera is white balanced properly before taking jpegs.

    Chris
  • I was wondering why the near infrared reflected light is used as a measure of the sunlight intensity. Why not use the green light since it's being reflected as well. Is it because the green light seen by the camera overlaps the red and blue, giving a less acurate measure of the sunlight intensity?

    If the green light is used instead of the NIR, it might give a less accurate NDVI reading but the camera wouldn't have to be modified. It might give some interesting results in many cases.

    Does this make sense?
  • edited December 2013
    Chris - while I never used it to hold a filter, I did use a filter adapter to protect the lens of my Canon S95 (and now my S100). I suspect there are filter adapters available which could be used in this way for many of the Powershot (and Ixus) cameras. See below (and other images in the same set.

    S95 with protector
  • edited December 2013
    YvonH,

    Plant reflection in the IR is much much larger than in green wavelengths (you can think of it as having a higher albedo than snow in these wavelengths). It's also better at differentiating the health status of plants as reflectivity in those wavelengths is related directly to cell wall structure/margins/air spaces and changes dramatically with wilting and turgor pressure. Reflectivity in green is related to pigments so it doesn't change that rapidly in response to water stress.

    Optimally you use a multispectral or hyperspectral imager to classify vegetation and health status. With 200 plus channels you can identify oak species, canopy leaf density, estimate productivity/carbon fixation, or map minerals in desert soils associated with gold mineralization. Even better for plant ID is to compare remote sensing images through seasons.

    Plus false color images are cool and IR does neat things with water depth ;).
  • Dave,
    That's a bit of a revelation for me. I didn't know that some Powershots have a removable ring on the front designed to allow twist-on adapter tubes (or lenses). Two of my early KAP cameras (A590 and A630) have this. I always wondered what that little button by the lens was for! So I could buy an adapter like this: http://www.aliexpress.com/item/Black-52mm-52-mm-Camera-Lenses-Adapter-Tube-Ring-for-Canon-A570-IS-B52-Lens-Hood/1033170003.html and use a 52 mm filter. Or I can use your mounting scheme to put this on any other camera. Now all I need is the correct piece of BK7 glass to replace the IR blocking filter.

    YvonH,
    Remote sensing folks have used green in the "Normalized Difference Greenness Index" ((G-R)/(G+R)) and the "Redness Index" ((R-G)/(R+G)). But as tgran points out, the reflectance of NIR from healthy plants is huge compared to green, so NDVI ((NIR-R)/(NIR+R)) has greater power to discriminate between healthy vegetation and most other surfaces in aerial images.
  • edited December 2013
    With black epoxy and fiberglass you can put a filter adapter on any camera ;)

    But make sure to replace the hot filter with optical glass as Benedict says above.
  • edited December 2013
    I have been using a commercially IR converted A2200 with CHDK for a while, but the image quality is not that brilliant and I still prefer the images from my old Fuji F30. I think that this may relate more to the conversion than the intrinsic camera quality.
    With external filters, I have used blu tack in the past:
    image
    (in this case, a near ultraviolet set-up, with Hoya U-360 UV-pass+Schott BG39 IR-block, 25mm filters)
    with aluminium tape to fasten the filters together.
    Now, I have just glued the filters to the lens housing on the Fuji.

    For larger cameras, it is often cheaper to buy a stack of adapter rings than several individual ones:
    http://www.ebay.co.uk/itm/16-PCS-49-52-55-58-62-67-72-77-82mm-Step-Up-Down-Ring-Filter-UV-CPL-Set-DC147-/291001923924?pt=UK_Photography_Adapter_Rings&hash=item43c110f954

    To protect my compact camera lenses, I cut lengths off a telescopic pole, which provides a wide range of diameters.
  • Thanks for the explanation on green vs IR.

    I found these instructions here to modify a Canon T4i if anyone is interested. The video is almost one hour long.

    http://www.youtube.com/watch?v=7huA4R9rXrQ
  • edited December 2013
    Nice instructions. The rate limiting step for me is usually the right optical glass replacement for the hot filter. I've tried cutting the right glass to size myself and cleaning it and it was pretty much a failure with the equipment I had available. I either need some new skills on this front or more money to buy filters for the exorbitant price they're sold online. [ Scratch that, those Astronomik hot filter replacements are much more affordable than other ones I'm used to seeing . . . Thanks for the link ]

    One big plus of a full spectrum camera is that they collect a lot more light. 2x + depending on the scene. This makes faster shutter speeds and potential night KAP opportunities you might not otherwise have. I'm willing to bet you wouldn't even notice weird colors much in a night-time cityscape.
  • I asked how the full spectrum filters were cut here. This is what I was told:

    The filter blank was bedded in fixturing clay. It was then cut using a water cooled diamond saw. The clay did a good job of holding the glass in place, and kept the edges from fracturing. The diamond saw had a high speed and low feed so it wasn't pushing into the glass much at all. Water cooling was necessary.

    That was all a verbal regurgitation from one of the guys who watched it done. The guy who did the cutting has since left the company, so I can't ask them directly.

    Tom
  • Thanks for asking! For one cheap camera I used a microscope slip cover that was the same exact thickness, trimmed it crudely with a sapphire cutter, and put it in place with black silicone. In my next life I want to work in a micro-machining lab.
  • I know I've posted about IR stuff in a couple of threads, so I wasn't sure where to put this. I hope this one is appropriate. This also isn't news since others have already done it. But I figured I'd share:

    I've had an IR-converted A2200 for a while, and picked up a second A2200 with the intent of flying them both on a KAP rig at the same time so that I can do four-color photography. I have the bracket made, and got the thumbs-up from James for the modification I had planned for my GentLED-CHDK (basically to wire in two USB pigtails off the same cable). Despite all that I still haven't flown both of those cameras at the same time on the same rig. Shame on me.

    But I did finally fly them both! My wife and I made a recent trip to Maui on Mokulele Airlines. The significance here is that they use single-engine turboprop planes rather than jets, so the whole trip is spent below 9000' ASL. Not quite KAP range, but a lot better than 35,000'! I hand-held both cameras, pointed them out the window, and got busy.

    Keep in mind this was all hand-aiming without a bracket. And the shutters were triggered by manually half-pressing two shutter buttons, re-aiming both cameras, and then fully depressing both buttons. So "close" is all I could hope for. I made a bunch of photo pairs in the hopes I'd get lucky. I wasn't able to check the photos until we got back. All I can say is I got lucky.

    Step one was to bring both photos into Photoshop CC for alignment. In the past I've used Microsoft ICE, Autopano Pro, and Fotomatix. They work, but I was seriously impressed by the job Photoshop did, especially given that one of the photos was color and the other was IR. (I'm not all that impressed with Photoshop's panorama stitching, though.) With the two aligned photos in hand, I got busy.

    South Kohala Color Infrared

    This first one is a synthetic color IR photo meant to reproduce how color IR film works: IR light goes to the red channel, red light goes to the green channel, and green light goes to the blue channel. Blue light is ignored. I did all the channel monkeying in Photoshop, but in retrospect it would've been loads easier in ImageMagick.

    South Kohala NDVI

    The second is an NDVI taken from the same two photos. The IR information came from the IR photo's green channel (most info, least noise, though a straight grayscale probably would've worked better). The R information came from the RGB photo's R channel. The processing was done in ImageMagick using the same script I wrote to do NDVI on ground-based photos.

    I made no real effort to calibrate the cameras since I didn't even think of doing this until I was on the plane. So the only real information you can get from the NDVI is that white areas show photosynthesis at work, and dark areas don't. There are a couple of hints up in the cloud forest on top of the mountain that you MIGHT be able to differentiate between different kinds of vegetation, but that could be a fluke.

    One thing I was happy about: NDVI is supposed to take out variations in illumination, say, from cloud shadows on the ground. The noise will go up in the shaded areas, but the NDVI response to the ground cover shouldn't change. Except for one really noxious clouded area that's basically what I got. Yaaay! It worked!

    This is encouraging enough for me to finish out the modifications to my GentLED-CHDK cable and put the tray in my KAP rig. I'm still not sure what I'll use it for at this point since I haven't been doing any archaeological work recently, but like everything else I've done with KAP so far it IS a lot of fun.

    Cheers,

    Tom
  • YAAAAAY! Done at long last!

    IR / RGB KAP Rig

    Can't wait to get this thing up in the air.

    Tom
  • Looks good! :) And the results above also! I'd be interested in this sort of thing too, but it became obvious that I can't convert the cameras by myself, I'm just horrible with the tiny bits and pieces. :) I had to open an A810 because the lens mechanism was full of sand, and at the same time I looked what would it take to IR convert it. Maybe I just didn't have the patience or the motoric skills for it, but now the whole thing is totally demolished. Oh well...
  • I understand. This A2200 took me two tries. I got dirt in mine, too, and figured if I was already taking it apart I might was well finish the job. I ran into issues with one of the connectors, but after messing with it for a couple of hours with my heart in my throat, I finally figured it out. Taking apart cameras isn't simple, no matter how easy the tutorials make it look. I hope never to have to take this one apart again.

    Tom
  • The weather held! I got it airborne!

    I was a little concerned about the two shutters going off at the same time, so I re-wrote my CHDK script to half-press the shutter on the rising edge of the +5V pulse the GentLED-CHDK puts out, and full-press the shutter on the falling edge. This requires the blue wire on the GentLED-CHDK to be plugged into a ground pin on the RC receiver so you can flip the shutter switch to focus, and flip it back to trigger. (And for what it's worth it's identical to the behavior of the GentLED cable I use on my Canon T2i. Now all my cameras are consistent!)

    That combo worked GREAT. Going through the photos later there was no question which ones were paired. I aligned in Photoshop CC, but did all the channel math using Imagemagick. Here's a post-aligned pair:

    Mala`Ai Test - RGB

    Mala`Ai Test - NIR

    And here's a synthetic color-IR generated from three of those four channels:

    Mala`Ai Test - CIR

    I tried doing some other analysis, but the low angle of the light really didn't work with anything I attempted. (That's what I get for flying after work!) I'm just stoked the whole thing worked so well.

    A couple of other fun things happened on that flight. The wind was a little high, so I used my Flowform 16. I realized after I launched it that I was standing in the same exact spot I'd stood the first time I pulled that kite out of its bag and let it catch wind. That put a smile on my face.

    I put the kite up to test the wind while I put my gear together. Before I could attach the rig to the line a bunch of kids came bounding up to ask what I was doing. They LOVED the idea of hanging a camera from a kite line. I gave the kids my transmitter and monitor while I put the rig on the line and gave it some altitude. They were getting such a kick out of running out to wave at the camera, then running back so someone else could have a turn, that I almost lost all my light! But their parents came by about twenty minutes before the sun set behind the clouds, so I did finally get my tests in. But they had a blast. And so did I! It was a great reminder of all the things that make KAP so much fun.

    Tom

  • Whoa, I'm too amazed with your results, very, very nice!

    Curious, what were the shutter speeds here?

    If that wasn't a KAP success, I don't know what is. :) You got to fly a kite, got aerial images from two cameras, normal band and IR and got to delight kids, all at the same time, awesome! :)
  • Shutter speeds were 1/160s on the NIR and 1/500s on the RGB.

    I should probably point out that I'm not really breaking ground with this. Scott Armitage built a two-camera rig several years ago, and the folks at Publiclab have been building and selling turnkey two-camera systems for balloons and kites. I'm standing on the shoulders of a number of giants. (But the view really is nice!)

    Tom
  • Tom, congratulations on getting the rig working. The results are really great.

    What do you think about not being able to see the LCD of one of the A2200s? Is that loss tolerable? I don't know much about the GentLED-CHDK, so I was wondering why you needed a script since CHDK has a remote synchronization feature that works really well to fire the triggers simultaneously with no script running.

    If you get more VIS/NIR pairs you might want to try Ned Horning's plugin for Fiji. Give it two directories for VIS and NIR photos and it does the alignment of each pair (using the SIFT algorithm), crops to the overlapping area, and produces NDVI and false color IR for each pair. It's a very powerful tool.

    I'm looking forward to more infrared shots.

    Chris
  • Thanks!

    I don't mind not being able to see one of the LCDs. I'm only really using the video link to aim, so as long as I can recognize ground features on my screen I'm good to go.

    I tried playing with CHDK's own remote stuff back when I was using it on my A650. It had some limitations for what I was trying to do, so I didn't use it. At this point I think I'm using it exactly the way they designed it. But I'm so used to running scripts at this point that it seemed the logical way for me to do it. I really should just set both cameras side by side and play with CHDK.

    Thanks for the link to the Fiji plug-in. I'll give that a go. The more tools, the merrier!

    Tom
  • edited September 2014
    Using Ned Horning's stuff is going to involve a bit of a learning curve. First, I tried it with my current installation of ImageJ. (Fiji is just ImageJ, after all, right?) Oooooh boy. Lesson #1: My version of ImageJ is old. So I installed Fiji and went from there.

    I'm having problems using the first script - the one that pairs RGB and IR images. I need to check the clocks on the two cameras. It kept pairing up the wrong images. But when I cranked my own matchedImages.txt file the rest worked fine. I compared it against Photoshop and Microsoft ICE for alignment, and you can't tell the difference between them. (Hmmm! Everyone's using SIFT!) I think the transformation Ned's using beats ICE hands-down.

    I ran into one other issue, though. I wouldn't have seen it if I hadn't been using Ned's software, because I wouldn't have seen it if I hadn't seen every pair from that session combined that way. I have a lag issue on one of the cameras. I had big stands of bamboo in a number of the frames, so I could see what was camera motion and what was subject motion. There's a slight lag between the RGB and IR frames. The two cameras I'm using are running different firmware revisions, so it may be innate behavior of the camera, or it could be something with my script. I won't know until I try CHDK's native cable release stuff. So the jury's still out.

    Thank goodness it's Friday! I'm planning to drag all my gear out and photograph a couple of anchialine ponds near the coast. It should be a good test.

    Tom

    EDIT: Ok, I tried CHDK's shutter stuff. OoOOOOOooh! Cool! Already I can tell the two cameras are much more closely synced. If I'm still seeing relative lag, I'll dial in their delays to minimize it. Thanks again, Chris!
Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion