Microsoft Photosynth: Possible KAP Use?

edited August 2008 in Technique
Check out Microsoft's collaboration with the University of Washington.

It seems that there could be a use for it in KAP. I've had bad luck with photostitching in the past, mostly because it just takes way too long to play with all the settings so that pictures at different angles and heights all work together. I hope to have some time to play with the demo this weekend, I'll post my results.
«1345

Comments

  • I saw Photosynth demonstrated last year, and it's pretty amazing.
  • Seems interesting, but only on PC's :o(
  • Would be very cool to drop all your photos into from an AutoKAP shoot, neat way to explore them.

    We were talking about it for when we do site surveys for theatre remodels, we'd have to take a lot more photos than we currently do, but that's probably okay.
  • Emmanuel, they promised a Mac version soon after sales began for the Windoze version. But they work for Microsoft, so I don't know how accurate that is.
  • Still PC only, as far as I know, but a collection of KAP shots thrown into photosynth here: http://photosynth.net/view.aspx?cid=1E573A33-F39E-48EF-8621-2FBC1920F14D.

    Still need to dig up a hunk of autokap shots and see how it does-- didn't do too bad considering the fisheye lens...
  • Other than the easier stitching, how does this differ from 360 degree Quicktime scenes? Having looked through a few of the synths the image matching doesn't seem to match up images very smoothly as you pan around.
  • The idea is not that they are "stitched" its that they provide the 3D relationship between photos. This synth is a test I did just outside my office that sort of demonstrates what it's for: http://photosynth.net/view.aspx?cid=8b061978-1b16-47c0-9363-1b62457135f1.

    So, it's not so much stitching the photos together (which it does rather coarsely), as much as sort of displaying the photos in a 3D environment. Sort of like you taped photos of everything next to the objects in the room.
  • It's maddening to click on your link, Ben, and be told that it won't run on my Mac. Hopefully, sometime soon.
  • @broox: Yes, me too. Luckily, I have windows access at work.
  • edited March 2009
    Someone over on the rock-climbing forum suggested I try Photosynth today.

    I put this together from shots I got Friday:

    http://photosynth.net/view.aspx?cid=AD2AD49F-05B4-4A46-B9FF-7078A6656C53

    A fine use for some the cock-eyed shots I bring home in bundles.

    I read the list of recomended tips on the Photosynth site and all the tips lined up perfect with what I do every time I fly the kite.

    -Have lots of overlap. Check
    -Don't crop. Check
    -Shoot texture. Check
    -No zoom. Check
    -Move around. Check

    They may as well have suggested "attach camera to a kite"!

    Very simple to use, and it's FREE!
  • edited March 2009
    There's actually a fair bit of interest in KAP from the software developers working on Photosynth. The more the merrier!

    Here are some I've posted:

    Waipio Valley

    Truck on Mana Road

    Lava Flow (cold, not hot... sorry)

    Pololu Valley

    If you get on the Photosynth forums, you'll probably run across at least one post of mine where I'm asking if we can extract the 3D model data from a synth. (Hey, I had to try!) Looks like the answer isn't all that favorable.

    Tom

    P.S. Brooks, the Silverlight viewer runs on Mac, PC, and even Linux (or so I'm told). It'll let you view synths. Netflix now uses Silverlight for viewing movies online, so some folks may already have it installed.
  • edited March 2009
    Don't tell anyone, but last week I bought Parallels, which lets me load Windoze XP on my Mac just like another application. I thought I needed it to download my genealogy research and tree from the web. Turns out I didn't, but as long as it's there, I decided last night late to download Photosynth.

    After a lot of fumbling around, I wasn't able to get it to work. I did get a helpful window that suggested that I file an error report with Microsoft though.

    }:-( (that's a dark look for the Dark Side)

    I will check out Silverlight.
  • Broox - I run VMWare Fusion on my iMac so I can use XP (I've got old hardware such as a scanner and tablet for which there are no OSX drivers). Anyway, I just installed Photosynth successfully without problems. I'm running OSX 10.5.6.

    Dave
    PS I spent 32+ years working for IBM and have been using a PC since they first appeared in 1981, but switching to an iMac last year was the most sensible computer-related decision I've made in years. I'm still in love.
  • edited April 2009
    Ok, one more fun bit from Photosynth:

    A while ago I read someone's blog where they described how to extract a point cloud from a synth:

    http://binarymillenium.com/2008/08/photosynth-export-process-tutorial.html

    Hey, I had to try it! So I took a synth I'd made from a set of ortho photos over a lava field, and gave it a try:

    http://photosynth.net/view.aspx?cid=f363450f-3e05-479b-aebf-64d31c4be0b6

    Making the synth was the easy part. Getting the point cloud out was not straightforward, even with the detailed instructions on binarymillenium. But once it was on disk, I was able to convert it to a set of CSV files and import it into a 3D modeler (Rhino3D in my case.) There were a number of stray points that had to be dealt with, but eventually I got it clean enough to generate a mesh surface through the points:

    Photosynth Surface Extraction

    It's not perfect. Not by a long shot. There are gaps where there wasn't enough surface texture to find matching points. There are holes where there were shadows that were too dark to resolve. And the thing is just chunky on the whole. But it also worked.

    It's not GIS, it's not as good as what Photomodeler can do. But it worked!

    Tom
  • edited April 2009
    Way too cool. Since it's automated it's better (in some ways) than a lot of commercial software. How many points in your cloud?

    This of course means Microsoft will disable the functionality and kill the project. You weren't supposed to be allowed this functionality till 2019.
  • Hahahaha!! Well... the functionality isn't entirely there. The way to get the point cloud is to install a packet sniffer, watch the HTML traffic to locate the files, use a mirroring tool to grab the entire directory of files off the server, run them through a TCL script to convert the point cloud to a CSV file, and finally (FINALLY!) import it into a modeler.

    Heck, I want to see a clicky button that just drops the file on my disk. I don't know if Microsoft could do anything to make it any harder than it already is.

    Now all I need is some better weather so I can try this again on a beach.

    Tom
  • edited April 2009
    Found this interesting article;

    http://www.technologyreview.com/blog/editors/23257/

    Cheers
    Nick
  • Thanks for that link, Nick! That's getting closer to what I'm after.

    Tom
  • edited April 2009
    I remember looking a few years back when automated, real-time, 3D-model-building, photogrammetric software sold for 50 - 300K per license. Granted, the military was probably the largest client. This stuff was easily taken into an absolute reference frame (GIS) and would be able to integrate data from existing maps, DEMS, lidar, with the photogrammetry. This stuff still exists along with slightly less expensive commercial software (LPS, Autodesk, etc.).

    So. . . are they really sharing their algorithm openly? I'm guessing it's being licensed to a few select folks. To me it looks like they are intentionally hiding their algorithm by conducting the calculations for photosynth on their own server. They are also making it fairly inconvenient to get the simplest of data files out of Photosynth. It doesn't look like they want competing viewers or third party modeling software to take off. Some day they are going to come up with a business model, and I doubt they'll give out free professional software for the price of click through advertising.

    Maybe I'm wrong. I'm a bit confused by the academic involvement. Certainly some of the work will be published in dissertations and journals.
  • I admit, it's an odd mix. The algorithm came out of a university, I think, so it may have already been open before Microsoft got involved. I'm not clear on the details of that. But Google's developing a tool based off of that algorithm, and now it looks like there are at least two others.

    The calculations actually happen on the client machine. The files all get uploaded to their server after the fact, but the numbers get crunched locally. They're not exactly sending it out as source code, so it doesn't mean I can look at how they're doing it, but it's not so restrictive that they crunch all the numbers themselves. There is at least one other viewer that can view the things, but again it's a Microsoft product. So I don't know where that's going.

    As far as the point cloud, yeah, it's inconvenient at best. But what's surprising is that no one has made any effort to block the one access that's been found for getting to the point cloud. The instructions were published in a blog quite some time ago, but it's still possible to go grab point clouds out of the thing, albeit inconveniently.

    One point that the developers have made in the forum is that 3D modeling software is a lot more picky about matching images. First and foremost, their goal is to render a three dimensional model from photographs. Photosynth needs to come up with a three dimensional reconstruction so it knows where to put each picture, but that's not the goal of the thing. So if the point cloud has stray points, noise, or just plain wrong information, they really don't mind so long as they come up with a visually pleasing result for where to place each photo. In order to export an honest to goodness textured model, they would need to throw a lot more effort toward filtering and cleaning up the point cloud. It doesn't sound like that's effort they're all that interested in.

    I'm just glad there's work being done along these lines, and that new people are getting involved and starting to see the possibilities. It's like back in the bad old days of Windows 3.1 and CAD systems. Autodesk was just about the only show in town. Sure, there were some free packages out there, along with some other commercial ones, but they were all clunky at best. Autodesk had everyone over a barrel, and knew it. Now there's all sorts of options out there for CAD and 3D modeling. It doesn't mean Autodesk has changed their pricing scheme, but it does mean you can buy someone else's software and still play the game.

    There's one player in particular I'm eager to see in all this: Google. They have a product out there already, Google Earth. They buy licenses for data sets and use them heavily. But for 3D reconstruction of cityscapes, they're mostly out of luck. Ah! But then they come up with Google SketchUp and show people how they can draw up renderings of buildings and make the data available for Google Earth. All of a sudden people are providing the content for free. Same with Panoramio, providing photographs for Google Earth. Now combine the two: What if it was possible for someone to walk around a city landscape taking pictures like a fiend, dump all the pictures into this "free" software, and get a very accurate, highly textured model of the city landscape that could be placed into Google Earth immediately? Heck yes they'd provide that for free! Because it would more than pay for itself with unique content that their main competitor wouldn't be able to provide.

    It's not the academic involvement that's got me interested. It's the commercial involvement from companies that are offering more than one product. If product A can generate content for product B and make it more valuable, it's worth developing product A at a loss so far as the increase in B makes up for it.

    Tom
  • edited April 2009
    Interesting stuff and I think I see your large scale view. Before people would have been focused on milking individuals with professional goals for all of the money in their wallet. Maybe they will for some sort of photosynth-pro.

    I agree that the next step is the precision of the point cloud. Basically, I think the ultimate goal of a modeling application is to throw up a crude point cloud for the user to see initially and then add the functionality of 1) calculating/sorting/deleting by RMSE, 2) viewing/editing/dragging individual points with a convenient interface, 3) adding new points (with the assistance of the software giving you approximate locations based on old points), etc. -- like existing modeling software. The crude point cloud is a great starting point that may work very well for some photosets and not so well for others. I do think it assists in getting people over the hump. Getting an initial relationship for a new photo so the software can start helping you approximate the location of new points would be awesome.
  • Tom, really neat stuff. I've played with PhotoSynth but hadn't seen that 3D point clouds could be extracted from it.

    Looks like something I need to play with too ;)

    -Mark
  • edited April 2009
    3d point clouds... I need to read this thread completely.
    Couldn't stop myself:
    http://photosynth.net/view.aspx?cid=4EE2D08B-49B3-45E6-A5D4-FC5F474867C9

    I'm trying to make a decent one off this for ages now, This at least serves as a fun stand-in until I make that happen...

    Edit:
    By now I've read it..
    All the above makes me think of an old application made by one off the photostitching software makers (Panorama tools, H. Dersch, if I'm not mistaking)
    This app. was able to stitch pictures of objects so the result could be viewed as a 3d object. Could very well be this early little tool that is serving as the base for this microsoft thingy...
  • edited May 2009
    I thought some of you might enjoy seeing some of the data I've generated with Photosynth:

    Hawiian KAP/Photosynth test

    Credits for the KAP images go to Tom Benedict and Mike Desilets. I think the results are impressive for my first attempt.


    Youtube video of Photosynth being used from the ground
    This one isn't KAP but may still be of interest.

    -Mark
  • Photosynth definitely has potential.

    I followed the link in one of Tom's entries above with a view to extracting the point cloud from the synthed data. Wireshark I get to work some of the time - for me it is very much "monkey see, monkey do" with out knowing all the details that go on behind the scenes.

    The Python script I could not get work - OK I am using a horror story called Vista.
    So I looked at the Python script, I do not do these modern languages but then I had a
  • I've made a few photosynths now. Go to and search for my user name, "doc_glenn". Mine are not too exciting, but I think it is a terrific way to view KAP. With an Aurico or equivalent, you can easily get a set of photos with comprehensive coverage. I did not input all photos from each session, so the synth probably had difficulty connecting some areas. I'll keep on experimenting to get better results.

    I also tagged my synths with kite,aerial,kap. Let's all do the same, I want to see more KAP photosynths! And please leaves comments on this topic if you want others to look at your results.
  • edited July 2009
    Great work Quakeguy. Is that mound recent or prehistoric? Looks to me like there may be a few structures apparent in the 3D data...

    I have great news for others who want to use Photosynth with KAP. I've been working with a friend of mine (he's the coder, I'm the mapping/KAP geek) to create a java script that makes all the steps a little easier. You can download the ALPHA version here:

    Java Script

    "It will sniff out the packet information like wireshark, and then import the .bin's into the program, and even give you a 2D view of the point cloud, from any of the axis you choose. Just please remember this is my alpha version of it. But all the basics are there, and a little more." -- From my amigo's description.

    Please send me any comments or suggestions you have about the app.

    -Mark

    p.s. I THINK I've worked out a way to georeference these data. Your comments on the cooridinate system have helped a lot with that. Thanks.
  • Holy COW! Ok, I gotta gotta play with this.

    Hey, here's a much related question:

    I've got a really nice synth of a garden I did for a friend. She's designing new irrigation for the garden, and needs a reasonably decent ortho of the garden. The accuracy requirements aren't GIS-strict, just as little distortion as I can manage. I'm about to download your amigo's Java script and give it a go, with the aim to pull out a 2D view of the point cloud looking down. (I take it it keeps the color information?)

    Is there a way to go from the 2D point cloud, which with this photo set is actually quite good and low-distortion, and use that to hang the ortho photos on it so I basically wind up with a 2D composite that is reasonably distortion-free? As you can guess when I tried this with a panorama stitcher, it got all sorts of bendies in it because of the assumptions in the stitching software.

    MAN, I'm STOKED about being able to georeference a Photosynth point cloud! SERIOUSLY looking forward to hearing more about that. Please please do share!

    Tom
  • Ok, I gave it a try. I had a hard time getting it to packet sniff the URL. So I used Wireshark and pulled out the path and tried to use that with Open from URL, but still no dice. Finally I did a wget to grab the .bin files and fed them to the script just opening them as files. That worked.

    The renderer was a nice change from my very klutzy, manual conversion method. I got to zoom around and get a nice top-down view of the garden. W00t!

    When I tried to export as a .PLY and load it into Rhino3D, though, it didn't work. Rhino opened the file, but nothing appeared on screen. I'm not sure if this is a non-standard PLY read on Rhino's part, or something funky with Rhino dealing with point primitives in a PLY file. No clue.

    So a qualified success, and doing pretty darned well for an alpha release. I'd love to cut out the Wireshark part of the process, though, because I can't get cut and paste to work with Wireshark unless I save the packet stream, open it in a text editor, and THEN copy and paste. So I'm hoping that's just a case of my not doing something correctly.

    Tom
  • edited July 2009
    Tom, did you install jpcap? Also, do you have Meshlab? Does your output PLY open in that? It shouldn't matter which app you open it in but that's all I've been testing it with.

    Like you, I've been super busy with work and haven't had a chance to play with the java too much yet. Stay tuned.

    -Mark
Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion