Search DLab 1.0

Friday, September 24, 2010

Exerimenting AR with projectors

Finally got a hold of 2 projectors (thanks to Belle and Ball)
and been playing with them extensively and it seems
to me that AR and projector do not mix together well. Here
are the details of the more relevant experiments I done:

1.Mixing AR with projector's image:
(single projector and computer setup)

An AR tag (paper) is pasted onto the make-shift screen. Ar program is then run and the tag was detected fine (left image).

The projector is turned on and the image projected onto the screen. Immediately, the program can no longer detect the tag (right image).

Conclusion:
Actual print out AR tag can not be mix with projector's projection. The light intensity of the projector's image is way too high for the web-cam to see through (notice image on the right, the image on the computer's screen is pure white).

2. Using projected image to display AR tag
(single computer, single projector)

The pattern displayed by the projector on to the screen, the AR program is then turn on to see whether the tag can be detected in this case.

Conclusion:
It works, but the camera distance has to be adjusted somewhat.




Notes: projecting the AR program itself onto the screen with the camera pointing to the same screen/direction is a bad idea as it produces infinite looping effect.














3. Double projector
(2 computers, 2 projectors)
Similar to experiment #2, but this time with 2 projectors and computers, one projecting the AR program, the other the AR tag.
Conclusion: The light intensity of the combined projector images is too high, the webcam cannot see anything but pure white.

4. Using computer to display AR tag #2
(1 projectors, 2 computers)

Similar to experiment #2 but this one uses a single projector and a computer dedicated to display the AR tag onto the screen. The AR program is run on the 2nd computer (no projector hooked up).

Conclusion: It works out well as expected, and the best part is there is no annoying infinite loop effect .



5. Animated Tag
(same set up as previous)










same set up as previous experiment, but this time the tags are put under some simple motion- tweening animation (flips, move, scale, and 3D rotation)

Conclusion: Though it seems to work somewhat, the AR program is only able to detect the tag while it's not under transformation (scaling, etc). Which would indicate that the detection time is pretty slow and cannot keep up with sudden movements. And just for the record, 3D rotation do not work at all. In the end I think it would work if the animation is slow and simple enough (keep it to translation and rotation)!

6. Integrating real world object
(one projector, 2 computers, an object)










Same 1 projector set up, but this time the whole screen is flooded with AE tags, then a white paper cube is integrated into the set up by placing it in front of the projected image.

Conclusion: The AR program will detect pattern as long as it's on a FLAT surface. This set up works, however the camera has to be readjust in order to take into account the protruding surface---it seems that the program will start having difficulty detecting the tag if it's smaller than 2-3 cm square (like wise, the camera shouldn't be too far from the tag; about half a meter away).



Well from my experiments, it would appear that layering projector's image is not a good idea in any case, AR's lighting condition tolerance is much more sensitive than originally perceived. Second, it would be really difficult to exactly project a pattern onto a surface as in experiment #6. I had skimmed over Max/Msp and projection mapping technology. However it might takes me a long time to learn and get it working with AR (currently, a free projection mapping tool that I can find do not support Flash files, only Quicktime clip ( http://hcgilje.wordpress.com/resources/video-projection-tools/). Projector's image is merely 2D display of what was on the computer screen, thereby, it would be extremely difficult achieve 3D, or even an illusion of 3D projection with a projector (though the image inside AR program appears to be 3D, the desktop image---what is projected by the projector---is decidedly 2D). Right now, experiment #5 and #6, imperfect and imprecise as they are, holds the most potential and are the most plausible within the time frame.

I still have the 2nd projector for the weekend, so please tell me what else you want me to try with 2 projectors. I sincerely apologize for not showing up today, and for reporting my progress so late (went out to see the doctor).

3 comments:

  1. krist we have a meet up with aj moe on sunday
    what time do u free ja?

    ReplyDelete
  2. Krist, thanks!
    hope you feel better now.
    Prior to meeting with you and Palm tomorrow,
    try:
    - back projection
    - a way to use 2 Projectors, ie. 2 computers, 2 webcams, reading the same tag but projecting different information over each other?
    really push the 2 projectors idea with overlapping projection, the AR tag can be read elsewhere if there's a problem with lighting
    - if 2 projections, maybe a way to create 3D image...like Stereoscopy or Anaglyph image? (search wikipedia for it)

    - work with Palm on how to incorporate this into a design proposal for midterm and show it to me tomorrow

    thanks again + see you on Sunday,
    11am at BACC

    ReplyDelete
  3. Experiment #5: Apart from just Animated Tag, what about interactive Tag that you change/manipulate certain part of the tag or merging of two or more tags to create a new tag...setup the interaction in Flash? Maybe one projector is about manipulating the "tag pattern" while the other projector is the outcome of that...

    Experiment #2: Maybe find a way to utilize the Optical / Video Feedback, if you two can + have time

    ReplyDelete