9 responses

  1. Simon Safar
    September 10, 2011

    I can easily imagine people trying such apps for fun, but not yet for an usable UI. Like, in what way is it better to look “thorough” your phone instead of looking “at” it?

    (And this wasn’t just a rhetorical question. There are actual answers, but still not enough.)

    Most of the examples shown are already available without AR. Figuring out where to go out is much easier if you can estimate which places are nearest (much easier on a map). And after reading a barcode, it’s simpler to read the description if it “hovers” not over the thing itself but is fixed to your phone’s screen…

    That said, I still think this is the future. (Rainbows End style.) But in order for that to happen, overlaying shaky 2D overlays and trembling, blocky 3D buildings over the camera input with seconds of latency is not enough. We would need rock-solid tracking, even recognition, with lower latency, and we have neither the hardware nor the software (computer vision, etc) for that… but it’s coming. As for the graphics, some TRON-style display would be OK, but it is a must for the phone to know where exactly it is in the world and what it is looking at.

    Still, as long as we have only smartphones, there arent _that_ much possibilities for AR (well, interactive IKEA manuals that show you where to put _that_ screw over there would be still cool). At least not too much compared to the same thing with HUDs, contact lenses with displays, etc… with the main “how do I know when to pick up my phone and press the scan button” problem would be eliminated.

    I wonder how the world would look like in 2025 :)

Leave a Reply

You must be to post a comment.

Back to top
mobile desktop