Tonchidot just revealed its Sekai Camera, a system for using online data to navigate the real-world, at the TechCrunch50 conference in San Francisco. I thought this was incredibly cool, and the audience agreed — it kept applauding and cheering throughout the presentation.

The idea is to use location-based data to tag real-world objects, and to present that information as a graphical layer over images in your iPhone camera — much more exciting than a simple map. For example, you can look at a street through your camera, and the Sekai Camera can display arrows pointing to all the nearby restaurants or stores or messages from your friends.

This really felt like a science fictional device from the future — panelist Tim O’Reilly cited Neuromancer and Snow Crash, but I think it most resembles the augmented reality described in Vernor Vinge’s novel Rainbow’s End.

The details of the service are a little hazy right now, due to the language barrier — when the panelists started asking questions, the Tokyo-based company’s chief executive Takahito Iguchi responded tersely but enthusiastically. For example, O’Reilly asked,”Can you actually build it, and are you building it the right way?” but Iguchi just shouted that O’Reilly should use his imagination.

“Join us!” Iguchi cried, and the audience started cheering once again.

Update: Several of the panelists were a little skeptical because of the lack of details. Perhaps the biggest lingering question is where the data for the tags comes from — will most of it be user-generated, or will it come from existing online sources, or somewhere else entirely?

Still, the panelists also acknowledged that it’s an exciting idea. Rafe Needleman of Webware began his first question with the preface, “So, before Google buys you …” to which Iguchi responded, “Never!”

AI. Messaging. Bots. Arm yourself for the next paradigm shift at MobileBeat 2016. July 12-13 at The Village in San Francisco. Reserve your place here.