I recently discovered a neat project that promises to connect the world’s most popular online information source, Wikipedia, with the physical world to deliver an awesome new user experience. The Semapedia project leverages Semacode technology to connect physical objects with information available in Wikipedia. For those of you that have not heard of Semacode technology before, here is a little snippet from their website:
Semacode’s Software Development Kit is a system for ubiquitous computing. Using the Semacode SDK you can create visual tags for objects and contexts, and read them using a mobile camera phone. Our software running on your phone will then deliver you to the appropriate mobile content.
Semacode works by embedding a URL (web address) into a sort of two-dimensional barcode which looks like a dense crossword puzzle (pictured) — called the tag. The SDK software contains the capability to detect and decode the tag very rapidly with the camera on your phone. It extracts the URL and sends you to that address using the phone’s built-in browser.
So here is the mile-high of how Semapedia works. A Wikipedia entry is associated with a Semapedia Tag. This tag, in turn, is then associated with a physical location. This enables mobile users to point their camera at a physical location – like the Hofburg in Vienna – and instantly view any Wiki information associated with the location on their mobile phone (as long as they have the software installed).
Pretty neat, right? This simple idea promises to help usher in the next generation of pervasive computing services that leverage existing networked services on the web. Hopefully this type of service will inspire more novel Web/Mobile mashups. For instance, it would be pretty cool to hook up Riya with a set of online information service in similar manner. I will have to run that by Peter Rip and crew at some point soon.
I love being a geek (sigh). I have to go now, my spidey-sense is tingling. Later Fanboys.