Here is some of the latest out of Supertouch Labs. It utilizes a kinect, spacial recognition and activation, remote iPad control and gesture based media control.
The user, once in our Coverflow application can control media using simple gestures right and left. Additionally, the system adjusts the volume based on how close the user is to the screen, so if she is farther away, we can play the content louder for her to hear, if closer, it will be softer.
The core programming works for a variety of media. One can control music, video, movies, presentation slides and images to create amazing presentation control in virtual space. Interested in this? Drop me a line.
In another example of how to use new engagement technology for creative branded executions I present the collaboration between Nike and the YesYesNo collective to launch the Nike Free Run+2 City Pack series.
Using some custom software YesYesNo was able to track and modify the datastream of a specific runners' route. Allowing for route, speed, distance and a few other variables they created a live artistic visualization of a particular runners trek. These end up as compelling visuals that are individually personal to the runner. After which they were provided to the runner with the route and runner info laser etched onto a custom box with a pair shoes from the Nike City Pack series from their city.
In a great hybrid between fashion and technology, Adidas commissioned Didier Brun from Sidlee to create a system where the steps of sneakers could control the sounds in a musical performance. One of the main rules was no faking, the sounds had to come from the dancer's steps. The results are pretty outstanding.
If you want to see how it was done, go to Sparkfun.
A nice little marriage of store window projection and Kinect control coming out of Paris. We did not do this particular one, but it is a nice illustration of the direction that Consumer Engagement Technology and HCI is going in the ad business and the reason we started our Supertouch division.
Drop a line if you or your brand are interested in creating these type of experiences.
Here is a good example of where we believe user interaction and technology are heading when it comes to brand experiences. For this experience, Ford UK combined markerless augmented reality and some basic gesture tracking to create a pretty fun experience. They virtually out a Ford car into your hands and let you customize the color, see features, even see how it assists in parallel parking (something people here in NYC should make use of). We have been talking for about 2 years about how tech and user experience is quickly coming together and it is good to see people going in this direction.
Here is a short video from our client GE HEALTHCARE IT that showcases two of the technologies our Supertouch group created for the customer engagement area of the GE presence at RSNA this year. Our Question of the Day and Interactive Datavisualization applications and hardware are featured towards the end. We utilized distributed touchable content and custom multi-touch tables to bring some interactive goodies to their presence.
Last week I spent most of my time in the Hotel Del Coronado in San Diego exhibiting for GE at the TEDMED 2010 conference. While I have been a fan of the TED and TEDMED talks for years, being in the environment is a very different experience. Meeting and talking with Richard Saul Wurman (who created both TED and TEDMED) and Mark Hodosh (President of TEDMED) was a treat and both were extremely accessible and enthusiastic to interact with.
About 3.5 weeks prior to the show, we were given a challenge to come up with an idea that we could execute for GE at the show, to support it's sponsorship, that would engage and create discussion among the attendees at TEDMED. It should also support GE's involvement in the new site Visualizing.org which has a goal of making sense of complex issues through data and design.
We came up with the idea of using a variety of datasets, all related to health in the United States and allow people to touch, create, comment and share interactive Data Visualizations at the show site utilzing our Supertouch technology products. We thought that creating this interactive experience, where you could see the correlations between 12 different health factors and outcomes would lead people to come up with their own hypothesis about what is going on in our country as it relates to how we look and treat health. How does Obesity match up against High School Graduation? Is there a connection between Life Expectancy and Median Income? And the opportunity to look at it from a macro USA perspective or state vs. state would give the most benefit to explore the world of this data in an easily digestible way. We created a slightly stripped down version of the app that is hosted on the GE.com/Visualization site here.
The experience looked like this:
We brought two interactive SuperTouch 50" multi-touch tables which were both being projected onto a backdrop so other attendees could see what the user was creating. In addition we made a pedestal touch unit with a connected 65" monitor for the other application we wrote, which was a daily poll question where dynamic infographics were generated based on the responses of the TEDMED attendees to challenging questions about the future of health. These results were then posted at the end of each day to GEReports.com.
It was amazing to see how people responded to the idea of looking at data in this interactive way and proved our premise that simpler access to complicated data creates dynamic thinking.
The real inspiration came not from the experience itself, which I happen to be very proud of for being able to pull off such a detailed set of hardware and software in the time we were given (our usual dev time is 8-12 weeks), but from the conversations with so many amazing and creative thinkers. Both the speakers and attendees at TEDMED are passionate, driven and engaging. Whether it is a Martha Stewart or Steve Wozniac (one of my personal mancrush moments) or a Standord med student, everyone seemed very open to talking, sharing, laughing, thinking and in our case interacting with information and each other.
The presentations themselves were outstanding (I will tweet or post sessions that were particularly impressive once they go live on TEDMED), but the hanging out before, between and after sessions was an incredible experience. Speaking in depth with some of these people made my year and created a fire to push forward with creative thinking like very few things in my life. Some of the highlights for me were:
Meeting Mark Koska, creater of the SafePoint syringe which is literally saving millions of lives.
Having a long discussion with Peter Daszak from EcoHealth Alliance about how disease spreads.
Getting a preview of the beautiful Medica iPad app from Jay Walker, TEDMED and @Radical's Jon Kamen from Kamen himself and hearing the thought process behind creating it.
Seeing Jay Walker show the actual 1665 medican record book that contained the first known recorded medical data (related to deaths from the plague that swept Europe during the time) and kicked off modern day statistics.
And finally, seeing, saying hi to or just generally basking in the awesomeness of people like MIT's Hugh Herr, Woz, Dean Kamen, spoken word master Sekou Andrews, Steve Case, Quincy Jones, Frank Gehry, medical pioneer Alex Berenstein, soprano Charity Tilleman-Dick and so many more outstanding people.
Thanks to team GE for giving us the opportunity, it was well worth the lack of sleep.
Visualizing in the Crown Room, the chandeliers were designed by L. Frank Baum who wrote The Wizard of Oz.
Steve Case checking out SuperTouch
Registration day @ TEDMED
Visualizations could be commented on and shared with your social networks.
Infographics were dynamically created on the fly based on answers from attendees at TEDMED. The larger the box, the more popular the answer.