Moonshot - Barkley Labs

Glass to glass, a web-enabled storefront window you control with your smartphone

| Comments

Here’s one piece from our Reinventing Retail exhibition that’s worth checking out this Friday.

Bridging the Physical-Digital Gap

This retail window proof of concept is just one way we imagine the digital and physical worlds becoming more connected. As shoppers pass the window, they’re prompted to visit a web URL on their smartphones, and upon visiting, they’re given real-time control of the model’s outfits and poses.

For Geeks and Buzzword-Lovers: How it Works

Under the hood, this demonstration uses a couple technologies we’re really excited about.

Both the window display itself and the mobile website are written in plain old HTML, CSS, and Javascript, the languages of the web that we all have grown to love. However, both of those web pages function as clients to a WebSockets server that allows real-time communication between the two screens. Change an outfit on the mobile page and a packet of information is beamed instantly to the web client on screen to sync up the model’s actions with your selection.

In addition to the handiness of WebSockets, we got to play around with something that was just added to the latest version of Chrome Canary — alpha transparency in an HTML5 video. Canary is the developer release of the popular Chrome browser and contains features that haven’t quite hit the market yet. So, while we were excited to play with transparency in videos, don’t expect to roll it out on your clients’ websites just yet.

In our demonstration, the model is actually superimposed on top of a CSS background image. She was shot on green screen and her own surroundings are keyed out to an alpha channel in the video, which means we can swap out the background with any color, image, video, scene, or whatever we want with just HTML and CSS. Not too shabby! Before the new WebM feature was released, you had to perform some pretty hacky and cumbersome tricks to get a similar effect.

For the curious, the transitions between video clips were performed by jQuery, simply fading one video out and fading in another.

This HTML5 Rocks tutorial does a pretty solid job of explaining how to get alpha videos. However, we were able to just take some MOVs from our video guys with the alpha channel already present and convert them to WebM with a single ffmpeg command. Be forewarned, if you’re wanting to do this on a Mac, you’ll need to compile ffmpeg yourself to get WebM support.

Once again, feel free to stop by and check it out this Friday, from 5-8pm, during our Moonshot Open House. This and numerous other emerging tech demos will be on display for our Reinventing Retail exhibition.