The Exploratorium’s team of Second Life staff and volunteers put on another mixed-reality webcast viewing event in SL on June 5th, 2012, extending the museum and online event of the Exploratorium’s live webcast of the 2012 Transit of Venus into the virtual world. Avatar-scientist Patio Plasma hosted the in-world event which featured the 6.5 hr long live webcast of telescopic images of the sun and the rare planetary transit. Volunteers in SL helped avatars see the stream. For this event, we tried a different method of putting the stream into SL–using the HTML on a prim feature of the SL viewer. This method proved to be workable, though not as easy to use for SL residents as the traditional approach of replacing a prim texture with a Quicktime stream. With each event, we learn more about the best ways to use the multimedia features of SL and how to engage avatars with live webcasts.
Archive for the ‘Events’
Ever wondered what your goldfish are thinking about? A playful exhibit called MIND THE FISH at the Cinekid festival in Amsterdam this month allows visitors to peer into a fishbowl via AR and read discussions the fish are having with themselves, with other fish, and with visitors. The amusing texts, like “I feel like everyone is looking at me … or am I crazy?’ appear as thought bubbles projecting from fish as they swim around their aquatic habitat as seen by a movable screen with an augmented view of the bowl. The interactive work was made by Arthur van Beek, Sander Veenhof, Edith Kuyvenhoven, and Tijmes Woudenberg, collaborators from the Netherlands who each contributed different skills to bring the inner thoughts of the fish to light.
I asked AR artist Sander Veenhof some questions about creating the work and the reactions of visitors.
What are the challenges of designing an AR exhibit that incorporates a physical interface beyond a mobile device?
The subject matter we were trying to augment caused the foremost challenge. Since no off-the-shelve and open source system for tracking and tracing goldfish exists, we created one ourselves. Fish are quite challenging to track, because of their shininess and moving and turning in front of the camera. To our distress, when the lighting of the festival was installed, it appeared that orange was the a very popular light color, changing the whole surrounding into orange, so our system detected numerous invisible fish all the time.
The physical interface is actually not very physical. There’s no movement sensor or angle detection involved. Just by turing the device and the webcam, new fish enter the visible screen, and the software handles the placement of text call-outs.
What are the fish saying?
The fish stories are written by an expert in the field of children stories, and she even had experience writing stories in which animals play the main role. A couple of stories are about fish wondering what to do. Playing computer games, seeing a movie. And of course: playing hide and seek. Lucky for them, there’s one plant in the fish bowl. Once in a while, a virtual fish appears. Its remarks indicate that it has no clue how things in the real world work. Not even knowing what a dictionary is, when being suggested to look into such a book. Furthermore, one of the goldfish acts as a #twittervis relaying recent tweets including both #twittervis and #cinekid
What were the roles of the different collaborators on the project?
This was collaboration to the extreme. Every part was done by someone else, and all the components came together in the last few days before the opening. The system had an optimal modular set-up. The hardware was designed by Arthur van Beek, based on an original concept developed by me and him. He reserved space for a laptop on which Tijmen Woudenberg put his ‘orange tracking’ software. The tracking software kept track of fish, numbering them uniquely. His software did requests to an online dialogue server I created, which analyzed which fish were available, if a monologue or dialogue was going on, and if the involved fish were still present. If not, a new one-liner or story was started. All monologues, dialogues and even trialogues were written by Edith Kuyvenhoven, who could keep contributing new texts until half an hour before the opening, fine tuning the texts based on the experience of seeing the installation working as a whole for the first time.
There must have been a great sense of wonderment and whimsy for people who engaged with the exhibit. Were any observed reactions surprising?
I found it fascinating to keep getting the question: “How do you know what they are saying? Is this real? Or not?”. The fact that it is a question, means it did look convincing to the children. Many wanted to install such a system at home, pointed at their own fish bowl.
Besides the AR aspect, the installation functioned in a very almost analog way. Kids are of course checking how they look, when being viewed at from within a fishbowl. Actually, it was something I was curious about too. How does it look when looking to the outside world from inside? We’ve a lot of plans for variations and new versions of the installation. One of them is to check the effects of installing a fish-eye webcam. I’m very curious how the world in the bowl will look like then.
MIND THE FISH at CineKid Festival, Amsterdam, Oct 2011
Recomposing Man Ray and Magritte (while being followed around by a giant blue ant): mobile AR at the Exploratorium’s After Dark
Over the past month, I’ve been developing some mobile augmented reality exhibit prototypes for the Exploratorium’s After Dark: Get Surreal event, which took place on Feb 3, 2011. The interactive installations were designed to stimulate play and social interaction using the mixed reality of After Dark visitors using their mobile phones to explore the program theme–surrealism in art, science, and society.
Mobile AR overlays virtual images with the real world using smartphones including iPhones and those running the Android operating system. I used an AR software platform called Junaio to publish an Exploratorium “channel” for our event that contained four “locative” elements/sculptures made using 3D computer graphic models, photographic images, and scripting. Each of the 3D objects that people could see when they scanned a 2D marker image marking an exhibit location had either an audio composition or a video attached to it, which streamed through the browser. The markers, called LLA markers (for latitude, longitude, and altitude) are a type of 2D barcode and contain encoded geocoordinates that tell the software on the smartphone the precise location of the visitor. This type of marker is useful for specifying a location inside of a building where GPS signals are often not detected.
An exhibit called Magritte Me allowed visitors to stage their own version of surrealist René Magritte’s “The Son of Man” with a virtual floating bowler hat and green apple in front of a real cloud and sky background. When the virtual elements were touched on the phone, visitors heard a surreal streaming audio composition by Exploratorium collaborator Wayne Grim called “dflux theme”.
Another exhibit, titled Odalips, allowed visitors to stage their own version of surrealist Man Ray’s “Observatory Time – The Lovers”, in which the figure of a woman reclines on a couch pointing up to a painting on the wall behind her that has a mysterious set of disembodied lips. The exhibit featured an audio composition by Wayne Grim called “Perch”. Odalips was setup inside of another Exploratorium exhibit, informally called the bridgelight room, which contains two bright sodium lights. When acclimated to the lights, visitors see everything in a yellow-grey tinted monochromatic palette. We use the room to let visitors explore the effects of different wavelengths of light on color perception. The virtual objects in the AR exhibit were the lips (in red), floating above a fainting couch visitors could recline on.
In an exhibit called André le GéANT, visitors played with posing themselves with a giant blue ant appearing to be crawling off of our Mezzanine level that spoke in French in the voice of André Breton (from a 1950 interview about surrealism.) Ants also feature prominently in Salvador Dalí work.
In annother exhibit, Redisintegration No 1, visitors found themselves surrounded by a large-scale virtual sculpture that paid homage to Salvador Dalí’s “The Disintegration of the Persistence of Memory” and played a surreal video called “Psychometry.”
Introducing the Exploratorium’s After Dark audience to augmented reality through the Meta Cookie exhibit and the mobile AR exhibits seemed to be successful. Both experiences attracted many people at the event and stimulated play and interest in the AR technology. Look for more information about what we learned from experimenting with AR on mobile devices at After Dark at a session at the Museums and the Web conference this spring.
Visitors to the Exploratorium’s monthly evening program series for adults, After Dark, experienced bizarre interactions with their sense of reality at After Dark: Get Surreal on Feb 3, 2010–an event infused with surrealist themes in art, music, and science.
Takuji Narumi, a Ph.D. candidate in the Graduate School of Engineering at the University of Tokyo, and Takashi Kajinami, a master course student in the Graduate School of Information Science and Technology at the University of Tokyo tantalized visitors with their interactive experience, “Meta Cookie.” The installation allows participants to don a head-mounted display while holding a cookie that has a 2D marker image burned onto it. A webcam attached to the headgear detects the marker on the cookie while the AR magicians change settings which cause the image of the cookie the person sees through the display to change to a different flavored cookie. Before your eyes, the cookie morphs visually from a butter cookie to a chocolate cookie to a strawberry cookie to a maple cookie to a lemon cookie. Did I mention that tubes attached to the headgear aimed at the adventurous person’s nose deliver scents of the different cookie flavors as the display changes? This remarkable exhibit really pulls at your sense of reality as you nibble on the cookie when it appears to be one flavor and then again as the image and smells change, sensing the flavor as entirely different!
Meta Cookie at the Exploratorium’s After Dark: Get Surreal event on Feb 3, 2010
Since the beginning of the Exploratorium’s explorations with virtual worlds (circa 2006–I know, seems like a long time ago!), we’ve combined live webcasting with exhibits to create social events. To bring the outside world in using programs like webcasts, you need a capable audio/video encoding tool that can communicate directly with a streaming media server. Virtual worlds like Second Life, can then deliver that stream to the participant by accessing the stream location and rendering it in the viewer application. For video, we can make a surface act like a big projection screen and show the live stream. Two programs which we’ve found to be useful to encode digital video from a camera or other video signal are QuickTime Broadcaster from Apple and Wirecast from Telestream. Both are capable of connecting to a QuickTime Streaming Server (QTSS) which is the necessary server application you need to stream video into SL. I haven’t been able to successfully use a Helix Universal Server, another popular streaming media server, to get video into SL yet, but I keep thinking it’s possible. Also, I haven’t had an opportunity to experiment with using other server apps like Wowza for this but hope to at some point. QuickTime Broadcaster is only available for the Mac, but is free from Apple, and makes setting up an encoder system fast and easy. You may need to enter a username and password to authenticate with QTSS from QuickTime Broadcaster through the application’s Network configuration. Wirecast is a commercial application, though you can obtain a discounted educational license which makes it fairly accessible, cost-wise. It’s cross-platform and I’ve used it on both Mac and Windows to encode and connect to QTSS. Wirecast has many other features like video switching, which are very useful and worth checking out. Both programs allow you to select a video input source on your computer –webcam, external camera via capture interface or Firewire, name the stream (you’ll need this to know the stream url), and to simultaneously save an encoded file to your computer while it’s sending the encoded video to the streaming server (useful for posting the video for on-demand viewing after the live event). On October 9, 2009, we’ll be doing a live webcast from a remote location (the Lick Observatory on Mt. Hamilton) and using Wirecast to encode real-time telescopic images and video of Exploratorium scientists Paul Doherty and Ron Hipschman who will be hosting a program about NASA’s LCROSS mission. The program will be streamed on the Exploratorium’s website and into our amphitheater on Exploratorium Island in Second Life.
Creating objects and experiences that tell the multifaceted story of the number Pi is nothing less than serious fun. Now in it’s third year being celebrated by the Exploratorium community in Second Life, and in it’s twenty first year being commemorated world-wide, Pi Day is a unique opportunity to be amazed by the relevance of the ever repeating number yielded by dividing the circumference of a circle by its diameter. Exploratorium staff and SL community members have created unique exhibits that let avatars experience, learn about, and contemplate Pi. Exhibits on display all month with a special event on Pi Day 3/14/2009 from 1:00 – 3:00 PM PDT on Exploratorium Island and at Sploland.