Museum Virtual Worlds

Bringing Real and Virtual Together

June 8, 2012
by Rob Rothfarb
0 comments

LIVE Webcast of the Transit of Venus Streamed Into Second Life

The Exploratorium’s team of Second Life staff and volunteers put on another mixed-reality webcast viewing event in SL on June 5th, 2012, extending the museum and online event of the Exploratorium’s live webcast of the 2012 Transit of Venus into the virtual world. Avatar-scientist Patio Plasma hosted the in-world event which featured the 6.5 hr long live webcast of telescopic images of the sun and the rare planetary transit. Volunteers in SL helped avatars see the stream. For this event, we tried a different method of putting the stream into SL–using the HTML on a prim feature of the SL viewer. This method proved to be workable, though not as easy to use for SL residents as the traditional approach of replacing a prim texture with a Quicktime stream. With each event, we learn more about the best ways to use the multimedia features of SL and how to engage avatars with live webcasts.

October 25, 2011
by Rob Rothfarb
0 comments

MIND THE FISH–Augmented reality exhibit reveals aquatic creatures’ inner thoughts

MIND THE FISH: Augmented Reality exhibit

MIND THE FISH Augmented Reality exhibit at CineKid Festival, Amsterdam, Oct 2011. Image by Sander Veenhof

Ever wondered what your goldfish are thinking about? A playful exhibit called MIND THE FISH at the Cinekid festival in Amsterdam this month allows visitors to peer into a fishbowl via AR and read discussions the fish are having with themselves, with other fish, and with visitors. The amusing texts, like “I feel like everyone is looking at me … or am I crazy?’ appear as thought bubbles projecting from fish as they swim around their aquatic habitat as seen by a movable screen with an augmented view of the bowl. The interactive work was made by Arthur van Beek, Sander Veenhof, Edith Kuyvenhoven, and Tijmes Woudenberg, collaborators from the Netherlands who each contributed different skills to bring the inner thoughts of the fish to light.

I asked AR artist Sander Veenhof some questions about creating the work and the reactions of visitors.

What are the challenges of designing an AR exhibit that incorporates a physical interface beyond a mobile device?

The subject matter we were trying to augment caused the foremost challenge. Since no off-the-shelve and open source system for tracking and tracing goldfish exists, we created one ourselves. Fish are quite challenging to track, because of their shininess and moving and turning in front of the camera. To our distress, when the lighting of the festival was installed, it appeared that orange was the a very popular light color, changing the whole surrounding into orange, so our system detected numerous invisible fish all the time.

The physical interface is actually not very physical. There’s no movement sensor or angle detection involved. Just by turing the device and the webcam, new fish enter the visible screen, and the software handles the placement of text call-outs.

What are the fish saying?

The fish stories are written by an expert in the field of children stories, and she even had experience writing stories in which animals play the main role. A couple of stories are about fish wondering what to do. Playing computer games, seeing a movie. And of course: playing hide and seek. Lucky for them, there’s one plant in the fish bowl. Once in a while, a virtual fish appears. Its remarks indicate that it has no clue how things in the real world work. Not even knowing what a dictionary is, when being suggested to look into such a book. Furthermore, one of the goldfish acts as a #twittervis relaying recent tweets including both #twittervis and #cinekid

What were the roles of the different collaborators on the project?

This was collaboration to the extreme. Every part was done by someone else, and all the components came together in the last few days before the opening. The system had an optimal modular set-up. The hardware was designed by Arthur van Beek, based on an original concept developed by me and him. He reserved space for a laptop on which Tijmen Woudenberg put his ‘orange tracking’ software. The tracking software kept track of fish, numbering them uniquely. His software did requests to an online dialogue server I created, which analyzed which fish were available, if a monologue or dialogue was going on, and if the involved fish were still present. If not, a new one-liner or story was started. All monologues, dialogues and even trialogues were written by Edith Kuyvenhoven, who could keep contributing new texts until half an hour before the opening, fine tuning the texts based on the experience of seeing the installation working as a whole for the first time.

There must have been a great sense of wonderment and whimsy for people who engaged with the exhibit. Were any observed reactions surprising?

I found it fascinating to keep getting the question: “How do you know what they are saying? Is this real? Or not?”. The fact that it is a question, means it did look convincing to the children. Many wanted to install such a system at home, pointed at their own fish bowl.

Besides the AR aspect, the installation functioned in a very almost analog way. Kids are of course checking how they look, when being viewed at from within a fishbowl. Actually, it was something I was curious about too. How does it look when looking to the outside world from inside? We’ve a lot of plans for variations and new versions of the installation. One of them is to check the effects of installing a fish-eye webcam. I’m very curious how the world in the bowl will look like then.

You need to a flashplayer enabled browser to view this YouTube video
MIND THE FISH at CineKid Festival, Amsterdam, Oct 2011

May 27, 2011
by Rob Rothfarb
0 comments

Artist Lynette Wallworth Using AR to Show Coral Adaptation to Climate Change

Yesterday I went to a presentation by Australian artist, Lynette Wallworth, an artist-in-residence at the Exploratorium, who showed us her recent works in which she focuses on a rare astronomical event, the Transit of Venus, as a metaphor for awareness about the rising of sea temperatures in bio-diverse ocean environments–an indicator of global climate change.

A transit of Venus occurs when the planet Venus is aligned between the earth and the Sun.  These alignments occur in pairs that are eight years apart.  The time between transit pairs is over a hundred years.  In our lifetime, the transit of Venus occurred in 2004 and the second transit in this cycle will occur in June of 2012.

In Rekindling Venus, a multi-media work commissioned by the Adelaide Film festival, Lynette examines the history of scientists and explorers around the globe exchanging observations of the transit of Venus in 1761 and in 1769.  She connects the global exchange of information and ideas about the transit in the past to the current exchange of information among scientists and the general public about the issues of climate change facing us now.  The first part of the work, In Plain Sight, is an augmented reality work that uses the mobile AR platform Junaio, to allow people to peer into a 3D virtual world with endangered corals.  She’s collaborating with scientists who study corals in ocean areas where coral bleaching occurs due to elevated sea temperatures and examines coral species that fluoresce under different wavelengths of light.   She discussed how scientists are observing that some species of corals that were not known to fluoresce have been shown to now be exhibiting this phenomena, possibly as an adaptation mechanism to changing climate conditions.  The Junaio AR channel for the work, recognizes posters of endangered coal species and uses this image-based 2D marker approach to show 3D models of corals in their fluorescent state. An element of note in this work are the hotspots on the 3D models which, when activated by touch, display a window with recent elevated sea temperature alerts from NOAA, gathered from locations across the planet. This points to great potential for linking mobile AR experiences to live data sources, and helps connect the viewer to something happening right now.

The Rekindling Venus website counts down to the 2012 transit of Venus, which, along with the installation and mobile exploration of the AR elements, offers a call to action for viewers by engaging them in awareness, investigation, and discussion of critical climate change issues as we witness the upcoming astronomical alignment and understand the global context for shared wonder in natural phenomena and how that can inspire us to combat the climate change issues we face.

April 14, 2011
by Rob Rothfarb
0 comments

Museums and the Emerging AR Web

Over 700 museum professionals convening last week in Philadelphia for the Museums and the Web 2011 conference saw applications and content expanding from the desktops of museum visitors onto networked exhibits inside of museums and onto a growing collection of mobile devices. The annual gathering, which brings together museum exhibit developers, producers, user-experience designers, artists, curators, software developers, digital media directors, and vendors of content creation tools and services, showcases the latest digital media work in the international museums community. It provides a great opportunity for people to share work, ideas, and strategies for using digital media in it’s many evolving forms to engage visitors both in the museum through networked exhibits and online through websites and mobile applications.

As part of a conference session on augmented reality, I presented a paper which discusses the initial investigations of AR that I’ve been doing here at the Exploratorium. This includes a science inquiry activity about weather in the San Francisco bay which will be part of our Science in the City video program series, and the playful Step Into a Virtual View art installations at February’s After Dark: Get Surreal event. The paper presents details about how these exhibit prototypes were developed, considerations for preparing 3D content for use in mobile AR applications, interface design challenges, and some initial observations of how museum visitors interact with mobile AR exhibits. Read the full paper, “Mixing Realities to Connect People, Places, and Exhibits Using Mobile Augmented-Reality Applications” here.

Planning an Exploratorium AR exhibit prototype- Golden Gate Bridge Fog Altimeter

Also presenting about their work with AR in this session were two museums from the Netherlands. Margriet Schavemaker, from the Stedelijk Museum in Amsterdam, discussed that museum’s work with mobile AR using the Layar platform. Their ARTours project, currently one year into a two year project, investigates how their museum visitors interact with AR art and architecture points of interest and exhibits in Amsterdam. Read the paper, “Augmented Reality and the Museum Experience”, co-written with her colleagues Hein Wils from the Stedelijk and with Paul Stork and Ebelien Pondaag from Amsterdam-based design studio Fabrique.

Cutting-edge work was also shown by Ingeborg Veldman and Tanja van der Woude from Science LinX, the science center of the Faculty of Mathematics and Natural Sciences of the University of Groningen in the Netherlands. Science LinX is focused on engaging teenagers in STEM disciplines and they experiment with exhibit development methods and tools that can be used to communicate hard to explain phenomena. I was captivated by Ingeborg’s presentation about their project, MIGHT-y, an exhibit and game which uses 2D markers on the faces of cubes to let visitors explore concepts about scale from the film by Charles and Ray Eames, Powers of Ten. It’s a mixed-reality exhibit in which visitors don AR glasses to see “into” the cubes and manipulate 3D animated objects. Use of AR glasses seem new for museum exhibits and still in an early stage of application. Also, it’s expensive. The glasses used in MIGHT-y, Wrap920 video eyewear from Vuzix, are a consumer product geared toward gamers that can be adapted for use in exhibits to provide an immersive virtual reality experience. I tried a portable version of MIGHT-y with the eyeware, and although it was a bit jarring in terms of head tracking lag, it does provide a compelling augmented display. It’s encouraging to see experimentation with different forms of AR in museums. Read their paper, “Science LinX: the neXt level in augmenting science center Xperiences” which was co-written with Bart van de Laar, also of the University of Groningen

MW2011 was a great conference and it was a special honor for the Exploratorium’s website to be recognized with the Best Long-Lived Site Award by the Museums and the Web community!

February 15, 2011
by Rob Rothfarb
1 Comment

Recomposing Man Ray and Magritte (while being followed around by a giant blue ant): mobile AR at the Exploratorium’s After Dark

Over the past month, I’ve been developing some mobile augmented reality exhibit prototypes for the Exploratorium’s After Dark: Get Surreal event, which took place on Feb 3, 2011. The interactive installations were designed to stimulate play and social interaction using the mixed reality of After Dark visitors using their mobile phones to explore the program theme–surrealism in art, science, and society.

Mobile AR overlays virtual images with the real world using smartphones including iPhones and those running the Android operating system. I used an AR software platform called Junaio to publish an Exploratorium “channel” for our event that contained four “locative” elements/sculptures made using 3D computer graphic models, photographic images, and scripting. Each of the 3D objects that people could see when they scanned a 2D marker image marking an exhibit location had either an audio composition or a video attached to it, which streamed through the browser. The markers, called LLA markers (for latitude, longitude, and altitude) are a type of 2D barcode and contain encoded geocoordinates that tell the software on the smartphone the precise location of the visitor. This type of marker is useful for specifying a location inside of a building where GPS signals are often not detected.
Magritte Me augmented reality exhibit at the ExploratoriumAn exhibit called Magritte Me allowed visitors to stage their own version of surrealist René Magritte’s “The Son of Man” with a virtual floating bowler hat and green apple in front of a real cloud and sky background. When the virtual elements were touched on the phone, visitors heard a surreal streaming audio composition by Exploratorium collaborator Wayne Grim called “dflux theme”.

Another exhibit, titled Odalips, allowed visitors to stage their own version of surrealist Man Ray’s “Observatory Time – The Lovers”, in which the figure of a woman reclines on a couch pointing up to a painting on the wall behind her that has a mysterious set of disembodied lips. The exhibit featured an audio composition by Wayne Grim called “Perch”. Odalips was setup inside of another Exploratorium exhibit, informally called the bridgelight room, which contains two bright sodium lights. When acclimated to the lights, visitors see everything in a yellow-grey tinted monochromatic palette. We use the room to let visitors explore the effects of different wavelengths of light on color perception. The virtual objects in the AR exhibit were the lips (in red), floating above a fainting couch visitors could recline on.

In an exhibit called André le GéANT, visitors played with posing themselves with a giant blue ant appearing to be crawling off of our Mezzanine level that spoke in French in the voice of André Breton (from a 1950 interview about surrealism.) Ants also feature prominently in Salvador Dalí work.

In annother exhibit, Redisintegration No 1, visitors found themselves surrounded by a large-scale virtual sculpture that paid homage to Salvador Dalí’s “The Disintegration of the Persistence of Memory” and played a surreal video called “Psychometry.”

Introducing the Exploratorium’s After Dark audience to augmented reality through the Meta Cookie exhibit and the mobile AR exhibits seemed to be successful. Both experiences attracted many people at the event and stimulated play and interest in the AR technology. Look for more information about what we learned from experimenting with AR on mobile devices at After Dark at a session at the Museums and the Web conference this spring.

February 10, 2011
by Rob Rothfarb
0 comments

Meta Cookie: Olfactory and Gustatory Augmented Reality

Visitors to the Exploratorium’s monthly evening program series for adults, After Dark, experienced bizarre interactions with their sense of reality at After Dark: Get Surreal on Feb 3, 2010–an event infused with surrealist themes in art, music, and science.

Takuji Narumi, a Ph.D. candidate in the Graduate School of Engineering at the University of Tokyo, and Takashi Kajinami, a master course student in the Graduate School of Information Science and Technology at the University of Tokyo tantalized visitors with their interactive experience, “Meta Cookie.” The installation allows participants to don a head-mounted display while holding a cookie that has a 2D marker image burned onto it. A webcam attached to the headgear detects the marker on the cookie while the AR magicians change settings which cause the image of the cookie the person sees through the display to change to a different flavored cookie. Before your eyes, the cookie morphs visually from a butter cookie to a chocolate cookie to a strawberry cookie to a maple cookie to a lemon cookie. Did I mention that tubes attached to the headgear aimed at the adventurous person’s nose deliver scents of the different cookie flavors as the display changes? This remarkable exhibit really pulls at your sense of reality as you nibble on the cookie when it appears to be one flavor and then again as the image and smells change, sensing the flavor as entirely different!

You need to a flashplayer enabled browser to view this YouTube video
Meta Cookie at the Exploratorium’s After Dark: Get Surreal event on Feb 3, 2010