Since the beginning of the Exploratorium’s explorations with virtual worlds (circa 2006–I know, seems like a long time ago!), we’ve combined live webcasting with exhibits to create social events. To bring the outside world in using programs like webcasts, you need a capable audio/video encoding tool that can communicate directly with a streaming media server. Virtual worlds like Second Life, can then deliver that stream to the participant by accessing the stream location and rendering it in the viewer application. For video, we can make a surface act like a big projection screen and show the live stream. Two programs which we’ve found to be useful to encode digital video from a camera or other video signal are QuickTime Broadcaster from Apple and Wirecast from Telestream. Both are capable of connecting to a QuickTime Streaming Server (QTSS) which is the necessary server application you need to stream video into SL. I haven’t been able to successfully use a Helix Universal Server, another popular streaming media server, to get video into SL yet, but I keep thinking it’s possible. Also, I haven’t had an opportunity to experiment with using other server apps like Wowza for this but hope to at some point. QuickTime Broadcaster is only available for the Mac, but is free from Apple, and makes setting up an encoder system fast and easy. You may need to enter a username and password to authenticate with QTSS from QuickTime Broadcaster through the application’s Network configuration. Wirecast is a commercial application, though you can obtain a discounted educational license which makes it fairly accessible, cost-wise. It’s cross-platform and I’ve used it on both Mac and Windows to encode and connect to QTSS. Wirecast has many other features like video switching, which are very useful and worth checking out. Both programs allow you to select a video input source on your computer –webcam, external camera via capture interface or Firewire, name the stream (you’ll need this to know the stream url), and to simultaneously save an encoded file to your computer while it’s sending the encoded video to the streaming server (useful for posting the video for on-demand viewing after the live event). On October 9, 2009, we’ll be doing a live webcast from a remote location (the Lick Observatory on Mt. Hamilton) and using Wirecast to encode real-time telescopic images and video of Exploratorium scientists Paul Doherty and Ron Hipschman who will be hosting a program about NASA’s LCROSS mission. The program will be streamed on the Exploratorium’s website and into our amphitheater on Exploratorium Island in Second Life.
Our upcoming “mixed reality” video screening event, Fabricated Realities, which will occur on January 24th at our museum in San Francisco and on Exploratorium Island in Second Life, poses some unique technical challenges. Like other public programs for which we’ve created a virtual counterpart, we’re taking advantage of things we’ve learned before and techniques and processes we’ve developed. We’ve scaled back part of the initial plan for what videos signals will be digitally encoded for streaming into SL, but will keep those elements in mind for future cinema arts related programs.
In this event, we’ll combine two audiences, one real, one virtual, to hopefully create an integrated experience where a filmmaker can interact with people in front of him and avatars projected alongside. Both audiences will view the artist’s documentary shown on a screen in front of them at almost nearly the same time. Only a slight delay of a few seconds occurs when we encode video and stream it into the virtual world. Wayne Grim, one of my colleague’s at the Exploratorium, created a theater configuration diagram and an audio/video/networking signal-path diagram that shows how we’re setting up those signals in the McBean Theater at the Exploratorium.
On August 1, 2008, the Exploratorium will webcast a total eclipse of the sun as seen from remote Xinjiang Uygur Autonomous Region in northwestern China near the Mongolian border. Our scientists and media development crew will capture dramatic telescopic images of the eclipse, which will be webcast via the Exploratorium’s website and in Second Life. The program will be hosted by Exploratorium scientists Dr. Robert Semper and Dr. Paul Doherty and feature NASA Heliospheric physicist Dr. Eric Christian who will show some of the latest imagery of the sun from NASA’s SOHO and STEREO missions, and explain how the solar wind can impact us here on Earth.
On Exploratorium Island in Second Life, we’ll host an eclipse viewing event featuring the live webcast, interactive exhibits, and music. You can view the eclipse webcast in the amphitheater on Exploratorium island as well as other sims including Sploland, Spindrift, Nanotechnology, UK Future Focus, Science School, and SciLands.
Putting on this event in SL is presenting some different challenges than the first time we brought a total solar eclipse webcast there in March 2006. It’s a great opportunity to continue to learn about putting on museum events in a virtual world. I’ll be sharing more details of the event as well as details of those challenges in the coming days.