Our upcoming “mixed reality” video screening event, Fabricated Realities, which will occur on January 24th at our museum in San Francisco and on Exploratorium Island in Second Life, poses some unique technical challenges. Like other public programs for which we’ve created a virtual counterpart, we’re taking advantage of things we’ve learned before and techniques and processes we’ve developed. We’ve scaled back part of the initial plan for what videos signals will be digitally encoded for streaming into SL, but will keep those elements in mind for future cinema arts related programs.
In this event, we’ll combine two audiences, one real, one virtual, to hopefully create an integrated experience where a filmmaker can interact with people in front of him and avatars projected alongside. Both audiences will view the artist’s documentary shown on a screen in front of them at almost nearly the same time. Only a slight delay of a few seconds occurs when we encode video and stream it into the virtual world. Wayne Grim, one of my colleague’s at the Exploratorium, created a theater configuration diagram and an audio/video/networking signal-path diagram that shows how we’re setting up those signals in the McBean Theater at the Exploratorium.