Since the beginning of the Exploratorium’s explorations with virtual worlds (circa 2006–I know, seems like a long time ago!), we’ve combined live webcasting with exhibits to create social events. To bring the outside world in using programs like webcasts, you need a capable audio/video encoding tool that can communicate directly with a streaming media server. Virtual worlds like Second Life, can then deliver that stream to the participant by accessing the stream location and rendering it in the viewer application. For video, we can make a surface act like a big projection screen and show the live stream. Two programs which we’ve found to be useful to encode digital video from a camera or other video signal are QuickTime Broadcaster from Apple and Wirecast from Telestream. Both are capable of connecting to a QuickTime Streaming Server (QTSS) which is the necessary server application you need to stream video into SL. I haven’t been able to successfully use a Helix Universal Server, another popular streaming media server, to get video into SL yet, but I keep thinking it’s possible. Also, I haven’t had an opportunity to experiment with using other server apps like Wowza for this but hope to at some point. QuickTime Broadcaster is only available for the Mac, but is free from Apple, and makes setting up an encoder system fast and easy. You may need to enter a username and password to authenticate with QTSS from QuickTime Broadcaster through the application’s Network configuration. Wirecast is a commercial application, though you can obtain a discounted educational license which makes it fairly accessible, cost-wise. It’s cross-platform and I’ve used it on both Mac and Windows to encode and connect to QTSS. Wirecast has many other features like video switching, which are very useful and worth checking out. Both programs allow you to select a video input source on your computer –webcam, external camera via capture interface or Firewire, name the stream (you’ll need this to know the stream url), and to simultaneously save an encoded file to your computer while it’s sending the encoded video to the streaming server (useful for posting the video for on-demand viewing after the live event). On October 9, 2009, we’ll be doing a live webcast from a remote location (the Lick Observatory on Mt. Hamilton) and using Wirecast to encode real-time telescopic images and video of Exploratorium scientists Paul Doherty and Ron Hipschman who will be hosting a program about NASA’s LCROSS mission. The program will be streamed on the Exploratorium’s website and into our amphitheater on Exploratorium Island in Second Life.
NASA has published a report from the workshop on virtual worlds and immersive environments held in 2008 at Ames Research Center. The report, created by workshop organizers and participants, summarizes the presentations and discussions at the workshop, which was attended by a diverse group of people from the research, commercial, education, and gaming sectors. Key themes include remote exploration, global participation paradigms, and the narrowing boundaries between physical and virtual experience. For training simulations and outreach purposes, NASA continues to use virtual worlds. Check out the STS-125 Hubble servicing spacewalk simulation and stay tuned for more interactive virtual worlds about future moon missions. I don’t have a current update on where NASA is with it’s RFP for a MMOG platform but will be keeping my eye out for the latest on that.
Educational media developers and researchers at the Cornell University SciCenter put on a workshop with sponsorship from the National Science Foundation and the University of Pennsylvania called the Taxonomy of Virtual Worls for Education. The workshop brought together virtual worlds technology platform developers, educators, educational media develepers, assessment and evaluation researchers, youth facilitaors, and teens from Philadelphia area schools for two days to discuss our practices in making and working with virtual worlds. This was an intensive meeting focused on creating a basic taxonomy of virtual worlds and virtual worlds features which could be used by each of the meeting participants as well as by media designers, businesses, attorneys, school districts, legislators, researchers, funding organizations, and others to better understand virtual worlds and how they can be used by K-12 educators for STEM learning initiatives.
There were several technology platforms/projects represented there including Second Life, Active Worlds (a strong supporter of educational virtual world developers and one a company that has been developing and supporting its platform for over twleve years), Cobalt (open source P2P, object-oriented platform developed at Duke University, built on Open Croquet), Project Wonderland (Sun Microsystems open source platform with strong audio-conferencing elements), Blue Mars (new platform from Avatar Reality, based on a popular 3D game engine), Digital Spaces (open source platform from Digital Space with strong physics support, used by DS in their development of 3D simulations for NASA), and Medulla (an open source toolkit from the Federation of American Scientists for building learning object extensions in various virtual world platforms).
The beginnings of the taxonomy that is being created will be posted online and form the basis for further development of information best practices and organizational strategies for creating, delivering, and assessing immersive education initiatives in virtual worlds. Hopefully, the taxonomy that’s developed will also help the NSF develop some of it’s own guidelines for evaluating proposals for work in this area.
NASA has had a long time interest in the application of virtual worlds technologies for space exploration research and mission planning. Over the past ten years, there have been many wonderful simulations and 3D virtual worlds which have been created by NASA and NASA partners and made publicly accessible via the Internet.
Recently, the NASA Learning Technologies Project office submitted an RFI for the creation of a Massively Multiplayer Online Learning Game (MMOG). This has stimulated a lot of interest in the commercial and education worlds. It will be interesting to watch this MMOG exploration and development process and see what kind of game engine and content platform NASA will get behind. However it works out, it may prove that a viable and scalable virtual world / game engine platform will emerge –one that the museums and educators can benefit from.
In late January 2008, a group of scientists at NASA Ames Research Center, along with 3D simulation industry visionaries, organized a weekend workshop called Virtual Worlds and Immersive Environments. Paul Doherty and I participated and I was invited to give a presentation about the Exploratorium’s recent work in Second Life. It was a really interesting workshop. There were great presentations by NASA, innovative technology companies, game content developers, educators, and researchers. The major themes of the workshop and discussions were “We all get to go,” “Remote Exploration,” and “Become the data.” We saw several new 3D game engines, got a good update on the state of open source platforms and applications, and saw Will Wright demo the latest developments for Spore, the much anticipated MMOG about the evolution of species.
One of the take-away points of the event was for the virtual worlds community to develop greater awareness of and partnerships with NASA CoLAB, a NASA program which supports online and offline communities collaborating with NASA. NASA CoLAB has a great presence in Second Life.
Presentations from the workshop are in the process of being archived on the NASA Virtual Worlds Workshop Wiki site that NASA Ames has developed.