iGrid Part III: High-Def, Interactive, and Multicast Applications
9.03.2005 -- This issue of the weekly series on iGrid leading up to the event features high-definition, interactive, and multicast applications.
High-Definition Applications
A group from CESNET (Czech Republic academic network provider) and Louisiana State University will present a virtual, interactive environment useful for high-quality lecturing and discussion, linking Brno, Czech Republic; LSU; and San Diego. It will create the illusion of a single meeting place for all. A lecture will be given at one of the sites, followed by a live discussion and interaction among the audience of all three sites. To minimize delay, support real-time speech interruptions, and create a “nearly immersive” look-and-feel environment requires uncompressed high-definition video; one raw stream takes about 1.5 Gbps, so the three-site configuration needs around 5-Gbps, and a multi-point configuration is likely to overload a single 10-Gbps link. There have been previous demonstrations of low-latency communication based on raw high-definition video between two sites, but this project will demonstrate the strength of a multi-point (using simulated multicast) communication. (See http://sitola.fi.muni.cz/sitola/igrid; demo CZ101.)
|
A group led by the Korea Institute of Science and Technology Information (KISTI) will demonstrate an interactive, 3-D, high-definition video application in support of e-science. KISTI will perform the demonstration in partnership with Kyungpook National University (KNU), Gwangju Institute of Science and Technology (GIST), the Korea Advanced Institute of Science and Technology (KAIST), and the Advanced Network Forum (ANF) in Korea. This joint efforts will be made over UCLP. (UCLP stands for Controlled LightPath software, co-funded by Cisco Canada and CANARIE under CANARIE's Directed Research program. The software is freely available and being used in many of the iGrid demos.) It will show the results of data analysis from a high-energy physics collaboration. (See www.canarie.ca/canet4/uclp/igrid2005/demo.html; demo KO101.)
A project led by the i2CAT Foundation, Spain, will demonstrate transcoding of five minutes of raw video data in SDI format to MPEG2. This project will send SDI data to a grid distribution center where the film will be segmented into fragments, which in turn will be sent to remote computers in Ottawa and Barcelona using lightpaths provisioned by UCLP. The fragments will be transcoded into MPEG2 format and sent back to Barcelona for reassembly, then sent to San Diego, showing MPEG2 video. All the international connections/lightpaths will be set up by the grid application using UCLP (See www.i2cat.net/i2cat/servlet/I2CAT.MainServlet?seccio=2 and www.canarie.ca/canet4/uclp/igrid2005/demo.html; demo SP101.)
Videoconferencing has been a poor substitute for meeting in person because low-quality video coupled with slow response times make videoconferencing inferior to real-life communication. The ResearchChannel demonstration brings the feeling of real-life, in-person interactions to videoconferencing. This is a major step toward raising videoconferencing to the same level as conversing in person with colleagues. In this demonstration, participants from around the world will communicate in real time by way of high-resolution, high-definition video with very low latency. In some cases, communication will be two-way, with completely uncompressed HDTV.
This is the first multicast of 1.5-Gb/s high-definition video and the first demonstration of a multipoint videoconference using uncompressed HD. Each remote site will transmit uncompressed HD video and audio over IP to receivers at iGrid. Each received stream will then be tiled on a single HD stream that will be multicast to all remote sites. Also, an image of the speaker will be multicast to the other sites as a single uncompressed HD stream. iGrid participants will see a full-resolution image of the current speaker in real time as well as a tiled display of all videoconference participants. The same will be seen at all other participating sites. The University of Washington, in conjunction with the Pacific Northwest Gigapop, ResearchChannel and AARNet, led the development of this system. Other partners include Calit2, UCSD, SURFNet, University of Wisconsin-Madison, WIDE, and APAC. (See www.apac.edu.au/apac05, www.pnw-gigapop.net/news/jgn2_2005.html, www.researchchannel.org/inside/news/press/apan04.asp; demo US118.)
The University of Washington will demonstrate real-time, live, compressed, high-definition video from an underwater sea vent research site in the Pacific Ocean. Challenges involved in deep-sea, science-related networking include ship-to-shore satellite connections and communications, IP network redistribution from the satellite downlink stations, on-site optimization of applications, and the inevitable “surprises.” (See www.researchchannel.org/projects and www.neptune.washington.edu/index.html; demo US119.)
The NASA Goddard Space Flight Center and Physical Optics Corporation will demonstrate a 35" x 35" holographic 3-D HDTV video display system that does not require goggles or other special head gear. Live stereoscopic high-definition video from the NASA Goddard Space Flight Center in Greenbelt, Maryland, will be transmitted over the 10-Gbps National Lambda Rail for viewing by iGrid attendees in the Calit2 building. (See www.poc.com/emerging_products/3d_display/default.asp; demo US130.)
Interactive Applications
A project led by the Poznań Supercomputing and Networking Center will demonstrate how to provide interactive TV services to a large number of users over broadband IP networks, ensuring high-quality end-user experience through easy, scalable access to rich digital live and archived content. The prototype system connects centers in Poznań, Cracow, and Warsaw (planned next year) over PIONIER, the Polish National Research and Education Optical Network, and serves up to 15,000 users. While the system is designed to serve Poland, it could easily be extended to other countries. The iGrid demonstration will present multimedia content distribution from Poznań to the Calit2 building at UCSD triggered by many simultaneous user requests. (See http://itvp.psnc.pl/en/; demo PL101.)
See also demo SP101, described in the previous article on remote-controlled applications.
Multicast Applications
See demo CZ101 (above).
Many emerging applications require point-to-point reliable, high-quality transport of extremely large data files among multiple nodes. A demonstration by the International Center for Advanced Internet Research at Northwestern University will show how this goal can be achieved by exploiting the capabilities of dynamic lightpath switching using advanced optical technologies. More specifically, this group will demonstrate state-of-the-art techniques for large-scale, wavelength-based transport among multiple nodes, using new methods for transparent mapping, lightpath control, resource allocation, error control, traffic stream quality, and performance monitoring. (See www.icair.org/igrid2005; demo US110.)
The Electronic Visualization Lab at the University of Illinois at Chicago will lead a series of demos streaming visualizations from Chicago and Urbana-Champaign, Illinois; Burnaby, British Columbia; Daejon, South Korea; and Amsterdam to a 100-million-megapixel “LambdaVision” display using the Scalable Adaptive Graphics Environment and custom software developed by KISTI in Korea and SARA in the Netherlands. SAGE, being used in several iGrid demos, is a graphics streaming architecture that supports collaborative scientific visualization environments with potentially hundreds of megapixels of contiguous-display resolution. In collaborative scientific visualization, it is crucial to share high- resolution imagery as well as high-definition video among groups of collaborators at various sites. The network-centered architecture of SAGE allows collaborators to simultaneously run various applications (such as 3-D rendering, remote desktop, video streams, and 2-D maps) on local or remote clusters, and share them by streaming the pixels of each application over ultra-high-speed networks to large tiled displays.
One part of this demo will use Cyto-Vis, an artistic, real-time visualization system driven by statistical information gathered during gigabit network data transfers to SAGE. It artistically presents network characteristics, such as origin of the data, bandwidth utilization, number of applications running, and frame rate. It will also detect and display the presence of Bluetooth devices carried by attendees. The idea is to generate an event-specific, real-time visualization that creates 3-D patterns based on actual local and wide-area networking data. (See www.evl.uic.edu/cavern/glvf, www.evl.uic.edu/cavern/sage, and www.evl.uic.edu/luc/cytoviz; demo US117.)