NTT, Academic Partners Share Computing and Network Innovations at ON*VECTOR

March 17, 2011

Research scientists, engineers and entrepreneurs  from seven countries gathered at the University of California, San Diego, from March 2-4 for the tenth annual ON*VECTOR International Photonics Workshop, a three-day “meeting of the minds” devoted to exploring the hottest and most relevant topics in computing and networking. This year’s meeting was also preceded by a the DMASM 2011 conference on metadata and digital media analysis, search and management.

ON*VECTOR stands for Optical Networked Virtual Environments Collaborative Trans-Oceanic Research. The annual ON*VECTOR workshop, sponsored by NTT Network Innovation Laboratories of Japan, has contributed to the increased international use and ongoing development of photonic (or optical) networks. Optical networks  or communication networks that transmit information in the form of light (photons)  provide greatly increased bandwidth because multiple signals can be transmitted through optical fiber across individual wavelengths of light, called lambdas.

“The annual ON*VECTOR workshop at Calit2 has become an important opportunity for NTT Labs to hear about new directions in academic research related to the development, deployment and use of photonic networking on a global basis,” said Kazuo Hagimoto, executive director of NTT Science and Core Research Laboratories Group. “We appreciate Calit2’s contribution every year.”

The ON*VECTOR project is an ongoing research collaboration among NTT Network Innovation Laboratories, Keio University’s Institute for Digital Media and Content, the University of Tokyo's Morikawa Laboratory,  the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC) and UCSD/Calit2. The ON*VECTOR project is coordinated by Pacific Interface Inc., a California-based consulting company.

“The 25 or so talks at ON*VECTOR 2011 were so strong, it’s as if the entire program was a succession of keynote presentations,” remarked Calit2 Director of Visualization Tom DeFanti, who is also an ON*VECTOR program committee member.  “The information exchanged and personal connections are really valuable.  What really makes ON*VECTOR special is the caliber of people who attend.”

“Every year we come back to UCSD/Calit2 for ON*VECTOR because of the great support we get from everyone, and the great facilities available,” said Laurin Herr, president of Pacific Interface, Inc. “Attendees appreciate the opportunity to see demonstrations of Calit2’s own research labs. And UCSD faculty and graduate student speakers have contributed meaningfully to the program for many years, including 2011.”

This year’s workshop focused on a number of contemporary topics in networking and computing, including:
• Green IT
• 40-100 Gigabit-per-second (Gbps) technology research
• Designing and securing cloud computing systems
• Advanced network infrastructure
• Next-generation processing units
• Managing future networks
• Computer-supported cooperative work on future networks
• Future network challenges

Independent consultant and longtime Canadian Internet guru, Bill St. Arnaud, who served as chair for the workshop’s Green IT panel, underscored the importance of building smart, energy-efficient datacenters to help enable advances in computing and networking.

“There has been phenomenal growth in the numbers and size of datacenters, and with that comes an increase in energy consumption,” he remarked. “Like the iron, steel and cement factories before them, datacenters are the heavy industry of the information age.”

St. Arnaud added that as global temperatures rise, computer scientists and engineers will need to rethink where and how they build datacenters.

“Some climate scientists predict that San Diego, for example, will be hotter and drier 20 years from now than the Sahara desert is today,” he added. “So what do you do when you find your transformers are blowing up from the heat? Does it make sense to do your computing somewhere else?”

Joe Mambretti, director of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, led a series of discussions on research into 40-100 Gbps technology. Mambretti says that for computing and photonic networks to evolve, the industry’s underlying business model must first be overhauled.

“Current communication services are designed, implemented and run based on 19th-century models, where designs have very precisely defined services and technologies,” he explained. “This actually limits the power of IT and makes it very static.

“Gaining revenue from static services is a dead model,” Mambretti continued. “What we need is a strategy where we customize, rather than a strategy where we rely on a ‘take it or leave it’ mentality. One-network-fits-all is a model that will not scale or provide the resources we will need in the future.”

Calit2’s DeFanti led a panel on “Designing and Securing Cloud Computing Systems” which, he noted, are closely related to green computing because they allow computing and storage to take place anywhere – and with fewer computers serving more users, consumption of electricity per unit of work (or per person) is reduced, and power sources with lesser carbon emissions can be given preference.  

“It’s clear that cloud computing and storage is the future,” said DeFanti, who leads the Calit2-based, NSF-funded GreenLight project. “But how it will actually evolve comes about as a result of discussions like this one, with a diverse group of individuals like we had at ON*VECTOR. For instance, pressure needs to be put on the telecommunications industry to make greener switches that reduce power consumption, say, at night, when traffic is lower. That’s not the case now.” 

This year’s ON*VECTOR workshop was preceded for the first time by a new conference on Digital Media Analysis, Search and Management (DMASM). Also held at Calit2 and organized by Pacific Interface’s Laurin Herr and Natalie Van Osdol, DMASM was sponsored by NTT Communication Science Laboratories and featured presentations on NTT Robust Media Search Technology; Protecting Online Media Assets; Content-Based Media Analysis, Search and Management as well as several other media- and metadata-related topics.

Metadata is typically referred to as ‘data about data.‘ As one DMASM attendee put it, “In terms of Internet search results, metadata is the difference between, ’Arnold Schwarzenegger’ turning up as ’The Terminator’ or ’Kindergarten Cop’ or the Governor of California.”

“The use, production, and distribution of digital media has historically been limited by slow and costly storage, processors and networks,” Herr explained. “But as more and more  of these old bottlenecks are eliminated,  new bottlenecks are emerging. One of the most critical, and most difficult to resolve, is the metadata bottleneck. Without metadata you can’t search, manage, control, preserve, profit from or protect digital media.

“Humans typing in metadata doesn’t scale,” he added. “Labor costs don’t scale  they can’t keep up. YouTube content is currently adding 35 hours of content per minute, and growing! We need automated tools to solve this problem.  So, at DMASM 2011 we brought together people from the commercial community  who have needs but no satisfactory tools – with people from the research community, who are capable of developing new tools but rarely hear first-hand about commercial requirements.  Our hope is to promote exchange of information and opinions, cultivate collaboration and help build a community interested in tackling the challenges of implementing content-based metadata.”

ON*VECTOR will return to Calit2 at UC San Diego in early 2012 for its eleventh meeting, and DMASM will also come back – undoubtedly even bigger in its second outing, as researchers grapple with ways for metadata to make sense of a world where the volume of data carried over photonic networks continues its explosive growth. 

by Tiffany Fox, (858) 246-0353,

Related Links

NTT Network Innovation Laboratories