Click here to bypass page layout and jump directly to story.=


UC Berkeley >


University of California

News - Media Relations

Berkeley








NEWS HOME


ARCHIVES


EXTRAS


MEDIA
RELATIONS

  Press Releases

  Image Downloads

  Contacts


  

 Press Release

UC Berkeley computer scientists receive nearly $9.5 million to tackle some of today's major computer and Internet problems
13 Sep 2000

By Robert Sanders, Media Relations

Projects led by UC Berkeley | Projects involving UC Berkeley collaborators

Berkeley - With nearly $9.5 million in support from President Clinton's Information Technology Research program, University of California, Berkeley, scientists and engineers are embarking on studies to answer some of today's vexing questions surrounding computers and the Internet.

Why do software applications crash so frequently? Why do computer clusters, called servers, go down more often as maintenance costs skyrocket? How do we insure that the Internet evolves to remain as successful as it is today?

UC Berkeley computer scientists are among the recipients of eight separate grants announced today (Wed., Sept. 13) by the National Science Foundation, part of a new $90 million federal program to support research on information technology. Five of these grants, totaling nearly $7.4 million over the next five years, will fund projects led by UC Berkeley faculty. UC Berkeley computer scientists are collaborators on three other projects that will bring about $2.07 million to the campus over the next five years.

"These UC Berkeley research groups have a significant track record of success in changing the industry, and they're now tackling a new challenge," said Christos H. Papadimitriou, acting chair of the computer science division in the UC Berkeley College of Engineering and the C. Lester Hogan Professor of Computer Science.

In all, the National Science Foundation (NSF) announced grants for 62 large projects averaging $1 million per year for three to five years, plus 148 smaller projects of $500,000 or less for up to three years.

The NSF expects continued funding to support these grants and has asked for $190 million of additional funding in the next fiscal year.

"This initiative will help strengthen America's leadership in a sector that has accounted for one-third of U.S. economic growth in recent years," said President Clinton.

Projects that have come out of the UC Berkeley Department of Electrical Engineering & Computer Sciences and into general use include the Berkeley UNIX operating system, widely used today on computer workstations and for Web services; RISC chips, at the heart of the Power PC and used in the computer-aided design of chips; RAID, used in many hard disk systems today; and ideas for linking computers together to form large networks of workstations.

One of the new projects, with a budget of $2.4 million and a collaboration with Mills College in Oakland, Calif., will look at the reliability of clusters of thousands of computers commonly used today by Internet companies like Yahoo and Inktomi to index the Internet. Another, budgeted at $2.5 million over three years, will focus on a problem that has been intractable for more than 20 years: buggy software. The third, funded by $1.5 million over five years, will explore ways of relieving congestion on the Internet, including financial incentives.

Last year, Clinton's Information Technology Advisory Committee (PITAC) recommended an "increased federal investment (in information technology) to maintain the U.S. lead in this important sector of the global economy," noted NSF director Rita Colwell in announcing the Information Technology Research awards. The $90 million in awards responds to these recommendations, she said, funding projects that "represent major innovations in information technology, rather than routine applications of existing technology."

For a complete list of ITR awards and project abstracts, see http://www.itr.nsf.gov/.

For the PITAC report, see http://www.ccic.gov/.

Projects led by UC Berkeley include:

Preventing server crashes
Taming the Data Flood: Systems that Evolve, are Available and Maintainable
$2,399,802 over three years
http://iram.CS.Berkeley.EDU/seam/

Networks of computer workstations, numbering in the thousands, have become common in the Internet age. Yahoo alone uses thousands of computers to index the Internet.

But keeping these server systems running and reliable has become an expensive problem. As computer hardware has become more reliable, system and operator errors have become the major causes of server crashes. Operator error alone may account for 40 percent of system crashes. Plus, human maintenance costs have grown to become nearly 95 percent of the cost of operating a system.

"These computer systems are designed to scale up easily, but two of the big problems are that human costs become extravagantly high as they grow bigger, and they don't work all that well - they go down a lot," said David Patterson, UC Berkeley professor of computer science and principal investigator for the grant along with UC Berkeley computer scientists Katherine Yelick and John Kubiatowicz. "The problem of attaining peak performance, which has dominated the research agenda for the past 20 years, will be secondary to concerns of availability, maintainability and evolutionary growth in the PostPC era, where computers must cope with the flood of new data and yet be much more dependable and maintainable."

In collaboration with computer science professor Matthew Merzbacher at Mills College in Oakland, Calif., Patterson and his UC Berkeley colleagues plan to test ways of making computer servers cheaper to maintain and more available, even in the presence of hardware or programming bugs. One possibility is to purposely introduce flaws into computer systems to see how they recover and to design them to be rugged in the presence of these flaws. This might involve software that senses and automatically repairs data.

Patterson already is exploring these options in a current project called I-STORE.

"I've become aware of the I-STORE project and find it very intriguing," said Dr. Winfried W. Wilcke, program director for Silicon Valley of the IBM Research Division. "It squarely addresses the real issues in highly scalable systems of today and the foreseeable future." Patterson hopes the ideas they develop will trickle down to home computers and make them more reliable, too.

The collaboration with Mills, an all-women's college, is designed to increase interactions between faculty and both undergraduate and graduate students at the two institutions.

"We're hoping the cross pollination will be good for both sides, and that this will create opportunities for women in computer science at Cal that aren't there now," Merzbacher said.

Getting the bugs out before software release
The Open Source Quality Project
$2,499,923 over three years

As computer programs have become bigger and more complex, bugs have proliferated. It is estimated that the Windows NT operating system has about a million bugs in its 10 million lines of code, each an opportunity for a computer crash. Even programs like the Linux operating system, which are open-source - the computer code has been published for all to see and improve - have numerous bugs, perhaps more than proprietary codes.

UC Berkeley computer science professor Alex Aiken is leading an effort with his UC Berkeley computer science colleagues, professor Thomas A. Henzinger and assistant professor George Necula, along with David Schmidt of Kansas State University, to study open source programs like Linux and find ways to improve their reliability.

"We want to prevent the release of buggy code, which requires new upgrades all the time," said Aiken. "We believe we can design tools to find bugs in existing programs more cheaply than it can be done now, and help people change the way they write code so it's less buggy in the first place."

Linux is the first open source code they plan to look at. They will concentrate initially on the parts of the code that are the most difficult to program and to test, those dealing with device drivers - the code that tells the computer how to deal with peripheral devices.

The problem of buggy device drivers has implications for security also, Necula said. With device drivers, you often don't know who wrote specific sections, and whether or not they inserted a Trojan horse that could allow unauthorized entry into your system.

"We hope to act as quality assurance for the open source movement," Necula said. "This could greatly improve the software development process."

Scaling the Internet without bringing it to a halt
Collaborative Research: Scalable Services for the Global Network
$1,493,661 over five years

As Internet usage ramps upward, the World Wide Web sometimes seems like the World Wide Wait. New and resource-demanding applications, ranging from videoconferencing and multimedia Webcasting to distributed gaming, promise to bring the Internet to a standstill in the not too distant future.

John Chuang, an assistant professor in UC Berkeley's School of Information Management & Systems, and three colleagues will develop new network services to support these and other types of networking applications. The grant also involves computer scientists Edward Knightly of Rice University, Jorg Liebeherr of the University of Virginia, and Hui Zhang of Carnegie Mellon University.

Bringing data closer to the users is a recurring theme in this project. For example, local caching of frequently requested Web pages and audio and video files can eliminate duplication of traffic and improve user-perceived response times. Intelligent management of network resources allows the distribution of live, quality video and audio that don't break up.

The primary design criterion for any solution, however, is scalability - that is, that the solution still works as Internet use increases.

"We want to develop solutions that can scale up to accommodate future growth of the Internet," Chuang said, "but our solutions also have to be incrementally deployable over the current Internet."

Solutions might involve economic incentives, as well, including Internet pricing.

"We are taking a novel, multidisciplinary approach to this problem, and we believe that well designed pricing schemes for network services can be effective in encouraging efficient use of the Internet," he said.

If the Internet is the answer, what it the question?
Analysis of Internet algorithms: Optimization, game theory and competitive analysis
$499,774 for three years

Four theoretical computer scientists are teaming up to find out why the Internet works so well, and to develop principles that could guide its future evolution.

One particular question involves traffic control on the Internet. A protocol called TCP/IP, developed by Van Jacobson, then at Lawrence Berkeley National Laboratory, works beautifully today, slowing traffic as it builds so as to guarantee as much access as possible. UC Berkeley computer scientists David Karp and Christos H. Papadimitriou, in collaboration with Scott Shenker of the International Computer Science Institute in Berkeley and Elias Koutsoupias of UCLA, will explore the mathematical details of TCP/IP and attempt to place it on a firmer foundation.

"We believe TCP/IP is great, but why is it great? There is no mathematics to explain this," Papadimitriou said. "It's a very difficult problem, but if we knew the mathematical foundations, then when the protocol needs to be updated, we could know what to do next."

Debugging the compiler
Translation Validation for Advanced Compiler Optimizations
$499,554 for three years

Even if programmers write a perfect piece of software, bugs in the compiler - which translates program instructions into commands the computer can understand - can screw it up. George Necula, assistant professor of computer science at UC Berkeley, thinks he has a way to find and flag these compiler bugs so they can be fixed.

Compilers are used primarily by software developers and programmers, though many Web browsers incorporate a Java compiler. Nevertheless, improvements in compilers would affect everyone, including the end user.

"It's very, very hard to find these kinds of errors in compilers," Necula said. "We want to explore and perfect our tool so we can deploy it broadly to help developers find bugs, and allow the safe use of more aggressive optimizations in compilers."

Necula's goal by the end of the project is to provide his compiler checker with GNU C, a free, open source compiler used by tens of thousands of developers around the world.

Projects involving UC Berkeley collaborators:

Creating dynamic images on the Internet
Interacting with the visual world: Capturing, understanding and predicting appearance
$3.5 million over five years, $621,000 of that total for UC Berkeley
Principal investigator: Shree Nayar of Columbia University
UC Berkeley collaborator: Jitendra Malik, professor of computer science

The project goal is to make pictures and video on the Web more dynamic, so users can freely explore, interact with and create variations of the physical world being presented. Malik, whose expertise is computer vision, is developing ways to enhance the two-dimensional information in pictures and video to create three-dimensional images with information on appearance, too. Then the user can view the scene from a different perspective or under different lighting conditions. This could help in architectural design and distance learning, and provide greater realism in virtual environments and multimedia entertainment.

Profiling the user for better service
IM Data Centers - Managing Data with Profiles
$3 million over 5 years, $800,000 of which goes to UC Berkeley
Principal investigator: Stan Zdonik, Brown University
UC Berkeley collaborator: Michael Franklin, associate professor of computer science

If computer networks could keep track of your on-line needs - when, where and how you use the Internet - they could be faster and more responsive. Michael Franklin and his colleagues want to explore the added services that user profiles can provide, such as speeding data recharging - a term that refers to connecting to a computer network to update e-mail, calendar, news, and more. "We want to make recharging data on a portable device as simple as recharging the power," he said. User information also could help system managers better allocate resources.

Creating a global scientific network
Griphyn - Grid Physics Network
$12 million over 5 years, $650,000 of it for UC Berkeley
Principal investigator: Paul Avery, University of Florida
UC Berkeley collaborator: Michael Franklin, associate professor of computer science

In the largest project funded by NSF, physicists and computer scientists from around the country will collaborate to build a global infrastructure that will let scientists share data and results of computer simulations. UC Berkeley's Michael Franklin will direct work on a coherent architecture to allow data delivery on the scale of petabytes of information.

###

.