Lessons Learned from Deep Blue Lead to New Computing Approaches

Washington, D. C., USA & Yorktown Heights, NY, USA - 24 Oct 2002: Five years ago, IBM's chess playing supercomputer, Deep Blue, defeated then reigning World Chess Champion Garry Kasparov in a six-game match. While Deep Blue was a computer system designed to play chess at the grandmaster level, the subsequent underlying technology of Deep Blue currently solves many real world challenges, including weather forecasting, financial market modeling, automotive design, and medical research and development.

IBM recently donated part of Deep Blue to the Smithsonian's National Museum of American History where it will be on display beginning tomorrow, October 25, as part of the permanent exhibition, "Information Age: People, Information and Technology."

Today, we are faced with new challenges in information technology whose solution will require comparable innovations. Computers systems are becoming too complex to manage in traditional ways, namely, by having human administrators monitor their behavior and respond to hardware and software glitches, security alerts, outdated software, resource limitations and a host of related problems. The environments that a systems administrator must confront are vastly more complex and ill-defined than a game of chess. Deep Blue has led directly to the notion of deep computing: taking vast amounts of data and computational power and using them to make better decisions.

"Deep Blue's victory in 1997 was a milestone. It caused a lot of us to think that there are some things that are possible now with the kinds of computers we have that we didn't really believe we could do before," says Bill Pulleyblank, director of Exploratory Server Systems at IBM Research and director of the IBM Deep Computing Institute. "It re-energized a lot of people and created a belief in deep computing capabilities that might not have been there without it."

Yet, over the next 10 to 20 years, there will not be enough skilled personnel to manage the world's IT infrastructure; nor is it likely that even the smartest humans will have the speed, endurance and attentiveness required to manage that infrastructure. As computers get faster, more widely connected and involved in more and more aspects of business, healthcare, transportation, entertainment -- in short, almost every area of our lives -- it is essential that they take some responsibility for their own management.

Since the match five years ago, IBM has proposed a grand challenge and is currently working with academia, governments and other corporations to address this looming problem posed by the complexity of IT infrastructure. Called "autonomic computing," this called for computers to manage themselves with greater than human-like abilities for use across a wide range of business and commercial applications, from e-sourcing to data-mining to resource allocation.

IBM has applied lessons learned from Deep Blue to projects like autonomic computing, natural language processing, and data mining. For example, the current workhorse of IBM computational power, the Blue Gene computer, is being applied to modelling protein folding, but its chips are in fact general purpose. This was one of the lessons from the Deep Blue project: only in exceptional cases does it make sense to design special purpose hardware for a particular application (e.g. to play chess). Usually it is better to rely on general purpose processors.

Deep Blue also proved that one could tackle tough problems with smart algorithms and sufficient computational power. It encouraged many programmers to view real-life challenges as solveable, instead of as daunting, unmasterable problems. Recent initiatives such as the Superhuman Speech project - a system being designed to recognize human speech better than humans can - evolved from this kind of thinking.

Facts about Deep Blue

Deep Blue ran on a massively parallel, RS/6000 SP-based computer system.

Deep Blue searched between 100 million and 200 million chess positions per second.

During the match with Kasparov, it averaged 126 million positions per second.

IBM researchers Murray Campbell, Joseph Hoane, and Feng Hsiung Hsu created Deep Blue.

IBM's Deep Blue project began in 1989. Since IBM continued adding different hardware and software to various iterations of Deep Blue, there's no exact year when Deep Blue was built.

Related XML feeds
Topics XML feeds
Chemistry, computer science, electrical engineering, materials and mathematical sciences, physics and services science