|
发表于 2010-8-5 14:03:43
|
显示全部楼层
原文
Your computer at the service of science, exclusive interview to the director of seti@home and BOINC: David P. Anderson
June 13, 2010
Exclusive interview for seti.cl
David P. Anderson is a research scientist at the Space Sciences Laboratory at the University of California Berkeley. It leads the SETI@home, BOINC, Bossa and Bolt projects.
Thanks to the development of seti@home, more than 5 million users worldwide were able to form the largest supercomputer, thereby creating the concept of volunteer computing. Accordingly, BOINC was created, which allows the participation of many other projects, which you can see in detail here.
Spanish version here.

David Anderson,Matt Lebofsky and Jeff Cobb in the SETI@home machine room, a converted wiring closet that contains about 50 computers in 3 full-size racks.
For how long have you been working for seti@home and BOINC? How the idea came up?
David Gedye – a friend of mine, and a former student – had the idea for SETI@home in 1994, and he organized the project.
David and I envisioned the use of volunteer computing not only in radio SETI but in all areas of science.
By 2002, there were a few big volunteer computing projects – SETI@home, Folding@Home, GIMPS.
We had succeeded in harnessing a huge amount of computing power, but it wasn’t being alloted to scientists in a good way.
So I decided to develop software that would let any scientist create a volunteer computing project, and would let volunteers donate computing power to any set of these projects, not just one. The goal of this software – BOINC – is to create a dynamic “scientific computing ecosystem” in which projects compete for volunteers, the projects doing the best science get the most computing power, and volunteers have an incentive to learn about and evaluate current scientific research.
The National Science Foundation (a U.S. funding agency) supported this vision, and has funded the development of BOINC from 2002 until the present day.
What’s your inspiration to keep working on distributed computing?
I think volunteer computing is the future of scientific computing. The amount of computing power in consumer products – especially GPUs – vastly exceeds that of cluster, grids, and supercomputers. In addition, I hope that volunteer computing will encourage people to think about science, the scientific method, the value of skepticism, time scales beyond the current moment, and spatial scales beyond the immediate surroundings. These types of thought are under constant attack.
For me personally, it’s fun and satisfying to work on BOINC. There are many challenging technical problems in converting a huge set of diverse, untrusted computers into a reliable computing resource. And of course, every computer programmer dreams of having their creations used by lots of people.

Jeff Cobb, David Anderson and Matt Lebofsky looking at Matt's broken computer. Behind Matt is David's desk, with a picture of Half Dome (Yosemite) and the SETI@home Certificate of Computation, signed by himself.
Is there any significant result on any BOINC projects?
Most BOINC projects produce a constant stream of scientific results and publications: Climateprediction.net has done the best predictions of century-scale climate change, IBM World Community Grid has discovered drug candidates, PrimeGrid has found new prime numbers, and so on. SETI@home hasn’t found ET, but SETI@home is not representative of volunteer computing in general.
What are people favourites areas of research?
What’s more important: to find new drugs to treat cancer?
To understand the molecular basis of cancer?
To mitigate the environmental factors that cause cancer?
Or something more universal, like finding the Higgs boson, gravitional waves, or ET?
There’s no right answer, of course. Everyone has their own values and their own opinions. The beauty of volunteer computing is that it gives people a reason to think about these questions, and the power to act on their conclusions.
What is human-based computation? Are people collaborating and understanding this concept?
Most science research projects involve a web of interdependent activities: forming theories, designing experiments, building instruments, analyzing data, and so on. Computers can help with some of these things, but science remains an essentially human activity.
Some research involve tasks that the public – perhaps with a little training – can do via the Internet. Pioneer projects like Stardust@home and Galaxy Zoo used this approach to analyze lots of images.
People – including me – are becoming interested in exploring other types of research in which the approach might be useful.
This is a bit like volunteer computing, but in some ways harder because people are far more diverse than computers.
What are your expectations on distributed computing and citizen science?
We’ve made only small progress toward the grand goal of a scientific computing ecosystem. There are more volunteer computing projects now – about 50 – but not many new ones. Volunteer computing hasn’t entered the mainstream of scientific computing, and it hasn’t achieved public awareness on a large scale.
The main difficulty I see is that, even with BOINC, few scientists have the computer resources or skills to operate a volunteer computing project. The solution, I think, is to locate projects at an institutional level: instead of SETI@home, for example, we should have UCBerkeley@home, which provides computing power to all UC Berkeley researchers, and is operated and promoted by the University.
A small group of people is working towards the grand goal, and progress is being made on several fronts.
I feel that we’re approaching a “tipping point” after which things will really take off.
Any greetings to the spanish speaker community?
I lived in Ecuador as a teenager and have tremendous love and respect for Latin American culture and people.
I hope to form collaborations with Latin American scientists, partly so that I can travel there often and practice my Spanish, which has eroded considerably. |
|