Robotics C++ Physics II AP Physics B Electronics Java Astronomy Other Courses Summer Session  

Why Computer Science?

 

] Teaches good problem solving techniques for any discipline.

] Prepares students for life beyond college - in a world increasingly influenced (driven?) by computers and

       computer science.

] Provides knowledge and computational skills required by the majority of college math and science courses.

 

Article from The New York Times, March 25, 2001, George Johnson

 

Except for the fact that everything, including DNA and proteins, is made from quarks, particle physics and biology don't seem to have a lot in common. One science uses mammoth particle accelerators to explore the subatomic world; the other uses petri dishes, centrifuges and other laboratory paraphernalia to study the chemistry of life. But there is one tool both have come to find indispensable: supercomputers powerful enough to sift through piles of data that would crush the unaided mind.

 

Last month both physicists and biologists made announcements that challenged the tenets of their fields. Though different in every other way, both discoveries relied on the kind of intense computer power that would have been impossible to marshal just a few years ago. In fact, as research on so many fronts is becoming increasingly dependent on computation, all science, it seems, is becoming computer science.

 

"Physics is almost entirely computational now," said Thomas B. Kepler, vice president for academic affairs at the Santa Fe Institute, a multidisciplinary research center in New Mexico. "Nobody would dream of doing these big accelerator experiments without a tremendous amount of computer power to analyze the data." But the biggest change, he said, was in biology. "Ten years ago biologists were very dismissive of the need for computation," Dr. Kepler said. "Now they are aware that you can't really do biology without it."  Researchers have long distinguished between experiments done in vivo (with a living creature) and in vitro (inside a glass test tube or dish.) Now they commonly speak of doing them in silica — as simulations run on the silicon chips of a computer. 

 

There are computational chemistry, computational neuroscience, computational genetics, computational immunology, and computational molecular biology. Even fields like sociology and anthropology are slowly succumbing to the change. At the Santa Fe Institute, computer models are used to study the factors that might have led to the rise and fall of complex cultures like the Anasazi of Chaco Canyon and Mesa Verde — a kind of artificial archaeology.

 

Scientists still devise hypotheses to be tested in the laboratory or in the field. But a new step has been added to the scientific process: More and more often, the experimental data that emerge are used to generate computer simulations.

 

A network of nerve cells or a complex molecule comes to life as an animation on a phosphorescent screen — to be electronically prodded and poked, manipulated with a fluidity not possible in the real world.

 

In the course of this augmentation of the scientific mind, the volume of data that needs to be analyzed has increased from a trickle to a torrent, with physicists and biologists making the heaviest demands. Early last month, Brookhaven National Laboratory in Upton, N.Y., unveiled precise new measurements of something called the anomalous magnetic moment of the muon. For months scientists gathered information about how streams of these particles, ejected from an accelerator, wobbled as they coursed around inside the world's largest super conducting magnet — a donut-shaped ring more than 40 feet in diameter. Details aside, the take-home message of the experiment was that the revered Standard Model, a longstanding theory describing the particles and forces of the universe, may be tantalizingly wrong.

 

But reaching that conclusion required a month long computational marathon in which more than a trillion bytes of data were processed by a dozen computers. Then, just to be safe, the information was processed again by another bank of computers using different software.

 

A trillion bytes is the equivalent of a thousand one-gigabyte hard drives — hundreds of thousands of Napster downloads. But that was just a fraction of the information needed to produce the competing computer models of the human genome revealed the following week by Celera Genomics and the publicly financed International Human Genome Sequencing Consortium.

Generating Celera's computerized genomic map required scrutinizing some 80 trillion bytes of data using what the company describes as "some of the most complex computations in the history of supercomputing." For this and other biological projects, Celera has assembled what is believed to be the largest civilian supercomputing operation in the world. The rival genome consortium, which relied on less computationally intensive techniques, had to yoke together 100 Pentium-powered PC's at the last moment to assemble 400,000 snippets of DNA into its own picture of the genome.

 

When the number-crunching was done, both teams were surprised to find that there may be far fewer genes than had

long been believed — 30,000 instead of 100,000. The realization may lead to a rethinking of how the complexity of life unfolds from the genetic code.

 

Physicists, more than biologists, have been accustomed to working this way. Extreme computing has been an important part of their field since the days of the Manhattan Project. Supercomputers at government research centers, processing data at unprecedented speeds, simulate some of the complexities of a nuclear explosion or the impact of a meteor striking earth.

 

In more abstract realms, a whole field called lattice quantum chromodynamics has sprung up, studying the strong nuclear force, which holds together the nuclei of atoms, by modeling how quarks and gluons cavort on a four-dimensional grid of artificial space and time. In the grandest simulations of them all, cosmologists play with computer models of the universe, tweaking the parameters of creation and running the big bang again and again.

 

With the genome project, biologists are now upstaging everyone, including physicists, in their sheer demand for computing power. And reconstructing the genome is just the beginning. Figuring out how the 30,000 genes, played like piano keys, give rise to the rhythms and melodies of life is going to take even more calculating power. Earlier this year Celera joined with Sandia National Laboratory in Albuquerque, N.M., and Compaq computers to begin developing the hardware and software needed to move into biology's next phase.

 

For years physicists have worried that many of the bright young students who would once have joined the quest to discover the laws of nature were being diverted instead into computer science. Last month a leader in the software industry, Larry Ellison, the chief executive of Oracle, predicted that the focus of the intellectual excitement will shift again.

 

"If I were 21 years old," he said at a company conference in New Orleans, "I probably wouldn't go into computing. The computing industry is about to become boring. I'd go into genetic engineering."

 

Maybe it wouldn't matter. Whatever field he chose, he would eventually end up doing computer science.