Archive for the ‘Scientific Applications’ Category

One for the high energy physics crowd

Tuesday, April 26th, 2005

Grids take center stage during the Global Grid Challenge according to PhysOrg.com and reported by Slashdot. Successfully transmitting 500 terabytes of data over a 10 day period, a team of eight major computer centers from the US and Europe joined together to form a global computing infrastructure. The purpose of the Challenge tests is to prepare for the massive amounts of data to be generated and shared via the Large Hadron Collider (LHC) at CERN due to come online in 2007. Once in operation, the LHC is supposed to produce 1500 MB of data every second for a decade.

The collaborative success demonstrated by this particular service challenge is a seminal issue for the broader impacts of global cyberinfrastructure. The ability to quickly perform massive calculations is only one part of the big picture. Moving large chunks of data around the globe in a timely manner opens new doors of opportunity.

High performance cluster benchmarking

Tuesday, April 12th, 2005

Supercomputing Online reports that the Texas Advanced Computing Center (TACC) at the University of Texas, Austin has teamed up with Dell to research performance issues associated with high performance computing clusters. Citing the increased use of such clusters by the HPC community, the manager of TACC’s HPC group says in the article:

With Dell’s support, we will continue to investigate and improve the performance of applications when run in these clustered computing environments and explore new techniques and algorithms for improving performance.

U.S. losing ground or just propaganda?

Monday, April 11th, 2005

An article posted by the San Francisco Chronicle claims that the US had its worst performance this year in 29 years of the annual Association for Computing Machinery International Collegiate Programming Contest. Is this further proof of a decline in science and math education in the US? Or is the poor US performance in this contest just that, a poor performance rather than any real indicator of overall educational deficiency?

In the article, Georgia Institute of Technology Professor Jim Foley says it’s the former, proclaiming

the educational system has done a demonstrably poor job of (teaching) technical, scientific and computing.

Code optimization revisited

Tuesday, April 5th, 2005

A team at Carnegie Mellon University has developed a tool for use with signal processing applications. In collaboration with IBM, Carnegie Mellon’s team has created a highly optimized FFT library for IBM’s Blue Gene/L. Touting SPIRAL as a tool that can bridge the gap between increasing computing power and slow code optimization, the technology

…provides a broad range of different solutions to identify the best signal-processing and math functions for difficult computer implementations.

See LinuxElectrons.com for the article.

Times sees reduction in DARPA dollars

Tuesday, April 5th, 2005

“I’m worried and depressed,” said David Patterson, a computer scientist at the University of California, Berkeley who is president of the Association of Computing Machinery, an industry and academic trade group. “I think there will be great technologies that won’t be there down the road when we need them.”

This little bit of sunshine comes from a Saturday New York Times article on DARPA’s shift away from open-ended basic computer science research and toward “more classified work and narrowly defined projects that promise a more immediate payoff. ” NSF’s Peter Freeman and PITAC’s cybersecurity report put in appearances.

The money (literally) quote:

[DARPA officials] revealed that within a relatively steady budget for computer science research that rose slightly from $546 million in 2001 to $583 million last year, the portion going to university researchers has fallen from $214 million to $123 million.

Spotlight on Dan Reed

Friday, February 25th, 2005

A very complimentary article on Dan Reed and his efforts to launch and grow the Renaissance Computer Center appeared in today’s Chronicle of Higher Education. I especially like his vision for providing high-performance computing resources beyond the science and engineering communities, to artists and humanists, for example. Combine the unique, and somewhat different, creative abilities of a high-end technologist and an artist, and who knows what you’ll get? By the way, Dan is a contributor to the first issue of CTWatch Quarterly.

This article, “High-Tech Renaissance,” is available online at
http://chronicle.com/temp/email.php?id=169g4qip74htta5qlfpmq8a9hj2bxlbr

This article will be available to non-subscribers of The Chronicle for up to five days after it is e-mailed.

The article is always available to Chronicle subscribers at
http://chronicle.com/weekly/v51/i25/25a03301.htm
________________________________________________

The moderators and/or administrators of this weblog reserve the right to edit or delete ANY content that appears on the site. In other words, the moderators and administrators have complete discretion over the removal of any content deemed by them to be inappropriate, in full or in part.

Any opinions expressed on this site belong to their respective authors and are not necessarily shared by the sponsoring institutions or the National Science Foundation.

Any trademarks or trade names, registered or otherwise, that appear on this site are the property of their respective owners and, unless noted, do not represent endorsement by the editors, publishers, sponsoring institutions, the National Science Foundation, or any other member of the CTWatch team.

No guarantee is granted by CTWatch that information appearing in the Blog is complete or accurate. Information on this site is not intended for commercial purposes.