CTWatch
November 2006 A
High Productivity Computing Systems and the Path Towards Usable Petascale Computing
Susan Squires, Sun Microsystems Inc.
Michael L. Van De Vanter, Sun Microsystems Inc.
Lawrence G. Votta, Sun Microsystems Inc.

7
6. Conclusions

The three research stages and associated methods described in this paper have immense potential to increase understanding of software development, both in the HPC community and beyond. Research results to date include fundamental discoveries about productivity.30 31 32 33 These findings are grounded in empirically validated models that reflect the experience of practicing HPC professionals.

We are just beginning to understand how central to the efforts of HPCS research are the essential and intimate relationships among people, tools, and code: independent changes in each are unlikely to produce the dramatic 10x increase in software productivity that was envisioned by the founders of the DARPA HPCS program and which is desperately needed by the HPC community. Meeting that goal demands aligning those changes around a deep understanding of what makes software development productive: for machines, for individuals, for organizations, and for communities. As the technology historian, Kingery noted, “... No one denies the importance of things, but learning from them requires rather more attention than reading texts” 34.

Acknowledgments We would like to thank our HPCS colleagues at Sun Microsystems and elsewhere in the HPC community for their helpful discussions and comments. This material is based upon work supported by DARPA under Contract No.NBCH3039002.
References
1Post, D.E., Votta, L.G. “Computational Science Requires a New Paradigm,” Physics Today, 58(1):35-41.
2Defense Advanced Research Project Agency (DARPA) Information Processing Technology Office, High Productivity Computing Systems (HPCS) Program - www.darpa.mil/ipto/programs/hpcs/ .
3Kitchenham, B., Pfleeger, S., Pickard, L., Jones, P., Hoaglin, D. El Emam, K., Rosenberg, J. “Preliminary Guidelines for Empirical Research in Software Engineering,” IEEE Transactions on Software Engineering 28:8 August 2002 pp. 721-734.
4Bernard, H. Handbook of Methods in Cultural Anthropology. Walnut Creek: Altamara Press. 1999.
5Yin, R.K. Case Study Research: Design and Methods. Sage Publications, second edition, 1994.
6Card, D.N., Church, V.E., Agresti, W.W., "An Empirical Study of Software Design Practices," IEEE Transactions on Software Engineering, 1986. 12(2): 264-271.
7Müller, M.M. and Tichy, W.F. "Case Study: Extreme Programming in a University Environment," In Proceedings of 23rd International Conference on Software Engineering. May 12-19, 2001. pp. 537-544.
8Perry, D.E., Sim, S.E., and Easterbrook, S.M. "Case Studies for Software Engineers," In Proceedings of Software Engineering, 2004. ICSE 2004. Proceedings. 26th International Conference on. 2004. pp. 736-738.
9Seaman, C.B. and Basili, V.R. "An Empirical Study of Communication in Code Inspections," In Proceedings of 19th International Conference on Software Engineering. Boston, MA. May 17-23, 1997. p. 96-106.
10Carver, J., Hochstein, L., Kendall, R., Nakamura, T., Zelkowitz, M., Basili, V., Post, D. „Observations about Software Development for High End Computing,” CTWatch Quarterly, Volume 2, Number 4, November 2006 – www.ctwatch.org/quarterly/.
11Kendall, R., Carver, J., Mark, A., Post, D., Squires, S., Shaffer, D. “Case Study of the Hawk Code Project,” Los Alamos National Laboratory, Report LA-UR-05-9011, 2005.
12Kendall, R., Post, D., Squires, S., Halverson, C. “Case Study of the Condor Code Project,” Los Alamos National Laboratory Report LA-UR-05-9291, 2005.
13Kendall, R., Post, D., Squires, S., Carver, J. “Case Study of the Eagle Code Project,” Los Alamos National Laboratory Report LA-UR-06-1092.
14Post, D., Kendall, R., Whitney, “Case Study of the Falcon Code Project,” Proceedings Second International Workshop on Software Engineering for High Performance Computing System Applications, St. Louis, 15 May 2005.
15Beebe, J. “Basic Concepts and Techniques of Rapid Appraisal,” Human Organization, 54(1): 42-51. 1995.
16Trotter, R., Schensul, J. “Methods in Applied Anthropology,” in Handbook of Methods in Cultural Anthropology, H. Russell Bernard (ed.), Walnut Creek: Altamara Press. 1999.
17Lahsen, M. “Seductive Simulations: Uncertainty Distributions around Climate Modeling,” Social Studies of Science 36(6): 895-992. December 2005.
18Gusterson, H. Nuclear Rites: A Weapons Laboratory at the End of the Cold War, University of California Press: Berkeley. 1996.
19Gusterson, H. People of the Bomb: Portraits of America's Nuclear Complex, University of Minnesota Press: Minneapolis. 2004.
20McNamara, L., Trucano, T. “So Why Do You Trust That Model? Some Thoughts on Modeling, Simulation, Social Science and Decision Making,” Advanced Concepts Group News and Views 8:2. Albuquerque, NM: Sandia National Laboratories. 2006.
21Shalin, V., Wales, R., “Shift Handovers in ISS Mission Control,” in Human Organizational Risk Management Workshop NASA-Ames April 25-27 2001.
22MacKenzie, D. “Missile Accuracy: A Case Study in the Social Processes of Technological Change,” in The Construction of Technological Systems, Wiebe Bijker, Thomas Hughes and Trevor Pinch (Eds) MIT Press: Cambridge MA. 1987.
23Sarkar, V., Williams, C., Ebcioglu, K. “Application Development Productivity Challenges for High-End Computing,” First Workshop on Productivity and Performance in High-End Computing, Madrid Spain, pp 14-18. February 2004.
24Barabasi, A.L. “Network Theory – the Emergence of the Creative Enterprise,” Science, 308(29): 639-650. April 2005.
25Watts, D., Strogatz, S. Interview in Discover. 1998.
26Borgatti, S., Everett, M.G. “Network Analysis of 2-mode Data,” Social Networks, 19(3): 243-269. 1997.
27Handwerker, W.P., Wozniak, D. „Sampling Strategies for the Collection of Anthropological Data: An Extension of Boaz’s Answer to Galton’s Problem,” Current Anthropology, 38(5): 869-875. 1997.
27Handwerker, W.P., Wozniak, D. „Sampling Strategies for the Collection of Anthropological Data: An Extension of Boaz’s Answer to Galton’s Problem,” Current Anthropology, 38(5): 869-875. 1997.
28Johnson, P., Paulding, M. “Understanding HPC Development through Automated Process and Product Measurement with HackyStat,” Second Workshop on Productivity and Performance in High-End Computing (P-PHEC), San Francisco, Feb. 13, 2005.
29Scriven, M. “Beyond Formative and Summative Evaluation,” In G. W. McLaughlin & D. C. Phillips (Eds.) Evaluation and Education: At Quarter Century. Chicago, IL: University of Chicago Press, pp. 19-64. 1991.
30Loh, E., Van De Vanter, M. L, Votta, L.G. “Can Software Engineering Solve the HPCS Problem?” Proceedings Second International Workshop on Software Engineering for High Performance Computing System Applications, St. Louis, 15 May 2005.
31Loh, E., Van De Vanter, M. L, Votta, L.G. “Can Software Engineering Solve the HPCS Problem?” Proceedings Second International Workshop on Software Engineering for High Performance Computing System Applications, St. Louis, 15 May 2005.
32Squires, S., Van De Vanter, M. L, Votta, L.G. “Yes, There Is an ‘Expertise Gap’ in HPC Application Development,” Third Workshop on Productivity and Performance in High-End Computing (P-PHEC), Austin, Feb. 12, 2006.
33Van De Vanter, M. L, Post, D.E, Zosel, M. “HPC Needs a Tool Strategy,” Proceedings Second International Workshop on Software Engineering for High Performance Computing System Applications, St. Louis, 15 May 2005.
34Kingery, W. (Ed), Editor’s Preface, Learning from Things: Method and Theory of Material Culture Studies. Washington, D.C.: Smithsonian Institution Press. 1996.

Pages: 1 2 3 4 5 6 7

Reference this article
Squires, S., Van De Vanter, M. L., Votta, L. G. "Software Productivity Research In High Performance Computing ," CTWatch Quarterly, Volume 2, Number 4A, November 2006 A. http://www.ctwatch.org/quarterly/articles/2006/11/software-productivity-research-in-high-performance-computing/

Any opinions expressed on this site belong to their respective authors and are not necessarily shared by the sponsoring institutions or the National Science Foundation (NSF).

Any trademarks or trade names, registered or otherwise, that appear on this site are the property of their respective owners and, unless noted, do not represent endorsement by the editors, publishers, sponsoring institutions, the National Science Foundation, or any other member of the CTWatch team.

No guarantee is granted by CTWatch that information appearing in articles published by the Quarterly or appearing in the Blog is complete or accurate. Information on this site is not intended for commercial purposes.