CTWatch
March 2008
Urgent Computing: Exploring Supercomputing's New Role
Suresh Marru, School of Informatics, Indiana University
Dennis Gannon, School of Informatics, Indiana University
Suman Nadella, Computation Institute, The University of Chicago
Pete Beckman, Mathematics and Computer Science Division, Argonne National Laboratory
Daniel B. Weber, Tinker Air Force Base
Keith A. Brewster, Center for Analysis and Prediction of Storms, University of Oklahoma
Kelvin K. Droegemeier, Center for Analysis and Prediction of Storms, University of Oklahoma

8

When isolated supercells were detected in upper midwest on June 7th, LEAD developers helped scientists get quick turnaround using SPRUCE critical priority queues on UC/ANL resources, preempting currently running jobs. The scientists subsequently analyzed the forecasts and compared the 20 UTC radar images for the HWT 2 km and 4 km forecasts (Figure 7). The LEAD on-demand shows distinct differences from other HWT numerical predictions (Figure 8) using the previous day’s 21Z SREF data for the ARW2 and ARW4, the resolution and initial condition for the ARW3, and the 15 UTC data and resolution for the LEAD-ADAS urgent computing workflow execution.

Based on a comparison of just the two LEAD forecasts, the ADAS initialized forecast does a better job of handling the main line of convection during the period; in contrast, the NAM-initialized forecast is a little slow in initiating convection on that line in Iowa and produces less intense convection. However, the ADAS-initialized forecast produces some spurious convection early in the run that started in northeast Iowa and quickly moved northeast; the remains of that can be seen in the Upper Peninsula of Michigan at 20 UTC. It is possible that the ADAS analysis resulted in the net convective inhibition being too weak in those areas for this case. At 00 UTC, both LEAD forecasts had a weak secondary boundary to the southeast of the main line running from near Chicago across northern Illinois into northern Missouri. In the ADAS run this appears to be convection on an outflow boundary from the main line, whereas in the NAM-initialized run it seemed to have developed on its own as a weak line. It can be seen from this one example that each method of initialization of the model has its own unique characteristics and it is expected that, in time the best of each can be discerned and an intelligently constructed consensus will produce a superior forecast to what is currently available.

Figure 7

Figure 7. Comparison of four forecasts of radar composite valid at 20 UTC 07 June 2007: (top left) 3 km WRF ARW initialized at 00 UTC; (top right) 4-km WRF NMM initialized at 00 UTC; (bottom left) CAPS 2-km WRF initialized at 21 UTC 06 June 2007; (bottom right) 4 km WRF ARW SREF control forecast initialized at 00 UTC.

Figure 8

Figure 8. Comparison of four forecasts of radar composite valid at 20 UTC 07 June 2007: (top left) LEAD 2-km WRF initialized from the 3h forecast of the 1200 UTC NAM;(top right) LEAD 2-km WRF initialized from the 15 UTC ADAS analysis; (bottom left) NSSL 4-km WRF initialized at 00 UTC; and (bottom right) Observed radar composite at 2002 UTC.
Conclusion and Future Work

During spring 2007, LEAD cyberinfrastructure integrated with the SPRUCE Urgent Computing tools demonstrated on-demand, dynamically adaptive forecasts —those launched at the discretion of forecasters and over regions of expected hazardous weather as determined by severe weather watches and mesoscale discussions at the NOAA Storm Prediction Center. This collaboration was successful and used preemption capabilities on UC/ANL TeraGrid resources to meet the deadlines for critical runs.

For the 2008 Hazardous Weather Test Bed, we plan to repeat the experiment from 2007, adding 3-6 hours to the length of each on-demand forecasts to cover the evening active thunderstorm period as well as the afternoon. Additionally, we will study the processes by which forecasters determine when and where to (manually) launch on-demand forecasts. We also will continue to evaluate the tradeoffs between varying versus persistent model configurations. We strongly believe that by using urgent computing, the community can test and explore new ways to use applications and resources for critical situations.

References
1Droegemeier, K. K. et al. (20 other authors), “Linked Environments for Atmospheric Discovery (LEAD): A Cyberinfrastructure for mesoscale meteorology research and education,” in Proc. 20th Conf. Interactive Information Processing Systems for Meteorology, Oceanography, and Hydrology, Am. Meteorological Soc., 2004.
2LEAD - leadproject.org/
3Michalakes, J., Chen, S., Dudhia, J., Hart, L., Klemp, J., Middleco, J., Skamarock, W. “Development of a next-generation regional weather research and forecast model,” in Ninth ECMWF Workshop on the use of Parallel Processors in Meteorology, Reading, U.K., November 2000. Argonne National Laboratory preprint ANL/MCS-P868-0101, 2001.
4TeraGrid - www.teragrid.org/
5Kandaswamy, G., Fang, L., Huang, Y., Shirasuna, S., Marru, S., Gannon, D. “Building Web services for scientific Grid applications,” IBM Journal of Research and Development, 50(2/3):249-260, 2006
6Foster, I., Kesselman, C. “Globus: A metacomputing infrastructure toolkit,” IJSA, 11(2):115-128, 1997.
7Slominski, A. “Adapting BPEL to scientific workflows,” Chapter 14 in Workflows for e-Science, I. J. Tayler, E. Deelman, D. Gannon, and M. Shields, eds. Springer, 2007.
8Andrews, T. et al. “Business Process Execution Language for Web Services version 1.1,” online, 5 May 2003. www6.software.ibm.com/software/developer/library/ws-bpel.pdf.
9CASA - www.casa.umass.edu/
10Zink, M., Westbrook, D., Abdallah, S., Horling, B., Lakamraju, V., Lyons, E., Manfredi, V., Kurose, J., Hondl, K. “Meteorological command and control: An end-to-end architecture for a hazardous weather detection sensor network,” pp. 37-42 in Proc. 2005 Workshop on End-To-End, Sense-and-Respond Systems, Applications and Service., International Conference on Mobile Systems, Applications and Services. USENIX Association, Berkeley, 2005.
11Plale, B., Gannon, D., Brotzge, J., Droegemeier, K., Kurose, J., McLaughlin, D., Wilhelmson, R., Graves, S., Ramamurthy, M., Clark, R. D., Yalda, S., Reed, D. A., Joseph, E., Chandrasekar, V. “CASA and LEAD: Adaptive cyberinfrastructure for real-time multiscale weather forecasting,” special issue on system-level science, Computer, 39(11): 56-63, 2006.
12SPRUCE Science Gateway - spruce.teragrid.org/
13Telecommunications Service Priority (TSP) program - tsp.ncs.gov/
14Pete Beckman, Ivan Beschastnikh, Suman Nadella, and Nick Trebon, “Building an Ifrastructure for urgent computing," in High Performance Computing and Grids in Action. IOS Press, Amsterdam, 2007.
15PBS ‘qsub’ Job Submission Tool - www.clusterresources.com/torquedocs21/commands/qsub.shtml
16NOAA HWT - www.nssl.noaa.gov/hwt/
17SWeiss, S. J., Kain, J. S., Bright, D. R., Levit, J. J., Carbin, G. W., Pyle, M. E., Janjic, Z. I., Ferrier, B. S., Du, J., Weisman, M. L., Xue, M. “The NOAA Hazardous Weather Testbed: Collaborative testing of ensemble and convection-allowing WRF models and subsequent transfer to operations at the Storm Prediction Center,” in Proc. 22nd Conf. Wea. Anal. Forecasting/18th Conf. Num. Wea. Pred., Salt Lake City, Utah, Amer. Meteor. Soc., CDROM 6B.4, 2007.
18Brewster, K. A., Weber, D. B., Thomas, K. W., Droegemeier, K. K., Wang, Y., Xue, M., Marru, S., Gannon, D., Alameda, J., Jewett, B. F., Kain, J. S., Weiss, S. J., Christie, M. “Use of the LEAD portal for on-demand severe weather prediction,” Sixth Conference on Artificial Intelligence Applications to Environmental Science, 88th Annual Meeting of the American Meteorological Society, New Orleans, 2007.
19Xue, M., Wang, D., Gao, J., Brewster, K., Droegemeier, K. K. “The Advanced Regional Prediction System (ARPS) – storm-scale numerical weather prediction and data assimilation,” Meteor. Atmos. Physics, 82:139-170, 2007.

Pages: 1 2 3 4 5 6 7 8

Reference this article
Marru, S., Gannon, D., Nadella, S., Beckman, P., Weber, D. B., Brewster, K. A., Droegemeier, K. K. "LEAD Cyberinfrastructure to Track Real-Time Storms Using SPRUCE Urgent Computing," CTWatch Quarterly, Volume 4, Number 1, March 2008. http://www.ctwatch.org/quarterly/articles/2008/03/lead-cyberinfrastructure-to-track-real-time-storms-using-spruce-urgent-computing/

Any opinions expressed on this site belong to their respective authors and are not necessarily shared by the sponsoring institutions or the National Science Foundation (NSF).

Any trademarks or trade names, registered or otherwise, that appear on this site are the property of their respective owners and, unless noted, do not represent endorsement by the editors, publishers, sponsoring institutions, the National Science Foundation, or any other member of the CTWatch team.

No guarantee is granted by CTWatch that information appearing in articles published by the Quarterly or appearing in the Blog is complete or accurate. Information on this site is not intended for commercial purposes.