Questions about run time, nesting

Discuss the nesting capability within the model itself and any problems you might have run into.

Questions about run time, nesting

Postby mnotaro » Mon Dec 27, 2010 5:10 pm

I am interested in running WRF to examine modern and future climate over Oklahoma and north Texas,
both as 20-year simulations forced by an IPCC AR4 global model. I have experience with other RCMs but wish to expand into WRF, so any advice on an experiment design would be appreciated.

My understanding is that the ratio of parent to child nested domain should be 3:1, right?
So if I wish to perform a non-hydrostatic simulation of about 6-km in the inner-most domain,
I could use the roughly 160-km GCM to force WRF for 3 nested domains of 53 km, 17 km, and 6 km, right?
The 6-km domain would cover Oklahoma and N Texas (the outermost domain could cover the middle
half of the country probably). Does this sound practical?

Any idea how long it would take to run a simulation year for such a grid (even a rough estimate)?
I want to be sure I am not proposing to run a simulation that would take forever and is impractical.
I am interested in weather extremes so I wish to run at a high spatial resolution.

Thanks,
Michael
mnotaro
 
Posts: 3
Joined: Thu Jul 08, 2010 3:17 pm

Re: Questions about run time, nesting

Postby jimmyc » Wed Dec 29, 2010 2:55 am

Hi Michael-

Its hard to estimate time with a nested run since it depends on exactly how big those inner domains are (and of course how many vertical levels you want to use).

A conus 50km wrf run for NARCCAP using 64 processors was about 130 by 155 grid points with 35 vertical levels. I think it cost 6 hours of wall clock time per month of simulation. For these simulations we used the 2.5 degree reanalysis. However, I dont think you have to use the outer 50 km grid. Trapp and colleagues at purdue are using the global data and going straight to 4 km grid spacing (with re-init every 24 hours to avoid drift).
As far as the size, extending the outermost domain has some benefits for the severe storm environments since you cant really add detail via the LBCs from over the mountains, especially since they are a major player in the base state climate in the Plains. In addition because of the domain decomposition you may want a larger outer domain to increase the processor count and memory for the inner domains.

The bottom line is to try to benchmark a simulation like this for cost. Of course with all the new physics options costs may vary substantially.

WRF is very well setup now for climate outputs (accumulated fluxes for radiation components, heat fluxes, precip, etc) and I believe the to-be-released version will have climate physics schemes from CAM and CCSM.
The views expressed in this message do not necessarily reflect those of NOAA or the National Weather Service or the University of Oklahoma.
James Correia, Jr
jimmyc
 
Posts: 519
Joined: Tue Apr 15, 2008 1:10 am


Return to Nesting

Who is online

Users browsing this forum: No registered users and 3 guests