WRF模式学习

WRF(Weather Research & Forecasting)模式是一个专为气象研究和业务预报应用所研发的中尺度数值天气预报系统,它具有ARW(Advanced Research WRF)和 NMM(Nonhydrostatic Mesoscale Model)两种核心,一个数据同化系统,以及支持并行计算和系统可扩展性的软件架构。该模型适用于从几十米到数千公里的广泛气象应用。WRF可以根据实际大气条件(即观测和分析)或理想化条件进行模拟。WRF为操作预测提供了一个灵活且计算高效的平台。

官网  https://www.mmm.ucar.edu/models/wrf

WRF源代码和图形软件下载网址  

https://www2.mmm.ucar.edu/wrf/users/download/get_source.html

在线手册

ARW OnLine Tutorial Introduction (ucar.edu)

进入WRF文件夹

cd WRF

WRF文件介绍:

•  README文件中包含了wrf的版本信息和如何安装和运行的代码

 源码目录

dyn_em/:  Directory for ARW dynamics and numerics
dyn_nmm/:  Directory for NMM dynamics and numerics, which is no longer developed or supported
dyn_exp/:  Directory for a 'toy' dynamic core
external/:  Directory containing external packages, such as those for IO, time–keeping, and MPI
frame/:  Directory containing modules for the WRF framework
inc/:  Directory containing 'include' files
main/:  Directory for main routines, such as wrf.F, and all executables after compilation
phys/:  Directory for all physics modules
share/:  Directory containing mostly modules for WRF mediation layer and WRF I/O
tools/:  Directory containing tools for developers

 脚本

clean/:  Script to clean created files and executables
compile/:  Script for compiling the WRF code
configure/:  Script to create the configure.wrf file, which prepares for compilation

• Makefile: Top–level makefile
• Registry/: Directory for WRF Registry files
• arch/: Directory where compile options are gathered
• run/: Directory where one may run WRF
• test/: Directory that contains several test case directories, may also be used to run WRF

编译 WRF

You will first need to create a configuration file for your computer. This file will determine the type of compilation (i.e., parallel or serial), and what compiler to use.

 


Before you continue, ensure that your computer environment is set up correctly.

Now that you have set up your environment, type:

./configure

You will be given a list of choices for your computer. You must choose the compiler options best suited for your needs, and then you need to choose whether you plan to build in parallel (distributed memory [dmpar], OpenMP shared memory [smpar], or a combination of shared memory and distributed memory [dm+sm]), or serially. Below is an example from a Linux system (*Note: dmpar is the most highly tested and recommended mode for compiling in parallel):

 

 

Never compiled WRF before?

Pick a serial option, as this will reduce the number of possible problems you run into. Once you are proficient in compiling and running WRF, you can re-configure WRF for a more complex environment that requires a parallel build.

NCAR – Cheyenne Users

Please see  Notes for Using Cheyenne.

For 2D Idealized Cases

Always pick a single–threaded, no–nesting option
 

serial: single processor
smpar: shared memory option (OpenMP)
dmpar: distributed memory option (MPI) **Recommended for parallel**
dm+sm: distributed memory with shared memory (for example, MPI across nodes with openMP within a node) – usually better performance is through dmpar only)



 

Once you have made your choice, you will be asked the type of nesting run you are interested in:

Compile for nesting? (0=no nesting, 1=basic, 2=preset moves, 3=vortex–following) [default 0]:



 

The most common option is "basic" (1).
(2) is used for a moving nest in which the user specifies the nest movement using namelist variables - Note: This was originally developed for testing purposes and is possible to use, but is very tedious, as you must specify every move.
(3) is primarily used for moving nests following a hurricane vortex, and the model automatically moves the nest
To read more about moving nests, click here.




 

The above will create a 'configure.wrf' file. If necessary, you can edit this file, and then save it before compiling. Click here to see an example of a configure.wrf file that was created on NCAR's Cheyenne supercomputer. This file will differ from platform to platform, and from version to version.

Compiling WRF for Real Data Cases


 

On some computers, it is necessary to set the following environment variable before compiling (if you are unsure, go ahead and set it, as it does not hurt to do so):
setenv WRF_EM_CORE 1


 

Type:

./compile

You will be given the following choices:

Usage:

compile [-d] [-j n] wrf      compile wrf in run dir (NOTE: no real.exe, ndown.exe, or ideal.exe generated)

or choose a test case (see README_test_cases for details) :
compile em_b_wave
compile em_convrad
compile em_esmf_exp
compile em_fire
compile em_grav2d_x
compile em_heldsuarez
compile em_hill2d_x
compile em_les
compile em_quarter_ss
compile em_real
compile em_scm_xy
compile em_seabreeze2d_x
compile em_squall2d_x
compile em_squall2d_y
compile em_tropical_cyclone
compile exp_real
compile nmm_real
compile nmm_tropical_cyclone

compile -d : to compile without optimization and with debugging
compile -j n : parallel make using n tasks if supported (default 2)
compile -h : help message

Since we are compiling a WRF ARW real data case, we are going to choose the "em_real" option (make sure to send the standard error and output to a log file as indicated below with the use of ">&". If anything goes wrong with the compilation, this file will be necessary for troubleshooting):

./compile em_real >& log.compile

If your compilation was successful, you should see the following executables in the WRF/main/ directory:

ndown.exe: Used for one-way nesting
tc.exe: TC Bogusing for adding or removing a tropical cyclone
real.exe: WRF initialization for eal data cases
wrf.exe: WRF model integration

Make sure the executables are not of zero–size!

These executables will be linked from the main/ directory to the directories run/ and test/em_real/. You can choose to run the code from either of these directories.




 

 

The executables were not created

Check the  log.compile file for any errors. The file is large, so you will want to do a search for the word 'Error' (with a capital 'E'. We are not concerned about errors with a lower–case 'e'). Click  here to see an example compile log that was created on the NCAR Cheyenne supercomputer for a build with ifort. This file will vary among platforms, compilers, and versions.
Most common errors are:
  • Incorrect netCDF version (e.g., netCDF is compiled with PGI, but you are compiling the code with Intel compilers).
  • Required libraries are not installed, or not installed correctly.
  • Paths to libraries not found.
Typically when you are unable to compile WRF, it is related to a problem with your environment, compiler(s), and/or libraries. We have a webpage that will walk you through the steps of verifying that your environment is set up properly, testing that each of the libraries are built correctly, and are compatible with your compilers. You can find that web page  here. If you are working on a large cluster, you may need to seek help from a systems administrator from your institute to get the environment set up correctly.

If you need to correct a problem in the configure.wrf file, be sure to:
./clean -a
./configure
then make the change to configure.wrf, and save it, before attempting to compile the code again.




 

If you need to obtain help or support regarding a compiling problem

Repeat what you have done one more time:
./clean -a
./configure
./compile em_real >& log.compile
Then post a topic to the  WRF/MPAS Support Forum, and be sure to attach the log.compile file, along with your configure.wrf file, and computer/compiler information.




 

If WRF was successfully compiled, you are now ready to compile WPS.

Compiling WRF for Idealized Cases


 

On some computers (e.g., some Intel machines), it is necessary to set the following environment variable before compiling:
setenv WRF_EM_CORE 1


 

Type:

./compile

You will be given the following choices:

Usage:

compile [-d] [-j n] wrf compile wrf in run dir (NOTE: no real.exe, ndown.exe, or ideal.exe generated)

or choose a test case (see README_test_cases for details) :
compile em_b_wave
compile em_convrad
compile em_esmf_exp
compile em_fire
compile em_grav2d_x
compile em_heldsuarez
compile em_hill2d_x
compile em_les
compile em_quarter_ss
compile em_real
compile em_scm_xy
compile em_seabreeze2d_x
compile em_squall2d_x
compile em_squall2d_y
compile em_tropical_cyclone
compile exp_real
compile nmm_real
compile nmm_tropical_cyclone

compile -d compile without optimization and with debugging
compile -j n parallel make using n tasks if supported (default 2)
compile -h help message

Pick the idealized case you want to run (e.g., the baroclinic wave case), and compile the code for this case:

./compile em_b_wave >& log.compile

If your compilation was successful, you should see these executables in the WRF/main/ directory:

WRF/main/ideal.exe: WRF initialization for idealized data cases
WRF/main/wrf.exe: WRF model integration

These executables will be linked from the WRF/main/ directory, to the directories run/ and test/em_your_case (e.g., for the baroclinic wave case, the executables will be linked to the directory test/em_b_wave, which is where you will be running the code).


 

If you change cases, remember that recompiling a new case will overwrite the 'ideal.exe' executable that you are now using. If you would like to keep the old 'ideal.exe,' move or rename it. Since 'ideal.exe' is linked from the WRF/main/ directory, into the directory you are using (e.g., test/em_b_wave), you can simply remove the link and copy 'ideal.exe' into the test/ case/ directory, so that it does not get overwritten when compiling a new case.





 

 

The executables were not created

Check the log.compile file for any errors. The file is large, so you may want to do a search for the word 'Error' (with a capital 'E'. We are not concerned about errors with a lower–case 'e').
Most common errors are:
  • Incorrect netCDF version (e.g., netCDF is compiled with PGI, but you are compiling the code with Intel compilers).
  • Required libraries are not installed, or not installed correctly.
  • Paths to libraries not found.
Typically when you are unable to compile WRF, it is related to a problem with your environment, compiler(s), and/or libraries. We have a webpage that will walk you through the steps of verifying that your environment is et up properly, testing that each of the libraries are built correctly, and are compatible with your compilers. You can find that web page  here.

If you need to correct a problem in the configure.wrf file, be sure to:
./clean -a
./configure
then make the change to configure.wrf, and save it, before attempting to compile the code again.




 

If you need to report a problem to wrfhelp (wrfhelp at ucar dot edu)

Repeat what you have done one more time:
./clean -a
./configure
./compile em_real >& log.compile
Then send the log.compile file, together with your computer/compiler information to wrfhelp.


 

You are now ready to run the WRF ARW model for your chosen idealized case.

Basics - WRF for Idealized Cases

The WRF model has 2 steps:

ideal.exe
Sets up initial conditions needed to run wrf.
Note: Not all idealized cases require a lateral boundary file because of the boundary condition choices they use, such as the periodic boundary condition option.
 
wrf.exe
Generates the model forecast.

STEPS to Run WRF

1. After you have compiled the case of interest (remember to recompile the code if you change cases).
    Move to the directory you plan to run the code (either test/em_xxxxxx/ or run/).
    Both the directories run/ and test/em_xxxxxx/ will have all the files you need linked in,
    so it does not matter in which one you choose to run the code.

 

2. If there is a run_me_first.csh script in the directory - RUN IT (this will link in extra data files needed during the run).

./run_me_first.csh
 

3. Edit the namelist.input file.
    Use the default option if you have never run an idealized case before.
    For detailed explanations of the namelist parameters, as well as some recommendations for best practices, see the Best Practice WRF Namelist page

 

4. Run ideal.exe

- ideal.exe generally cannot be run in parallel. For parallel compiles, run this on a single processor.
- The exception is the quarter_ss case, which can now be run with MPI.

Verify that the program ran correctly
- Check that the file wrfinput_d01 was generated.
- Idealized cases do not require a lateral boundary file because of the boundary condition choices they use, such as the periodic boundary condition option.


5. Run wrf.exe

- Two-dimensional ideal cases cannot be run in MPI parallel. OpenMP is fine.
- Three-dimensional cases can be run with MPI

Verify that the program ran correctly
- After successful completion, you should see wrfout_d01_0001-01-01* files


6. Use a post-processing tool to view the output.
 
 

Available Cases

compile em_b_wave
compile em_convrad
compile em_fire
compile em_grav2d_x
compile em_heldsuarez
compile em_hill2d_x
compile em_les
compile em_quarter_ss
compile em_scm_xy
compile em_seabreeze2d_x
compile em_squall2d_x
compile em_squall2d_y
compile em_tropical_cyclone

WRF Preprocessing System (WPS)

The next step for real data cases is to compile WPS.
You should not be here if you have not already SUCCESSFULLY compiled WRF

 

Examine the WPS Source Code

Move into the WPS directory that you created:

cd WPS
Note: If you are still in the WRF/ directory, this should be one directory up from where you are (i.e., cd ../WPS).

Inside this directory, you will find a number of files and directories. Below are desciptions of some of the files:

– The README files contain useful information about the code and how to set up and run the model.

– Source code directories:

geogrid/ Directory containing code to create the static data
metgrid/ Directory containing code to create input to WRF
ungrib/ Directory containing code to unpack GRIB data
util/ Directory containing utilities

– Scripts:

clean Script to clean created files and executables
compile Script for compiling WPS code
configure Script to create the configure.wps'file, to configure the environment, and prepare for compiling
link_grib.csh Script to link GRIB files to the WPS directory

– Others:

arch Directory where compile options are gathered
namelist.wps WPS namelist that will be used for running 'geogrid.exe,' 'ungrib.exe,' and 'metgrid.exe'
namelist.wps-all_options A reference that contains all additional options you can use in your namelist.wps file

Environment Variable - netCDF

This has likely already been set prior to compiling WRF; however, you can check by issuing (e.g., with csh):

echo $NETCDF
 

If it is not set, issue the following command (typically the netCDF libraries are located in /usr/local/netcdf, but this may vary between systems - check to make sure):

setenv NETCDF /usr/local/netcdf



 

Add this environment variable to your .cshrc, .login, or .profile file. This will ensure that it is always set correctly, and that you do not have to reset it each time that you log in.


 

LINUX-environment Users

Ensure that your netCDF libraries were compiled with the same compiler that you are going to use to compile WRF (e.g., if you are compiling WRF with a PGI compiler, your netCDF libraries must also be compiled with PGI).

NCAR – Cheyenne Users

On NCAR's Cheyenne system, netCDF is installed in:

/glade/u/apps/ch/opt/netcdf/ version/compiler/version_of_compiler

above, 'version' is the version of netcdf you currently have loaded, the 'compiler' is the compiler you currently have loaded, and the 'version of compiler' is the version of the currently loaded compiler (ex. /4.4.1.1/intel/17.0.1).

Configuring WPS

This step will create a configure file for your particular computer. The code has been ported to a wide variety of popular platforms.

Type

./configure

You will be given a list of choices for your computer. These choices range from compiling for a single processor job, to using distributed-memory parallelization options for multiple processors. For example, the choices for a Linux computer may look similar to this (from NCAR'S Cheyenne machine):

NCAR – Cheyenne Users

Please see  NCAR Notes for Cheyenne.
 

In General

If you plan to use GRIB2 data (or think you may use it in the future), always pick a configure option that allows for GRIB2.

It is only necessary to compile WPS with parallelization (dmpar, smpar, or dm+sm) if you are planning to use a very large domain (thousands of grid cells in the east and west directions). Otherwise, a serial option should be fine,  and is recommended, regardless of whether you compiled WRF in parallel.


 

Have GRIB2 Data?

You will need the following libraries: JasPer, libPNG, and Zlib.
Download a tar file containing all 3 libraries  here, or you can obtain them from our  compiling tutorial page, with detailed instructions on how to install the libraries.

Zlib may already be on your computer, so check before installing this library. JasPer and PNG are compression libraries needed to ungrib GRIB version 2 data.
 

You will see a configure.wps file created after configuring. If necessary, you may edit compile options/paths in this file.



 

WRF_DIR Path

If your 'WRF' path is not ../WRF, you will need to edit the  configure.wps file and set the correct path to your 'WRF' directory.


 

After typing './configure,' no options are listed, or the options are not for your platform

This will happen if the code has not been ported to your platform. You will need to  add compilation options for your computer.
**NOTE: This should only be for advanced users

Compile WPS

To compile WPS, issue the following command (make sure to send your standard error and output to a log file by utilizing the ">&" as shown below. If your compilation fails, you will need this file to troubleshoot):

./compile >& compile.log
If your compilation was successful, you should see these executables created (make sure they are not of size zero):
geogrid.exe -> geogrid/src/geogrid.exe Generates static data
ungrib.exe -> ungrib/src/ungrib.exe Unpacks GRIB data and changes to intermediate format
metgrid.exe -> metgrid/src/metgrid.exe Generates input data for WRF
If you do not see these files, check your compile.log file for any errors (seach for the word 'Error,' with a captial 'E').

A few utilities will also be linked under the util/ directory:
 

avg_tsfc.exeComputes daily mean surface temperature from intermediate files. Recommended for using with the 5-layer soil model (sf_surface_physics = 1) in WRF
g1print.exeLists the contents of a GRIB1 file
g2print.exeLists the contents of a GRIB2 file
mod_levs.exeRemoves superfluous levels from the 3D fields in intermediate files
rd_intermediate.exeReads intermediate files
calc_ecmwf_p.exeCreates fields that have 3D pressure and geopotential height on the same levels as the other atmospheric fields, for use with ECMWF sigma–level data sets
height_ukmo.exeComputes a geopotential height field for data sets from the UKMO Unified Model
plotgrids.nclCreates an image of your domain
int2nc.nclCreates a netCDF format file, from an intermediate format file

 

*Note:
Programs plotgrids.exe and plotfmt.exe are no longer automatically compiled with WPS. The reason is that, instead of plotgrids.exe, there is a program called plotgrids.ncl that is compiled, and serves the exact same purpose. plotfmt.exe is only useful for new data types (that we do not support). If you still wish to compile these programs, you can type (from the WPS directory):

./compile  utility_of_your_choice >& log. utility_of_your_choice
If you have NCL libraries installed, before compiling, it may be useful to type 'ncargf90'. This will tell you the correct libraries to use if you would like to compile these 2 programs. Then you will need to edit the following line in your configure.wps file to reflect those libraries (and then save the configure.wps file before compiling).
NCARG_LIBS =



 

Utilities

Detailed explanations of the WPS utility programs are available from Chapter 3 of the  WRF Users' Guide.


 

NCAR Graphics

If you don't have  NCAR Graphics on your system, it is not a problem, but since it is a very handy, free plotting program, we recommend installing it at some point.


 

The executables were not created

Check the  log.compile file carefully for any errors (do a search for 'Error' with a capital 'E').

Most common errors are:
– Incorrect netCDF version (e.g., netCDF compiled with PGI, but you are compiling the code with Intel).
– Paths to libraries not found.
– Required libraries not installed, or not installed correctly.
– You configured for GRIB2 data, but do not have the extra libraries (JasPer, PNG, and Zlib) on your system, or they have been installed incorrectly. See  this page for help getting those libraries installed correctly. (Zlib may already be on your computer, so check before installing this library. JasPer and PNG are compression libraries needed to ungrib GRIB version 2 data).

Typically when you are unable to compile WPS, it is related to a problem with your environment, compiler(s), and/or libraries. We have a webpage that will walk you through the steps of verifying that your environment is set up properly, testing that each of the libraries are built correctly, and are compatible with your compilers. You can find that web page  here.

If you correct any problems in the configure.wps file, be sure to
./clean -a
and reconfigure before you attempt to compile the code again.



 

If your executables were created correctly, you are now ready to run the WRF ARW model.

Basics for Running the Model

Below is a description of the program flow during a typical model run:

 



 

WPS

geogrid.exe creates terrestrial data from static geographic data that is obtained from an external data source (this is available to download, and will be discussed in the case studies).
ungrib.exe unpacks GRIB meteorological data (that is obtained from an external source) and packs it into an intermediate file format.
metgrid.exe interpolates the meteorological data horizontally onto your model domain. Output from metgrid.exe is used as input to WRF (through the real.exe program).

 

WRF

real.exe vertically interpolates the data onto the model coordinates.
wrf.exe generates the model forecast.



 

Detailed information on all components are available in the  WRF Users' Guide. Peruse Chapter 3 for details on WPS, and Chapter 5 for details on WRF.


 

Should I run geogrid or ungrib first?

It does not matter in which order these 2 programs are run, as they are independent of each other. Additionally, re-running one does not require that you must re-run the other.



 

Before we run a specific case, let's first look at some details regarding each of the individual components.

Basics: Geogrid

The purpose of GEOGRID is to define the simulation domain(s), and interpolate various terrestrial data sets to the model grids.

The simulation domain(s) are defined using information specified by the user in the "share" and "geogrid" sections of the WPS namelist. Please see our Best Practices WPS Namelist page for detailed explanations of these and other namelist.wps parameters, in addition to recommended practices.

In namelist.wps, multiple columns are used for multiple domains. If you use 3 columns, but set max_dom to 2, the last column will be ignored. NOTE: Not all parameters have multiple columns, e.g., dx and dy.

By default, in addition to computing latitude and longitude for every grid point, geogrid will interpolate soil categories, land use category, terrain height, annual mean deep-soil temperature, monthly vegetation fraction, monthly albedo, maximum snow albedo, and slope category to the model grids.

STEPS to Run GEOGRID

**Note: This is just a brief description of the basic steps - meant to help you understand the process. If you currently have no data, or simulation you would like to run, you can wait until you get to the 'case studies' section, later, to try to follow along.

1. Download the terrestrial input data
2. Edit the &share and &geogrid sections of the namelist.wps file for your particular domain set-up.

 

Run  plotgrids.ncl to ensure your domain is in the right location before running  geogrid.exe
ncl util/plotgrids.ncl

 

3. Run geogrid.exe (output is in the format of 'geo_em.dxx.nc' - one file for each domain)

./geogrid.exe

Basics: Ungrib

The purpose of UNGRIB is to unpack GRIB (GRIB1 and GRIB2) meteorological data and pack it into an intermediate file format.

Unpacking the data is controlled via the "share" and "ungrib" sections of the WPS namelist. For complete descriptions of these namelist variables (as well as others) in addition to recommendations for best practices, see our Best Practices WPS Namelist page.

UNGRIB
   - is NOT dependent on any WRF model domain.
   - is NOT dependent on GEOGRID.
   - does NOT cut down the data according to your model domain specification. It simply unpacks the required fields and writes them out into a format that the METGRID program can read.
   - makes use of Vtables (see sample Vtables in the WPS/ungrib/Variable_Tables/ directory) to specify which fields to unpack from the GRIB files. The Vtables list the fields and their GRIB codes that must be unpacked from the GRIB files.



 

Although Vtables are provided for many common data sets, it would be impossible for ungrib to anticipate every possible source of meteorological data in GRIB format. When a new source of data is to be processed by UNGRIB, the user may create a new Vtable, either from scratch, or by using an existing Vtable as an example. In either case, a basic knowledge of the meaning and use of the various fields in the Vtable will be helpful.

Required Fields

When creating your own Vtables, note that  a number of input fields are required.

STEPS to Run UNGRIB

**Note: This is just a brief description of the basic steps - meant to help you understand the process. If you currently have no data, or simulation you would like to run, you can wait until you get to the 'case studies' section, later, to try to follow along.

1. Download data and place in a unique directory (it is not essential to place the data in a unique directory, but is recommended to maintain organization).
     Obtaining input data is generally the user's responsibility, however, some details concerning data types and availability can be found by clicking the 'Data' tab on the top navigational bar.

2. Familiarize yourself with the data

3. Link (with the UNIX command ln) the correct Vtable
   -For example, if you are using GFS data, type:
    ln -sf ungrib/Variable_Tables/Vtable.GFS Vtable

4. Link (with supplied script link_grib.csh) the input GRIB data
    ./link_grib.csh path_to_data

5. Edit the &share and &ungrib sections of the namelist.wps file. You only need to pay attention to the following parameters:
    start_date ; end_date ; interval_seconds ; prefix
    Note: Normally one will leave the "prefix" set to "FILE", except in cases where this may overwrite data.

6. Run ungrib.exe (output will be intemediate files in the format of FILE:YYYY-MM-DD_hh - one file for each time)
     ./ungrib.exe

7. Familiarize yourself with the intermediate files
 



 

WRF will need boundary conditions for the  ENTIRE time you plan on running the model. Make sure to UNGRIB enough input data.



 

Input data NOT in GRIB format

Write a stand alone program to read your data and output it in the  intermediate file format.  (This "intermediate" link has detailed file format information and sample programs. Extra information is also available in Chapter 3 of the WRF User's Guide).

Creating your intermediate files manually eliminates the need for the UNGRIB step. If you create your intermediate files manually, it is recommended to check the correctness of your intermediate files with the utility program  rd_intermediate.exe (available in the WPS/util directory).
 



 

Frequently Asked Questions

Do I have to re-run ungrib if I use the same data, but change my domain location?
No. UNGRIB is not tied to any domain, but if you are using a regional model as input data make sure that your new domain will still fit inside the input data domain.

Do I have to re-run ungrib if I add a nest to my model run?
No. UNGRIB has no WRF model domain information.

For which length of time do I need to run ungrib?
As WRF is a regional model, you need input data for the entire time you plan on running the WRF model.

Basics: Metgrid

The purpose of METGRID is to horizontally interpolate the meteorological data onto your model domain.
Output from metgrid.exe is used as input to WRF.

The simulation domain(s) are defined using information specified by the user in the "share" and "metgrid" sections of the WPS namelist. For detailed descriptions of the namelist.wps variables, in addition to suggestions for best practices, see our Best Practice WPS Namelist page.

STEPS to Run METGRID

**Note: This is just a brief description of the basic steps - meant to help you understand the process. If you currently have no data, or simulation you would like to run, you can wait until you get to the 'case studies' section, later, to try to follow along.

1. Edit the &share and &metgrid sections of the namelist.wps file.
2. Input to METGRID is the geo_em.dxx.nc output files from GEOGRID, and
    the intermediate output files from UNGRIB (e.g., FILE:YYYY-MM-DD_hh).
3.Run metgrid.exe
      ./metgrid.exe

   Output from this program will be:
    - met_em.d0X.YYYY-MM-DD_hh:00:00.nc - one file for per time, for each domain ("d0X" represents the domain).

Basics: WRF

 

The WRF model has 2 steps:

real.exe
This program vertically interpolates the met_em* files (generated by metgrid.exe), creates boundary and initial condition files, and does some consistency checks.

wrf.exe
Generates the model forecast.

STEPS to Run WRF

1. Move to the directory in which you plan to run the code (either WRF/test/em_real or WRF/run).
       -Both the run/ and test/em_real/ directories will have all the files you need linked in,
       so it does not matter which one you choose to run the code in.
 
2. Link or copy (with the UNIX command ln or cp ) the met_em files to this directory.
       ln -sf path_to_met_em_files/met_em.d0* .
 
3. Edit the namelist.input file for your particular run. For descriptions of the namelist parameters, as well as suggestions for best practices, see our Best Practices WRF Namelist page.
 
4. Run real.exe (verify that the program runs correctly)
       -You should have the following output files (default setup): wrfinput_d01 & wrfbdy_d01
       -This is true for single domain and default nested runs.
       -If you plan to use a nested domain, you will have a wrfinput_dxx file for each domain.
       (more on this in 'nested case studies').
 
5. Run wrf.exe (verify that the program runs correctly)
       -You should have the following output files (default setup): wrfout_dxx_[initial_date] (one for each domain)
       -Each file (by default) will contain all the forecast output times.
 
6. Use a post-processing tool to view the output.
 



 

Do I need to re-run real.exe if a change to the namelist.input has been made?

-Generally, you only need to run  wrf.exe again; however, if you have changed the LSM packages you are running with, or have changed anything about the date/times or input data, then you must re-run  real.exe again.

real.exe runs very fast - so if in doubt, just go ahead and run  real.exe again.

 

Let's run a real case study!!

  • 2
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值