This page documents the steps to run WRF and WRF-Chem on Harvard cluster.

We invite Archana Dayalu, Packard Chan, Lee Miller and Jiahua Guo to co-edit this page.

Glossary

Super quick start (Harvard cluster users only, real case, no compilation)

One-time setup

Register in http://www2.mmm.ucar.edu/wrf/users/download/get_source.html

ssh datamover01  # for faster copying

Choose a wrf-root directory: recommend to be on lfs disk, but not scratchlfs. If available, LAB storage on holylfs, kuanglfs are good choices. Below uses /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/.

rsync -a /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/410-WPS /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/ --exclude='.git*'  # 15s on datamover01

rsync -a /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/412-WRF /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/ --exclude='.git*'  # 2m22s on datamover01

Run WPS

Exit datamover01 node. You can use login node for this default case.
cd /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/410-WPS/
source source_wrf  # 25s on holylogin03
vim namelist.wps  # no edits
./geogrid.exe  # 8s on holylogin03, v410, Jan00
ln -sf ungrib/Variable_Tables/Vtable.GFS-PRMSL Vtable  # for ungrib

./link_grib.csh /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/data-tutorial-case/JAN00/fnl_2000012*  # for ungrib

./ungrib.exe  # 2s on holylogin03, v410, Jan00
./metgrid.exe  # 1s on holylogin03, v410, Jan00

Run WRF

cd /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/412-WRF/
cp -ai run 00run  # 5s on holylogin03
cd 00run/
# make sure you have sourced source_wrf
ln -s ../../410-WPS/met_em.d01.2000-01-2* ./
vim namelist.input  # no edits
./real.exe  # 3s on holylogin03
tail rsl.error.0000  # expect to see "SUCCESS COMPLETE REAL_EM INIT"
vim job_wrf.sbatch  # no required edits
sbatch job_wrf.sbatch  # 2m36s with 4 huce_intel cpus
tail rsl.error.0000*  # expect to see "SUCCESS COMPLETE WRF"

REAL cases

Choose versions of WRF & WPS:

Latest releases on official GitHub:

WRF v4.1.2 (precompiled without chem) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/412-WRF
WPS v4.1 (precompiled) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/410-WPS

WRF v3.6.1 (precompiled with chem) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRFV3
WPS v3.6.1 (precompiled) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WPS

WRF v3.0beta with RAVE (pls recompile) is in ~kuang/Model/WRFV3
WPS v3.2.1 (pls recompile) is in ~dingma/Model2/WPSv3

Read the user's guide:
WRF (ARW) User's Guides: v3, v4
WRF-Chem: https://ruc.noaa.gov/wrf/wrf-chem/Users_guide.pdf #This is for a different WRF-Chem version (3.9), but it's still a relevant guide.
https://ruc.noaa.gov/wrf/wrf-chem/Emission_guide.pdf #This is a separate supplementary WRF-Chem guide to chemical input data processing.
https://ruc.noaa.gov/wrf/wrf-chem/wrf_tutorial_nepal/talks/Setup.pdf  #Some helpful WRF-Chem slides from NOAA

PART I. Setting up/Configuration/Compilation.

#(3) Configure WRF

 ./configure

#(4) Choose 15. (dmpar) to compile MPI version with Intel compilers

              **Note! Do not use dm+sm or smpar options with WRF-Chem!! Choose either serial or dmpar**

#(5) Modify the file “configure.wrf” (around lines 149-150) to read the following. Note that you have to do this each time you run ./configure, because the configure.wrf script is overwritten each time.

 DM_FC = mpiifort -f90=$(SFC)
 DM_CC = mpiicc -cc=$(SCC) -DMPI2_SUPPORT

#(6) Compile WRF before WPS!! Compilation will take a while. If you're on an interactive shell, remove the "&" to avoid timing out.

      # For real cases:  
./compile em_real &> compile.log

  #(7) Configure WPS

      #Likely you will need GRIB2 Support...so Choose option #19: 
      #Linux x86_64, Intel compiler    (dmpar)
./configure

#(8) Compile WPS

./compile &> compile.output

#(1)-(8) are adapted from p.6 of Plamen's advice for v3.9.1 (6/8/2018)


      #(9) Compile convert_emiss.exe, a program that converts binary intermediate chemical fields into netcdf WRF-compatible format. Navigate to your WRFV3 folder and type:

./compile emi_conv

     If compilation was successful, you should see convert_emiss.exe in the WRFV3/chem folder.

      #(10) Compile PREP-CHEM-SOURCES (available HERE), a mapping utility for converting raw anthropogenic chemical fields to binary intermediates that are then fed into convert_emiss.exe. Unzip the tar.gz file into your main WRF folder. There are some typos, and missing details in the pdf guide above, so a modified version of the instructions (and Paul Edmon's help rebuilding HDF5 to fix an error message) enabled successful compilation of the utility. The modified instructions are located here:

/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRF/README_prepchem_modifications

In any case, a compiled version of prep chem sources utility using the instructions above is located here:

/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRF/PREP-CHEM-SRC-1.5/bin/prep_chem_sources_RADM_WRF_FIM_SIMPLE.exe

#(11) The anthro_emiss utility (ANTHRO). Like prep-chem, this is another (possibly less versatile) option for getting your anthropogenic chemical fields the wrf-chem format. It appears that it's useful for very specific data sets (like EDGAR-HTAP). But it also appears to be more straightforward to use, if you aren't rigid about your emissions inventory choice (i.e., if EDGAR-HTAP is sufficient for your purposes). If you want to set it up on your own, go here (https://www.acom.ucar.edu/wrf-chem/download.shtml) and click "anthro_emiss" at the bottom. Or you could use what is already downloaded and compiled (following the intstructions in the README):

/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRF/ANTHRO

#(12)  The MOZBC utility for mapping chemical boundary conditions to your WRF-Chem  domain has already been compiled and saved in the $CLIMATE_MODELS WRF-Chem folder, following the instructions in the README_mozbc file. You can use that, or if you wanted, download and compile MOZBC on your own. The initial files are based on MOZART 4-GEOS 5 output (6hr, 1.9x2.5deg, 56-level) datasets (https://www.acom.ucar.edu/wrf-chem/mozart.shtml). Read the README file for compilation instructions if you're doing this on your own; on the Harvard cluster you might have to do the following: export NETCDF_DIR=$NETCDF_HOME before compilation, and same with MEGAN instructions (#13 below). Otherwise you can use the compiled version located at:

/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/MOZBC

#(13) MEGAN bio emissions. A compiled version is located at the path below. (Read the README file for details if you want to compile your own version. Make sure your NETCDF_DIR variable is set if so, and make sure your make_util file is executable. Then run ./make_util megan_bio_emiss). Make sure you also have the input data you need from https://www.acom.ucar.edu/wrf-chem/download.shtml . Scroll to the bottom "For bio_emiss input files only" and fill out the requested spatiotemporal information. Then select "bio_emiss input files".

/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/MEGAN

#(14) EDGAR-HTAP anthropogenic chemical fields, for use with anthro_emiss utility. Download from here: https://www2.acom.ucar.edu/wrf-chem/wrf-chem-tools-community. Otherwise you can link to the already downloaded version:

/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/anthro_data

 

PART II. Running WPS and WRF: Overview/General Steps


Now that you have a compiled version of WRF and WPS, you are ready to set up your model runs. Since these are for real cases, you need access to initialization data sets. You will need to figure out the initialization data set best suited to your domain and purposes. For the examples for China provided here, we are using GRIB2 NCEP FNL Operational Model Global Tropospheric Analyses, continuing from July 1999 (http://dx.doi.org/10.5065/D6M043C6). For these examples, the files are already downloaded. Instructions to link to them are noted where necessary.

Regardless of whether you are running WRF or WRF-Chem, it is important that you do the following first and in this order (detailed instructions follow, including in the examples in Part III and IV):

(1) Run WPS (geogrid.exe, ungrib.exe, metgrid.exe) to create real data-based initialization files with of form met_em.d0X.YYYY-MM-DD_hh:mm:ss.nc

(2) Run real.exe to generate input and boundary files of form wrfinput_d0*, wrfbdy_d01 (and optionally wrffdda_d0*) to initialize WRF model 

If you are running WRF without chemistry, you can go ahead and run the main WRF model at this point. If you are running WRF-Chem, this is the point at which you run your chemistry data prep program (i.e., prep-chem-src, anthro_emis, and/or convert_emiss) which requires wrfinput_d0* files to actually work. Once you have your correctly formatted chemical data (they should be in the form wrfchemi_00z_d01 and wrfchemi_12z_d01). Once you are done with this and have all your requisite chem data in the right format stored or linked in the WRFV3/test/em_real folder, you can run the wrf.exe model.


 

Step 1: Running WPS (WRF Pre-processing System)

The pre-processing to create a real data-based initialization file. Read the README file in the parent WPS folder for a quick guide on what the programs do.

  1. Define model domains with geogrid. Edit the namelist.wps file, specifically:
 &share (e.g., core='ARW', max_dom, io_form_geogrid=2, opt_output_from_geogrid_path='/n/your_scratch_dir_path/GEOGRID_DUMP/’)
 &geogrid

      2. View domain configuration, confirm it's correct.

 cp util/plotgrids_new.ncl . ; ncl plotgrids.ncl

      3. Run geogrid.

cd geogrid; ln -s GEOGRID.TBL.ARW_CHEM GEOGRID.TBL 
mpirun -np 1 ./geogrid.exe

      1.Examine GRIB data (most likely GRIB2).

./util/g1print.exe /your_GRIB1_data_path/filename | more
./util/g2print.exe /your_GRIB2_data_path/filename | more

      2. Edit namelist.wps as needed.

&share (start date, end date, interval seconds=21600)
&ungrib (outformat='WPS',prefix='/n/your_scratch_dir_path/METEM_FILEDUMP/FILE')

               #note: be aware of 90-day retention policy of scratch data. Make sure you've emptied any files from previous runs in the dump dirs.

      3. link Vtable (it's like a Rosetta Stone) to appropriate GRIB data type to prepare for ungribbing. e.g., for GFS data. WRF has a bunch of Vtables already in the Variable_Table subfolder of ungrib, but sometimes you will need to do some sleuthing and download the one that you actually need into the Variable_Tables folder. For the examples that follow, this is exactly what we'll need to do. 

ln -sf ungrib/Variable_Tables/Vtable.theVtableYouWantToUse Vtable

      4. link the GRIB data that you are going to use

./link_grib.csh /your_GRIB_data_path/XXX*

             #This should update GRIBFILE.AAA, .AAB, etc links in pwd.

      5. Run ungrib

mpirun -np 1 ./ungrib.exe >& ungrib.output

            #You should see FILE:YYYY-MM-DD_hh in your folder prescribed by “prefix”

  1. Edit namelist.wps as needed.

&share (this should be exactly as you need it already)
&metgrid (fg_name = &ungrib prefix; io_form_metgrid=2, outpath)

      2. Run metgrid

mpirun -np 1 ./metgrid.exe

      3. You should see met_em.d01(or d02 or d03....).YYYY-MM-DD_hh:mm:ss.nc files created. These are your WPS final output files that real.exe ingests.


Step 2. Running WRF

  1. First link your met files from metgrid.exe
cd run
ln -s /n/your_scratch_dir_path/METEM_FILEDUMP/met_em* .

      2. Edit namelist.input file.

&time_control #obviously should match namelist.wps.
history_outname = 'the/path/where/you/want/final/output/files/and/filename_d<domain>_<date>'
&domains #obviously should match namelist.wps
#Customize other sections as needed.

#Particularly examine num_metgrid_levels and num_metgrid soil levels so they match with the input specified by the met_em* files. At command line:

ncdump -h met_em_d0XXXX.nc | more
num_metgrid_levels is typically in the header of the file
num_metgrid_soil_levels is close to the footer of the file

#Execute real.exe ; no real benefit to running as a parallel job so…

mpirun -np 1 ./real.exe >& run_real.log #make sure namelist.wrf chem_opt = 0 for this step for now.

      3. Examine the run_real.log file. You should see SUCCESS EM_REAL INIT printed at the end.

      4. Make sure you have the following files output from this step:

wrfinput_d0* (one for each domain)
wrfbdy_d01 (just for d01)
wrffdda_d01 (just for d01 since nudging is happening only in this domain)

 

       5. Process gridded chemical data (biogenic, boundary conditions, anthropogenic) using specified utilities (e.g., MEGAN bio, prep-chem-src, mozbc). Note: there are other options, like fire emissions ... you need to figure out what is relevant to your question.

       6. Make sure your binary emissions files are saved in WRFV3/test/em_real either as hard or soft links. They MUST be in this directory in some form.

       7. Turn your chemistry back on (chem_opt=XXX) in the namelist.wrf file.

       8. Run convert_emiss.exe to convert the chemistry data from binary intermediate to WRF input form

       9. You should have your relevant chem data in  WRFV3/test/em_real at this point, ready for ingestion by the wrf model. At this point, the key files input files that WRF-Chem expects in order to run successfully may now include, but is not limited, to the following:

           wrfbiochemi_<domain> #if you planned to include biogenic chem

           wrfchemi_00z_<domain> #if you planned to use anthropogenic chem, for two sets of time 00z and 12z

           wrfchemi_12z_<domain> 

           wrfbdy_d01 #Boundary file, should include chemical boundary conditions from mozbc for example if you chose to go that route.

           wrfinput_<domain> #your standard initialization file from real.exe

           wrffdda_<domain> #for the domain that you're nudging in (if you are nudging).

 

     10. If everything looks in order, run the wrf model. This is the only step that has significant benefit from running in parallel!

sbatch run_wrf.sh #Edit this script as needed; a template is provided.

     11. You should now have netcdf files of format, stored in /n/your_scratch_dir_path/WRFOUT/. If you are planning to drive an LPDM like STILT with these met files, these need to eventually be converted to .arl format using WRF2ARL directory contents. This is something that will be treated in a separate page. It is highly recommended that you get familiar with NCL (and have access to ncview) for post-processing and visualization of the wrfout files.

wrfout_d0#_YYYY-MM-DD_hh:mm:ss

Detailed examples

WRF-Chem: A PM2.5 example

The purpose of this example is to take the general steps listed above and actually run a three nested domain WRF-Chem PM2.5 simulation over Beijing during the January 2013 severe haze event and compare with observations. We are going to run WRF-Chem for a total of 10 days from Jan 6 2013 00:00UTC to Jan 16 2013 00:00UTC. We establish 5 days for model spin-up such that the usable simulation time period is 5 days. Make sure you have a local copy of the /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRF directory and contents. You don't need to copy the geography data set.

Thanks to Meng Gao at Harvard China Project and Thomas Lauvaux at PSU for help with this and for providing their WRF-Chem namelist templates for this example!

NOTE: Running "plain wrf" without chemistry is everything below except you would ignore steps 3 through 6. If you're not interested in WRF-Chem, and wanted to do a plain WRF tutorial there are much better ones online, or you could just follow along here and skip steps 3 to 6.

At the end of this example you will have learned how to:


Step 0.1.

At any point where you want to check the contents of a netcdf file as a sanity-check use ncview! This is an excellent habit to develop. Just type 

ncview your_file.nc

and navigate through the variables and panes and make sure things look realistic.

Step 0.2

For wrf-chem, it's good practice to create a folder for use with various external utilities that you link your intermediate wrf files to. This will become clearer, but for now make sure you have a directory in your $CLIMATE_MODELS location that's entitled "UTILINP".

Step 1. Run WPS

&share
wrf_core = 'ARW',
max_dom = 3,
start_date = '2013-01-06_00:00:00','2013-01-06_00:00:00','2013-01-06_00:00:00'
end_date = '2013-01-16_00:00:00','2013-01-16_00:00:00','2013-01-16_00:00:00'
interval_seconds = 21600,
opt_output_from_geogrid_path = '/n/regal/wofsy_lab/adayalu/GEOGRID_DUMP/WRF01/'
io_form_geogrid = 2,
&geogrid
parent_id = 1, 1, 2,
parent_grid_ratio = 1, 3, 3,
i_parent_start = 1, 40, 17,
j_parent_start = 1, 26, 21,
e_we = 81, 49, 55,
e_sn = 57, 49, 55,
geog_data_res = 'usgs_30s+5m','usgs_30s+2m','usgs_30s+30s',
dx = 81000,
dy = 81000,
map_proj = 'lambert',
ref_lat = 35,
ref_lon = 110,
truelat1 = 30,
truelat2 = 60,
stand_lon = 116.397,
geog_data_path = '/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRF_GEOG_COMPLETE/geog'
/
&ungrib
out_format = 'WPS',
prefix = '/n/regal/wofsy_lab/adayalu/METEM_FILEDUMP/WRF01/FILE',
/
&metgrid
fg_name = '/n/regal/wofsy_lab/adayalu/METEM_FILEDUMP/WRF01/FILE'
io_form_metgrid = 2,
opt_output_from_metgrid_path = '/n/regal/wofsy_lab/adayalu/METEM_FILEDUMP/WRF01/'
/
ncl plotgrids_new.ncl
rm -f /n/regal/wofsy_lab/adayalu/GEOGRID_DUMP/WRF01/*
rm -f /n/regal/wofsy_lab/adayalu/METEM_FILEDUMP/WRF01/*
 
srun -n 1 --mem=10000 --pty --x11=first -p test -t 200 bash
cd geogrid; ln -s GEOGRID.TBL.ARW_CHEM GEOGRID.TBL
mpirun -np 1 ./geogrid.exe 

And you should see something like below. Once this is done, navigate to your geogrid dump folder that you specified in your namelist.wps file and visualize the three geo_em.<domain>.nc files using ncview. Make sure things look reasonable.

Parsed 22 entries in GEOGRID.TBL
Processing domain 1 of 3
Processing XLAT and XLONG
Processing MAPFAC

[...etc for other variables, domains ...] 

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
! Successful completion of geogrid. !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
ln -sf ungrib/Variable_Tables/Vtable.GFS_new Vtable
./link_grib.csh /n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130106* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130107* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130108* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130109* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130110* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130111* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130112* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130113* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130114* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130115* \
/n/holylfs/LABS/kuang_lab/adayalu/WRF_GRIB2_NCEP_FNL/fnl_20130116* 
./util/g2print.exe /your_GRIB2_data_path/filename | more
mpirun -np 1 ./ungrib.exe > ungrib.out
tail -3 ungrib.out 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
! Successful completion of ungrib. !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

 

mpirun -np 1 ./metgrid.exe

 

Step 2. Run real.exe to get necessary intermediate files for WRF-Chem utilities (chem_opt off).

ln -s /n/your_scratch_dir_path/METEM_FILEDUMP/met_em* .
mpirun -np 1 ./real.exe >& run_real.log
wrfbdy_d01 #the outermost domain parameter boundary condition file
wrffdda_d01 #nudging file, we requested nudging in the outer domain
wrfinput_d<domain> #initial condition files for each of your study domains.

 

Step 3. Generate your bio emissions using MEGAN

&control
domains = 3,
start_lai_mnth = 1,
end_lai_mnth = 12,
wrf_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP',
megan_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/MEGAN'
/
./megan_bio_emiss < megan_bio_emiss.inp > megan_bio_emiss.out
wrfbiochemi_d01
wrfbiochemi_d02
wrfbiochemi_d03

 

Step 4. OPTION 1 (USE THIS FOR NOW): Prep your anthropogenic emissions using anthro_emis

 
cd src
ln -sf yourpath_toWRF/anthro_data/MOZCART/EDGAR_HTAP_emi_PM2.5_2010.0.1x0.1.nc .

&CONTROL
 anthro_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/ANTHRO/src'
 domains = 3
 wrf_dir    = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP'
 src_file_prefix = 'EDGAR_HTAP_emi_'
 src_file_suffix = '_2010.0.1x0.1.nc'
 src_names = 'PM2.5(1)'
 sub_categories  = 'emis_tot'
 serial_output   = .false.
 start_output_time = '2010-12-01_00:00:00'
 emissions_zdim_stag = 1
 emis_map = 'PM_25(a)→PM2.5',
/

./anthro_emis < anthro_emis.inp > anthro_emis.out

and you should see six new files in the ANTHRO/src directory, one for each of the three domains. Check out the contents with ncview.

wrfchemi_00z_<domain> 
wrfchemi_12z_<domain> 

 

Step 4. OPTION 2 (FYI for now): Prep your anthropogenic emissions using prep-chem-sources

cd bin


Step 5. Run real.exe again, with chem_opt turned on

mpirun -np 1 ./real.exe > run_real.log

Step 6. Prep the chemical data initial and boundary conditions using MOZBC

cd MOZBC
cp CBMZ-MOSAIC_8bins.inp mozbc.inp
do_bc = .true.
do_ic = .true.
domain = 3

#FYI, I've found mozbc can be unhappy when the set directory path is too long.

dir_wrf = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP/' #obviously this should be your specific path.
dir_moz = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/MOZBC/' #obviously this should be your specific path.
# fn_moz should look something like 'ha0004.nc'. #you will have to rename your mozart4geos5-ZZZZ.nc file whatever this is.

#In the species map (spc_map) section, delete the entry 'op2 -> C2H5OOH'. This isn't in the mozart4geos5 file, and leads to an error if it remains in there.

#(This knowledge is from trial and error.)

ln -s mozart4geos5-ZZZZ.nc ha0004.nc #copy as link to "rename"
./mozbc < mozbc.inp > mozbc.out
#tail mozbc.out should have a final line that reads:
bc_wrfchem completed successfully

Step 7. You're (FINALLY) ready to run WRF-Chem

sbatch run_wrf.sh

squeue -u username #monitor your job status

cd /your/history_outname/specified/preferably/some/regal/directory/WRFOUT/

and you should see for each domain ...

wrfchem_d<domain>_2013-01-DD_HH:00:00
sacct -j 49862718 --format=JobID,JobName,MaxRSS,Elapsed
      JobID    JobName     MaxRSS    Elapsed 
------------ ---------- ---------- ---------- 
49862718     wrfchem_t+            1-15:32:54 
49862718.ba+      batch     11588K 1-15:32:54 

Step 8. Post-Processing and data visualization

https://www.ncl.ucar.edu/Document/Functions/Built-in/addfile.shtml
    1. Accounting only for primary PM2.5. There is obviously all the secondary PM2.5 that needs the appropriate precursor species mapped as well. (25% to nearly 40% of PM2.5 in many cities in the region is secondary inorganic.)
    2. My failure to process files correctly. While the wrfchemi, wrfbiochemi, and mozbc utilities seem to have gone through, it may not have done so correctly based on an inappropriate namelist parameter selection. For some reason, the surface emissions are not being read in correctly. We used EDGAR-HTAP from 2010 processed using the anthro_emis utility. In the past I have run a test of this using a more specialized inventory from 2010 pre-processed using NCL which led to a far more realistic PM2.5 simulation (i.e., surface emissions data was being read in). 
    3. inappropriate choices in the WRF-Chem namelist.input file.

In any case, this exercise should at least get you familiarity with the process of running WRF-Chem and set you up for being able to do second-order troubleshooting (like the more important question of why these values are unrealistic!).

Running WRF-Chem for real cases in Large Eddy Simulation (LES) Mode: A Beijing PM2.5 Case Study

IDEALIZED cases

WRF (ARW) User's Guides: v3, v4

Miscellaneous links

WRF (ARW) User's Guides: v3, v4

Google Docs: https://docs.google.com/document/d/1Jls4FlWIOIhMlCzMPWm6_aBZqx_Axxe8RMcKjdILDFg/

Ding's notes: global_WRF_on_Odyssey.pdf

ARW Technical Note: http://www2.mmm.ucar.edu/wrf/users/docs/technote/

Optimizing performance: https://www2.cisl.ucar.edu/software/community-models/optimizing-wrf-performance

Running on Cheyenne: http://www2.mmm.ucar.edu/wrf/users/cheyenne-note.html