Running ADAS: Difference between revisions

Pchakrab (talk | contribs)
No edit summary
Pchakrab (talk | contribs)
 
(61 intermediate revisions by the same user not shown)
Line 1: Line 1:
==DAS assimilation and forecast runs==
==ADAS run==
====Select DAS run and corresponding build====


First, we choose a DAS run to simulate. For a list of runs
===Build model===
$ ls ~dao_ops
First, choose a model tag to build. For this example, we choose GEOSadas-5_9_1_p7.
on NCCS discover. As an example, we choose the run <tt>d591_fpit</tt>


Look for ATAG in file <tt>~dao_ops/d591_fpit/run/FVDAS_Run_Config</tt>
$ cvs co -P -r GEOSadas-5_9_1_p7 -d g591p8 GEOSadas-5_9
$ grep ATAG ~dao_ops/d591_fpit/run/FVDAS_Run_Config
 
Not the module name <tt>GEOSadas-5_9</tt>. Unlike AGCM, all ADAS tags have a corresponding module.
    setenv ATAG    GEOSadas-5_9_1_p4
    setenv FVARCH  /archive/u/gmao_ops/GEOS-5.9.1/${ATAG}/ #include trailing colon


This gives us the tag and location of the restart files [for an older build, it is best to checkout the model tag (given by ATAG) and rebuild the model].
$ cd g591p8/src
$ ./parallel_build.csh


Move to the bin directory (of the existing build) to setup experiment.
===Setup experiment (Run <tt>fvsetup</tt>)===
$ cd ~dao_ops/GEOSadas-5_9_1_p4/GEOSadas/Linux/bin


====Setup experiment====
Run <tt>fvsetup</tt> from the bin directory (<tt>g591p8/Linux/bin</tt>) with the following options (default value chosen for ones not mentioned):
[we choose the default value for the initial conditions (FVICS) as well]


Run <tt>fvsetup</tt> from the bin directory with the following options (default value is chosen for the ones not mentioned):
[For now we choose the default value for FVICS as well]
  AGCM Resolution? [b72]
  AGCM Resolution? [b72]
  > d72
  > C180
OGCM Resolution? [c34]
> f34
   
   
  EXPID? [u000_d72]
  EXPID? [u000_C180]
  > g591p4_d72
  > g591p8
   
   
  OBSERVING SYSTEM CLASSES?
  OBSERVING SYSTEM CLASSES?
  > "ncep_prep_bufr"
  > "ncep_prep_bufr"
<!--
> "ncep_1bamua_bufr,ncep_1bamub_bufr,ncep_1bhrs2_bufr,ncep_1bhrs3_bufr,
    ncep_1bmsu_bufr,ncep_osbuv_bufr,ncep_prep_bufr,ncep_sptrmm_bufr,
    disc_airs_bufr,disc_amsua_bufr,ncep_mhs_bufr,ncep_1bhrs4_bufr,
    ncep_amsre_bufr,ncep_goesfv_bufr,ncep_mtiasi_bufr,ncep_gpsro_bufr,
    aura_omi_bufr,ncep_satwnd_bufr,ncep_atms_bufr,ncep_sevcsr_bufr"
-->
   
   
  CHECKING OBSYSTEM? [2]
  Do Aerosol Analysis (y/n)? [y]
  > 3
  > n
 
One can, of course, turn on Aerosol Analysis and AOD assimilation here.


After saving these inputs, move to the experiment directory
A pdf of an example <tt>fvsetup</tt> session is provided here: [[Media:Fvsetup_session.pdf|fvsetup_session]].
$ cd $NOBACKUP/g591p4_d72/run


At this point, one can edit CAP.rc.tmpl to run 12-hour experiment. Also, one can turn on MAPL timers and memutils
====FVHOME====
FVHOME is the home directory for fvDAS experiment. Resource files, restarts and system output will be stored
under this directory. Usually it contains subdiretories:


  JOB_SGMT:    0 120000
  ana/        first guess/analysis output
  NUM_SGMT:    1
  anasa/      job script to run stand-alone analysis
   
  daotovs/    daotovs field output
  MAPL_ENABLE_TIMERS: YES
  diag/      diagnostic field output
  MAPL_ENABLE_MEMUTILS: YES
  etc/        listings and other odds & ends
fcst/      forecast run directory
fvInput/    process required inputs
obs/        post-analysis ODS files
prog/      prognostic field output
recycle/    latest restart files
rs/        restart files
run/        resource files
 
<tt>$FVHOME/.FVROOT</tt> gives the location of the installation <tt>bin</tt> directory.
 
Some of these directories (<tt>ana/</tt>, <tt>chem/</tt>, <tt>diag/</tt>, <tt>etc/</tt>, <tt>obs/</tt>, <tt>rs/</tt>) are stored in the archive. After completing the experiment, look in <tt>$ARCHIVE/g591p8</tt> (where <tt>g591p8</tt> is the experiment ID).
 
====Run directory====
$ cd $FVHOME/run


To run one segment each of <tt>GSIsa.x</tt> and <tt>GEOSgcm.x</tt> add '<tt>exit</tt>' before 'PART IV - Next PBS Job Segment' in <tt>g5das.j</tt>.
<tt>g5das.j</tt> is the job submission script. The number of cpus etc. can be changed here (if they haven't already been changed during setup). Also, if you want to run a single cycle of <tt>GSIsa.x</tt> & <tt>GEOSgcm.x</tt> add <tt>exit</tt> right before 'PART IV'.
<!--
To simulate the OPS run d591_fpit, we make a few changes:
# NX, NY in AGCM.rc.tmpl and AGCM.BOOTSTRAP.rc.tmpl have the same values (8, 48) as in the OPS run
# Copy HISTORY.rc.tmpl over from the OPS run
# Copy obsys.rc over from the OPT run
# NX, NY in GSI_GridComp.rc.tmpl should match those from the OPS run (6, 36)
# Set NUM_SGMT to 1 in CAP.rc.tmpl
# <tt>g5das.j</tt>
## PBS -l select=32
## <tt>setenv NCPUS    384</tt>
## <tt>setenv NCPUS_GSI 216</tt>
## '''One cycle''': To run one segment each of GSIsa.x and GEOSgcm.x add '<tt>exit</tt>' before 'PART IV'
-->


====Data Assimilation experiment====
===Data Assimilation===


  $ cd $NOBACKUP/g591p4_d72/run
  $ cd $FVHOME/run
  $ qsub g5das.j
  $ qsub g5das.j


<tt>g5das</tt> creates a scratch directory <tt>$NOBACKUP/fvwork.xxxx</tt> and runs the experiment there. After the initial setup, <tt>GSIsa.x</tt> is run with output written to <tt>ana.log</tt> followed by <tt>GEOSgcm.x</tt> which writes its output to <tt>stdout</tt> and <tt>fvpsas.log</tt>.
<tt>g5das</tt> creates a scratch directory <tt>$NOBACKUP/fvwork.xxxx</tt> and runs the experiment there. After the initial setup, <tt>GSIsa.x</tt> is run with output written to <tt>ana.log</tt> followed by <tt>GEOSgcm.x</tt> which writes its output to <tt>stdout</tt> and <tt>fvpsas.log</tt>.


====Forecast experiment====
===Forecast===
Later.
 
==Simulating OPS run==
<!--EVERYTHING BELOW IS COMMENTED OUT


==Simulating DAS forecast run==
==Simulating DAS forecast run==
Line 134: Line 171:
    
    
   $ qsub gcm_run.j
   $ qsub gcm_run.j
-->


[[Category:SI Team]]
[[Category:SI Team]]
[[Category:Running the Model]]
[[Category:Running the Model]]