|
|
(2 intermediate revisions by the same user not shown) |
Line 1: |
Line 1: |
| This page describes the steps and modifications necessary to build and run the Single Column Model (SCM) under Ganymed 1.0 on discover. It assumes that you have successfully run the model as described in [[Ganymed 1.0 Quick Start]].
| | Due to recent changes in the surface grid used by GEOS-5, the Single Column Model (SCM) is not currently functional in Ganymed 1.0. This will be rectified at some point. Trust us! |
| | |
| '''Back to [[GEOS-5 Documentation for Ganymed 1.0]]'''
| |
| | |
| | |
| == Checking Out and Updating GEOS-5 for SCM ==
| |
| | |
| The current Ganymed model tag has all code necessary to run the single column model with any of the cases below. '''The executable of the Single Column Model is identical to the global model'''; the only differences are the environment and settings with which it runs.
| |
| | |
| == Setting Up and Running Existing SCM Experiments ==
| |
| | |
| Starting with the tag Fortuna-2_5_p1 (that is, before Ganymed) the setup script for the SCM experiments is <code>src/Applications/GEOSgcm_App/scm_setup</code> . You do ''not'' have to run the <code>gcm_setup</code> script as you do to set up a global run. '''The current setup script will be updated with the tags, so use only that script for Fortuna 2.5 and later -- do not use any other.'''
| |
| | |
| At the time of this writing there are twenty experiments to choose from:
| |
| | |
| * ARM Southern Great Plains site (http://www.arm.gov/sites/sgp), July 1997
| |
| * KWAJEX (http://www.atmos.washington.edu/kwajex/)
| |
| * ARM-SCSMEX (http://www.cawcr.gov.au/bmrc/wefor/research/scsmex.htm)
| |
| * ARM-TWP (http://acrf-campaign.arm.gov/twpice/)
| |
| * TRMM-LBA (http://radarmet.atmos.colostate.edu/lba_trmm/)
| |
| * Five experiments with about the same times and locations as the five above, but forced with MERRA data
| |
| * TOGA COARE (http://www.atmos.washington.edu/togacoare/summaries.html)
| |
| * Six CGILS cases (http://atmgcm.msrc.sunysb.edu/cfmip_figs/Case_specification.html)
| |
| * NAMECORE (forced with MERRA data) (http://www.eol.ucar.edu/projects/name/)
| |
| * NAMEAZNM (forced with MERRA data) (http://www.eol.ucar.edu/projects/name/)
| |
| * NAMMA (forced with MERRA data) (http://namma.msfc.nasa.gov/)
| |
| | |
| | |
| Create your own run directory, then modify and uncomment the first executable line of <code>scm_setup</code>, which assigns <code>ESMADIR</code> to your local Ganymed 1.0 build that you are using for the SCM (you may already have this set as an environment variable). Uncomment one of the lines that assign the variable <code>CASEDIR</code> to choose the experiment to run. Then run the script from the run directory you have created. It will copy all of the necessary resource, forcing and data files to the working directory. Each experiment requires its own directory. If you modify the resource files (e.g., HISTORY.rc) you may want to copy the setup directory to your own area and modify it and the setup script accordingly so that you don't clobber your modifications.
| |
| | |
| Then you can just run the model executable from the command line in the directory you created. You will have to load the proper modules by sourcing <code>src/g5_modules</code>. Although it runs with a single processor, on '''discover''' you should run it from an interactive job on a compute node (as opposed to the '''discover''' front end). This can be done by running <code>qsub -I ijob</code>, where <code>ijob</code> is the job script that sets up the environment (examples are in <code>~aeichman</code> or <code>~amolod</code>). Once the job starts it starts an interactive shell on the job node, from which you can run the GEOS-5 executable. Since all of the necessary configuration files are copied to the experiment directory, it requires none of the extra environmental infrastructure needed for a global experiment that the run script <code>gcm_run.j</code> creates.
| |
| | |
| == Creating Driving Datasets from MERRA ==
| |
| | |
| Given the resource and other files that come with a complete SCM configuration (either from an existing case or create with the procedure below), a driving data file for the same location and time span can be generated using MERRA output. Users should note that this current scheme for MERRA data does not include analysis increments in the advection terms, and probably should.
| |
| | |
| === Obtaining MERRA Data ===
| |
| | |
| MERRA output files are located under <code>/archive</code> on NCCS discover and can be time consuming to obtain. For this purpose a set of scripts have been created to make the task easier. To use them, create a subdirectory and copy the contents of <code>/discover/nobackup/aeichman/scm/util/get-merra</code> to it. You should have the following:
| |
| | |
| getter.j
| |
| MERRAobsys.rc
| |
| README
| |
| watcher.j
| |
| | |
| | |
| To use the scripts, modify the line in <code>getter.j</code> starting <code>setenv PATH ${PATH}:</code> ... to point to the directory
| |
| <code>src/GMAO_Shared/GMAO_etc/</code> in your local Fortuna 2.0 build, which
| |
| contains the necessary utilities. These use perl
| |
| libraries, which might require additions to your environment. (Assume at first that they don't.) To specify the range of MERRA data to obtain, modify the variables
| |
| <code>BEGIN_date</code> and <code>END_date</code> (both in the format YYYYMMDD). You may need
| |
| to modify your group name in the PBS environment variables as well.
| |
| | |
| Then <code>qsub watcher.j</code> (''not'' <code>getter.j</code>). It will submit the <code>getter.j</code>
| |
| script while submitting a version of itself to monitor the "getter"
| |
| job. <code>getter.j</code> uses the <code>acquire</code> utility to smoothly transfer files from <code>/archive</code> to the current directory. If the getter job ends without finishing -- most likely
| |
| because the allotted walltime ran out -- then the watcher job will
| |
| repeat the process, until all the data in the specified range are
| |
| copied to the current directory. For data sets of a month or so this may take a few hours, but the scripts should run without intervention. If something interrupts this process, the same scripts may be started again, and <code>acquire</code> will be intelligent enough figure out where it needs to pick up. Keep in mind that response time of the <code>/archive</code> filesystem can vary considerably (on the scale of hours to days, depending on downtime).
| |
| | |
| For more details, see <code>README</code>.
| |
| | |
| === Generating the Driving Data ===
| |
| | |
| Now the ASCII .dat file used for the driving data can be created.
| |
| | |
| Under the directory <code>src/GMAO_Shared/GEOS_Util/post</code> in your Fortuna build, apply the following CVS command:
| |
| | |
| cvs upd -r b_afe_Fortuna_merra2scm merra2scm.F GNUmakefile
| |
| | |
| This will check out the source file <code>merra2scm.F</code>; it must be modified to the time and location of the data set to be created. Change the parameters <code>begdate</code> and <code>enddate</code> to the dates you want to cover, but leave <code>begtime</code> and <code>endtime</code> alone. If you are replicating an existing experiment, <code>begdate</code> and <code>enddate</code> can be obtained from that experiment's <code>cap_restart</code> and <code>CAP.rc</code>, respectively. The parameters <code>lonbegin</code>, <code>lonend</code>, <code>latbegin</code>, and <code>latend</code> specify the location and appropriate values can be gleaned from the filenames in the appropriate experiment under <code>/discover/nobackup/aeichman/scm/scminfiles/</code> -- for example, the filename
| |
| <code>tile.data_simple1_XY1x1-C_34N_100W_38N_95W</code>. (Note that these file names are truncated when copied by the SCM setup script.) Finally, change the variable <code>dirname</code> to the directory where you copied the MERRA data.
| |
| | |
| Then <code>cd</code> up to the <code>src</code> directory, run <code>make install</code>, and run the executable <code>merra2scm.x</code>. It will generate the driving data file <code>merra_scm.dat</code>, which can be used to replace the one with the expriment data.
| |
| | |
| === Required Modifications to the Model ===
| |
| | |
| At the time of this writing, modifications need to be compiled in to load new cases. We are planning to amend this inconvenience. The source files to modify are in <code>src/GEOSgcs_GridComp/GEOSgcm_GridComp/GEOSagcm_GridComp/GEOSsuperdyn_GridComp/GEOSdatmodyn_GridComp</code>.
| |
| | |
| First, in <code>GEOS_DatmoDynGridComp.F90</code>, a case must be added in two <code>select</code> statements at (or near) line 1039 and line 1941:
| |
| | |
| select CASE(trim(DATA$))
| |
| | |
| A sample case is shown below:
| |
| | |
| <pre>
| |
| case("merra_arm97jul")
| |
| NT = 240
| |
| NLEVEL = 42
| |
| DATA_DRIVER=.true.
| |
| </pre>
| |
| | |
| For the case to be added, the <code>case</code> statement must have the name of the driver data file with the trailing <code>.dat</code> truncated (i.e. the file <code>merra_scm.dat</code> will require the <code>case</code> statement <code>case("merra_scm")</code> ). The variable <code>NT</code> must be assigned to the length of the time series of the driving data, and <code>NLEVEL</code> to the number of pressure levels. These values may be obtained from the header of the driver data file.
| |
| | |
| A similar, though more simple modification must be made in the other <code>case</code> statement that has MERRA SCM experiment names as cases, and the <code>if</code> statement at about line 1074.
| |
| | |
| Similarly, any experiment using MERRA data requires a modification to <code>reader.F90</code> at about lines 179, 302 and 325. The <code>if then</code> statement there:
| |
| | |
| <pre>
| |
| if(filename.eq."arm_97jul.dat".or. &
| |
| filename.eq."merra_arm97jul.dat".or. &
| |
| filename.eq."merra_arm_scmx.dat")then
| |
| </pre>
| |
| | |
| requires the addition of the full driver data file name.
| |
| | |
| With these modifications in place, the model may be recompiled.
| |
| | |
| Finally, the parameter <code>DRIVER_DATA</code> in <code>AGCM.rc</code> needs to be changed to the full filename of the driver data. Note that you will probably have to change the begin time in the <code>cap_restart</code> and the end time in <code>CAP.rc</code> to the appropriate times in <code>begtime</code> (probably <code>000000</code>) and <code>endtime</code>.
| |
| | |
| == Creating New SCM Case Boundary Conditions ==
| |
| | |
| To create the boundary condition files for a new SCM case with a location and time span different from the ones already made, there are a set of IDL scripts in
| |
| | |
| src/GEOSgcs_GridComp/GEOSgcm_GridComp/GEOSagcm_GridComp/GEOSsuperdyn_GridComp/GEOSdatmodyn_GridComp/idl
| |
| | |
| IDL can be run from the '''dali''' machine, which you should be able to <code>ssh</code> to from '''discover'''. See the NCCS documentation for IDL for help: http://www.nccs.nasa.gov/dali_qna.html#step5. Follow the steps and start IDL (<code>idl</code>). To run the script <code>make_bcs_ics.pro</code> enter:
| |
| | |
| IDL> .run make_bcs_ics
| |
| | |
| The procedure requires the restart files <code>fvcore_internal_rst</code>, <code>moist_internal_rst</code> and <code>catch_internal_rst</code> (with or without dates appended) seasonally appropriate for the experiment's start date, chosen in the priority month-day-year. Most restart files available have the time 2100z -- the affects of a disjoint between restart time-of-day and that of the beginning time of the model apparently diminish after a few days of spinup.
| |
| | |
| In the file <code>make_bcs_ics.pro</code> make the following changes:
| |
| | |
| *<code>odir</code> to an appropriate directory in your area for input files
| |
| *<code>gdirbase</code> to a directory in your area for output
| |
| *<code>cr</code> to the geographic range in coordinate degrees (<code>[''S W N E'']</code>)
| |
| *<code>casename</code> to a directory-appropriate name (this will be a subdirectory created for output in <code>gdirbase</code>)
| |
| *<code>sst_impose</code> to the desired SST boundary condition (if neccessary, in K)
| |
| *<code>ntype</code> (tile surface type) to 100 if over land, 0 if over sea
| |
| *<code>'moist_internal_rst.b19830214_21z'</code> to the name of your moist internal restart file
| |
| * the intances of <code>'fvcore_internal_rst.b19830214_21z'</code> to the name of your fvcore internal restart file
| |
| | |
| In <code>odir</code>, place the following files:
| |
| | |
| catch_internal_rst*
| |
| FV_144x91_DC_360x180_DE.til
| |
| fvcore_internal_rst*
| |
| lai_green_clim_144x91_DC.data
| |
| moist_internal_rst*
| |
| nirdf_144x91_DC.dat
| |
| topo_DYN_ave_144x91_DC.data
| |
| topo_GWD_var_144x91_DC.data
| |
| topo_TRB_var_144x91_DC.data
| |
| vegdyn_144x91_DC.dat
| |
| visdf_144x91_DC.dat
| |
| | |
| The files other than the restarts can currently be obtained from <code>/discover/nobackup/amolod/bcs/144x91/</code>.
| |
| | |
| Then run <code>make_bcs_ics.pro</code> from the IDL command line. This will create a set of files in <code>gdirbase/casename</code>.
| |
| | |
| Now you have to select a tile from the file <code>FV_144x91_DC_360x180_DE.til</code>. After the header, each line contains the specifications for one tile. Find a tile close to the location of your experiment -- the third column is longitude, the fourth latitude. The first column should be the same as the <code>ntype</code> in <code>make_bcs_ics.pro</code>. The last column is the tile number, which you should record.
| |
| | |
| Matlab (running on '''dali''') can make this task easier. First a edit a copy of <code>FV_144x91_DC_360x180_DE.til</code>, deleting the file header, the first eight lines. Then from Matlab you can <load> it. The following is an example of finding tiles near 39N 77W:
| |
| | |
| <pre>
| |
| >> format shortG
| |
| >> load FV_144x91_DC_360x180_DE.til
| |
| >> lat=40;lon=-77;
| |
| >> tilelines=find(int8(FV_144x91_DC_360x180_DE(:,4))==lat & int8(FV_144x91_DC_360x180_DE(:,3))==lon );FV_144x91_DC_360x180_DE(tilelines,[1 4 3 12])
| |
| | |
| ans =
| |
| | |
| 100 39.963 -76.626 6894
| |
| 100 39.96 -76.975 6895
| |
| 100 40.182 -76.625 6896
| |
| 100 40.187 -76.899 6899
| |
| 100 40.134 -77.357 6900
| |
| 100 40.353 -77.128 6901
| |
| 100 39.595 -77.256 6935
| |
| 19 39.566 -76.569 67187
| |
| | |
| >>
| |
| </pre>
| |
| | |
| The last command displays candidate tiles with their land/sea value, latitude, logitude, and tile number. It might make sense to examine adjacent whole lat/lon values.
| |
| | |
| Edit <code>make_land_files.pro</code> so that <code>bcsodir</code>, <code>xdir</code> and <code>casename</code> are the same as <code>odir</code>, <code>gdirbase</code> and <code>casename</code>, respectively, in <code>make_bcs_ics.pro</code>. Also change <code>itile</code> to the tile number you recorded from the tile file and <code>catchname</code> to the name of your catchment restart. The run the script. It will create a subdirectory <code>Landfiles</code> in the output directory and generate the land BC files there.
| |
| | |
| To make an appropriate <code>AGCM.rc</code>, copy one from an existing SCM case, and change the following:
| |
| | |
| * <code>AGCM_GRIDNAME</code> and <code>OGCM_GRIDNAME</code> to reflect the coordinates in the filenames of the files that you just generated
| |
| * <code>DRIVER_DATA</code> to the name of your driving data file name (for example, created from MERRA data in the section above)
| |
| | |
| Likewise, copy a <code>CAP.rc</code> and change the <code>END_DATE</code> as appropriate. Do the same for the start date in <code>cap_restart</code>. A <code>HISTORY.rc</code> can be copied without modification. Keep <code>AGCM.rc</code>, <code>CAP.rc</code>, <code>cap_restart</code>, and <code>HISTORY.rc</code> with the output from the IDL scripts. The latter will have to be either renamed or linked to names that the model will recognize -- see <code>/discover/nobackup/aeichman/scm/scminfiles/arm_97jul</code> for an example. You should have the following:
| |
| | |
| AGCM.rc
| |
| CAP.rc
| |
| cap_restart
| |
| catch_internal_rst
| |
| datmodyn_internal_rst
| |
| fraci.data
| |
| fvcore_internal_rst
| |
| HISTORY.rc
| |
| laigrn.data
| |
| moist_internal_rst
| |
| nirdf.dat
| |
| SEAWIFS_KPAR_mon_clim.data
| |
| sst.data
| |
| sstsi.data
| |
| tile.data
| |
| topo_dynave.data
| |
| topo_gwdvar.data
| |
| topo_trbvar.data
| |
| vegdyn.data
| |
| visdf.dat
| |
| | |
| These files, plus a driving data file, comprise the case-specific files for an SCM case, similar to those cases in <code>/discover/nobackup/aeichman/scm/scminfiles/</code>, and can use <code>getSCMdata.sh</code> to set up the model environment to run.
| |
| | |
| == Some Discussion About How to Use and Configure SCM ==
| |
| | |
| The following section contains excerpts from emails from users along with replies that might answer questions that can come up.
| |
| | |
| ===The VERTICAL_ADVECTION Flag===
| |
| | |
| ''From ARM_97JUL, I do see there is a flag "VERTICAL_ADVECTION" in AGCM.rc. I am trying to turn it off by setting it to "0". Is it the right way to use observed vertical advection?''
| |
| | |
| yes sir, that is the way to turn it off. but turning it off is what gives 50 deg temp biases. there is a code change that i am testing today or tomorrow that greg walker tried and
| |
| said made a big difference. we are suspecting that the T vertical advection term in the
| |
| obs dataset is missing the adiabatic expansion term (ie, that it is dT/dp and not the
| |
| total vert tend). so we will try to use the vertical advection of s term (it is already
| |
| divided by Cp i think). in that case the idea would be to go into the reader.F90 term and
| |
| (greg walker did this) do something like:
| |
| | |
| add the line:
| |
| | |
| TMP(13,:,:) = TMP(13,:,:)/3600.0
| |
| | |
| near where we do other stuff like this. and then: instead of:
| |
| | |
| T_V_ADV(i,k) = -dv(7,K,I)
| |
| | |
| put in something like:
| |
| | |
| if (filename.eq."arm_97jul.dat") then
| |
| T_V_ADV(i,k) = -dv(13,K,I) ! Vertical_s_Advec/cp (K/s)
| |
| else
| |
| T_V_ADV(i,k) = -dv(7,K,I) ! Vertical_T_Advec (K/s) [is omega*alpha included?]
| |
| endif
| |
| | |
| got it? the idea is to use the vertical advection of s term for arm 97
| |
| july case for now (its possible that there
| |
| are other cases that we have to do this also, but until we know that we
| |
| want to try it for arm 97 july only).
| |
| | |
| so - if you want to turn off the calculation of vertical advection you
| |
| set VERTICAL_ADVECTION to 0 in
| |
| the AGCM.rc (or leave it out - 0 is the default). and then i would
| |
| HIGHLY recommend doing what i suggest
| |
| here.
| |
| | |
| ===CGILS Experiments===
| |
| | |
| ''It will be great if we can have some cases to study low clouds such as off coasts of California and Peru. Joao and I are interested to use SCM for the low clouds study.''
| |
| | |
| just to let you both know that there is a new set of cases that we can now
| |
| do with the scm, but they may not make their way into the 'official' set
| |
| of cases.
| |
| they are the CGILS cases. this is the set of CFMIP-GCSS simulations at three
| |
| points on the transect from the calif coast to the mid pacific (stratus,
| |
| strato cu, cu).
| |
| the CGILS project includes a set of 3 or 4 LES simulations of the same 3
| |
| spots.
| |
| the SCM forcing is idealized and the simulation is perpetual july 15
| |
| with no diurnal
| |
| cycle (code changes needed for this to the model will not be in
| |
| 'official' code
| |
| releases - that's why these experiments won't be ones that are on the
| |
| 'list'), but
| |
| the set-up is great for testing pbl schemes and the interaction with the
| |
| shallow
| |
| and deep convection in marine boundary layers. i have done the
| |
| simulations with
| |
| our fortuna-2_0 code and i've also done a suite of sensitivity
| |
| experiments and am
| |
| continuing to do more. if you'd like the code mods for these runs i can
| |
| provide them.
| |
| | |
| ==Parameters for progno_cloud==
| |
| | |
| The following is a list of parameters for <code>progno_cloud</code> that can be in the <code>AGCM.rc</code> configuration file.
| |
| | |
| <pre>
| |
| | |
| Slot Name (AGCM.rc) default var. name description
| |
| ---- -------------- ------- --------- ------------
| |
| 1 'CNV_BETA:', 10.0 CNV_BETA Divide convective rain by cnv_beta for Marsh-Palm
| |
| drop size, number, velocity - used for evap of rain
| |
| 2 'ANV_BETA:', 4.0 ANV_BETA Divide anvil rain rate by anv_beta
| |
| 3 'LS_BETA:', 4.0 LS_BETA Divide Large Scale rain by ls_beta
| |
| 4 'RH_CRIT:', 1.0 RH00 Upper limit on critical relative humidity for evap/condense
| |
| 5 'AUTOC_LS:', 2.0e-3 C_00 Multiplication factor (+unit conversion) for autoconversion
| |
| rate (autoconvert exp(-rate * dt) )
| |
| 6 'QC_CRIT_LS:', 8.0e-4 LWCRIT Scale autoconversion (impact ~ 1 - exp(-1/lwcrit)**2 )
| |
| 7 'ACCRETION:', 2.0 C_ACC Scale factor for accretion of cloud water by rain/snow
| |
| 8 'BASE_REVAP_FAC:', 1.0 C_EV Scale factor for rain/snow re-evap (re-evap ~ 1 - exp(- c_ev) )
| |
| 9 'VOL_TO_FRAC:', -1.0 CLDVOL2FRC Not used
| |
| 10 'SUPERSAT:', 0.0 RHSUP_ICE Not used
| |
| 11 'SHEAR_EVAP_FAC:', 1.3 SHR_EVAP_FAC Not used
| |
| 12 'MIN_ALLOW_CCW:', 1.0e-9 MIN_CLD_WATER Not used
| |
| 13 'CCW_EVAP_EFF:', 3.3e-4 CLD_EVP_EFF Scale for evap of cloud water/(subl of ice) (+unit conv)
| |
| 14 'NSUB_AUTOCONV:', 20. NSMAX Not used
| |
| 15 'LS_SUND_INTER:', 4.8 LS_SDQV2 Factor to control how fast LS ice autonv drops at cold temps
| |
| 16 'LS_SUND_COLD:', 4.8 LS_SDQV3 Factor to control how fast LS ice autonv drops at coldest temps
| |
| 17 'LS_SUND_TEMP1:', 230. LS_SDQVT1 Temp at which to start decrease in LS ice autoconv ramping
| |
| 18 'ANV_SUND_INTER:', 1.0 ANV_SDQV2 Factor to control how fast anvil ice autonv drops at cold temps
| |
| 19 'ANV_SUND_COLD:', 1.0 ANV_SDQV3 Factor to control how fast anvil ice autonv drops at coldest temps
| |
| 20 'ANV_SUND_TEMP1:', 230. ANV_SDQVT1 Temp at which to start decrease in anvil ice autoconv ramping
| |
| 21 'ANV_TO_LS_TIME:', 14400. ANV_TO_LS Not used
| |
| 22 'NCCN_WARM:', 50. N_WARM Not used
| |
| 23 'NCCN_ICE:', 0.01 N_ICE Not used
| |
| 24 'NCCN_ANVIL:', 0.1 N_ANVIL Not used
| |
| 25 'NCCN_PBL:', 200. N_PBL Not used
| |
| 26 'DISABLE_RAD:', 0. DISABLE_RAD Flag (=1) to disable radiative interaction with cloud/rain
| |
| 27 'ICE_SETTLE:', 0. Not used
| |
| 28 'ANV_ICEFALL:', 0.5 ANV_ICEFALL_C Scale for fall rate of anvil ice (used to scale ice autoconv)
| |
| 29 'LS_ICEFALL:', 0.5 LS_ICEFALL_C Scale for fall rate of LS ice (used to scale ice autoconv)
| |
| 30 'REVAP_OFF_P:', 2000. REVAP_OFF_P Max pressure at which to do precip re-evap (mb)
| |
| 31 'CNV_ENVF:', 0.8 CNVENVFC Scale factor for convective rain/snow re-evap
| |
| (re-evap ~ 1 - exp(- envfrac) ) - fraction of re-evap in
| |
| environment as opposed to in the cloud
| |
| 32 'WRHODEP:', 0.5 WRHODEP Control rate of dec/incr of ice fall speed with high/low press
| |
| 33 'ICE_RAMP:', -40.0 T_ICE_ALL = ICE_RAMP + MAPL_TICE - Temp at which all cloud/precip is ice
| |
| (fraction=1, use L of ice)
| |
| 34 'CNV_ICEPARAM:', 1.0 CNVICEPARAM Control on how much new conv precip is ice
| |
| (1=> use ice fraction, 0=> all liquid)
| |
| 35 'CNV_ICEFRPWR:', 4.0 ICEFRPWR = CNV_ICEFRPWR + .001 -- Scale ice fraction (liquid/ice
| |
| partition for consensation/evap/melting&freezing)
| |
| Fraction = Fraction ** icefrpowr
| |
| 36 'CNV_DDRF:', 0.0 CNVDDRFC Fraction of re-evap of conv precip to reserve for re-evap
| |
| lower in atm (in a "downdraft")
| |
| 37 'ANV_DDRF:', 0.0 ANVDDRFC Fraction of re-evap of conv precip to reserve for re-evap
| |
| lower in atm (in a "downdraft")
| |
| 38 'LS_DDRF:', 0.0 LSDDRFC Fraction of re-evap of conv precip to reserve for re-evap
| |
| lower in atm (in a "downdraft")
| |
| 39 'AUTOC_ANV:', 1.0e-3 Not used
| |
| 40 'QC_CRIT_ANV:', 8.0e-4 Not used
| |
| 41 'TANHRHCRIT:', 1. tanhrhcrit Flag to use tanh vertical profile for Rh crit for condens/evap
| |
| 42 'MINRHCRIT:', 0.8 minrhcrit Min Rh in tanh profile
| |
| 43 'MAXRHCRIT:', 1.0 maxrhcrit Max Rh in tanh profile
| |
| | |
| </pre>
| |
| | |
|
| |
|
| '''Back to [[GEOS-5 Documentation for Ganymed 1.0]]''' | | '''Back to [[GEOS-5 Documentation for Ganymed 1.0]]''' |