Wednesday, August 17, 2011

Submitting a ROMS JOB on the Oceanography Department's High Performance Computing Facilities - hosted by the Climate Systems Analysis Group

Hi

I have run a ROMS test case on the Oceanography Department's new High Performance Computing Facilities - core.csag.uct.ac.za

For useful information and guidelines on using the cluster I suggest you read the “Guidelines for the use and administration of the Oceanography Department's High Performance Computing Facilities - hosted by the Climate Systems Analysis Group” which should be given to you once an account has been created for you on the cluster. Information on the Queue management system (Sun Grid Engine) and shared libraries can be found in this document.

I have made a testcase directory in my scratch directory that a new user can copy into their scratch directory and use as a starting point - /scratch/ocean/nburls/ROMS_CORE_testcase

The files of interest in it are:

1) jobcomp_mpich2_intel64

Note! As mentioned at the top of this script, you have to source the Intel compiler in your .bashrc script or in the jobcomp script.

# source /share/apps/intel/Compiler/11.1/073/bin/ifortvars.sh intel64

# source /share/apps/intel/Compiler/11.1/073/bin/iccvars.sh intel64

Also note that as it is currently configured this script makes use of source code in my home directory,

set SOURCE=/home/nburls/Roms_tools/Roms_Agrif

and the mvapich library,

mva_base=/share/apps/mpich2-1.3.2p1-intel64

and netcdf library,

set NETCDFLIB=-L/share/apps/netcdf-3.6.3-intel_intel64/lib

set NETCDFINC=-I/share/apps/netcdf-3.6.3-intel_intel64/include

2) run_roms_CORE_joinfiles.csh OR run_roms_CORE_nojoin.csh

Currently set to run on 8 processors.

Using

set NCJOIN=/home/nburls/bin/ncjoin

set PARTIT=/home/nburls/bin/partit

As per Nicolette's previous post the run_roms_CORE_nojoin.csh run script does not rejoin the average and diagnostics output files, while run_roms_CORE_joinfiles.csh does.

See Nicolette's SUN Runscripts - http://uctoceanmodelling.blogspot.com/2011/02/update-of-sun-cluster-running-scripts.html

Let me know if any issues pop up. Hopefully not.

The next step will be to get the matlab licence server up and running on the CORE cluster so that one can do ROMS pre and post processing in batch mode on the cluster.

Natalie


Friday, February 25, 2011

Update of SUN cluster running scripts

I have updated/written a few scripts for running on the SUN cluster at the CHPC. They are available in my work directory under "runscripts".

run_roms_l.moab
    • The previous run script waited for the tiles to be joined before timestepping of the next month could proceed. This holds up all the processors as this is a serial (1 CPU) job.
    • Changes: the tiles are renamed with the dates for joining later in the next script
    • Restart and history files are still joined as these files are small and quick to join and so you can monitor your model run.
join_files.moab
    • This takes the renamed tiles created above and joins them with ncjoin.
    • This is done in parallel so can be submitted to the SUN, else it can run in serial on your desktop by changing the number of parallel processes.
    • Careful of diagnostic files. I created a monster 28GB average file!!!!!
rename_files.sh or rename_files.moab
    • When the model crashes due to MPI error or something, the timestepping and the file naming become out of sync. This renames all the files with the correct dates after the run.
    • This is a serial procedure. The .moab is so you can submit it to the cluster.
I've tested writing, joining and renaming monthly average files. The scripts produce log and error files. I haven't tested for nesting. Read the description in the file headers.

Let me know if I missed something.
Nicolette

Thursday, January 13, 2011

Ncview: A quick and easy visual browser for netcdf files.

Hi All,

Im sure this is probably boring old news to most of you...but at the risk of being redundant I thought I would share just incase.

Ncview is a quick and easy way of viewing netcdf formatted files. No code required just a click of a button. Ncview typically needs to be run in a linux based environment (mac os x works nicely too). It can open up several data files at once and will run a little movie of all your data if you so fancy. It also has a range of different colour bars you can choose from.

For more information and the download:
http://meteora.ucsd.edu/~pierce/ncview_home_page.html

Sarah

Monday, December 13, 2010

Making loaddap work on ubuntu at UCT


  1. Download a libdap.rpm and loaddap.rpm appropriate to your version of matlab. For matlab 6, loaddap 3.5.2 and libdap 3.6.2 are the packages to use. http://www.opendap.org/download/ml-structs.html
  2. Convert the rpm packages to debian (.deb) using alien (which is in the ubuntu repository)
  3. Install the packages, libdap first and then loaddap
  4. Install a pthreads library (libpthread), which is in the repository
  5. You will need to create links to a number of libraries (the versions may differ according to the versions of loaddap and libdap that you are using) that possibly exist as newer versions in your version of ubuntu (in my case, 10.04/Lucid Lynx).
  • In /usr/lib: sudo ln -s libssl.so.0.9.8 libssl.so.4
  • In /usr/lib: sudo ln -s libcrypto.so.0.9.8 libcrypto.so.4
  • In /usr/lib: sudo ln -s libcurl.so.3 libcurl.so.2
  • In /lib: sudo ln -s libcomm_err.so.2.1 libcomm_err.so.3
  1. This is the dodgy bit: the /usr/bin/loaddap.mexglx seems to be incompatible with the version of matlab and/or mexcdf that I have, so I copied it from /Roms_tools/Opendap_tools/FEDORA/ and pasted it into /usr/bin (renaming the old one loaddap.mexglx_bckp)
  2. Edit you start.m script in order to point to the loaddap and libdap libraries:
  • addpath(['/usr/bin'])
  • addpath(['/usr/lib'])
  • addpath(['/usr/sbin'])
  • addpath(['/usr/share'])

If you are using a machine that is not at UCT, you should be good to go. If you are at UCT, you need to do the following (thanks Neil!) in order to get around some proxy issues.

  1. Install cntlm (available in the ubuntu repository)
  2. As root, you need to edit the following file: /etc/cntlm.conf according to the following example (everything else in the file is uncommented):

Username <your UCT username (i.e. staff or student number)>
Domain wf

Password <your UCT password> # Use hashes instead (-H)


Proxy campusnet.uct.ac.za:8080


Listen 3128




ISAScannerSize 1024

ISAScannerAgent Wget/

ISAScannerAgent APT-HTTP/

ISAScannerAgent Apt/

ISAScannerAgent Svn/

ISAScannerAgent Loaddap/


  1. In order to activate the changes made in cntlm.conf so that you can bypass the proxy issues so that your username and password is automatically provided for command-line opendap downloads, you need to type:

sudo /etc/init.d/cntlm start

You won't want to have to type this command everytime you do a download, so in order to automate it on start-up, go to System – Preferences – Startup Applications. Click on add, type a name for command (e.g. Run cntlm), the command (use gksudo instead of sudo: gksudo /etc/init.d/cntlm start) and a comment if you wish. You will be asked for your root password on startup in a gui (the reason you need to use gksudo) so that this is able to run.

  1. In .bashrc, add the following lines (the port needs to be consistent with what is in cntlm.conf)

export http_proxy=http://localhost:3128

export https_proxy=$http_proxy

export ftp_proxy=$http_proxy

  1. In System-Preferences-Network Proxy, I changed mine to Manual Proxy Configuration (and checked the box: use the same name for all protocols), with localhost as the HTTP proxy and the port 3128.

Note: for my Mozilla Firefox (version 3.6.12) and for my Thunderbird email client I kept the preferences for network connections to 'Auto-detect proxy settings'.

Note: this is one solution. There may also be a cleaner work-around in which you specify your proxy, username and password in .dodsrc (which should be in your home directory).


Friday, December 3, 2010

Bulk renaming files and animations

It is sometimes useful to bulk rename files and create animations.

I had 1000s of images that were in sequential order of the form ghrsst_YYMMDD_reg.tdf.reg.jpg that I needed to rename to img0001.jpg, img0002.jpg, etc.

To do so I did the following:
$ rename 's/_reg.tdf.reg.jpg/.jpg/' * # replace all the "_reg.tdf.jpg" with just ".jpg"
$ rename 's/ghrsst_/0/' *.jpg # replace all the "ghrsst_" with "0"
$ x=1; for i in *jpg; do counter=$(printf %07d $x); ln -s "$i" img"$counter".jpg; x=$(($x+1)); done

Then to make the animation from the linked files:
$ ffmpeg -f image2 -r 8 -i img%07d.jpg -b 8000k -s 1920x1080 Ghrsst.mp4


This makes a nice high resolution animation of the jpg images you created.

See this link for the animation. WARNING: This animation is ~180 MB!!!

Cheers
Bjorn

Tuesday, November 30, 2010

Submitting a ROMS JOB on CHPC-IQudu

Hi All

Finally, I have ROMS running again on CHPC's iQudu cluster (e1350 IBM Linux cluster)

IQudu used to be set up with IBM's load-leveler but recently CHPC made the change to Moab. This meant starting from scratch!

I have made a testcase directory in my work directory on IQudu that a new user can copy into their work directories and use as a starting point.

/CHPC/work/nburls/ROMS_iQudu_testcase

The files of interest in it are

1) jobcomp_iqudu_intel

Note! As mentioned at the top of this script, you have to source the Intel compiler in your .bashrc script

Also note that as it is currently configured this script makes use of source code in my home directory.

set SOURCE=/CHPC/home/nburls/Roms_Agrif_v2.1_20_07_2010/Roms_Agrif

and a mvapich library (/CHPC/home/nburls/mvapich_intel) and netcdf library (/CHPC/home/nburls/netcdf_intel) in my home directory.

2) run_roms_iqudu.moab

Note! Make sure you change the following lines to point to your directory.
#MSUB -o /CHPC/work/nburls/ROMS_iQudu_testcase/roms_log.out
#MSUB -e /CHPC/work/nburls/ROMS_iQudu_testcase/roms_log.err
#MSUB -d /CHPC/work/nburls/ROMS_iQudu_testcase/

Let me know if any issues pop up. Hopefully not.

Natalie

Friday, November 19, 2010

Experiments in HYCOM


The Hybrid Coordinate Ocean Model (HYCOM) combines the optimal features of isopycnic-coordinate and fixed-grid ocean circulation models in one framework [1]. The name “hybrid” is derived from the models ability to dynamically change its vertical layer distribution between isopycnic (ρ), z-level and bathymetry following σ-coordinates, regularly adjusting to an optimal vertical structure most suitable for a specific region of the ocean. The adaptive vertical grid conveniently resolves regions of vertical density gradients, such as the thermocline and surface fronts.

Over the past few years I have been developing a nested regional 1/10° HYCOM of the greater Agulhas region (for more info, see http://www.nersc.no/~bjornb/PhdThesisBjornBackeberg.pdf). Model validation is an important aspect in developing realistic ocean simulations.

Marjolaine Rouault has been working on deriving ocean surface current velocities from the Advanced Synthetic Aperture Radar (ASAR; [2]), which is mounted on the Envisat satellite. The ASAR velocity data product has a resolution ranging from 4 – 8 km. It is a microwave sensor and its ability to “see” through clouds makes it a very powerful data set to use to map the Agulhas Current at high resolution, and hence also to validation models.


The figure above on the left shows the 2 year mean surface radial velocity component derived from ASAR. It is the radial component of the velocities because Envisat's ground track is -15° from North, and one can only derive current velocities from ASAR, when the current is perpendicular to the satellite ground track, which is luckily the case for the Agulhas. The black contour lines represent the 200, 500, 1000, 2000, and 4000 m isobaths.

One can see that in the southern part of the Agulhas, the current is strongly steered by the bathymetry. At the eastern edge of the Agulhas Bank, the core of the current closely follows the 1000 m isobath.

The above figure to the right shows the 2 year mean radial velocity component derived from HYCOM version 2.1. Two noticeable differences are evident:

  1. The current velocities of the southern Agulhas Current simulated in HYCOM are markedly weaker (almost a factor of 3 difference) than those derived from ASAR.
  2. At the eastern edge of the Agulhas Bank, the current follows the 2000 m isobath instead of the 1000 m isobath, as suggested by the ASAR observations.

In HYCOM, each vertical layer is assigned a reference density, which is the density it reverts to once when changing from fixed vertical coordinates to isopycnic coordinates. The reference densities in version 2.1 of HYCOM were chosen to mimic the reference densities of the parent model supplying the lateral boundary conditions, which range from 21.0 to 28.3 kg/m3. Plotting the vertical distribution of the layers in HYCOM 2.1 (not shown) it is evident that too many layers are found within the mixed layer and the upper 200 m, and only 7 layers remain to simulate the remainder of the water column, which suggests that the reference densities selected are inadequate for the Agulhas region.

Version 2.1 of HYCOM applies a relatively crude vertical interpolation scheme to the mixed layer, which causes enhanced, and artificial, diapycnal mixing within the mixed layer that may diffuse the core of the current.

Recently, we upgraded HYCOM to version 2.2, and some new and improved features include:
  • A new GISS mixed layer scheme
  • WENO-like PPM interpolation of the mixed layer
    (WENO = Weighted Essentially Non-Oscilatory; PPM = Piecewise Parabolic Method)
  • Bottom layer KPP
  • Slow evolution of the barotropic mode, which allows us to double the barotropic time-step.

Then, to test the importance of the vertical layer distribution, I ran two experiments using version 2.2 of HYCOM:

expt01.0: Using the same reference densities as in HYCOM 2.1
expt01.1: Adjust the reference densities to a range more suitable for the Agulhas region. Based on potential density observations from the WOCE transect I6, a new density range from 23.6 to 27.6 kg/m3 was chosen. The vertical resolution resolution was increased between 23.5 and 26.8 kg/m3 to capture the salinity maximum in the Mozambique Channel at approximately 150 – 300 m, and between 27.1 and 27.7 kg/m3 to capture the salinity minimum in the South Atlantic at approximately 600 – 1200 m.

The below figures show the 2 year mean surface radial velocity component derived from expt01.0 and expt01.1 (left and right respectively). The model was run for 2 years only, so is not spun up, these are preliminary tests only.

However, these preliminary results indicate that implementing the new version of HYCOM increases the velocities of the simulated current. There is also some improvement in HYCOM's ability to simulate an Agulhas Current following the 1000 m isobath along the eastern edge of the Agulhas Bank, despite using the same reference densities as in version 2.1.

Qualitatively comparing HYCOM 2.2 expt01.0 and expt01.1, indicates that adjusting the reference densities has an effect on the position on the mean flow of the Agulhas Current. Expt01.1 seems to follow the 1000 m isobath more closely, although in general the current remains too wide compared to the ASAR derived observations. The below results highlight the importance of the vertical discretisation of the model grid when simulation the ocean.


 


These experiments with HYCOM 2.2 have been carried out using a 2nd order momentum advection scheme, and it has been shown, using HYCOM 2.1, that a 4th order momentum advection scheme has a significant impact on the solution [3]. 
 
The 4th order momentum advection scheme uses a super-slip condition at the coast. Presently I am testing a simulation using the free-slip condition at the coast, to see whether this has an impact on the simulation of the Agulhas.

References

  1. Bleck, R.: An oceanic general circulation model framed in hybrid isopycnic-Cartesian coordinates, Ocean Modell. 37, 55–88, 2002
  2. Rouault, M. J., A. Mouche, F. Collard, J. A. Johannessen and B. Chapron: Mapping the Agulhas Current from space: an assessment of ASAR surface current velocities, J. Geophys. Res., 115, C10026, doi:10.1029/2009JC006050
  3. Backeberg, B. C., Bertino, L., and Johannessen, J. A.: Evaluating two numerical advection schemes in HYCOM for eddy-resolving modelling of the Agulhas Current, Ocean Sci. 5(2), 173–190, 2009