Monday, December 13, 2010

Making loaddap work on ubuntu at UCT


  1. Download a libdap.rpm and loaddap.rpm appropriate to your version of matlab. For matlab 6, loaddap 3.5.2 and libdap 3.6.2 are the packages to use. http://www.opendap.org/download/ml-structs.html
  2. Convert the rpm packages to debian (.deb) using alien (which is in the ubuntu repository)
  3. Install the packages, libdap first and then loaddap
  4. Install a pthreads library (libpthread), which is in the repository
  5. You will need to create links to a number of libraries (the versions may differ according to the versions of loaddap and libdap that you are using) that possibly exist as newer versions in your version of ubuntu (in my case, 10.04/Lucid Lynx).
  • In /usr/lib: sudo ln -s libssl.so.0.9.8 libssl.so.4
  • In /usr/lib: sudo ln -s libcrypto.so.0.9.8 libcrypto.so.4
  • In /usr/lib: sudo ln -s libcurl.so.3 libcurl.so.2
  • In /lib: sudo ln -s libcomm_err.so.2.1 libcomm_err.so.3
  1. This is the dodgy bit: the /usr/bin/loaddap.mexglx seems to be incompatible with the version of matlab and/or mexcdf that I have, so I copied it from /Roms_tools/Opendap_tools/FEDORA/ and pasted it into /usr/bin (renaming the old one loaddap.mexglx_bckp)
  2. Edit you start.m script in order to point to the loaddap and libdap libraries:
  • addpath(['/usr/bin'])
  • addpath(['/usr/lib'])
  • addpath(['/usr/sbin'])
  • addpath(['/usr/share'])

If you are using a machine that is not at UCT, you should be good to go. If you are at UCT, you need to do the following (thanks Neil!) in order to get around some proxy issues.

  1. Install cntlm (available in the ubuntu repository)
  2. As root, you need to edit the following file: /etc/cntlm.conf according to the following example (everything else in the file is uncommented):

Username <your UCT username (i.e. staff or student number)>
Domain wf

Password <your UCT password> # Use hashes instead (-H)


Proxy campusnet.uct.ac.za:8080


Listen 3128




ISAScannerSize 1024

ISAScannerAgent Wget/

ISAScannerAgent APT-HTTP/

ISAScannerAgent Apt/

ISAScannerAgent Svn/

ISAScannerAgent Loaddap/


  1. In order to activate the changes made in cntlm.conf so that you can bypass the proxy issues so that your username and password is automatically provided for command-line opendap downloads, you need to type:

sudo /etc/init.d/cntlm start

You won't want to have to type this command everytime you do a download, so in order to automate it on start-up, go to System – Preferences – Startup Applications. Click on add, type a name for command (e.g. Run cntlm), the command (use gksudo instead of sudo: gksudo /etc/init.d/cntlm start) and a comment if you wish. You will be asked for your root password on startup in a gui (the reason you need to use gksudo) so that this is able to run.

  1. In .bashrc, add the following lines (the port needs to be consistent with what is in cntlm.conf)

export http_proxy=http://localhost:3128

export https_proxy=$http_proxy

export ftp_proxy=$http_proxy

  1. In System-Preferences-Network Proxy, I changed mine to Manual Proxy Configuration (and checked the box: use the same name for all protocols), with localhost as the HTTP proxy and the port 3128.

Note: for my Mozilla Firefox (version 3.6.12) and for my Thunderbird email client I kept the preferences for network connections to 'Auto-detect proxy settings'.

Note: this is one solution. There may also be a cleaner work-around in which you specify your proxy, username and password in .dodsrc (which should be in your home directory).


Friday, December 3, 2010

Bulk renaming files and animations

It is sometimes useful to bulk rename files and create animations.

I had 1000s of images that were in sequential order of the form ghrsst_YYMMDD_reg.tdf.reg.jpg that I needed to rename to img0001.jpg, img0002.jpg, etc.

To do so I did the following:
$ rename 's/_reg.tdf.reg.jpg/.jpg/' * # replace all the "_reg.tdf.jpg" with just ".jpg"
$ rename 's/ghrsst_/0/' *.jpg # replace all the "ghrsst_" with "0"
$ x=1; for i in *jpg; do counter=$(printf %07d $x); ln -s "$i" img"$counter".jpg; x=$(($x+1)); done

Then to make the animation from the linked files:
$ ffmpeg -f image2 -r 8 -i img%07d.jpg -b 8000k -s 1920x1080 Ghrsst.mp4


This makes a nice high resolution animation of the jpg images you created.

See this link for the animation. WARNING: This animation is ~180 MB!!!

Cheers
Bjorn

Tuesday, November 30, 2010

Submitting a ROMS JOB on CHPC-IQudu

Hi All

Finally, I have ROMS running again on CHPC's iQudu cluster (e1350 IBM Linux cluster)

IQudu used to be set up with IBM's load-leveler but recently CHPC made the change to Moab. This meant starting from scratch!

I have made a testcase directory in my work directory on IQudu that a new user can copy into their work directories and use as a starting point.

/CHPC/work/nburls/ROMS_iQudu_testcase

The files of interest in it are

1) jobcomp_iqudu_intel

Note! As mentioned at the top of this script, you have to source the Intel compiler in your .bashrc script

Also note that as it is currently configured this script makes use of source code in my home directory.

set SOURCE=/CHPC/home/nburls/Roms_Agrif_v2.1_20_07_2010/Roms_Agrif

and a mvapich library (/CHPC/home/nburls/mvapich_intel) and netcdf library (/CHPC/home/nburls/netcdf_intel) in my home directory.

2) run_roms_iqudu.moab

Note! Make sure you change the following lines to point to your directory.
#MSUB -o /CHPC/work/nburls/ROMS_iQudu_testcase/roms_log.out
#MSUB -e /CHPC/work/nburls/ROMS_iQudu_testcase/roms_log.err
#MSUB -d /CHPC/work/nburls/ROMS_iQudu_testcase/

Let me know if any issues pop up. Hopefully not.

Natalie

Friday, November 19, 2010

Experiments in HYCOM


The Hybrid Coordinate Ocean Model (HYCOM) combines the optimal features of isopycnic-coordinate and fixed-grid ocean circulation models in one framework [1]. The name “hybrid” is derived from the models ability to dynamically change its vertical layer distribution between isopycnic (ρ), z-level and bathymetry following σ-coordinates, regularly adjusting to an optimal vertical structure most suitable for a specific region of the ocean. The adaptive vertical grid conveniently resolves regions of vertical density gradients, such as the thermocline and surface fronts.

Over the past few years I have been developing a nested regional 1/10° HYCOM of the greater Agulhas region (for more info, see http://www.nersc.no/~bjornb/PhdThesisBjornBackeberg.pdf). Model validation is an important aspect in developing realistic ocean simulations.

Marjolaine Rouault has been working on deriving ocean surface current velocities from the Advanced Synthetic Aperture Radar (ASAR; [2]), which is mounted on the Envisat satellite. The ASAR velocity data product has a resolution ranging from 4 – 8 km. It is a microwave sensor and its ability to “see” through clouds makes it a very powerful data set to use to map the Agulhas Current at high resolution, and hence also to validation models.


The figure above on the left shows the 2 year mean surface radial velocity component derived from ASAR. It is the radial component of the velocities because Envisat's ground track is -15° from North, and one can only derive current velocities from ASAR, when the current is perpendicular to the satellite ground track, which is luckily the case for the Agulhas. The black contour lines represent the 200, 500, 1000, 2000, and 4000 m isobaths.

One can see that in the southern part of the Agulhas, the current is strongly steered by the bathymetry. At the eastern edge of the Agulhas Bank, the core of the current closely follows the 1000 m isobath.

The above figure to the right shows the 2 year mean radial velocity component derived from HYCOM version 2.1. Two noticeable differences are evident:

  1. The current velocities of the southern Agulhas Current simulated in HYCOM are markedly weaker (almost a factor of 3 difference) than those derived from ASAR.
  2. At the eastern edge of the Agulhas Bank, the current follows the 2000 m isobath instead of the 1000 m isobath, as suggested by the ASAR observations.

In HYCOM, each vertical layer is assigned a reference density, which is the density it reverts to once when changing from fixed vertical coordinates to isopycnic coordinates. The reference densities in version 2.1 of HYCOM were chosen to mimic the reference densities of the parent model supplying the lateral boundary conditions, which range from 21.0 to 28.3 kg/m3. Plotting the vertical distribution of the layers in HYCOM 2.1 (not shown) it is evident that too many layers are found within the mixed layer and the upper 200 m, and only 7 layers remain to simulate the remainder of the water column, which suggests that the reference densities selected are inadequate for the Agulhas region.

Version 2.1 of HYCOM applies a relatively crude vertical interpolation scheme to the mixed layer, which causes enhanced, and artificial, diapycnal mixing within the mixed layer that may diffuse the core of the current.

Recently, we upgraded HYCOM to version 2.2, and some new and improved features include:
  • A new GISS mixed layer scheme
  • WENO-like PPM interpolation of the mixed layer
    (WENO = Weighted Essentially Non-Oscilatory; PPM = Piecewise Parabolic Method)
  • Bottom layer KPP
  • Slow evolution of the barotropic mode, which allows us to double the barotropic time-step.

Then, to test the importance of the vertical layer distribution, I ran two experiments using version 2.2 of HYCOM:

expt01.0: Using the same reference densities as in HYCOM 2.1
expt01.1: Adjust the reference densities to a range more suitable for the Agulhas region. Based on potential density observations from the WOCE transect I6, a new density range from 23.6 to 27.6 kg/m3 was chosen. The vertical resolution resolution was increased between 23.5 and 26.8 kg/m3 to capture the salinity maximum in the Mozambique Channel at approximately 150 – 300 m, and between 27.1 and 27.7 kg/m3 to capture the salinity minimum in the South Atlantic at approximately 600 – 1200 m.

The below figures show the 2 year mean surface radial velocity component derived from expt01.0 and expt01.1 (left and right respectively). The model was run for 2 years only, so is not spun up, these are preliminary tests only.

However, these preliminary results indicate that implementing the new version of HYCOM increases the velocities of the simulated current. There is also some improvement in HYCOM's ability to simulate an Agulhas Current following the 1000 m isobath along the eastern edge of the Agulhas Bank, despite using the same reference densities as in version 2.1.

Qualitatively comparing HYCOM 2.2 expt01.0 and expt01.1, indicates that adjusting the reference densities has an effect on the position on the mean flow of the Agulhas Current. Expt01.1 seems to follow the 1000 m isobath more closely, although in general the current remains too wide compared to the ASAR derived observations. The below results highlight the importance of the vertical discretisation of the model grid when simulation the ocean.


 


These experiments with HYCOM 2.2 have been carried out using a 2nd order momentum advection scheme, and it has been shown, using HYCOM 2.1, that a 4th order momentum advection scheme has a significant impact on the solution [3]. 
 
The 4th order momentum advection scheme uses a super-slip condition at the coast. Presently I am testing a simulation using the free-slip condition at the coast, to see whether this has an impact on the simulation of the Agulhas.

References

  1. Bleck, R.: An oceanic general circulation model framed in hybrid isopycnic-Cartesian coordinates, Ocean Modell. 37, 55–88, 2002
  2. Rouault, M. J., A. Mouche, F. Collard, J. A. Johannessen and B. Chapron: Mapping the Agulhas Current from space: an assessment of ASAR surface current velocities, J. Geophys. Res., 115, C10026, doi:10.1029/2009JC006050
  3. Backeberg, B. C., Bertino, L., and Johannessen, J. A.: Evaluating two numerical advection schemes in HYCOM for eddy-resolving modelling of the Agulhas Current, Ocean Sci. 5(2), 173–190, 2009

Wednesday, November 10, 2010

ROMS bulk forcing update

In case anyone is interested I now have a version of ROMS_TOOLS that will accommodate the CORE 2 forcing data set. The data-set needs to be pre-prepared top put all variables on a consistent time-step and separate it into monthly blocks. Roms_blk files now contain no wind stress variables (redundant space) and specific rather than relative humidity. Only downward long-wave radiation is included - black body radiation is corrected for in the bulk routine. Shortwave radiation is corrected for with a flat albedo for sea-water (0.065). The ROMS algorithms have been adapted to cope with specific humidity.

I have also engineered a way to allow ROMS to accept an anomaly on the fly (labelled 'anom' in roms_anm.nc), this way anomalies can be applied to data sets without rebuilding the forcing fields. Let me know if this is of use - the anomaly I have made is currently just applicable to wind stress.

Lastly, we now have the entire CORE2 data-set, normal year and inter-annual series, both in corrected forms. I am currently working on adapting the boundary routines to take data from a few ORCA configurations, which we now have the data for in house (contact me if you want more details on this: configs are ORCA05 and ORCA2 from the Kiel Climate Model).

If anyone spots that I have done something a bit stupid above, please let me know!

Cheers

Ben

Monday, November 8, 2010

Getting email notifications for this blog

To get email notifications of new posts / comments in this blog you need to join the Google group:

http://groups.google.com/group/uct-ocean-modelling

Members of this group will receive email notifications when the UCT ocean modelling blog is updated.

Unfortunately this will only send you an update when a new POST is created, not when people comment on posts.

It is possible subscribe to the blog using the link at the bottom of the blog "Subscribe to: Posts (Atom)".

Monday, October 25, 2010

Bulk forcing of ROMS

ROMS-ites,

I have a question with regard to applying surface forcing to ROMS, which I hope one of you can help me with. I want to run an Indian Ocean domain forced with the GFDL-CORE 2 data set. I would rather not explicitly prescribe the SST and SSS in my model, so I intend to use this data set to apply bulk forcing across the domain (which CORE in its native state appears to accommodate well). However, I would like to weakly relax/nudge the derived SST and SSS values back to Levitus so I can restrict my tracer evolution somewhat. As yet, I don't seem to be able to find a way to do this. It seems that if I explicitly define the surface forcing through the make_forcing algorithm then I can activate the QCORRECTION and SFLX_CORR keys. However, defining bulk forcing negates this option.

Can anyone help me with this? Are there other, possibly better, alternate approaches?

Thanks

Ben

Friday, October 22, 2010

Parallel Computing with MATLAB - Academic trial at CHPC

Through collaboration with CHPC, OPTI-NUM solutions has made a 128-worker MATLAB Distributed Computing Service available for a trial period. This trial is available to any academic user for testing purposes. Successful trials will influence any decisions made by CHPC to make Matlab available to academic users on an ongoing basis. For information on how to use MATLAB on the CHPC cluster: http://www.optinum.co.za/microsites/chpc/

Thursday, October 21, 2010

Where to get high res bathymetry data for SA

Hi Schmodellers,

For those of you who might be interested in obtaining high resolution bathymetric data, I highly recommend you approach the Council of Geosciences; they are extremely helpful and resourceful. From what I have learned they are very keen on getting more involved with the universities and students.

The bathymetric data I received for False Bay was primarily fair chart derived so the density of the data varied throughout the domain, high data density was located in the bay and low density further offshore. The data needed to be gridded. In my case, as a result of the coarser resolution of the bathymetry further offshore, I had to grid the data into 100m cell sizes. See below, for a ‘zoomed in’ section of the end product compared with the GEBCO 1’ dataset.

The GEBCO 1’ dataset (Left) and the Council of Geosciences bathymetric data (right) gridded into 100m cells, both are overlaid with the 5m isobaths contours from 0-200m.

In particular you will notice that the GEBCO dataset excludes the features located at the mouth of the bay, namely Rocky Bank and Rough Bottom. From the ROMS simulations that I have run, first with a flat bottom and then with the high resolution bathymetry, the results have shown that these features significantly influence the circulation and thermal structure in the bay. They reduce the inflow of remotely forced circulation and furthermore act as a barrier reducing the amount of cold bottom water that enters the Bay. The results highlight the importance of using high resolution bathymetric data in this study.

If you are interested in getting bathymetric data or want to know more contact Michael Machutchon michael@geoscience.org.za from the Council of Geosciences. If you would like to see my code for gridding the data please don't hesitate to ask.

Monday, October 11, 2010

Welcome to the UCT ocean modelling blog!

This space has been created for modellers to, not only share their successes so that they can help and inspire others, but also to vent their frustrations so that they can let off steam (!) and, with a bit of luck, be helped by someone else. It is also a place where modellers can keep abreast of what the local community is up to...without leaving the comfort of their own office chair! I'm not trying to encourage social delinquency, but an online forum provides a more accessible way of sharing modelling activities with colleagues who may not be based in your immediate vicinity (like, in your office).

In the next few months a UCT ocean server protocol will be developed. This is becoming important as the number of people using Bart, Lisa, Marge and, in the near future, Maggie, increases. We are trying to develop a system that will help to make the administration and the upkeep of these machines more efficient. At the next Monday modellers meet (to be announced soon), we will talk about how we plan to use google applications to make our lives easier...and we look forward to your input on that.

As the first post, I thought some eye-candy would be nice...it's a ROMS model simulation of depth averaged currents, between 0-100 m depth for the Benguela system. The colourbar is the speed and the arrows show the direction: black for northward, white for southward. Sorry, the arrows are small and very difficult to see and if you maximize the animation, it is very blurry.