Return to Coastal Ocean Modeling Tools


Ocean Forcing

The main source of open boundary condition information for subtidal ocean fields is the Global NCOM model. I obtained extractions of the fields surrounding our domain from Lucy Smedstad at NRL, who is a co-author on Sutherland et al. (2011).

UPDATE 1/23/2013 Lucy also gave us the files for 2010-2012. For 2013 onwards gNCOM will be replaced by gHYCOM, and will presumably be processed differently. The processing of the 2010-2012 files is idential to that described below, except that (i) it did not require the use of ncom_extract.m. Instead I just gunzip and untar the file she sent, and then moved all the resulting files by hand into year folders (e.g. using mv *gncom_2010* ../2010). (ii) I edited preprocess.m so that it made the longitude in the processed files be -132 to -122, instead of positive E values. Looking at the files using ncom_look.m the only other difference is that the new files go to -132 longitude instead of -131, consistent with slightly larger file size. The preprocessing took 11 minutes per year on skua.

  • The files are kept on waddle (accessible from skua) in:
    (*) = /pmraid3/parker/tools_data/mossea_forcing_data/ocn/NCOM_global/
  • Time period [ys] = 1998 to 2009 (now 2012)
  • Spatial domain: [-131 -122 41 53] like the plots below
  • Daily fields, assumed to be at the start of each day, so 365 or 366 per year
  • Fields: [var] = s3d, ssh, t3d, u3d, v3d (we make ubar and vbar ourselves)
  • Lat-lon grid is plaid, with 1/8 degree resolution (~ 14 km)
  • Depth levels are 0 to 5500, with 5 m spacing near the top, 50 m at 200 m, and 500 m at the bottom.
  • The same spatial structure holds for input and output of the preprocessing.

Processing steps, done in matlab on skua in the directory (*):

  1. The original compressed files are in (*)/compressed_files/[var]_[ys]_gncom.tar.gz
  2. Use ncom_extract.m to untar these into (a lot of) single day files, such as (*)/2006/s3d_gncom_2006010100.nc (~ 1 MB each).
  3. Then run preprocess.m which runs through all these, and packs them into full-year files by variable, such as (*)/pro2006/[var].nc (~725 MB each). This calls the function preprocess_make_netcdf.m. It extrapolates with nearest neighbor for salinity, temperature, and SSH, and pads with zeros for u and v, such that the entire array has data before we feed it to make_clim.m (called by run_maker.m). The preprocessing takes about 40 minutes per year on skua. There were 9 missing files in 1998.
  4. Files (*)/pro[1998-2012]/[[var].nc were created on skua 6/4/2012 and 1/23/2013 by PM, using the improved code from Sarah Giddings that fixed a masking bug.

gncom

Fig. 1. Example fields plotted by ncom_look.m. The top row is the original surface fields, and the bottom row is after preprocessing.


Parker MacCready 01/23/2013