Create an R language script (prefered) or visual basic script to achieve the following.
Given two data sets that include time and depth information (see attached files).
These files represent time series of depth observations (a profile of the seabed) from two ships as they move along (more or less) the same track line but at different times.
I want to know the time offset (a single value) that would best align the two depth profiles.
I would like to view a graph of the measure of alignment against time offset (or lag).
I imagine this would be done by calculating a cross correlation between the depth profiles for different time lags and plotting the cross correlation (at lag) against the time lag.
The script should return the time offset that gives the best alignment in the format of Decimal minutes. (eg 124.5 minutes)
If there is difficulty in returning a single best value for for the time offset (eg multiple minima, or other issues with the data), then it is acceptable to quote to calculate the correlation (or other alignment measure) and create the plot, so that a manual choice of the optimal lag can be made.
File descriptions. Each file has the following structure
1 Date of "ping" in dd/mm/yyyy format
2 time of ping in hh:mm:ss format as time of day
3 mmm milliseconds to be added to the time to get exact time with precision of 1 msec
4 not needed
5 not needed
6 not needed
7 Depth in meters
The date, time (hhmmmss) and milseconds columns can be combined to give a single time measure.