Hi all,

so, as explained:

default['analysis_duration'] = ["Duration of the Analysis (total in seconds : 3600, 
[86400])",'86400']
default['corr_duration'] = ["Data windows to correlate (in seconds) 
[1800.]",'1800.']
default['overlap'] = ["Amount of overlap between data windows [0:1[ [0.]",'0.0']

analysis_duration should not be changed. It's there because I originally planned to allow support for other job bases, not only the "1day job" (e.g. for acoustic kHz data). But to date, it's not implemented. corr_duration is the window length, in the loaded day, that is cross-correlated, and they step by the overlap. The "chunking" is not for efficiency, it's for quality rebuilding of the CCF. In the simplest case of a big event in the middle of your 2 daily seismic traces, you'll have the CCF of those full traces will look like an autocorrelation, but if you slice the day in chunks, only one of the, for example 48 slices, will be autocorr, and thus the daily stack of those windows will be less affected by this event. So, msnoise computes N CCFS per day and stacks them to a daily CCF (default is linear stack = average). Thomas
On 13/10/2016 18:14, Flinders, Ashton wrote:
Ok, thanks Lukas!

I brought up a similar question last week about how the cross-correlation
segmenting was done and after talking to Esteban/Thomas my impression was
that corr_duration simply split the time segments into chunks, and did the
cross correlation in chunks PURELY for efficiency reasons, and then rebuilt
the total daily time cross correlation afterwards, and did not stack them.
But this seems to not be true then?


-ashton

On Thu, Oct 13, 2016 at 3:09 AM, Lukas Preiswerk <preisw...@vaw.baug.ethz.ch
wrote:
Ashton,

I think the part that is throwing me off a bit is the "Daily NCFs were
then
obtained by stacking 30-min NCFs".
This sound like to be, they took their individual 30 min NCFs and stacked
them, so that each day is represented by the stacked summation of 48
independent NCFs.
Agreed. If you set corr_duration to 1800 and leave analysis_duration
in the standard 86400, this is exactly what you get in the 1-Day
stacks.

I assumed that meant that both corr_duration and analysis duration were
set
to 3600, but maybe not.
As I mentioned, if you set analysis_duration to 3600,
then you only use 1 hour of data each day and leave the rest
untouched… Try it for yourself with the keep_all option enabled and by
looking at the data in the h5 files.

Lukas




2016-10-12 18:08 GMT+02:00 Flinders, Ashton <aflind...@usgs.gov>:
Thanks Lukas!

I think the part that is throwing me off a bit is the "Daily NCFs were
then
obtained by stacking 30-min NCFs".

This sound like to be, they took their individual 30 min NCFs and stacked
them, so that each day is represented by the stacked summation of 48
independent NCFs.

I assumed that meant that both corr_duration and analysis duration were
set
to 3600, but maybe not.

I'll probably just send an email out to them to see what parameters they
used.


On Tue, Oct 11, 2016 at 11:35 PM, Lukas Preiswerk <
preisw...@vaw.baug.ethz.ch> wrote:

Hi Ashton,

I can partly answer 1) and 3). First, corr_duration would be 30*60 in
their paper (corr_duration is in seconds). As far as I understand,
analysis_duration should almost always be 86400. Setting the
analysis_duration
smaller could be used to prevent loading a full day of data for
specific cases, like super high frequency data (8kHz or more). The
remaining processing still works on days and not multiples of
analysis_duration. For examplem if you set analysis_duration to 3600,
then you only use 1 hour of data each day…

Hope that helps!

Lukas


2016-10-11 18:57 GMT+02:00 Flinders, Ashton <aflind...@usgs.gov>:
Hi all, I was just reading through Taka'aki and Forents new paper
using
MSNoise, and was hoping just for a wee bit more clarification on the
MSnoise processing scheme (wasnt quite clear in the docs).


The paper says;
"We first removed the instrument response from 1-day-long waveform to
obtain ground motion in displacement. Daily displacement data were
bandpassed between 0.08 and 2.0 Hz, down-sampled into 10  Hz, and
split
into 30-min-long data. Those 30-min-long data were spectral whitened
in a
frequency range of 0.1–0.9  Hz and then one-bit normalized. With those
one-bit normalized data, the NCFs were computed for all possible
combinations of components. Daily NCFs were then obtained by stacking
30-min NCFs."


Q1) So just in terms of implementation in msnoise admin, the
30-min-long
duration would be controlled by "analysis_duration" correct?

Q2) If you remove the instrument response, is it always removed from
a 1
day chunk, or is it removed from a chunk equal in size to
"analysis_duration"? (the docs say 1 day, but I wasnt sure if this was
just
referencing the default "analysis_duration" time).

Q3) This probably isnt the intended usage, but if you used
"analysis_duration" longer than a day, would you expect things to
behave?
Thanks as always!

-ashton


p.s. paper;
http://earth-planets-space.springeropen.com/articles/10.
1186/s40623-016-0538-6
--
Ashton F. Flinders, Ph.D
U.S. Geological Survey
345 Middlefield Road
Menlo Park, CA 94025
(650) 329-5050
_______________________________________________
MSNoise mailing list
MSNoise@mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise
_______________________________________________
MSNoise mailing list
MSNoise@mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise



--
Ashton F. Flinders, Ph.D
U.S. Geological Survey
345 Middlefield Road
Menlo Park, CA 94025
(650) 329-5050
_______________________________________________
MSNoise mailing list
MSNoise@mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise
_______________________________________________
MSNoise mailing list
MSNoise@mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise



_______________________________________________
MSNoise mailing list
MSNoise@mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise

Reply via email to