Applying random time delay offsets looks like AWGN. Why? #268
pfeatherstone
started this conversation in
General
Replies: 1 comment 1 reply
-
You might need to plot the full signal to see what's happening; not just the first 140 samples or so. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
So I'm experimenting with some synthetic noisers. I've played with random sample rate offsets (exactly the same as
sro_model
in GNURadio) that applies a Gaussian random walk offset to the sample rate. That behaves well.So I thought I would try something simpler and add a Gaussian random offset to the time delay. This is slightly different.
Here is the code and some figures:
Here are the plots:
Now you can see in the second plot (I) that tiny time offsets are applied. That's great, that was the intention.
Looking at the first plot, the spectra, it looks like AWGN!
But why?
Beta Was this translation helpful? Give feedback.
All reactions