--- In LTspice@..., Jim Wagner <wagnejam99@...> wrote:
Greetings folks -
I have a question about modeling noise in a real system.
Lets say I start with a uniform distributed sequence of random values with a span of +1 to -1. I know that I can average over N of these values and get a Gaussian sequence with a mean of zero and a standard deviation of 0.5. This much is pretty easy.
I don't think that will give a very good approximation to Gaussian noise. You might want to try the Box-Mueller method.
Now, these values represent discrete time samples. How do I turn them into real signal that has a defined bandwidth? Yes, I can put the sequence through a low-pass filter, but I have to define a time interval between samples. Do I need to set this time at 1/10X the filter bandwidth? 1/100X? And, does it matter whether I take the values as though the output of a first-order sample/hold, or do linear interpolation between them? Or something else? And, what sort of filtering is normally assumed? Single pole? Two or three poles?
For the Box-Mueller, just take the output signal. You don't need S/H, interpolation, or any other tricks. You may need to adjust tripdV and tripdt. There are several Box-Mueller files available. Try
"
and
"
In a hardware noise generator, you normally want the noise spectrum to remain flat well past the frequencies of interest. But if you need to define the bandwidth, how about doing a FFT on the resulting noise signal? That should give answers to your questions.
Thanks
Jim Wagner
Oregon Research Electronics
Mike