Greetings folks -
I have a question about modeling noise in a real system.
Lets say I start with a uniform distributed sequence of random values with a span of +1 to -1. I know that I can average over N of these values and get a Gaussian sequence with a mean of zero and a standard deviation of 0.5. This much is pretty easy.
Now, these values represent discrete time samples. How do I turn them into real signal that has a defined bandwidth? Yes, I can put the sequence through a low-pass filter, but I have to define a time interval between samples. Do I need to set this time at 1/10X the filter bandwidth? 1/100X? And, does it matter whether I take the values as though the output of a first-order sample/hold, or do linear interpolation between them? Or something else? And, what sort of filtering is normally assumed? Single pole? Two or three poles?
Yes, this is for an LTspice project. And, the reason I am being a bit coy about specific values is that I would like to apply this in a variety of communication systems with varying operating frequencies and bandwidths. In some cases, the system bandwidth will be very broad (maybe 100 MHz) encompassing HF frequencies with lots of ambient noise as well as VHF that is much quieter.
Thanks
Jim Wagner
Oregon Research Electronics