I'm looking for help concerning the timing parameters of a TRAN simulation
to evaluate the distortion of an amplified sine wave.
?
I've been reading through lots of messages of this group concerning "distortion" and "tran timing"
recently but could not find an answer to what I was looking for.
[I came across the audio distortion analyser contributed by Tony Casey,
for which I am thankful and it is indeed a fine tool.]
?
Hitherto I've used 1 or 2 per mille of the sine wave period, so the time step would be 1 usec or 2 usec
when testing with 1 kHz. From my tests I decided for myself that values exceeding 1% of the period
should best be avoided.
I also tested time steps adjusted to the power of 2, e.g. 2**14 (interval / 16384).?
?
For the interval (t_stop - t_start) I usually take 16msec when testing with 1kHz
but I have also used values from 10msec to 48msec.
It is this parameter which I personally find most difficult to decide upon.
?
So, to sum it all up, I wonder if there is a guideline which values to use for the time step
and the interval for a certain test frequency when testing audio related circuits with sine waves.
(I'm still using LTspice XVII.)
?
Ryu