[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Sc-devel] scsynth -L option


unfortunately this is deprecated and replaced by the suggestedLatency property of PaStreams (the one I was fiddling around with before...). I tried setting it to a meaningful value such as

numBufs * numSamples / sampleRate

but the underlying implementations are really so different that this results in a far to small latency most of the times. The -L switch would have allowed people to push their systems and go as low as possible, but thats just trial and error really.

I suspect we just have to leave it as it is and go with the lowest latency suggested by the api. If people have problems with that, they need to find different drivers for their hardware. (ASIO, DSound).


On 4 Mar 2008, at 20:31, ronald kuivila wrote:

Hi Chris,

I just took a look at


which is what you need to consult. It looks like what you actually want to do is compute the latency in milliseconds (100 * bufferSize/samplingRate) and set the environment variable PA_MIN_LATENCY_MSEC to that value. You don't need a new flag, you just need to use the existing information to tell PortAudio how to do its stuff.

The page suggests a number of system tweaks for XP to improve performance.

Here are some relevant quotes from that page:

"The only delay that PortAudio can control is the total length of its buffers. The Pa_OpenStream() call takes two parameters: numBuffers and framesPerBuffer. The latency is also affected by the sample rate which we will call framesPerSecond....The latency in milliseconds due to this buffering is:
latency_msec = 1000 * numBuffers * framesPerBuffer / framesPerSecond
This is not the total latency, as we have seen, but it is the part we can control."

"On some systems you can override the PortAudio minimum if you know your system can handle a lower value. You do this by setting an environment variable called PA_MIN_LATENCY_MSEC which is read by PortAudio when it starts up. This is supported on the PortAudio implementations for Windows MME, Windows DirectSound, and Unix OSS."



On Mar 4, 2008, at 12:08 PM, Chris Frauenberger wrote:


quite frankly, I dont know exactly what the parameter does. As portaudio is a platform independent API it is up to the various implementation to
put this value into context. It is, however, not the hardware buffer
size (which can be assigened separately).

The proposed -L switch would provide access to the PaTime
PaStreamParameters::suggestedLatency (from the docs):

The desired latency in seconds. Where practical, implementations should
configure their latency based on these parameters, otherwise they may
choose the closest viable latency instead. ... Actual latency values for an open stream may be retrieved using the inputLatency and outputLatency
fields of the PaStreamInfo structure returned by Pa_GetStreamInfo().

For Windows MME drivers the default suggested latency is up to 0.2 sec
while the buffer size is still 64 samples. Maybe Ross can shed some
light on this.


ronald kuivila wrote:
Hi all,

Maybe a little explanation is in order. There are several different
issues parameters involved in system latency:

	blockSize				-z option
	hardware buffer size 		-Z option
	Server object latency		language side parameter that can be tuned

   Right now, if the hardware buffer size is bigger than blockSize,
untimestamped messages are processed  on the next hardware
buffer rather than the next sample block. (This is the issue Alberto
mentioned on the list a while ago.)

   The Server object latency imposes a delay that should be larger
than the largest delay associated with hardware buffer size and UDP
This will guarantee accurate timing. It can be altered without
rebooting the server.


On Mar 4, 2008, at 5:01 AM, Stefan Kersten wrote:

On 29.02.2008, at 14:59, Christopher Frauenberger wrote:
now that version 3.2 is launched, I would commit the change proposed. It essentially introduces a -L switch to scsynth when compiled with port-audio to specify a preferred latency. This is important as the
suggested latency (especially on Windows) is sometimes really high
(0.2 sec) and could be reduced by the user for time-critical things
through the server options.
shouldn't the hardware buffer size -Z option be used for specifying
the latency? scsynth seems to be competing already with csound in
terms of command line options ;)


Sc-devel mailing list

Sc-devel mailing list

Sc-devel mailing list

Sc-devel mailing list