[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Sc-devel] scsynth -L option



Hi Chris,

I just took a look at  

http://www.portaudio.com/docs/latency.html

which is what you need to consult.  It looks like what you actually want to do is compute the latency in milliseconds
(100 * bufferSize/samplingRate) and set the environment variable PA_MIN_LATENCY_MSEC to that value.
You don't need a new flag, you just need to use the existing information to tell PortAudio how to do its stuff.

The page suggests a number of system tweaks for XP to improve performance.

Here are some relevant quotes from that page:

"The only delay that PortAudio can control is the total length of its buffers. The Pa_OpenStream() call takes two parameters: numBuffers and framesPerBuffer. The latency is also affected by the sample rate which we will call framesPerSecond....The latency in milliseconds due to this buffering  is:
latency_msec = 1000 * numBuffers * framesPerBuffer / framesPerSecond
This is not the total latency, as we have seen, but it is the part we can control."

"On some systems you can override the PortAudio minimum if you know your system can handle a lower value. You do this by setting an environment variable called PA_MIN_LATENCY_MSEC which is read by PortAudio when it starts up. This is supported on the PortAudio implementations for Windows MME, Windows DirectSound, and Unix OSS."


Cheers,

RJK


On Mar 4, 2008, at 12:08 PM, Chris Frauenberger wrote:

Hello,

quite frankly, I dont know exactly what the parameter does. As portaudio 
is a platform independent API it is up to the various implementation to 
put this value into context. It is, however, not the hardware buffer 
size (which can be assigened separately).

The proposed -L switch would provide access to the PaTime 
PaStreamParameters::suggestedLatency (from the docs):

The desired latency in seconds. Where practical, implementations should 
configure their latency based on these parameters, otherwise they may 
choose the closest viable latency instead. ... Actual latency values for 
an open stream may be retrieved using the inputLatency and outputLatency 
fields of the PaStreamInfo structure returned by Pa_GetStreamInfo().

For Windows MME drivers the default suggested latency is up to 0.2 sec 
while the buffer size is still 64 samples. Maybe Ross can shed some 
light on this.

Thanks
Chris

ronald kuivila wrote:
Hi all,

Maybe a little explanation is in order. There are several different  
issues parameters involved in system latency:

blockSize -z option
hardware buffer size -Z option
Server object latency language side parameter that can be tuned

   Right now, if the hardware buffer size is bigger than blockSize,  
untimestamped messages are processed  on the next hardware
buffer rather than the next sample block.  (This is the issue Alberto  
mentioned on the list a while ago.)

   The Server object latency imposes a delay that should be larger  
than the largest delay associated with hardware buffer size and UDP  
transmission.
This will guarantee accurate timing. It can be altered without  
rebooting the server.

RJK


On Mar 4, 2008, at 5:01 AM, Stefan Kersten wrote:

On 29.02.2008, at 14:59, Christopher Frauenberger wrote:
now that version 3.2 is launched, I would commit the change proposed.
It essentially introduces a -L switch to scsynth when compiled with
port-audio to specify a preferred latency. This is important as the
suggested latency (especially on Windows) is sometimes really high
(0.2 sec) and could be reduced by the user for time-critical things
through the server options.
shouldn't the hardware buffer size -Z option be used for specifying
the latency? scsynth seems to be competing already with csound in
terms of command line options ;)

<sk>

_______________________________________________
Sc-devel mailing list


_______________________________________________
Sc-devel mailing list

_______________________________________________
Sc-devel mailing list