[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Sc-devel] Servers with nil latency?



Hi Julian,

Yes, Alberto mentioned unsynchronized clocks also. It is certainly the most convincing example as in that case latency is not a meaningful concept. But I wonder if it wouldn't be cleaner to simply turn off timestamps altogether in the server, either by using a different class of server object or
adding some logic to the current server class.


RJK

On Dec 18, 2007, at 9:57 PM, Julian Rohrhuber wrote:

In some cases it may be practical to do otherwise, but I think latency should not be used for scheduling, but only for taking into account the network latency headspace. So s.latency is a property of the connection and not of the scheduling processes.

Now here comes the very common case: You play in a networked group, and use each other's servers. Normally, the NTP clocks of all these computers are not in sync, a bit off, or minutes off. What to do then? you set the latency to nil, because there is no meaningful value for it.



Hi all,

(The following is intended as a design discussion, I have no intention of trying to make any changes on this for 3.2)

I fully understand the need for OSC commands without timestamps, but setting a Server's latency to nil still seems like a hack:

1. It affects all processes using the server rather than only the processes that need immediate scheduling. So Patterns, that could schedule their events in advance and stay out of the way are competing for the same time points as the user interactions.

2. It is not fully supported. For example, in Event the check is only found in the \note eventType.

3. It seems to conflate an attribute of the Server with a scheduling choice.

Wouldn't it be better to have a separate Server object with the same NetAddr that sends all bundles without
timestamps? (OK that contradicts point 3 above, but still...)

RJK




On Dec 18, 2007, at 4:58 PM, Alberto de Campo wrote:

Hi James and Josh,

Let me rephrase my question: "what is the difference between setting server's latency to nil and setting it to 0"?

lots of places look up server.latency and use it in sendBundle:

// the server thinks this is too late and complains
s.sendBundle(0, ["s_new", "default", -1, 0, 0]);

// the server silently does this as soon as the bundle comes in
s.sendBundle(nil, ["s_new", "default", -1, 0, 0]);

I use latency nil whenever I need things to respond really quickly,
e.g. with gestural input; or when sending to servers on different
machines without synced clocks.

IMO 'as soon as possible' should be legal, so please leave the check in!


best, adc


RJK



On Dec 18, 2007, at 10:26 AM, James Harkins wrote:

On Dec 18, 2007 10:03 AM, ronald kuivila <rkuivila@xxxxxxxxxxxx> wrote:
Hi all,

In Event there is a switch for a Server with latency = nil. When
and why does that ever happen?

If you take out the switch, events will die if somebody sets latency to nil. Do we want to make it an explicit requirement that you must
have a numeric value in the server's latency variable to use event
patterns?

Obviously it's better for timing if the latency value is there, but I'm not sure it needs to be an absolute rule - latency or death. (Cake
or death?)

hjh


--
--
Alberto de Campo
Bergstrasse 59/33
A-8020 Graz, Austria
e-mail : decampo@xxxxxx
--
_______________________________________________
Sc-devel mailing list
Sc-devel@xxxxxxxxxxxxxxx
http://www.create.ucsb.edu/mailman/listinfo/sc-devel


_______________________________________________
Sc-devel mailing list
Sc-devel@xxxxxxxxxxxxxxx
http://www.create.ucsb.edu/mailman/listinfo/sc-devel


--





.
_______________________________________________
Sc-devel mailing list
Sc-devel@xxxxxxxxxxxxxxx
http://www.create.ucsb.edu/mailman/listinfo/sc-devel