Hey-
I've been encountering a problem. Every 500 ms. my server sends a static update out. As well, every 50 ms., both my server and client run the game code that moves the players, etc. My problem is that on the server, the code is correctly running every 50 ms, so that the code runs 10 times before every static update. But, in the flash client, the setInterval() is not consistent, varying by about 30 ms. I've tried to use setTimeout() and subtracting the difference, so for example, if one timeout took 60 ms to trigger, the next timeout would be told to try to trigger at 40 ms. But, this method still has not worked.
Any ideas would be greatly appreciated