Code like this is actually 50% faster on the server than on my home Computer!
- Code: Select all
long startTime = _stopWatch.ElapsedMilliseconds;
double myRandom = 0;
for (int i = 0; i < 1000000; i++)
{
myRandom += Math.Sqrt(_builtInRandom.NextDouble() * 10000) ;
}
Console.WriteLine("Square root : " + (_stopWatch.ElapsedMilliseconds - startTime) );
// Square root : 60ms(debug server) / 30ms( live server)
This is true for most calculations (Math functions, generating randoms etc), they are all consistently faster.
However there is one area where the live server seems inexplicably slower than my own computer(debug server):
- Code: Select all
long startTime = _stopWatch.ElapsedMilliseconds;
for (int c = 0; c < 12000; ++c)
{
List<Object> tilesArray2 = new List<Object>();
for (int k = 0; k <= 40; ++k)
{
tilesArray2.Add(new Object());
}
}
Console.WriteLine("Listmanipulation : " + (_stopWatch.ElapsedMilliseconds - startTime) );
// List manipulation : 50ms(debug server) / 120ms(live server)
- also the range of the live server varies quite wildly, from 30 to 300ms(!), but averages around 120ms, which is about 240% slower.
What is really odd is the inconsistency, as I said in all other areas the Server seems to outperform the debug server / home CPU, so it would logically follow that if everything was working properly, this should be faster too. Which is why I think that this a bug?