[Mono-osx] Packet loss with sockets
Jason Bell
gharen1234 at hotmail.com
Tue Aug 8 15:37:04 EDT 2006
I have a simple server and client that use sockets with tcp protocol to
communicate. It works great with .net 2.0.
Under mac os x the same code results in packet loss. The greater the
distance between server and client, the greater the loss. I'm pretty sure
this is due to mono for two reasons:
1) I've tried pinging various servers using native utilities, and no packet
loss is apparent.
2) If it were a problem with the mac's network card, I wouldn't think
distance would be such a factor. But when on a local network, the packet
loss is minimal (though still occures), while over larger distances there's
as much as 90% packet loss.
My understanding is that tcp is supposed to be guarenteed to not have packet
loss, as any lost packets are detected and resent. So I would think that if
there was a problem with the network card the rate of transmission would
just be very slow. But packets are lost and not resent.
Has anyone else seen this? I'm not an expert with such things, so is it
possible that it actually is a hardware issue?
_________________________________________________________________
Play Q6 for your chance to WIN great prizes.
http://q6trivia.imagine-live.com/enca/landing
More information about the Mono-osx
mailing list