04-05-2009 10:03 PM
I hava an application to receive datagrams in client mode, the server send datagrams in quick speed.
There is a problem that datagrams received in the client are not currently sent out by server, there are several seconds' delay.
I think this may related to a big RCVBUF of datagramconnection. For SocketConnection, there is a method setSocketOption() which can configure the RCVBUF of connection. Is there same method for DatagramConnection?
Solved! Go to Solution.
04-17-2009 10:56 AM
04-17-2009 11:17 AM
Even on a PC with ethernet connections, it's possible to have a few seconds delay, though that is unusual....it's usually not more than about 200mS and if it's local, it'll be typically under 10mS.
Over the air, a few seconds is not unreasonable at all. When streaming audio, there is usually a buffer that is used which is why it's best to use the api that is there to send stuff like audio as that takes care of all that. Remember also that you are using Java, which isn't the fastest API in the world, but the speed to access the data should be insignificant compared to the speed of connection.
Just make sure that you are keeping the connection and not connecting for every chunk of data.
You just just have a loop where you are reading whatever is available in your input buffer and returning the number of bytes read. On the transmit size on the server you should have a similar transmit loop. In typical socket connections if one loop is ahead of the other the nature of the algorithm should work itself out into a synchronizing mode.
i.e. If you send one chunk and wait for it to be received, you might have a delay of a few seconds, but if you are continuously transmitting, you might be reading one chunk while another is being transmitted, so the second chunk may have no delay.
Are you seeing a really slow speed when you do it in a loop as mentioned?
I guess what I'm saying is that while I'd expect a longer delay for the first read, I'd expect it to be much faster for the second one, so the overall bandwidth is what you should be measuring.
04-19-2009 08:30 PM
Thank Mark and Donald, your suggestions are really appreciated.
I am now having a single thread to do receiving loop, and another thread to deal with data. With the first thread's priority set to Thread.MAX_PRIORITY and the second thread's priority set to Thread.MIN_PRIORITY, the delay disappeared.
Thanks again and Best Regards!
04-19-2009 09:24 PM
Ah, I'm glad you got it to work. While using max and min priorities are good to get it to work, I'd recommend that you experiment a little bit and see if you can avoid using the max and min as if you put timers in there, etc, you could end up with a race condition. Actually, I'm not sure what priorities everything in the OS runs at (and I wish RIM would tell us) as this could be useful to know so we'd know what to set our various threads at.
Of course, I'm also a believer in the philosophy "If it works, don't fix it".... :-)
The only reason for "fixing it" might be if you plan on expanding it, and if you think this will run in parallel with some other apps running.
Just from experience I try to always avoid using extremes of priority.
Whatever you think is best.
04-19-2009 11:03 PM
Hi Donald, I agree with your "avoid using extremes of priority"!
I tried to set these two priorities closer to Thread.NORM_PRIORITY, but more closer to normal, more delayed data appeared.
I also tried to use TimerTask instead of Thread, but it performed like thread with normal priority.
For my situation, maybe it's best to use max and min.