07-31-2012 03:49 PM
07-31-2012 06:23 PM
After GPS, network is the principle killer of battery. So the answer to your question:
"Will keeping a socket connection open be noticeably bad for the user's battery?"
But will it be unacceptable? I have no direct experience with doing this. However I have some experience with an application that uses a socket connection to send and receive data and will be open a new socket whenever there is data to be sent or received. This does not kill the battery - it will still last a day.
However I am not convinced that you need to do this. The socket connection approach documented above will get stuff to and from the phone in under 10 seconds, even on a low speed network. But the biggest impact to performance appears to be network latency, in that even though the server turns the message around immediately it still takes between3 and 5 seconds to get back to the phone. So this is still going to effect you.
And I suspect your biggest issue is the time taken to establish a connection. I would analyze the various parts of your network communication and work out where the time is actually taken. It could be obtaining the connection from ConnectionFactory and that is something you can help.
So before you give up on the current implementation, see if you can do some analysis of where you are taking the performance hit.
08-02-2012 02:23 PM
08-02-2012 02:27 PM
Peter, First of all, thanks for your reply. I have done some analysis, but writing to the event log at various stages of the connection. My approach is as follows:
1. new ConnectionFactory
2. set preferredTransportTypes (WIFI, WAP2, BIS, MDS, TCP)
3. connectionDescriptor = getConnection(url)
4. if not null
5. httpConnection = (HttpConnection) connectionDescriptor.getConnection();
6. set inputstream
7. open byteoutputstream
8. read inputstream until -1
9. return or process outputstream as a bytearray
I notice that most of the the time, the delay is at step 5. This supports the latency argument. So you are saying the socket server alternative would not really make a major difference for this problem.
08-02-2012 05:46 PM
Good analysis approach.
Unfortunately what you are seeing does not really support the latency argument. Unless you use HTTPS, the connection does not get established until you request something from it. Typically this is the response code. In your case it is getting the input stream, but the result is the same. The time from 5 to 6 includes the connection time and the latency time.
I was thinking you might notice a delay between 3 and 4 while the ConnectionFactory checked the available connections based on your preferences. But this is not what you see.
Based on what you do see, I'd like to know about how long the delay is between 5 and 6. If you use my figures, if the time between 5 and 6 will include at most 5 seconds of latency. The rest is connection time. I would be interested to know what the difference is.
Note I am assuming that you do not have WiFi enabled here.