01-15-2009 12:49 AM
Ok I made a simple program to check the differences in coverage using getSignalLevel between verizon and att. I have a blackberry storm on verizon and it is unlocked and i have an ATT sim so I can switch back and forth. The program I wrote dumps the getSignalLevel every second and I had a few questions on this api:
1) when I'm on verizon the best I ever get is -80 no matter how close I am to a tower. when I'm on att I see it go as low as -60's.
2) I'm baffled by the logic for the display of bars to getSignalLevel. It seems that there is a different bars to signal strength mapping when on verizon than there is when i'm on att (again...with the same phone). Any ideas on this?
3) I used the getNetworkService api and when i'm on verizon I never seem to get back the EVDO service eventhough my phone is displaying 1XEV. When I'm on ATT my phone is only showing "edge" yet the api is returning back EVDO and edge.
I'd appreciate help on these questions.
01-21-2009 03:28 PM
Please refer to this knowledge base article.
How To - Interpret wireless network signal levels
Article Number: DB-00708
01-21-2009 10:27 PM
01-23-2009 02:31 PM
10-23-2009 12:56 PM
I have seen the exact same behavior on my Verizon BlackBerry Storm. The function getSignalLevel() never returns a value greater than -80 and getNetworkService() never reports EVDO. Did you find a solution?
10-23-2009 01:25 PM
No I never did find a solution.
I think I came across on these forums that the enums for EVDO are deprecated. I think when you compile you probably are seeing the warning. The new enums are still being worked by BB. and Yes on the storm I only see -80 as the best. However if you switch over to UMTS mode you can see the signal level go down to -60's or so.
10-23-2009 01:33 PM
Hmm, that's lame. I really need to differentiate 1xRTT vs EVDO. It works fine on a Pearl 8130 and a Curve 8330. The Storm is bugged?
Every CDMA phone I have experience with, including Windows Mobile as well as BlackBerry, has capped the signal level at some arbitrarily low value; not the same on each phone. I don't know why. My theory is that the underlying CDMA technology does not need measurements above a certain value in order to work (e.g. manage cell site handoffs, etc), thus the hardware does not support such measurements. I don't know why GSM should be any different than CDMA; albeit CDMA can tolerate lower signal levels than GSM. Perhaps it is just a convention used by the hardward manufacturers.
10-23-2009 01:43 PM
The EVDO enums changed in OS 4.7 so if you're compiling with 4.6 or earlier but running on 4.7 or greater, you won't see EVDO come up.
The signal level behavior is expected and based on the network technology. CDMA networks can use lower power for an equally reliable signal. The CDMA chipset simply maxes out at -80. GSM/EDGE need more power to be reliable. I've seen -40 as the max when standing next to a base station. UMTS is actually Wideband-CDMA and maxes out at -70.
10-23-2009 01:48 PM
One thing I have noticed is, that on the Storm, the value returned by getNetworkService has bit #17 set. This bit is undocumented. Perhaps it is our mysterious NETWORK_SERVICE_EVDO_REV0 bit.
The exact value I see is 10000000000001110 (binary).
To know for sure, I need to travel to a loction where EVDO is not available.
10-23-2009 02:00 PM
With respect to the value returned by getNetworkService(), the behavior is the same when compiling with 4.2.1 and 4.7.
It's true that CDMA can tolerate signal levels lower than GSM, but that doesn't explain a difference of 40 dBm. I think the cap choosen on CDMA devices is arbitrarily choosen as opposed to being an actual physical reality.
I have seen signal level values of -37 dBm on UMTS/HSDPA Windows Mobile devices (e.g. HTC Touch Diamond on AT&T). I have not seen a cap of -70 dBm on any UMTS/HSDPA device, but I have not tried to run my Storm on UMTS/HSDPA .