I’m wanting to learn more about how OpenCellID determines the approximate (or “updatable”) cell tower locations, and how my reports can be optimized to contribute to this.
I’m working on an Open Source “probe” project using a low-cost dev board that has a cellular module (available in both a CAT-M and CAT-1 version) and GPS. I’m using this with some multi-carrier SIM cards (although it could use any) and so in my configuration I can be testing across ~400 carriers in ~190 countries.
My thought is that when this is powered, it will scan for networks, connect to each visible network that it can, and report the data to OpenCellID via the API.
My question is about the “optimal” reporting “interval”, not only based on time, but also on location or physical separation of reports.
Obviously, I could simply runs tests and reports non-stop, but it seems that there would be some separation distance that would really provide better data for triangulation - if that is what you are doing to approximate and update the tower locations.
What is that distance? If I generate a report for a carrier, how far should I measure in distance before reporting on that same tower and connection? 1 mile? 2 miles? 5 miles? More?
I don’t want to send excessive similar reports if the data will not be valuable for OpenCellID to leverage.
Nice setup with the multi-carrier approach. To optimize contributions: send as much data as your mobile data budget allows. We dedupe internally at ~50m intervals and handle archival, so no need to worry about measurement density.
More data points = better tower location confidence, especially when captured from different angles/distances. Our backend handles the optimization, so the main constraint should just be your data plan economics.
Mind sharing a link to your probe project? Sounds interesting.
I have not yet pushed the code to a new repo, and finally made time to work on this again. I’m currently testing with the CAT-M board that uses the SIMCom 7000G modem.
I’m getting the following test data back from the modem:
I have a few questions on mapping this data into a report.
The API seems to have a LAC parameter, which says it can be LAC or TAC. But then it also has a TAC parameter. It seems that I ought to use the TAC parameter instead of LAC. Is this correct?
I’m not quite clear which “signal” I ought to be providing you. From the documentation, the various values provided mean:
<RSRQ> - (-8 above) Current reference signal receive quality as measuredbyL1<RSSNR>
<RSRP> - (-96 above) Current reference signal received power. Available for CAT-M or NB-IOT
<RSSI> - (-72 above) Current Received signal strength indicator
<RSSNR> - (16 above) Average reference signal signal-to-noise ratio of the serving cell
The value of SINR can be calculated according to <RSSNR>, the formula is as below:
SINR = 2 * <RSSNR> - 20. (Calculated as 12 from above)
The range of SINR is from -20 to 30
I have another question related to the required values. You ask for:
<rating> double GPS quality/accuracy information (metres)
I’m curious if you consider a calulated value based on the GPS HDOP to be adequate? From my understanding the Estimate Error Constant for HDOP is 5 meters, and so I can use the formula:
I’ll try to address your questions point by point.
Q1: TAC vs. LAC parameter
You’re correct that the API mentions both LAC and TAC. Our system is designed to accommodate different data reporting formats. If you’re submitting data via CSV upload , use the LAC column and leave the TAC column blank. However, if you’re submitting through an API and you have TAC data available, you should use the TAC parameter for better alignment.
Q2: Which signal to provide?
For LTE and CAT-M1, you should use the RSRP (Reference Signal Received Power) value expressed in dBm. This should fall within the range of -45 dBm to -137 dBm . Alternatively, if your device reports ASU (Arbitrary Strength Unit), the corresponding range is 0 to 95 ASU . More details about signal filtering and data types can be found here under the “Filtering of data” section.
Q3: GPS accuracy
If your device reports GPS accuracy directly in meters, you can use that value as is. For devices reporting HDOP, your proposed formula seems fine: Accuracy (meters) = HDOP × Error Constant (5 meters)
I hope you are doing well. I wanted to see if you could comment on my two posts above. I’ve got my device working now and being tested, but only posting to my cloud. I want to ensure that the data that I am about to post to your API is correct and accurate.
I look forward to your response, and hope to be posting to your platform by this coming weekend.