


Please let me know if these calculations appear logical? Or have I got this wrong? At least, these calculations above match what I see in UE logs. Only 2 detectable demodulation ref symbols in each RB of 84 REs would give RSRQ -16dB, matching places where I would start seeing radio link failures in logs. RSRQ average of 4/40 = -10dBĪn absolute minimum almost guaranteed to cause a radio link failure would be where 3 of the ref symbol resource elements have interference and the UE can only detect energy in 1 ref symbol, giving a ratio and RSRQ of 1/84 = -19dB. 4/4=1.įor cases where the remaining 80 resource elements in each RB have energy, we would have 4/84 = -13.22dB RSRQ.Ī reasonable average of 50% RE population would then give an approx. So assuming RSSI measures something sensible in LTE, and since 36.211 shows 4 ref symbols (all cell-specific) on antenna port zero, the max RSRQ when only ref signals are transmitted across the considered meas bandwidth, would be zero (4 ref symbols & remaining 80 OFDM symbols empty). Maybe there's an error in 36.214 because the RSRP definition looks the same as the RSSI definition, which would make RSRQ always zero (as it says "measurements in the numerator and denominator shall be made over the same set of resourceīlocks.") and it says RSSI only considers ref symbols.
