Outage Probability and Cell Coverage
If the target minimum received power is -80 dBm, and the received power at a distance d is -75 dBm, what is the outage probability at this distance?
A receiver at distance d = 1 km experiences a pathloss of 80 dB. If the transmitter power is 35 dBm and the target minimum power is -45 dBm, does this receiver experience an outage?
If the coverage radius of a base station is increased while keeping transmit power constant, what is likely to happen to the outage probability?
A cell has an outage probability of 0.1. Which statement is correct?
Which technique can help improve coverage without increasing transmit power?
If the received SNR is below the target SNR threshold, this results in:
In a Rayleigh fading channel, outage probability is typically:
Which environment would likely require the smallest cell radius to maintain low outage probability?