Improving 5G Millimeter-Wave Signal Analysis Accuracy
Millimeter-wave (mmWave) frequencies unlock the true potential of 5G. The ultra-wide bandwidths enable faster wireless connection speeds and high capacity with low latency, providing the ideal solution to meet increasing industry demands. Many mobile network operators started deploying commercial 5G mmWave networks in 2020. All have massive mmWave deployment plans on their roadmaps. In response, 5G chipset, device, and base station makers are ramping up their design and manufacturing powers to bring more 5G mmWave products and services to market.
Millimeter-wave technology is a crucial element that sets 5G apart from 4G Long Term Evolution (LTE). The technology creates many challenges across the workflow in design, manufacturing, and deployment. At mmWave frequencies, significant path loss makes radio-frequency (RF) power limited and costly. And mmWave frequencies introduce a disruptive change in test methods. Measuring performance metrics using over-the-air (OTA) test methods makes achieving accurate and repeatable results more difficult. Wide bandwidths also introduce more noise and frequency responses. These challenges increase measurement complexity and uncertainty for chipset and device makers, network equipment manufacturers, and operators
This paper discusses improvement in 5G mmWave signal analysis accuracy by doing the following:
- reducing signal path loss
- improving signal condition
- calibrating at the reference plane
By downloading a white paper, the details of your profile might be shared with the creator of the content and you may be contacted by them directly.