Chrome Extension
WeChat Mini Program
Use on ChatGLM

Optimum Averaging of Superimposed Training Schemes in OFDM under Realistic Time-Variant Channels

IEEE access(2021)

Cited 3|Views0
No score
Abstract
The current global bandwidth shortage in orthogonal frequency division multiplexing (OFDM)-based systems motivates the use of more spectrally efficient techniques. Superimposed training (ST) is a candidate in this regard because it exhibits no information rate loss. Additionally, it is very flexible to deploy and it requires low computational cost. However, data symbols sent together with training sequences cause an intrinsic interference. Previous studies, based on an oversimplified channel (a quasi-static channel model) have solved this interference by averaging the received signal over the coherence time. In this paper, the mean square error (MSE) of the channel estimation is minimized in a realistic time-variant scenario. The optimization problem is stated and theoretical derivations are presented to attain the optimum amount of OFDM symbols to be averaged. The derived optimal value for averaging is dependent on the signal-to-noise ratio (SNR) and it provides a better MSE, of up to two orders of magnitude, than the amount given by the coherence time. Moreover, in most cases, the optimal number of OFDM symbols for averaging is much shorter, about 90% reduction of the coherence time, thus it provides a decrease of the system delay. Therefore, these results match the goal of improving performance in terms of channel estimation error while getting even better energy efficiency, and reducing delays.
More
Translated text
Key words
Channel estimation,OFDM,Training,Computational modeling,Coherence time,Channel models,Estimation,OFDM,superimposed training,time-variant channel,channel estimation,least squares,optimization,averaging
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined