Goal-Oriented CSI Feedback for MRT-Precoded Massive MIMO Communication Systems

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Downlink channel state information (CSI) feedback typically results in an unacceptable overhead in frequencydivision-duplex (FDD) massive multiple-input multiple-output (MIMO) systems. To deal with this challenge, several deep learning (DL) based CSI compression and recovery approaches have been developed, which follow an auto-encoder architecture and aim at minimizing CSI reconstruction error. Different from the mainstream methodology mentioned above, in this letter, we follow a goal-oriented design philosophy. That is, instead of minimizing the reconstruction error, we train a deep neural network (NN) to compress the CSI such that the precoder using the compressed CSI as input can optimize the downlink transmission performance, i.e., minimize the bit error rate (BER) at the UEs. A two-stage training method is developed to train the NN.

Original languageEnglish
Title of host publication2024 IEEE 35th International Symposium on Personal, Indoor and Mobile Radio Communications, PIMRC 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350362244
DOIs
StatePublished - 2024
Event35th IEEE International Symposium on Personal, Indoor and Mobile Radio Communications, PIMRC 2024 - Valencia, Spain
Duration: 2 Sep 20245 Sep 2024

Publication series

NameIEEE International Symposium on Personal, Indoor and Mobile Radio Communications, PIMRC
ISSN (Print)2166-9570
ISSN (Electronic)2166-9589

Conference

Conference35th IEEE International Symposium on Personal, Indoor and Mobile Radio Communications, PIMRC 2024
Country/TerritorySpain
CityValencia
Period2/09/245/09/24

Keywords

  • CSI feedback
  • Massive MIMO
  • deep learning
  • precoding

Fingerprint

Dive into the research topics of 'Goal-Oriented CSI Feedback for MRT-Precoded Massive MIMO Communication Systems'. Together they form a unique fingerprint.

Cite this