On Model Coding for Distributed Inference and Transmission in Mobile Edge Computing Systems

Jingjing Zhang , Osvaldo Simeone

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)
135 Downloads (Pure)

Abstract

Consider a mobile edge computing system in which users wish to obtain the result of a linear inference operation on locally measured input data. Unlike the offloaded input data, the model weight matrix is distributed across the wireless Edge Nodes (ENs). ENs have non-deterministic computing times, and they can transmit any shared computed output back to the users cooperatively. This letter investigates the potential advantages obtained by coding model information prior to ENs' storage. Through an information-theoretic analysis, it is concluded that, while generally limiting cooperation opportunities, coding is instrumental in reducing the overall computation-plus-communication latency.

Original languageEnglish
Article number8691770
Pages (from-to)1065-1068
Number of pages4
JournalIEEE COMMUNICATIONS LETTERS
Volume23
Issue number6
Early online date15 Apr 2019
DOIs
Publication statusPublished - Jun 2019

Keywords

  • Coding
  • Computing/Communication Latency
  • Edge Computing
  • Stragglers

Fingerprint

Dive into the research topics of 'On Model Coding for Distributed Inference and Transmission in Mobile Edge Computing Systems'. Together they form a unique fingerprint.

Cite this