A proactive mobile edge cache policy based on the prediction by partial matching

Lincan Li, Chiew Foong Kwong, Qianyu Liu

Research output: Journal PublicationArticlepeer-review

1 Downloads (Pure)

Abstract

The proactive caching has been an emerging approach to cost-effectively boost the network capacity and reduce access latency. While the performance of which extremely relies on the content prediction. Therefore, in this paper, a proactive cache policy is proposed in a distributed manner considering the prediction of the content popularity and user location to minimise the latency and maximise the cache hit rate. Here, a backpropagation neural network is applied to predict the content popularity, and prediction by partial matching is chosen to predict the user location. The simulation results reveal our proposed cache policy is around 27%-60% improved in the cache hit ratio and 14%-60% reduced in the average latency, compared with the two conventional reactive policies, i.e., LFU and LRU policies.

Original languageEnglish
Pages (from-to)1154-1161
Number of pages8
JournalAdvances in Science, Technology and Engineering Systems
Volume5
Issue number5
DOIs
Publication statusPublished - 2020

Keywords

  • Cache Latency Cache Hit Rate Content Prediction Location Prediction

ASJC Scopus subject areas

  • Engineering (miscellaneous)
  • Physics and Astronomy (miscellaneous)
  • Management of Technology and Innovation

Fingerprint

Dive into the research topics of 'A proactive mobile edge cache policy based on the prediction by partial matching'. Together they form a unique fingerprint.

Cite this