Abstract
The proactive caching has been an emerging approach to cost-effectively boost the network capacity and reduce access latency. While the performance of which extremely relies on the content prediction. Therefore, in this paper, a proactive cache policy is proposed in a distributed manner considering the prediction of the content popularity and user location to minimise the latency and maximise the cache hit rate. Here, a backpropagation neural network is applied to predict the content popularity, and prediction by partial matching is chosen to predict the user location. The simulation results reveal our proposed cache policy is around 27%-60% improved in the cache hit ratio and 14%-60% reduced in the average latency, compared with the two conventional reactive policies, i.e., LFU and LRU policies.
Original language | English |
---|---|
Pages (from-to) | 1154-1161 |
Number of pages | 8 |
Journal | Advances in Science, Technology and Engineering Systems |
Volume | 5 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2020 |
Keywords
- Cache Latency Cache Hit Rate Content Prediction Location Prediction
ASJC Scopus subject areas
- Engineering (miscellaneous)
- Physics and Astronomy (miscellaneous)
- Management of Technology and Innovation