Mobile edge caching is an emerging approach to manage high mobile data traffic in fifth-generation wireless networks that reduces content access latency and offloading data traffic of backhaul links. This paper proposes a novel cooperative caching policy based on long short-term memory (LSTM) neural networks considering the characteristics between the features of the heterogeneous layers and the user moving speed. Specifically, LSTM is applied to predict content popularity. Size-weighted content popularity is utilised to balance the impact of the predicted content popularity and content size. We also consider the moving speeds of mobile users and introduce a two-level caching architecture consisting of several small base stations (SBSs) and macro base stations (MBSs). To avoid content requests of fast-moving users affecting the content popularity distribution of the SBS since fast-moving users frequently handover among SBSs, fast-moving users are served by MBSs no matter which SBS they are in. SBSs serve low-speed users, and SBSs in the same cluster can communicate with one another. The simulation results show that compared to common cache methods, for example, the least frequently used and least recently used methods, our proposed policy is at least 8.9% lower and 6.8% higher in terms of the average content access latency and offloading ratio, respectively.
ASJC Scopus subject areas
- Information Systems
- Computer Networks and Communications
- Electrical and Electronic Engineering