Mobile caching at the edge of the wireless network has been regarded as an ideal approach that can alleviate the user access latencies. While there is a problem that the user terminal (UT) is moving too fast when enter a serving cache area, it may not have enough time to acquire the required data from the cache. One solution is to predict the UT's future location and pre-prepare the requested content at the cache devices that will appear in the UT's future path. Once the UT arrive at the serving cache area, they can directly acquire the data since it is already at the location, rather than send in a request to update the cache. The key point to achieve this reliably is the accuracy of the location prediction. This paper presents a location prediction model based on prediction-by-partial-matching (PPM) algorithm in the mobile cache design. The performance of this model will be compared by using oneorder context, two-order context and three-order context, respectively. All the models will be evaluated in a real world data.