Title
A Caching Strategy For Transparent Computing Server Side Based On Data Block Relevance
Abstract
The performance bottleneck of transparent computing (TC) is on the server side. Caching is one of the key factors of the server-side performance. The count of disk input/output (I/O) can be reduced if multiple data blocks that are correlated with the data block currently accessed are prefetched in advance. As a result, the service performance and user experience quality in TC can be improved. In this study, we propose a caching strategy for the TC server side based on data block relevance, which is called the correlation pattern-based caching strategy (CPCS). In this method, we adjust a model that is based on a frequent pattern tree (FP-tree) for mining frequent patterns from data streams (FP-stream) to the characteristics of data access in TC, and we devise a cache structure in accordance with the storage model of TC. Finally, the access process in TC with real access traces in different caching strategies. Simulation results show that the cache hit rate under the CPCS is higher than that using other algorithms under conditions in which the parameters are coordinated properly.
Year
DOI
Venue
2018
10.3390/info9020042
INFORMATION
Keywords
Field
DocType
transparent computing, caching strategy, FP-stream, frequent patterns
Server-side,Bottleneck,Data mining,User experience design,Data stream mining,Cache,Computer science,Computer network,Block (data storage),Storage model,Data access
Journal
Volume
Issue
ISSN
9
2
2078-2489
Citations 
PageRank 
References 
0
0.34
8
Authors
4
Name
Order
Citations
PageRank
Bin Wang190181.18
Lin Chen200.68
Weimin Li36325.40
Jinfang Sheng42310.02