Title
Small Files Storing And Computing Optimization In Hadoop Parallel Rendering
Abstract
The Hadoop framework has been widely used in the animation industry to build a large scale, high performance parallel render system. However, Hadoop Distributed File System (HDFS) and MapReduce programming model are designed to manage large files and suffer performance penalty while rendering and storing small RIB files in rendering system. Therefore, method that merging small RIB files based on two intelligent algorithms is proposed to solve the problem. The method uses Particle Swarm Optimization (PSO) and Support Vector Machine (SVM) to choose the optimal merge value for any scene file, by mainly considering the rendering time, memory limitation and other indicators. Then, the method takes advantage of frame-to-frame coherence to merge RIB files at an interval way with the optimal merge value. Finally, the proposed method is compared with the naive method under three different render scenes. Experimental results show that the proposed method significantly reduces the number of RIB files and render tasks, and improves the storage efficiency and computing efficiency of RIB Files.
Year
DOI
Venue
2015
10.1002/cpe.3847
2015 11TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION (ICNC)
Keywords
Field
DocType
Hadoop, rendering system, small files, PSO, SVM
Computer science,Artificial intelligence,Computer hardware,Distributed File System,Particle swarm optimization,Parallel rendering,Programming paradigm,Support vector machine,Parallel computing,Storage efficiency,Animation,Rendering (computer graphics),Machine learning
Conference
Volume
Issue
Citations 
29
20
2
PageRank 
References 
Authors
0.41
9
5
Name
Order
Citations
PageRank
Yizhi Zhang120.41
Heng Chen2151.66
Zhu Zhengdong372.90
Xiaoshe Dong417251.44
Honglin Cui520.41