Abstract | ||
---|---|---|
The number of applications based on Apache Hadoop is dramatically increasing due to the robustness and dynamic features of this system. At the heart of Apache Hadoop, the Hadoop Distributed File System (HDFS) provides the reliability and high availability for computation by applying a static replication by default. However, because of the characteristics of parallel operations on the application l... |
Year | DOI | Venue |
---|---|---|
2016 | 10.1109/TKDE.2016.2523510 | IEEE Transactions on Knowledge and Data Engineering |
Keywords | Field | DocType |
Encoding,Big data,Fault tolerance,Fault tolerant systems,Measurement,Computers | Data mining,Computer science,Robustness (computer science),Artificial intelligence,Data file,Erasure code,Distributed computing,Distributed File System,Supervised learning,Fault tolerance,High availability,Big data,Machine learning | Journal |
Volume | Issue | ISSN |
28 | 6 | 1041-4347 |
Citations | PageRank | References |
4 | 0.47 | 26 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Dinh-Mao Bui | 1 | 32 | 3.35 |
Shujaat Hussain | 2 | 84 | 8.87 |
Eui-Nam Huh | 3 | 1036 | 113.46 |
Sungyoung Lee | 4 | 2932 | 279.41 |