Distributed File System Hadoop Distributed File System (HDFS)
It is primarily designed to hold a large amount of data while providing access to multiple clients over a network. It is designed to hold vast amounts of data (petabytes and terabytes) and also supports individual files having large sizes.
Here files are stored on a single machine. Here, the files are stored over multiple machines.
It does not provide Data Reliability It provides Datta Reliability.
If multiple clients are accessing the data at the same time, it can cause a server overload. HDFS takes care of server overload very smoothly, and multiple access does not amount to server overload.
BY Best Interview Question ON 02 Jun 2020