Hadoop is an open source, Java-based programming framework designed to process big data across computer clusters. Hadoop architecture manages the transfer of data among nodes and allows a system to still function properly even if a node fails.
There are two main components in Hadoop architecture: distributed file system and MapReduce Engine. The main distributed file system is the Hadoop distributed file system (HDFS), which is where programs are stored. The MapReduce engine is a framework used for program execution.
It's no coincidence saying Hadoop can conjure a smile on even serious developers' faces. Creator Doug Cutting named the framework after his son's toy elephant.
More on Hadoop:
Top issues of using HDFS in Hadoop architecture
Should shared storage be used with Hadoop architecture?