Main Features of Apache Hadoop
Hadoop open source framework and popular data storage system. Hadoop is used to stores large set of structured,semi structured and unstructured data. Here we are discuss about main features of Apache hadoop
1. Hadoop is Open Source:
Hadoop framework is the open source so we can changed project coding according to business requirements
2. Fault Tolerant:
In Hadoop all data stored in HDFS and that data are replicated two or more blocks. All blocks data are across the hadoop cluster . If any block are failure or out of service the system automatically assigns the work to another block so processing the data continuously working.
3. Scalable:
Hadoop is open source so its run on hardware. In Hadoop new nodes can be easily added without any downtime. Hadoop provides horizontal scalability so new node added on the Fly model to system. In Hadoop applications run on more than thousands of node.
4. Cost Effective:
Hadoop also offer cost less storage system for business. Apache hadoop used to develop the internet based companies. Cost of the hadoop management including hardware,software and other expenses with $1000 terabyte data. Hadoop offers computing and storage capabilities for hundreds of pounds per terabyte.
5. Economic:
Hadoop is not expensive and it runs on hardware. If run hadoop don’t need special machine for it. In hadoop new node add process is easily so Hadoop requirement increases we can increase node as well without any downtime and preplaning.
6. Easy to use:
In hadoop client deal is not needed for data processing because hadoop framework takes all the process so easy to use the hadoop
7. High Availability:
In hadoop hardware data are stored in multiple copies so if any data path failure the data accessed from another path or copy.
Reference - Features of Hadoop
Comments
Post a Comment