What is Hadoop Technology?
Hadoop technology is an open source software framework for storing data. Hadoop is also used for running applications on clusters of commodity hardware. It provides massive/big storage for any type of enormous data processing power and has the ability to handle virtually limitless concurrent tasks or jobs. Hadoop is java based programming framework that supports the processing and storage of extremely large data sets in a distributed computing environment.
What is Big data?
Big data is collection of data that is big in size/volume and also is expanding with time. This data is very large and complex. Big data isn't any other subject, big data is also a data but in big size
Why Hadoop?
- Hadoop provides rapid data transfer rate
- Hadoop is capable to work on cluster and handle to thousand of TB data
- It is developed by doug cutting and mike cafarella in 2006
- It was inspired by google mapReduce programming framework
- Ability to store large amount of data
- computing power
- Fault tolerance
- Low cost
- Scalability
Hadoop Components
- Name Node
- Data Node
- YARN -yet another resource negotiator
Where Hadoop is using now?
- Low-cost storage and data archive
- Sandbox for discovery and analysis
- Data Lake
- Complement your data warehouse
- IOT and Hadoop
Want To Learn Hadoop?
Information source :internet
media source:google.in
Hadoop is the most preferred framework for storing and processing big data. Nice post, you have written amazing articles on big data and artificial intelligence. I appreciate the author's efforts in writing such informative article. Thank you for sharing this. Great blog. Big Data Training and placement in Chennai
ReplyDeletePost a Comment