Apache Hadoop
Updated: 06/22/2024 by Computer Hope
Apache Hadoop is the name for a set of big data algorithms, distributed storage, and distributed processing software created by the Apache Software Foundation. It is designed to support high-volume data service applications; automatically handling hardware failures without a loss of service.
Hadoop uses a distributed file system known as HDFS, and software for processing big data sets known as MapReduce. Extremely large files are split into pieces, usually 64 or 128 MB each. The software is mostly written in Java, with some lower-level code written in C language.
Apache server, Big data, Data analysis, Data lake, Service, Software terms