Comments on How to Install and Configure Apache Hadoop on Ubuntu 20.04

Apache Hadoop is an open-source framework used to manage, store and process data for various big data applications running under clustered systems. In this tutorial, we will explain how to set up a single-node Hadoop cluster on Ubuntu 20.04.

7 Comment(s)

Add comment

Please register in our forum first to comment.

Comments

By: seli

how to design a fully distributed Hadoop cluste

By: ghassen salhi

hadoop@ubuntu:~$ hdfs dfs -mkdir /test mkdir: Your endpoint configuration is wrong; For more details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort

By: ydh2244

so thanks :) because of this post, i installed hadoop 

By: Biswash Koirala

Hi, I did steps till hdfs namenode -format. Then I started one by one. Then the 'jps' gives only '5428 Jps'. The number keeps increasing if I do it more. The namenodes, datanodes and others are not running. Can you please suggest how to troubleshoot it? hdfs version > 'Hadoop 3.2.1'whereis hadoop >  'hadoop: /usr/local/hadoop /usr/local/hadoop/bin/hadoop /usr/local/hadoop/bin/hadoop.cmd'

By: mts

there is same problem me too

By: shahad

what is a problem solution ?

Unable to connect to 

1- access the Hadoop NameNode using the URL http://your-server-ip:9870. You should see the following screen:

2-  access the individual DataNodes using the URL http://your-server-ip:9864. 

By: Hung

Sorry but when i want to  edit the hadoop-env.sh file and define the Java path, the terminal tell me that no such file or directory like that