Comments on How to Install and Configure Apache Hadoop on Ubuntu 20.04
Apache Hadoop is an open-source framework used to manage, store and process data for various big data applications running under clustered systems. In this tutorial, we will explain how to set up a single-node Hadoop cluster on Ubuntu 20.04.
7 Comment(s)
Comments
how to design a fully distributed Hadoop cluste
hadoop@ubuntu:~$ hdfs dfs -mkdir /test mkdir: Your endpoint configuration is wrong; For more details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort
so thanks :) because of this post, i installed hadoop
Hi, I did steps till hdfs namenode -format. Then I started one by one. Then the 'jps' gives only '5428 Jps'. The number keeps increasing if I do it more. The namenodes, datanodes and others are not running. Can you please suggest how to troubleshoot it? hdfs version > 'Hadoop 3.2.1'whereis hadoop > 'hadoop: /usr/local/hadoop /usr/local/hadoop/bin/hadoop /usr/local/hadoop/bin/hadoop.cmd'
there is same problem me too
what is a problem solution ?
Unable to connect to
1- access the Hadoop NameNode using the URL http://your-server-ip:9870. You should see the following screen:
2- access the individual DataNodes using the URL http://your-server-ip:9864.
Sorry but when i want to edit the hadoop-env.sh file and define the Java path, the terminal tell me that no such file or directory like that