This is a map reduce code of calculating pi value "hadoop jar /home/rakib/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.4.jar pi 10 100". You can watch Hive installation video for more use cases of Map Reduce.
Ejecute jps en la línea de comando y verifique que todos los nodos funcionen correctamente o no. Si funciona, utilice localhost:50070 para conectar hadoop desde el navegador.
delete namenode and datanode folders under /home/user/hadoop/data using rm -r command, again create those folders in that location after that format the namenode finally start the hadoop cluster
may be some properties need to be added in mapred-site.xml file. So when you run the any hdfs command it'll show error and show some properties needed to be updated, copy those properties into mapred-site.xml file with hadoop_home path.
delete namenode and datanode folders under /home/user/hadoop/data using rm -r command, again create those folders in that location after that format the namenode, finally start the hadoop cluster
For 4 days I was not able to install hadoop but after your video i installed it successfully
Glad to know that. Thanks!
four days, i am been installing since one month 😪
Suffered to install hadoop.This video is my saviour.
#no-1 video for hadoop installation
It's a pleasure.
Thank you for this thorough video. it helped.
You're welcome! Glad it was helpful.
Perfect tutorial
Thank you so much.
Great, thanks for the video
livesaver! thankyou so much you helped me a lot!
You're welcome!
Thanks bro, awesome video
Glad you liked it
thqnk you dude for this usefull video I zish you all good this world can provide
So nice of you
Awesome! I was able to install Hadoop on Ubuntu successfully. I'm from Vietnam with love. Thank you very much
You're welcome!
Nice, worked for me, thank you
You're welcome!
Thank you for this great video. However, only the last step did not work with me "Unable to connect". what could be the problem?
First uninstall SSH, then follow configure the SSH section on the video after that install SSH.
THANKS I LOVE YOU 😭😭😭
kindly also explain how to run a simple mapreduce job on it?
This is a map reduce code of calculating pi value "hadoop jar /home/rakib/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.4.jar pi 10 100". You can watch Hive installation video for more use cases of Map Reduce.
Great tutorial, thank you very much
Glad it was helpful!
is it possible to install hadoop on different disk? I tried to do the same on path: /media/ubuntu/D/hadoop/etc/hadoop and it didn't work
Yes, it's possible. You have to give the access permission of that path using chmod
Thaaaank you a looooot ❤
thanks a lot
You are most welcome
🙏👏
Inside hadoop lib folder is not present
Go inside Hadoop folder using cd command then press ls. You can see lib folder is present.
how to create hadoop_install_config.txt file. I have no this file in my downloads
you can find that file in video description.
Hola hice todo el procedimiento pero cuando me voy a conectar desde el navegador no me lo permite
Ejecute jps en la línea de comando y verifique que todos los nodos funcionen correctamente o no. Si funciona, utilice localhost:50070 para conectar hadoop desde el navegador.
What's the song or the instrument in the intro?
thanks bro
Welcome
My datanode is not showing. When I ran jps command
delete namenode and datanode folders under /home/user/hadoop/data using rm -r command, again create those folders in that location after that format the namenode finally start the hadoop cluster
Why the name node is not showing.
check web UI after starting the nodes using this link localhost:9870/
Why it can not run command hdfs?
may be some properties need to be added in mapred-site.xml file. So when you run the any hdfs command it'll show error and show some properties needed to be updated, copy those properties into mapred-site.xml file with hadoop_home path.
asegúrate de que estés en la ruta /home/tu_usuario/hadoop/bin/, me pasó lo mismo, y luego ejecutas : hdfs namenode -format
Localhost: permission denied (publickey,password). how can i fix this?
Watch the video from 14:01.
how to add root
Watch the video from 1:16
My Datanode is not showing when I ran jps command
delete namenode and datanode folders under /home/user/hadoop/data using rm -r command, again create those folders in that location after that format the namenode, finally start the hadoop cluster
Sir can you make video on it
its not working
@@IvyProSchool