# Hadoop installation on windows without cygwin in 10 mints

By | August 10, 2015

Hadoop installation on windows without cygwin in 10 mints – Hadoop installation on windows 7 or 8

Before starting make sure you have this two softwares

Configuration
Step 1 – Windows path configuration
set HADOOP_HOME path in enviornment variable for windows
Right click on my computer > properties > advanced system settings > advance tab > environment variables > click on new

Find path variable in system variable > click on edit > at the end insert ‘; (semicolon)’ and paste path upto hadoop bin directory in my case it’s a

F:/Hortanwork/1gbhadoopram/Software/hadoop-2.7/hadoop-2.7.1/bin

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>


<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>

Edit hadoop-2.7.1/etc/hadoop/hdfs-site.xml, paste the following lines and save it, please create data folder somewhere and in my case i have created it in my HADOOP_HOME directory

<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
</property>
<property>
<name>dfs.datanode.data.dir</name>
</property>
</configuration>

<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
</property>
</configuration>

Edit hadoop-2.7.1/etc/hadoop/hadoop-env.cmd, comment existing %JAVA_HOME% using @rem at start, give proper path and save it. (my jdk is in program files to avoid spaces i gave PROGRA~1)

Demo

Step 3 – Start everything

Very Important step

bin folder – this contains .dll and .exe file (winutils.exe for hadoop 2.7.1)

Now delete you existing bin folder and replace with new one (downloaded from my repo )

( my github ) etc folder is just given for reference you need to modify your configuration parameters according to your environment path

Open cmd and type ‘hdfs namenode -format’ – after execution you will see below logs

Open cmd and point to sbin directory and type ‘start-all.cmd

F:Hortanwork1gbhadoopramSoftwarehadoop-2.7hadoop-2.7.1sbin>start-all.cmd

It will start following process

Namenode

Datanode

YARN resourcemanager

YARN nodemanager

JPS – to see services are running
open cmd and type – jps (for jps make sure your java path is set properly)

GUI
Step 4 – namenode GUI, resourcemanager GUI

In next tutorial we will see how to run mapreduce programs in windows using eclipse and this hadoop setup

## 192 thoughts on “Hadoop installation on windows without cygwin in 10 mints”

1. house clearance islington

I’m not that much of a internet ffpovv reader to be honest but your blogs really nice, keep it up! I’ll go ahead and bookmark your site to come back down the road. All the best

2. Meera

Hi, When i run the hadoop-env.cmd, there is no response
wat can i do now

3. Vedant

I am getting this error after running hdfs namenode -format command:
Error:could not find or load main class..
how to solve this????

4. Parth

It’s working!! The GUI shows it’s working!!! However when I tried jps command it showed that command doesn’t exist!! I’ve set the JAVA path properly!

5. Mallikarjun Ratala

Great Article. I was able to set up Hadoop successfully, but not in 10 mins

6. Manish Kumar suthar

I will Run Work With this code But i want to rin word count program with using this so please help me for this
………………..

7. Thajudheen

sir, if all this steps are not applicable for windows 32 bit machine.then, all the nodes are starting while entering start-all.cmd but it automatically shutting down all nodes in the hadoop. what will i do for this error.

8. prashant

C:\WINDOWS\system32>hdfs namenode -format
‘Files’ is not recognized as an internal or external command,
operable program or batch file.
Error: JAVA_HOME is incorrectly set.
‘-Dhadoop.security.logger’ is not recognized as an internal or external command

operable program or batch file.

9. Chandra Sekhar

Its working. Prior to this I have tried many which were not helpful. Thankyou so much.

10. Naveen

17/01/23 04:41:29 INFO impl.MetricsSystemImpl: NodeManager metrics system started
17/01/23 04:41:29 FATAL nodemanager.NodeManager: Error starting NodeManager
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609) at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977) at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187) at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108) at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.testDirs(DirectoryCollection.java:290) at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.checkDirs(DirectoryCollection.java:229) at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.checkDirs(LocalDirsHandlerService.java:379) at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.serviceInit(LocalDirsHandlerService.java:160) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107) at org.apache.hadoop.yarn.server.nodemanager.NodeHealthCheckerService.serviceInit(NodeHealthCheckerService.java:48) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107) at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceInit(NodeManager.java:260) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:485) at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:533) 17/01/23 04:41:29 INFO impl.MetricsSystemImpl: Stopping NodeManager metrics system… 17/01/23 04:41:29 INFO impl.MetricsSystemImpl: NodeManager metrics system stopped. 17/01/23 04:41:29 INFO impl.MetricsSystemImpl: NodeManager metrics system shutdown complete. 17/01/23 04:41:29 INFO nodemanager.NodeManager: SHUTDOWN_MSG: 11. Naveen 17/01/23 04:41:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable 17/01/23 04:41:27 FATAL datanode.DataNode: Exception in secureMain java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2344) at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2386) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2368) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2260) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2307) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2484) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2508) 17/01/23 04:41:27 INFO util.ExitUtil: Exiting with status 1 12. Naveen 17/01/23 04:41:29 FATAL nodemanager.NodeManager: Error starting NodeManager java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
17/01/23 04:41:29 INFO impl.MetricsSystemImpl: Stopping NodeManager metrics system…
17/01/23 04:41:29 INFO impl.MetricsSystemImpl: NodeManager metrics system stopped.
17/01/23 04:41:29 INFO impl.MetricsSystemImpl: NodeManager metrics system shutdown complete.
17/01/23 04:41:29 INFO nodemanager.NodeManager: SHUTDOWN_MSG:

13. Qazi

Hi,

I have completed every step but stuck on Namenode GUI address – http://localhost:50070

It seems I cannot open port got this message
This site can’t be reached

localhost refused to connect.
ERR_CONNECTION_REFUSED

My all 4 windows are open and running fine. I can see the application page. I tried to open port in firewall settings but port still not listed in open ports. Can I open this page in any other port? And why in configuration file ‘core-site.xml’ says localhost:9000? and not 50070?

Thanks

QF

1. Vojtech

I have exactly the same problem. Would anybody have a hint? Thanks

14. Ganesh

Hi Tushar Sarde,

As I am new to hadoop and I tried to replicate the same setting in my windows and I am successfully able to up the following nodes,
10448 DataNode
800 Launcher
7556
10728 NodeManager
10044 ResourceManager
8508 Jps

But it failed to start the namenode and getting the below error,

16/12/27 15:52:01 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: All specified directories are not accessible or do not exist.

Many Thanks,
Ganeshbabu R

15. Akanksha

Thank you so much Sir for insightful tutorial. I am able to create a namenode. But after doing start-dfs from sbin I am getting an error as the system can’t find the file hadoop. I am able to get the version by doing hadoop version. I also checked java_home set in a hadoop-env.cmd and it is JAVA_HOME=C://Java//jdk1.8.0_25. Could you please suggest.

16. H2P

Hi,
Thanks for this helpful post but while configuring i am facing following problem please suggest solution if possible!!!!!

While using “hdfs namenode -format” cmd it successfully formatted the namenode:———

16/11/07 08:22:08 INFO namenode.FSImage: Allocated new BlockPoolId: BP
16/11/07 08:22:09 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
16/11/07 08:22:09 INFO util.ExitUtil: Exiting with status 0
16/11/07 08:22:09 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ImMortaL-PC/

But when i use “start-dfs” cmd from sbin it is giving following error message for namenode & datanode:

[ for datanode ]:————————–
16/11/07 08:29:39 FATAL datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609) at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977) at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187) at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:157) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2344)
16/11/07 08:29:39 INFO util.ExitUtil: Exiting with status 1
16/11/07 08:29:39 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at ImMortaL-PC/

[ for namenode] :—————————-
16/11/07 08:29:48 ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609) at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996) at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:490)
16/11/07 08:29:48 INFO util.ExitUtil: Exiting with status 1
16/11/07 08:29:48 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ImMortaL-PC/************************************************************/

I have following cnfiguration for hdfs-site.xml:

dfs.replication
1

dfs.namenode.name.dir

dfs.datanode.data.dir

This is really a great instruction. It took me 10 minutes to set it up. Thanks for putting this together!

18. sekhar

Hi tushar

My name is sekhar.i am do the same installation what u mentioned above…but when I enter C:\hadoop-2.7.1\sbin>hdfs name node -format ….it says “error could not find or load main class my”.
And also i enter “start-all.cmd” it shows only yarn nodemanager and yarn resourcemanager….when enter jps command it shows only jps value and nodemanager and resourcemanager…..
So what is the problem in that……how to solve start hdfs name node manager …

19. sekhar

Hi tuaja

My name is sekhar.i am do the same installation what u mentioned above…but when I enter C:\hadoop-2.7.1\sbin>hdfs name node -format ….it says “error could not find or load main class my”.
And also i enter “start-all.cmd” it shows only yarn nodemanager and yarn resourcemanager….when enter jps command it shows only jps value and nodemanager and resourcemanager…..
So what is the problem in that……how to solve start hdfs name node manager …

20. Nitin

I Have Successfully installed Hadoop But When I integrate the hadoop With Eclipse DFS Server is not work
Eclipse shows the error :-
” An internal error occurred during: “Connecting to DFS localhost”.
org/apache/htrace/SamplerBuilder “.
how to solve it . & How to run Map Reduce “WordCount ” Program In eclipse passing the txt file path.
plz help me to solve this error

1. vairaprakash

1. vairaprakash

is it possible to install pig and hive here? How fully distributed configuration is possible in windows? once it possible then please share the procedure sir.

21. Kirti P. B.

Hello Tushar. Its a useful article. I have installed as per the instructions, but while running start-all.cmd from sbin folder, initially all four manager windows are coming but after that I m getting following Exception. I am unable to interpret. Please suggest the solution. Thanks in anticipation.

STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r 15e
cc87ccf4a0228f35af08fc56de536e6ce657a; compiled by ‘jenkins’ on 2015-06-29T06:04
Z
STARTUP_MSG: java = 1.7.0_79
************************************************************/
[Fatal Error] core-site.xml:2:6: The processing instruction target matching “[xX
][mM][lL]” is not allowed.
16/09/23 16:42:45 FATAL conf.Configuration: error parsing conf core-site.xml
sing instruction target matching “[xX][mM][lL]” is not allowed.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
2539)
:2492)
)
51)
ericOptionsParser.java:321)
icOptionsParser.java:487)
er.java:170)
er.java:153)
sourceManager.java:1197)
16/09/23 16:42:45 FATAL resourcemanager.ResourceManager: Error starting Resource
Manager
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/H:/RE
columnNumber: 6; The processing instruction target matching “[xX][mM][lL]” is n
ot allowed.
2645)
:2492)
)
51)
ericOptionsParser.java:321)
icOptionsParser.java:487)
er.java:170)
er.java:153)
sourceManager.java:1197)
The processing instruction target matching “[xX][mM][lL]” is not allowed.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
2539)
… 10 more
16/09/23 16:42:45 INFO resourcemanager.ResourceManager: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down ResourceManager at SOEX-PC-122-KirtiP/10.84.1.176
************************************************************/

22. ravi

I have been successfully been able to use the above tutorial to set up hadoop. However I see this error in one hadoop distribution window. hadoop name node, hadoop data node and yarn resource manager are starting up

16/09/17 14:37:46 INFO service.AbstractService: Service org.apache.hadoop.yarn.s
erver.nodemanager.containermanager.localizer.ResourceLocalizationService failed
java.net.BindException: Problem binding to [0.0.0.0:8040] java.net.BindExceptio
: bind; For more details see: http://wiki.apache.org/hadoop/BindException

23. praKASH

Good Tutotial dear…. i got install successful and i can open in browser also…
can u post an example how to submit a job and output

Thanks

24. Rahi Akela

Hi,

I want to install HBase on windows 10. Please share the step how should i proceed because I followed the above step for Hadoop and it is working perfectly on windows 10.

Thanks
Rahi

25. Balaji

I got this while running example, please suggest

Application application_1473316663235_0005 failed 2 times due to AM Container for appattempt_1473316663235_0005_000002 exited with exitCode: -1073741515
For more detailed output, check application tracking page:http://Admin-PC:8088/cluster/app/application_1473316663235_0005Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1473316663235_0005_02_000001
Exit code: -1073741515
Stack trace: ExitCodeException exitCode=-1073741515:
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
Container exited with a non-zero exit code -1073741515
Failing this attempt. Failing the application.

26. student

Hi sir,
I am getting errors when i run the command start-all.cmd on all 4 cmd windows -like unable to load native hadoop library from ur platform and at the end a shut_down_mesg.
How to resolve this error************************************************************/
errors:
ry for your platform… using builtin-java classes where applicable
16/09/06 18:22:05 FATAL datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.a ccess0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)

at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:6 09) at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977) at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskCheck er.java:187) at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:17 4) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:157) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.c
heckDir(DataNode.java:2344)
(DataNode.java:2386)
.java:2368)
ataNode.java:2260)
de.java:2307)
ava:2484)
08)
16/09/06 18:22:05 INFO util.ExitUtil: Exiting with status 1
16/09/06 18:22:05 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at AGGARWAL-pc/192.168.1.35
************************************************************/

27. mark

jps command is throwing error as :
‘jps’ is not recognized as an internal or external command,
operable program or batch file.

28. Harish

Hi Tushar,

I have installed hadoop as per the instructions in this article. But i am facing few problems here.

1. jps command is throwing error as :
‘jps’ is not recognized as an internal or external command,
operable program or batch file.

2. i am not able access the url http://localhost:50070.

Great Tutorial , it saved me a lot of time.

Thank you so much

30. ramesh

I followed the tutorial, but I am having problems with the ‘start-all.cmd’ command from sbin. It gives the following error:
‘C:/ “C:\” org.apache.hadoop.util.PlatformName’ is not recognized as an internal or external command, operable program or batch file.
The system cannot execute the specified program.
The system cannot execute the specified program.
You replied : The first C:/ is C:/javapath and the second C:\ is C:/hadoop-share-common-lib-path
Could you please tell me where to make this change. I mean which file and in which directory.
Thanks

31. Tushar Sarde Post author

Is this in C drive ? try installing this setup into d drive!

16/08/13 23:14:02 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: All specified directories are not accessible or do not exist.
at

32. Sumeet Gupta

this is hdfs-site.xml

dfs.replication

1

dfs.namenode.name.dir

dfs.datanode.data.dir

33. shank

Wmutil.xe given in the github is not compatible with my system. I have a 32-bit system

1. Tushar Sarde Post author

Yes this works only on 64 bit! I need to build this solution for 32 bit also

i can’t successfully runs mapreduce default jobs such as wordcount on my dataset….

‘C:\Program’ is not recognized as an internal or external command,
operable program or batch file.
16/07/29 21:32:31 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032
16/07/29 21:32:35 INFO input.FileInputFormat: Total input paths to process : 1
16/07/29 21:32:36 INFO mapreduce.JobSubmitter: number of splits:1
16/07/29 21:32:38 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1469862632042_0003
16/07/29 21:32:39 INFO impl.YarnClientImpl: Submitted application application_1469862632042_0003
16/07/29 21:32:40 INFO mapreduce.Job: Running job: job_1469862632042_0003
16/07/29 21:32:52 INFO mapreduce.Job: Job job_1469862632042_0003 running in uber mode : false
16/07/29 21:32:52 INFO mapreduce.Job: map 0% reduce 0%
16/07/29 21:32:52 INFO mapreduce.Job: Job job_1469862632042_0003 failed with state FAILED due to: Application application_1469862632042_0003 failed 2 times due to AM Container for appattempt_1469862632042_0003_000002 exited with exitCode: 1
Diagnostics: Exception from container-launch.
Container id: container_1469862632042_0003_02_000001
Exit code: 1
Exception message: CreateSymbolicLink error (1314): A required privilege is not held by the client.

Stack trace: ExitCodeException exitCode=1: CreateSymbolicLink error (1314): A required privilege is not held by the client.

at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

Shell output: 1 file(s) moved.

Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
16/07/29 21:32:53 INFO mapreduce.Job: Counters: 0″

steps taking to resolve the issue:

Full control permission was giving to the user and as well runs as administrator but the issue still persist….

1. Tushar Sarde Post author

‘C:\Program’ is not recognized as an internal or external command,
operable program or batch file.
………….. Try to install it in D drive

35. Ilsa

Hi. Thankyou for the helpful article.
I am a newbie to Hadoop and this helped me get started with things. However when i enter the command ‘hdfs namenode format’ it is giving me the error ‘Error: Could not find or load main class K’. i have configured all the files as you have explained and checked environment vasriables etc. Can you please help me resolve this?

1. Tushar Sarde Post author

what is K ? do you have K drive ? check your configuration files again (xml files) in /etc folder

36. shylendran

I am getting the below waning and error while running the “hdfs namenode -format” command. What could be the reason. I followed all the above mentioned steps. Am I missing any libraries?

16/07/19 15:50:27 WARN namenode.FSEditLog: No class configured for C, dfs.nameno
de.edits.journal-plugin.C is empty
16/07/19 15:50:27 ERROR namenode.NameNode: Failed to start namenode.
java.lang.IllegalArgumentException: No class configured for C
itLog.java:1615)
Log.java:1629)
og.java:282)
(FSEditLog.java:247)
985)
de.java:1429)
54)
16/07/19 15:50:27 INFO util.ExitUtil: Exiting with status 1
16/07/19 15:50:27 INFO namenode.NameNode: SHUTDOWN_MSG:

1. Tushar Sarde Post author

check your configuration files (xml files in etc folder), something is wrong with the files

37. Somasundaram

Thanks TooDey Team
It perfectly worked for he installation on the first run itself. I ran it n a DESK TOP machine. May be I have to elan the problem solving and deploying.
Thanks again TOODEY

38. Atul S

Thanks for the tutorial. I was able to setup Hadoop 2.7.2 with JDK 1.8 on machine. I was able to access the hadoop sites

I closed all console programs. Now when I am trying to run again the start-all.cmd command I get following error

————————-
This script is Deprecated. Instead use start-dfs.cmd and start-yarn.cmd
Error: JAVA_HOME is incorrectly set.
——————————

If I run start-dfs.cmd commad then it gives following error.

——————————-
Error: JAVA_HOME is incorrectly set.
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Error: JAVA_HOME is incorrectly set.
‘-server’ is not recognized as an internal or external command,
operable program or batch file.
——————————-

Everything worked initially now it stopped working completely. I tried checking the JAVA_HOME path in environment variables, it is properly set. Not sure why it stopped working all of sudden.

Atul

1. Tushar Sarde Post author

have you set JAVA_HOME in your hadoop-env.cmd file ? show me the code here, ur JAVA_HOME

2. Tushar Sarde Post author

Atul, This winutils.exe file is only applicable for hadoop 2.7.1 version 64 bit machine

1. Atul S

I tried with 2.7.1, I get the same error.

Error: JAVA_HOME is incorrectly set.
‘-Dhadoop.security.logger’ is not recognized as an internal or external command,
operable program or batch file.

———————-

Java_Home code

@rem The java implementation to use. Required.
set JAVA_HOME=D:\Softwares\Java\JDK1.8.0_91

1. Geeta

Nice tutorial..

In the beginning i got :
Error: JAVA_HOME is incorrectly set.

Need to set JAVA_HOME path as “C:\PROGRA~1\Java\jdk1.7.0_51” instead of “C:\Program Files\Java\jdk1.7.0_51”

39. nag

I followed the setup process as you explained, I am able to start all the processes except the namenode, could u pls suggest how to make it running? Thanks for the video.
getting the below exception:

java.io.IOException: All specified directories are not accessible or do not exist.
16/07/04 21:39:41 INFO mortbay.log: Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070 16/07/04 21:39:41 INFO impl.MetricsSystemImpl: Stopping NameNode metrics system… 16/07/04 21:39:41 INFO impl.MetricsSystemImpl: NameNode metrics system stopped. 16/07/04 21:39:41 INFO impl.MetricsSystemImpl: NameNode metrics system shutdown complete. 1. nag I am able to resolve the issue , changed the hdfs-site.xml as below and it worked in windows 10, thx. dfs.replication 1 dfs.namenode.name.dir /Hadoop/data/namenode dfs.datanode.data.dir C:/Hadoop/data/datanode 2. Tushar Sarde Post author is this under your c drive ? if yes change installation to other driver like D or E 40. PJ Hi, I was stuck at running JSP where it did not return any values after executed it. Also, the http://localhost:50070/ is not running. I’ve follow exactly all the steps and the comments suggestion, but no luck. Could you please assist? 1. Tushar Sarde Post author could you please show me some errors or logs ? 41. Saurav Sircar Doesn’t work for me! Here’s the error I’m getting: C:\Users\Saurav2512>hdfs namenode -format Error: JAVA_HOME is incorrectly set. Please update C:\Hadoop\hadoop-2.7.2\conf\hadoop-env.cmd ‘-Dhadoop.security.logger’ is not recognized as an internal or external command, operable program or batch file. I don’t even have a “conf” folder present. Am I supposed to create one? Also, I’m using Java 8 and Hadoop 2.7.2. Is that the issue? 1. Sangeetha Hi Tushar, same problem with 2.7.1 Can you please suggest? 1. Tushar Sarde Post author Hi Sangeetha, have u resolved your problem ? could you please describe me once again here ? 1. Ashrunil Roy Hi, All your problems will go away once you install java (jdk1.7.0_79) in a path where there is no space in the entire path to java.exe. e.g C:\jdk1.7.0_79\bin\java.exe (no spaces here). Now the “conf” thing…change directory to “hadoop-2.7.2\libexec” and look for the file hadoop-config.cmd. Open the file and search for the string – if not exist %JAVA_HOME%\bin\java and replace with if not exist “%JAVA_HOME%\bin\java.exe” save the file and try and execute the command > hadoop version If the command goes through without errors, you are safe to move to the next stage. 42. nathan Hi.. thanks for the tutorial, i success with localhost:8088/cluster, but i have 2 problem 1. when i try to open namenode localhost:50070 its failed -> the error is: this site can’t be reached 2. c:\hadoop-2.7.1\sbin\jps ‘jps’ is not recognized as an internal or external command, operable program or batch file. 1. Tushar Sarde Post author Your JAVA_HOME is not set properly! 1. Ajith Is there a way to install apache hive for the above mentionedsteps for hadoop installation on windows? 43. Narasimha Raju Thank you so much its working fine. 44. Kamta Mishra i want to install Hbase in window and want to connect with Map Reduce program, so could you help me regarding that please 45. brian Thank you for this article and very helpful. Where is the next one for eclipse and mapreduce? 46. Tahir Imtiaz Thanks alot Tushar… nice tutorial .. initially there was many errors but then i replaced my “etc” folder with your and changed the java home path in hadoop-env and paths in hdfs-site files and it worked perfectly… I installed java in c drive not in program files .. using window 10 64 bit … 1. ayesha A.o.a how r u Tahir Imtiaz? i have installed but how to run and integerate other s/w just like fllume ,hive.indly give me suggestion . 47. Adeoye Adewale Good job author. Please help me. it only the resource manager task was shown when i run “jps” while namenode, datanode and node manager never runs… your swift response will be appreciated as it help me in my undergraduate project defense. or contact me: adeoyeadewale@outlook.com Many Thanks 48. Raju Sabbani I solved that previous error, when ran “start-all.cmd” the four terminals were opend properly But i have one new problem when i ran “jps” command through “sbin” i am not able to see below services Namenode Datanode YARN resourcemanager YARN nodemanager i am using windows-10(64_bit) ##### when ran jps it is showing like below #### C:\Hadoop\hadoop-2.7.1\sbin>jps C:\Hadoop\hadoop-2.7.1\sbin> 1. Adeoye Adewale ensure you include your java path “C:\Program Files (x86)\Java\jdk1.7.0_03\bin” this to the PATH on system variable and run “JPS” best of luck 49. Raju Sabbani Hi…. I am getting this error can you explain in detail pls….. i set my java path like below. ( ////// @rem The java implementation to use. Required. @rem set JAVA_HOME=%JAVA_HOME% set JAVA_HOME=C:\Program Files\Java\jdk1.8.0 /////) ####The error is#### C:\Hadoop\hadoop-2.7.1\etc\hadoop>hdfs namenode -format Error: JAVA_HOME is incorrectly set. Please update C:\Hadoop\hadoop-2.7.1\conf\hadoop-env.cmd ‘-Dhadoop.security.logger’ is not recognized as an internal or external command, operable program or batch file. 50. Thomas Hello Tushar, I’m getting this error when starting the data node: any ideas? I’m running Hadoop 2.7.2 on Windows 7 STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r b165c4fe8a74265c792ce23f546c64604acf0e4 STARTUP_MSG: java = 1.8.0_65 ************************************************************/ 16/05/25 17:05:57 FATAL datanode.DataNode: Exception in secureMain java.lang.RuntimeException: Error while running command to get file permissions : ExitCodeException exitCode=-107374 at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) at org.apache.hadoop.util.Shell.run(Shell.java:456) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSyste at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.jav
at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2341) at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2383) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2365) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2257) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2304) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2481) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2505) at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSyste
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.jav at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2341)
16/05/25 17:05:57 INFO util.ExitUtil: Exiting with status 1
16/05/25 17:05:57 INFO datanode.DataNode: SHUTDOWN_MSG:

Thomas

1. Tushar Sarde Post author

Its trying to fetch something from git repo but because of security its not able to access URL.
Are you running this from Office(secure) network

2. Tushar Sarde Post author

51. Parani

Hi Thushar,

Thank you so much for your post. It helps me to setup Hadoop in Windows Machine. I am getting everything working except jps. when I issue command jps.. I see no results. no errors and nothing. But I could see all the 4 services are up and running the localhost urls are also fine.

Could you please tell me how to resolve this?

1. Tushar Sarde Post author

problem with your java may be, try to add java into the path, after setting up everything try opening new session of cmd prompt !

52. Rahul Rjk

I am getting this error, when trying to install in win10 machine.

16/05/21 21:07:08 ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609) at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996) at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:490)
16/05/21 21:07:08 INFO util.ExitUtil: Exiting with status 1
16/05/21 21:07:08 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at Raul/192.168.1.16
************************************************************

1. Rahul Rjk

Getting this error as well

STARTUP_MSG: java = 1.8.0_92
************************************************************/
16/05/21 21:12:15 FATAL datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609) at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977) at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187) at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:157) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2344)
16/05/21 21:12:15 INFO util.ExitUtil: Exiting with status 1
16/05/21 21:12:15 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Raul/192.168.1.16
************************************************************/

1. Rahul

STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r 15ecc87ccf4a0228f35af08fc56de536e6ce657a; compiled by ‘jenkins’ on 2015-06-29T06:04Z
STARTUP_MSG: java = 1.8.0_92
************************************************************/
16/05/22 21:59:12 INFO namenode.NameNode: createNameNode []
16/05/22 21:59:13 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
16/05/22 21:59:13 INFO impl.MetricsSystemImpl: NameNode metrics system started
16/05/22 21:59:14 INFO namenode.NameNode: fs.defaultFS is hdfs://localhost:9000
16/05/22 21:59:14 INFO namenode.NameNode: Clients are to use localhost:9000 to access this namenode/service.
16/05/22 21:59:14 INFO hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:50070
16/05/22 21:59:15 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
16/05/22 21:59:15 INFO server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
16/05/22 21:59:15 INFO http.HttpRequestLog: Http request log for http.requests.namenode is not defined
16/05/22 21:59:15 INFO http.HttpServer2: Added global filter ‘safety’ (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 16/05/22 21:59:15 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
16/05/22 21:59:15 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 16/05/22 21:59:15 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
16/05/22 21:59:15 INFO http.HttpServer2: Jetty bound to port 50070
16/05/22 21:59:15 INFO mortbay.log: jetty-6.1.26
16/05/22 21:59:15 INFO mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070 16/05/22 21:59:15 ERROR common.Util: Syntax error in URI C:\Hadoop\data\namenode. Please check hdfs configuration. java.net.URISyntaxException: Illegal character in opaque part at index 2: C:\Hadoop\data\namenode at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021) at java.net.URI$Parser.parse(URI.java:3058)
at java.net.URI.(URI.java:588)
16/05/22 21:59:15 WARN common.Util: Path C:\Hadoop\data\namenode should be specified as a URI in configuration files. Please update hdfs configuration.
16/05/22 21:59:15 ERROR common.Util: Syntax error in URI C:\Hadoop\data\namenode. Please check hdfs configuration.
java.net.URISyntaxException: Illegal character in opaque part at index 2: C:\Hadoop\data\namenode
at java.net.URI$Parser.fail(URI.java:2848) at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parse(URI.java:3058) at java.net.URI.(URI.java:588) at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48) at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FSNamesystem.java:1400) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceEditsDirs(FSNamesystem.java:1445) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceEditsDirs(FSNamesystem.java:1414) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkConfiguration(FSNamesystem.java:617) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:669) at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:584) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:644) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:811) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:795) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1488) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1554) 16/05/22 21:59:15 WARN common.Util: Path C:\Hadoop\data\namenode should be specified as a URI in configuration files. Please update hdfs configuration. 16/05/22 21:59:15 WARN namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories! 16/05/22 21:59:15 WARN namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories! 16/05/22 21:59:15 ERROR common.Util: Syntax error in URI C:\Hadoop\data\namenode. Please check hdfs configuration. java.net.URISyntaxException: Illegal character in opaque part at index 2: C:\Hadoop\data\namenode at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021) at java.net.URI$Parser.parse(URI.java:3058)
at java.net.URI.(URI.java:588)
16/05/22 21:59:15 WARN common.Util: Path C:\Hadoop\data\namenode should be specified as a URI in configuration files. Please update hdfs configuration.
16/05/22 21:59:15 ERROR common.Util: Syntax error in URI C:\Hadoop\data\namenode. Please check hdfs configuration.
java.net.URISyntaxException: Illegal character in opaque part at index 2: C:\Hadoop\data\namenode
at java.net.URI$Parser.fail(URI.java:2848) at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parse(URI.java:3058) at java.net.URI.(URI.java:588) at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48) at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FSNamesystem.java:1400) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceEditsDirs(FSNamesystem.java:1445) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceEditsDirs(FSNamesystem.java:1414) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:670) at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:584) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:644) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:811) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:795) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1488) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1554) 16/05/22 21:59:15 WARN common.Util: Path C:\Hadoop\data\namenode should be specified as a URI in configuration files. Please update hdfs configuration. 16/05/22 21:59:16 INFO namenode.FSNamesystem: No KeyProvider found. 16/05/22 21:59:16 INFO namenode.FSNamesystem: fsLock is fair:true 16/05/22 21:59:16 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000 16/05/22 21:59:16 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true 16/05/22 21:59:16 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000 16/05/22 21:59:16 INFO blockmanagement.BlockManager: The block deletion will start around 2016 May 22 21:59:16 16/05/22 21:59:16 INFO util.GSet: Computing capacity for map BlocksMap 16/05/22 21:59:16 INFO util.GSet: VM type = 32-bit 16/05/22 21:59:16 INFO util.GSet: 2.0% max memory 966.7 MB = 19.3 MB 16/05/22 21:59:16 INFO util.GSet: capacity = 2^22 = 4194304 entries 16/05/22 21:59:16 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false 16/05/22 21:59:16 INFO blockmanagement.BlockManager: defaultReplication = 1 16/05/22 21:59:16 INFO blockmanagement.BlockManager: maxReplication = 512 16/05/22 21:59:16 INFO blockmanagement.BlockManager: minReplication = 1 16/05/22 21:59:16 INFO blockmanagement.BlockManager: maxReplicationStreams = 2 16/05/22 21:59:16 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false 16/05/22 21:59:16 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000 16/05/22 21:59:16 INFO blockmanagement.BlockManager: encryptDataTransfer = false 16/05/22 21:59:16 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 1000 16/05/22 21:59:16 INFO namenode.FSNamesystem: fsOwner = Rahul (auth:SIMPLE) 16/05/22 21:59:16 INFO namenode.FSNamesystem: supergroup = supergroup 16/05/22 21:59:16 INFO namenode.FSNamesystem: isPermissionEnabled = true 16/05/22 21:59:16 INFO namenode.FSNamesystem: HA Enabled: false 16/05/22 21:59:16 INFO namenode.FSNamesystem: Append Enabled: true 16/05/22 21:59:16 INFO util.GSet: Computing capacity for map INodeMap 16/05/22 21:59:16 INFO util.GSet: VM type = 32-bit 16/05/22 21:59:16 INFO util.GSet: 1.0% max memory 966.7 MB = 9.7 MB 16/05/22 21:59:16 INFO util.GSet: capacity = 2^21 = 2097152 entries 16/05/22 21:59:16 INFO namenode.FSDirectory: ACLs enabled? false 16/05/22 21:59:16 INFO namenode.FSDirectory: XAttrs enabled? true 16/05/22 21:59:16 INFO namenode.FSDirectory: Maximum size of an xattr: 16384 16/05/22 21:59:16 INFO namenode.NameNode: Caching file names occuring more than 10 times 16/05/22 21:59:16 INFO util.GSet: Computing capacity for map cachedBlocks 16/05/22 21:59:16 INFO util.GSet: VM type = 32-bit 16/05/22 21:59:16 INFO util.GSet: 0.25% max memory 966.7 MB = 2.4 MB 16/05/22 21:59:16 INFO util.GSet: capacity = 2^19 = 524288 entries 16/05/22 21:59:16 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033 16/05/22 21:59:16 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0 16/05/22 21:59:16 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000 16/05/22 21:59:16 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10 16/05/22 21:59:16 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10 16/05/22 21:59:16 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25 16/05/22 21:59:16 INFO namenode.FSNamesystem: Retry cache on namenode is enabled 16/05/22 21:59:16 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis 16/05/22 21:59:16 INFO util.GSet: Computing capacity for map NameNodeRetryCache 16/05/22 21:59:16 INFO util.GSet: VM type = 32-bit 16/05/22 21:59:16 INFO util.GSet: 0.029999999329447746% max memory 966.7 MB = 297.0 KB 16/05/22 21:59:16 INFO util.GSet: capacity = 2^16 = 65536 entries 16/05/22 21:59:16 ERROR namenode.NameNode: Failed to start namenode. java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:490) at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:322) at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:215) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:975) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:681) at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:584) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:644) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:811) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:795) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1488) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1554) 16/05/22 21:59:16 INFO util.ExitUtil: Exiting with status 1 16/05/22 21:59:16 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at Raul/192.168.1.16 ************************************************************/ C:\Hadoop\hadoop-2.7.1\sbin> 1. Tushar Sarde Post author i see some configuration error, please check the configuration again . . . 16/05/22 21:59:15 ERROR common.Util: Syntax error in URI C:\Hadoop\data\namenode. Please check hdfs configuration 2. Tushar Sarde Post author 16/05/21 21:12:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable 16/05/21 21:12:15 FATAL datanode.DataNode: Exception in secureMain java.lang.UnsatisfiedLinkError: This error is because, its not able to find HADOOP_HOME path that is winutil.exe file try to configuire – HADOOP_HOME and also path upto “$HADOOP_HOME/bin/”

2. Tushar Sarde Post author

Not sure on win 10 but try to configure hadoop on other drives like D or E instead of C

53. Shikha

after running ‘hdfs namenode -format’ command, I am getting below error. Googled to understand the issue, many ppl suggested to edit bashrc file.. bt I am not able to find this file within my hadoop home. Could you please suggest where/what to look for?

———————————————————————————————————————————————————-
Picked up JAVA_TOOL_OPTIONS: -Duser.home=C:\Users\Shikha
Exception in thread “main” java.lang.UnsupportedClassVersionError: org/apache/ha
doop/util/PlatformName : Unsupported major.minor version 51.0
1)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58) at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: org.apache.hadoop.util.PlatformName. Program wil l exit. Picked up JAVA_TOOL_OPTIONS: -Duser.home=C:\Users\43279591 Exception in thread “main” java.lang.UnsupportedClassVersionError: org/apache/ha doop/hdfs/server/namenode/NameNode : Unsupported major.minor version 51.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at java.lang.ClassLoader.defineClass(ClassLoader.java:615) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:14 1) at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode.
Program will exit.

1. Tushar Sarde Post author

are you using 32 bit windows machine ? this setup works on 64 bit windows machine

1. Shikha

Thanks for responding.. It’s 64 bit only.. I am using jdk 1.6.. Could it be the rootcause?

1. Shikha

Tried with jdk1.7 too, getting same error. 🙃

1. Tushar Sarde Post author

Are you using 32 bit windows machine ?

54. Sameen S.

Hi Tushar,

Hope you are doing well. Thanks a lot for this fantastic tutorial I have three out of four processes/ services running, only problem I am facing is with datanode I guess. Following is my error code, guide please!
————————————————————————————————————————————————————–
0: Unknown error
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:241) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:724) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:502) at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:140) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2344)
16/05/07 13:55:20 FATAL datanode.DataNode: Exception in secureMain
16/05/07 13:55:20 INFO util.ExitUtil: Exiting with status 1
16/05/07 13:55:20 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Sameen/192.168.192.1
—————————————————————————————————————————————————————-

P.S: I am still newbie to Hadoop environment!
Thanks,

1. Sameen S.

Ok I have got it. issue resolved, there was an error in path in hdfs-site.xml, I was giving namenode path two times and skipping the datanode that’s why it wasn’t appearing.
now I am waiting for the next tutorial “In next tutorial we will see how to run mapreduce programs in windows using eclipse and this hadoop setup”.

Any tentative plan when it is coming?

Thanks,

55. MIQTA

Everything went perfectly right I thought installation was done but I am unable to use GUI. http://localhost:8088/ returns the error that localhost refuses to connect. What could be the reason ?

56. kamakshaiah

Every thing is fine, but I am not able to format namenode (hdfs namenode -format). Will you please let me know as how to do it.

Thanks

1. Tushar Sarde Post author

Please give me more details, what is a error when you run “hdfs namenode -format” command

57. Hiren

Hi Tushar, Could do this. Thanks for the tutorial. Only problem I found in tutorial was the use of / should be replaced by \ in hdfs-site.xml.

58. Javed

after that i removed the in_use lock file… then again used hdfs namenode -format
and stared again… but again datanode stoped.

using the same java as suggested… C:\Program Files\Java\jdk1.7.0_79\bin

priviously either namenode was not working then make the changes (‘/’ before C drive) like below then namenode stared but datanode stared getting failed.

dfs.replication
1

dfs.namenode.name.dir

dfs.datanode.data.dir

59. Sanjay T Patil

HI Tushar,

Thanks and Regards,
Sanjay

60. Manasa

Hi,

Can this be used for Windows 10 installation.
Thanks

1. Tushar Sarde Post author

Yes, this tutorial will work in 64 bit

61. nikunj mange

hi, Thanks for the above tutorial. I followed the above steps and I could start the Hadoop. But when I issued the Jps command, the Namenode is not getting listed. Hence I am not able to access the – http://localhost:50070 link. Could you please help me in resolving this.

I am new to this can you please also help me how to use in eclipse or some link..

1. ubosm

the use of / should be replaced by \ in hdfs-site.xml

62. Ga

Hi,where is the next tutorial？
“In next tutorial we will see how to run mapreduce programs in windows using eclipse and this hadoop setup”

63. Aniket singh

Hi Tushar,

thanks a lot for this post. I am finally able to install hadoop which i could not earlier using cygwin.

Do you have any other post for running the mapreduce job in hadoop? if yes then please send me the link.

Thanks,
Aniket

64. judson gabriel

65. midhun

sir , i am getting a “””org.apache.hadoop.mapred.InvalidInputException: Input Pattern hdfs://localhost:9
001/user/user54/dblp/raw-000/* matches 0 files””” while running a code can you please help me to solve the issue

66. Ahmed Attia

First of all this tutorial is perfect (A+) but i need bit help
I’m trying to use hadoop (2.7.1) for the first time using cygwin on win 8. I followed the configuration and i think every thing is well. However when i issue sbin/start-all.sh, i get the following error

Also when i try to use the browser and issue http://localhost:50030/ or http://localhost:50070/ i get the following message

This site can’t be reached
localhost refused to connect.
How can I get my computer to stop refusing the connection? I don’t know much about networking or servers. Thanks in advance

67. Naveen

The system cannot find the path specified.
Error: JAVA_HOME is incorrectly set.
The system cannot find the path specified.
Error: JAVA_HOME is incorrectly set.

Hi

Thank you for giving procedure for Hadoop Installation.

I tried installing hadoop 2.7.1 by following the procedure given by you. But It is not successful.

My Java version is 1.8.0

Yarn resource manager and Yarn node manager getting started.

But Data Node and Name Node are giving errors. Its a error from java. The error is: “Unsatisfied Linker error”.

Thank You Smile | 🙂

1. Kunal Relia

Hi. I am facing the same Problem. Help on this matter will be highly appreciated.

1. Prabakaran

You might be using JDK older than 1.7.1 , use a newer JDK like 1.8 and it should start fine

69. anu

Hi Tushar,

Thanks very much for the tutorial. Its very crisp and easy to learn.

I followed the tutorial, but I am having problems with the ‘start-all.cmd’ command from sbin. It gives the following error:

‘C:/ “C:\” org.apache.hadoop.util.PlatformName’ is not recognized as an internal or external command, operable program or batch file.
The system cannot execute the specified program.
The system cannot execute the specified program.

1. anu

The first C:/ is C:/javapath and the second C:\ is C:/hadoop-share-common-lib-path.

70. Zem Wilbury

Great tutorial. The only one among many I could use. Congrats!

71. Ricardo Merino

Great tutorial!! thank you so much.

As a new user of Hadoop I have a question:
Do I need to format the namenode every time I want to start the daemons?

Thanks alot again!

1. Tushar Sarde Post author

No you don’t need to, if you format the namenode all, metadata related information will be gone (metadata like – file size, replication etc)

72. Praveen Sahu

Hi Tushar,

Thanks a lot for summing up the process.
I am usinf hadoop 2.7.1 ,jdk 1.7 and all the config provided by you.While running hadoop namenode -format I am getting this error
WARN namenode.FSEditLog: No class configured for F, dfs.namenode.edits.journal-plugin.F is empty
16/03/12 13:40:36 ERROR namenode.NameNode: Failed to start namenode.
java.lang.IllegalArgumentException: No class configured for F
16/03/12 13:40:36 INFO util.ExitUtil: Exiting with status 1
16/03/12 13:40:36 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************

Kinldy help

1. Tushar Sarde Post author

Looks like you don’t have F driver and what path are you setting into your hdfs-site.xml file ?

1. Praveen Sahu

It is resolved.Actually I was giving space in java locations path.
Now if I want to run a hadoop program please let me know if there is any tutorial.

Installation tutorial is great man .Thanks!!

2. Vipin Singh

change the property in hdfs-site.xml

basically you will have to change the drive name with file:
Eg F:/somefolder/otherfolder to file:/somefolder/otherfolder

dfs.replication
1

dfs.namenode.name.dir

dfs.datanode.data.dir

1. Praveen Sahu

Thanks .. any info on running hadoop program.

73. Sameen S.

Hi Tushar,
Thank you for such a plain and vivid tutorial on this, It is really helpful even for beginners.
Why I am getting following message
“hdfs is not recognized as internal or external command”

1. Tushar Sarde Post author

your windows CMD is not able to find out hadoop.cmd thats why you are getting hdfs is not recognized as internal or external command
Like in Java we get java is not recognized as internal or external command if we have not set any path.

Check your system path again and let me know

Thx

74. Mayank

Hi Tushar,

I am using your setting, but I am using the Windows 32 bit version, Can you please give the winutils.exe , dll complied for windows 32 bit version or please guide us how you are building it otherwise.

Thanks for the tutorial, its really nice.

1. Tushar Sarde Post author

Hi I will build that for you, actually its a very long process, i will upload the tutorial but it will take time 🙁

1. yash

Hi Tushar,

Any luck with win 7 32 bit Utils please ? thanks in advance.

1. Tushar Sarde Post author

Not tried yet Yash sorry for this but i will upload it in November 2016

2. Ritika

Hi Tushar,
Can you please build for 32 bit and share. I am trying to setup hadoop on 32 bit Windows. Thanks in advance.

Regards
Ritika

1. Tushar Sarde Post author

Hi Ritika,

32 bit tutorial will be available from Oct. 2016, sorry for the delay

Thanks
Tushar

75. MR

Thanks for this tutorial!!
Though I have a problem in the last part

In step 3, I replaced my bin folder with yours, but when I try executing hdfs namenode -format, I get an error:

C:\>hdfs namenode -format
‘hdfs’ is not recognized as an internal or external command,
operable program or batch file.

(I have hadoop 2.7.2 and jdk 1.8, if it would make things any different. And I only had a template for the mapreduce: mapred-site.xml.template)

1. MR

I redid the tutorial from scratch with hadoop 2.7.1, and I’m still getting the same error message.
And I don’t have mapred-site.xml, just mapred-site.xml.template (yes, even in 2.7.1).
So I added the (your code here) at the end of the template document.

1. Jay

I m getting the same issue MR..if u found the solution plz tell me also..thanx in advance

1. Tushar Sarde Post author

Hi Jay,

1. Zem Wilbury

Go to the bin directory and try

./hdfs namenode -format

1. Sameen S.

Hi Zem,

I tried this too, but none of the above solutions by Tushar is working for me. And tried this one, but same error appears. Though I am using Hadoop 2.7.1. Please guide!

1. Tushar Sarde Post author

Kindly explain me your question ? jps is just a command.

2. Malla

Go to your Java 7/8 installation directory (for ex: C:\Program Files\Java\jdk1.8.0_91\bin) and try below command:

C:\Program Files\Java\jdk1.8.0_91\bin>jps
8752
7092 DataNode
7556 Jps
2328 NameNode
9800 ResourceManager
9864 NodeManager

C:\Program Files\Java\jdk1.8.0_91\bin>

76. karhtick

when is use ‘hdfs namenode -format’ in command prompt it says hdfs is not recognized as internal or external command

1. Tushar Sarde Post author

Add this in hdfs-site.xml

 name - fs.checkpoint.dir value - F:\secondarynamenode 

and start the services again
sbin/start-all.cmd

77. karhtick

i cant get you what you are going to say this following step

Step 3 – Start everything

Very Important step

bin folder – this contains .dll and .exe file (winutils.exe for hadoop 2.7.1)

Now delete you existing bin folder and replace with new one (downloaded from my repo )

1. Tushar Sarde Post author

i am saying delete your existing bin folder and download new bin folder from my repository which contains .dll and .exe files

78. raj

Hi,

I tried to follow many links to install but yours is very simple.

I tried to install hadoop on windows 7 in 32bit machine.I followed your steps.Java version is 1.7.0_95.
I have few doubts,please clear it if it’s possible.

1,when i execute hdfs version, i got error.
when i execute hadoop version,it shows version details. What is the difference and issue?

Error: Could not find or load main class version

Compiled by jenkins on 2015-06-29T06:04Z
Compiled with protoc 2.5.0
From source with checksum fc0a1a23fc1868e4d5ee7fa2b28a58a

2,I got status 0 When execute hdfs namenode -format.It created 4 files ( fsimage_0000000000000000000,fsimage_0000000000000000000.md5,seen_txid and VERSION )
but i got below errors and waring while executing the command.

Log
—————

.
.
.

STARTUP_MSG: java = 1.7.0_95
************************************************************/
16/02/19 14:42:52 INFO namenode.NameNode: createNameNode [-format]
at java.net.URI$Parser.fail(URI.java:2829) at java.net.URI$Parser.checkChars(URI.java:3002)
at java.net.URI$Parser.parse(URI.java:3039) at java.net.URI.(URI.java:595) at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48) at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util .java:98) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FS Namesystem.java:1400) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceDirs( FSNamesystem.java:1355) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java: 966) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1429) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1554) 16/02/19 14:42:53 WARN common.Util: Path D:\Hadoop_TEST\Hadoop\Data should be specified as a URI in configuration files. Please update hdfs configuration. 16/02/19 14:42:53 ERROR common.Util: Syntax error in URI D:\Hadoop_TEST\Hadoop\Data. Please check hdfs configuration. java.net.URISyntaxException: Illegal character in opaque part at index 2: D:\Hadoop_TEST\Hadoop\Data at java.net.URI$Parser.fail(URI.java:2829)
at java.net.URI$Parser.checkChars(URI.java:3002) at java.net.URI$Parser.parse(URI.java:3039)
at java.net.URI.(URI.java:595)
.
.
.
.
16/02/19 14:42:56 INFO namenode.FSImage: Allocated new BlockPoolId: BP-507165786-10.219.149.100-1455873176446
16/02/19 14:42:56 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
16/02/19 14:42:56 INFO util.ExitUtil: Exiting with status 0
16/02/19 14:42:56 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at PC195/10.219.149.100
************************************************************/

could you please share Am i right track?

Thanks ,
Raj

Note : Some what difficult to align the text.

1. Tushar Sarde Post author

Hi Raj,

I apologise but use this setup for 64 bit windows machine, i have build dll using 64 bit machine.

79. raj

Hi,

I tried to follow many links to install but yours is very simple.

I tried to install hadoop on windows 7 in 32bit machine.I followed your steps.Java version is 1.7.0_95.
I have few doubts,please clear it if it’s possible.

1,when i execute hdfs version, i got error.
when i execute hadoop version,it shows version details. What is the difference and issue?

Error: Could not find or load main class version

8f35af08fc56de536e6ce657a
Compiled by jenkins on 2015-06-29T06:04Z
Compiled with protoc 2.5.0
From source with checksum fc0a1a23fc1868e4d5ee7fa2b28a58a

2,I got status 0 When execute hdfs namenode -format.It created 4 files ( fsimage_0000000000000000000,fsimage_0000000000000000000.md5,seen_txid and VERSION )
but i got below errors and waring while executing the command.

STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r 15e
cc87ccf4a0228f35af08fc56de536e6ce657a; compiled by ‘jenkins’ on 2015-06-29T06:04
Z
STARTUP_MSG: java = 1.7.0_95
************************************************************/
16/02/19 14:42:52 INFO namenode.NameNode: createNameNode [-format]
ry for your platform… using builtin-java classes where applicable
java.net.URISyntaxException: Illegal character in opaque part at index 2: D:\Had
at java.net.URI$Parser.fail(URI.java:2829) at java.net.URI$Parser.checkChars(URI.java:3002)
at java.net.URI$Parser.parse(URI.java:3039) at java.net.URI.(URI.java:595) at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48) at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util .java:98) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FS Namesystem.java:1400) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceDirs( FSNamesystem.java:1355) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java: 966) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo de.java:1429) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:15 54) 16/02/19 14:42:53 WARN common.Util: Path D:\Hadoop_TEST\Hadoop\Data should be sp ecified as a URI in configuration files. Please update hdfs configuration. 16/02/19 14:42:53 ERROR common.Util: Syntax error in URI D:\Hadoop_TEST\Hadoop\D ata. Please check hdfs configuration. java.net.URISyntaxException: Illegal character in opaque part at index 2: D:\Had oop_TEST\Hadoop\Data at java.net.URI$Parser.fail(URI.java:2829)
at java.net.URI$Parser.checkChars(URI.java:3002) at java.net.URI$Parser.parse(URI.java:3039)
at java.net.URI.(URI.java:595)
.java:98)
Namesystem.java:1400)
Dirs(FSNamesystem.java:1445)
Dirs(FSNamesystem.java:1414)
971)
de.java:1429)
54)

16/02/19 14:42:56 INFO namenode.NNStorageRetentionManager: Going to retain 1 ima
ges with txid >= 0
16/02/19 14:42:56 INFO util.ExitUtil: Exiting with status 0
16/02/19 14:42:56 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at PC195/10.219.149.100
************************************************************/

80. Nithya

I have used above steps to install hadoop. But I have got the following Error. Please suggest me to proceed further
16/02/18 09:00:02 ERROR namenode.NameNode: Failed to start namenode.
java.lang.IllegalArgumentException: No class configured for D
itLog.java:1615)
Log.java:1629)
og.java:282)
(FSEditLog.java:247)
985)
de.java:1429)
54)
16/02/18 09:00:02 INFO util.ExitUtil: Exiting with status 1
16/02/18 09:00:02 INFO namenode.NameNode: SHUTDOWN_MSG:

81. marco

Wonderful tutorial.
Now, could you indicate us the link for the tutorial “how to run mapreduce programs in windows using eclipse and this hadoop setup”.
thanks
Marco

82. Bhushan

“Below is the msg which is shown before it shuts it down”

[Fatal Error] core-site.xml:28:22: The processing instruction target matching “[xX][mM][lL]” is not allowed.
16/02/15 16:24:56 FATAL conf.Configuration: error parsing conf core-site.xml
org.xml.sax.SAXParseException; systemId: file:/C:/hadoop-2.7.1/etc/hadoop/core-site.xml; lineNumber: 28; columnNumber: 22; The processing instruction target matching “[xX][mM][lL]” is not allowed.

1. Tushar Sarde Post author

Could you please paste your lines from 25 – 30 ? i need to see that lines

1. Bhushan

I am not sure if you are able to view all the lines in the comments above. I have again pasted the lines within comments.

<!–

fs.defaultFS

hdfs://localhost:9000

–>

1. Tushar Sarde Post author

Delete your existing core-site.xml file and use my github account core-site.xml file.

83. Bhushan

Hi Tushar,

I could run all of the steps up until the ‘jps’ command. The 4 windows with the processes did open but the final message was

SHUTDOWN_MSG: Shutting down ResourceManager at Bhushan-PC/”some IP address”

And similarly for Namenode, Datanode, NodeManager…..

How to exactly get to step 4. When I try opening the links mentioned in step 4 it says
——————–
This webpage is not available

ERR_CONNECTION_REFUSED
Hide details
Google Chrome’s connection attempt to localhost was rejected. The website may be down, or your network may not be properly configured.
———————

Could you kindly help me with this. My JAVA_HOME variable is properly set. I am using Windows 10.

Thanks,
Bhushan.

1. Zomel

Make sure to keep the 4 windows open when you run the “jps” command and when you enter to the localhost urls. otherwise this won’t work

84. Anantharaman

Also, can you tell me how you made those exe, dll & lib files.

1. Tushar Sarde Post author

Its a very long process, i made that exe using visual studio 2010 and hadoop source code.
for your need i will write one tutorial for you on this, please wait for sometime.

1. Sumeet Gupta

Hello Tushar,

Hope you are doing good. Thanks for the tutorial. It is very descriptive and your efforts are highly appreciated.

However, when I try to start the DFS through start-dfs.cmd, both NameNode and Datanode are not starting due to different errors. If possible, could you please share your thoughts on what can I do to fix this?

Please note that I am trying to install it on Windows 10 64 but PC. The path which I am trying to allocate to namenode and datanode exists. I am appending logs from both the (datanode and namenode) windows below for your reference.

Kind Regards,
Sumeet

—————————————————————————-DATANODE—————————————————————————-
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
16/08/13 23:29:05 INFO datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG: host = Sumeet-BOT/192.168.56.1
STARTUP_MSG: args = []
STARTUP_MSG: version = 2.7.1
—————
———————-
—————————-
16/08/13 23:14:02 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: All specified directories are not accessible or do not exist.
16/08/13 23:14:02 INFO util.ExitUtil: Exiting with status 1
16/08/13 23:14:02 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at Sumeet-BOT/192.168.56.1
************************************************************/

85. Imelda

Thank you for this tutorial, but I have an obstacles at the step of hdfs namenode -format.

After i execute “hdfs namenode -format”, the cmd give feedback like this :

The system cannot find the path specified.
Error : JAVA_HOME is incorrectly set.
‘-Dhadoop.security.logger’ is not recognized as an internal or external command, operable program or batch file.

I just wondering, why after i execute “hdfs namenode -format”, the program can detect dir C;\hadoop-2.7.1\conf\hadoop-env.cmd,because after i check and recheck there is no conf directory, and hadoop-env.cmd file is in etc directory.

Do I missed something, because I think I have followed your instruction above.
Thnks

1. Tushar Sarde Post author

Hi Imelda,

Have you configured JAVA_HOME in your windows machine in environment variable also in hadoop-env.cmd

set JAVA_HOME=C:\PROGRA~1\Java\jdk1.7.0_67

Can you please confirm if java is installed successfully.

Thanks

1. Anonymous

Hi Tushar Sarde,

Of course I have configure JAVA_HOME in my windows machine just like in hadoop-env.cmd

And I can’t solve the problem yet…

1. Tushar Sarde Post author

You need to recheck everything again, you are missing something !

2. Noel

set JAVA_HOME=’C:/Program%20%Files/Java/jdk1.8.0_74′ but still getting the error

Error: JAVA_HOME is incorrectly set.
‘-Dhadoop.security.logger’ is not recognized as an internal or external command,
operable program or batch file.

1. Tushar Sarde Post author

set path like this

C:\PROGRA~1\Java\jdk1.8.0_74

2. judson gabriel

2).i finished all steps u listed all in your blog in the final step at terminal it shows C:\Users\judson>C:\Users\judson>hdfs namenode -format
‘C:\Users\judson’ is not recognized as an internal or external command,
operable program or batch file.
and

1. Tushar Sarde Post author

C:\Users\judson>C:\Users\judson>hdfs namenode -format

It seems u have copy pasted path 2 times, check out above and make it like

C:\Users\judson>hdfs namenode -format

86. Sudhish

hi, Thanks for the above tutorial. I followed the above steps and I could start the Hadoop. But when I issued the Jps command, the Namenode is not getting listed. Hence I am not able to access the – http://localhost:50070 link. Could you please help me in resolving this.

1. Tushar Sarde Post author

Can you please copy-paste your . . hdfs-site.xml ? May be you missed something

2. Bhushan

It might have happened because you might have replaced your etc folder as well. Because the Java_home for you and the person who wrote the post might be different. So I would suggest running step 2 after step 3. That might solve the problem.

@Tushar : Thanks for the post! It’s been months since I am finding this solution. I hope it works w/o any trouble