Tuesday, October 29, 2013

Installing Hadoop 2 on a Mac

I've had a lot of trouble getting Hadoop 2 and yarn 2 running on my MAC.  There are some tutorials out there but they are often for
beta and alpha versions of the hadoop 2.0 family.  These are the steps I used to get Hadoop 2.2.0 working on my MAC running OSX 10.9

Note:  watch for version differences in this blog.  It was written for Hadoop 2.2.0, we are currently on 2.6.2 so that will need to be changed throughout.

Get hadoop from http://www.apache.org/dyn/closer.cgi/hadoop/common/

make sure JAVA_HOME is set (if you have Java 6 on your machine):
export JAVA_HOME=`/usr/libexec/java_home -v1.6`
(Note your Java version should be 1.7 or 1.8)

point HADOOP_INSTALL to the hadoop installation directory
export HADOOP_INSTALL=/Applications/hadoop-2.2.0

And set the path

You can test hadoop is found with
hadoop -version

make sure ssh is set up on your machine:
system preferences -> sharing -> remote login is ticked

ssh @localhost

where is the name you used to logon.

in $HADOOP_INSTALL/etc these are the conf files I changed.





Make the directories for the namenode and datanode data (note the file above and the mkdir below will need to reflect where you  want to store the files, I've stored mine in the home directory of the Administrator user on my Mac).

mkdir -p /Users/Administrator/hadoop/namenode
mkdir -p /Users/Administrator/hadoop/datanode

hadoop namenode -format

 <!-- Site specific YARN configuration properties -->  


should give
9430 ResourceManager
9325 SecondaryNameNode
9513 NodeManager
9225 DataNode
9916 Jps
9140 NameNode

if not check log files.  If data node is not started and  you get incompatible id's error, stop everything delete datanode directory and recreate
datanode directory

try  a ls
hadoop fs -ls

if you get

ls: `.': No such file or directory

then there is no home directory in the hadoop file system.  So

hadoop fs -mkdir /user
hadoop fs -mkdir /user/<username>
where is the name you are logged onto the machine with.

now change to $HADOOP_INSTALL directory and upload a file

hadoop fs -put LICENSE.txt

finally try a mapreduce job:

cd share/hadoop/mapreduce
hadoop jar ./hadoop-mapreduce-examples-2.2.0.jar wordcount LICENSE.txt out


  1. Thanks for the great post. It really helped me get started. Since I ran into a few problems while following your directions, I thought I'd post the problem and solutions here in case they are useful to anyone else.

    Problem 1
    When executing 'hadoop version', I would get a error. I apologize that I didn't capture the exact error syntax but the gist was that the hadoop was complaining about the location of java_home.

    Solution 1
    Instead of using

    export JAVA_HOME=`/usr/libexec/java_home -v1.6`

    I added the following to my .bash_profile file:

    export JAVA_HOME="$(/usr/libexec/java_home)"

    Problem 2
    When I perform an operation on the file system, I'd get errors that read "Unable to load realm info from SCDynamicStore".

    Solution 2
    I added the following line to the bottom of the hadoop-env.sh file:

    export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

    I also added the following to the yarn-env.sh file:

    YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

    Hope this helps!

    1. Thanks for your solutions. I recon problem 1 is because the syntax I had explicitly sets the java version to 1.6 which might not have been installed on your system. Thanks for the answer to problem 2

  2. This is really essential information for web developers who are in the beginning stage of website developing.
    Web Designing Companies India | Web Development Companies

  3. Thanks for your post!

    I have been trying to configure it this whole Sunday afternoon, but whatever tutorial I try, I keep getting the following error:


    13/12/01 18:44:32 INFO mapreduce.Job: Job job_1385919832889_0001 failed with state FAILED due to: Application application_1385919832889_0001 failed 2 times due to AM Container for appattempt_1385919832889_0001_000002 exited with exitCode: 127 due to: Exception from container-launch:
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
    at org.apache.hadoop.util.Shell.run(Shell.java:379)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
    at java.util.concurrent.FutureTask.run(FutureTask.java:166)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:722)

    .Failing this attempt.. Failing the application.
    13/12/01 18:44:32 INFO mapreduce.Job: Counters: 0

    Any ideas?

    1. So I get this error when trying to run the wordcount example.

    2. Solved it. I had some leftovers from other tutorials in my config files. Make sure to only make the changes in this tutorial fellow mac users! :)

    3. Tackled it. I had a few remains from different excercises in my config indexes. Make a point to just make the progressions in this excercise individual mac clients! :)


    4. Hadoop is a open source framework which is written in java by apche
      software foundation.Hadoop Tutorial

  4. Awesome! Worked perfectly for me! Thanks!

  5. This comment has been removed by a blog administrator.

  6. Much obliged concerning your answers. I recon issue 1 is on the grounds that the punctuation I had unequivocally sets the java form to 1.6 which may not have been introduced on your framework. A debt of gratitude is in order regarding the reply to issue.

    best website design//Mobile Apps N Webs Development

  7. Thanks for great article. All worked for me - and this was the first time I tried to get Hadoop up and running.

  8. This website is very helpful for the students who need info about the Hadoop courses.i appreciate for your post. thanks for shearing it with us. keep it up.
    Hadoop Training in hyderabad

  9. I hope this information of installing process would be as best reference to install the hadoop for the people.I really grateful to this blog for updating useful things.
    Web Designing Companies Bangalore | Website Development Company Bangalore

  10. Actually i am searching for such article on hadoop installation on mac, you make my work easy by providing such a useful information a single place.
    For Hadoop Training check https://intellipaat.com/

  11. Uniqe informative article and of course True words, thanks for sharing. Today I see myself proud to be a hadoop professional with strong dedication and will power by blasting the obstacles. Thanks to Hadoop Course in Chennai

  12. Hi, I am Jackson from Chennai. I am technology freak. I did Hadoop Training Chennai at FITA. This is useful for me to make a bright career in IT field.

  13. Thanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing. Web desigining Training in chennai | Web desigining Training chennai | Web desigining course in chennai | Web desigining course chennai

  14. Hi friends,This is Johnson from Chennai.Thanks for sharing this informative blog. I did Unix certification course in Chennai at Fita academy. This is really useful for me to make a bright career.
    Unix Training

  15. I had visited your website which was really good,Actually we dealing in Website Design & Development Company USA

  16. Hi admin thanks for sharing informative article on hadoop technology. In coming years, hadoop and big data handling is going to be future of computing world. This field offer huge career prospects for talented professionals. Thus, taking Hadoop Training in Chennai will help you to enter big data technology.

  17. Oracle DBA Training in Chennai
    Thanks for sharing this informative blog. I did Oracle DBA Certification in Greens Technology at Adyar. This is really useful for me to make a bright career..

  18. Whatever we gathered information from the blogs, we should implement that in practically then only we can understand that exact thing clearly, but it’s no need to do it, because you have explained the concepts very well. It was crystal clear, keep sharing..
    Websphere Training in Chennai

  19. Data warehousing Training in Chennai
    I am reading your post from the beginning, it was so interesting to read & I feel thanks to you for posting such a good blog, keep updates regularly..

  20. Selenium Training in Chennai
    Wonderful blog.. Thanks for sharing informative blog.. its very useful to me..

  21. Oracle Training in chennai
    Thanks for sharing such a great information..Its really nice and informative..

  22. I have read your blog and i got a very useful and knowledgeable information from your blog.You have done a great job.
    SAP Training in Chennai

  23. This information is impressive..I am inspired with your post writing style & how continuously you describe this topic. After reading your post,thanks for taking the time to discuss this, I feel happy about it and I love learning more about this topic
    Android Training In Chennai In Chennai

  24. Pretty article! I found some useful information in your blog, it was awesome to read,thanks for sharing this great content to my vision, keep sharing..
    Unix Training In Chennai

  25. I found some useful information in your blog, it was awesome to read, thanks for sharing this great content to my vision, keep sharing..
    SalesForce Training in Chennai

  26. There are lots of information about latest technology and how to get trained in them, like Best Hadoop Training In Chennai have spread around the web, but this is a unique one according to me. The strategy you have updated here will make me to get trained in future technologies Hadoop Training in Chennai By the way you are running a great blog. Thanks for sharing this blogs..

  27. This blog was referred to me by one of my batch-mates who used to participate along with me at hadoop online training center who is also a genius in the subject. Thanks you for the information which is cent percent reliable on this blog.