Install Hadoop on Mac OS - hadoop3

I have setup Hadoop on Mac OS. I have these packages:
Hadoop version 3.1.1
Java version "1.8.0_191"
Maven version 3.6.0
But I can not start Hadoop.
Here is my computer
I try to fix it so:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
Below is my references:
line 103 ,do not know that


hadoop version - fairscheduler-statedump.log (No such file or directory)

I tried to install hadoop-3.2.0 on linux mint. Everything is going fine. Also java 11.0.2 is installed like this:
$ java -version
java version "11.0.2" 2018-10-16 LTS
Java(TM) SE Runtime Environment 18.9 (build 11.0.2+7-LTS)
Java HotSpot(TM) 64-Bit Server VM 18.9 (build 11.0.2+7-LTS, mixed mode)
when I use this command hadoop version, I get this error:
$ hadoop version
log4j:ERROR setFile(null,true) call failed. /usr/local/hadoop-3.2.0/logs/fairscheduler-statedump.log (No such file or directory)
at java.base/ Method)
at java.base/
at java.base/<init>(
at java.base/<init>(
at org.apache.log4j.FileAppender.setFile(
at org.apache.log4j.RollingFileAppender.setFile(
at org.apache.log4j.FileAppender.activateOptions(
at org.apache.log4j.config.PropertySetter.activate(
at org.apache.log4j.config.PropertySetter.setProperties(
at org.apache.log4j.config.PropertySetter.setProperties(
at org.apache.log4j.PropertyConfigurator.parseAppender(
at org.apache.log4j.PropertyConfigurator.parseCategory(
at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(
at org.apache.log4j.PropertyConfigurator.doConfigure(
at org.apache.log4j.PropertyConfigurator.doConfigure(
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(
at org.apache.log4j.LogManager.<clinit>(
at org.slf4j.impl.Log4jLoggerFactory.<init>(
at org.slf4j.impl.StaticLoggerBinder.<init>(
at org.slf4j.impl.StaticLoggerBinder.<clinit>(
at org.slf4j.LoggerFactory.bind(
at org.slf4j.LoggerFactory.performInitialization(
at org.slf4j.LoggerFactory.getILoggerFactory(
at org.slf4j.LoggerFactory.getLogger(
at org.slf4j.LoggerFactory.getLogger(
at org.apache.hadoop.util.VersionInfo.<clinit>(
Hadoop 3.2.0
Source code repository -r e97acb3bd8f3befd27418996fa5d4b50bf2e17bf
Compiled by sunilg on 2019-01-08T06:08Z
Compiled with protoc 2.5.0
From source with checksum d3f0795ed0d9dc378e2c785d3668f39
This command was run using /usr/local/hadoop-3.2.0/share/hadoop/common/hadoop-common-3.2.0.jar
It seems hadoop is properly installed but something is wrong with log4j.
May I ask you to help me to solve this error?
I should mention that I tried to install a version before hadoop-3.2.0 means hadoop-3.1.2 and everything seems fine.
So I guess this is the problem related to java v11.0.2 and hadoop-3.2.0.
I create log directory in /usr/local/hadoop and grant permission to my hadoop account.
/usr/local/hadoop$ sudo mkdir logs
/usr/local/hadoop$ sudo chown -R hadoop logs

jps command not found while installing hadoop

ian new to centos 7
and iam configuring hadoop 2.7.1 cluster so i need to install openjdk as a requirement
so i installed one by the command
yum install java-1.7.0-openjdk
and java version command out put is
java version "1.7.0_131"
OpenJDK Runtime Environment (rhel- u131-b00)
OpenJDK 64-Bit Server VM (build 24.131-b00, mixed mode)
but my problem is that i want to use jps command and it is found in
so i iwant to install this rpm and i used the command
cd /usr/lib/jvm
rpm -ivh --nodeps
but becuase a newer version of jdk is installed i wasn't able to install this rpm
with the error
package java-1.7.0-openjdk-1: (which is newer than java-1.7.0-openjdk-1: is already installed
i don't know if iam using the right way to make jps command works
what should i do to include jps command
and is it right to install an old release of openjdk i mean 101 when newer one already exists i mean 131
java-1.7.0-openjdk contains only the JRE. jps is part of the openjdk development package. Refer here.
yum install java-1.7.0-openjdk-devel

Installing Apache Spark using yum

I am in the process of installing spark in my organization's HDP box. I run yum install spark and it installs Spark 1.4.1. How do I install Spark 2.0? Please help!
Spark 2 is supported (as a technical preview) in HDP 2.5. You can get the specific HDP 2.5 repo added to your yum repo directory and then install the same. Spark 1.6.2 is the version default in HDP 2.5.
sudo cp hdp.repo /etc/yum.repos.d/hdp.repo
sudo yum install spark2-master
sudo yum install spark2 (also seems to be doing the same when i tried)
see whats new in HDP 2.5
For full list of repos see

Switch java version on CentOS 6

I have a CentOS 6 virtual box with Java 1.7 present.
$ java -version
java version "1.7.0_51"
OpenJDK Runtime Environment (rhel- u51-b02)
OpenJDK 64-Bit Server VM (build 24.45-b08, mixed mode)
$ javac -version
javac 1.7.0_51
I need to use Java 1.6 instead of Java 1.7 so I installed it.
$ sudo yum install java-1.6.0-openjdk.x86_64
$ sudo yum install java-1.6.0-openjdk-devel.x86_64
When I check the version again, I get the same as previously. For other projects I will need Java 1.7 so I want to keep it.
Now I would like to configure my virtual machine so that both the default JDK and the default JRE are 1.6. How do I do that? Also, how can I switch back to Java 1.7 when I'm done with it?
I believe the commands you are looking for are alternatives --config java and alternatives --config javac. Additional documentation is here.

tomcat7 UnsupportedClassVersionError when running with java7

problem: when deploying my war to tomcat7 i get the error
java.lang.UnsupportedClassVersionError: org.MyLibraryClass : Unsupported major.minor version 51.0
(this is the error one gets when compiling java with a newer version than the java used when running the code.)
situation, in order:
brand new ubuntu 12.04.1 server 64bit minimal, in a virtualbox
installed tomcat6
tried to deploy my war
realized the error, and that i need java7 because ubuntu 12 still comes with outdated java
installed oracle java 7 using this guide
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java7-installer
removed tomcat6 and installed tomcat7
sudo apt-get remove tomcat6-common
sudo apt-get install tomcat7
deployed my war to tomcat7
started tomcat
sudo service tomcat7 start
checked my app's log file. same error.
echo $JAVA_HOME is empty, java -version shows:
java version "1.7.0_07"
Java(TM) SE Runtime Environment (build 1.7.0_07-b10)
Java HotSpot(TM) 64-Bit Server VM (build 23.3-b01, mixed mode)
it's a default tomcat7 install, no modification. still i checked the startup scripts and config to make sure no custom java version is specified anywhere. also checked by asking catalina:
ubuntu#ubuntu:/home$ /usr/share/tomcat7/bin/ version
Using CATALINA_BASE: /usr/share/tomcat7
Using CATALINA_HOME: /usr/share/tomcat7
Using CATALINA_TMPDIR: /usr/share/tomcat7/temp
Using JRE_HOME: /usr
Using CLASSPATH: /usr/share/tomcat7/bin/bootstrap.jar:/usr/share/tomcat7/bin/tomcat-juli.jar
Server version: Apache Tomcat/7.0.26
Server built: Jul 19 2012 03:21:30
Server number:
OS Name: Linux
OS Version: 3.2.0-29-generic
Architecture: amd64
JVM Version: 1.7.0_07-b10
JVM Vendor: Oracle Corporation
now i'm stuck. i don't see how any java code could fail to run on oracle's jre7.
my war is a brand new very basic hello world grails 2.1 app with maven, which has a maven dependency (org.MyLibraryClass) that is compiled with jdk7. that's the one for which i get the error.
in grails i changed BuildConfig.groovy to have 1.7 instead of 1.6: = 1.7
grails.project.source.level = 1.7
then did a grails clean, rebuild, war, redeploy. no change.
any idea what to try next?
Typical error when compiling code with Java7 and running it under Java6.
The critical line in your query is this one I think:
sudo service tomcat7 start
I think it may be triggering the inheritance of OpenJDK still in the system there. What you want to do is instead try tomcat from your own environment.
Login as your normal user
java -version
and check. You should also login as a clean user, root or elsewhere and check java -version to check.
If all else fails, go to /etc/profile and make sure path to Oracle's Java/bin directory is the very first thing in the PATH variable for the environment.
I have the same problem just now,but now it's solved.
Please check this symbolic
it's default link is open-jdk, reset the correct jdk dirctory.
good luck!