Run shell script from local directory instead of HDFS via Oozie - hadoop

I want to run a shell script from the local path(Edge node) instead of hdfs directory via oozie. My local shell script contains ssh steps which I cant run from hdfs directory.
XYZ is the userid and xxxx is the server(Edge node). I used below action in the workflow but this is not working. Please help
<action name="abc">
<ssh xmlns="uri:oozie:ssh-action:0.1">
<command>/local path/</command>
<ok to="success-mail"/>
<error to="fail-Email"/>


How to get oozie jobId in oozie workflow?

I have a oozie workflow that will invoke a shell file, Shell file will further invoke a driver class of mapreduce job. Now i want to map my oozie jobId to Mapreduce jobId for later process. Is there any way to get oozie jobId in workflow file so that i can pass the same as argument to my driver class for mapping.
Following is my sample workflow.xml file
<workflow-app xmlns="uri:oozie:workflow:0.4" name="test">
<start to="start-test" />
<action name='start-test'>
<shell xmlns="uri:oozie:shell-action:0.2">
<argument>${jobId}</argument> <!-- this is how i wanted to pass oozie jobId -->
<ok to="end" />
<error to="kill" />
<kill name="kill">
<message>test job failed
<end name="end" />
Following is my shell script.
hadoop jar testProject.jar testProject.MrDriver $1 $2 $3
Try to use ${wf:id()}:
String wf:id()
It returns the workflow job ID for the current workflow job.
More info here.
Oozie drops an XML file in the CWD of the YARN container running the shell (the "launcher" container), and also sets an env variable pointing to that XML (cannot remember the name though).
That XML contains a lot of stuff like name of Workflow, name of Action, ID of both, run attempt number, etc.
So you can sed back that information in the shell script itself.
Of course passing explicitly the ID (as suggested by Alexei) would be cleaner, but sometimes "clean" is not the best way. Especially if you are concerned about whether it's the first run or not...

How to get oozie workflow duration at the end

Is there any way to email the duration of the workflow with the completion email? Is there such a variable that I can use?
I dont think such a variable is available. But if needed you can do such using shell action. During your workflow start execute a shell script for start time and save it in a variable. At the time of workflow just finish before your email action have a another shell script which will calculate the current time - start time and use it in your email. But this makes your workflow dirty
This is a remarkable shortcoming of Oozie. Each of our workflows starts with an shell action that calls a simple bash script to get timestamp.
<action name="start-time">
<shell xmlns="uri:oozie:shell-action:0.1">
<ok to="the-first-actual-action"/>
<error to="fail"/>
And this is testable with Java EL in the email we send on completion, error, like so:
<action name="email">
<email xmlns="uri:oozie:email-action:0.1">
<subject>COMPLETED: ${wf:name()}</subject>
Workflow ID: ${wf:id()}
Workflow Name: ${wf:name()}
Workflow app path: ${wf:appPath()}
Start Time: ${wf:actionData('start-time')['time']}
End Time: ${timestamp()}
<ok to="end"/>
<error to="fail"/>
Getting duration is another jump-through-hoop exercise involving passing the start and end time to a bash script.
I was investigating the Oozie SLA functionality, but I haven't found a way to extract the data.

How to make Hue - Oozie workflow run a java job which has config file?

I have a buildModel.jar, and a folder "conf" which contain a configuration file named
The command line running it look like this:
hadoop jar /home/user1/buildModel.jar -t fp-purchased-products -i hdfs://Hadoop238:8020/user/user2/recommend_data/bought_together
After doing some analyze, it use the db information in "" file to store data to a mongo db.
Now i need to run it with Hue Oozie workflow, so I used Hue to upload the jar file and folder "conf" to hdfs then created a workflow. I also added "" file in workflow
This is the workflow.xml
<workflow-app name="test_service" xmlns="uri:oozie:workflow:0.4">
<start to="run_java_file"/>
<action name="run_java_file">
<ok to="end"/>
<error to="kill"/>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
<end name="end"/>
And this is the workflow-metadata.json
{"attributes": {"deployment_dir": "/user/hue/oozie/workspaces/_user2_-oozie-31-1416890719.12", "description": ""}, "nodes": {"run_java_file": {"attributes": {"jar_path": "/user/user2/service/build_model/buildModel.jar"}}}, "version": "0.0.1"}
After doing analyze, it got error when save data to mongo db. It seem that the java file can't see the
Can anyone guide me how to use Hue Oozie run java which has config file ?
Sorry for late answer.
As Romain explained above. Hue will copy the to the same directory with the BuildModel.jar. So i changed the code to let BuildModel.jar read config file at the same directory. It worked !

FNF: Not able to execute

Trying to run a Oozie workflow but keep getting the following error message:
org.apache.oozie.action.ActionExecutorException: FNF: Not able to execute on username#servername | ErrorStream: *********************************************************************
This machine is the property of xyz....
(Note: I've setup passpharase-less access. If I run the steps manually it works, but when I run thru Oozie it doesn't. In other words, I can login to the machine as user 'oozie', then ssh username#servername (without entering password) & then run the 'command'. This works, but the Oozie workflow doesn't)
Here's my workflow.xml
<workflow-app name="my app" xmlns="uri:oozie:workflow:0.2">
<start to="sshAction"/>
<action name="sshAction">
<ssh xmlns="uri:oozie:ssh-action:0.1">
<command>cd /export/home/user/test/bin;./ --arg value</command>
<ok to="sendEmail"/>
<error to="sendEmail" />
<action name="sendEmail">
<email xmlns="uri:oozie:email-action:0.1">
<subject>Output of workflow ${wf:id()}</subject>
<body>Status of the file move: ${wf:actionData('sshAction')['STATUS']}</body>
<ok to="end"/>
<error to="end"/>
<end name="end"/>
Figured out what was wrong by looking at the code. FNF stands for 'File not found'. It appears the 'ssh action' doesn't handle commands separated by semi-colon such as this:
cd /export/home/user/test/bin;./ --arg value
Here's what I did:
1) Changed the command to:
./ --arg value
2) Copied to the root directory of the user.
3) Added cd /export/home/user/test/bin to the beginning of the ''
It's working now!

oozie ssh action takes long time to complete

I tried running an ssh action workflow job in oozie with the following action code
Passwordless ssh was configured :
<action name="sshaction">
<ssh xmlns="uri:oozie:ssh-action:0.1">
<ok to="WordCount" />
<error to="fail" />
<action name="WordCount">
<delete path="${nameNode}/user/510600/output/" />
<ok to="end" />
<error to="fail" />
Problem I encountered with the above code is oozie ssh action takes long time to complete even with a 2 line shell script, However other action runs very fast.
For the above 2 actions sshaction took 12 mins to complete and the action WordCount took only 15 Seconds to complete
my shellscript is as /home/510600/HADOOP_ECO/CDH4/oozietest/
rm -rf /home/510600/abc.log
Can anyone explain why oozie ssh action takes long time to run ?
If everything works fine except sending status to oozie webserver from shell script, I'nk the issue would be curl.
Linux utility curl should be present in the remote machine.
Because oozie webserver internally uses two bash scripts and for executing the commands in remote machine. script uses the linux utility curl to send the status back to oozie server by invoking oozie webservice.
Sometimes it may occurs because of configuration or authenticatin issues.
Did you try executing the script without oozie. How long it takes to complete ?