Question
a.
oozie.wid.rerun.skip.nodes
b.
oozie.wf.rerun.skip.nodes
c.
oozie.wf.run.skip.nodes
d.
all of the mentioned
Posted under Hadoop
Engage with the Community - Add Your Comment
Confused About the Answer? Ask for Details Here.
Know the Explanation? Add it Here.
Q. Nodes in the config _____________ must be completed successfully.
Similar Questions
Discover Related MCQs
Q. _____________ will skip the nodes given in the config with the same exit transition as before.
View solution
Q. ________ nodes that control the start and end of the workflow and workflow job execution path.
View solution
Q. Node names and transitions must be conform to the following pattern =[a-zA-Z][\-_a-zA-Z0-0]*=, of up to __________ characters long.
View solution
Q. A workflow definition must have one ________ node.
View solution
Q. If one or more actions started by the workflow job are executing when the ________ node is reached, the actions will be killed.
View solution
Q. A ___________ node enables a workflow to make a selection on the execution path to follow.
View solution
Q. Which of the following can be seen as a switch-case statement?
View solution
Q. All decision nodes must have a _____________ element to avoid bringing the workflow into an error state if none of the predicates evaluates to true.
View solution
Q. The ___________ attribute in the join node is the name of the workflow join node.
View solution
Q. If a computation/processing task -triggered by a workflow fails to complete successfully, its transitions to _____________
View solution
Q. If the failure is of ___________ nature, Oozie will suspend the workflow job.
View solution
Q. A _______________ action can be configured to perform file system cleanup and directory creation before starting the mapreduce job.
View solution
Q. ___________ properties can be overridden by specifying them in the job-xml file or configuration element.
View solution
Q. A collection of various actions in a control dependency DAG is referred to as ________________
View solution
Q. The ability of Hadoop to efficiently process large volumes of data in parallel is called __________ processing.
View solution
Q. __________ is used for simplified Data Management in Hadoop.
View solution
Q. Falcon provides ___________ workflow for copying data from source to target.
View solution
Q. A recurring workflow is used for purging expired data on __________ cluster.
View solution
Q. Falcon provides the key services data processing applications need so Sophisticated________ can easily be added to Hadoop applications.
View solution
Q. Falcon promotes decoupling of data set location from ___________ definition.
View solution
Suggested Topics
Are you eager to expand your knowledge beyond Hadoop? We've curated a selection of related categories that you might find intriguing.
Click on the categories below to discover a wealth of MCQs and enrich your understanding of Computer Science. Happy exploring!