Question
a.
master-worker
b.
master-slave
c.
worker/slave
d.
all of the mentioned
Posted under Hadoop
Engage with the Community - Add Your Comment
Confused About the Answer? Ask for Details Here.
Know the Explanation? Add it Here.
Q. HDFS works in a __________ fashion.
Similar Questions
Discover Related MCQs
Q. ________ NameNode is used when the Primary NameNode goes down.
View solution
Q. Which of the following scenario may not be a good fit for HDFS?
View solution
Q. The need for data replication can arise in various scenarios like ____________
View solution
Q. ________ is the slave/worker node and holds the user data in the form of Data Blocks.
View solution
Q. HDFS provides a command line interface called __________ used to interact with HDFS.
View solution
Q. HDFS is implemented in _____________ programming language.
View solution
Q. For YARN, the ___________ Manager UI provides host and port information.
View solution
Q. For ________ the HBase Master UI provides information about the HBase Master uptime.
View solution
Q. During start up, the ___________ loads the file system state from the fsimage and the edits log file.
View solution
Q. In order to read any file in HDFS, instance of __________ is required.
View solution
Q. ______________ is method to copy byte from input stream to any other stream in Hadoop.
View solution
Q. _____________ is used to read data from bytes buffers.
View solution
Q. Interface ____________ reduces a set of intermediate values which share a key to a smaller set of values.
View solution
Q. Reducer is input the grouped output of a ____________
View solution
Q. The output of the reduce task is typically written to the FileSystem via ____________
View solution
Q. Applications can use the _________ provided to report progress or just indicate that they are alive.
View solution
Q. Which of the following parameter is to collect keys and combined values?
View solution
Q. ________ is a programming model designed for processing large volumes of data in parallel by dividing the work into a set of independent tasks.
View solution
Q. The daemons associated with the MapReduce phase are ________ and task-trackers.
View solution
Q. The JobTracker pushes work out to available _______ nodes in the cluster, striving to keep the work as close to the data as possible.
View solution
Suggested Topics
Are you eager to expand your knowledge beyond Hadoop? We've curated a selection of related categories that you might find intriguing.
Click on the categories below to discover a wealth of MCQs and enrich your understanding of Computer Science. Happy exploring!