What happens in a MapReduce job when you set the number of reducers to one?

A.
A single reducer gathers and processes all the output from all the mappers. The output is
written in as many separate files as there are mappers.
B.
A single reducer gathers and processes all the output from all the mappers. The output is
written to a single file in HDFS.
C.
Setting the number of reducers to one creates a processing bottleneck, and since the number
of reducers as specified by the programmer is used as a reference value only, the MapReduce
runtime provides a default setting for the number of reducers.
D.
Setting the number of reducers to one is invalid, and an exception is thrown.
B.
A single reducer gathers and processes all the output from all the mappers. The output is
written to a single file in HDFS.
0
0
I choose B
0
0