Bird
0
0

A Hadoop job fails with an OutOfMemoryError. The configuration is:

medium📝 Debug Q6 of 15
Hadoop - Performance Tuning
A Hadoop job fails with an OutOfMemoryError. The configuration is:
mapreduce.reduce.memory.mb=2048
mapreduce.reduce.java.opts=-Xmx3072m

What is the likely cause?
AContainer memory is too large
BJava heap size is too small
CJava heap size exceeds container memory size
DReduce memory property is ignored
Step-by-Step Solution
Solution:
  1. Step 1: Compare Java heap size and container memory

    Java heap size (-Xmx3072m) is larger than container memory (2048 MB).
  2. Step 2: Understand impact on job

    Heap size cannot exceed container memory; this causes OutOfMemoryError.
  3. Final Answer:

    Java heap size exceeds container memory size -> Option C
  4. Quick Check:

    Heap size must be ≤ container memory [OK]
Quick Trick: Heap size must not exceed container memory [OK]
Common Mistakes:
  • Setting heap size larger than container memory
  • Assuming container memory is ignored
  • Confusing map and reduce properties

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More Hadoop Quizzes