Bird
0
0

A Hadoop job is failing frequently with 'GC overhead limit exceeded' errors. What tuning action can fix this?

medium📝 Debug Q6 of 15
Hadoop - Performance Tuning
A Hadoop job is failing frequently with 'GC overhead limit exceeded' errors. What tuning action can fix this?
AIncrease the Java heap size for map and reduce tasks
BDecrease the number of reducers to zero
CReduce the input split size drastically
DDisable speculative execution
Step-by-Step Solution
Solution:
  1. Step 1: Understand 'GC overhead limit exceeded' error

    This error means Java spends too much time garbage collecting due to low heap memory.
  2. Step 2: Identify tuning to fix memory pressure

    Increasing heap size gives more memory, reducing garbage collection frequency and errors.
  3. Final Answer:

    Increase the Java heap size for map and reduce tasks -> Option A
  4. Quick Check:

    Heap size increase = Fix GC overhead error [OK]
Quick Trick: Increase heap size to reduce GC overhead errors [OK]
Common Mistakes:
  • Reducing reducers does not fix memory errors
  • Changing split size unrelated to GC errors
  • Disabling speculative execution won't fix memory

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More Hadoop Quizzes