Bird
0
0

A Hadoop cluster has 20 nodes with 64 GB RAM each. You want to run a memory-intensive job requiring 1.5 TB RAM total. How many nodes should you allocate to this job?

hard📝 Application Q9 of 15
Hadoop - Cluster Administration
A Hadoop cluster has 20 nodes with 64 GB RAM each. You want to run a memory-intensive job requiring 1.5 TB RAM total. How many nodes should you allocate to this job?
A23 nodes
B24 nodes
C20 nodes
D25 nodes
Step-by-Step Solution
Solution:
  1. Step 1: Calculate total RAM needed in GB

    1.5 TB = 1.5 * 1024 = 1536 GB RAM required.
  2. Step 2: Calculate nodes needed

    Each node has 64 GB RAM, so nodes = 1536 / 64 = 24 nodes exactly.
  3. Step 3: Add buffer for safety

    Adding 1 node buffer for stability: 24 + 1 = 25 nodes, but cluster only has 20 nodes.
  4. Step 4: Check cluster capacity

    Since cluster has only 20 nodes (1280 GB total), cannot allocate 25; must allocate max 20 nodes.
  5. Final Answer:

    20 nodes -> Option C
  6. Quick Check:

    RAM needed / RAM per node = nodes needed [OK]
Quick Trick: Divide total RAM by per-node RAM, add buffer [OK]
Common Mistakes:
  • Not converting TB to GB
  • Ignoring cluster node limit
  • Not adding buffer nodes

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More Hadoop Quizzes