Bird
0
0

You want to migrate a large Hadoop dataset to a cloud-native platform but want to minimize downtime. Which approach is best?

hard📝 Application Q15 of 15
Hadoop - Modern Data Architecture with Hadoop
You want to migrate a large Hadoop dataset to a cloud-native platform but want to minimize downtime. Which approach is best?
AUse incremental data replication with tools like DistCp to sync data continuously
BCopy all data once during a maintenance window and stop all jobs
CManually upload data files via FTP to cloud storage
DRun Hadoop jobs locally without migration
Step-by-Step Solution
Solution:
  1. Step 1: Understand downtime minimization in migration

    Minimizing downtime requires continuous syncing rather than one-time full copy.
  2. Step 2: Identify best tool for incremental replication

    DistCp supports incremental copying of Hadoop data to cloud storage, enabling near-zero downtime migration.
  3. Final Answer:

    Use incremental data replication with tools like DistCp to sync data continuously -> Option A
  4. Quick Check:

    Incremental sync = less downtime [OK]
Quick Trick: Incremental sync tools reduce downtime [OK]
Common Mistakes:
  • Doing one-time full copy causing long downtime
  • Using manual FTP uploads for big data
  • Avoiding migration by running locally

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More Hadoop Quizzes