Bird
0
0

You need to import a large dataset from S3 into DynamoDB and want to minimize downtime. Which approach helps achieve this?

hard🚀 Application Q9 of 15
DynamoDB - Backup and Recovery
You need to import a large dataset from S3 into DynamoDB and want to minimize downtime. Which approach helps achieve this?
AManually write a script to insert items one by one
BUse the DynamoDB import from S3 feature which loads data asynchronously
CExport data from DynamoDB first, then import back
DPause all writes to the table during import
Step-by-Step Solution
Solution:
  1. Step 1: Understand import behavior

    DynamoDB import from S3 runs asynchronously and does not block table access.
  2. Step 2: Compare alternatives

    Manual scripts are slow; pausing writes causes downtime; export-import is unrelated.
  3. Final Answer:

    Use the DynamoDB import from S3 feature which loads data asynchronously -> Option B
  4. Quick Check:

    Async import minimizes downtime [OK]
Quick Trick: DynamoDB import runs async, keeping table available [OK]
Common Mistakes:
MISTAKES
  • Thinking import blocks table access
  • Using slow manual inserts for large data

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More DynamoDB Quizzes