0
0
DynamoDBquery~20 mins

Batch limits and retries in DynamoDB - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Batch Write Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
query_result
intermediate
2:00remaining
BatchWriteItem request size limit

You want to write 150 items to a DynamoDB table using BatchWriteItem. What will happen if you send all 150 items in one batch request?

AThe request will succeed and write all 150 items at once.
BThe request will automatically split into multiple batches and write all items.
CThe request will partially succeed and write only the first 100 items.
DThe request will fail because the maximum batch size is 25 items.
Attempts:
2 left
💡 Hint

Remember the maximum number of items allowed in a single BatchWriteItem request.

🧠 Conceptual
intermediate
2:00remaining
Handling unprocessed items in BatchWriteItem

When you use BatchWriteItem, some items may be returned as unprocessed. What is the best practice to handle these unprocessed items?

AManually retry sending the unprocessed items until they succeed.
BIgnore unprocessed items because DynamoDB will retry them automatically.
CDelete the unprocessed items from your source data to avoid errors.
DIncrease the batch size to include unprocessed items in the next request.
Attempts:
2 left
💡 Hint

Think about how DynamoDB handles unprocessed items and what your application should do.

📝 Syntax
advanced
3:00remaining
Correct syntax for BatchWriteItem with retries

Which of the following code snippets correctly implements retry logic for unprocessed items in a DynamoDB BatchWriteItem operation using AWS SDK?

Ado { const response = await dynamoDB.batchWrite({ RequestItems: itemsToWrite }).promise(); itemsToWrite = response.UnprocessedItems; } while (itemsToWrite && Object.keys(itemsToWrite).length > 0);
Bconst response = await dynamoDB.batchWrite({ RequestItems: itemsToWrite }).promise(); if (response.UnprocessedItems) { itemsToWrite = response.UnprocessedItems; }
Cfor (let i = 0; i < 3; i++) { await dynamoDB.batchWrite({ RequestItems: itemsToWrite }).promise(); }
Dwhile (itemsToWrite.length > 0) { const response = await dynamoDB.batchWrite({ RequestItems: itemsToWrite }).promise(); itemsToWrite = response.UnprocessedItems; }
Attempts:
2 left
💡 Hint

Look for a loop that continues while unprocessed items exist.

optimization
advanced
3:00remaining
Optimizing batch write throughput with retries

You want to optimize writing 1000 items to DynamoDB using BatchWriteItem. Which approach best balances throughput and retry handling?

ASplit items into batches of 100, send batches in parallel, and ignore unprocessed items to save time.
BSplit items into batches of 25, send batches sequentially, and retry unprocessed items with exponential backoff.
CSend all 1000 items in one batch request and retry unprocessed items immediately without delay.
DSend items one by one using PutItem to avoid unprocessed items and retries.
Attempts:
2 left
💡 Hint

Consider DynamoDB batch size limits and best retry practices.

🔧 Debug
expert
3:00remaining
Diagnosing persistent unprocessed items in BatchWriteItem

You notice that your BatchWriteItem requests keep returning unprocessed items even after multiple retries with exponential backoff. What is the most likely cause?

AYou are using the wrong AWS region for your DynamoDB table.
BYour batch size exceeds the 25 item limit, causing automatic throttling.
CYour provisioned write capacity is too low to handle the write volume.
DYour retry logic is missing the required <code>RequestItems</code> parameter.
Attempts:
2 left
💡 Hint

Think about what causes throttling and unprocessed items despite retries.