Given a DynamoDB table Users with items having UserId as the primary key, consider this BatchGetItem request:
{
"RequestItems": {
"Users": {
"Keys": [
{"UserId": {"S": "user1"}},
{"UserId": {"S": "user3"}}
]
}
}
}If the table contains these items:
- UserId: user1, Name: Alice
- UserId: user2, Name: Bob
- UserId: user3, Name: Charlie
What will the response contain?
BatchGetItem returns all requested items that exist in the table.
The BatchGetItem request asks for user1 and user3. Both exist, so both are returned in the response.
Choose the correct statement about DynamoDB's BatchGetItem operation.
Think about how BatchGetItem handles multiple tables.
BatchGetItem supports retrieving items from multiple tables by specifying keys for each table in the request.
Identify the syntactically valid BatchGetItem JSON request.
Check the structure of the Keys attribute.
The Keys attribute must be an array of key objects. Option D correctly uses an array under Keys.
You need to retrieve 150 items from a DynamoDB table using BatchGetItem. What is the best approach to avoid request size limits?
Remember DynamoDB limits BatchGetItem to 100 items per request.
DynamoDB limits BatchGetItem to 100 items per request. Splitting keys into batches avoids errors and respects limits.
Consider this BatchGetItem request to table Orders:
{
"RequestItems": {
"Orders": {
"Keys": [
{"OrderId": {"S": "o1"}},
{"OrderId": {"S": "o2"}},
{"OrderId": {"S": "o3"}}
],
"ConsistentRead": true
}
}
}The response contains UnprocessedKeys with some keys. What is the most likely reason?
Think about DynamoDB limits and throttling behavior.
When BatchGetItem exceeds provisioned throughput, DynamoDB returns unprocessed keys to retry later. This is normal behavior under throttling.