0
0
DynamoDBquery~5 mins

Batch limits and retries in DynamoDB

Choose your learning style9 modes available
Introduction
Batch limits and retries help you send or receive many items efficiently without overloading the system or losing data.
When you want to read or write many items at once to save time.
When you need to handle limits set by DynamoDB on batch sizes.
When some requests fail and you want to try them again automatically.
When you want to avoid errors caused by sending too many items in one batch.
Syntax
DynamoDB
BatchWriteItemRequest {
  RequestItems: {
    TableName: [
      { PutRequest: { Item: {...} } },
      { DeleteRequest: { Key: {...} } }
    ]
  },
  ReturnConsumedCapacity: 'TOTAL' | 'NONE',
  ReturnItemCollectionMetrics: 'SIZE' | 'NONE'
}

BatchGetItemRequest {
  RequestItems: {
    TableName: {
      Keys: [ { KeyAttribute: value }, ... ],
      ProjectionExpression: 'attribute1, attribute2'
    }
  },
  ReturnConsumedCapacity: 'TOTAL' | 'NONE'
}
DynamoDB limits batch write requests to 25 items per batch.
Batch get requests can retrieve up to 100 items per batch.
Unprocessed items are returned and should be retried.
Examples
Batch write with 2 new books and 1 book deleted.
DynamoDB
BatchWriteItemRequest with 3 items:
{
  RequestItems: {
    'Books': [
      { PutRequest: { Item: { 'ISBN': '123', 'Title': 'Book A' } } },
      { PutRequest: { Item: { 'ISBN': '456', 'Title': 'Book B' } } },
      { DeleteRequest: { Key: { 'ISBN': '789' } } }
    ]
  }
}
Batch get to retrieve title and author of 2 books.
DynamoDB
BatchGetItemRequest with 2 keys:
{
  RequestItems: {
    'Books': {
      Keys: [ { 'ISBN': '123' }, { 'ISBN': '456' } ],
      ProjectionExpression: 'Title, Author'
    }
  }
}
Batch write with no items does nothing but is valid.
DynamoDB
Empty BatchWriteItemRequest:
{
  RequestItems: {
    'Books': []
  }
}
DynamoDB limits batch writes to 25 items per request.
DynamoDB
BatchWriteItemRequest with 26 items (exceeds limit):
// This will cause an error and needs splitting into two batches.
Sample Program
This program writes up to 25 items at a time to a DynamoDB table named 'Books'. If some items are not processed, it retries them until all are written.
DynamoDB
import boto3
from botocore.exceptions import ClientError

# Create DynamoDB client
client = boto3.client('dynamodb')

def batch_write_with_retries(table_name, items):
    max_batch_size = 25
    unprocessed_items = []

    # Split items into batches of 25
    for i in range(0, len(items), max_batch_size):
        batch = items[i:i+max_batch_size]
        request_items = {table_name: [{'PutRequest': {'Item': item}} for item in batch]}

        while True:
            response = client.batch_write_item(RequestItems=request_items)
            unprocessed = response.get('UnprocessedItems', {})

            if not unprocessed or not unprocessed.get(table_name):
                break  # All items processed

            # Retry unprocessed items
            request_items = unprocessed

    print(f"All items written to {table_name} with retries if needed.")

# Example items to write
items_to_write = [
    {'ISBN': {'S': '001'}, 'Title': {'S': 'Book One'}},
    {'ISBN': {'S': '002'}, 'Title': {'S': 'Book Two'}},
    {'ISBN': {'S': '003'}, 'Title': {'S': 'Book Three'}}
]

print("Before batch write")
batch_write_with_retries('Books', items_to_write)
print("After batch write")
OutputSuccess
Important Notes
Batch write operations have a limit of 25 items per request and batch get operations have a limit of 100 items.
Unprocessed items are common when throughput limits are exceeded; always check and retry them.
Retries should have a delay or backoff in real applications to avoid throttling.
Summary
Batch limits prevent sending too many items at once to DynamoDB.
Always check for unprocessed items and retry them to ensure data is saved.
Use batch operations to improve efficiency when handling many items.