Skip to content

Batch Operations

Batch operations allow you to read or write multiple items in a single request.


POST /api/v1/databases/{databaseId}/items/batch/write

Section titled “POST /api/v1/databases/{databaseId}/items/batch/write”

Performs batch put/delete operations. Maximum 25 items per request.

Request:

{
"items": [
{
"operation": "Put",
"pk": "user#123",
"sk": "profile",
"data": { "name": "John" }
},
{
"operation": "Put",
"pk": "user#456",
"sk": "profile",
"data": { "name": "Jane" }
},
{
"operation": "Delete",
"pk": "user#789",
"sk": "profile"
}
]
}
FieldTypeDescription
operationstringPut or Delete
pkstringPartition key
skstringSort key (optional)
dataobjectItem data (required for Put)

Response (200 OK):

{
"successCount": 3,
"failedCount": 0,
"errors": null
}

Some items may fail while others succeed:

{
"successCount": 2,
"failedCount": 1,
"errors": [
{
"pk": "user#789",
"sk": "profile",
"error": "Condition check failed"
}
]
}

POST /api/v1/databases/{databaseId}/items/batch/get

Section titled “POST /api/v1/databases/{databaseId}/items/batch/get”

Gets multiple items by keys. Maximum 100 keys per request.

Request:

{
"keys": [
{ "pk": "user#123", "sk": "profile" },
{ "pk": "user#456", "sk": "profile" },
{ "pk": "user#789", "sk": "profile" }
]
}

Response (200 OK):

{
"items": [
{
"pk": "user#123",
"sk": "profile",
"data": { "name": "John" },
"createdAt": "2024-01-15T10:00:00Z",
"updatedAt": "2024-01-15T10:30:00Z"
},
{
"pk": "user#456",
"sk": "profile",
"data": { "name": "Jane" },
"createdAt": "2024-01-15T11:00:00Z",
"updatedAt": "2024-01-15T11:30:00Z"
}
],
"unprocessedKeys": null
}

If some keys couldn’t be processed (e.g., due to throughput limits), they are returned for retry:

{
"items": [...],
"unprocessedKeys": [
{ "pk": "user#789", "sk": "profile" }
]
}

OperationLimit
Batch Write25 items per request
Batch Get100 keys per request

// Batch Write
var writeItems = new List<BatchWriteItem>
{
new BatchWriteItem
{
Operation = BatchOperation.Put,
PartitionKey = "user#123",
SortKey = "profile",
Data = new Dictionary<string, object?> { ["name"] = "John" }
},
new BatchWriteItem
{
Operation = BatchOperation.Put,
PartitionKey = "user#456",
SortKey = "profile",
Data = new Dictionary<string, object?> { ["name"] = "Jane" }
},
new BatchWriteItem
{
Operation = BatchOperation.Delete,
PartitionKey = "user#789",
SortKey = "profile"
}
};
var writeResult = await client.BatchWriteAsync(writeItems);
Console.WriteLine($"Success: {writeResult.Value.SuccessCount}");
// Batch Get
var keys = new List<ItemKey>
{
new("user#123", "profile"),
new("user#456", "profile")
};
var getResult = await client.BatchGetAsync(keys);
foreach (var item in getResult.Value.Items)
{
Console.WriteLine($"{item.PartitionKey}: {item.GetAttribute<string>("name")}");
}

If you have more than 25 items to write, split them into chunks:

var allItems = GetLargeListOfItems(); // 100 items
var chunks = allItems.Chunk(25);
foreach (var chunk in chunks)
{
await client.BatchWriteAsync(chunk.ToList());
}

Always check for and retry failed items:

var result = await client.BatchWriteAsync(items);
if (result.Value.FailedCount > 0)
{
foreach (var error in result.Value.Errors)
{
Console.WriteLine($"Failed: {error.Pk}/{error.Sk}: {error.Error}");
}
}

Instead of multiple individual Gets:

// Inefficient: Multiple requests
var item1 = await client.GetItemAsync("user#1", "profile");
var item2 = await client.GetItemAsync("user#2", "profile");
var item3 = await client.GetItemAsync("user#3", "profile");
// Efficient: Single request
var items = await client.BatchGetAsync(new[]
{
new ItemKey("user#1", "profile"),
new ItemKey("user#2", "profile"),
new ItemKey("user#3", "profile")
});

Use CaseBatchTransaction
Independent writesYesNo
All-or-nothingNoYes
Maximum throughputYesNo
Conditional writesNoYes
Maximum items25 write / 100 read25 write / 100 read