Copying one table to another in DynamoDB

Solution 1:

Create a backup(backups option) and restore the table with a new table name. That would get all the data into the new table. Note: Takes considerable amount of time depending on the table size

Solution 2:

I just used the python script, dynamodb-copy-table, making sure my credentials were in some environment variables (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY), and it worked flawlessly. It even created the destination table for me.

python dynamodb-copy-table.py src_table dst_table

The default region is us-west-2, change it with the AWS_DEFAULT_REGION env variable.

Solution 3:

AWS Pipeline provides a template which can be used for this purpose: "CrossRegion DynamoDB Copy"

See: http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-crossregion-ddb-create.html

The result is a simple pipeline that looks like:

enter image description here

Although it's called CrossRegion you can easily use it for the same region as long the destination table name is different (Remember that table names are unique per account and region)

Solution 4:

You can use Scan to read the data and save it to the new table.

On the AWS forums a guy from the AWS team posted another approach using EMR: How Do I Duplicate a Table?

Solution 5:

Here's one solution to copy all items from one table to another, just using shell scripting, the AWS CLI and jq. Will work OK for smallish tables.

# exit on error
set -eo pipefail

# tables
TABLE_FROM=<table>
TABLE_TO=<table>

# read
aws dynamodb scan \
  --table-name "$TABLE_FROM" \
  --output json \
 | jq "{ \"$TABLE_TO\": [ .Items[] | { PutRequest: { Item: . } } ] }" \
 > "$TABLE_TO-payload.json"

# write
aws dynamodb batch-write-item --request-items file://"$TABLE_TO-payload.json"

# clean up
rm "$TABLE_TO-payload.json"

If you both tables to be identical, you'd want to delete all items in TABLE_TO first.