Copying large Dynamo data from to another service table
Jump to navigation
Jump to search
Overview
How to efficiently move potentially large sets of data from a DynamoDB table in one service to another table in another service.
Considerations
- Need to focus on efficiency
- Prefer to compartmentalize resources per service, meaning only the one service has access to it's resources like DynamoDB tables
Currently
We send the to and from table including primary keys etc to allow one Lambda to read and write to both tables. Do this because sending the data by messages between the two services is not efficient.
Possible improvements
Perhaps a stream (eg Kenesis) would work in moving the data from one service (read) to another (write). Adds the cost of the messaging/stream service, and perhaps a small cost in firing up the additional Lambda at scale, but perhaps worth it to secure resource per service.