It looks simple, right ? It is but there are a lot to do before you can actually perform the copy.
- Create an IAM policy that gives RDS read/write/list permissions to the S3 bucket
- Create an IAM role that gives RDS access to the S3 bucket
- Associate the IAM role to the DB instance
- Create a new Option Group or associate the S3_INTEGRATION option to an existing one
You can check the details on how to perform theses steps here.
OK, now you can perform the data transfer running, for example, the command below:
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3( p_bucket_name => 'mybucket', p_prefix => '', p_s3_prefix => '', p_directory_name => 'DATA_PUMP_DIR') AS TASK_ID FROM DUAL;
It will copy all files on DATA_PUMP_DIR directory to S3 bucket mybucket
This command will provide a task_id that will be useful to monitor the transfer status.
You can rely on AWS RDS Events at the console or the store procedure below to monitor the transfer job.
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','
Or at the AWS Console: