Export/Import Data on Amazon Oracle RDS using Data Pump utility

By September 6, 2019 May 18th, 2020 AWS, Blogs, data

Written by Selvakumar K, Associate Tech Lead at Powerupcloud

It has been a long time plan to test the Data Pump Scenario between Oracle RDS instances

Problem Statement 

On a daily basis, the data is been restored between Oracle RDS instances. (i.e)  making the copy of particular production schema to dev Oracle RDS instances 

Steps to implement Backup / Restore 

1. Take backup of  the Oracle schema from source RDS into data_pump_dir as master user 

DECLARE

Bkp NUMBER;

BEGIN

Bkp := DBMS_DATAPUMP.OPEN( operation => ‘EXPORT’, job_mode => ‘SCHEMA’, job_name=>null);

DBMS_DATAPUMP.ADD_FILE( handle => Bkp, filename => ‘rdsdumpsrc1.dmp’, directory => ‘DATA_PUMP_DIR’, filetype => dbms_datapump.ku$_file_type_dump_file,reusefile => 1);

DBMS_DATAPUMP.ADD_FILE( handle => Bkp, filename => ‘rdsdumpsrc1.log’, directory => ‘DATA_PUMP_DIR’, filetype => dbms_datapump.ku$_file_type_log_file);

DBMS_DATAPUMP.METADATA_FILTER(hdnl,’SCHEMA_EXPR’,’IN (”rdsdumpsrc”)’);

DBMS_DATAPUMP.START_JOB(Bkp);

END;

/

Source Schema Name: rdsdumpsrc

Once the backup is completed , please use the below queries to verify its in data_pump_dir directory 

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

2. Create Database Link to Target Instance 

create database link to_target_transfer connect to admin identified by admin123 using 

‘(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=selvadump.cv7jsqukf31b.us-east-1.rds.amazonaws.com)(PORT=1521))(CONNECT_DATA=(SID=ORCL)))’;

Target Instance : selvadump.cv7jsqukf31b.us-east-1.rds.amazonaws.com

Target Instance Port : 1521

Target Instance SID : ORCL

3. Transfer the Exported Dump into target instance Data_Pump_Dir

Using DBMS_FILE_TRANSFER utility , copy and transfer the file to target instance on destination directory 

BEGIN

DBMS_FILE_TRANSFER.PUT_FILE(

source_directory_object       => ‘DATA_PUMP_DIR’,

source_file_name              => ‘rdsdumpsrc1.dmp’,

destination_directory_object  => ‘DATA_PUMP_DIR’,

destination_file_name         => ‘rdsdumpsrc_copied1.dmp’, 

destination_database          => ‘to_target_transfer’ 

);

END;

Using Step 1 command verify the file is copied to target instance

4. Import the schema into the target instance 

Restored the source schema into different schema using DBMS_DATAPUMP.METADATA_REMAP , If required use more data pump utilities to restore into different tablespace as well 

DECLARE

restre NUMBER;

BEGIN

restre := DBMS_DATAPUMP.OPEN( operation => ‘IMPORT’, job_mode => ‘SCHEMA’, job_name=>null);

DBMS_DATAPUMP.METADATA_REMAP(restore,’REMAP_SCHEMA’,’RDSDUMPSRC’,’rdsdumpsrc_copied3′);

DBMS_DATAPUMP.START_JOB(restre);

END;

/

We can use a python script to automate this process , use the below script to backup and restore the oracle schema to target instance

Conclusion:

Data pump is the most convenient way to do frequent backup and restore oracle schema between prod, dev environments for data refresh purpose


Leave a Reply