8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23ai | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Data Pump API for PL/SQL (DBMS_DATAPUMP)
The DBMS_DATAPUMP
package provides a PL/SQL API, allowing us to define Data Pump operations programmatically. This article provides a few examples to get you started.
There were examples of this functionality in the original 10g Data Pump article here, but this article expands on those examples.
Related articles.
- Data Pump (expdp, impdp) : All Articles
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
Setup
We create a new database user for our testing.
conn / as sysdba alter session set container=pdb1; drop user testuser1 cascade; create user testuser1 identified by testuser1 default tablespace users quota unlimited on users; grant create session, create table, create type to testuser1;
We create and populate a copy of the EMP table in the test user.
conn testuser1/testuser1@pdb1 create table emp ( empno number(4,0), ename varchar2(10 byte), job varchar2(9 byte), mgr number(4,0), hiredate date, sal number(7,2), comm number(7,2), deptno number(2,0), constraint pk_emp primary key (empno) ); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7369,'SMITH','CLERK',7902,to_date('17-DEC-80','DD-MON-RR'),800,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7499,'ALLEN','SALESMAN',7698,to_date('20-FEB-81','DD-MON-RR'),1600,300,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7521,'WARD','SALESMAN',7698,to_date('22-FEB-81','DD-MON-RR'),1250,500,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7566,'JONES','MANAGER',7839,to_date('02-APR-81','DD-MON-RR'),2975,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7654,'MARTIN','SALESMAN',7698,to_date('28-SEP-81','DD-MON-RR'),1250,1400,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7698,'BLAKE','MANAGER',7839,to_date('01-MAY-81','DD-MON-RR'),2850,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7782,'CLARK','MANAGER',7839,to_date('09-JUN-81','DD-MON-RR'),2450,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7788,'SCOTT','ANALYST',7566,to_date('19-APR-87','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7839,'KING','PRESIDENT',null,to_date('17-NOV-81','DD-MON-RR'),5000,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7844,'TURNER','SALESMAN',7698,to_date('08-SEP-81','DD-MON-RR'),1500,0,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7876,'ADAMS','CLERK',7788,to_date('23-MAY-87','DD-MON-RR'),1100,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7900,'JAMES','CLERK',7698,to_date('03-DEC-81','DD-MON-RR'),950,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7902,'FORD','ANALYST',7566,to_date('03-DEC-81','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7934,'MILLER','CLERK',7782,to_date('23-JAN-82','DD-MON-RR'),1300,null,10); commit;
We create a physical directory on the database server to use with the export/import operations.
mkdir -p /tmp/dp
We create a Oracle directory object pointing to the physical location.
conn / as sysdba alter session set container=pdb1; create or replace directory test_dir AS '/tmp/dp'; grant read, write on directory test_dir to testuser1;
Once a job is running, the status of the job is visible in the DBA_DATAPUMP_JOBS
view.
column owner_name format a20 column job_name format a30 column operation format a10 column job_mode format a10 column state format a12 select owner_name, job_name, trim(operation) as operation, trim(job_mode) as job_mode, state, degree, attached_sessions, datapump_sessions from dba_datapump_jobs order by 1, 2;
You can see an example of this in the datapump_jobs.sql script.
Table Export
The following example shows how to perform a table export of the EMP
table in the TESTUSER1
schema.
declare l_dp_handle number; begin -- Open a table export job. l_dp_handle := dbms_datapump.open( operation => 'EXPORT', job_mode => 'TABLE', remote_link => NULL, job_name => 'TESTUSER1_EMP_EXPORT', version => 'LATEST'); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'TESTUSER1_EMP.dmp', directory => 'TEST_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'expdpTESTUSER1_EMP.log', directory => 'TEST_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); -- Specify the table to be exported, filtering the schema and table. dbms_datapump.metadata_filter( handle => l_dp_handle, name => 'SCHEMA_EXPR', value => '= ''TESTUSER1'''); dbms_datapump.metadata_filter( handle => l_dp_handle, name => 'NAME_EXPR', value => '= ''EMP'''); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle); end; /
If we are quick enough, we can see the job executing in the DBA_DATAPUMP_JOBS
view.
SQL> @datapump_jobs.sql OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- SYS TESTUSER1_EMP_EXPORT EXPORT TABLE EXECUTING 1 0 2 SQL>
Table Import
The following example shows how to perform a table import. We import the dump file produced by the export of the EMP
table, but remap it to EMP2
.
declare l_dp_handle number; begin -- Open a schema import job. l_dp_handle := dbms_datapump.open( operation => 'IMPORT', job_mode => 'TABLE', remote_link => NULL, job_name => 'TESTUSER1_EMP_IMPORT', version => 'LATEST'); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'TESTUSER1_EMP.dmp', directory => 'TEST_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'impdpTESTUSER1_EMP_IMPORT.log', directory => 'TEST_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); -- Perform a REMAP_TABLE from EMP to EMP2. dbms_datapump.metadata_remap( handle => l_dp_handle, name => 'REMAP_TABLE', old_value => 'EMP', value => 'EMP2'); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle); end; /
If we are quick enough, we can see the job executing in the DBA_DATAPUMP_JOBS
view.
SQL> @datapump_jobs.sql OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- SYS TESTUSER1_EMP_IMPORT IMPORT TABLE EXECUTING 1 0 2 SQL>
Schema Export
The following example shows how to perform a schema export of the TESTUSER1
schema.
declare l_dp_handle number; begin -- Open a schema export job. l_dp_handle := dbms_datapump.open( operation => 'EXPORT', job_mode => 'SCHEMA', remote_link => NULL, job_name => 'TESTUSER1_EXPORT', version => 'LATEST'); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'TESTUSER1.dmp', directory => 'TEST_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'expdpTESTUSER1.log', directory => 'TEST_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); -- Specify the schema to be exported. dbms_datapump.metadata_filter( handle => l_dp_handle, name => 'SCHEMA_EXPR', value => '= ''TESTUSER1'''); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle); end; /
If we are quick enough, we can see the job executing in the DBA_DATAPUMP_JOBS
view.
SQL> @datapump_jobs.sql OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- SYS TESTUSER1_EXPORT EXPORT SCHEMA EXECUTING 1 0 2 SQL>
Schema Import
The following example shows how to perform a schema import. We import the dump file produced by the export of the TESTUSER1
schema, but remap it to TESTUSER2
.
First we need the schema to import into.
conn / as sysdba alter session set container=pdb1; drop user testuser2 cascade; create user testuser2 identified by testuser2 default tablespace users quota unlimited on users; grant create session, create table, create type to testuser2;
The following is an example of how this API can be used to perform a schema import with a schema remap operation.
declare l_dp_handle number; begin -- Open a schema import job. l_dp_handle := dbms_datapump.open( operation => 'IMPORT', job_mode => 'SCHEMA', remote_link => NULL, job_name => 'TESTUSER2_IMPORT', version => 'LATEST'); -- Specify the schema to be imported. dbms_datapump.metadata_filter( handle => l_dp_handle, name => 'SCHEMA_EXPR', value => '= ''TESTUSER1'''); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'TESTUSER1.dmp', directory => 'TEST_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'impdpTESTUSER2.log', directory => 'TEST_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); -- Perform a REMAP_SCHEMA from SCOTT to SCOTT2. dbms_datapump.metadata_remap( handle => l_dp_handle, name => 'REMAP_SCHEMA', old_value => 'TESTUSER1', value => 'TESTUSER2'); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle); end; /
If we are quick enough, we can see the job executing in the DBA_DATAPUMP_JOBS
view.
SQL> @datapump_jobs.sql OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- SYS TESTUSER2_IMPORT IMPORT SCHEMA EXECUTING 1 0 2 SQL>
Database Export
The following example shows how to perform a full export of the PDB1
database.
declare l_dp_handle number; begin -- Open a full export job. l_dp_handle := dbms_datapump.open( operation => 'EXPORT', job_mode => 'FULL', remote_link => NULL, job_name => 'PDB1_EXPORT', version => 'LATEST'); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'PDB1.dmp', directory => 'TEST_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'expdpPDB1.log', directory => 'TEST_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle); end; /
If we are quick enough, we can see the job executing in the DBA_DATAPUMP_JOBS
view.
SQL> @datapump_jobs.sql OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- SYS PDB1_EXPORT EXPORT FULL EXECUTING 1 0 2 SQL>
Database Import
The following example shows how to perform a full import. We import the dump file produced by the export of the PDB1
database.
First we need drop the TESTUSER1
schema, to see if it comes back after the import.
conn / as sysdba alter session set container=pdb1; drop user testuser1 cascade;
The following is an example of how this API can be used to perform a schema import with a schema remap operation.
declare l_dp_handle number; begin -- Open a schema import job. l_dp_handle := dbms_datapump.open( operation => 'IMPORT', job_mode => 'FULL', remote_link => NULL, job_name => 'PDB1_IMPORT', version => 'LATEST'); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'PDB1.dmp', directory => 'TEST_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => 'impdpPDB1.log', directory => 'TEST_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle); end; /
If we are quick enough, we can see the job executing in the DBA_DATAPUMP_JOBS
view.
SQL> @datapump_jobs.sql OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- SYS PDB1_IMPORT IMPORT FULL EXECUTING 1 0 2 SQL>
For more information see:
- DBMS_DATAPUMP
- Data Pump (expdp, impdp) : All Articles
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
Hope this helps. Regards Tim...