8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Data Pump Export (expdp) to and Import (impdp) From Cloud Object Stores in Oracle Database 21c
Data Pump supports cloud object stores as a dumpfile location for on-prem installations from Oracle 21c onward.
- Introduction
- Prerequisites
- Bug in Oracle 21.3 and 21.4
- Setup
- Object Store Credentials
- Object Store URIs
- Export To Cloud Object Store
- Import From Cloud Object Store
Related articles.
- Oracle Data Pump 21c and Cloud Object Stores
- Data Pump Enhancements in Oracle Database 21c
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
- Cloud : All Articles
Introduction
Importing data from a cloud object store was first introduced in the Autonomous Database. The Oracle 18c impdp
utility introduced the CREDENTIAL
parameter to specify the object store credential to be used for an import. Although this used an on-prem installation of the impdp
utility, the feature could only be used when connected to an Autonomous Database. From Oracle 21c (21.3) we can also use the CREDENTIAL
parameter with the expdp
utility to export from an Autonomous Database to an Oracle Cloud object store. You can read about this functionality here.
- Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)
- Oracle Cloud : Autonomous Database (ADW or ATP) - Export Data to an Object Store (expdp)
Oracle 21c (21.3) introduced the ability to export to an object store (Oracle Cloud only), and import from an object store (Oracle Cloud, S3, Azure Blob Storage) using an on-prem database, so this functionality is no longer restricted to the Autonomous Database.
When using a cloud object store as your dumpfile location, there will be some latency associated with the data transfer.
Prerequisites
The DBMS_CLOUD
package is present by default on the Autonomous Database. It is not installed in Oracle 21c on-prem installations, so it has to be installed manually. This prerequisite is not mentioned in the Oracle documentation. The installation is described in this MOS note.
There is an example of this installation here.
We need an object store bucket for the examples. This could be an Oracle Cloud Object Storage bucket, or an AWS S3 bucket. The following article describes how to create an Oracle Cloud Object Storage bucket.
Bug in Oracle 21.3 and 21.4
Whilst trying to use this feature in Oracle 21.3 I kept getting errors. I raised a service request (SR) with My Oracle Support (MOS), which resulted in the following bug. It's hidden, but it exists. The but is still present in 21.4.
Bug 33323028 - DATA PUMP EXPORT TO OCI OBJECT STORAGE FAILS ORA-39001 ORA-39000 ORA-31641
Hopefully this will be fixed in a future release update, but for now the suggested workaround is to enable the Object Store ODM Library.
# Turn everything off. dbshut $ORACLE_HOME cd $ORACLE_HOME/rdbms/lib make -f ins_rdbms.mk opc_on # Turn everything on. dbstart $ORACLE_HOME
Thanks to the support folks and development folks at Oracle for getting me past this bug.
The previously undocumented dependency on the DBMS_CLOUD
package and the workaround for the bug have now been added to the following MOS note.
Setup
We create a test user to perform the export and import operations. We make sure the user can create credentials, and give it access to the DBMS_CLOUD
package.
conn sys/SysPassword1@//localhost:1521/pdb1 as sysdba --drop user testuser1 cascade; create user testuser1 identified by testuser1 quota unlimited on users; grant connect, resource to testuser1; grant create credential to testuser1; grant execute on dbms_cloud to testuser1;
We need a local directory object for use with the export and import operations. Dumpfiles are placed in the object store, but log files are written locally.
create or replace directory tmp_dir as '/tmp/'; grant read, write on directory tmp_dir to testuser1;
Connect to the test user and create the following table.
conn testuser1/testuser1@//localhost:1521/pdb1 create table emp ( empno number(4,0), ename varchar2(10 byte), job varchar2(9 byte), mgr number(4,0), hiredate date, sal number(7,2), comm number(7,2), deptno number(2,0), constraint pk_emp primary key (empno) ); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7369,'SMITH','CLERK',7902,to_date('17-DEC-80','DD-MON-RR'),800,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7499,'ALLEN','SALESMAN',7698,to_date('20-FEB-81','DD-MON-RR'),1600,300,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7521,'WARD','SALESMAN',7698,to_date('22-FEB-81','DD-MON-RR'),1250,500,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7566,'JONES','MANAGER',7839,to_date('02-APR-81','DD-MON-RR'),2975,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7654,'MARTIN','SALESMAN',7698,to_date('28-SEP-81','DD-MON-RR'),1250,1400,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7698,'BLAKE','MANAGER',7839,to_date('01-MAY-81','DD-MON-RR'),2850,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7782,'CLARK','MANAGER',7839,to_date('09-JUN-81','DD-MON-RR'),2450,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7788,'SCOTT','ANALYST',7566,to_date('19-APR-87','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7839,'KING','PRESIDENT',null,to_date('17-NOV-81','DD-MON-RR'),5000,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7844,'TURNER','SALESMAN',7698,to_date('08-SEP-81','DD-MON-RR'),1500,0,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7876,'ADAMS','CLERK',7788,to_date('23-MAY-87','DD-MON-RR'),1100,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7900,'JAMES','CLERK',7698,to_date('03-DEC-81','DD-MON-RR'),950,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7902,'FORD','ANALYST',7566,to_date('03-DEC-81','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7934,'MILLER','CLERK',7782,to_date('23-JAN-82','DD-MON-RR'),1300,null,10); commit;
Object Store Credentials
The DBMS_CLOUD
package contains a copy of the procedures from the DBMS_CREDENTIAL
package. The two packages can be used interchangeably, as they do the same thing.
Create a credential for your object store. For an Oracle object storage bucket we use our Oracle Cloud email and the Auth Token we generated.
conn testuser1/testuser1@//localhost:1521/pdb1 begin dbms_cloud.drop_credential(credential_name => 'obj_store_cred'); end; / begin dbms_cloud.create_credential ( credential_name => 'obj_store_cred', username => 'me@example.com', password => '{my-Auth-Token}' ) ; end; /
For AWS buckets we use our AWS access key and secret access key.
begin dbms_cloud.create_credential ( credential_name => 'obj_store_cred', username => 'my AWS access key', password => 'my AWS secret access key' ); end; /
Object Store URIs
We use an object store URI for the DUMPFILE
location. For AWS S3, use the URI of your S3 bucket. For Oracle Cloud the URI can take either of these forms.
https://swiftobjectstorage.{region}.oraclecloud.com/v1/{namespace}/{bucket-name}/{file-name}.dmp https://objectstorage.{region}.oraclecloud.com/n/{namespace}/b/{bucket-name}/o/{file-name}.dmp
The documentation typically uses the "swiftobjectstorage" URI, but both work. For the remainder of the article we will use the "swiftobjectstorage" URI.
Export To Cloud Object Store
We perform a normal export using the expdp
utility. We use the CREDENTIAL
parameter to provide our cloud object store credentials, and use a cloud object store URI in the DUMPFILE
parameter.
$ expdp testuser1/testuser1@//localhost:1521/pdb1 \ tables=emp \ credential=obj_store_cred \ dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.dmp \ logfile=tmp_dir:expdp_emp.log \ exclude=statistics Export: Release 21.0.0.0.0 - Production on Fri Sep 10 08:34:40 2021 Version 21.3.0.0.0 Copyright (c) 1982, 2021, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production Starting "TESTUSER1"."SYS_EXPORT_TABLE_01": testuser1/********@//localhost:1521/pdb1 tables=emp credential=obj_store_cred dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.dmp logfile=tmp_dir:expdp_emp.log exclude=statistics Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT . . exported "TESTUSER1"."EMP" 8.781 KB 14 rows Master table "TESTUSER1"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for TESTUSER1.SYS_EXPORT_TABLE_01 is: https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.dmp Job "TESTUSER1"."SYS_EXPORT_TABLE_01" successfully completed at Fri Sep 10 08:34:59 2021 elapsed 0 00:00:17 $
Import From Cloud Object Store
We perform a normal import using the impdp
utility. We use the CREDENTIAL
parameter to provide our cloud object store credentials, and use a cloud object store URI in the DUMPFILE
parameter.
$ impdp testuser1/testuser1@//localhost:1521/pdb1 \ tables=emp \ remap_table=testuser1.emp:emp_copy \ credential=obj_store_cred \ dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.dmp \ logfile=tmp_dir:impdp_emp.log Import: Release 21.0.0.0.0 - Production on Fri Sep 10 08:36:15 2021 Version 21.3.0.0.0 Copyright (c) 1982, 2021, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production Master table "TESTUSER1"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded Starting "TESTUSER1"."SYS_IMPORT_TABLE_01": testuser1/********@//localhost:1521/pdb1 tables=emp remap_table=testuser1.emp:emp_copy credential=obj_store_cred dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.dmp logfile=tmp_dir:impdp_emp.log Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "TESTUSER1"."EMP_COPY" 8.781 KB 14 rows Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT ORA-31684: Object type CONSTRAINT:"TESTUSER1"."PK_EMP" already exists Job "TESTUSER1"."SYS_IMPORT_TABLE_01" completed with 1 error(s) at Fri Sep 10 08:36:28 2021 elapsed 0 00:00:09 $
For more information see:
- Oracle Database Utilities
- Oracle Data Pump 21c and Cloud Object Stores
- Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)
- Oracle Cloud : Autonomous Database (ADW or ATP) - Export Data to an Object Store (expdp)
- DBMS_CLOUD : Installation on 19c and 21c On-Prem Databases
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Data Pump Enhancements in Oracle Database 21c
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
- Cloud : All Articles
Hope this helps. Regards Tim...