8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Oracle Cloud : Autonomous Database (ADW or ATP) - Export Data to an Object Store (expdp)
This article demonstrates how to export data from an Autonomous Data Warehouse (ADW) or Autonomous Transaction Processing (ATP) service on the Oracle Cloud using the expdp
utility.
- Assumptions
- Create Something to Export
- Object Store Credentials
- Export to Object Store
- Get the Log File
Related articles.
- Oracle Cloud : Autonomous Transaction Processing (ATP) - Create Service
- Oracle Cloud : Autonomous Data Warehouse (ADW) - Create Service
- Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Data Pump Enhancements in Oracle Database 21c
- Cloud : All Articles
Assumptions
For the export to work you will have to make a connection from an Oracle client to the ADW or ATP database. You can see the necessary setup to do this here.
The Oracle 18c impdp
utility introduced the CREDENTIAL
parameter to specify the object store credential to be used for an import. From Oracle 21c (21.3) we can also use the CREDENTIAL
parameter with the expdp
utility.
We need an object store bucket to export the data to. The export is only supported to an Oracle Cloud Object Storage bucket, but the import also works from an AWS S3 bucket and Azure Blob Storage. The following article describes how to create an Oracle Cloud Object Storage bucket.
Create Something to Export
We connect to an autonomous database and create a new test user.
conn admin/MyPassword123@obatp_high create user testuser1 identified by "MyPassword123"; alter user testuser1 quota unlimited on data; grant create session to testuser1; grant dwrole to testuser1;
We create a test table which we will export.
create table testuser1.t1 as select level as id, 'Description for ' || level as description from dual connect by level <= 1000; commit;
Object Store Credentials
Create a credential for your object store. For an Oracle object storage bucket we use our Oracle Cloud email and the Auth Token we generated.
conn admin/MyPassword123@obatp_high begin dbms_cloud.drop_credential(credential_name => 'obj_store_cred'); end; / begin dbms_cloud.create_credential ( credential_name => 'obj_store_cred', username => 'me@example.com', password => '{my-Auth-Token}' ) ; end; /
For AWS buckets we use our AWS access key and secret access key. Remember, this is only supported for imports.
begin dbms_cloud.create_credential ( credential_name => 'obj_store_cred', username => 'my AWS access key', password => 'my AWS secret access key' ); end; /
Export to Object Store
We can use a local Oracle 21.3 installation to export data from the autonomous database to an object store.
We use the CREDENTIALS
parameter to point to the database credential we created earlier. We use an object store URI for the DUMPFILE
location. For AWS S3, use the URI of your S3 bucket. For Oracle Cloud the URI can take either of these forms.
https://swiftobjectstorage.{region}.oraclecloud.com/v1/{namespace}/{bucket-name}/{file-name}.dmp https://objectstorage.{region}.oraclecloud.com/n/{namespace}/b/{bucket-name}/o/{file-name}.dmp
The following example uses the "swiftobjectstorage" URI.
expdp admin/MyPassword123@obatp_high \ tables=testuser1.t1 \ directory=data_pump_dir \ credential=obj_store_cred \ dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp \ exclude=statistics Export: Release 21.0.0.0.0 - Production on Tue Sep 7 18:36:39 2021 Version 21.3.0.0.0 Copyright (c) 1982, 2021, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production Starting "ADMIN"."SYS_EXPORT_TABLE_01": admin/********@obatp_high tables=testuser1.t1 directory=data_pump_dir credential=obj_store_cred dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp exclude=statistics Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Processing object type TABLE_EXPORT/TABLE/TABLE . . exported "TESTUSER1"."T1" 32.60 KB 1000 rows ORA-39173: Encrypted data has been stored unencrypted in dump file set. Master table "ADMIN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for ADMIN.SYS_EXPORT_TABLE_01 is: https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp Job "ADMIN"."SYS_EXPORT_TABLE_01" successfully completed at Tue Sep 7 18:37:14 2021 elapsed 0 00:00:26 $
The following example uses the "objectstorage" URI.
expdp admin/MyPassword123@obatp_high \ tables=testuser1.t1 \ directory=data_pump_dir \ credential=obj_store_cred \ dumpfile=https://objectstorage.uk-london-1.oraclecloud.com/n/my-namespace/b/ob-bucket/o/t1.dmp \ exclude=statistics Export: Release 21.0.0.0.0 - Production on Tue Sep 7 19:05:47 2021 Version 21.3.0.0.0 Copyright (c) 1982, 2021, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production Starting "ADMIN"."SYS_EXPORT_TABLE_01": admin/********@obatp_high tables=testuser1.t1 directory=data_pump_dir credential=obj_store_cred dumpfile=https://objectstorage.uk-london-1.oraclecloud.com/n/my-namespace/b/ob-bucket/o/t1.dmp exclude=statistics Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Processing object type TABLE_EXPORT/TABLE/TABLE . . exported "TESTUSER1"."T1" 32.60 KB 1000 rows ORA-39173: Encrypted data has been stored unencrypted in dump file set. Master table "ADMIN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for ADMIN.SYS_EXPORT_TABLE_01 is: https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp Job "ADMIN"."SYS_EXPORT_TABLE_01" successfully completed at Tue Sep 7 19:06:06 2021 elapsed 0 00:00:15 $
Get the Log File
If we want to read the contents of the expdp
log file we can push it across to the object store using the PUT_OBJECT
procedure in the DBMS_CLOUD
package.
conn admin/MyPassword123@obatp_high begin dbms_cloud.put_object( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/export.log', directory_name => 'data_pump_dir', file_name => 'export.log'); end; /
It can then be downloaded from the object store.
For more information see:
- Oracle Cloud : Autonomous Transaction Processing (ATP) - Create Service
- Oracle Cloud : Autonomous Data Warehouse (ADW) - Create Service
- Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Data Pump Enhancements in Oracle Database 21c
- Cloud : All Articles
Hope this helps. Regards Tim...