8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux

Home » Articles » Vm » Here

Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)

This article demonstrates how to import data into an Autonomous Data Warehouse (ADW) or Autonomous Transaction Processing (ATP) service on the Oracle Cloud using the impdp utility.

The examples in this article are based on the Autonomous Data Warehouse (ADW), but the same method works fine for the Automated Transaction Processing (ATP) service too.

Related articles.

Export Your Existing Data

We have a schema called TEST in an Oracle 18c instance on Oracle Database Cloud Service (DBaaS). The schema has two tables (EMP and DEPT), which we want to transfer to the Autonomous Data Warehouse (ADW) or Autonomous Transaction Processing (ATP).

Create a directory object.

conn / as sysdba

ALTER SESSION SET CONTAINER = pdb1;

CREATE OR REPLACE DIRECTORY temp_dir AS '/tmp/';
GRANT READ, WRITE ON DIRECTORY temp_dir TO test;

Export the schema. The ADW documentation suggests the EXCLUDE and DATA_OPTIONS options in the example below. These options are not necessary for ATP service. For such a small import it is silly to use the PARALLEL clause, but we want to produce multiple dump files.

expdp test/test@pdb1 \
      schemas=test \
      parallel=2 \
      version=12.2 \
      directory=temp_dir dumpfile=emp_dept%u.dmp logfile=expdp_emp_dept.log \
      exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link \
      data_options=group_partition_table_data

This resulted in the following dump files.

/tmp/emp_dept01.dmp
/tmp/emp_dept02.dmp

These dump files were uploaded to an AWS S3 bucket.

I found I had to set the version=12.2 option during the export or I would receive the following error during the import into ADW and ATP.

Connected to: Oracle Database 18c Enterprise Edition Release 12.2.0.1.0 - 64bit Production
ORA-39002: invalid operation
ORA-39358: Export dump file version 18.0.0 not compatible with target version 12.2.0

It would appear there is something interesting about the version of 18c used for ADW.

Object Store Credentials

We need to create a credential containing a username and password for the connection to the object store. If you are an AWS S3 bucket the username and password are as follows.

The credentials are dropped and created using the DROP_CREDENTIAL and CREATE_CREDENTIAL procedures of the DBMS_CLOUD package respectively.

BEGIN
  DBMS_CLOUD.drop_credential(credential_name => 'OBJ_STORE_CRED');
END;
/

BEGIN
  DBMS_CLOUD.create_credential (
    credential_name => 'OBJ_STORE_CRED',
    username => 'my AWS access key',
    password => 'my AWS secret access key'
  ) ;
END;
/

Import Data from S3

For the import to work you will have to make a connection from an Oracle client to the ADW database. You can see the necessary setup to do this here.

From an 18c client we can issue the following type of import. The CREDENTIAL option specifies the object store credential to be used for the import. The DUMPFILE option specifies the URIs of the dump files in the object store. The TRANSFORM and EXCLUDE options are the recommended settings in the ADW documentation, but are not necessary for the ATP service. In this case we are using REMAP_SCHEMA to place the objects into a schema called MY_USER on ADW.

impdp admin/MyPassword123@obadw_high \
      directory=data_pump_dir \
      credential=obj_store_cred \
      dumpfile=https://s3-eu-west-1.amazonaws.com/my-sh-data-bucket/emp_dept01.dmp,https://s3-eu-west-1.amazonaws.com/my-sh-data-bucket/emp_dept02.dmp \
      parallel=2 \
      remap_schema=test:my_user \
      partition_options=merge \
      transform=segment_attributes:n \
      transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \
      exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link

From a 12.2 or earlier client we need to set the default credential for the database on ADW.

ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'ADMIN.OBJ_STORE_CRED';

We can issue the following type of import. The CREDENTIAL option isn't used and instead the "default_credential:" prefix is used before each object store URI. The rest of the parameters are the same as the previous example.

impdp admin/MyPassword123@obadw_high \
      directory=data_pump_dir \
      dumpfile=default_credential:https://s3-eu-west-1.amazonaws.com/my-sh-data-bucket/emp_dept01.dmp,default_credential:https://s3-eu-west-1.amazonaws.com/my-sh-data-bucket/emp_dept02.dmp \
      parallel=2 \
      remap_schema=test:my_user \
      partition_options=merge \
      transform=segment_attributes:n \
      transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \
      exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link

I ran both these imports from the 18c client on my DBaaS service, but remember, it is initiating the import process on the ADW database.

From 19c onward we can use wildcards in URL-based dump file names, making import from multiple files into Autonomous Databases easier. In the following example we replace the number on the file name with the "*" wildcard.

impdp admin/MyPassword123@obadw_high \
      directory=data_pump_dir \
      credential=obj_store_cred \
      dumpfile=https://s3-eu-west-1.amazonaws.com/my-sh-data-bucket/emp_dept0*.dmp \
      parallel=2 \
      remap_schema=test:my_user \
      partition_options=merge \
      transform=segment_attributes:n \
      transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \
      exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link

Get the Log File

If we want to read the contents of the impdp log file we can push it across to the object store using the PUT_OBJECT command.

BEGIN
  DBMS_CLOUD.put_object(
    credential_name => 'obj_store_cred',
    object_uri      => 'https://s3-eu-west-1.amazonaws.com/my-sh-data-bucket/import.log',
    directory_name  => 'DATA_PUMP_DIR',
    file_name       => 'import.log');
END;
/

It can then be downloaded from the object store.

Autonomous Transaction Processing (ATP)

The method for importing data from a dump file in an object store is the same for the Autonomous Transaction Processing (ATP) service as the Autonomous Data Warehouse (ADW) service.

Remember you are able to create a variety of access structures in ATP that you can't in ADW.

For more information see:

Hope this helps. Regards Tim...

Back to the Top.