8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
DBMS_CLOUD Package
This article describes the basic usage of the DBMS_CLOUD
package. The DBMS_CLOUD
package was introduced in Autonomous Database to provide a simple way to interact with an object store from the database. It can be used in on-prem installations for versions 19c and 21c. It works equally well with AWS S3 buckets or Oracle Cloud Object Storage buckets.
- Prerequisites
- Setup
- Object Store URIs
- Object Store Credentials
- Objects and Files
- External Tables
- Copy Data
- Export Data
- SODA Collections
- Delete Operations
Related articles.
- DBMS_CLOUD : Objects and Files
- DBMS_CLOUD : External Tables
- DBMS_CLOUD : Installation on 19c and 21c On-Prem Databases
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Oracle Cloud : Autonomous Database (ADW or ATP) - Load Data from an Object Store (DBMS_CLOUD)
Prerequisites
Most of the examples in this article use an on-prem installation of the DBMS_CLOUD
package, but some functionality only seems to work properly on the Autonomous Database. We'll highlight when one of these restrictions is in place.
The DBMS_CLOUD
package is present by default on the Autonomous Database. It is not installed in Oracle 19c or 21c installations, so it has to be installed manually. The installation is described in this MOS note.
There is an example of this installation here.
We need an object store bucket for some of the examples. This could be an Oracle Cloud Object Store bucket, or an AWS S3 bucket. The following article describes how to create an Oracle Cloud Object Store bucket.
Setup
We create a test user. We make sure the user can create credentials, and give it access to the DBMS_CLOUD
package.
conn sys/SysPassword1@//localhost:1521/pdb1 as sysdba --drop user testuser1 cascade; create user testuser1 identified by testuser1 quota unlimited on users; grant connect, resource to testuser1; grant create credential to testuser1; grant execute on dbms_cloud to testuser1;
We need a local directory object to interact with files on the database file server. We give both the test user and C##CLOUD$SERVICE
user access to this directory.
create or replace directory tmp_files_dir as '/tmp/files'; grant read, write on directory tmp_files_dir to testuser1, C##CLOUD$SERVICE;
The external table functionality requires access to a directory object called DATA_PUMP_DIR, so create it in the PDB and grant read/write access to the test user.
alter session set "_oracle_script"=TRUE; create or replace directory data_pump_dir as '/u01/app/oracle/admin/cdb1/dpdump/'; alter session set "_oracle_script"=FALSE; grant read, write on directory data_pump_dir to testuser1;
Connect to the test user and create the following table.
conn testuser1/testuser1@//localhost:1521/pdb1 create table emp ( empno number(4,0), ename varchar2(10 byte), job varchar2(9 byte), mgr number(4,0), hiredate date, sal number(7,2), comm number(7,2), deptno number(2,0), constraint pk_emp primary key (empno) ); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7369,'SMITH','CLERK',7902,to_date('17-DEC-80','DD-MON-RR'),800,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7499,'ALLEN','SALESMAN',7698,to_date('20-FEB-81','DD-MON-RR'),1600,300,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7521,'WARD','SALESMAN',7698,to_date('22-FEB-81','DD-MON-RR'),1250,500,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7566,'JONES','MANAGER',7839,to_date('02-APR-81','DD-MON-RR'),2975,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7654,'MARTIN','SALESMAN',7698,to_date('28-SEP-81','DD-MON-RR'),1250,1400,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7698,'BLAKE','MANAGER',7839,to_date('01-MAY-81','DD-MON-RR'),2850,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7782,'CLARK','MANAGER',7839,to_date('09-JUN-81','DD-MON-RR'),2450,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7788,'SCOTT','ANALYST',7566,to_date('19-APR-87','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7839,'KING','PRESIDENT',null,to_date('17-NOV-81','DD-MON-RR'),5000,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7844,'TURNER','SALESMAN',7698,to_date('08-SEP-81','DD-MON-RR'),1500,0,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7876,'ADAMS','CLERK',7788,to_date('23-MAY-87','DD-MON-RR'),1100,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7900,'JAMES','CLERK',7698,to_date('03-DEC-81','DD-MON-RR'),950,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7902,'FORD','ANALYST',7566,to_date('03-DEC-81','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7934,'MILLER','CLERK',7782,to_date('23-JAN-82','DD-MON-RR'),1300,null,10); commit;
All SQL operations in this article will be performed from the test user, unless otherwise stated.
conn testuser1/testuser1@//localhost:1521/pdb1
Object Store URIs
We use an object store URI for many of the examples in this article. For Oracle Cloud the URI can take either of these forms.
https://swiftobjectstorage.{region}.oraclecloud.com/v1/{namespace}/{bucket}/{objectname} https://objectstorage.{region}.oraclecloud.com/n/{namespace}/b/{bucket}/o/{objectname}
The documentation typically uses the "swiftobjectstorage" URI, but both work. For the remainder of the article we will use the "swiftobjectstorage" URI.
The AWS S3 and Azure blob storage URIs typically look like this.
AWS S3: https://s3-{region}.amazonaws.com/{bucket}/{objectname} Azure Blog Storage: https://{account}.blob.core.windows.net/{container}/{objectname}
A number of the procedures and functions support wildcards in the URI definitions.
- * - A wildcard for multiple characters.
- ? - A wildcard for a single character.
Object Store Credentials
The DBMS_CLOUD
package contains a copy of the procedures from the DBMS_CREDENTIAL
package. The two packages can be used interchangeably, as they do the same thing.
Create a credential for your object store using the CREATE_CREDENTIAL
procedure. For an Oracle object storage bucket we use our Oracle Cloud email and the Auth Token we generated.
begin dbms_cloud.create_credential ( credential_name => 'obj_store_cred', username => 'me@example.com', password => '{my-Auth-Token}' ) ; end; /
For AWS buckets we use our AWS access key and secret access key.
begin dbms_cloud.create_credential ( credential_name => 'obj_store_cred', username => 'my AWS access key', password => 'my AWS secret access key' ); end; /
Information about the credential can be displayed using the USER_CREDENTIALS
view.
column credential_name format a25 column username format a20 select credential_name, username, enabled from user_credentials order by credential_name; CREDENTIAL_NAME USERNAME ENABL ------------------------- -------------------- ----- OBJ_STORE_CRED me@example.com TRUE SQL>
The DISABLE_CREDENTIAL
and ENABLE_CREDENTIAL
procedures disable and enable credentials respectively.
begin dbms_credential.disable_credential('obj_store_cred'); dbms_credential.enable_credential('obj_store_cred'); end;/
The UPDATE_CREDENTIALS
procedure allows us to edit attributes of a credential.
begin dbms_credential.update_credential( credential_name => 'obj_store_cred', attribute => 'username', value => 'me@example.com'); dbms_credential.update_credential( credential_name => 'obj_store_cred', attribute => 'password', value => '{my-Auth-Token}'); end; /
The DROP_CREDENTIAL
procedure drops the named credential.
begin dbms_cloud.drop_credential(credential_name => 'obj_store_cred'); end; /
The following examples require a valid credential.
Objects and Files
There are several routines for manipulating files on the local database file system and objects in a cloud object store.
Create a file on the database server file system.
mkdir -p /tmp/files echo "This is a test file" > /tmp/files/test1.txt
We use the PUT_OBJECT
procedure to transfer a file from the directory object location to the cloud object store.
begin dbms_cloud.put_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test1.txt', directory_name => 'tmp_files_dir', file_name => 'test1.txt'); end; /
There is an overload of the PUT_OBJECT
procedure to transfer the contents of a BLOB to the object store.
declare l_file blob; begin l_file := utl_raw.cast_to_raw('This is another test file'); dbms_cloud.put_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test2.txt', contents => l_file); end; /
The LIST_OBJECTS
table function lists objects in the location pointed to by the object store URI.
set linesize 150 column object_name format a12 column checksum format a35 column created format a35 column last_modified format a35 select * from dbms_cloud.list_objects( credential_name => 'obj_store_cred', location_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket'); OBJECT_NAME BYTES CHECKSUM CREATED LAST_MODIFIED ------------ ---------- ----------------------------------- ----------------------------------- ----------------------------------- test1.txt 20 5dd39cab1c53c2c77cd352983f9641e1 11-SEP-21 08.45.42.779000 AM +00:00 test2.txt 25 d0914057907f9d04dd9e68b1c1e180f0 11-SEP-21 08.45.54.148000 AM +00:00 SQL>
We use the GET_METADATA
function to return information about a specific object.
select dbms_cloud.get_metadata( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test2.txt') as metadata from dual; METADATA -------------------------------------------------------------------------------- {"Content-Length":25} SQL>
We use the GET_OBJECT
procedure to transfer an object from the cloud object store to the directory object location.
begin dbms_cloud.get_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test2.txt', directory_name => 'tmp_files_dir', file_name => 'test2.txt'); end; /
There is a GET_OBJECT
function to transfer an object from the cloud object store to a BLOB.
declare l_file blob; begin l_file := dbms_cloud.get_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test2.txt'); end; /
The DELETE_OBJECT
procedure deletes objects from the cloud object store.
begin dbms_cloud.delete_object( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test1.txt'); dbms_cloud.delete_object( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/test2.txt'); end; /
The DELETE_FILE
procedure deletes files from directory object location.
begin dbms_cloud.delete_file( directory_name => 'tmp_files_dir', file_name => 'test1.txt'); dbms_cloud.delete_file( directory_name => 'tmp_files_dir', file_name => 'test2.txt'); end; /
The LIST_FILES
table function lists the files in the location pointed to by the specified Oracle directory object. The documentation says it's only supported for directory objects mapping to Oracle File System (OFS) or Database File System (DBFS) file systems, so we can't use it for a regular file system. It does work in the locations provided on the Autonomous Database.
select * from dbms_cloud.list_files(directory_name => 'data_pump_dir');
External Tables
This section describes the creation of external tables based on files in a cloud object store.
CREATE_EXTERNAL_TABLE
We create a file called "emp.dat" with the following contents and place it into our object store. It is a pipe-delimited file with no headers.
7369|"SMITH"|"CLERK"|7902|17-DEC-80|800||20 7499|"ALLEN"|"SALESMAN"|7698|20-FEB-81|1600|300|30 7521|"WARD"|"SALESMAN"|7698|22-FEB-81|1250|500|30 7566|"JONES"|"MANAGER"|7839|02-APR-81|2975||20 7654|"MARTIN"|"SALESMAN"|7698|28-SEP-81|1250|1400|30 7698|"BLAKE"|"MANAGER"|7839|01-MAY-81|2850||30 7782|"CLARK"|"MANAGER"|7839|09-JUN-81|2450||10 7788|"SCOTT"|"ANALYST"|7566|19-APR-87|3000||20 7839|"KING"|"PRESIDENT"||17-NOV-81|5000||10 7844|"TURNER"|"SALESMAN"|7698|08-SEP-81|1500|0|30 7876|"ADAMS"|"CLERK"|7788|23-MAY-87|1100||20 7900|"JAMES"|"CLERK"|7698|03-DEC-81|950||30 7902|"FORD"|"ANALYST"|7566|03-DEC-81|3000||20 7934|"MILLER"|"CLERK"|7782|23-JAN-82|1300||10
The CREATE_EXTERNAL_TABLE
procedure creates an external table called EMP_EXT
based on a file in a cloud object store.
--drop table emp_ext; begin dbms_cloud.create_external_table( table_name => 'emp_ext', credential_name => 'obj_store_cred', file_uri_list => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.dat', column_list => 'empno number(4), ename varchar2(10), job varchar2(9), mgr number(4), hiredate date, sal number(7,2), comm number(7,2), deptno number(2)', format => json_object('ignoremissingcolumns' value 'true', 'removequotes' value 'true') ); end; /
We query the external table, which reads the data from the cloud object store.
select * from emp_ext; EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO ---------- ---------- --------- ---------- --------- ---------- ---------- ---------- 7369 SMITH CLERK 7902 17-DEC-80 800 20 7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30 7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30 7566 JONES MANAGER 7839 02-APR-81 2975 20 7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30 7698 BLAKE MANAGER 7839 01-MAY-81 2850 30 7782 CLARK MANAGER 7839 09-JUN-81 2450 10 7788 SCOTT ANALYST 7566 19-APR-87 3000 20 7839 KING PRESIDENT 17-NOV-81 5000 10 7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30 7876 ADAMS CLERK 7788 23-MAY-87 1100 20 7900 JAMES CLERK 7698 03-DEC-81 950 30 7902 FORD ANALYST 7566 03-DEC-81 3000 20 7934 MILLER CLERK 7782 23-JAN-82 1300 10 14 rows selected. SQL>
The FORMAT
parameter allows us to tailor the load process to to suit the data file contents. The full list of format options can be found here. The following example works with a CSV file.
We create a file called "emp.csv" with the following contents and place it into our object store. It is a CSV file with a header row.
"EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO" 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20 7499,"ALLEN","SALESMAN",7698,20-FEB-81,1600,300,30 7521,"WARD","SALESMAN",7698,22-FEB-81,1250,500,30 7566,"JONES","MANAGER",7839,02-APR-81,2975,,20 7654,"MARTIN","SALESMAN",7698,28-SEP-81,1250,1400,30 7698,"BLAKE","MANAGER",7839,01-MAY-81,2850,,30 7782,"CLARK","MANAGER",7839,09-JUN-81,2450,,10 7788,"SCOTT","ANALYST",7566,19-APR-87,3000,,20 7839,"KING","PRESIDENT",,17-NOV-81,5000,,10 7844,"TURNER","SALESMAN",7698,08-SEP-81,1500,0,30 7876,"ADAMS","CLERK",7788,23-MAY-87,1100,,20 7900,"JAMES","CLERK",7698,03-DEC-81,950,,30 7902,"FORD","ANALYST",7566,03-DEC-81,3000,,20 7934,"MILLER","CLERK",7782,23-JAN-82,1300,,10
The CREATE_EXTERNAL_TABLE
procedure creates an external table called EMP_CSV_EXT
based on a file in a cloud object store.
--drop table emp_csv_ext; begin dbms_cloud.create_external_table( table_name => 'emp_csv_ext', credential_name => 'obj_store_cred', file_uri_list => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.csv', column_list => 'empno number(4), ename varchar2(10), job varchar2(9), mgr number(4), hiredate date, sal number(7,2), comm number(7,2), deptno number(2)', format => json_object('type' value 'csv', 'skipheaders' value '1') ); end; /
We query the external table, which reads the data from the cloud object store.
select * from emp_csv_ext; EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO ---------- ---------- --------- ---------- --------- ---------- ---------- ---------- 7369 SMITH CLERK 7902 17-DEC-80 800 20 7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30 7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30 7566 JONES MANAGER 7839 02-APR-81 2975 20 7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30 7698 BLAKE MANAGER 7839 01-MAY-81 2850 30 7782 CLARK MANAGER 7839 09-JUN-81 2450 10 7788 SCOTT ANALYST 7566 19-APR-87 3000 20 7839 KING PRESIDENT 17-NOV-81 5000 10 7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30 7876 ADAMS CLERK 7788 23-MAY-87 1100 20 7900 JAMES CLERK 7698 03-DEC-81 950 30 7902 FORD ANALYST 7566 03-DEC-81 3000 20 7934 MILLER CLERK 7782 23-JAN-82 1300 10 14 rows selected. SQL>
The VALIDATE_EXTERNAL_TABLE
procedure allows us to check the validity of an external table.
begin dbms_cloud.validate_external_table('emp_csv_ext'); end; /
CREATE_EXTERNAL_PART_TABLE
Create four CSV files using the following queries.
set markup csv on quote on set trimspool on linesize 1000 feedback off pagesize 0 spool /tmp/files/gbr1.txt select 'GBR', object_id, owner, object_name from all_objects where object_id <= 2000 and rownum <= 1000; spool off spool /tmp/files/gbr2.txt select 'GBR', object_id, owner, object_name from all_objects where object_id BETWEEN 2000 AND 3999 and rownum <= 1000; spool off spool /tmp/files/ire1.txt select 'IRE', object_id, owner, object_name from all_objects where object_id <= 2000 and rownum <= 1000; spool off spool /tmp/files/ire2.txt select 'IRE', object_id, owner, object_name from all_objects where object_id BETWEEN 2000 AND 3999 and rownum <= 1000; spool off set markup csv off set trimspool on linesize 1000 feedback off pagesize 14
You may need to clean up the start and end of the files a little before uploading them. Copy the files to the object store.
begin dbms_cloud.put_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/gbr1.txt', directory_name => 'tmp_files_dir', file_name => 'gbr1.txt'); end; / begin dbms_cloud.put_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/gbr2.txt', directory_name => 'tmp_files_dir', file_name => 'gbr2.txt'); end; / begin dbms_cloud.put_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/ire1.txt', directory_name => 'tmp_files_dir', file_name => 'ire1.txt'); end; / begin dbms_cloud.put_object ( credential_name => 'obj_store_cred', object_uri => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/ire2.txt', directory_name => 'tmp_files_dir', file_name => 'ire2.txt'); end; /
The CREATE_EXTERNAL_PART_TABLE
procedure creates an external partitioned table called COUNTRY_PART_TAB_EXT
based on a files in a cloud object store.
--drop table country_part_tab_ext; begin dbms_cloud.create_external_part_table( table_name => 'country_part_tab_ext', credential_name => 'obj_store_cred', format => json_object('type' value 'csv', 'skipheaders' value '1'), column_list => 'country_code varchar2(3), object_id number, owner varchar2(128), object_name varchar2(128)', partitioning_clause => 'partition by list (country_code) ( partition part_gbr values (''GBR'') location ( ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/gbr1.txt'', ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/gbr2.txt'' ), partition part_ire values (''IRE'') location ( ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/ire1.txt'', ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/ire2.txt'' ) )' ); end; /
We query the external table, which reads the data from the cloud object store.
select country_code, count(*) as amount from country_part_tab_ext group by country_code order by country_code; COU AMOUNT --- ---------- GBR 2000 IRE 2000 SQL>
The VALIDATE_EXTERNAL_PART_TABLE
procedure allows us to check the validity of an external partitioned table.
begin dbms_cloud.validate_external_part_table('country_part_tab_ext'); end; /
CREATE_HYBRID_PART_TABLE
The CREATE_HYBRID_PART_TABLE
procedure creates an external hybrid partitioned table called COUNTRY_HYBRID_PART_TAB_EXT
based on a files in a cloud object store.
--drop table country_hybrid_part_tab_ext; begin dbms_cloud.create_hybrid_part_table( table_name => 'country_hybrid_part_tab_ext', credential_name => 'obj_store_cred', format => json_object('type' value 'csv', 'skipheaders' value '1'), column_list => 'country_code varchar2(3), object_id number, owner varchar2(128), object_name varchar2(128)', partitioning_clause => 'partition by list (country_code) ( partition part_gbr values (''GBR'') external location ( ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/gbr1.txt'', ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/gbr2.txt'' ), partition part_ire values (''IRE'') external location ( ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/ire1.txt'', ''https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/ire2.txt'' ), partition part_usa values (''USA'') )' ); end; /
We insert a row into the regular partition.
insert into country_hybrid_part_tab_ext values ('USA', 123, 'banana', 'banana'); commit;
We query the external table, which reads the data from the cloud object store.
select country_code, count(*) as amount from country_hybrid_part_tab_ext group by country_code order by country_code; COU AMOUNT --- ---------- GBR 2000 IRE 2000 USA 1 SQL>
The VALIDATE_HYBRID_PART_TABLE
procedure allows us to check the validity of an external hybrid partitioned table.
begin dbms_cloud.validate_hybrid_part_table('country_hybrid_part_tab_ext'); end; /
Copy Data
The COPY_DATA
procedure allows us to copy data from a cloud object store into an existing table.
We create a file called "emp.csv" with the following contents and place it into our object store. It is a CSV file with a header row.
"EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO" 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20 7499,"ALLEN","SALESMAN",7698,20-FEB-81,1600,300,30 7521,"WARD","SALESMAN",7698,22-FEB-81,1250,500,30 7566,"JONES","MANAGER",7839,02-APR-81,2975,,20 7654,"MARTIN","SALESMAN",7698,28-SEP-81,1250,1400,30 7698,"BLAKE","MANAGER",7839,01-MAY-81,2850,,30 7782,"CLARK","MANAGER",7839,09-JUN-81,2450,,10 7788,"SCOTT","ANALYST",7566,19-APR-87,3000,,20 7839,"KING","PRESIDENT",,17-NOV-81,5000,,10 7844,"TURNER","SALESMAN",7698,08-SEP-81,1500,0,30 7876,"ADAMS","CLERK",7788,23-MAY-87,1100,,20 7900,"JAMES","CLERK",7698,03-DEC-81,950,,30 7902,"FORD","ANALYST",7566,03-DEC-81,3000,,20 7934,"MILLER","CLERK",7782,23-JAN-82,1300,,10
We truncate the local EMP
table and reload the data from the cloud object store using the COPY_DATA
procedure.
truncate table emp; begin dbms_cloud.copy_data( table_name => 'emp', credential_name => 'obj_store_cred', file_uri_list => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.csv', format => json_object('type' value 'csv', 'skipheaders' value '1') ); end; /
We query the EMP
table, and we can see the data has been loaded.
select * from emp; EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO ---------- ---------- --------- ---------- --------- ---------- ---------- ---------- 7369 SMITH CLERK 7902 17-DEC-80 800 20 7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30 7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30 7566 JONES MANAGER 7839 02-APR-81 2975 20 7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30 7698 BLAKE MANAGER 7839 01-MAY-81 2850 30 7782 CLARK MANAGER 7839 09-JUN-81 2450 10 7788 SCOTT ANALYST 7566 19-APR-87 3000 20 7839 KING PRESIDENT 17-NOV-81 5000 10 7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30 7876 ADAMS CLERK 7788 23-MAY-87 1100 20 7900 JAMES CLERK 7698 03-DEC-81 950 30 7902 FORD ANALYST 7566 03-DEC-81 3000 20 7934 MILLER CLERK 7782 23-JAN-82 1300 10 14 rows selected. SQL>
Similar to the external table examples, the FORMAT
parameter allows us to tailor the load process to to suit the data file contents.
Export Data
The EXPORT_DATA
procedure takes the data generated by a query and exports it to a cloud object store in the requested format. This doesn't seem to work on the on-prem version of the DBMS_CLOUD
package, but does work on Autonomous Database.
begin dbms_cloud.export_data ( credential_name => 'obj_store_cred', file_uri_list => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.json', query => 'select * from emp', format => '{"type" : "JSON"}' ); end; / begin dbms_cloud.export_data ( credential_name => 'obj_store_cred', file_uri_list => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/emp.csv', query => 'select * from emp', format => '{"type" : "CSV"}' ); end; /
SODA Collections
Oracle databases can be used as a document store using Simple Oracle Document Access (SODA). You can read about SODA here.
We create a new collection called "TestCollection1".
set serveroutput on declare l_collection soda_collection_t; begin l_collection := dbms_soda.create_collection('TestCollection1'); if l_collection is not null then dbms_output.put_line('Collection ID : ' || l_collection.get_name()); else dbms_output.put_line('Collection does not exist.'); end if; end; / Collection ID : TestCollection1 PL/SQL procedure successfully completed. SQL>
Create a file called "fruit.json" with the following contents and upload it to your cloud object store.
{"fruit": "banana"}
The COPY_COLLECTION
procedure loads the data from our cloud object store into the collection.
begin dbms_cloud.copy_collection( collection_name => 'TestCollection1', credential_name => 'obj_store_cred', file_uri_list => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/fruit.json', format => json_object('unpackarrays' value 'true') ); end; /
We can see the data in the collection using the following query.
select json_document from "TestCollection1"; JSON_DOCUMENT -------------------------------------------------------------------------------- {"fruit":"banana"} SQL>
Delete Operations
Many of the DBMS_CLOUD
operations produce additional files (log files, bad files, temp files etc.). These need to be cleaned up once an operation is complete. Several of the procedures shown above have overloads that return an operation ID value, which can be used with the DELETE_OPERATION
procedure to clean up the extra files. Alternatively the operations for the current session can be displayed using the USER_LOAD_OPERATIONS
view.
SQL> desc user_load_operations Name Null? Type ----------------------------------------------------- -------- ------------------------------------ ID NOT NULL NUMBER TYPE NOT NULL VARCHAR2(128) SID NOT NULL NUMBER SERIAL# NOT NULL NUMBER START_TIME TIMESTAMP(6) WITH TIME ZONE UPDATE_TIME TIMESTAMP(6) WITH TIME ZONE STATUS VARCHAR2(9) OWNER_NAME VARCHAR2(128) TABLE_NAME VARCHAR2(128) PARTITION_NAME VARCHAR2(128) SUBPARTITION_NAME VARCHAR2(128) FILE_URI_LIST VARCHAR2(4000) ROWS_LOADED NUMBER LOGFILE_TABLE VARCHAR2(128) BADFILE_TABLE VARCHAR2(128) TEMPEXT_TABLE VARCHAR2(128) SQL>
We return the operations for the current session using the USER_LOAD_OPERATIONS
view.
column type format a10 select id, type from user_load_operations order by 1; ID TYPE ---------- ---------- 1 COPY 11 COPY SQL>
The DELETE_OPERATION
procedure allows us to clear up the additional files associated with a specific operation.
begin dbms_cloud.delete_operation(1); end; / select id, type from user_load_operations order by 1; ID TYPE ---------- ---------- 11 COPY SQL>
The DELETE_ALL_OPERATIONS
procedure allows us to clean up the additional files for all operations, or those for specific types of operations if we specify the type value.
-- Delete only COPY operations. begin dbms_cloud.delete_all_operations('COPY'); end; / -- Delete all operations. begin dbms_cloud.delete_all_operations; end; /
For more information see:
- DBMS_CLOUD Package
- DBMS_CLOUD : Objects and Files
- DBMS_CLOUD : External Tables
- DBMS_CLOUD : Installation on 19c and 21c On-Prem Databases
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Oracle Cloud : Autonomous Database (ADW or ATP) - Load Data from an Object Store (DBMS_CLOUD)
- DBMS_CREDENTIAL : Persist Database and OS Credentials in Oracle Database 12c Release 1 (12.1)
Hope this helps. Regards Tim...