8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23ai | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Oracle Cloud : APEX Application Development Service (APX) - Import Data
This article describes how to import data into the APEX Application Development (APX) service on the Oracle Cloud.
At first glance this sound like it should be the same as the process for the ADW and ATP services, described here. The difference with the APEX Application Development Service is there is no SQL*Net access, so we can't use the impdp
utility. Instead we have to use the DBMS_DATAPUMP
package to perform the import.
- Create Test Schema
- Export Data
- Create an Object Storage Bucket and Upload the Dump File
- Create Credential
- Transfer the Dump File
- Import Data
Related articles.
- Oracle Cloud : APEX Application Development Service (APX) - Create Service
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Data Pump API for PL/SQL (DBMS_DATAPUMP)
- Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)
- Cloud : All Articles
Create Test Schema
We need a dump file to test the import process. We create a schema in your local database, which we can export.
conn sys/SysPassword1@//localhost:1521/pdb1 as sysdba -- Create test user. create user testuser1 identified by testuser1 quota unlimited on users; grant create session, create table to testuser1; -- Create a directory object for use with the export.. create directory tmp_dir as '/tmp/'; grant read, write on directory tmp_dir to testuser1;
We connect to the TESTUSER1
user and create some objects.
conn testuser1/testuser1@//localhost:1521/pdb1 -- DROP TABLE EMP PURGE; -- DROP TABLE DEPT PURGE; CREATE TABLE DEPT ( DEPTNO NUMBER(2) CONSTRAINT PK_DEPT PRIMARY KEY, DNAME VARCHAR2(14), LOC VARCHAR2(13) ) ; CREATE TABLE EMP ( EMPNO NUMBER(4) CONSTRAINT PK_EMP PRIMARY KEY, ENAME VARCHAR2(10), JOB VARCHAR2(9), MGR NUMBER(4), HIREDATE DATE, SAL NUMBER(7,2), COMM NUMBER(7,2), DEPTNO NUMBER(2) CONSTRAINT FK_DEPTNO REFERENCES DEPT ); INSERT INTO DEPT VALUES (10,'ACCOUNTING','NEW YORK'); INSERT INTO DEPT VALUES (20,'RESEARCH','DALLAS'); INSERT INTO DEPT VALUES (30,'SALES','CHICAGO'); INSERT INTO DEPT VALUES (40,'OPERATIONS','BOSTON'); INSERT INTO EMP VALUES (7369,'SMITH','CLERK',7902,to_date('17-12-1980','dd-mm-yyyy'),800,NULL,20); INSERT INTO EMP VALUES (7499,'ALLEN','SALESMAN',7698,to_date('20-2-1981','dd-mm-yyyy'),1600,300,30); INSERT INTO EMP VALUES (7521,'WARD','SALESMAN',7698,to_date('22-2-1981','dd-mm-yyyy'),1250,500,30); INSERT INTO EMP VALUES (7566,'JONES','MANAGER',7839,to_date('2-4-1981','dd-mm-yyyy'),2975,NULL,20); INSERT INTO EMP VALUES (7654,'MARTIN','SALESMAN',7698,to_date('28-9-1981','dd-mm-yyyy'),1250,1400,30); INSERT INTO EMP VALUES (7698,'BLAKE','MANAGER',7839,to_date('1-5-1981','dd-mm-yyyy'),2850,NULL,30); INSERT INTO EMP VALUES (7782,'CLARK','MANAGER',7839,to_date('9-6-1981','dd-mm-yyyy'),2450,NULL,10); INSERT INTO EMP VALUES (7788,'SCOTT','ANALYST',7566,to_date('13-JUL-87','dd-mm-rr')-85,3000,NULL,20); INSERT INTO EMP VALUES (7839,'KING','PRESIDENT',NULL,to_date('17-11-1981','dd-mm-yyyy'),5000,NULL,10); INSERT INTO EMP VALUES (7844,'TURNER','SALESMAN',7698,to_date('8-9-1981','dd-mm-yyyy'),1500,0,30); INSERT INTO EMP VALUES (7876,'ADAMS','CLERK',7788,to_date('13-JUL-87', 'dd-mm-rr')-51,1100,NULL,20); INSERT INTO EMP VALUES (7900,'JAMES','CLERK',7698,to_date('3-12-1981','dd-mm-yyyy'),950,NULL,30); INSERT INTO EMP VALUES (7902,'FORD','ANALYST',7566,to_date('3-12-1981','dd-mm-yyyy'),3000,NULL,20); INSERT INTO EMP VALUES (7934,'MILLER','CLERK',7782,to_date('23-1-1982','dd-mm-yyyy'),1300,NULL,10); COMMIT;
Export Data
We perform a schema-level export using the expdp
utility on our local server.
expdp testuser1/testuser1@pdb1 \ schemas=testuser1 \ directory=tmp_dir dumpfile=emp_dept%u.dmp logfile=expdp_emp_dept.log \ data_options=group_partition_table_data
We now have a dump file called "emp_dept01.dmp" in the "/tmp/" directory of our local server.
Create an Object Storage Bucket and Upload the Dump File
We create an object storage bucket on Oracle Cloud, upload the dump file and make sure we have a valid authentication token to access the bucket. All that's described in this article.
Once we've completed that, we have the following information.
- URL Path: The URL path (URI) of the dump file we uploaded to the object storage.
- Username. This is the username (email address) of the user we generated the authentication token for.
- Authentication Token: The authentication token we copied during the token creation.
Create Credential
We log into the APEX development instance on Oracle Cloud and launch SQL Developer Web. We create a credential using the CREATE_CREDENTIAL
procedure in the DBMS_CLOUD
package.
begin dbms_cloud.drop_credential(credential_name => 'OBJ_STORE_CRED'); end; / begin dbms_cloud.create_credential ( credential_name => 'OBJ_STORE_CRED', username => 'me@example.com', password => 'User Auth Token Here' ); end; /
We can display the credential information using the DBA_CREDENTIALS
view.
select * from dba_credentials;
We can test the credential using the LIST_OBJECTS
table function in the DBMS_CLOUD
package. The following query uses the credential and the URL Path of the bucket to list the files in the bucket.
select * from dbms_cloud.list_objects( 'OBJ_STORE_CRED', 'https://objectstorage.uk-london-1.oraclecloud.com/n/????????/b/ob-bucket/o/');
Transfer the Dump File
We log into the APEX development instance on Oracle Cloud and launch SQL Developer Web. We load the file into the DATA_PUMP_DIR
directory using the GET_OBJECT
procedure in the DBMS_CLOUD
package. The DATA_PUMP_DIR
directory is a default directory object available on ADW, ATP and the APEX development instances.
begin dbms_cloud.get_object( credential_name => 'OBJ_STORE_CRED', object_uri => 'https://objectstorage.uk-london-1.oraclecloud.com/n/????????/b/ob-bucket/o/emp_dept01.dmp', directory_name => 'DATA_PUMP_DIR'); end; /
We can check the file is in the directory using the LIST_FILES
table function in the DBMS_CLOUD
package.
select * from dbms_cloud.list_files('DATA_PUMP_DIR');
Import Data
With the file loaded into the DATA_PUMP_DIR
directory, we can now import the data using the Data Pump API for PL/SQL. You can read more about it in this article.
We log into the APEX development instance on Oracle Cloud and launch SQL Developer Web. We create a new user.
create user testuser1 identified by "TstUsr123####"; alter user testuser1 quota unlimited on data; grant create session, create table to testuser1;
We import the dump file. The API calls perform a schema-level import of the TESTUSER1 schema, remapping references from the USERS
tablespace to the DATA
tablespace. The DATA
tablespace is the standard tablespace on the service.
set serveroutput on declare l_username varchar2(30) := 'TESTUSER1'; l_dump_file varchar2(50) := 'emp_dept01.dmp'; l_log_file varchar2(50) := 'impdpTESTUSER1.log'; l_source_ts varchar2(30) := 'USERS'; l_dp_handle number; l_result varchar2(32767); begin -- Open a schema import job. l_dp_handle := dbms_datapump.open( operation => 'IMPORT', job_mode => 'SCHEMA', remote_link => NULL, job_name => dbms_scheduler.generate_job_name); -- Specify the schema to be imported. dbms_datapump.metadata_filter( handle => l_dp_handle, name => 'SCHEMA_EXPR', value => '= ''' || l_username || ''''); -- Specify the dump file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => l_dump_file, directory => 'DATA_PUMP_DIR'); -- Specify the log file name and directory object name. dbms_datapump.add_file( handle => l_dp_handle, filename => l_log_file, directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); -- Perform a REMAP_TABLESPACE from USERS to DATA. dbms_datapump.metadata_remap( handle => l_dp_handle, name => 'REMAP_TABLESPACE', old_value => l_source_ts, value => 'DATA'); dbms_datapump.start_job(l_dp_handle); dbms_datapump.wait_for_job ( handle => l_dp_handle, job_state => l_result); dbms_output.put_line('l_result=' || l_result); end; /
We can check the log file by reading it from the directory.
set serveroutput on declare l_log_file varchar2(32767) := 'impdpTESTUSER1.log'; l_dir varchar2(32767) := 'DATA_PUMP_DIR'; l_file sys.utl_file.file_type; l_text varchar2(32767); begin l_file := sys.utl_file.fopen(l_dir, l_log_file, 'r', 32767); begin loop sys.utl_file.get_line(l_file, l_text); dbms_output.put_line(l_text); end loop; exception when no_data_found then null; end; utl_file.fclose(l_file); end; /
Alternatively we can move the file to our bucket and download it from there.
begin dbms_cloud.put_object( credential_name => 'OBJ_STORE_CRED', object_uri => 'https://objectstorage.uk-london-1.oraclecloud.com/n/????????/b/ob-bucket/o/import.log', directory_name => 'DATA_PUMP_DIR', file_name => 'impdpTESTUSER1.log'); end; /
For more information see:
- Oracle Cloud : APEX Application Development Service (APX) - Create Service
- Oracle Cloud Infrastructure (OCI) : Create an Object Storage Bucket
- Data Pump API for PL/SQL (DBMS_DATAPUMP)
- Oracle Cloud : Autonomous Database (ADW or ATP) - Import Data from an Object Store (impdp)
- Cloud : All Articles
Hope this helps. Regards Tim...