8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Oracle REST Data Services (ORDS) : Database API - Data Pump
The Oracle REST Data Services (ORDS) database API allows us to create Data Pump export and import jobs via REST web service calls.
- Assumptions
- Setup
- Get Data Pump Jobs
- Table Export
- Table Import
- Schema Export
- Schema Import
- Database Export
- Database Import
- Thoughts
Related articles.
- Oracle REST Data Services (ORDS) : Database API - Setup
- Oracle REST Data Services (ORDS) : All Articles
- Data Pump (expdp, impdp) : All Articles
- Data Pump API (DBMS_DATAPUMP)
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
Assumptions
This article assumes the following.
- You already have a functioning installation of ORDS, using an application server or standalone mode.
- ORDS can be installed in the root container or a PDB. We're using a PDB installation.
- You must have performed the Database API setup described in this article here. This functionality requires the ORDS enabled schema alias approach in version 20.2, but it should work with the default administrator approach in a future version.
- The paths for the ORDS configuration match those from the ORDS installation article listed above.
- You have an Oracle database available. In this article I will be using a 19c database, but it works just the same for other versions.
- You have a way to call the web services. In these examples we use "curl", but you could use Postman or Insomnia. You can find an explanation of the curl parameters in the setup article.
We will use the following endpoint to perform the operations.
https://localhost:8443/ords/{schema-alias}/_/db-api/stable/database/datapump/jobs/ # Example https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/
If you are using a root container installation, you will have to adjust the URL to include the database service. In this example the PDB1 pluggable database service.
https://localhost:8443/ords/{database-service}/{schema-alias}/_/db-api/stable/database/datapump/jobs/ # Example https://localhost:8443/ords/pdb1/dbapi_user/_/db-api/stable/database/datapump/jobs/
There are also separate RPC-style export and import endpoints. I'm going to avoid them as they are not "REST", having their action determined by the URL, and all they do is allow us to omit the "operation" item from the payload.
# REST .../_/db-api/stable/database/datapump/jobs/ # RPC .../_/db-api/stable/database/datapump/export .../_/db-api/stable/database/datapump/import
Setup
We create a new database user for our testing.
conn / as sysdba alter session set container=pdb1; --drop user testuser1 cascade; create user testuser1 identified by testuser1 default tablespace users quota unlimited on users; grant create session, create table, create type to testuser1;
We create and populate a copy of the EMP table in the test user.
conn testuser1/testuser1@pdb1 create table emp ( empno number(4,0), ename varchar2(10 byte), job varchar2(9 byte), mgr number(4,0), hiredate date, sal number(7,2), comm number(7,2), deptno number(2,0), constraint pk_emp primary key (empno) ); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7369,'SMITH','CLERK',7902,to_date('17-DEC-80','DD-MON-RR'),800,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7499,'ALLEN','SALESMAN',7698,to_date('20-FEB-81','DD-MON-RR'),1600,300,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7521,'WARD','SALESMAN',7698,to_date('22-FEB-81','DD-MON-RR'),1250,500,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7566,'JONES','MANAGER',7839,to_date('02-APR-81','DD-MON-RR'),2975,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7654,'MARTIN','SALESMAN',7698,to_date('28-SEP-81','DD-MON-RR'),1250,1400,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7698,'BLAKE','MANAGER',7839,to_date('01-MAY-81','DD-MON-RR'),2850,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7782,'CLARK','MANAGER',7839,to_date('09-JUN-81','DD-MON-RR'),2450,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7788,'SCOTT','ANALYST',7566,to_date('19-APR-87','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7839,'KING','PRESIDENT',null,to_date('17-NOV-81','DD-MON-RR'),5000,null,10); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7844,'TURNER','SALESMAN',7698,to_date('08-SEP-81','DD-MON-RR'),1500,0,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7876,'ADAMS','CLERK',7788,to_date('23-MAY-87','DD-MON-RR'),1100,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7900,'JAMES','CLERK',7698,to_date('03-DEC-81','DD-MON-RR'),950,null,30); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7902,'FORD','ANALYST',7566,to_date('03-DEC-81','DD-MON-RR'),3000,null,20); insert into emp (empno,ename,job,mgr,hiredate,sal,comm,deptno) values (7934,'MILLER','CLERK',7782,to_date('23-JAN-82','DD-MON-RR'),1300,null,10); commit;
We create a physical directory on the database server to use with the export/import operations.
mkdir -p /tmp/dp
We create a Oracle directory object pointing to the physical location.
conn / as sysdba alter session set container=pdb1; create or replace directory test_dir AS '/tmp/dp'; grant read, write on directory test_dir to testuser1;
Once a job is running, the status of the job is visible in the DBA_DATAPUMP_JOBS
view and using the GET service shown below.
column owner_name format a20 column job_name format a30 column operation format a10 column job_mode format a10 column state format a12 select owner_name, job_name, trim(operation) as operation, trim(job_mode) as job_mode, state, degree, attached_sessions, datapump_sessions from dba_datapump_jobs order by 1, 2;
You can see an example of this in the datapump_jobs.sql script.
Get Data Pump Jobs
Making a HTTP GET method call to the "/database/datapump/jobs/" endpoint returns a list of Data Pump jobs, or a count of zero if no jobs are present.
curl -ks -X GET \ --user dbapi_user:DbApiUserPassword1 \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "items": [ { "owner_name": "DBAPI_USER", "job_name": "DATAPUMP_REST_EXPORT_20201001115524", "operation": "EXPORT ", "job_mode": "SCHEMA ", "state": "NOT RUNNING", "degree": 0, "attached_sessions": 0, "datapump_sessions": 0, "links": [ { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001115524/" } ] } ], "hasMore": false, "limit": 25, "offset": 0, "count": 1, "links": [ { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "edit", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "first", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" } ] }
For each job in the "items" array there is a "self" link, allowing us to drill down and get more information about the job. In the following example we make a GET call to one of these links.
curl -ks -X GET \ --user dbapi_user:DbApiUserPassword1 \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001115524/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_EXPORT_20201001115524", "owner_name": "DBAPI_USER", "operation": "EXPORT", "job_mode": "SCHEMA", "state": "NOT RUNNING", "degree": 0, "attached_sessions": 0, "datapump_sessions": 0, "job_state": "COMPLETED", "job_comment": "Job \"DBAPI_USER\".\"DATAPUMP_REST_EXPORT_20201001115524\" successfully completed at Thu Oct 1 11:56:07 2020 elapsed 0 00:00:41", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "enclosure", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001115524/EXPDAT01-20201001_11_55_26.DMP" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001115524/EXPDAT-2020-10-01-11_55_24.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001115524/" } ] }
The "links" array includes an "enclosure" link to return the resulting dump file, and a "related" link to return the log file.
Table Export
The following example shows how to perform a table export of the EMP
table in the TESTUSER1
schema.
We create a payload file containing the parameters for the export job.
cat > /tmp/payload.json <<EOF { "operation": "EXPORT", "job_mode": "TABLE", "datapump_dir": "TEST_DIR", "name_expressions": [ {"expression": "='EMP'"}, {"expression": "!='DEPT'"} ], "schema_expressions": [ {"expression": "= 'TESTUSER1'"} ] } EOF
Notice we don't get to control the name of the dump file or the log file. We have to use the output links to identify them. We can use the "name_expressions", "schema_expressions" and "tablespace_expressions" options to define multiple expressions (IN, NOT IN, =, !=, LIKE, NOT LIKE etc.) if required. In the example above the "name_expressions" option includes the EMP
table and excludes the DEPT
table.
We could replace the "name_expressions" option with a comma-separated list of the tables to include using the "filer" option, which would give us the same result.
cat > /tmp/payload.json <<EOF { "operation": "EXPORT", "job_mode": "TABLE", "datapump_dir": "TEST_DIR", "filter": "EMP", "schema_expressions": [ {"expression": "= 'TESTUSER1'"} ] } EOF
We make a POST call to the "/database/datapump/jobs/" endpoint, passing the raw payload and setting the "Content-Type" header to "application/json".
curl -ks -X POST \ --user dbapi_user:DbApiUserPassword1 \ --data-binary @/tmp/payload.json \ --header "Content-Type:application/json" \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_EXPORT_20201001122631", "owner_name": "DBAPI_USER", "operation": "EXPORT", "job_mode": "TABLE", "state": "EXECUTING", "degree": 1, "attached_sessions": 0, "datapump_sessions": 2, "job_state": "EXECUTING", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001122631/EXPDAT-2020-10-01-12_26_31.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001122631/" } ] }
The "links" array includes a "related" link to the log file, and a "self" link to the job itself, where we can get the URL for the dump file, similar to what we saw in the GET operation above.
Table Import
The following example shows how to perform a table import. We import the dump file produced by the export of the EMP
table. Before doing this we need to drop the existing EMP table.
drop table testuser1.emp purge;
We create a payload file containing the parameters for the import job.
cat > /tmp/payload.json <<EOF { "operation": "IMPORT", "job_mode": "TABLE", "datapump_dir": "TEST_DIR", "file_name": "EXPDAT01-20201001_11_55_26.DMP", "name_expressions": [ {"expression": "='EMP'"} ], "schema_expressions": [ {"expression": "= 'TESTUSER1'"} ] } EOF
We could replace the "name_expressions" option with a comma-separated list of the tables to include using the "filer" option, which would give us the same result.
cat > /tmp/payload.json <<EOF { "operation": "IMPORT", "job_mode": "TABLE", "datapump_dir": "TEST_DIR", "file_name": "EXPDAT01-20201001_11_55_26.DMP", "filter": "EMP", "schema_expressions": [ {"expression": "= 'TESTUSER1'"} ] } EOF
We make a POST call to the "/database/datapump/jobs/" endpoint, passing the raw payload and setting the "Content-Type" header to "application/json".
curl -ks -X POST \ --user dbapi_user:DbApiUserPassword1 \ --data-binary @/tmp/payload.json \ --header "Content-Type:application/json" \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_IMPORT_20201001141930", "owner_name": "DBAPI_USER", "operation": "IMPORT", "job_mode": "TABLE", "state": "EXECUTING", "degree": 1, "attached_sessions": 0, "datapump_sessions": 2, "job_state": "EXECUTING", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001141930/EXPDAT-2020-10-01-11_55_24.LOG" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001141930/IMPDAT-2020-10-01-14_19_30.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001141930/" } ] }
Schema Export
The following example shows how to perform a schema export of the TESTUSER1
schema.
We create a payload file containing the parameters for the export job.
cat > /tmp/payload.json <<EOF { "operation": "EXPORT", "job_mode": "SCHEMA", "datapump_dir": "TEST_DIR", "schema_expressions": [ {"expression": "= 'TESTUSER1'"} ] } EOF
If we didn't need a complex expression to identify the schemas to export, we could replace the "schema_expressions" option with a comma-separated list of the schemas to include using the "filer" option, which would give us the same result.
cat > /tmp/payload.json <<EOF { "operation": "EXPORT", "job_mode": "SCHEMA", "datapump_dir": "TEST_DIR", "filter": "TESTUSER1" } EOF
We make a POST call to the "/database/datapump/jobs/" endpoint, passing the raw payload and setting the "Content-Type" header to "application/json".
curl -ks -X POST \ --user dbapi_user:DbApiUserPassword1 \ --data-binary @/tmp/payload.json \ --header "Content-Type:application/json" \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_EXPORT_20201001142402", "owner_name": "DBAPI_USER", "operation": "EXPORT", "job_mode": "SCHEMA", "state": "EXECUTING", "degree": 1, "attached_sessions": 0, "datapump_sessions": 2, "job_state": "EXECUTING", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001142402/EXPDAT-2020-10-01-14_24_02.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001142402/" } ] }
Schema Import
The following example shows how to perform a schema import. We import the dump file produced by the export of the TESTUSER1
schema, but remap it to TESTUSER2
.
We create a payload file containing the parameters for the import job.
cat > /tmp/payload.json <<EOF { "operation": "IMPORT", "job_mode": "SCHEMA", "datapump_dir": "TEST_DIR", "file_name": "EXPDAT01-20201001_14_24_04.DMP", "schema_expressions": [ {"expression": "= 'TESTUSER1'"} ], "remap_schemas": [ { "source": "TESTUSER1", "target": "TESTUSER2" } ] } EOF
Alternatively we could use the "filter" option to identify the schemas to import.
cat > /tmp/payload.json <<EOF { "operation": "IMPORT", "job_mode": "SCHEMA", "datapump_dir": "TEST_DIR", "file_name": "EXPDAT01-20201001_14_24_04.DMP", "filter": "TESTUSER1", "remap_schemas": [ { "source": "TESTUSER1", "target": "TESTUSER2" } ] } EOF
We make a POST call to the "/database/datapump/jobs/" endpoint, passing the raw payload and setting the "Content-Type" header to "application/json".
curl -ks -X POST \ --user dbapi_user:DbApiUserPassword1 \ --data-binary @/tmp/payload.json \ --header "Content-Type:application/json" \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_IMPORT_20201001143655", "owner_name": "DBAPI_USER", "operation": "IMPORT", "job_mode": "SCHEMA", "state": "EXECUTING", "degree": 1, "attached_sessions": 0, "datapump_sessions": 2, "job_state": "EXECUTING", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001143655/EXPDAT-2020-10-01-14_24_02.LOG" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001143655/IMPDAT-2020-10-01-14_36_55.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001143655/" } ] }
Database Export
The following example shows how to perform a full export of the PDB1
database.
We create a payload file containing the parameters for the export job.
cat > /tmp/payload.json <<EOF { "operation": "EXPORT", "job_mode": "FULL", "datapump_dir": "TEST_DIR" } EOF
The full export will run in parallel by default. We can limit the parallelism using the "threads" option.
cat > /tmp/payload.json <<EOF { "operation": "EXPORT", "job_mode": "FULL", "datapump_dir": "TEST_DIR", "threads": 1 } EOF
We make a POST call to the "/database/datapump/jobs/" endpoint, passing the raw payload and setting the "Content-Type" header to "application/json".
curl -ks -X POST \ --user dbapi_user:DbApiUserPassword1 \ --data-binary @/tmp/payload.json \ --header "Content-Type:application/json" \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_EXPORT_20201001145804", "owner_name": "DBAPI_USER", "operation": "EXPORT", "job_mode": "FULL", "state": "EXECUTING", "degree": 1, "attached_sessions": 0, "datapump_sessions": 2, "job_state": "EXECUTING", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001145804/EXPDAT-2020-10-01-14_58_04.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_EXPORT_20201001145804/" } ] }
Database Import
The following example shows how to perform a full import. We import the dump file produced by the export of the PDB1
database.
We create a payload file containing the parameters for the import job. The database export produced multiple dump files, so we include the "%U" wildcard in the "file_name" option pattern. We also remap one of the schemas.
cat > /tmp/payload.json <<EOF { "operation": "IMPORT", "job_mode": "FULL", "datapump_dir": "TEST_DIR", "file_name": "EXPDAT%U-20201001_14_58_04.DMP", "remap_schemas": [ { "source": "TESTUSER1", "target": "TESTUSER3" } ] } EOF
We make a POST call to the "/database/datapump/jobs/" endpoint, passing the raw payload and setting the "Content-Type" header to "application/json".
curl -ks -X POST \ --user dbapi_user:DbApiUserPassword1 \ --data-binary @/tmp/payload.json \ --header "Content-Type:application/json" \ https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/ | python3 -mjson.tool { "job_name": "DATAPUMP_REST_IMPORT_20201001152105", "owner_name": "DBAPI_USER", "operation": "IMPORT", "job_mode": "FULL", "state": "EXECUTING", "degree": 1, "attached_sessions": 0, "datapump_sessions": 2, "job_state": "EXECUTING", "links": [ { "rel": "collection", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/" }, { "rel": "describedby", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/metadata-catalog/" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001152105/EXPDAT-2020-10-01-14_58_04.LOG" }, { "rel": "related", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001152105/IMPDAT-2020-10-01-15_21_05.LOG" }, { "rel": "self", "href": "https://localhost:8443/ords/dbapi_user/_/db-api/stable/database/datapump/jobs/DBAPI_USER,DATAPUMP_REST_IMPORT_20201001152105/" } ] }
Thoughts
Here are some thoughts on the database APIs for Data Pump.
- The APIs have very limited functionality. There are a lot of things missing compared to the command line utility and the
DBMS_DATAPUMP
package. - I think the lack of functionality will probably annoy some DBAs. They may prefer to wait for a future version, or use the
DBMS_DATAPUMP
package to automate Data Pump operations. - This functionality requires the ORDS enabled schema alias approach in version 20.2, but it should work with the default administrator approach in a future version.
- I don't see a way to clean up old jobs, or to clean up the contents of the file system that relate to those jobs. If this API is to allow us to be far removed from the server, some sort of clean-up seems to be necessary.
- Only being able to use the ORDS enabled schema approach is problematic, as you need to give out DBA credentials to use the API, and I can't see many DBAs using it. If a developer wants to use it, you will need to give them DBA credentials, which is not allowed in many companies.
- On a more geenral level, I feel the ORDS Database APIs need a more fine-grained approach to which APIs can be accessed by a user. They should probably reflect the underlying privileges of the database user, rather than using ORDS-specific settings.
For more information see:
- General REST Endpoints
- Oracle REST Data Services (ORDS) : Database API - Setup
- Oracle REST Data Services (ORDS) : All Articles
- Data Pump (expdp, impdp) : All Articles
- Data Pump API (DBMS_DATAPUMP)
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
Hope this helps. Regards Tim...