8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Data Pump (expdp, impdp) Interactive Command Mode
The expdp
and impdp
utilities are just a wrapper over the underlying APIs. All data pump actions are performed by database DBMS_SCHEDULER
jobs. These jobs are controlled by a master control process which uses Advanced Queuing. At runtime an advanced queue table, named after the job name, is created and used by the master control process. The table is dropped on completion of the data pump job. The job and the advanced queue can be named using the JOB_NAME
parameter. Cancelling the client process does not stop the associated data pump job. Issuing "CTRL+C" on the client during a job stops the client output and puts you into interactive command mode.
Related articles.
- Data Pump (expdp, impdp) : All Articles
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
Interactive Command Mode (CTRL+C)
When running the expdp
or impdp
utilities the log output is also echoed to the screen, but we can stop this and switch to interactive mode by issuing CTRL+C.
Once in interactive mode we have access to a number of commands described here (expdp, impdp). There are some differences between the commands available for each utility, but some of the commonly used commands are listed below.
CONTINUE_CLIENT
: Switch back to the normal client, with the log output echoed to the screen again.EXIT_CLIENT
: The client is closed, but the database job continues, so the operation completes as normal.KILL_JOB
: Detaches all clients and kills the database job.STOP_JOB
: By default, the current actions are completed, then the job is stopped. It can be resumed later. If you use theSTOP_JOB=IMMEDIATE
option, all actions are stopped immediately. When the job resumed, some of those actions will need to be rerun to make the action consistent.START_JOB
: Restarts a stopped job.STATUS
: Displays basic information about the job, including the status of the workers.
Attach to Existing Job
You can start a new Data Pump client and attach it to an existing Data Pump job.
If you have used the JOB_NAME
parameter when creating the export or import job you will already know what the job name is. If not, you can identify the running job by querying the DBA_DATAPUMP_JOBS
view. The following query is available as the datapump_jobs.sql script.
SET LINESIZE 150 COLUMN owner_name FORMAT A20 COLUMN job_name FORMAT A30 COLUMN operation FORMAT A10 COLUMN job_mode FORMAT A10 COLUMN state FORMAT A12 SELECT owner_name, job_name, TRIM(operation) AS operation, TRIM(job_mode) AS job_mode, state, degree, attached_sessions, datapump_sessions FROM dba_datapump_jobs ORDER BY 1, 2; OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS -------------------- ------------------------------ ---------- ---------- ------------ ---------- ----------------- ----------------- DUMMY_DBA SYS_EXPORT_SCHEMA_01 EXPORT SCHEMA EXECUTING 4 1 6 1 row selected. SQL>
Once you know the job name, you can attach the client to the job using the ATTACH={JOB_NAME}
parameter as follows.
expdp user/password@service attach=SYS_EXPORT_SCHEMA_01 impdp user/password@service attach=SYS_IMPORT_SCHEMA_01
Once attached, you have access to all the commands, including CONTINUE_CLIENT
to echo the log to the screen.
For more information see:
- Commands Available in Data Pump Export Interactive-Command Mode
- Commands Available in Import's Interactive-Command Mode
- Data Pump (expdp, impdp) : All Articles
- Data Pump Quick Links : 10g, 11g, 12cR1, 12cR2, 18c, 19c, 21c, Transportable Tablespaces
Hope this helps. Regards Tim...