8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
SQLcl : Unload and Load Table Data
Use SQLcl to unload table data to a formatted file, and load data from a formatted file into a table.
Related articles.
Setup
The examples in this article require the following tables.
--drop table emp purge; create table emp ( empno number(4) constraint pk_emp primary key, ename varchar2(10), job varchar2(9), mgr number(4), hiredate date, sal number(7,2), comm number(7,2), deptno number(2) ); insert into emp values (7369,'SMITH','CLERK',7902,to_date('17-12-1980','dd-mm-yyyy'),800,NULL,20); insert into emp values (7499,'ALLEN','SALESMAN',7698,to_date('20-2-1981','dd-mm-yyyy'),1600,300,30); insert into emp values (7521,'WARD','SALESMAN',7698,to_date('22-2-1981','dd-mm-yyyy'),1250,500,30); insert into emp values (7566,'JONES','MANAGER',7839,to_date('2-4-1981','dd-mm-yyyy'),2975,NULL,20); insert into emp values (7654,'MARTIN','SALESMAN',7698,to_date('28-9-1981','dd-mm-yyyy'),1250,1400,30); insert into emp values (7698,'BLAKE','MANAGER',7839,to_date('1-5-1981','dd-mm-yyyy'),2850,NULL,30); insert into emp values (7782,'CLARK','MANAGER',7839,to_date('9-6-1981','dd-mm-yyyy'),2450,NULL,10); insert into emp values (7788,'SCOTT','ANALYST',7566,to_date('13-JUL-87','dd-mm-rr')-85,3000,NULL,20); insert into emp values (7839,'KING','PRESIDENT',NULL,to_date('17-11-1981','dd-mm-yyyy'),5000,NULL,10); insert into emp values (7844,'TURNER','SALESMAN',7698,to_date('8-9-1981','dd-mm-yyyy'),1500,0,30); insert into emp values (7876,'ADAMS','CLERK',7788,to_date('13-JUL-87', 'dd-mm-rr')-51,1100,NULL,20); insert into emp values (7900,'JAMES','CLERK',7698,to_date('3-12-1981','dd-mm-yyyy'),950,NULL,30); insert into emp values (7902,'FORD','ANALYST',7566,to_date('3-12-1981','dd-mm-yyyy'),3000,NULL,20); insert into emp values (7934,'MILLER','CLERK',7782,to_date('23-1-1982','dd-mm-yyyy'),1300,NULL,10); commit; create table emp2 ( empno number(4) constraint pk_emp2 primary key, ename varchar2(10), job varchar2(9), mgr number(4), hiredate date, sal number(7,2), comm number(7,2), deptno number(2) );
LOADFORMAT
The LOADFORMAT
setting determines how the UNLOAD
and LOAD
commands will work. The simplest way to get information about the LOADFORMAT
setting is to use the SQLcl help command, as shown below.
SQL> help set loadformat SET LOADFORMAT [ default|csv|delimited|html|insert|json|json-formatted|loader|t2|xml] [options...] default : Load format properties return to default values csv : comma separated values delimited : (CSV synonym) delimited format, comma separated values by default html : UNLOAD only, Hypertext Markup Language insert : UNLOAD only, SQL insert statements json : UNLOAD only, Java Script Object Notation json-formatted : UNLOAD only, "pretty" formatted JSON loader : UNLOAD only, Oracle SQLLoader format t2 : UNLOAD only, T2 Metrics xml : UNLOAD only, Extensible Markup Language where options represents the following clauses: COLUMN_NAMES|COLUMNNAMES|NAMES {ON|OFF} : Header row with column names DELIMITER {separator} : Delimiter separating fields in the record ENCLOSURES {enclosures|OFF} : Optional left and right enclosure. OFF indicates no enclosures If 1 character is specified, sets left and right enclosure to this value. If 2 or more characters are specified, sets left to the first character, the right to the second character and ignores the remaining characters. To set multiple character enclosures, use Set ENCLOSURE_LEFT and ENCLOSURE_RIGHT ENCODING {encoding|OFF|""} : Encoding of load file. OFF and "" reset to default encoding for environment LEFT|ENCLOSURE_LEFT|ENCLOSURELEFT {enclosure|OFF} : Set a 1 or more character left enclosure. If no ENCLOSURE_RIGHT is specified, it is used for both left and right. OFF indicates no enclosures RIGHT|ENCLOSURE_RIGHT|ENCLOSURERIGHT {enclosure|OFF} : Set a 1 or more character right enclosure. OFF indicates no right enclosure ROW_LIMIT|ROWLIMIT|LIMIT} {number_of_rows|OFF|""} : Max number of rows to read, including header. OFF and "" set to not limit. SKIP|SKIP_ROWS|SKIPROWS {number_of_rows|OFF|""} : Number of rows to skip [[SKIP_AFTER_NAMES|SKIPAFTERNAMES|AFTER]|[SKIP_BEFORE_NAMES|SKIPBEFORENAMES|BEFORE]] : Skip the rows before or after the (header) Column Names row TERM|ROW_TERMINATOR {terminator|""|DEFAULT|CR|CRLF|LF} : Character(s) indicating end of row. If the file contains standard line end characters, the line_end does not need to be specified. "" or DEFAULT specifies the default (any standard terminator) for LOAD "" or DEFAULT specifies the environment default for UNLOAD CRLF specifies WINDOWS terminator, generally for UNLOAD LF specifies UNIX terminator, generally for UNLOAD CR specifies MAC terminator, generally for UNLOAD Examples: set loadformat delimited 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20,5555555555554444 set loadformat delimited enclosures <> line_end {eol} 7369,<SMITH>,<CLERK>,7902,17-DEC-80,800,,20,5555555555554444{eol} set loadformat delimited 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20,5555555555554444 set loadformat default (restore default settings) 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20,5555555555554444 SQL>
As you can see, there are a lot more options for unloading, than loading. For this article we will stick with CSV format, since that is popular for dumping data for transfer due to its compact nature.
UNLOAD
The UNLOAD
command dumps the contents of a specified table into a formatted file. The format is determined by the LOADFORMAT
setting, and the destination for the file can be a directory local to the SQLcl executable, or a cloud storage bucket.
In the following example we set the default location for file handing using the CD
command. Remember, this must be accessible to the client. It's not a database server location, unless you are running SQLcl on the database server. We set the LOADFORMAT
setting to CSV and we unload the EMP
table.
cd c:\temp set loadformat csv unload emp format csv column_names on delimiter , enclosures "" encoding UTF8 row_terminator default ** UNLOAD Start ** at 2021.11.24-09.14.08 Export Separate Files to C:\temp DATA TABLE EMP File Name: C:\temp\EMP_DATA_TABLE_1.csv Number of Rows Exported: 14 ** UNLOAD End ** at 2021.11.24-09.14.08 SQL>
We can set the directory as part of the UNLOAD
command.
unload emp dir c:\temp
The UNLOAD
command produces a file with the following name format.
Format: {TABLE-NAME}_DATA_TABLE.{extension} Example: EMP_DATA_TABLE.csv
If a file of that name is already present, a new file is created with a number tacked on to the end of the file name. For three runs of the above unload command we would expect the following files to be created.
EMP_DATA_TABLE.csv EMP_DATA_TABLE_1.csv EMP_DATA_TABLE_2.csv
The full instructions for the UNLOAD
command can be displayed using the SQLcl HELP
command, as shown below.
SQL> help unload UNLOAD ------ Unload table data into a locally accessible directory or cloud storage location. UNLOAD [TABLE] [schema.]table_name [ <directory-specification> | <cloud-storage-specification> ] where : [ TABLE ] [schema.]table_name Identifies table to unload. If schema is omitted, unloads the table for the connected user schema <directory-specification> : {DIRECTORY|DIR} <directory-name> <directory-name> : identifies the target directory. <cloud-storage-specification> : { CLOUDSTORAGE | CS } [ <uri> | <qualified-name> ] <uri> : complete uri for the cloud storage bucket if a default cloudstorage is not set <qualified-name> : name of the bucket, optionally qualified by the namespace. The qualified name concatenated to the uri specified by cloudstorage command must fully identify the bucket uri. If <uri> and <qualified-name> are omitted, the cloud storage default must be set to the bucket. If both <directory-specification> and <cloud-storage-specification are omitted, unloads the data into the default directory. By default, unload will create CSV files as: comma delimiter double-quote enclosures standard line terminator for windows, unix or mac encoding UTF8 column names in the first row Use SET LOADFORMAT options for specifying format options for output files (file format, delimiter, enclosures, etc). Use SET TIMING ON to show elapsed time for each unload and total elapsed time for the unload. Use CD command to set default directory for file access. Use CLOUDSTORAGE (CS) command to set default cloud storage uri. Examples: -------- EXAMPLES UNLOAD TABLE TO LOCALLY ACCESSIBLE DIRECTORY UNLOAD TABLE employees DIR c:\TABLES Unload employees table from schema for current user into C:\TABLES directory as csv file UNLOAD TABLE hr.employees Unload employees table from hr schema as csv file into default directory CD C:\TABLES Sets the default directory to C:\TABLES SET LOADFORMAT DELIMITED ENCLOSURES <> LINE_END {eol} UNLOAD TABLE hr.employees 7369,<SMITH>,<CLERK>,7902,17-DEC-80,800,,20,5555555555554444{eol} SET LOADFORMAT DEFAULT (restore default settings) UNLOAD TABLE hr.employees 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20,5555555555554444 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20,5555555555554444 EXAMPLE UNLOAD TABLE TO CLOUD STORAGE WITH BUCKET SPECIFIED ON SET CLOUDSTORAGE set storage uri to bucket CS https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace/b/bucket oci profile DEFAULT unload employees CS EXAMPLE UNLOAD TABLE TO CLOUD STORAGE WITH NAMESPACE SPECIFIED ON CLOUD STORAGE set storage uri to namespace CS https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace unload employees CS /b/bucket/o/employees.csv SQL>
LOAD
The LOAD
command loads formatted data into a destination table.
In the following example we set the default location for file handing using the CD
command. Remember, this must be accessible to the client. It's not a database server location, unless you are running SQLcl on the database server. We set the LOADFORMAT
setting to CSV and we load the file we created previously into the EMP2
table.
cd c:\temp set loadformat csv load table testuser1.emp2 EMP_DATA_TABLE.csv Load data into table TESTUSER1.EMP2 csv column_names on delimiter , enclosures "" encoding UTF8 row_limit off row_terminator default skip_rows 0 skip_after_names #INFO Number of rows processed: 14 #INFO Number of rows in error: 0 #INFO Last row processed in final committed batch: 14 SUCCESS: Processed without errors SQL>
We could specify the full path as part of the LOAD
command.
load table testuser1.emp2 c:\temp\EMP_DATA_TABLE.csv
We can load the data into a new table using the NEW
keyword. The table structure is determined by looking at the data. In this example we load the data into a new table called EMP3
.
load table testuser1.emp3 c:\temp\EMP_DATA_TABLE.csv new Create new table and load data into table TESTUSER1.EMP3 csv column_names on delimiter , enclosures "" encoding UTF8 row_limit off row_terminator default skip_rows 0 skip_after_names #INFO DATE format detected: DD-MON-RR CREATE TABLE TESTUSER1.EMP3 ( EMPNO NUMBER(6), ENAME VARCHAR2(26), JOB VARCHAR2(26), MGR NUMBER(6), HIREDATE DATE, SAL NUMBER(6), COMM NUMBER(6), DEPTNO NUMBER(4) ) ; #INFO Table created #INFO Number of rows processed: 14 #INFO Number of rows in error: 0 #INFO Last row processed in final committed batch: 14 SUCCESS: Processed without errors SQL>
We can influence the load by setting LOAD
parameters. In this example we alter the column names in the "EMP_DATA_TABLE.csv" file, swapping EMPNO
to EMPLOYEE_NO
and ENAME
to EMPLOYEE_NAME
. We truncate the EMP
table and set some LOAD
parameters. We alter how the rows are batched and committed, set the expected date format and use MAPNAMES
to associate the column names in the data file to the column names in the table if they don't match. The column name in the file is case sensitive. Once this is done we can load the data.
truncate table emp; set load batch_rows 100 batches_per_commit 100 date dd-mon-rr mapnames (EMPLOYEE_NO=empno, EMPLOYEE_NAME=ename) load table testuser1.emp c:\temp\EMP_DATA_TABLE.csv Load data into table TESTUSER1.EMP csv column_names on delimiter , enclosures "" encoding UTF8 row_limit off row_terminator default skip_rows 0 skip_after_names batch_rows 100 batches_per_commit 100 clean_names transform column_size rounded commit on date_format dd-mon-rr errors 50 map_column_names (EMPLOYEE_NAME=ename, EMPLOYEE_NO=empno) method insert timestamp_format timestamptz_format locale English United Kingdom scan_rows 100 truncate off unknown_columns_fail on #INFO Number of rows processed: 14 #INFO Number of rows in error: 0 #INFO Last row processed in final committed batch: 14 SUCCESS: Processed without errors SQL>
The full instructions for the LOAD
command can be displayed using the SQLcl HELP
command, as shown below.
SQL> help load LOAD ----- Loads a comma separated value (csv) file into a table. LOAD [TABLE] [schema.]table_name { <file-specification> | <cloud-storage-specification> } [NEW | SHOW | DDL_SHOW | CREATE | DDL_CREATE] where : [schema.]table_name Identifies table to load. If schema is omitted, loads the table for the connected user schema <file-specification> : { <fully-qualified-file-name> | <file-name> } <fully-qualified-file-name> :Identifies the full path to the file to load. <file-name> : Identifies the file to load. The file must be located in the default path. <cloud-storage-specification> : { CLOUDSTORAGE | CS } [ <uri> | <qualified-name> ] <uri> : complete uri for the cloud storage file if a default cloudstorage is not set <qualified-name> : name of the object, optionally qualified by the namespace and the bucket. The qualified name concatenated to the uri specified by cloudstorage command must fully identify the object uri. If <uri> and <qualified-name> are omitted, the cloud storage default must be set to the object. An oci profile or a dbms_cloud credential must be set to do a cloud storage load. See help for OCI, DBC and CS commands. If profile and credential are set, oci profile will be used unless a default transfer is set with CS command. NEW Create table and Load data. Use SET LOAD and SET LOADFORMAT to specify properties for the DDL analysis and generation. [SHOW | SHOW_DDL] Execute the ddl generation phase and show the ddl. Use SET LOAD and SET LOADFORMAT to specify properties for the DDL analysis and generation. [CREATE | CREATE_DDL] Execute the ddl generation phase and create the table. Use SET LOAD and SET LOADFORMAT to specify properties for the DDL analysis and generation. Defaults for file format: The first row of the file is a header row. The columns in the header row must match the columns defined on the table. The columns are delimited by a comma and may optionally be enclosed in double quotes. Lines are terminated with standard line terminators for windows, unix or mac. File is encoded UTF8. The default load: Processes with 50 rows per batch. If AUTOCOMMIT is set in SQLCL, a commit is done every 10 batches. The load is terminated if more than 50 errors are found. Use SET LOADFORMAT options for reading the file (delimiter, enclosures, etc). Use SET LOAD options for loading the data (rows per batch, date formats, etc). Use SET TIMING ON to show elapsed time for the load. Use CD command to set default directory for file access. Use CLOUDSTORAGE (CS) command to set default cloud storage uri and transfer options. Use OCI command to set profile Use DBC command to manage and set credentials. Create Table ddl generation pre-scans the data file to determine column properties. Use SET LOAD SCAN <n> to specify number of rows to scan for ddl. 100 is the default. Use SET LOAD COL_SIZE to change column sizes that are generated. Use SET LOAD MAP_NAMES to map file column names to table column names. Examples: -------- Example "LOAD TABLE" local file LOAD EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv Loads the data from EMPLOYEES_DATA_TABLE into the EMPLOYEES table of the schema for current user 7369,"SMITH","CLERK",7902,17-DEC-80,800,,20,5555555555554444 Example "LOAD TABLE" local file with customized load processing SET LOAD BATCH_ROWS 100 BATCHES_PER_COMMIT 100 100 LOAD HR.EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv Loads the data from EMPLOYEES_DATA_TABLE into the EMPLOYEES table of the HR schema using batch and commit options 7369,<SMITH>,<CLERK>,7902,17-DEC-80,800,,20,5555555555554444{eol} Example "LOAD TABLE" file with language sensitive dates SET LOAD LOCALE FRENCH FRANCE FORMAT_DATE dd-MON-yy LOAD EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv Loads the data from EMPLOYEES_DATA_TABLE into the EMPLOYEES table transforming localized dates 103,Alexander,Hunold,AHUNOLD,590.423.4567,3-janv.-90,IT_PROG,9000,,102,60 Example "LOAD TABLE" file with date format mask SET LOAD FORMAT_DATE mmddyy LOAD EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv Loads the data from EMPLOYEES_DATA_TABLE into the EMPLOYEES table transforming dates 7369,"SMITH","CLERK",7902,121780,800,,20,5555555555554444 Example "LOAD TABLE" from CLOUD STORAGE with file specified on CLOUDSTORAGE CS https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace/b/bucket/o/employees.csv OCI PROFILE DEFAULT LOAD EMPLOYEES CS Example "LOAD TABLE" from CLOUD STORAGE with NAMESPACE specified on CLOUDSTORAGE CS https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace LOAD EMPLOYEES CS /b/bucket/o/employees.csv Example "CREATE TABLE" and "LOAD TABLE" LOAD EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv NEW Creates EMPLOYEES table based on scanning file and SET LOADFORMAT and SET LOAD properties. Loads the data from the file into the new EMPLOYEES table. Example "SHOW DDL" for TABLE but do not create the table and load data LOAD EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv SHOW Show ddl for EMPLOYEES table based on scanning file, the SET LOADFORMAT and the SET LOAD properties. Example "CREATE TABLE" LOAD EMPLOYEES C:\TABLES\EMPLOYEES_DATA_TABLE.csv CREATE Create EMPLOYEES table based on scanning file, the SET LOADFORMAT and the SET LOAD properties. SQL>
The full instructions for the LOAD
parameter can be displayed using the SQLcl HELP
command, as shown below.
SQL> help set load SET LOAD default | [options...] default : Load method properties return to default values where options represents the following clauses: BATCH_ROWS|BATCHROWS <number_of_rows> : Data loading is done in batches. Specifies the number of rows to include in each batch. BATCHES_PER_COMMIT|BATCHESPERCOMMIT <batches_per_commit> : Commit after processing <number_of_batches>. If equal to 0, will commit at end of load If greater than or equal to 0, will set COMMIT ON CLEAN_NAMES [ TRANSFORM <- | TRANSFORM128 | QUOTE | QUOTE128 | UNIQUE ] Identifies rule for making table and column names compliant with database identifiers. Names are cleaned before they are mapped to provide consistency with previous releases. If both CLEAN_NAMES and MAP_COLUMN_NAMES are used, the clean names should be specified. Standard Identifiers: No longer than 30/128 characters. Not a reserved word. Starts with a letter and contains only letters, digits, or one of _$#. Upper-case. Names that do not comply, must be quoted. Length rules always apply. Note data that is enclosed in quotes will have quotes in the header row removed before names are cleaned. CLEANING OPTIONS TRANSFORM (default) Indicates that names will be transformed as follows: Names will be upper-cased. If the name starts and ends with the quote character, they will be removed. Names that are reserved words will be appended with dollar sign ($). Names that start with a number or special character will be prefixed with an X. Spaces and hyphens will be replaced with underscores (_). $ and # characters will be retained. Special characters other than $ and # will be replaced with number sign (#). Names will be truncated to 30|128 characters depending on database MAX_STRING_SIZE. After names are cleaned, non-unique names within the column set will be appended with a unique sequence number. If truncation is required, sequence number will be maintained. TRANSFORM128 Applies all transform rules. Names may be 128 characters. QUOTE Quote non-compliant name and shorten to 30|128 characters depending on database MAX_STRING_SIZE. QUOTE128 Quote non-compliant names. Names names may be 128 characters. UNIQUE Compatibility option with previous releases of load service. . Names that are not unique within the column set are appended with a unique sequential number. Truncation is not provided. COLUMN_SIZE|COLUMNSIZE|COLSIZE {ACTUAL|ROUND|ROUNDED|MAX|MAXIMUM} : Create table column size strategy. ACTUAL uses the largest size found during the scan. ROUND|ROUNDED uses a size a little larger than the largest size found during the scan. MAX|MAXIMUM uses the database maximum size for the data type that was detected. COMMIT {ON|OFF} : Enable/Disable data commits DATE|DATE_FORMAT|DATEFORMAT format_mask : The format of all DATE data type columns being loaded Specify no format_mask or DEFAULT to use database default. For DATE columns, if format is not set and SCAN_ROWS = 0, the data is scanned for a valid mask. ERRORS {number_of_rows|UNLIMITED}|-1: Indicates the number of error rows allowed. If this number is exceeded, the load will be terminated. -1 and UNLIMITED indicate no error limit. Note that all rows in a batch may be in error if any row fails. LOCALE { <language country> | DEFAULT | "" } : Specify locale language and optionally country. DEFAULT|"" : Set to default locale. MAP_COLUMN_NAMES|MAPCOLUMNNAMES|MAPNAMES { OFF| (<file-col-name>=<table-col-name>,...) } : Provide a mapping from the column names specified in the file to column names in the table METHOD INSERT : Method to use for data loads. SCAN_ROWS|SCANROWS|SCAN <1-5000> : Identify the number of rows to scan for create table generation. Default is 100 rows. TIMESTAMP|TIMESTAMP_FORMAT|TIMESTAMPFORMAT: The format of all TIMESTAMP data type columns being loaded Specify no format_mask or DEFAULT to use database default. For TIMESTAMP columns, if format is not set and SCAN_ROWS = 0, the data is scanned for a valid mask. TIMESTAMPTZ|TIMESTAMPTZ_FORMAT|TIMESTAMPTZFORMAT : The format of all TIMESTAMPTZ data type columns being loaded Specify no format_mask or DEFAULT to use database default. For TIMESTAMPTZ columns, if format is not set and SCAN_ROWS = 0, the data is scanned for a valid mask. TRUNCATE {OFF|ON} : Truncate ON will truncate the table before loading UNKNOWN_COLUMNS_FAIL|UNKNOWNCOLUMNSFAIL|UNKNOWNFAIL {ON|OFF} : ON: Terminate the load if any columns in the file do not map to a column in the table. OFF: Allow the load to proceed when columns in the file do not map to a column in the table. Examples: SET LOAD BATCH_ROWS 100 BATCHES_PER_COMMIT 100 7369,<SMITH>,<CLERK>,7902,17-DEC-80,800,,20,5555555555554444{eol} SET LOAD LOCALE FRENCH FRANCE DATE dd-MON-yy 103,Alexander,Hunold,AHUNOLD,590.423.4567,3-janv.-90,IT_PROG,9000,,102,60 SET LOAD DATE mmddyy 7369,"SMITH","CLERK",7902,121780,800,,20,5555555555554444 SET LOAD MAPNAMES (EMPLOYEE_ID=EMP_ID, FIRST_NAME=FNAME) Map the EMPLOYEE_ID column in the file to EMP_ID column name in the table Map the FIRST_NAME column in the file to FNAME column name in the table SET LOAD COLSIZE MAX SCAN 500 When generating create table ddl, generate column definitions using maximum column sizes. Scan and analyze 500 rows for generating create table ddl SQL>
For more information see:
Hope this helps. Regards Tim...