Export table - first 10 million rows

All posts relating to Oracle database administration.

Moderator: Tim...

Export table - first 10 million rows

Postby jnrpeardba » Thu Nov 14, 2013 5:44 pm

Hi Tim,

I have a schema estimated to be 220GB in size with 2 huge tables comprising around 100 million rows and it's not feasible to perform a migration all in one go or via tts method. Therefore I'd like to split the export via x number of rows by exporting for example 10 million rows at a time, which will leave me with 10 huge export dump files to ship across to the new server and import.

Can I do something like this:

Code: Select all
expdp schemas=schema include=TABLE:"IN('table')" QUERY=\"schema.table:"where rownum <=10000000\" dumpfile=01.dmp =blah blah...." (first 10 million rows)
expdp schemas=schema include=TABLE:"IN('table')" QUERY=\"schema.table:"where rownum >=10000001 but less than 20000000\" dumpfile=02.dmp blah blah...." (from 10 mill +1 to 20 mill)
expdp schemas=schema include=TABLE:"IN('table')" QUERY=\"schema.table:"where rownum <=20000000 but less than 30000000\" dumpfile=03.dmp blah blah...." (from 20 million +1 to 30 million)


And so on until I have taken 10 exports up to 100 million rows

I have 2 questions

1)I do not believe I have the correct syntax in the 2nd and 3rd of my exports - can you provide your input for the correct syntax
2) What happens if there are 93 million rows? During my final export, will the export complete error free or will it give an error because I specified up to 100000000?

Hope I have made myself clear, basically want to split the schema/table export into smaller chunks so I can easily migrate to the new env

Many thanks for any input you have Tim,

regards

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Thu Nov 14, 2013 6:09 pm

Hi.

1) Your syntax is wrong.
2) I think you misunderstand what ROWNUM is and how it works. A test like that will never work. ROWNUM is applied after the rows are retrieved. The example shows why your logic is flawed, even if your syntax were correct.

Code: Select all
CREATE TABLE t1 (
  id NUMBER
);

INSERT INTO t1 VALUES (1);
INSERT INTO t1 VALUES (2);

SELECT * FROM t1 WHERE ROWNUM > 1;

no rows selected

SQL>


3) What makes you think a 220G database can not be transferred in one shot. I do stuff a lot bigger than that all the time.

Cheers

Tim...
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Re: Export table - first 10 million rows

Postby jnrpeardba » Thu Nov 14, 2013 6:35 pm

Hi Tim,

Having spoken to the Apps team today, a previous export involving a schema of 150GB took more than 10+hours and were worried this may take a while to produce the export and transfer the dump file to the new server within a suitable time frame, so I thought an easier option was to do this in chunks.

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Thu Nov 14, 2013 9:28 pm

Hi.

Thoughts:

- If your storage is slow, your exports will be slow.
- Consider exporting in parallel, assuming your hardware can cope with it.
- If you want to separate things, do a full schema, excluding the two big tables. Export the two big tables separately as two separate jobs.

Cheers

Tim...
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Re: Export table - first 10 million rows

Postby jnrpeardba » Fri Nov 15, 2013 7:15 am

Hi Tim,

Thanks for the reply, and in answer to your suggestions:

The previous export was taken using with parallel=4

I cannot confirm about the type of storage, other than they are local disks on a Linux RHEL 5.8 server,
using oracle 11.2.0.3

One thing I'd like to test today would be to export up to a million rows and then another export of rows
1 million + 1 up to 2 million. This was I can gather my own timings and have a better idea of the duration,
rather than going on what has occurred in the past

Do you think it's possible to send me the syntax for a datapump export to produce 2 separate exports, as my previous
syntax was incorrect

Many thanks

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby jnrpeardba » Fri Nov 15, 2013 7:16 am

Hi Tim,

Thanks for the reply, and in answer to your suggestions:

The previous export was taken using with parallel=4

I cannot confirm about the type of storage, other than they are local disks on a Linux RHEL 5.8 server,
using oracle 11.2.0.3

One thing I'd like to test today would be to export up to a million rows and then another export of rows
1 million + 1 up to 2 million. This way I can gather my own timings and have a better idea of the duration,
rather than going on what has occurred in the past

Do you think it's possible to send me the syntax for a datapump export to produce 2 separate exports, as my previous
syntax was incorrect

Many thanks

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Fri Nov 15, 2013 8:53 am

Hi.

As I said before, this can not be done using ROWNUM, so you have to use something with a fixed value, like a primary key ID column. Assuming your table has an ID column populated by a sequence, you might do something like this.

Code: Select all
expdp schemas=schema include=TABLE:"IN('table')" QUERY=\"schema.table:"where id BETWEEN 1 AND 1000000\" dumpfile=01.dmp =blah blah...." (first 1 million rows)
expdp schemas=schema include=TABLE:"IN('table')" QUERY=\"schema.table:"where id BETWEEN 1000001 and 2000000\" dumpfile=02.dmp blah


Notes.

1) BETWEEN is inclusive of the boundary settings, so "BETWEEN 1 AND 1000000" includes both "1" and "1000000".
2) Sequences guarantee uniqueness, but don't guarantee no gaps, so you may export less than 1000000 rows if the DB has been turned off at any point, or if you are running in a RAC environment.
3) The total amount of work on your disks will be the same whether you do one big export, or several little ones, so the overall time to complete may actually be longer and more error prone.
4) If you are using expdp to get the data, you have to either have people off the system if you want to get a consistent result, or use the flashback feature to a specific time to make sure all the exports are consistent to the same point in time.

Thoughts:

If you think the data is too big to export in one go, perhaps you should consider a database duplication to perform the data move. That will be much faster.

http://www.oracle-base.com/articles/11g ... -11gr2.php

Cheers

Tim...
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Re: Export table - first 10 million rows

Postby jnrpeardba » Fri Nov 15, 2013 10:14 am

Thanks Tim,

will be giving this a go later today and then send you the feedback

My initial thoughts when presented with this task, were either transportable tablespaces or database duplication, but neither are an option in this solution.
The application will be shut, during the export, but the database will remain up.

Thanks again,

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Fri Nov 15, 2013 11:33 am

OK. :)
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Re: Export table - first 10 million rows

Postby jnrpeardba » Tue Nov 26, 2013 2:49 pm

Hi Tim,

Apologies for the delay but only just got the go-ahead to perform the export yesterday and here are my findings:-

A schema with a table size of 260GB is producing dump files of 3GB ever 10 mins - this translates to 16 hrs to export around 260GB. And this on a table with 5.5 million rows - so it must come down to storage. it's also on a shared server, with a number of other db's, so I am CPU bound. I understand it defeats the object to use export parallel on a single table, as this will not gain any benefit.

The source db is on a solaris server, with oracle 10g and the target db is on linux with oracle 11.2 - so I am not sure if we have any other options, other than this long drawn out datapump affair.

If you are aware of any options I can utilise please drop us a line and let me know your thoughts.

All in all it was a good test to execute because at least we as a group know where we are

Cheers Tim,

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Tue Nov 26, 2013 3:43 pm

Hi.

Transportable tablespace is going to be the quickest option, provided it is possible. It will allow the endian conversion you need between the big endian of Solaris and the small endian of Linux.

Apart from that, it is export/import...

Cheers

Tim...
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Re: Export table - first 10 million rows

Postby jnrpeardba » Tue Nov 26, 2013 6:39 pm

Thanks for the feedback Tim,

looks like I really only have 2 options

a) transportable tablespace
b) export / import

I've noticed you have a article detailing cross platform tablespaces,which I will be taking a read of tonight.
http://www.oracle-base.com/articles/10g/rman-enhancements-10g.php#cross_platform_tablespace_conversion

I think I need to use this option as I am going from Solaris with an Oracle 10g to Linux with Oracle 11.2

I'll have a read and see if this will be feasible for what I want to achieve,

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Tue Nov 26, 2013 6:45 pm

It's worth a try!

Cheers

Tim...
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Re: Export table - first 10 million rows

Postby jnrpeardba » Thu Nov 28, 2013 9:36 am

Hi Tim,

Having read your article about "cross platform transportable tablespaces" it does seem that it is feasible to go from solaris (big endian)
to linux (small endian) but I wanted to know whether it also can transfer from oracle 10 to oracle 11.

The other question I had was the option of performing an rman duplicate - again would this work between Solaris & Oracle 10 to Linux & Oracle 11g

is this something you have managed and if so is it a complex process to implement?

Many thanks,

Jnrpeardba
jnrpeardba
Advisor
 
Posts: 392
Joined: Wed May 04, 2011 3:14 pm

Re: Export table - first 10 million rows

Postby Tim... » Thu Nov 28, 2013 11:01 am

Hi.

I've not done those specific tasks, but I have done platform conversions and minor version upgrades using transportable tablespaces.

You can see the vesison upgrade is possible here:

http://oracleblogging.wordpress.com/201 ... 0g_to_11g/

What you really need to do is practice this on a test system. I recently did something similar for a 10g HPUX to an 11g Linux move. It was not possible. There were just too many issues with that specific combination. Eventually I gave up and used export/import. You will only know if your combination is more successful by trying.

Note. Depending on how your application is designed, you may have issues using transportable tablespaces. That could be a limiting factor before you start. Best check the transport is possible first, before deciding on a route. :)

Cheers

Tim...
Tim...
Oracle ACE Director
Oracle ACE of the Year 2006 - Oracle Magazine Editors Choice Awards
OakTable Member
OCP DBA 7.3, 8, 8i, 9i, 10g, 11g
OCP Advanced PL/SQL Developer
Oracle Database: SQL Certified Expert
My website: http://www.oracle-base.com
My blog: http://www.oracle-base.com/blog
Tim...
Site Admin
 
Posts: 17953
Joined: Mon Nov 01, 2004 5:56 pm
Location: England, UK

Next

Return to Oracle Database Administration

Who is online

Users browsing this forum: No registered users and 1 guest