Video : Data Pump Checksums in Oracle Database 21c

In today’s video we demonstrate the use of Data Pump checksums in Oracle Database 21c.

The video is based on part of this article.

The star of today’s video is Franck Pachot, who is checking out his escape routes. ๐Ÿ™‚

Cheers

Tim…

Video : SQLcl : Data Pump (Part 2)

In today’s video we demonstrate table-level export and imports using the integration between SQLcl and Data Pump.

This video builds on the first video, so it makes sense to watch that first.

Both these videos are based on this article.

I have a few articles on other SQLcl features, which you can find here.

The star of today’s video is Craig Shallahamer. Craig is the only person I know who defies the first law of thermodynamics, as he’s able to produce more energy than he consumes. If you’ve seen him teach or present, you’ll know what I mean. If you haven’t, you really should! ๐Ÿ™‚

Cheers

Tim…

Video : Oracle Data Pump 21c and Cloud Object Stores

In today’s video we’ll use Oracle Data Pump 21c to export to, and import from, a cloud object store.

The video is based on this article.

You might find the following articles useful as part of the setup.

The star of today’s video is Markus Michalewicz, the Vice President of Product Management; Database High Availability, Scalability, MAA & Cloud Migration at Oracle. That’s a bit of a mouthful. ๐Ÿ™‚

Cheers

Tim…

Data Pump Enhancements in Oracle 21c (and a little support story)

I’ve been having a play with some of the Oracle 21c data pump enhancements. This post links to the resulting articles, and includes a little story about one of the features.

The Articles

Here are the articles I wrote during this investigation.

As an aside, I also knocked up a quick overview of the DBMS_CLOUD package. I’ve used many of the routines in this package in my autonomous database articles over the last few years, but I got a bit sick of jumping around to get the syntax of different operations, so it seemed sensible to have a centralised description of everything, along with working examples.

The Story

The article about using expdp and impdp with a cloud object store (Oracle Cloud Object Storage, AWS S3 or Azure Block Storage) came with a little bit of drama.

Back in the 18c days it was possible to import into an Autonomous Database using a dump file on a cloud object store using the 18c impdp utility. I wrote about this at the time (here). At that time export to a cloud object store using the expdp utility was not supported, and the import wasn’t supported with an on-prem database.

Oracle 21c introduced the ability to export from an Autonomous Database to a cloud object store, which worked fine first time. The documentation also listed a new feature called, “Oracle Data Pump Supports Export to and Import From Cloud Object Stores“. This sounded very much like it meant for on-prem databases, and sure enough it did.

When I started trying to use this feature I pretty quickly hit a road block. The expdp utility couldn’t connect to the object store bucket. I raised a call with Oracle Support about it. While I was waiting for a response I figured this functionality may have a dependency on the DBMS_CLOUD package under the hood, so I installed it in my on-prem database. The on-prem installation of DBMS_CLOUD was working OK, but the expdp utility was still failing to contact the object store bucket.

Due in part to friends in high places, my SR got picked up and it was confirmed the DBMS_CLOUD installation was an undocumented prerequisite, but it was still not working for me. The support engineer confirmed they could replicate the issue too. A few interactions between support and development resulted in bug 33323028, which fortunately had a simple workaround. At that point the support engineer was up and running, but I still had a problem. A bit of tracing later and it turned out my remaining issue was PEBCAK (Problem Exists Between Chair And Keyboard)…

When I installed the DBMS_CLOUD package it said to put a wallet reference in the sqlnet.ora file. I did that and the package seemed to be working OK, so I thought everything was good. Unfortunately I put it under the ORACLE_HOME and Oracle 21c uses a read-only Oracle home, so that’s the wrong place. It didn’t affect the package, as that picks up the wallet location from a database property, but it affected the expdp and impdp utilities. I keep telling people read-only Oracle homes will trip you up if you are not used to them, and sure enough it tripped me up. Damn you muscle memory! Once the correct sqlnet.ora file was amended everything was good.

So the journey to get this feature working involved:

  • An undocumented prerequisite, which I guessed.
  • A bug which Oracle Support and the dev folks gave me a workaround to.
  • An idiot (me) trying to learn not to be an idiot.

With a bit of luck the bug fix/workaround will be rolled into a future release update, so you may never see this. The MOS note about the DBMS_CLOUD package installation suggests this might also be part of the database by default in future. That would be great if it happens.

Anyway, after that little drama I was able to export data from my on-prem database to a dump file located on a cloud object store, and import data from a cloud object store into my on-prem database. Happy days!

Thanks to the support and dev folks who helped get me through this! ๐Ÿ™‚

By the way, all the other Oracle 21c data pump new features worked without any issues.

So there you have it. Some new articles and a little bit of drama… ๐Ÿ™‚

Cheers

Tim…

Data Pump : API for PL/SQL (DBMS_DATAPUMP) and REST (ORDS)

I’m on holiday, so it’s a lot easier to tick things off my giant list at the moment. ๐Ÿ™‚

Someone asked me about the DBMS_DATAPUMP package, which is a PL/SQL API for Data Pump. I knew I had written about it, but it turned out I only had one little example in my main Data Pump article here. I put it on my list to expand it a little with some other common examples. I finally got round to this here.

It’s by no means an exhaustive list of what the DBMS_DATAPUMP package can do, but if someone new is coming to it, it puts a bit more meat on the bones compared to what I had before.

One of the reasons that moved that up the list was I was thinking about looking at the ORDS Database API endpoints related to Data Pump, which resulted in this.

That was a rather odd experience for a few of reasons.

  • There were some issues with authentication. I expected it to work with both the default administrator and an ORDS enabled schema, but it only worked with the latter. That was confusing the hell out of me for a time. I’m told it will work with both in the future…
  • There are two ways to use Data Pump. There is a REST endpoint “/database/datapump/jobs/” that can perform all the actions, and there are two RPC-style endpoints “/database/datapump/export” and “/database/datapump/import”. I decided that RPC is not for me and chose to ignore it. I’m a lot happier with the REST endpoint.
  • The API has extremely limited functionality at the moment.

I guess I was expecting this to feel like a wrapper over the DBMS_DATAPUMP package, but it is not like that at all. In this version it’s an API that covers a few basic tasks and completely locks you out of everything else. I can’t imagine many DBAs being able to use it, but it might be ideal for some developers if you are OK with giving them DBA credentials to the database. I wrote some of my thoughts/concerns at the bottom of the article here.

Despite my reservations, this is effectively version 1.0 of this functionality. I would suggest people try it out and feedback their requirements to the ORDS folks, so it can be improved.

On a more general note, it’s clear the ORDS Database API needs a rethink in terms of access management. It seems to be a bit of an all or nothing at the moment. I think it’s safe to say that as the APIs expand, there will be a range of people needing to use them, each with very different security constraints. I might want someone to be able to perform table or schema-level exports/imports, but I will want to control what they can do, and by giving them the access to do this, it doesn’t mean I want them to be able to run DBCA commands to create new instances as well. I think you get my point. It would seem like the access to the APIs need to respond to the underlying roles and privileges for the user, not using a separate ORDS parameter mechanism. Just my thought…

Cheers

Tim…

Data Pump Between Database Versions : It’s not just about the VERSION parameter! (Time Zone Files)

I was doing a small “quick” data transfer between two servers. The source was 19c and the destination was 18c, so I used the VERSION parameter during the export.

expdp โ€ฆ version=18 directory=โ€ฆ

The export went fine, but when I started the import I immediately got this error.

ORA-39002: invalid operation

A little Googling and I came across MOS Doc ID 2482971.1. In short, the time zone file was different between the two databases.

No problem. I know how to fix that, and both databases had the January quarterly patches applied, so the latest time zone files would be available right? Wrong. The 18c database was already at the maximum time zone version that was installed, and I needed to be one higher to match the 19c database.

After some Googling I re-found MOS Doc ID 412160.1. As soon as I opened it I remembered it. I find this note really messy an confusing, but the section labelled “C.1.d) DST patches list” had the list of patches, which is what I needed. I downloaded the patch to match the time zone file version of the source system and applied it with OPatch in the normal way. Always read the patch notes!!!

Once the new time zone file was in place in, it was time to update it in the database. I’ve written about this before.

Once the time zone file versions matched, the import worked as expected. Although the small data transfer that I expected to be quick had turned into a much bigger job. ๐Ÿ™‚

I can’t remember if I’ve hit this issue before, but I don’t remember it. I guess I’ve just been lucky with the time zone file versions matching. This note is to remind myself, it’s not just about the VERSION parameter! I’ve also updated a couple of articles with pointers about this.

Cheers

Timโ€ฆ

PS. It seems the later releases are more sensitive to time zone file differences than previous releases.

OTN Appreciation Day : Data Pump (expdp, impdp)

pumpHere’s my contribution to the OTN Appreciation Day.

Data Pump (expdp, impdp) was added in Oracle 10g as a replacement for the rather tired exp/imp utilities, which although useful, were severely lacking in functionality. So why do I like Data Pump? Here are a few standouts for me, but I’m sure other people will prefer others. ๐Ÿ™‚

  • Transportable Tablespaces, and from 12c onward Transportable Database, make moving large amounts of data, or even whole databases easy. This can include platform/endian changes and version upgrade too. This was possible with exp/imp too, but it doesn’t stop it being a useful feature of Data Pump. ๐Ÿ™‚
  • The INCLUDE, EXCLUDE, CONTENT andย QUERY parameters allow a lot of flexibility about what you include or exclude from your exports and imports.
  • The FLASHBACK_TIME parameter allows you to do an export of the data based on a point in time, undo permitting, which allows you to make truly consistent exports.
  • The REMAP parameters allow you to rename tablespaces, datafiles, schemas, tables and even alter the data during operations.
  • The DBMS_DATAPUMP package provides a PL/SQL API, allowing you to perform Data Pump operations without having to shell out to the OS. That makes automation a lot simpler.
  • The NETWORK_LINK parameter can be used to perform export and import operations over a database link. This allows you to create a dump file on a remote database, or even import without an intermediate dump file.

I’m sureย beginners think Data Pump is just for doing exports and imports of small databases, but it’s got loads of features. If you’ve not kept up with the new releases, you might be surprised just how much it can do!

If you’re interested in reading something more technical about Data Pump, here are some articles from my website.

Cheers

Tim…

Data Pump Enhancements in Oracle Database 12c

Another one to file under “Not sexy but flippin’ awesome!”

If you are a DBA, you are going to spend a lot of time with Data Pump. All roads seem to lead back to it. ๐Ÿ™‚ There are some more headline worthy features, like transportable database, but the two that jumped out at me were actually pretty small, but awesome.

  • “TRANSFORM=DISABLE_ARCHIVE_LOGGING:Y” – Switches table and/or index imports to NOLOGGING for the lifespan of the import operation.
  • “LOGTIME=ALL” – Puts a timestamp in the output message so you can see how long individual operations took.

I wrote up an article about it here.

Cheers

Tim…