Update Oracle Database Time Zone Files (Poll Results Discussed)

In case you didn’t know, countries occasionally change their time zones, or alter the way they handle daylight saving time (DST). To let the database know about these changes we have to apply a new database time zone file. The updated files have been shipped with upgrades and patches since 11gR2, but applying them to the database has always been a manual operation.

With the recent switch over to daylight savings time in the UK I decided to post this question on Twitter yesterday.

How often do you update your Oracle database time zone files?

We get less than 6% of people updating their time zone files on a regular schedule. Nearly 45% who only do the updates after a database upgrade, and nearly 50% of people who never do it at all.

I can’t say I’m surprised by the results. In terms of the reasoning for these responses, I’ll reference some of the comments on Twitter.

Regular Schedule

“Every ru patch, also thanks to 19.18 it is included now and with out of place upgrade and autoupgrade, i dont do it anymore πŸ™‚ all automatic.”

Mustafa KALAYCI

If you are using AutoUpgrade to patch to a new Oracle Home, then applying updated time zone files is really easy. Before 19.18 it’s just a single entry “timezone_upg=yes” in the AutoUpgrade config file. From 19.18 onward the update of the time zone file is the default action (see here).

So interestingly, there may be some people who don’t know they are applying an update of their time zone file, who actually are now…

After Upgrades

This feels like the natural time to do it for me, and it seems many other people feel the same.

As mentioned previously, AutoUpgrade makes it simple. From 21c onward AutoUpgrade is the main upgrade approach, even for those that have resisted using it for previous versions, so this question goes away from an upgrade perspective.

We can specifically tell it not to perform the action using “timezone_upg=no”, but I’m guessing most people will just go with the default action.

Never

“NEVER. As an American-only company with very little need for time-specific data, quite unnecessary. Horrible design with no rollbacks and headaches w/data pump. Just not worth it if possible to avoid”

Taylor

I totally understand this response. Many of us work with systems that are limited to our own country. Assuming our country doesn’t alter its own daylight savings time rules, then using an old time zone file is unlikely to cause an issue.

When you consider the number of people that run *very old* versions of Oracle, you can see that using old versions of the time zone file doesn’t present a major issue in these circumstances.

With reference to the data pump issue, I’ve experienced this, and it was also picked up in the comments.

“My hypothesis: Most do it when datapump tells they need to do it to get the import file they just received to load”

Connor McDonald

Offline/Online Operation

The point about this being an offline operation was raised.

“Well it is an offline operation, so pretty exceptional thing to do. Only in a rare case where some feature requires the upgrade – like DataPump failing or query over dblink failing.”

Ilmar Kerm

Downtime is never welcome, but it was also pointed out it can be an online operation in 21c.

“Offline will be a thing of the past…

https://docs.oracle.com/en/database/oracle/oracle-database/21/refrn/TIMEZONE_VERSION_UPGRADE_ONLINE.html

Connor McDonald

Conclusion

It seems like the time zone file version is not high on the list of priorities for most people, providing it is not causing a data pump issue. I totally understand this, and I myself only consider it during database upgrades.

I always like reading these poll results. I know the sample size is small, but it gives you a good idea of how your beliefs compare to the wider audience.

If you are interested to know how to manually upgrade your time zone file, you can read about it here.

Cheers

Tim…

Data Pump Between Database Versions : It’s not just about the VERSION parameter! (Time Zone Files)

I was doing a small “quick” data transfer between two servers. The source was 19c and the destination was 18c, so I used the VERSION parameter during the export.

expdp … version=18 directory=…

The export went fine, but when I started the import I immediately got this error.

ORA-39002: invalid operation

A little Googling and I came across MOS Doc ID 2482971.1. In short, the time zone file was different between the two databases.

No problem. I know how to fix that, and both databases had the January quarterly patches applied, so the latest time zone files would be available right? Wrong. The 18c database was already at the maximum time zone version that was installed, and I needed to be one higher to match the 19c database.

After some Googling I re-found MOS Doc ID 412160.1. As soon as I opened it I remembered it. I find this note really messy an confusing, but the section labelled “C.1.d) DST patches list” had the list of patches, which is what I needed. I downloaded the patch to match the time zone file version of the source system and applied it with OPatch in the normal way. Always read the patch notes!!!

Once the new time zone file was in place in, it was time to update it in the database. I’ve written about this before.

Once the time zone file versions matched, the import worked as expected. Although the small data transfer that I expected to be quick had turned into a much bigger job. πŸ™‚

I can’t remember if I’ve hit this issue before, but I don’t remember it. I guess I’ve just been lucky with the time zone file versions matching. This note is to remind myself, it’s not just about the VERSION parameter! I’ve also updated a couple of articles with pointers about this.

Cheers

Tim…

PS. It seems the later releases are more sensitive to time zone file differences than previous releases.