Data Pump Enhancements in Oracle 21c (and a little support story)

I’ve been having a play with some of the Oracle 21c data pump enhancements. This post links to the resulting articles, and includes a little story about one of the features.

The Articles

Here are the articles I wrote during this investigation.

As an aside, I also knocked up a quick overview of the DBMS_CLOUD package. I’ve used many of the routines in this package in my autonomous database articles over the last few years, but I got a bit sick of jumping around to get the syntax of different operations, so it seemed sensible to have a centralised description of everything, along with working examples.

The Story

The article about using expdp and impdp with a cloud object store (Oracle Cloud Object Storage, AWS S3 or Azure Block Storage) came with a little bit of drama.

Back in the 18c days it was possible to import into an Autonomous Database using a dump file on a cloud object store using the 18c impdp utility. I wrote about this at the time (here). At that time export to a cloud object store using the expdp utility was not supported, and the import wasn’t supported with an on-prem database.

Oracle 21c introduced the ability to export from an Autonomous Database to a cloud object store, which worked fine first time. The documentation also listed a new feature called, “Oracle Data Pump Supports Export to and Import From Cloud Object Stores“. This sounded very much like it meant for on-prem databases, and sure enough it did.

When I started trying to use this feature I pretty quickly hit a road block. The expdp utility couldn’t connect to the object store bucket. I raised a call with Oracle Support about it. While I was waiting for a response I figured this functionality may have a dependency on the DBMS_CLOUD package under the hood, so I installed it in my on-prem database. The on-prem installation of DBMS_CLOUD was working OK, but the expdp utility was still failing to contact the object store bucket.

Due in part to friends in high places, my SR got picked up and it was confirmed the DBMS_CLOUD installation was an undocumented prerequisite, but it was still not working for me. The support engineer confirmed they could replicate the issue too. A few interactions between support and development resulted in bug 33323028, which fortunately had a simple workaround. At that point the support engineer was up and running, but I still had a problem. A bit of tracing later and it turned out my remaining issue was PEBCAK (Problem Exists Between Chair And Keyboard)…

When I installed the DBMS_CLOUD package it said to put a wallet reference in the sqlnet.ora file. I did that and the package seemed to be working OK, so I thought everything was good. Unfortunately I put it under the ORACLE_HOME and Oracle 21c uses a read-only Oracle home, so that’s the wrong place. It didn’t affect the package, as that picks up the wallet location from a database property, but it affected the expdp and impdp utilities. I keep telling people read-only Oracle homes will trip you up if you are not used to them, and sure enough it tripped me up. Damn you muscle memory! Once the correct sqlnet.ora file was amended everything was good.

So the journey to get this feature working involved:

  • An undocumented prerequisite, which I guessed.
  • A bug which Oracle Support and the dev folks gave me a workaround to.
  • An idiot (me) trying to learn not to be an idiot.

With a bit of luck the bug fix/workaround will be rolled into a future release update, so you may never see this. The MOS note about the DBMS_CLOUD package installation suggests this might also be part of the database by default in future. That would be great if it happens.

Anyway, after that little drama I was able to export data from my on-prem database to a dump file located on a cloud object store, and import data from a cloud object store into my on-prem database. Happy days!

Thanks to the support and dev folks who helped get me through this! 🙂

By the way, all the other Oracle 21c data pump new features worked without any issues.

So there you have it. Some new articles and a little bit of drama… 🙂

Cheers

Tim…

Oracle Cloud Infrastructure (OCI) and Terraform : First Steps

We’ve got some stuff going on at work using Terraform, or Terrahawks as I like to call it, so I figured it was about time I had a play with it. I probably won’t be doing much of the project work myself, but I like to understand a bit about all the things we do.

The biggest problem with going on one of these “learning missions” is finding something to do that makes it feel real to me. I have some test environments across two Oracle Cloud accounts. One is my free tier account and the other is a trial account I get through the Oracle ACE Program, that has quite a lot of credit. 🙂 I figured I would automate the build of my test environments, so I can trash and rebuild them at will. So with that as my mission, I’ve taken my first steps into Terraform.

I’m not finished yet, and I’m not saying this is production ready “best practice” stuff. It’s just something I’ve been playing around with and it works great. Fortunately the Terraform OCI Provider and resources do all the heavy lifting, and if you are used to using Oracle Cloud, it’s pretty easy to navigate around the documentation, as a lot of it is organised similar to the menu structure. You can find the top-level of the docs here.

As I always say in these situations, it’s early days for me. I’ve got a number of things I want to build, and I’m sure that process will teach me more, and make me look back at these articles and cringe. That’s more rewriting on the way. 🙂

I’m putting this stuff into a GitHub repo, but I’ve not published that yet. I’m still trying to figure out what I should and shouldn’t include. Update. Here’s the GitHub repo.

Cheers

Tim…

PS. If you don’t remember Terrahawks, this might remind you.

Oracle Autonomous Database Cloud 2019 Specialist : My Thoughts

You’ve probably heard that Oracle have made some training and certifications free in recent times (here). We are approaching the end of that period now. Only about 15 more days to go.

Initially I thought I might try and do all the certifications, but other factors got in the way, so I just decided to do one. You can probably guess which one by the title of this post. 🙂

I had seen a few people speaking about their experiences of the training videos, so I thought I would give my opinions. Remember, this is my opinion of the training materials and exam, not my opinion of the cloud services themselves. I am also aware that this was free, so my judgement is going to be different than if I had to pay for it.

The Voices in the Videos

A number of people have been critical about the voices on the training videos. I really didn’t see a problem with them.

When you record videos and do presentations you have to decide who your target audience is. A large number of people that use Oracle have English as a second language. Having spent years presenting around the world I’ve learned you have to slow down a bit, or you lose some of the audience. I do this on my YouTube videos, and it can make them sound a bit monotone at times. When I’ve recorded my videos at my normal talking speed, people have responded to say they were brutally fast. You can’t please everyone. You have to make a choice, and for some professional training materials that probably means speaking slower.

I listened to most of these training videos at 1.5 speed and it was fine. The fact I wanted to listen to it this way is not a criticism of the training. I’ve listened to a number of Pluralsite courses at 1.7 speed, and I tend to listen to non-fiction Audible books on a higher speed. You just have to find what works for you.

It’s just my opinion, but I thought the voice was fine.

Content Inconsistencies

There are inconsistencies between the training materials and the documentation. I originally listed some, but I don’t think it’s really helpful. As with any training material, I think it’s worth going through the training material and documentation at the same time and cross referencing them, as well as trying stuff out if you can. It helps you to learn and it makes sure you really know what you are talking about.

Why are there inconsistencies? I suspect it’s because the cloud services have changed since the training materials were recorded. Remember, there is a quarterly push to the cloud, so every three months things might look or act a little different.

What should you do? I would suggest you learn both the training material, and the reality where the two diverge, but assume the training material is correct for the purpose of the exam, even if you know it to be wrong in reality. This is what I’ve done for all previous certifications, so this is nothing new to me.

How did I prepare?

As mentioned above, I watched the videos at 1.5 speed. For any points that were new to me, or I had suspicions about the accuracy, I checked the docs and my own articles on the subject. I also logged into the ADW and ATP services I’m running on the Free Tier to check some things out.

I did the whole of this preparation on Sunday, but remember I’ve been using ADW and ATP on and off since they were released. If these are new to you, you may want to take a little longer. I attempted to book the exam for Monday morning, but the first date I could get was late Wednesday.

Content

The training content is OK, but it contains things that are not specific to Autonomous Database. Sure, they are features that can be used inside, or alongside ADB, but I would suggest they are not really relevant to this training.

Why? I think it’s padding. Cloud services should be easy to use and intuitive, so in many cases I don’t think they should need training and certification. They should lead you down the right path and warn of impending doom. If the docs are clear and accurate, you can always dig a little deeper there.

This certification is not about being a DBA or developer. It’s about using the ADB services. I don’t think there is that much to know about most cloud services, and what really matters goes far beyond the scope of online training and certifications IMHO. 🙂

Free

The training and certifications are free until the middle of May 2020, which is when the new 2020 syllabus and certifications for some of the content comes out. By passing this free certification you are passing the 2019 certification, and they will stay valid for 18 months, then you will have to re-certify or stop using the title. I guess it’s up to you whether you feel a pressing need to re-certify or not.

Update: Some of the other training and exams are already based on the 2020 syllabus. Thanks for Adrian Png for pointing this out. 🙂

I’m sure this would not be popular at Oracle, but I would suggest they keep the cloud training and certifications free forever. Let’s be honest. Oracle are a bit-player in the cloud market. They need all the help they can get to win hearts and minds. Making the cloud training and certification free forever may help to draw people in. I don’t see this type of material as a revenue stream, but I’m sure some folks at Oracle do.

From what I’ve seen, the training materials are entry level, and something I would encourage people to watch before using the services, so why not make them free? That’s rhetorical. I know the answer. 🙂

Would I pay for it?

No. I watched the material to get a feel for what they included. I’m not saying I already knew everything, because I didn’t, but I knew most of what I wanted to know before using this stuff. Of course, if I had come in clean, this would have been pretty helpful I guess, but I think it would have been just as easy for me to use some of the online docs, blog posts and tutorials to get to grips with things. That’s just my opinion though. Other people may feel differently.

Would I have sat the exam if I had to pay for it? No. I don’t think there is anything here that I wouldn’t expect someone to pick up during their first few hours of working with the service. It’s nice that it’s free, but I’m not sure it makes sense to pay for it.

What about the exam?

The exam just proves you have watched the videos and have paid attention. If someone came into my office and said, “Don’t worry, I’m an Oracle Autonomous Database Cloud 2019 Specialist. Everything is going to be OK!”, I would probably lead them to the door…

I don’t think the exam was so much hard, as confusing at times. There were some questions I think need revision, but maybe I’m wrong. 🙂

What about doing the exam online?

This freaked me out a bit. You have to take photos of yourself at your desk, and photos of the room. Somewhere at Pearson Vue they have photos of my washing hanging up. 🙂 You are told not to touch your face, so as soon as I heard that my whole head started to itch. I started to read the first question out loud, and was told I had to sit in silence. I understand all the precautions, and they are fine. It just felt a bit odd. 🙂

So there you have it. Having promised myself I would never certify again, it turns out I’m a liar… 🙂 If you get a chance, give one of the training courses and exams a go. You’ve got nothing to lose. You can read more here.

Cheers

Tim…

Oracle Cloud : Free Tier and Article Updates

Oracle Cloud Free Tier was announced a couple of months ago at Oracle OpenWorld 2019. It was mentioned in one of my posts at the time (here). So what do you get for your zero dollars?

  • 2 Autonomous Databases : Autonomous Transaction Processing (ATP) and/or Autonomous Data Warehouse (ADW). Each have 1 OCPU and 20G user data storage.
  • 2 virtual machines with 1/8 OCPU and 1 GB memory each.
  • Storage : 2 Block Volumes, 100 GB total. 10 GB Object Storage. 10 GB Archive Storage.
  • Load Balancer : 1 instance, 10 Mbps bandwidth.
  • Some other stuff…

I’ve been using Oracle Cloud for a few years now. Looking back at my articles, the first was written over 4 years ago. Since then I’ve written more as new stuff has come out, including the release of OCI, and the Autonomous Database (ADW and ATP) services. As a result of my history, it was a little hard to get exited about the free tier. Don’t get me wrong, I think it’s a great idea. Part of the battle with any service is to get people using it. Once people get used to it, they can start to see opportunities and it sells itself. The issue for me was I already had access to the Oracle Cloud, so the free tier didn’t bring anything new to the table *for me*. Of course, it’s opened the door for a bunch of other people.

More recently I’ve received a few messages from people using the free tier who have followed my articles to set things up, and I’ve found myself cringing somewhat, as aspects of the articles were very out of date. They still gave you then general flow, but the screen shots were old. The interface has come a long way, which is great, but as a content creator it’s annoying that every three months things get tweaked and your posts are out of date. 🙂 I promised myself some time ago I would stop re-capturing the screen shots, and even put a note in most articles saying things might look a little different, but now seemed a good time to do some spring cleaning.

First things first, I signed up to the free tier with a new account. I didn’t need to, but I thought it would make sense to work within the constraints of the free tier service.

With that done I set about revamping some of my old articles. In most cases it was literally just capturing new screen shots, but there were a few little changes. Here are the articles I’ve revamped as part of this process.

There are some other things I’m probably going to revisit and new things I’m going to add, but for now I feel a little happier about this group of posts. They’ve been nagging at the back of my mind for a while now.

If you haven’t already signed up for a Free Tier account, there is literally nothing to lose here. If you get stuck, the chat support has been pretty good in my experience, and please send feedback to Oracle. The only way services get better is if there is constructive feedback.

Cheers

Tim…

Cloud : Who are the gatekeepers now?

There’s something you might consider sinister lurking in the cloud, and it might cause a big disruption in who are considered the gatekeepers of your company’s services. I’ve mentioned governance in passing before, but maybe it’s time for me to do some thinking out loud to get this straight in my own head.

In the on-prem world the IT departments tend to be the gatekeepers, because they are responsible for provisioning, developing and maintaining the systems. If you want some new infrastructure or a new application, you have to go and ask IT, so it’s pretty easy for them to keep a handle on what is going on and stay in control.

Infrastructure as a Service (IaaS)

The initial move to the cloud didn’t really change this. Most people who proudly proclaimed they had moved to the cloud were using Infrastructure as a Service (IaaS), and were really just using the cloud provider as a basic hosting company. I’ve never really considered this cloud. Yes, you get some flexibility in resource allocation, but it’s pretty much what we’ve always done with hosting companies. It’s just “other people’s servers”. As far as IaaS goes, the gatekeepers are still the same, because you need all/most of the same skills to plan, setup and maintain such systems.

Platform as a Service (PaaS)

When we start talking about Platform as a Service (PaaS), things start to get a little bit trickier. The early days of PaaS weren’t a great deal different to IaaS, as some of the PaaS services weren’t what I would call platforms. They were glorified IaaS, with pre-installed software you had to manage yourself. With the emergence of proper platforms, which automate much of the day-to-day drudgery, things started to shift. A developer could request a database without having to speak to the DBAs, sysadmins, virtualisation and network folks. You can of course question the logic of that, but it’s an option and there is the beginning of a power shift.

When we start talking about IoT and Serverless platforms things change big-time. The chances are the gatekeeper will be the budget holder, since you will be charged on a per request basis, and probably have to set a maximum spend per unit time to keep things under control. Depending on how your company manages departmental budgets, the gatekeeper could be whoever has some spare cash this quarter…

Software as a Service (SaaS)

Software as a Service (SaaS) possibly presents the biggest challenge for traditional on-prem IT departments, as the business can literally go out and pick the product they want, without so much of a thought for what IT think about it. Once they’ve spent the money, they will probably come to IT and expect them to magic up all the data integrations to make things work as expected. Also, once that money has been spent, good luck trying to persuade people they backed the wrong horse. SaaS puts the business users directly in the driving seat.

Conclusion

It would be naive to think any movement to the cloud (IaaS, PaaS or SaaS) could be done independently of an existing IT department, but the tide is turning.

The IT world has changed. The traditional power bases are eroding, and you’ve got to adapt to survive. Every time you say “No”, without offering an alternative solution, you’re helping to make yourself redundant. Every time you say, “We will need to investigate it”, as a delaying tactic, you’re helping to make yourself redundant. Every time you ignore new development and delivery pipelines and platforms, you are sending yourself to an early retirement. I’m not saying jump on every bandwagon, but you need to be aware of them, and why they may or may not be useful to you and your company.

Recently I heard someone utter the phrase, “you’re not the only hotel in town”. I love that, and it should be a wake-up call for any traditional IT departments and clouds deniers.

It’s natural selection baby! Adapt or die!

Cheers

Tim…

Why Automation Matters : Can’t the cloud do it for you?

One of the comments on my previous post in the series mentioned using the cloud may solve a lot of these issues, implying you don’t have to bother with your own automation. Cursed with the ability to see both sides to any argument, I both agree and disagree with this. 🙂

Cloud providers bring a lot to the table as far as automation is concerned. Firing up new VMs and containers is really simple, and of course platforms such as RDS and the Oracle Autonomous Database family take over many of the operational aspects. So I can forget about automation right? Not so fast…

We typically see demos of cloud services that involve clicking buttons on web pages and it all works and looks great, but it’s not the way we really want to work. We want our infrastructure as code, and you can’t check button presses into your version control. 🙂 Also, if we are promoting self-service in the company, the last thing we want to do is give everyone access to our cloud account.

The cloud providers have got our back here. They allow us to use CLIs, web services and tools like Terraform to define whole chunks of infrastructure based on their services. You can use these tools to create your own self-service portals within your company. But that’s a new bunch of stuff you have to learn to become effective using this platform. It hasn’t freed you up from having to think about automation completely. It’s just altered your focus.

What’s more, a cloud provider will not be able to provide every solution you need, configured exactly the way you want it. They may provide many of the building blocks or platforms, but you are still going to have to do some of the work your self, whether it’s application configuration or change management. All of this still needs to be automated if you want to live up to the infrastructure as code mantra.

We also have companies at various stages in the cloud journey to consider. Some companies are still not considering cloud. Some are part way through the journey. Some will almost certainly be running in mixed environments, made up of on-prem and multiple cloud providers for a long time, or eve forever? Automation allows you to abstract some of the working parts, giving you some consistency in these mixed environments.

I think this all comes down to levels. You may never have to install or patch a database again, but that isn’t the whole story as far as automation is concerned.

Check out the rest of the series here.

Cheers

Tim…

Oracle Cloud Infrastructure (OCI) : Create a Compartment, VCN and DB

Having spent time playing on the Autonomous Data Warehouse and Autonomous Transaction Processing services, I kind-of lost sight of the Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) stuff. I had a question recently about running (non-autonomous) databases on Oracle Cloud and I didn’t really have anything of my own to point them at, since my only DBaaS article was on the old “Classic” bit of Oracle Public Cloud. I figured it was about time I did a quick run through of the OCI version of that. This resulted in the following three posts, which are just scratching the surface of course.

At first glance it seems a little more complicated, as there are some prerequisites to think about, but actually it makes a lot of sense. The sales pitch demo of any cloud service is to click a few buttons and everything magically appears, but there is some thought needed in the real world. Defining a reasonable network topology for security, and separation of duties and functional areas are pretty common in most companies. This does feel more sensible, and sets you off on the right foot.

If you need a certain amount of manual control and access to the server, the Database VM approach is fine, and there are also Bare Metal and Exadata services too, but I think my starting position would be the autonomous services, unless I had a specific reason not to go that route. I’m all about doing as little as possible… 🙂

Cheers

Tim…

The first rule of Oracle Cloud Apps is: You do not talk about Oracle Cloud Apps

The wife has written a couple of posts recently (here and here) about the inevitable confusion that results when speaking about Oracle applications and the cloud. It’s really hard to speak about this stuff and know everyone is hearing and understanding what is being said, rather than what they think is being said.

Think about it for a minute.

  • Oracle Cloud Apps – Version 12. You can run them On-Prem, but most people will only ever experience them on the cloud. Not surprisingly, when I say “Oracle Cloud Apps”, this is what I’m talking about. My company is currently moving to Oracle Cloud Apps and we have no EBS.
  • E-Business Suite on the Cloud. Version 12.x. They’re Oracle applications and they run on the cloud, so they are Oracle Cloud Apps right?
  • If you are writing extensions to SaaS using the PaaS features, you are writing Oracle apps in the cloud. These are Oracle Cloud Apps right?
  • E-Business Suite 12.x. They are Oracle Apps and they are at version 12, so they are Oracle Apps 12 right?
  • Fusion Middleware 12c Release 1 or 2. If I’m writing apps on this stack they are Oracle Apps at version 12 right?
  • I can put anything on Oracle Public Cloud. Those are then Oracle Cloud Apps right?
  • All the other applications products and NetSuite etc. They are Oracle Cloud Apps right?

In the above examples I’m being intentionally silly, but I think you get the picture. If you are a little loose with your terminology, description or phrasing it’s really easy to be misunderstood.

What’s more, as individuals we each have a different set of experiences, so we are entering the conversation with some specific context in mind, and kind-of assume everyone understands our context.

Today I had a chat on Twitter with a couple of guys (Andrejs Karpovs‏ and Andrejs Prokopjevs) about my “Oracle Cloud Apps DBA” comments in this post. Both those guys are infinitely more qualified to speak about apps than me, but for a time I think we were speaking at cross purposes. I agree with everything that was said in the context it was said, but we were coming at things from quite different angles, so we seemed to be disagreeing at times. 🙂

It just feeds back into what Debra has been saying about how you have to be super careful when you discuss this stuff, and why she’s started to use the “Oracle Fusion Apps” name again in some conversations. I find myself saying things like, “Oracle Cloud Apps, formerly know as Oracle Fusion Apps”, which is a complete pain in the ass and doesn’t work too well on Twitter. 🙂

Cheers

Tim…

Oracle’s Cloud Licensing Change : Be Warned!

Over the last couple of years I’ve been doing talks about running Oracle databases in the cloud. In both my talks and articles I refer to the Licensing Oracle Software in the Cloud Computing Environment document. This morning I was reading a post on a mailing list and someone mentioned the document had been updated on 23-Jan-2017 and contained two rather interesting changes.

The Good

The document now explicitly mentions the difference between how vCPUs are licensed on different cloud providers. On AWS a vCPU is one Intel hyper thread, so you need 2 vCPUs to make a real core. Azure does not use hyper threading on their boxes, so 1 vCPU equals a real core. The previous version of the document did not make this clear, so it read like you were paying per thread on AWS, even though people who used cloud-savvy partners understood this issue and paid correctly (vCPUs/2 on AWS).

The Bad

The document now says,

“When counting Oracle Processor license requirements in Authorized Cloud Environments, the Oracle Processor Core Factor Table is not applicable.”

Just digest that for a moment. The intel core factor is 0.5, so an 8 core physical box requires 4 cores of licensing. Now on the cloud, an 8 core VM (16 vCPUs on AWS or 8 vCPUs on Azure) requires 8 cores of licensing.

On the 23-Jan-2017 the intel core factor was removed from the cloud licensing calculation and overnight your cloud licensing costs doubled! WTF! 🙁

Update: For the new  AWS bare-metal service, the core factor *should* still apply.

The same person also pointed out that in a MOS Note (Doc ID 2174134.1), last updated on 20-Aug-2016, Oracle pulled support for the Oracle Multitenant option from AWS EC2. WTF on a bike! 🙁 I assume they mean both non-CDB and Single Tennant (Lone-PDB) are still supported.

The Ugly

The reaction to this is going to be really bad! It’s getting really hard to remain an Oracle fanboy these days!

If you have been to one of my cloud talks over the last couple of years and you are basing your opinion on something you’ve heard me say, please remember things change. For those people I presented to at the UKOUG Database SIG on Tuesday, I’m sorry, but I was 2 days out of date on one slide. I’ve updated my slides and articles to reflect this change!

This is all so completely depressing!

Cheers

Tim…

PS. I’m not saying this policy document overrides your contracts. Just saying this is the most recent policy document produced by Oracle!

PPS. You might want to take a look at page 19 and the addendum on page 23 of this copy of the NoCOUG Journal.

Cloud First (again)

cloudDuring OpenWorld I wrote about my thoughts on Cloud First, an approach Oracle is taking for some of its products now. A discussion on Oracle-L has sparked this post.

One of things I hoped Cloud First would accomplish was to allow Oracle to fix more of the bugs before they dropped the on-premise release. Let’s look at the current 12.2 timeline.

  • 20th September (approx): The first 12.2 product was the Exadata Express, where you get a PDB in a fully managed cloud service, was released at OpenWorld. At least up until a few days ago this service was running was 12.2.0.0.3. That doesn’t sound like an on-premise release number to me.
  • 4-5th November: At the end of last week the Database Cloud Service (DBaaS) on Oracle Public Cloud got an update to allow you to provision 12.2.0.1 instances. That sounds kind-of like the version number of a first on-premise release to me. Also, the DBaaS offering is not automatically patched, so Oracle must have a reasonable level of confidence with this release if they are happy to put production DBaaS customers on it. 🙂 There is no installation media on this service, but there is a zip of the “app/oracle/product/12.2.0/dbhome_1” directory structure in the “/scratch/db/db12201_bits.tar.gz” file.
  • Currently the Database Cloud Service (Virtual Image), which builds a VM with installation media in the “/scratch” directory, does not allow 12.2.0.1 yet. Either they’ve not had time to finish this yet, or they don’t want to make getting the installation media so easy. 🙂
  • 8th November: There has been some limited 12.2 documentation around since the release of Exadata Express, but the “proper” 12.2 documentation was released yesterday. There are still some missing bits, like the install/upgrade manuals, which is not surprising as they are not necessary for Exadata Express or DBaaS.

So as far as I’m concerned, we have only just got a product that resembles an on-premise release now. The meaning of Cloud First will be judged by how long it takes from *now* for the on-premise release to drop. If it happens soon I will be in the “Cloud First has worked out OK” camp. If there is an extended period between now and the on-premise release, I will be switching my allegiance to the conspiracy theory camp. 🙂

Cheers

Tim…

PS. It’s possible there is still some work to put together conventional installation media. I have no knowledge of the internal processed at Oracle.