Shadow IT : Low-code solutions can help!

I recently had a bit of a rant on email about the current state of Shadow IT at work. Typically, we don’t know it is happening until something goes wrong, then we’re called in to help and can’t, mostly because we don’t have the resources to do it. My rant went something like this…

“This is shadow IT.

Shadow IT is happening because we are not able to cope with the requirements from the business, so they do it themselves.

We need to stop being so precious about tool-sets and use low-code solutions to give the business the solutions to their problems. This allows us to develop them quicker, and in some cases, let them develop their own safely.”

We are not a software house. We are not the sort of company that can take our existing staff and reasonably launch into microservices this, or functions that. In addition to all the big projects and 3rd party apps we deal with, we also need to provide solutions to small issues, and do it fast.

Like many other companies we have massive amounts of shadow IT, where people have business processes relying on spreadsheets or Access databases, that most of us in IT don’t know exist. As I mentioned in the quote above, this is happening because we are failing! We are not able to respond to their demands. Why?

For the most part we make the wrong decisions about technology stacks for this type of work. We just need simple solutions to simple problems, that are quick and easy to produce, and more importantly easy to maintain.

What tool are you suggesting? The *only* thing we have in our company that is truly up to date at this time, and has remained so since it was introduced into the company, is APEX. It also happens to be a low-code declarative development solution, that most of our staff could pick up in a few days. The *only* tool we have that allows us to quickly deliver solutions is APEX. So why are we not using it, or some other tool like it? IMHO because of bad decisions!

You’re an Oracle guy, and you are just trying to push the Oracle stack aren’t you? No. Give me something else that does a similar job of low-code declarative development and I will gladly suggest that goes in the list too. I’ve heard good things about Power Apps for this type of stuff. If that serves the purpose better, I’ll quite happily suggest we go in that direction. Whatever the tool is, it must be something very productive, which doesn’t require a massive learning curve, that also gives us the possibility of allowing the business to development for themselves, in a citizen developer type of way.

It should be noted, we are wedded to Oracle for the foreseeable future because of other reasons, so the “Oracle lock-in” argument isn’t a valid for us anyway.

So you’re saying all the other development stuff is a waste of time? No. In addition to the big and “sexy” stuff, there are loads of simple requirements that need simple solutions. We need to be able to get these out of the door quickly, and stop the business doing stuff that will cause problems down the line. If they are going to do something for themselves, I would rather it was done with a tool like APEX, that we can look after centrally. I don’t want to be worrying if Beryl and Bert are taking regular backups of their desktops…

Are you saying APEX is only good for this little stuff? No! I’m saying it does this stuff really well, so why are we using languages, frameworks and infrastructure that makes our life harder and slower for these quick-fire requirements? Like I said, it’s not about the specific tool. It’s what the tool allows us to achieve that’s important.

What would you do it you could call the shots? I would take a couple of people and task them with working through the backlog of these little requirements using a low-code tool. It might be APEX. It might be something else. The important thing is we could quickly make a positive impact on the way the company does things, and maybe reduce the need for some of the shadow IT. It would be really nice to feel like we are helping to win the war on this, but we won’t until we change our attitude in relation to this type of request.

So you think you can solve the problem of shadow IT? No. This will always happen. What I’m talking about is trying to minimise it, rather than being the major cause of it.

Cheers

Tim…

The Goal and The DevOps Handbook (again) : My Reviews

The Goal

In my recent review of The Unicorn Project I mentioned several times how much I loved the The Phoenix Project. Some of the feedback was that I should take a look at The Goal by Eliyahu M. Goldratt. After all, The Phoenix Project is an adaptation of The Goal.

I had a credit on Audible, which I’ll explain later, so I gave it a whirl.

I don’t know if it was the writing, or the voice acting, but The Goal has so much more personality than The Phoenix Project. I can barely believe I’m saying this after the amount of praise I’ve given to The Phoenix Project over the years.

The Goal is centred around manufacturing. It’s about the productivity issues in a failing factory. Despite being part of the tech industry, I feel the focus on manufacturing actually makes it easier to follow. There’s something about picturing physical products that make things seem clearer to me. This, and the fact many of these concepts were born out of manufacturing, are no doubt why The Phoenix Project makes repeated references to manufacturing.

I realise some people will prefer The Phoenix Project, because it more closely resembles what they see in their own failing technology organisations, but I think I’ve changed my opinion, and I think The Goal is now my favourite of the two.

The DevOps Handbook (Again)

Another thing I mentioned in my review of The Unicorn Project, was how much I disliked The DevOps Handbook. That seemed to surprise some people. So much so, I started to doubt myself. I couldn’t bring myself to read it again, so I decided to sign up for Audible and get it as my free book. That way I could listen to it when driving to visit my family at weekends.

I was not wrong about this book. In the comments for The Unicorn Project review, I answered a question about my attitude to The DevOps Handbook with the following answer.

“I found it really boring. I guess I was hoping it would be more of a reference or teaching aid. I found it really dry and quite uninformative for the most part. It mostly felt like a bunch of people “bigging themselves up”. Like, “When I worked at X, things were terrible, and I turned it around by myself and now things are fuckin’ A!” Similar to this book, I think the important messages could be put across in a tiny fraction of the space.”

There are undoubtedly valuable messages in The DevOps Handbook, but my gosh they make you work hard to find them. If they removed all the dick-waving, there wouldn’t be much left.

Another thing I found annoying about it, was it didn’t feel like it really related to my circumstances. I work with a load of third party products that I can’t just scrap, much as I’d like to. I found myself thinking these people were probably just cherry-picking the good stuff to talk about, and forgetting the stuff that was harder to solve. I’ve written about this type of thing in this post.

The messages in the “good DevOps books” are universal. They help you understand your own problems and think your own way through to solving them. I don’t think The DevOps Handbook helps very much at all.

So that’s twice I’ve tried, and twice I’ve come to the same conclusion. Stick with The Goal and The Phoenix Project. There are better things to do with your time and money than wasting it on The DevOps Handbook and The Unicorn Project. That’s just my opinion though!

Cheers

Tim…

PS. By the time I had waded through The DevOps Handbook a second time I had already got a new credit for Audible, which is why I tried The Goal on Audible, rather than reading it. I’m glad I did.

PPS. There are a few cringeworthy gender stereotypes in The Goal, but remember when this was written…

Video : Secure External Password Store

Today’s video demonstrates how to use a Secure External Password Store to hold database credentials in a client wallet.

The video is based on this old article.

The star of today’s video is John King. This is actually his second staring role on the channel. The other one was called Making Dreams Come True: Video for a Superfan. 🙂

Cheers

Tim…

VirtualBox 6.1.4

VirtualBox 6.1.4 has been released.

The downloads and changelog are in the usual places.

I’ve done the installation on my Windows 10 PC at work and all is good. I’ll probably do the installations on my Windows 10, macOS and Oracle Linux 7 hosts at home tonight and update this post.

Happy upgrading!

Cheers

Tim…

Update: I did the upgrades on my Windows 10, macOS and Oracle Linux 7 hosts at home. Everything went fine, and it all looks good for now.

Data Pump Between Database Versions : It’s not just about the VERSION parameter! (Time Zone Files)

I was doing a small “quick” data transfer between two servers. The source was 19c and the destination was 18c, so I used the VERSION parameter during the export.

expdp … version=18 directory=…

The export went fine, but when I started the import I immediately got this error.

ORA-39002: invalid operation

A little Googling and I came across MOS Doc ID 2482971.1. In short, the time zone file was different between the two databases.

No problem. I know how to fix that, and both databases had the January quarterly patches applied, so the latest time zone files would be available right? Wrong. The 18c database was already at the maximum time zone version that was installed, and I needed to be one higher to match the 19c database.

After some Googling I re-found MOS Doc ID 412160.1. As soon as I opened it I remembered it. I find this note really messy an confusing, but the section labelled “C.1.d) DST patches list” had the list of patches, which is what I needed. I downloaded the patch to match the time zone file version of the source system and applied it with OPatch in the normal way. Always read the patch notes!!!

Once the new time zone file was in place in, it was time to update it in the database. I’ve written about this before.

Once the time zone file versions matched, the import worked as expected. Although the small data transfer that I expected to be quick had turned into a much bigger job. 🙂

I can’t remember if I’ve hit this issue before, but I don’t remember it. I guess I’ve just been lucky with the time zone file versions matching. This note is to remind myself, it’s not just about the VERSION parameter! I’ve also updated a couple of articles with pointers about this.

Cheers

Tim…

PS. It seems the later releases are more sensitive to time zone file differences than previous releases.

Cloud Control 13.4 : Silent Installation and Silent Upgrade

A little over a week ago Enterprise Manager Cloud Control 13.4 was released. The following weekend I spent 3 days running builds constantly trying to get a clean install to work. Eventually I tweeted out in frustration and a friendly face at Oracle, who I’ve stalked on numerous occasions, put me in touch with the EM dev team.

Having had a quick look at my Vagrant build, they suggested I unset the CLASSPATH environment variable, and a working build was born. Thanks very much to the EM dev team! Without them I would have spent days looking at it and would probably still have failed.

Installation

The resulting Vagrant build and an article about the silent installation of Cloud Control 13.4 can be found here.

One thing that still irks me somewhat is the documentation about the adaptive optimizer parameters. The documentation says the following.

“If your Management Repository is using Oracle Database 12.2 or higher, none of these parameters need to be set.”

This is not true, and you always get this error message.

“ERROR:
The following prerequisite check failed because the Oracle Database, where the Management Repository will be configured, does not meet the configuration requirements. Fix the issue manually based on the recommendation offered for this prerequisite, and click OK. For more details, check the logs: /u01/app/oracle/middleware/cfgtoollogs/oui/emdbprereqs
Prereq Name Recommendation
Check if all adaptive features parameters are unset All adaptive features parameters should be unset for improved SQL performance”

I even tried a GUI installation, in case there was a difference between the GUI and silent installations. There wasn’t.

The workaround is to amend a bunch of underscore parameters that are only meant to be necessary when running a patched version of Oracle database 12.1 as the repository database.

alter system set "_optimizer_nlj_hj_adaptive_join"= FALSE scope=both sid='*'; 
alter system set "_optimizer_strans_adaptive_pruning" = FALSE scope=both sid='*';
alter system set "_px_adaptive_dist_method" = OFF scope=both sid='*'; 
alter system set "_sql_plan_directive_mgmt_control" = 0 scope=both sid='*';
alter system set "_optimizer_dsdir_usage_control" = 0 scope=both sid='*'; 
alter system set "_optimizer_use_feedback" = FALSE scope=both sid='*';
alter system set "_optimizer_gather_feedback" = FALSE scope=both sid='*'; 
alter system set "_optimizer_performance_feedback" = OFF scope=both sid='*';

It’s not a show stopper, so I can live with it, but it’s annoying, and the documentation should be altered to reflect the reality.

Upgrade

The next challenge was to work through an upgrade from a previous release. I worked through this using a starting point of 13.3. I already had a vagrant build for 13.3, but I made a few changes to bring it up to date, and add some more disk space. I also renamed the directory structure to make things a little neater.

The upgrade itself was very similar to that of the previous version. You can find the article about the silent upgrade to 13.4 and the Vagrant build I used to test the upgrade here.

Now remember, this is a simple upgrade of a totally clean 13.3 build to 13.4, so I’m not saying this is an exhaustive test, and I’m not saying this is proof it will work for you.

Next Steps

The next challenge will be to try a real upgrade at work. Work is crazy at the moment, so I’m not sure how long I will have to wait before doing this.

Most of our kit is VMware virtual machines running Oracle Linux, and the Cloud Control server is no exception, so I can get a backup of the whole VM before the upgrade, and just restore back to that in case of a disaster.

An ideal place to be is to have your build scripted, including the reconfiguration of all your targets. After a previous “issue”, I went through our existing config and built the EMCLI scripts to replace it all. I *think* I can rebuild everything from scratch if I need to. We do all new agent installations, target discovery and setup using EMCLI now, so I think all the retrofitted stuff will work too, but I have to admit I’m kind-of scared to try. 🙂

Conclusion

I don’t like to do anything at work unless I’ve already done it at home first. It’s taken me pretty much 5 full days (Fri, Sat, Sun, Fri, Sat) to get through this, but it’s done now, and I feel I can have a try at work without looking like a total fool now! 🙂

Cheers

Tim…

MobaXterm 20.0 and KeePass 2.44

And in other news about things I’ve missed recently…

MobaXterm 20.0 was released a couple of days ago. It looks like they’ve switched across to the yearly naming like many other companies. 🙂

The downloads and changelog are in the usual places.

If you are working on Windows and spend a lot of time in shells for connections to Linux boxes, you need this in your life!

KeePass 2.44 was released nearly a month ago.

The downloads and changelog are in the usual places.

You can read about how I use KeePass and KeePassXC on my Windows, Mac and Android devices here.

Happy days!

Cheers

Tim…

Video : Multitenant : Online Move of Datafiles in CDBs and PDBs

Today’s video is a quick look at online datafile moves in container databases (CDBs) and pluggable databases (PDBs).

If you’ve used this functionality in a non-CDB database, it’s going to look familiar, with a PDB-specific gotcha.

These articles discus moving and renaming files.

I’ve added this to my Multitenant YouTube playlist.

The star of today’s video is Todd Trichler, but he’s having the share the limelight with the top of Roel Hartman‘s head, and brief clips of John King and Debra Lilley on the video screen behind him.

Cheers

Tim…

Oracle Database 20c : Cloud Preview, Docs and Desupport

A little while ago Dominic Giles tweeted about the release of an Oracle Database 20c preview on Oracle Cloud and the Oracle Database 20c documentation. Some lucky people have already deployed the 20c preview. 🙂

Should we upgrade ASAP?

Dominic was quick to point out 19c is the long term support (LTS) release, and your focus should be to upgrade to that release. You should probably only upgrade to 20c if you really need some of the functionality it delivers, and are prepared to upgrade and patch regularly until you hit the next long term support release, which is likely to be 22c according to a slide from Sangham 2019, posted on Twitter by Patrick Jolliffe. 🙂

Most people will probably jump between LTS releases every 3-4 years.

I only care about LTS releases, so 20c is irrelevant right?

Wrong! {in flashing red lights}

It’s important to check out what is happening in the 20c release, because it may alter how you use the earlier releases now. There is no point launching into a new development using a feature that is about to disappear. Remember Oracle Streams anyone?

I’ve been banging on about Multitenant for over 6 years now, and I know a lot of people out there have stuck with the non-CDB architecture. If your intention is to jump between LTS releases, you need to get your CDB/PDB-foo up to scratch before the next LTS release, because as of 2oc, non-CDB has gone.

What should I do?

Take a look at this section of the Upgrade Manual.

Just scan down it to see if anything stands out as problematic for you. There are sections for 12.2, 18c and 19c too, if you are starting from further back. Think about the impact of this stuff on new and existing database deployments.

My advice. Stop using deprecated features ASAP. Start your migration away from them before you have to start worrying about upgrades.

Hopefully this will stop you making some bad decsions!

What stood out to you Tim?

I went on a Twitter frenzy as I was reading this section. Sorry about the spam if you follow me. 🙂

This is what jumped out at me. I’m not saying these all affect me, but they are interesting to me.

“Starting with Oracle Database 20c, Oracle Database is only supported using the multitenant architecture.”

I hope you knew this before this post. If this fills you with dread, don’t panic. I have articles here, and a YouTube playlist here. It’s going to be OK. You will get through this trauma! 🙂

“Traditional auditing is deprecated in Oracle Database 20c. Oracle recommends that you use unified auditing, which enables selective and more effective auditing inside Oracle Database.”

I’m not sure how this affects me. Further investigation is needed.

“Starting with Oracle Database 20c, older encryption and hashing algorithms contained within DBMS_CRYPTO are deprecated.”

This is giving me a bit of a panic attack. I don’t know how big an impact this is. It’s a deprecation notice, not a desupport notice, so there is still time… Maybe…

“Starting with Oracle Database 20c, Transport Layer Security protocol version 1.0 (TLS 1.0) is deprecated.”

When SSLv3 got pulled it killed us. Why? Because too many people were complacent and not willing to patch/upgrade their systems. As a result, on the day some of our external services turned off SSLv3, internal stuff broke and panic patching started! Patching without planning or testing.

Once again, this is a deprecation notice, so you’ve got time to start doing the right thing, but don’t leave it to the last minute. I always say it takes work to remain stationary in tech. You are swimming upstream and just keeping at the same spot takes effort. If you don’t make that effort, you are just floating downstream.

“Starting in Oracle Database 20c, the package DBMS_OBFUSCATION_TOOLKIT is desupported, and replaced with DBMS_CRYPTO.”

I’m hoping I’ve got everything moved across to DBMS_CRYPTO, but who knows?

“Starting in Oracle Database 20c, the Large Object (LOB) features DBMS_LOB.LOADFROMFILE and LOB buffering are desupported.”

Some time ago someone pointed out the deprecation notice in a previous release and I revisited all my website stuff (I think). It’s pretty easy to move to loadblobfromfile and loadclobfromfile, but that’s a piece of work and some testing that needs doing!

“Starting with Oracle Database 20c, the Oracle Grid Infrastructure feature Automatic Storage Management Cluster File System (Oracle ACFS) is desupported with Microsoft Windows”

This doesn’t affect me, but when you look at this alongside all the other ACFS deprecation and dessupport notices is makes rather grim reading. On the one had I’m thinking ACFS is for the chopping block, but on the other hand there are new features. Who knows what’s going on here?

“Desupport of Vendor Clusterware Integration with Oracle Clusterware”

I only included this one because it took me back to the glory days of Oracle 9i RAC on Tru64 and TruCluster. For quite some time since then, combining Oracle Clusterware with other clustering solutions as resulted in a clusterf*ck!

“Starting in Oracle Database 20c, the IGNORECASE parameter for the orapwd file is desupported. All newly created password files are case-sensitive.”

If I’m honest, I kind-of forgot this was possible. I think I only have one project that still uses case-insensitive passwords generally, and as of December last year, that is no longer necessary. I can’t remember needing case-insensitive passwords in the password file.

Is that it?

No, but it’s what stood out to me. Check the documentation for yourself and see what stands out for you.

What if it makes me depressed?

Check out the New Features Guide and look at all the new stuff you get to play with. Anyone want a new JSON data type with better performance? 🙂

Cheers

Tim…

Video : Oracle : Silent Installation and Database Creation

In today’s video we’ll take a look at the two sections of a database build that people often use a GUI for. The software installation using the Oracle Universal Installer (OUI), and the database creation using the Database Configuration Assistant (DBCA).

I recently wrote a post called Why no GUI installations anymore? I was surprised at how much comeback I got from that. This video isn’t a tutorial, but more of a taster to let people who are new to the subject see what they could be doing, rather than clicking buttons. 🙂

The video is based on the following articles.

If you want to see more a complete example of an automated build that uses these, you can check out this video.

If you want to get your hands dirty with automated builds, and you really should, you can play around with my Vagrant and Docker builds here.

There are lots of other people with builds on GitHub, including Oracle, so just play around with as many as you like. 🙂

The star of today’s video is David Hollenberger. I think he was a bit surprised when some random guy asked him for a random video clip. 🙂

Cheers

Tim…

Exit mobile version