8i | 9i | 10g | 11g | 12c | 13c | 18c | 19c | 21c | 23c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux

Home » Articles » 19c » Here

Oracle Database 19c RAC On Oracle Linux 8 Using VirtualBox and Vagrant

This article describes the "hands-off" installation of Oracle Database 19c RAC on Oracle Linux 8 using VirtualBox and Vagrant with no additional shared disk devices.

If you are wondering why there isn't a GUI version of this installation, please read Why no GUI installations anymore?

Related articles.

TL;DR

If you are comfortable with VirtualBox, Vagrant and RAC you might want to jump straight to the GitHub repository and use the basic instructions here.

Introduction

One of the biggest obstacles preventing people from setting up test RAC environments is the requirement for shared storage. In a production environment, shared storage is often provided by a SAN or high-end NAS device, but both of these options are very expensive when all you want to do is get some experience installing and using RAC. A cheaper alternative is to use virtualization to fake the shared storage.

Using VirtualBox you can run multiple Virtual Machines (VMs) on a single server, allowing you to run both RAC nodes on a single machine. In addition, it allows you to set up shared virtual disks, overcoming the obstacle of expensive shared storage.

Virtual RAC

In previous releases I gave walk through of a manual installation, but this doesn't really make sense as you shouldn't be installing anything this way anymore, so instead I'm discussing a silent installation. I will be describing the contents of the Vagrant builds, so there will be links to the actual scripts on GitHub, rather than repeating them here.

Before you launch into this installation, here are a few things to consider.

Required Software

Download and install the following software on your PC. If you can't figure out this step, you probably shouldn't be considering a RAC installation.

You will also need to download the 19c grid and database software, along with the latest combo patch for grid. This means you must have an Oracle Support account to complete this installation.

Clone Repository

Pick an area on your PC file system to act as the base for this git repository and issue the following command.

git clone https://github.com/oraclebase/vagrant.git

Copy the Oracle software under the "..../software/" directory. From the "rac/ol8_19" subdirectory, the structure should look like this.

$ tree
.
+--- config
|   +--- install.env
|   +--- vagrant.yml
+--- dns
|   +--- scripts
|   |   +--- root_setup.sh
|   |   +--- setup.sh
|   +--- Vagrantfile
+--- node1
|   +--- scripts
|   |   +--- oracle_create_database.sh
|   |   +--- oracle_db_software_installation.sh
|   |   +--- oracle_grid_software_config.sh
|   |   +--- oracle_grid_software_installation.sh
|   |   +--- oracle_user_environment_setup.sh
|   |   +--- root_setup.sh
|   |   +--- setup.sh
|   +--- Vagrantfile
+--- node2
|   +--- scripts
|   |   +--- oracle_user_environment_setup.sh
|   |   +--- root_setup.sh
|   |   +--- setup.sh
|   +--- Vagrantfile
+--- README.md
+--- shared_scripts
|   +--- configure_chrony.sh
|   +--- configure_hosts_base.sh
|   +--- configure_hosts_scan.sh
|   +--- configure_shared_disks.sh
|   +--- install_os_packages.sh
|   +--- oracle_software_patch.sh
|   +--- prepare_u01_disk.sh
+--- software
|   +--- LINUX.X64_193000_db_home.zip
|   +--- LINUX.X64_193000_grid_home.zip
|   +--- p6880880_190000_Linux-x86-64.zip
|   +--- p30783556_190000_Linux-x86-64.zip
|   +--- put_software_here.txt

$ 

When you clone the repository on Windows it is important you maintain the line terminators. All ".sh" scripts are run inside the Linux VMs, so they need UNIX style line terminators. If your Git client is set to convert all files to Windows style line terminators on a clone/pull, you will run into problems when those scripts are called from Linux.

Amend File Paths

The "config" directory contains a "install.env" and a "vagrant.yml" file. The combination of these two files contain all the config used for this build. You can alter the configuration of the build here, but remember to make sure the combination of the two stay consistent.

At minimum you will have to amend the following paths in the "vagrant.yml" file, providing suitable paths for the shared disks on your PC.

  asm_crs_disk_1: /u05/VirtualBox/shared/ol8_19_rac/asm_crs_disk_1.vdi
  asm_crs_disk_2: /u05/VirtualBox/shared/ol8_19_rac/asm_crs_disk_2.vdi
  asm_crs_disk_3: /u05/VirtualBox/shared/ol8_19_rac/asm_crs_disk_3.vdi
  asm_crs_disk_size: 2
  asm_data_disk_1: /u05/VirtualBox/shared/ol8_19_rac/asm_data_disk_1.vdi
  asm_data_disk_size: 40
  asm_reco_disk_1: /u05/VirtualBox/shared/ol8_19_rac/asm_reco_disk_1.vdi
  asm_reco_disk_size: 20

For example, if you were working on a Windows PC, you might create a path called "C:\VirtualBox\shared\ol8_19_rac" and use the following settings.

  asm_crs_disk_1: C:\VirtualBox\shared\ol8_19_rac\asm_crs_disk_1.vdi
  asm_crs_disk_2: C:\VirtualBox\shared\ol8_19_rac\asm_crs_disk_2.vdi
  asm_crs_disk_3: C:\VirtualBox\shared\ol8_19_rac\asm_crs_disk_3.vdi
  asm_crs_disk_size: 2
  asm_data_disk_1: C:\VirtualBox\shared\ol8_19_rac\asm_data_disk_1.vdi
  asm_data_disk_size: 40
  asm_reco_disk_1: C:\VirtualBox\shared\ol8_19_rac\asm_reco_disk_1.vdi
  asm_reco_disk_size: 20

If you don't alter them, they will get written to "C:\u05\VirtualBox\shared\ol8_19_rac".

Build the RAC

The following commands will leave you with a functioning RAC installation.

Start the DNS server.

cd dns
vagrant up

Start the second node of the cluster. This must be running before you start the first node.

cd ../node2
vagrant up

Start the first node of the cluster. This will perform all of the installations operations. Depending on the spec of the host system, this could take a long time. On one of my servers it took about 3.5 hours to complete.

cd ../node1
vagrant up

Turn Off RAC

Perform the following to turn off the RAC cleanly.

cd node2
vagrant halt

cd ../node1
vagrant halt

cd ../dns
vagrant halt

Remove Whole RAC

The following commands will destroy all VMs and the associated files, so you can run the process again.

cd node2
vagrant destroy -f

cd ../node1
vagrant destroy -f

cd ../dns
vagrant destroy -f

Description of the Build

From here on we will describe the sections in the build process. Remember, all parameters come from the following two files.

DNS Server Build

The DNS server build is really simple. You can see an example of the vagrant output I received here. This VM took about 2 minutes to build.

The Vagrantfile contains the definition of the VirtualBox VM that will be built, using the parameters from the vagrant.yml file. The default values produce a VM with the following characteristics.

The last stage of the VM build it to run the setup.sh script, which simply runs the root_setup.sh script.

The root_setup.sh script does the following.

Once the vagrant up command completes you will be left with a functioning DNS server ready for use with your RAC.

If you have any problems with the DNS build, don't continue with the RAC nodes until you fix them.

RAC Node 2 Build

The basic setup of the OS for the RAC nodes are very similar, but node 2 doesn't perform any installation actions, so we need this in place before starting node 1. You can see an example of the vagrant output I received here. This VM took about 5 minutes to build.

The Vagrantfile contains the definition of the VirtualBox VM that will be built, using the parameters from the vagrant.yml file. The default values produce a VM with the following characteristics.

The last stage of the VM build it to run the setup.sh script, which simply runs the root_setup.sh.

The root_setup.sh script does the following.

Once the vagrant up command completes you will be left with a prepared RAC node 2.

If you have any problems with the node 2 build, don't continue with the node 1 build until you fix them.

RAC Node 1 Build

The basic setup of the OS for the RAC nodes are very similar, but unlike node 2, the node 1 setup also includes the software installation and configuration actions. Remember, the DNS and node 2 VMs should be running before starting this node. You can see an example of the vagrant output I received here. This VM took about 1.5 hours to build.

The Vagrantfile contains the definition of the VirtualBox VM that will be built, using the parameters from the vagrant.yml file. The default values produce a VM with the following characteristics.

The last stage of the VM build it to run the setup.sh script, which simply runs the root_setup.sh.

The root_setup.sh script does all of the same actions as the node 2 build described above, but it also includes the installation and configuration steps. Rather than repeat the explanations of the common steps we will just focus on the differences here.

Once the vagrant up command completes you will be left with a fully configured and running 2 node RAC.

Extra Notes

I usually try to stick to base releases of Oracle software, so people without an Oracle Support contract can still test the build. That is difficult here as the 19.7 release update (RU) includes a number of fixes related to running Oracle 19c RAC on OL8.2. It's possible to get round some of these by downgrading OpenSSL to a previous release, but that's messy, so I stuck with making the 19.7 patch a requirement.

If you are interested in trying to complete the installation without the 19.7 RU, here are some things to consider. These are not necessary if you apply the patches as part of the installations using the "-applyRU" parameter. Thanks to Abdellatif AG in the comments for pointing this out.

There is an incompatibility with the way the 19.3 GI installer tests for passwordless SSH and the way SCP works on OL8.2. The workaround for this is to do the following on each RAC node. This information came from a private MOS note (Doc ID 2555697.1). Thanks to Simon Coter for helping me with this.

# Before the installation.
mv /usr/bin/scp /usr/bin/scp.orig
echo "/usr/bin/scp.orig -T \$*" > /usr/bin/scp
chmod 555 /usr/bin/scp

# After the installation.
#mv /usr/bin/scp.orig /usr/bin/scp

The 19.3 software was released before OL8 was certified, so it's not on the list of valid distributions. You can get round this by faking the distribution with the following environment variable.

export CV_ASSUME_DISTID=OEL7.6

Even with these workarounds in place the DBCA still fails to create a database because of the bug described in MOS Doc ID 29529394.8, which states there is no workaround.

For anyone determined to try and make the installation work without the 19.7 RU, focus on downgrading OpenSSL. The errors produced will lead you to the following MOS notes, but they are mostly a distraction from the real problem.

For more information see:

Hope this helps. Regards Tim...

Back to the Top.