Backing up Oracle Databases to cloud storage

Peter Goldthorp, Dito. January 2022

Dito developed scripts can be used to backup an Oracle database and transport backup files to and from a cloud storage bucket.

GCP Setup

  1. Follow the instructions in the gsutil install doc to install gsutil

  2. Check that gsutil has a “compiled crcmod” entry

    gsutil rsync uses hashes when modification time is not available at both the source and destination. Verify your crcmod installation is using the module’s C extension.

    Use gsutil version -l to check the crcmod status:

     $ gsutil version -l
     compiled crcmod: True

    Install if necessary:

     sudo yum install gcc python3-devel python3-setuptools redhat-rpm-config
     sudo pip3 uninstall crcmod
     sudo pip3 install --no-cache-dir -U crcmod
  3. Setup account credentials in the oracle account

    • Authenticate as a user with admin privileges in your GCP project
        gcloud auth login
    • Optional set environment variables to simplify service account creation. Substitute your values for items in square brackets
        gcloud config set project [project-id]
        export GCP_PROJECT=$(gcloud config get-value project)
        export GCS_BUCKET=gs://[cloud-storage-bucket-name]
        export GCP_SERVICE_ACCOUNT=[new-service-account-name]
        ### Example:
        gcloud config set project my-gcp-project
        export GCP_PROJECT=$(gcloud config get-value project)
        export GCS_BUCKET=gs://my-bms-rman-backups
        export GCP_SERVICE_ACCOUNT=bms-rman-backups
    • Create a Cloud Storage bucket to store your backups using terminal using below command:
        gsutil mb -c nearline -l us -p $GCP_PROJECT --pap enforced $GCS_BUCKET
    • Create a service account to write backup files to Cloud Storage using below command:
        gcloud iam service-accounts create $GCP_SERVICE_ACCOUNT
    • Give the service account admin access to Cloud Storage:
        gcloud projects add-iam-policy-binding  $GCP_PROJECT \
        --member "serviceAccount:${GCP_SERVICE_ACCOUNT}@${GCP_PROJECT}" \
        --role "roles/storage.objectAdmin"
    • Download the service account keys:
        gcloud iam service-accounts keys create rman-backup-sa-key.json \
    • Install a service account key
        gcloud auth activate-service-account --key-file ./rman-backup-sa-key.json
    • Locate the .boto file


        ls /home/oracle/.config/gcloud/legacy_credentials/
        cat /home/oracle/.config/gcloud/legacy_credentials/
        gs_service_key_file =

RMAN Script Setup

  1. Copy the files to the source database host machine and review their contents. Script descriptions:

    File Description Perform a full (level 0 or 1) database backup using RMAN Perform an incremental (level 2) database backup using RMAN Run gsutil rsync recursively between a source directory and a GCS bucket Run gsutil rsync recursively between a source GCS bucket and a target directory
  2. Edit file to include the .boto file location and gsutil path

  3. Create backup directories

     mkdir -p /mnt/disks/fast-recovery/backups/full
     mkdir -p /mnt/disks/fast-recovery/backups/inc1
     mkdir -p /mnt/disks/fast-recovery/backups/arch
     mkdir -p /mnt/disks/fast-recovery/backups/logs

Configure a crontab to backup the database

Configure a crontab for the oracle account

  1. Make sure the and scripts include oracle binaries in their runtime PATH. The scripts assume a call to . /home/oracle/.bash_profile will do that. They may require modification. For example, by setting the path to the dbhome executable export PATH=$PATH:/usr/local/bin

  2. Create crontab entries. For example, the following entries schedule a level 0 backup every Sunday night, level 1 backups Monday through Saturday, archive log backups every 3 hours and sync to cloud storage every 15 minutes.

    Example: crontab -e

     00 01 * * 0 /home/oracle/cron/ 0 2 2 /mnt/disks/fast-recovery/backups
     00 01 * * 1-6 /home/oracle/cron/ 1 2 2 /mnt/disks/fast-recovery/backups
     0 */3 * * * /home/oracle/cron/ 2 3 /mnt/disks/fast-recovery/backups
     */15  * * * * /home/oracle/cron/  /mnt/disks/fast-recovery gs://dito-ora-backup/fast-recovery-area

Duplicate the database in another location

  1. Follow the GCP Setup steps above
  2. Use the script to copy files from Cloud storage to the fast recovery area


     ./ gs://dito-ora-backup/fast-recovery-area /mnt/disks/fast-recovery
  3. Shutdown the database and startup nomount
     sqlplus / as sysdba
     SQL> shutdown immediate
     Database closed.
     Database dismounted.
     ORACLE instance shut down.
     SQL> startup nomount
     ORACLE instance started.
     Total System Global Area 1157625952 bytes
     Fixed Size                  9685088 bytes
     Variable Size             587202560 bytes
     Database Buffers          553648128 bytes
     Redo Buffers                7090176 bytes
     SQL> exit
  4. Use RMAN to duplicate the database
     rman auxiliary /
     RMAN> duplicate database to 'XE' backup location '/mnt/disks/fast-recovery/backups/' nofilenamecheck;


Crontab not running

  1. Verify the cron daemon is running
  2. Check your crontab entries using crontab guru

Cron jobs failing

Review the log file entries in /var/log/cron.

Run the following commands to create a script for testing scripts. It runs a script in its own shell in a similar way to a cron job.

cat <<EOF >>

. "\$1"
exec /usr/bin/env -i "\$SHELL" -c ". \$1; \$2"

chmod +x

Use the script to check individual scripts before adding them to the crontab. Example:

./ /home/oracle/cron/

Note: this only works for scripts that do not accept command line arguments.

Copyright © Dito LLC, 2023