This post is part of a series of blog posts on the Oracle Autonomous Database.
The first thing to consider when migrating to the Oracle Autonomous Database is how to load and move your data into the Autonomous Database. There are several options for you. When it comes to loading your data, the traditional tools that you are accustomed to using, such as Oracle Data Pump and SQL Loader, can be used to load data into the Oracle database.
When it comes to moving data into the Autonomous Database, Oracle supports many different types of data types, from SQL Loader text files, to export-import dump files, to CSV, to JSON files. When it comes to supporting object storage, ADB supports a few third-party cloud object storage platforms, such AWS S3 and Azure Blob, and, obviously, of course, Oracle’s own object storage. ADB also supports shifting data from an OCI application layer, as well as from virtual machines that may be hosted in your cloud tenancy.
When you look at migrating an existing database into the Autonomous Database, it’s important to realize that a physical database cannot simply be migrated to an autonomous database. It is not possible to migrate to the Autonomous Database by simply unplugging a PDB from an on-premise container database and plugging it into an Autonomous Database. This is not a supported option as of now.
Also, keep in mind all legacy features that are not supported must also be removed. For example, legacy style LOBs. Also, use of Data Pump export-import for exporting Oracle databases version 10.1 and higher are required.
So when we look at approaches to move data into the Autonomous Database, data can be obviously moved with Data Pump. And the nice thing about following this approach is it will eliminate legacy formats. It can perform upgrade versions. Its platform-independent. It’s a very portable way to load data into a pre-existing or pre-created, new Autonomous Database.
Another thing to keep in mind is that GoldenGate can be used. GoldenGate replication can be used to keep the on-premise or the non-autonomous database up and running and online during the process, meaning that you can have a zero down time migration process. And keep in mind that GoldenGate does have certain restrictions for row ID for nested tables, things such as that. Migrating third-party applications that use the Oracle database can also be supported, but do keep in mind, it’s important to ensure that the vendor supports the Autonomous Database. So while this may be technically possible, it is definitely important to work with the vendor.
Talking about Oracle GoldenGate, it is possible to set up GoldenGate between your on-premise or your other source system and use the Autonomous Database as a target database. Currently, only non-integrated replicas are supported- with the Autonomous Database. And the GoldenGate software release required (on the source system) is Oracle GoldenGate for Oracle database release 184.108.40.206.2 and later. For GoldenGate, the Oracle Autonomous Database cannot be a source database, it can only be a target, so please keep this in mind for rollback purposes.
When we look at different methods that are available to us, we support many, many approaches. The SQL Developer Data Import Wizard, SQL Developer Migration Workbench, Oracle Data Pump, SQL*Loader, database links, DBMS_CLOUD Package, external tables, data sync, as well as other ETL tools. These are all approaches that can be used, and there are going to be certain situations where one will work better than the other.
Often these tools involve a combination of tools. For example, using the SQL Developer Data Import Wizard is a very powerful tool for loading data or flat files into the Autonomous Database. But in order to work with large data sets, I would strongly advise that you work with Oracle’s Cloud Object Storage. The SQL Developer Migration Workbench is a very powerful tool for migrating third party databases, and it can actually work with the Object Storage support as well. And Oracle do provide a plugin to allow you to migrate from other databases, such as SQL Server, Amazon Redshift.
Data Pump, as I mentioned previously, is definitely a powerful tool and one that is definitely a recommended tool, especially when migrating from different platforms. There will be situations where it may be the only approach, depending on the platform you’re coming from and the data source. But obviously, it’s an Oracle to Oracle solution.
So let’s look at loading data into the Autonomous Database. The DBMS_CLOUD package is the preferred method. Because with this tool, Oracle give you the ability to load various types of data, whether it be a CSV, an Avro file, a JSON file, or just a regular text file. The DBMS_CLOUD package will permit you to perform these upgrades, inserts, and data loads fairly efficiently.
Basically, what’s required is you need to copy your files, in whatever format they are, to the Oracle Object Storage, and then run the DBMS_CLOUD package from within your Autonomous Database to load that data into your Autonomous Database. And yes, the volume of data that this supports is 100% unlimited. We can work with very large data volumes to perform that operation, and it works independently of the Autonomous Database you’re connected through to.
Loading using SQL*Loader is definitely supported, and it’s a good approach for loading data that’s sitting on a local client machine. And it works over the network. You can transfer your data without having to setup the Object Storage to perform this operation. But do keep in mind, the fact that you will be going over your network and you will be running from a client machine, means that the volume of data that you’re transferring can be limited.
And then when it comes to the Import Wizard on SQL Developer, this tool is very powerful. It can import all types of files, and it’s good for importing XLS files if you wish. Keep in mind that that’s limited to number of rows it’s in the Excel version you’re running with. And this approach is good for loading data from your local desktop. When using this utility, one of the most powerful features is it’s a very good utility for getting an understanding of what your data looks like and how your data is going to be mapped when it gets loaded into the Autonomous Database. So it’s a very good tool for setting up a sandbox or a POC and determining how your upgrade or data migration is going to work. It helps with the mappings, etc.
When it comes to the Object Storage, there’s a package called DBMS_CLOUD, and that package is a wrapper on top of our Oracle external table functionality. It simplifies the requirements of defining how you’re going to load your data. And we’re going to drill into some of these packages or functions that are included, COPY_DATA being the one for loading your data. So keep that in mind, when it comes time to load your data using the DBMS_CLOUD package, COPY_DATA is the one that you will be using for actually loading the data into the database.
ADB APIs for Object Store Access
- PL/SQL package, DBMS_CLOUD
- Wrapper on top of Oracle external table functionality
- Simplifies creating external tables on object store files
- No need to type complicated “create external table” commands
Data loading API: DBMS_CLOUD.COPY_DATA
External table API for queries: DBMS_CLOUD.CREATE_EXTERNAL_TABLE
DBMS_CLOUD is an interface to help you work with your Object Storage from your Autonomous Database. Because, remember, you do not have access to the local file system in the Autonomous Database, so you rely upon that package to perform these tasks.
In addition to DBMS_CLOUD, we have the DBMS_CLOUD_ADMIN package. And the DBMS_CLOUD_ADMIN package is only available on Autonomous Database Shared. And the reason for this is because these operations can be run natively through the database in Autonomous Dedicated. You can type the command, create database link, drop database link, etc. With Autonomous Shared, you do not have that ability. Those capabilities are actually locked down from the end users. So the way you create a database link or drop a database link, the way you enable or disable application continuity, is all done through this package, as well as granting tablespace quotas.
DBMS_CLOUD_ADMIN package is only available for the ADMIN user. To run it as a user other than ADMIN, the user must be granted EXECUTE privileges on the specific procedures, e.g. CREATE_DATABASE_LINK
Use GRANT_TABLESPACE_QUOTA to explictly grant a quota to a user if they need to load data.
We also have a package that’s available on Autonomous Shared called DBMS_CLOUD_MACADM. The DBMS_CLOUD_MACADM is used for configuring Oracle’s database vault on ADB. What you basically do, when you run this package, is you grant the privileges to use this package to the role DV_OWNER and DV_ACCTMGR to that user. Once that role has been granted to that user, that user can then enable, and disable, and manage a database vault.
Now, remember, the DV_OWNER cannot be the admin user. So you cannot grant DV_OWNER to admin. And this is one way of enforcing separation of duties. So keep in mind, the database vault requires the autonomous instance to be physically restarted. It cannot be controlled by the admin user. However, the roles to perform these operations is provided through the admin user to these specific sets of users. So it’s a mechanism to control and separate duties.
When it comes to loading data with the Object Storage, the Object Storage is available to provide you a very fast and low-cost model for storing data. And it’s a very good utility for actually staging data. And it’s possible to query that the components of the object storage directly from your Autonomous Database. The nice thing about using the Object Storage is it simplifies performing E-TL and ELT type operations. It allows you to also perform operations to query in place without actually physically moving your data from the Object Storage and loading it into the Autonomous Database. It’s possible to create hybrid partition tables or external tables that can map to the Object Storage. And it is extremely inexpensive, and your data is encrypted, so that only you can see it. So a very, very powerful feature. In addition to that, it does support the swift object URL as well as our own Object Storage URL.
Object Storage Integration
- ADB is fully integrated with major object stores: Oracle, AWS, Azure
- Choice of access:
- Load object store files into the database
- Query object store files on the fly without loading
- Supports all text file formats: CSV, delimited, JSON, etc.
This procedure stores Cloud Object Storage credentials in the Autonomous Data Warehouse database. Use stored credentials for data loading or for querying external data residing in the Cloud.
credential_name => ‘X’,
username => ‘firstname.lastname@example.org’.
password => ‘auth-token-string’
- Loads source files into an existing target table
- Handles compression, parallelism, logging automatically
- Logs all loads in the table user_load_operations
table_name => ‘CHANNELS’,
credential_name => ‘OBJ_STORE_CRED’,
file_uri_list => ‘https://objectstorage..oraclecloud.com/n///,
format => json_object ( ‘type’ value ‘CSV’, ‘skipheaders’ value ‘1’ )
This procedure creates an external table on files in the Cloud. This allows you to run queries on external data from Autonomous Data Warehouse. Handles compression, parallelism, logging automatically
table_name => ‘CHANNELS_EXT’,
credential_name => ‘OBJ_STORE_CRED’,
file_uri_list => ‘https://objectstorage..oraclecloud.com/n///
format => json_object ( ‘type’ value ‘CSV’, ‘skipheaders’ value ‘1’ ),
column_list ==> ‘CHANNEL_ID NUMBER,
This procedure removes the specified file from the specified directory on Autonomous Data Warehouse.
This procedure copies a file from Autonomous Data Warehouse to the Cloud Object Storage. The maximum file size allowed in this procedure is 5 gigabytes (GB).
This procedure validates the source files for an external table, generates log information, and stores the rows that do not match the format options specified for the external table in a badfile table on Autonomous Data Warehouse.
To troubleshoot loads:
SELECT table_name, status, rows_loaded, logfile_table, badfile_table
Export an Existing Oracle Database to Import into ADB
Oracle recommends schema mode for migrating to ADB to import schemas and their data.
List the schemas you want to export by using the schemas parameter.
Set the parallel parameter to at least the number of CPUs you have in your Autonomous Database to use parallelism.
The exclude and data_options parameters ensure that object types not required in ADB will not be exported and table partitions are grouped together so that they can be imported faster during the import to ADB.
i.e. Data Pump lets you exclude migration of objects like indexes and materialized views that are not required.
expdp sh@orcl \
exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link \
impdp admin@ADW_high \
exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link
The log files for Data Pump operations are stored in the directory DATA_PUMP_DIR
This is the only directory you can specify for the data pump directory parameter.
To access the log file you need to move the log file to your Cloud Object Storage using the PUT_OBJECT procedure.
credential_name => ‘X’,
object_uri => ‘https://…./import.log’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘import.log’
Create Database Links from Autonomous Database to Oracle Databases
Use DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK to create database links from an Autonomous Database to another Oracle database.
Autonomous Database supports creating database links only if the target database is accessible through a public IP or public hostname, or using an Oracle Database Gateway.
To use database links with Autonomous Database the target database must be configured to use TCP/IP with SSL (TCPS) authentication.
To ensure security, the database link port is restricted to the range 1521-1525.
- Copy your target database wallet, cwallet.sso, containing the certificates for the target database to Object Store.
- Create credentials to access your Object Store where you store the cwallet.sso
- Create a directory on Autonomous Database for the wallet file cwallet.sso
- Use DBMS_CLOUD.GET_OBJECT to upload the target database wallet to the directory you created in the previous step, DBLINK_WALLET_DIR
credential_name => ‘DEF_CRED_NAME’,
object_uri => ‘https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/bucketname/o/cwallet.sso’,
directory_name => ‘DBLINK_WALLET_DIR’);
On ADB create credentials to access the target database. The username and password you specify with DBMS_CLOUD.CREATE_CREDENTIAL are the credentials for the target database that you use to create the database link.
Create the database link to the target database using DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK.
db_link_name => ‘SALESLINK’,
hostname => ‘adb.eu-frankfurt-1.oraclecloud.com’,
port => ‘1522’,
service_name => ‘example_medium.atpc.example.oraclecloud.com’,
ssl_server_cert_dn => ‘CN=atpc.example.oraclecloud.com,OU=Oracle BMCS FRANKFURT,O=Oracle Corporation,L=Redwood City,ST=California,C=US’,
credential_name => ‘DB_LINK_CRED’,
directory_name => ‘DBLINK_WALLET_DIR’);
Users other than ADMIN require privileges to run DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK.
Only one wallet file is valid per directory for use with database links. You can only upload one cwallet.sso at a time to the directory you choose for wallet files (for example DBLINK_WALLET_DIR). This means with a cwallet.sso in DBLINK_WALLET_DIR you can only create database links to the databases for which the wallet in that directory is valid. To use multiple cwallet.sso files with database links you need to create additional directories and put each cwallet.sso in a different directory. When you create database links with DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK, specify the directory that contains the wallet with the directory_name parameter.
Autonomous Database sets the SEC_CASE_SENSITIVE_LOGON parameter to true and this value cannot be changed. If your target database is not an Autonomous Database, then you must set the SEC_CASE_SENSITIVE_LOGON parameter to true on the target database.
Loading JSON Documents
For loading data from collections in the cloud, Autonomous Transaction Processing provides the procedure DBMS_CLOUD.COPY_COLLECTION.