Data Preparation with OAC/DV

Data Preparation demo from OOW2018


ODI Studio: working with multiple environments in BI Apps

With ODI, a security configuration file (jps-config-jse.xml) and a wallet file (cwallet.sso) is needed to establish the connection to the WORK or MASTER Repository. These files can be retrieved from the ODI Server machine and need to be copied to each client Machine e.g. the Laptop that should connect to the Repository. With Oracle BI Applications, there is typically a multi environment setup like DEV, TEST and PRODUCTION. Each of these environments comes with its own security configuration and wallet file. These files need to be placed in the client directory of ODI e.g.:


There the patch to the security configuration files needs to be specified within the ODI Configuration file (odi.conf) with the following syntax:



Since, each environment comes with its own security configuration file and wallet file you may need to replace these files to make the proper connection to the respective environment. Thus, the easiest approach is to create sub folders for each environment and copy and replace the files and needed:


The wallet file contains and encrypted code to make the database connection, while the security config file contains information on how to authenticate the User e.g. against a Weblogic component.


Oracle Cloud Services with OCI and OCI-classic


Oracle Cloud Services with OCI and OCI-classic

Oracle SaaS and Oracle PaaS services are provided based on top the (underlying) Oracle Cloud Infrastructure. This infrastructure comes as either:

  • Oracle Cloud Infrastructure (OCI) → Gen. 2
  • Oracle Cloud Infrastructure – Classic (OCI-C) → Gen. 1

Newer PaaS Services like ADW are provided based on OCI → Gen. 2, while other Cloud Services like Fusion Applications or DIPC are built on top of OCI-C → Gen. 1. For End-users or Developers the difference may not be directly apparent, but the underlying capabilities for networking and connectivity will vary greatly both generations. This article will not address the differences between Gen. 2 and Gen. 1, but from an Cloud Architecture point of view it should always be checked what underlying generation is used for the Oracle Cloud Service, because the connection of Services will always be easier if both (or all) are on the same generation of underlying infrastructure.

Connections may still be created between Gen. 1 and Gen. 2 services, but the network performance, network configuration and flexibility will be much better with Gen. 2.


OTBI direct database access disabled Oracle Fusion 19A

With the most recent available upgrade of Oracle Fusion Applications Cloud 19A, the direct database access with OTBI has been disabled by default by Oracle. The feature will be disabled within a grace period of 12-month:

This is also looged a Bug, since many customers relied on the feature:

Community discussion: R13.19A OTBI Direct Database Query Privilege Change

Fusion Application: Upgrade 19A What’s new (see chapter: OTBI DIRECT DATABASE QUERY PRIVILEGE IS DISABLED BY DEFAULT FOR

The reasons for this are security concerns from Oracle side and the currently only remaining alternative, is to use BI Publisher in cases of direct database access (as confirmed in the community discussion by Oracle, this method will not be restricted).


Additionally, for OAC, the direct database access feature as also been removed from the default permissions of the logged in User (Authenticated User). But, using the Manage Privileges Pages, the permission can still be assigned for Users who require the feature:

How to create a direct database access is also described in a previous post.

working with BICC to resolve extract issues

in order to extract data from Oracle Fusion Applications Cloud, BICC needs to be used. This is part of every Fusion Applications Cloud and is typically reachable by extending the URL with /biacm e.g.

When logging into BICC with above example URL, the following screen is shown as starting page:


This is indeed the Review Cloud Extract Configuration page of the top right navigation menu and allows to click on a PVO (Data Store) to further check the configuration of the PVO in terms of associated offerings (OBIA modules) and to be selected attributes (columns) of the PVO (Public View Object).

When starting with a OBIA implementation project, upon the first initial login, the required OBIA modules need to be enabled with the Global Data Store List from the navigation menu:


The offerings (OBIA module) can be selected from the drop down menu and the respective PVOs can be enabled for extract:


This will enable the PVO for use within BICC and the extraction process. In the background these PVOs are created, additionally they will also be associated with the respective OBIA Modules, hence for an OBIA implementation of Human resources, only the required PVOs for HR are extracted from Fusion Cloud.

However, the BICC extracts may fail due to errors which require to execute some tasks within BICC, to fix the Errors as workaround. These tasks include:

  • remove a PVO from an associated offering (OBIA module)
  • deselect columns for a PVO

The following section shows a step-by-step approach to fix these errors within BICC:

remove PVO from offering

A typical reason to remove a PVO from an offering may be due to the following error:

ORA-00942: table or view does not exist 


This error can be resolved by navigating to the start page of BICC or the Review Cloud Extract Configuration link from the navigation menu. The PVO name of the error log can be copied to the Data store field to search. To configure the PVO and the association with the offering the icon right to the name or the name itself can be clicked:


On the page of the PVO click edit:


On the following page, the associated offerings is listed in the (right) selected menu box. The associated offering can be removed from the PVO, to exclude it from the next extract run of the OBIA module (offering):


After removing the associated offering from the PVO, the customization can be saved and on the following screen of PVO summary, the PVO can be closed by clicking “Done” to get back to the Review PVO page.

The above method of customization, will permanently remove the PVO from the offering (OBIA Module). When scheduling a subsequent extract run, the removed PVO will not be part of the offering. This can be validated when creating a new extract (from Manage Extract Schedules) and selecting the offering from the drop-down menu:


The removed PVO will no longer be part of a extract run:


This Data Store List screen can also be used for temporary test of deselecting a PVO from a offering (OBIA module) extract. Typically, the OBIA module is selected from the offering drop-down menu and the enabled for Extract checkbox in the header of the table is clicked to enable all associated PVOs for extract, but an individual PVO can also be deselected for the list, to only affect this specific run (e.g. for temporary testing purposes). Thus, a PVO can be removed from a offering for all subsequent runs or just an specific run (using the Data Store List of the Manage schedule screen).

deselect columns from a PVO

Another common error are missing columns from a PVO e.g. because they are obsolete after an upgrade or not used anymore. The following error is show within the BICC logs:

FscmTopModelAM.PjtProjectPlanAM.ProjectPlanIntegrationPVO – Missing columns – [DeliverableTypesBPEOLastUpdateDate1, ProjPlanLinePEOLastUpdateDate2]





In the latter case, the physical column first needs to be mapped to a attribute (field) of the PVO in order to deselect the attribute from the select list. This is most reliably done by create an SR wth Oracle Support. After the attribute has been identified, the attribute can be deseleted from the PVO by going back to the Review Cloud Extract Configuration screen and searching for the respective PVO. Once the PVO name has been clicked (like in above section) the PVO summary page is shown. From this page the PVO can be edited by clicking “Edit”:


After clicking next to the select column page, the configuration of columns is shown for the PVO:


First, it is recommended to make the highlighted columns visibile to check if an attribute is used for the select list, as primary key or within the incremental filter. The attribute name can then be used to search for the attribute which needs to be deselected:


After deselecting the checkbox e.g. of Select List, the PVO needs to be saved. These changes will then be saved for the next extract run with BICC where the  deselected attribute will not be part of the select statement and thus not be retrieved. Since, this column hasn’t been available and caused the extract issue, the issue has been fixed by also deselecting the attribute from the PVO definition.

extracting data from Oracle Fusion cloud for Oracle BI Applications with BICC

the data extract from Oracle Fusion cloud is managed by the Fusion BI Cloud Connector (BICC). This console can be access by the following typical URL:

For the login a dedicated user is recommended e.g. BICC_ADMIN, since this user will be used to create the extract. According to the documentation, this user must be assigned with the following application roles in order  to inherit the required permissions for running the extract:


This user is typically created by the Fusion team and given to the BI team. The BI team can validate the assigned roles by logging in to the following url:

and clicking on the user name in the top right corner. This will list all the assigned application roles to the user:


After validating the user, the Oracle BI Application modules need to be selected within BICC for extraction. This can be done by clicking on the calender icon of the top right menu and clicking on Global Data Store List.


This opens the page to enable the Oracle BI Application product offerings / modules within BICC for Fusion cloud extract. The offering must be selected from the drop down list and then be enabled for extract.


After all required modules have been selected, the modules can be extracted to UCM (this is the bundled file storage of any Fusion Cloud instance and may be used by multiple Fusion modules e.g. the data import to Fusion Cloud itself). The extract can be configured by clicking on Manage Extract Schedules on the top right menu.


This opens the schedule of extracts pane. A extract job can be created from this pane:


A new extract can be created by clicking the plus icon on the top pane. This will open a guided workflow, which allows to select the modules which should be extracted. E.g. in an initial project phase you may only choose to extract (run) a single module extract, while you have actived multiple modules. For example the project consists of Financials, Supply Chain, Procurement, but only the Procurement module is scheduled for extract.)

imageThen selecting immediate as recurrence setting the job will be run after saving it.

In case of errors with the extract, the log can be downloaded by clicking the help icon in the top right area of the screen:


and by clicking download latest log:


The log file may then provide further insight into any issues with the extract:


For further reference, please check: