Showing posts with label OBIA. Show all posts
Showing posts with label OBIA. Show all posts

Friday, April 19, 2019

OBIA Notes


  1. To setup the source system and target system configurations, login to BIACM using a user that has Admin privileges (in this case, BIAppsAdmin). Logging in with normal user like weblogic will not allow doing the system setup.
  2. On change of source connection details, suggested option is to edit the connection details of the existing source system setup and not to create a new source connection setup. The editing of the source connection is to be done from ODI studio.
  3. The target data warehouse connection is setup between the BIACM and the warehouse by updating the DataSource Number of the Target Datawarehouse to 999 from ODI Studio. This results in the Target Datawarehouse connection details to show automatically in the system setup tab in BIACM.
  4. Do Provide correct path for Source Files either from CM or ODI Studio(Under Topology Tab>>File)
  5. Datasource Number being provided while creating the source system setup is very crucial and is referenced throughout.
  6. Domain and Domain member mappings are similar to the csv setups being done in earlier versions of BI Apps (like 7964). Do update Domain Member Mappings Information. This information is explained in Detail in the later sections.
  7. To view runtime values of parameters or session values, pick the session ID from the load run.  Login to ODI Console -> Sessions -> Search for the session ID and then look for parameter values. OR in the ODI studio open the session and find the values in general tab 
  1. To view the SQL involved in any mapping, go to Design view, identify the mapping -> package -> scenario. Right-click the package and click on execute. In the pop-up dialog, check the “Simulation” checkbox and click OK. This will generate the SQL associated with the mapping.
  2. To reset the datawarehouse and load fresh data altogether, there is an option available called ‘Reset Datawarehouse Scenario’. This can be used to do a fresh data load by truncating all existing data in the datawarehouse. This scenario works by actually truncating the W_ETL_LOAD_DATES table and running the load in full mode. 
  3. In case of marking a step that has an error as complete, an entry for this step will not be put into the W_ETL_LOAD_DATES table. (Every successfully completed step will have an entry in the W_ETL_LOAD_DATES table with the run date and other details)
  4. Incremental Run logic

The W_ETL_LOAD_DATES table has a record for each of the steps executed in the load plan. In case of successful execution of a step in the load plan, an entry for that step will be made in this table.
     When the load is re-run, at each step, it will look for an entry in the W_ETL_LOAD_DATES table for that step. If an entry is present, it will go in incremental load, or it will go in full mode.
     Example query that checks the incremental mode:
SELECT    IF(COUNT(*)>0,'Y','N')
FROM     QUALIFY_DS(W_ETL_LOAD_DATES)
WHERE   PACKAGE_NAME = 'GET_CONTAINER_NAME()'
AND        (DATASOURCE_NUM_ID=#DATASOURCE_NUM_ID
OR           DATASOURCE_NUM_ID=#WH_DATASOURCE_NUM_ID)
AND        ETL_USAGE_CODE = '#ETL_USAGE_CODE'
AND        COMMITTED='1'

12. In an incremental load, to override and run a particular step in full mode, following options can be used,
        Delete the entry for that step in the W_ETL_LOAD_DATES table
        Update the “committed” column to 0(zero) in the W_ETL_LOAD_DATES table for the particular step.




OBIA Financial Analytics on Peoplesoft configuration

In this post we will see some important configurations steps for setting up Oracle Financial Analytics on Peoplesoft ERP.

Configuring BI Applications 11.1.1.10.1

Once the OBIA installation completed ,next we need to enable the purchased functional area and start extracting data into datawarehouse. In order to seutup we need to use below tools/applications.
  1. The Oracle BI Applications Configuration Manager (BIACM)
  2. Oracle BI Applications Functional Setup Manager (FSM part of BIACM itself)
  3. Oracle Data Integrator (ODI)
BIACM and FSM are web based applications that runs on BI Managed server,these apps facilitates to enable functional areas ,configure required parameters with respect to functional area for extracting data.
Briefly BIACM is for adminstering the entire OBIA environment and configuring ODI.
FSM acts as a lightwieght Project management tool for configuring specific functional area and can be used to assign configuration tasks to specific users to perform.

Steps Involved in Configuring a BI Applications 11.1.1.10.1 Data Load

1. Check ODI Agent status

First step is to make sure ODI agent is up and running since it is responsible for executing data loads.
Login console and make sure odi_server1 is up and running if not make it start from there it self.
and then check "oraclediagent" deployment in the deployments and make sure it is in active state.

2.Configure OBIA environment through BIACM

Access the BI Apps Configuration Manager (BIACM) using

https://hostname:9704/biacm

Note: To setup the source system and target system configurations, login to BIACM using a user that has Admin privileges (in this case, BIAppsAdmin). Logging in with normal user like weblogic will not allow doing the system setup.
In the BIACM Console,
Click on the System Setups -> Define Business Intelligence Applications Instance
Clicking on this task then presents three tabs for configuration: Source Systems, Target Warehouse, and Environment Configuration. If you look at the Target Warehouse and Environment Configuration tabs, you'll see that the BI Applications Configuration Utility has already configured these for you. However, there are no Source Systems defined, so that becomes our first task.

2.1  System Setups -Source Sytem setup:
  • Click the Source Systems tab
  • Click the green plus symbol to add a new source system. 



Now you have to register source in both biacm and odi


Enter the following details, when prompted:

Sl#
Parameter Name
Example Values
1
Product Line
PeopleSoft Enterprise
2
Product Line Version
PeopleSoft Enterprise 9.2
3
Source Instance Name
PSFT_9_2_SRC (Appropriate Instance Name)
4
Description
PSFT_9_2_SRC (Any Description)
5
Data Source Number
29 (Appropriate number for the source system)

Click the Next button to register the source in the ODI Topology.

Provide all the necessary information and click Test button to make sure all are correct.




Now click on next to register File connection.

In the Technology: File System tab back in Configuration Manager, enter the following information:
  1. ODI Data Server Name: PSFT_9_2_SRCFILES
  2. Host Name: <middleware_host>
  3. Path to File:  /<middleware_home>/Oracle_BI1/biapps/etl/data_files/src_files/PSFT_9_2
Finally, click the Save and Close button and then Done to finish this configuration process.
At this point, Configuration Manager has written the configuration from this one screen into the ODI Repository for four different objects in the Physical and Logical Topology.

Appendix 1:
For implementing Financial Analytics, we have to move below 2 csv files to the above mentioned folder. 
  •      file_group_acct_codes_psft.csv
  •         file_glacct_segment_config_psft.csv
In file_group_acct_codes_psft.csv file you have to supply values for
BUSINESS_UNIT,FROM_ACCT,TO_ACCT,GRP_ACCT_NUM

Here BUSINESS_UNIT is the mandatory values ,technically if it is null you wont see any of your account groouping names in DWH tables beacuse in ELT there is "not null" condition on this column while loading into W_PSFT_GRP_ACCT_TMP table.

In  file_glacct_segment_config_psft.csv file you have to supply active chartfield values under SEG 1 to n columns.

For Account,Balancing Segment,Cost Center,Project yu dont need to mention them in file because there were separate WID columns in all Fin tables .

3. Enable Licensed BI Offereings in FSM

Configure Offerings:

This step is done from the Functional Setup Manager from the Configuration Manager.
  1. Click Perform Functional Configurations in Configuration Manager, and then login to Functional Setup Manager.
  2. Then, click Configure Offerings; here we will enable our licensed offering for implementation.
  3. Check the Enable for Implementation checkbox beside the required functional modules and press Yes to continue and click. Select the following for PeopleSoft Enterprise:

        Financial Analytics





Create Implementation Project:

An implementation project is nothing more than a collection of tasks required to configure the offerings you include in the project.
Much like the load plan, BI Applications 11.1.1.10.1 contains a master task list for every offering, with some tasks having a one-to-many relationship to the offerings. Creating an implementation project is very similar to generating a load plan in that you only get a subset of the master based on your selections.
1.       Click Manage Implementation Projects tab and click the 'Create' icon

2.       Enter and select the values as below and Click ‘Next’ button




3. Expand Oracle Financial Analytics -> PeopleSoft Enterprise – Oracle Financial Analytics. Check the Include checkbox for all the applicable modules. You need to select one by one. After selecting one module it takes close to 2-3 seconds to save automatically and then proceed to next module.
4.  After saving, you will see the Project is created and a Task List for Financial Analytics is made.

5.  Click Save and Open Project

Upon opening the project, you are presented with the countless tasks to configure Financial Analytics. The tasks in Functional Setup Manager generally fall into three categories:

  1. Informational tasks that route you to the proper place in the documentation.
  2. Parameter or Domain tasks that route you to the proper place in Configuration Manager.
  3. Text file configuration tasks that tell you what file to edit.


The functional configuration can be achieved by completing the tasks in the Task list created in the Implementation Project.
4. Configure Licensed BI Offereings in BIACM

From the Configuration Manager,

  1. Click System Setups > Manage Business Intelligence Applications on the left-hand sidebar in Configuration Manager.
  2. Then, within the Manage Business Intelligence Applications page, select the offerings and the subject areas required.
  3. Then, press the OK button in the dialog then is then displayed, and once it closes, press Save.
  4. From the Business Intelligence Application Offerings and Associated Sources tab, expand the selected Offerings and map the offering to the correct source.



5.From the Business Intelligence Application Offerings and Associated Sources tab, expand the selected Offerings and map the offering to the correct source.




5. Create and Execute Domain Load

The Domain-Only Load Plan will run some targeted data extractions from the source system to assist with configurations for specific applications


From Configuration Manager,
1. Click Load Plans Administration > Manage Load Plans
2. When the Manage Load Plans page opens, click the green plus symbol to add a new load plan, and enter the following details to define the new domain load plan
Name: PSFT_9_2_Domain_Load
Description: PSFT_9_2_Domain_Load
Load Plan Type:
Domain-only Extract and Load (SDE and SIL)
Source Instances: PSFT_9_2_SRC
(The above fields can be altered based on your source system except 'Load Plan Type')


1.       Then, press the Next button to display the list of available fact table groups
2.       Select the modules as below:
a.       Financial Analytics

3.       Click the Save button and select Save and Generate Load Plan when prompted

Once the generation is complete, you may proceed execute the load plan make sure it runs successful.

After the domain load execution proceed to configure your BI system as followed in further steps.


6. Functional Configuration Tasks

The Functional Configuration consists of the following actions,
  1. Setting up Global and Module Specific Data Load Parameters(required Before Executing Full Load Plan)
  2. Setting up Reporting Parameters
  3. Domain Values and Domain Member Mapping

In BIACM navigate to -> Data Load Parameters Administration -> Manage Data Load Parameters

a.       Search for required parameter by name or code
b.       Click on the parameter row to select the parameter
c.       If the parameter is a Global parameter, then the Global Parameter Value field displays the actual value.

Parameters required for Full load:

The parameters that are required before a full load are marked with yellow warning symbol.




Usually we need to set below parametrs

Configure Global Currencies
Specify Gregorian Calendar Date Range
Specify the Enterprise Calendar
GBL_CALENDAR_ID :
SELECT DISTINCT SETID || '~' || CALENDAR_ID FROM PS_CAL_DEFN_TBL;

Configure Initial Extract Date

Configure Data Load Parameters for PeopleSoft Trees for Item Categories

1. TREE_SETID_NAME_LIST

Query to get values:

SELECT SETID||'~'||TREE_NAME from PSTREEDEFN where TREE_STRCT_ID in ('ITEMS') ;

2. TREE_STRUCT_ID_LIST

ITEMS

Configure Data Load Parameters for PeopleSoft Department Tree

1.TREE_SETID_NAME_LIST_DEPT

Query to get values:

SELECT SETID||'~'||TREE_NAME from PSTREEDEFN where TREE_STRCT_ID in ('DEPTID' ) ;

2. TREE_STRUCT_ID_LIST_DEPT

DEPTID

Configure Data Load Parameters for PeopleSoft Company (Business Unit) Tree

1.TREE_SETID_NAME_LIST_COMPANY

Query to get values:
SELECT SETID||'~'||TREE_NAME from PSTREEDEFN where TREE_STRCT_ID in ('CONSOLIDATION' ) ;

2. TREE_STRUCT_ID_LIST_COMPANY
CONSOLIDATION

Configure Data Load Parameter for PeopleSoft Product Hierarchy

TREE_NAME1


Balancing Segment Dimension
1.TREE_SETID_NAME_LIST
SELECT A.SETID||'~'||A.TREE_NAME FROM PSTREEDEFN A, PSTREESTRCT B WHERE A.TREE_STRCT_ID=B.TREE_STRCT_ID AND B.DTL_RECNAME='FUND_TBL';

2.TREE_STRUCT_ID_LIST
SELECT TREE_STRCT_ID FROM PSTREESTRCT WHERE DTL_RECNAME = 'FUND_TBL';

FUND chartfield has to be mapped to Balancing segment Dimension

Cost Center Dimension
1.TREE_SETID_NAME_LIST
SELECT A.SETID||'~'||A.TREE_NAME FROM PSTREEDEFN A, PSTREESTRCT B WHERE A.TREE_STRCT_ID=B.TREE_STRCT_ID AND B.DTL_RECNAME='DEPT_TBL';

2.TREE_STRUCT_ID_LIST
SELECT TREE_STRCT_ID FROM PSTREESTRCT WHERE DTL_RECNAME = 'DEPT_TBL';

Department chartfield has to be mapped to Cost Center Dimension

Natural Account Dimension
1.TREE_SETID_NAME_LIST
SELECT A.SETID||'~'||A.TREE_NAME FROM PSTREEDEFN A, PSTREESTRCT B WHERE A.TREE_STRCT_ID=B.TREE_STRCT_ID AND B.DTL_RECNAME='GL_ACCOUNT_TBL';

2.TREE_STRUCT_ID_LIST
SELECT TREE_STRCT_ID FROM PSTREESTRCT WHERE DTL_RECNAME = 'GL_ACCOUNT_TBL';

Account chartfield has to be mapped to Natural account diemension

Domains and Domain member mappings


DOMAINS
Domains are pre-seeded dimensional values that help define business metrics. For example, in Financial Analytics, domains store information about the General Ledger accounts.
Domains are typically located in the source system.
If domains are not available in a source system, then they can be sourced from a flat file.
DOMAIN MAPPINGS
Domain Mappings specify how data in a source system is extracted and loaded into Oracle Business Analytics Warehouse.
TYPES OF DOMAINS
1.       Regular Domains
Regular Domains have members consisting of a single value.
For example, members for a Purchase Order Status domain might have the following members:
        Cancelled
        Closed
        Incomplete
These single values map to single member values in the target system. For example, Cancelled maps to Cancelled, Closed maps to Closed, and so on.
2.       Band Domains
Band Domains have members consisting of two values (Range Start, and Range End) that specify a range.
For example, an Account Employee Size domain might have the following members:
a.       1, 5000
b.       5001, 10,000
c.       10,001, 1,000,000.
Each range maps to a single target Domain Member. For example, 1, 5000 maps to Small, 5001, 10,000 maps to Medium, and so on.
SOURCE DOMAINS
Data fields in a Source System application are referred to as Source Domains
WAREHOUSE DOMAINS
Data fields in Oracle Business Analytics Warehouse are referred to as Warehouse Domains.
DOMAIN MEMBERS
Domain Members are the permitted values for a Source or Warehouse Domain.
For example, the Domain Members for MARITAL_STATUS include D for Divorced, M for Married, S for Single, and so on.
WAREHOUSE DOMAIN HIERARCHIES
Warehouse Domain Hierarchies are Domains that have been organized into hierarchies to enable the data to be more effectively analyzed. For example, in Oracle HR Analytics, you might need to have a workforce event hierarchy: Event Group -> Event Sub-group -> Event Detail.
DOMAIN MEMBER MAPPINGS
Oracle Business Intelligence Applications ships default domain value mappings that map the seeded BI Application domain values to the seeded configuration data in Oracle Enterprise Resource Planning applications.
When you configure your Offerings, you need to review the default mappings for domain values, and if necessary update them to suit the categories that you want to use to report on your data.
For example,
In Oracle HR Analytics, the default domain values for Performance Range might be similar to the following:
0 - 50: PERF_RANGE_1
50 - 60: PERF_RANGE_2
60 - 70: PERF_RANGE_3.
If you want to use these default categories, you do not need to make any changes to these mappings before you start your ETL processes.
If you want to make changes to default mappings, then you use the Domain Mappings tab to implement these changes.
For example, you might want to change the range for PERF_RANGE_1 from 0 - 50 to 0 - 100. Or you might want to add a new category named PERF_RANGE_4 and assign the range 100 - 500 to the new PERF_RANGE_4 category


Now edit the domain memeber mappings based on business requirements .

Create the Full Data extract load plan  and execute from biacm and load can be monitored from ODI studio.

Friday, January 12, 2018

UOM conversion logic - Oracle EBS

Using below query we can get the conversion rate for an Inventory Item.



SELECT * FROM
(SELECT MSIB.ORGANIZATION_ID,
                                MSIB.INVENTORY_ITEM_ID,
                                MUCC.TO_UOM_CODE
                                FROM_UOM_CODE,
                                MUCC.FROM_UOM_CODE TO_UOM_CODE,
                                MUCC.CONVERSION_RATE
FROM   MTL_UOM_CLASS_CONVERSIONS MUCC,
                                MTL_SYSTEM_ITEMS_B MSIB
WHERE MUCC.INVENTORY_ITEM_ID=MSIB.INVENTORY_ITEM_ID
UNION ALL
SELECT MSIB.ORGANIZATION_ID,
                                MSIB.INVENTORY_ITEM_ID,
                                MUM.UOM_CODE FROM_UOM_CODE,
                                MSIB.PRIMARY_UOM_CODE TO_UOM_CODE,
                                1/MUC.CONVERSION_RATE
FROM   MTL_SYSTEM_ITEMS_B MSIB,  
                                MTL_UOM_CONVERSIONS MUC,
                                MTL_UNITS_OF_MEASURE MUM
WHERE MSIB.PRIMARY_UOM_CODE=MUC.UOM_CODE
                                AND MUC.INVENTORY_ITEM_ID=0
                                AND MUC.UOM_CLASS=MUM.UOM_CLASS AND BASE_UOM_FLAG='Y'
) SQ_UOM


Monday, August 21, 2017

ORA-20000: Error creating indexes in parallel, see W_ETL_EVENT_LOG for details

I see an error ODI when running OBIA etl loads in ODI.

ODI-1228: Task Custom_SIL_TestFact.WC_TEST_F (Integration) fails on the target ORACLE connection BIAPPS_DW.
Caused By: java.sql.SQLException: ORA-20000: Error creating indexes in parallel, see W_ETL_EVENT_LOG for details
ORA-06512: at line 119

Looks like duplicates in WC_TEST_F table. But real problem is incorrect flex-fields setting in the WC_TEST_F data store.

Issue:
There is an index created WC_TEST_F_U1 as alternate key and "OBI Bitmap Index" value set as Y.

Hence it is issuing below query for creating index during load.

CREATE UNIQUE BITMAP INDEX DEV_DW.WC_TEST_F_U1 ON DEV_DW.WC_TEST_F (DATASOURCE_NUM_ID, INTEGRATION_ID) PARALLEL NOLOGGING

The syntax is incorrect because a BITMAP index cannot be unique

Resolution:

1.Update W_ETL_ALL_INDEXES table  and set the PRE_PARAMETERS column from ‘UNIQUE BITMAP ‘ to ‘UNIQUE ‘ for this index WC_TEST_F_U1.

update W_ETL_ALL_INDEXES set pre_parameters='UNIQUE ' where index_name='WC_TEST_F_U1';

2.Modify WC_TEST_F data store flex-fileds to mark the ‘ODI Bitmap Index’ Parameter as ‘N’.


Restart load.

Wednesday, May 17, 2017

ODI-10163: This OPSS user biadmin has different GUID than the one found in Oracle Data Integrator repository

After cloning ODI , We got this issue while trying to login OBIA ODI repository.

ODI-10163: This OPSS user biadmin has different GUID than the one found in Oracle Data Integrator repository
oracle.odi.core.security.BadCredentialsException: ODI-10163: This OPSS user biadmin has different GUID than the one found in Oracle Data Integrator repository.
at oracle.odi.core.security.SecurityManager.createAuthentication(SecurityManager.java:511)
at oracle.odi.core.security.SecurityManager.createAuthentication(SecurityManager.java:252)
at oracle.odi.ui.docking.panes.OdiCnxFactory$1.run(OdiCnxFactory.java:208)
at oracle.ide.dialogs.ProgressBar.run(ProgressBar.java:655)
at java.lang.Thread.run(Thread.java:662)
WARNING JPS-00167 Invalid xml content was found. SchemaLocation: schemaLocation value = 'http://xmlns.oracle.com/oracleas/schema/11/jps-config-11_1.xsd' must have even number of URI's. Location: line 2 column 272.
ERROR odi.core.security.SecurityManager.createAuthenthication(subject) detected OPSS user:biadmin has different guid ODI repos. The guid of the opss user is:A6EA4DE0AB5211E6BFDE3BA100F88679, the guid in repos is:926E89702AD011E5AF4471E40A0748AE


Cause: After cloning instance weblogic ldap users GUID were changed but those were not updated in ODI repository.

Resolution: Update GUID in SNP_USER table for OPSS user and then regenerate cwallet.sso and jps-config-jse.xml and place them into your client machine

Update GUID in SNP_USER table

UPDATE SNP_USER SET GUID_EXTERNAL='A6EA4DE0AB5211E6BFDE3BA100F88679' WHERE WUSER_NAME='biadmin'

Regenerate/Update ODI security files

Connect middleware server and execute below command in putty

<Middleware>/Oracle_BI1/common/bin/wlst.sh <Middleware>/Oracle_BI1/bifoundation/install/createJPSArtifactsODI.py embedded --ADMIN_USER_NAME weblogic --DOMAIN_HOSTNAME <mention_host_name> --DOMAIN_PORT 7001 --DOMAIN_HOME_PATH <Middleware>/user_projects/domains/bifoundation_domain

On successful execution ,you will see regenerated/updated cwallet and jps-config-jse file at below Middleware server location

<Middleware>/user_projects/domains/bifoundation_domain/odi-client-config/embedded

copy those cwallet.sso and jps-config-jse.xml to your client odi bin location. Usually it is

C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin


Restart odi_server1 from the weblogic console to make the changes effective.

Thursday, February 23, 2017

ODI Coding Standards for OBIA 11.1.1.x

Below are the areas in Oracle Data Integrator where developers need to strictly follow the Oracle BI Apps standards for better code readability and troubleshooting.

1.KnowledgeModules
2.Filters
3.Joins
4.Lookups
5.Column Mapping Properties
6.Interface Standards
7.Translation Mapping Standards
8.Domain Staging Standards
9.All BIAPPS Tables Standards
10.Standards to Short cut the Task , LKP Temp Interface and Datastores

1.     Knowledge Modules Standards

ODI Interface -Flow -Source Set - LKM :

For MAIN Interfaces, choose the appropriate LKM for a given Source and Target combination from the list below:        

Source                             
Target                                
LKM
File                          
Data Warehouse           
LKM BIAPPS File to SQL
EBS,SBL,JDE,PSFT     
Data Warehouse           
LKM BIAPPS SQL to Oracle
Fusion 
Data Warehouse           
LKM BIAPPS OBIEE to Oracle
BIA Config Manager   
Data Warehouse           
LKM BIAPPS SQL to Oracle
Data Warehouse           
Data Warehouse           
No LKM (NULL)
SBL 
SBL
No LKM (NULL)
Datawarehouse
BIA Config Manager      
LKM BIAPPS SQL to Oracle


For TEMPORARY interfaces, the LKM should be NULL since temporary target will be created on the same source instance.

ODI Interface -Flow -Target Set - IKM :

For Temp Interface:
Always IKM BIAPPS SQL Control Append
For MAIN Interface:
All SDEs except Persistent Staging and Class Dimension Staging - IKM BIAPPS SQL Control Append
Persistent Staging and Class Dimension Staging - IKM BIAPPS Oracle Incremental Update


2.     Filters Standards


Try to use filters in Temp interfaces itself rather than using filters on temp interface columns in any other interface
Do not combine different filter logic in single filter. Have separate filters for each logic
SDE  Mapping Filter  - Standards:

For Temp Interface:

Ensure you use an ODI Function or UDF rather than a DB specific function in filter expression.
If Filter includes a subquery, always enclose each table name in the sub-query in a QUALIFY() function.

Most instance-specific SDEs include the UDF RUN_INCREMENTAL_FULL() in a filter to identify changed records.

For Main Interface:

Avoid Filters in Main interfaces. If you need to filter OLTP data, try to push the filter into the SQ temp interface. Filters should only be on Lookup results.

SDE Mapping Filter Execute On - Main Interface:

As filters should only be on Lookup results and Lookups should only be to data on a system different from the OLTP (ie flat file or DW table), Execute should be 'Staging' .

Note :This is an exception to the Interface standard of always using 'Source'.

3.     Joins Standards


Joins standards:

Joins should generally only be used in temp interfaces. Try to avoid joins in the Main interface if possible. If you need to join to table in same source system, push to a SQ temp interface. If you need to join to a flat file or table on another system, try to use a lookup in the main interface.

Join Expression Standards:

Avoid using functions or UDFs in the Join expressions.
Join expressions should not have subqueries - these subqueries should be moved to the Filter expression. By implementing in the Join clause, the subquery will be included in the ON clause of the Join. ANSI standards does not allow subqueries in the ON clause of a JOIN. By moving the subquery to a Filter expression, the subquery will be implemented in the WHERE clause.

Join Type Standards:

Use Inner join with join expression as appropriate. If you need to configure an outer join, use Left outer join with join expression, do not use right outer joins. Use the 'left outer join' property, do not use Oracle specific '+' syntax for outer joins in the join expression.
Do not use Cross or Natural join
(Exception - Cross Joins are allowed in the Fusion adaptor as VO's often have View Links which are implemented as cross joins in ODI).

Avoid using functions or UDFs in the Join expression.

4 .    Lookup Standards


Avoid lookups in the Temp interface. If you need to join to data from the same system in a temp interface, use a join to that table. If you need to join data from another system, do a lookup in the main interface.
Lookup Alias Standard:

LKP_<Name of Lookup Table>

If you have multiple lookups to the same table, include usage type as a suffix to the lookup alias. For example, you have multiple lookups to W_USER_D to retrieve a Created by Widand a Changed by Wid. We could name each alias as follows:
LKP_W_USER_D_CREATED
LKP_W_USER_D_CHANGED
Lookup Execute On Standard:

If lookup table is on same system as source table, use 'Source'. Otherwise use 'Staging'.
For example, you would use 'Source' in a SIL when extracting from a Staging table and need to lookup a DW table. You would use 'Staging' when looking up a Flat File or in an SDE when extracting from an OLTP table and you need to lookup data in a DW table.
Lookup Type Standard:

Select 'SQL left-outer join in the from clause.' Only in some very special cases, for performance reasons, should you use 'SQL expression in the select clause'.


5.     Column Mapping Properties


Insert/Update/Active properties are always needs to be checked.

Note: Update option unchecked for ROW_WID,W_INSERT_DT columns.
SDE mapping column properties -UK check box:

Checked for INTEGRATION_ID and DATASOURCE_NUM_ID (and SRC_EFF_FROM_DT if column exists in target) only.
SDE mapping column properties -CN check box:

Checked for INTEGRATION_ID and DATASOURCE_NUM_ID (and SRC_EFF_FROM_DT if column exists in target) only.
SDE mapping column properties -Execute On:

Interface standards:
If column comes from SQ temp interface - Source
If column comes from LKP temp interface - Staging
Datasource Num Id when populated with variable - Staging
Hard coded columns and columns other than datasource_num_id that are populated using variables - Target

6.     Interface Standards


SDE mapping interface name standards:

For Main Interface:

<Package Name>.<Main Interface Target Name>
Name must be unique (within the Adaptor folder)
Ex.
SDE_ORA_EAMAssetDimension.W_EAM_ASSET_DS
SDE_ORA_EAMAssetFact.W_EAM_ASSET_FS

For Temp Interface:

<Package Name>.<Main Interface Target Name>_SQ_<Base Table Name> OR
<Package Name>.<Main Interface Target Name>_SQ_<Function Name>

Name must be unique (within the Adaptor folder), include SDE and SQ in the name.
Ex.
SDE_ORA_EAMAssetDimension.W_EAM_ASSET_DS_SQ_CSI_ITEM_INSTANCES
SDE_ORA_UserDimension.W_USER_DS_SQ_USERS_EXTRACT




7.     Translation Mapping Standards


ODI Dimension Translation Staging Mapping Columns CN - Main Interface:

Checked for INTEGRATION_ID, DATASOURCE_NUM_ID and LANGUAGE_CODE only

ODI Dimension Translation Staging Mapping Columns UK - Main Interface:

Checked for INTEGRATION_ID,DATASOURCE_NUM_ID and LANGUAGE_CODE only

ODI SDE Source to Warehouse Translation Dimension Staging Filter Expression - Temp Interface:

When extracting from a table that stores translation Records, include a filter on the translation table's 'Language' column that matches the LANGUAGE_LIST variable. This variable is a comma-delimited List variable so quotes are not required (quotes are already included that separate the list items). This variable lists the languages that have been deployed in the data warehouse. Config Manager manages this variable and automatically creates this variable using the language character set used by each OLTP.
<Language Column> IN (#LANGUAGE_LIST)
When the OLTP table does not store translations, there is no need to include this filter.

ODI SDE Source to Warehouse Translation Dimension Staging System Column LANGUAGE_CODE - Main Interface:

UK and CN always checked, and
Mapping Expression:
DOMAIN_MEMBER_MAP(
'LANGUAGE',
LANGUAGE_CODE,
#DATASOURCE_NUM_ID,
'W_LANGUAGE'
)

Execute on Staging.
 
ODI SDE Source to Warehouse Translation Dimension Staging System Column LANGUAGE_CODE - Temp Interface:

Only required when populating a translation enabled DW table.
When extracting from a Translation table, map to LANGUAGE_CODE column.
If extracting from a table that does not support translations, populate with variable '#LANGUAGE_BASE' - this will force the record we do have available to be used as the source for generating pseudo-translated records for all languages installed in the data warehouse.
Do not hardcode values for LANGUAGE_CODE (ie no 'US' or 'ENU')
Execute on Source.

ODI SDE Source to Warehouse Translation Dimension Staging System Column SRC_LANGUAGE_CODE - Main Interface:

Mapping Expression:
DOMAIN_MEMBER_MAP(
'LANGUAGE',
COALESCE(SRC_LANGUAGE_CODE,LANGUAGE_CODE),
#DATASOURCE_NUM_ID,
'W_LANGUAGE'
)

If Source Language Code column not mapped in Temp Interface, the following is acceptable:

DOMAIN_MEMBER_MAP(
'LANGUAGE',
LANGUAGE_CODE,
#DATASOURCE_NUM_ID,
'W_LANGUAGE'
)

Execute On: Staging

ODI SDE Source to Warehouse Translation Dimension Staging System Column SRC_LANGUAGE_CODE - Temp Interface:

For those OLTPs that support pseudo-translation and provide a Source Language column (ie EBS and Fusion but not PSFT), map to this column. If there is no such column, leave blank.




8.     Domain Staging Standards



ODI SDE Source to Warehouse Domain Staging Interface Flow Target IKM - Main Interface:

The Domain Staging table W_DOMAIN_MEMBER_GS is a type of class staging table so should use the same IKM as all class SDEs - IKM BIAPPS Oracle Incremental Update.


ODI SDE Source to Warehouse Domain Staging System Column DOMAIN_CODE - Main Interface:

Mapping Expression = '#DOMAIN_CODE', Execute on Target
Domain Code is a single alphanumeric variable so should be enclosed in single quotes. All Main interface variables except Data Source Num Id should be executed in Target so #DOMAIN_CODE is executed on Target.

ODI SDE Source to Warehouse Domain Staging System Column DOMAIN_MEMBER_CODE - Main Interface:

Use SUBSTRING to get the first 80 characters:
SUBSTR(DOMAIN_MEMBER_CODE,1,80)
If source column is a number data type, convert to string as follows:
SUBSTR(TO_CHAR(DOMAIN_MEMBER_CODE),1,80)
Execute on Source

ODI SDE Source to Warehouse Domain Staging System Column DOMAIN_MEMBER_CODE - Temp Interface:

Map to appropriate source Code column. Execute on 'Staging' or 'Source'.

ODI SDE Source to Warehouse Domain Staging System Column DOMAIN_MEMBER_DEFN_TYPE_CODE - Main Interface :

Mapping Expression = 'E'. Execute on Target

ODI SDE Source to Warehouse Domain Staging System Column DOMAIN_MEMBER_REF_CODE - Main Interface:

Mapping Expression = '__NOT_APPLICABLE__'. Execute on Target


ODI SDE Source to Warehouse Domain Staging System Column DOMAIN_TYPE_CODE - Main Interface:

Mapping Expression = 'S', Execute on Target


ODI SDE Source to Warehouse Domain Staging System Column INTEGRATION_ID - Main Interface:

Mapping Expression:
'#DOMAIN_CODE' || '~' || SUBSTR(DOMAIN_MEMBER_CODE,1,80)

If Domain Member Code maps to a numeric column in the source then
'#DOMAIN_CODE' || '~' || SUBSTR(TO_CHAR(DOMAIN_MEMBER_CODE),1,80)

Execute on Target


ODI SDE Source to Warehouse Domain Staging Target Datastore - Main Interface :

Target datastore is always W_DOMAIN_MEMBER_GS.
ODI Domain Staging Mapping Columns - Temp Interface :

Temp Interface should have the following columns:
•       DOMAIN_MEMBER_CODE - Dataype should match source column's datatype (usually either Varchar or Number)
•       DOMAIN_MEMBER_NAME - VARCHAR2(255)
•       DOMAIN_MEMBER_DESCR - VARCHAR2(2000)
•       CREATED_BY_ID - VARCHAR2(80)
•       CHANGED_BY_ID - VARCHAR2(80)
•       CREATED_ON_DT - DATE
•       CHANGED_ON_DT - DATE
•       DELETE_FLG - VARCHAR2(1)
•       LANGUAGE_CODE - VARCHAR2(80)

•       SRC_LANGUAGE_CODE - VARCHAR2(80)



9.     All BIApps tables standards




Category
Standards Description
Table
Resource Name must be unique
Table
Table name must be all UPPER CASE
Table
Flat File sources should be all lower case
Column
Column Name length cannot be over 30 characters
Column
Column Names cannot be duplicated for a given table
Column
Column name must be all UPPER CASE
Sequence
Tables with ROW_WID must have a sequence
Sequence
Different sequences cannot reuse the same Native Sequence Name
Sequence
Native Sequence Name cannot be over 30 characters
Sequence
Sequence name should match table name with _SEQ or S1W suffix
Sequence
Native Sequence name should match Sequence name unless it would exceed 30 characters
Sequence
Sequence type must be 'Native'
Index
Index cannot be defined as Unique and Bitmap
Index
Index must have 'Active' and 'Defined in Database' properties checked
Index
ROW_WID must have a unique index
Index
DATASOURCE_NUM_ID + INTEGRATION_ID must have a unique index
Index
Except for Fact %_WID columns, bitmap indexes should not be used unless approved by the PSR team
Index
non-Teradata specific indexes should be enabled for Oracle


Standards that apply to BIApps table types:

Category
Standards Description
Table
Fact table name length cannot be over 26 characters
Table
Non-Fact table name length cannot be over 30 characters
Index
W_%_TL Tables must have a unique index on INTEGRATION_ID+DATASOURCE_NUM_ID+LANGUAGE_CODE
Index
W_%_T1_D Tables must have a unique index on SCD1_WID
Index
Fact %_WID columns must have a bitmap index

Standards to table column lengths

All WID columns      --NUMBER(10),
All ID columns      -- VARCHAR2(80)
CODE/TYPE/UOM columns -- VARCHAR2(30),
NAME columns  -- VARCHAR2(255),
DESCRIPTION columns  -- VARCHAR2(1000),
Metric columns -NUMBER(28,10),
INTEGRATION_ID column -- VARCHAR2(80) for Dimensions 
                                                   VARCHAR2(320) for Facts .
DATASOURCE_NUM_ID column--NUMBER(10),
ETL_PROC_WID column--NUMBER(10),
TENANT_ID column -- VARCHAR2(80)
X_CUSTOM column --VARCHAR2(10)



10.     Standards to Shortcut the Tasks,LKP Temp Interface and Datastores


How to Shortcut Tasks:

When you shortcut a task, you create a copy of the task folder with shortcuts of the package and interfaces within the task folder.
•       Right click the Task folder in the 'Base' adaptor folder and select 'Duplicate with Shortcuts'. This makes a copy in the same Adaptor folder with 'Copy' prepended to the name.
•       Right click the copy, select 'Cut'
•       Right click the next Adaptor folder and select 'Paste'
•       Rename the copy to remove 'Copy' from the task folder's name
•       Generate a scenario for the package as normal. You must select the 'Generate as if shortcuts are materialized' option.
Since the standard for shortcuts is to always 'forward' shortcut, we need to make a shortcut in the next adaptor based on this shortcut we just made.
•       Right click the newly copied Adaptor folder
•       Select 'Duplicate with Shortcuts',
•       Cut and copy into the next Adaptor and rename
•       Follow the steps below to shortcut the corresponding sources used by this task

How to shortcut LKP temp interfaces:

Reusable LKP temp interfaces should be in the 'Interfaces' folder of the Adaptor folder.  Lookups to Reusable LKP temp interfaces should always leverage a shortcut to that temp interface created in the task folder.
•       Right click base object and select 'Copy'
•       Right click on the Adaptor folder and select 'Paste as shortcut' - the shortcut will appear in that adaptor's 'Interfaces' folder.
Since the standard for shortcuts is to always 'forward' shortcut, we need to make a shortcut in the next adaptor based on this shortcut we just made.
•       Right click the shortcut you just made and select 'Copy'
•       Right click on the next Adaptor folder and select 'Paste as shortcut'
Shortcuts only work if they appear in the task folder where they need to be used.  You need to create another shortcut as follows
•       Right click on the shortcut LKP interface in the 'interfaces' folder of your adaptor then 'Paste as shortcut' in the task folder where it needs to be used
•       Rename the shortcut to follow standard naming convention of using the main interface name as a prefix followed by the _LKP or .LKP name.
•       Create a lookup, select the 'interface' tab and navigate to the task folder and select the LKP shortcut just created

Use Case1:
A task in EBS 11510 adaptor uses a Reusable LKP temp interface specific to EBS.
1.   Create a LKP temp interface in the EBS 11510 Adaptor's 'interface' folder
2.   Create a shortcut of that LKP temp interface in each task folder that needs to use it
3.   Create a lookup in each task folder using the shortcut in the same task folder
4.   Create a shortcut of the original LKP temp interface and paste a copy in the EBS R12 adaptor folder's 'interface' folder
5.   Duplicate the task folder in the EBS 11510 adaptor with shortcuts. Move the duplicated task folder to the EBS R12 adaptor folder.
6.   If the task needs to be materialized for some reason, do not use the option to materialize all objects in the task folder. Instead, materialize the individual objects starting with the lowest level temp interface embedded in another temp interface and work you way up to the SQ temp interface, materialize the SQ temp interface, then the main interface and finally the package. Do not materialize the LKP shortcut.


Use Case 2:
A task in the EBS 11510 must use a reusable LKP temp interface from the SILOS folder
1.   Copy the LKP temp interace in the SILOS adaptor folder and paste as shortcut into the EBS 11510 adaptor folder
2.   Start with step 2 as in Use Case 1

Use Case 3:
A new task is introduced in EBS R1212 that is not based on a shortcut to an earlier adaptor. It needs to use a reusable LKP temp interface from the SILOS folder
If the LKP temp interface has not been used in any prior EBS adaptor, you can skip creating the shortcut in the base folder and instead create a shortcut in the first adaptor folder where it is used (EBS R1212 in this case). Be sure to forward shortcut this LKP temp interface to later adaptors. Continue with step 2 in Use Case 1.

How to shortcut Datastores:

Right click the base datastore and select 'Copy'
•       Right click on the next Model-Submodel and select 'Paste as shortcut'
Since the standard for shortcuts is to always 'forward' shortcut, we need to make a shortcut in the next model based on this shortcut we just made.
•       Right click the shortcut you just made and select 'Copy'

•       Right click on the next Model-Submodel and select 'Paste as shortcut'