Tuesday, 7 January 2014

Get ODI Load Plan step name dynamically using ODI Substitution methods

ODI 11g provided the Load plan functionality to the users, along with its substitution methods to create generic objects in ODI.

Case: User requirement is to refresh multiple Materialized views from ODI Load plans.

1. Create ODI procedure to refresh the materialized view.

begin
dbms_snapshot.refresh('SCHEMA_NAME.MVIEW_NAME',ATOMIC_REFRESH=>FALSE);
end;

If we have hundreds of JOBs to execute, then we have to create those numbers of ODI Procedures. And same needs to be add into the load plans, which would generate extra overhead of JOB integrations in ODI.

Instead of that we would recommend to create a one generic/reusable ODI procedure and use that across the load plans, where you need to set the load plan step names with the help of <%=odiRef.getLoadPlanStepInstance("STEP_NAME")%> substitution method.

Add below code into the ODI Procedure.

begin
dbms_snapshot.refresh('SCHEMA_NAME. <%=odiRef.getLoadPlanStepInstance("STEP_NAME")%>’
,ATOMIC_REFRESH=>FALSE);
end;



2. Add the ODI Procedure into load plan and change the name of load plan steps according to the JOBs. The provided step name would get captured into that ODI procedure code at run-time.







3. Execute the load plan and see the logs for those executed jobs. Check the code of executed procedure where we can see the JOB name which we have provided into the load plan.



Cheers..!! :)


Sunday, 5 January 2014

Configuration of ODI Java EE agent.

Create Physical agent in topology with following parameters:


Name= Agent name
Host=on which Application server is installed.
Port=on which application server runs.’
Web application context: default (oraclediagent)


Create Application Server Credentials store:

a.       Go to the <ODI_HOME>\common\bin and run the wlst.sh file.
b.      Wls:/offline>connect(‘weblogic’,’weblogic123’,’t3://localhost:7001’)
c.      Wls:/odi_11g/serverConfig>createCred(map="oracle.odi.credmap",key="SUPERVISOR",user="SUPERVISOR",password="odisuper",desc="key for supervisor")
d.     Wls:/odi_11g/serverConfig>disconnect()
Now Credential store is created for the application store and defined the SUPERVISOR KEY, which will be used while generating WLS template.

Generate the server template for JAVAEEAgent.
Before generating server template, define the data sources for the agent.

a. Drag the work repository the data server into this data source and define the JNDI names.













B. Generate the server template.





Go to that location where jar file is located e.g: /oradata/ODI_SETUP/
Run the jar -xvf WLST_AGENT.jar file.
Got to the /oradata/odi_home/agentapp location and see that oracledi.ear file is created. And Deploy this same file on that application server (e.g:odi_server1).





Now see the deployments of odi_server1: check weather that agent application state is active.





Test the Java EE agent connection from Topology; 



Yehhhh !!!!!! We have successfully configured ODI JAVA EE agent……Njjoy…..! J

Saturday, 19 October 2013

Oracle Data Integrator for beginners.

Introduction – Oracle Data Integrator (ODI)

Oracle purchased Sunopsis in October 2006 and re-branded it as Oracle Data Integrator (ODI).
Oracle Data Integrator an application using the database for set-based data integration.
Oracle Data Integrator provides a fully unified solution for building, deploying, and managing complex data warehouses.                
ODI is widely used data integration software product, it provides a new declarative design approach to defining data transformation and integration processes, resulting in faster and simpler development and maintenance. Based on a unique “E-LT” architecture, Oracle Data Integrator not only guarantees the highest level of performance possible for the execution of data transformation and validation processes but is also the most cost-effective solution available today.         
ODI (Oracle Data Integrator) unifies of integration by transforming large volumes of data efficiently, processing events in real time through its advanced Changed Data Capture (CDC) with time based automation.
Oracle Data Integrator is also based on a unique E-LT (Extract - Load Transform) architecture which eliminates the need of an ETL Server sitting between the sources and the target server

A) Why Oracle Data Integrator?

  • ELT Architecture provides high performance.
  • Active integration enables real time data warehousing and operational data hubs.
  • Declarative design improves developer productivity.
  • Knowledge modules provide flexibility and extensibility.
  • ODI combines three style of data integration:
  • Data based, event based and service based ODI shortens implementation times with its declarative design approach.

B) Today’s Business Issues:

Now a day’s increasingly fast-paced business environment, organizations need to use more specialized software applications; they also need to ensure the coexistence of these applications on heterogeneous hardware platforms and systems and guarantee the ability to share data between applications and systems.  Projects that implement these integration requirements need to be delivered on-spec, on-time and on-budget.

C) ODI provides Unique Solutions:

The key reasons why more than 500 companies have chosen Oracle Data Integrator for their ETL needs:

·         JOB Dependency Matrix: The featured functionality by ODI to design the complex functional architectures for required business logics. Dependency of the object will be designed and it will be managed by ODI.
·         Faster and simpler development and maintenance: The declarative rules driven approach to ETL greatly reduces the learning curve of the product and increases developer productivity while facilitating ongoing maintenance. This approach separates the definition of the processes from their actual implementation, and separates the declarative rules (the “what”) from the data flows (the “how”).
·         ETL automation: ODI has capability to remove the manual interventions in the ETL process, which may causes day to day business activities. To improve the productivity and performance it may use real-time data replication products.
·         Better execution performance: traditional ETL software is based on proprietary engines that perform data transformations row by row, thus limiting performance. By implementing E-LT architecture, based on your existing RDBMS engines and SQL, you are capable of executing data transformations on the target server at a set-based level, giving you much higher performance.
·         Platform Independence: Oracle Data Integrator supports all platforms, hardware and OSs with the same software.
·         Data Connectivity: Oracle Data Integrator supports all RDBMSs including all leading Data Warehousing platforms such as Teradata, IBM DB2, Netezza, Oracle, Sybase IQ and numerous other technologies such as flat files, ERPs, LDAP, XML.

·         Cost-savings: the elimination of the ETL hub server and ETL engine reduces both the initial hardware and software acquisition and maintenance costs. The reduced learning curve and increased developer productivity significantly reduce the overall labor costs of the project, as well as the cost of ongoing enhancements.

D) ODI Architecture:

The ODI architecture is organized around a modular repository, which is accessed in client-server mode by components such as the ODI Studio and execution Agents that are written entirely in Java.
The architecture also includes a web-based application, the ODI Console, which enables users to access information through a Web interface and an extension for Oracle Fusion Middleware Control Console.


       













                     Fig 1: Oracle Data Integrator Architecture

E) Traditional ETL Vs E-LT Process










1) Extract: Extracting the data from various sources.
2) Load: Loading the data into the destinations target.
3) Transform: Transforming data according to a set of business rules                                      

In response to the issues raised by ETL architectures, a new architecture has emerged, which in many ways incorporates the best aspects of manual coding and automated code-generation approaches.
Known as “E-LT”, this new approach changes where and how data transformation takes place, and leverages existing developer skills, RDBMS engines and server hardware to the greatest extent possible.
In essence, E-LT moves the data transformation step to the target
RDBMS, changing the order of operations to