Sunday 13 December 2015

INFORMATICA SCD TYPE 2 DATE METHOD


Step  1 :

Create two tables like below

create table emp_daily

(eid  integer, ename varchar(30),job varchar(30),mgr  integer,jdate

date,sal integer) ;

insert into emp_daily values(1,'vinay','it',2,'01-oct-2015',30000);

 

create table emp_hist
(sno integer,eid integer,ename varchar(30),job varchar(30),mgr integer,SAL INTEGER,start_dt date,end_dt date)
Step 2:

1.       Take EMP_Daily, EMp_HIST  in the mapping

2.       Take lookup transform ,use EMP_HIST as lookup object and compare EID and IN_EID(input port)

3.       Take Expression transformation ,take all columns from source qualifier ,from lookup take these sno,eid,enm,mgr,job and connect to expression transformation.

Take new additional ports with the below data types and expressions

Opt_New_Record àStringàIIF(ISNULL(lkp_EID),TRUE,FALSE)

--IF NULL,IT IS NEW RECORD,OTHERWISE OLD/UPDATE RECORD

 

Opt_Update_RecordàStringàIIF(EID=lkp_EID AND (JOB<>lkp_JOB  OR MGR<>lkp_MGR OR SAL<>lkp_SAL),TRUE,FALSE)

--IF EID MATCHES AND THERE IS A CHANGE IN ANY OTHER ATTRIBUTE

 

Opt_Start_DateàDateàsysdate

4.       Take filter transformation, connect eid,enm,job,mgr,sal,jdate ,new_record  from expression.

Filter condition: NEW_RECORD=TRUE

5.       Take target and connect eid,enm,job,mgr,sal,Jdate to Start_date ,END _dt (con’t connect)

So that END_DT will be NULL.

For new record it will be NULL.

6.       Take Sequence Generator Transform and connect  NEXTVAL to target SNO

7.       Take one more filter transformation for UPDATE operation (Filter Condition: UPDATE_RECORD)

Connect from EXP ,eid,enm,job,sal,mgr,lkp_SNO,Update_RECORD,Start_Date

8.       Take target one more copy for UPDATE INSERT Operation

Pass EID,ENM,SAL,MGR,JOB,JDATE-START_Date,END-DT  NULL

This process is for Inserting updated record .

9.       Updating END_DT with start_date

Take update strategy transformation,take lkp_SNO and START_DATE

 

     Take a target table one more copy and connect lkp_SNO to SNO,START_DATEàEND_DT

 Take condition (DD_UPDATE)

10.   Goto lookup and use Sql Override ,where clause where end_dt is null to get only active records.

11.   Create session and workflow
SEE THE BELOW MAPPING FOR MORE INFO
 
 

 

Saturday 12 December 2015

INFORMATICA MAPPING VARIABLES AND PARAMETERS

If you declare mapping parameters and variables in a mapping, you can reuse a mapping by altering the parameter and variable values of the mapping in the session. This can reduce the overhead of creating multiple mappings when only certain attributes of a mapping need to be changed.
The next time you run the session, the Integration Service evaluates references to the variable to the saved value. To override the saved value, define the start value of the variable in a parameter file or assign a value in the pre-session variable assignment in the session properties.

Mapping parameters and variables declared for a mapping cannot be used within a mapplet.
Similarly, you cannot use a mapping parameter or variable declared for a mapplet in a mapping.

When you want to use the same value for a mapping parameter each time you run the session, use the same parameter file for each session run. When you want to change the value of a mapping parameter between sessions you can perform one of the following tasks:

 a)Update the parameter file between sessions.

 b)Create a different parameter file and configure the session to use the new file.

 Remove the parameter file from the session properties. The Integration Service uses the parameter value in the pre-session variable assignment. If there is no pre-session variable assignment, the Integration Service uses the configured initial value of the parameter in the mapping.


For example, you can create a user-defined mapping variable $$LastUpdateDateTime that saves the timestamp of the last row the Integration Service read in the previous session. Use $$LastUpdateDateTime for the beginning timestamp and the built-in variable $$$SessStartTime for the end timestamp in the source filter. Use the following filter to incrementally extract data based on the SALES.sales_datetime column in the source:

SALES.sales_datetime > TO_DATE (‘$$LastUpdateDateTime’) AND SALES.sales_datetime < TO_DATE (‘$$$SessStartTime’)

INITIAL VALUES:
================
String --- Empty string.

Numeric --- 0

Datetime -- 1/1/1753 A.D. or 1/1/1 when the Integration Service is configured for compatibility with 4.0.
======================================================================================================
Variable Values
The Integration Service holds two different values for a mapping variable during a session run:

 Start value of a mapping variable

 Current value of a mapping variable
 

The current value of a mapping variable changes as the session progresses. To use the current value of a mapping variable within the mapping or in another transformation, create the following expression with the SETVARIABLE function:

SETVARIABLE($$MAPVAR,NULL)
At the end of a successful session, the Integration Service saves the final current value of a mapping variable to the repository.

EXAMPLE 1:

1. CREATE A MAPPING WITH TWO PARAMETERS AND ONE VARIABLE
 NAME:$$PAR_INC
 TYPE:PARAMETER
 DATA TYPE: INTEGER
 IS EXPRESSION: FALSE
 SPECIFY: INITIAL VALUE :20000

 NAME:$$PAR_EXP_RATING
 TYPE:PARAMETER
 DATA TYPE: STRING
 LENGTH: 200  --V V IMP
 IS EXPRESSION: TRUE --VERY VERY IMP

 NAME:$$VAR_INC
 TYPE:VARIABLE
 DATA TYPE: INTEGER
 IS EXPRESSION: FALSE
 INITIAL VALUE: 2000

 MAPPING SHOULD BE
 SQ->EXP->TARGET

 SQ--
  SELECT EMP.EID, EMP.ENM, EMP.ELOC, EMP.JDATE, EMP.SAL
  FROM
   EMP
  WHERE SAL>=$$PAR_INC 

   AND JDATE<=TO_DATE (‘$$$SessStartTime’,'mm/dd/yyyy hh24:mi:ss') --USE THE DEFAULT SESSION PARAM
 EXP-->
  SAL_INC-->SAL + $$VAR_INC
  GEN-->STRING (200)-->$$PAR_EXP_RATING


3. GOTO SESSION ->SOURCE CONNECTION->USE VARIABLE->$DBConnection_SRC
    Target Connection->Use variable ->$DBConnection_TGT

4. CREATE A PARAMETER FILE LIKE BELOW


[SUMATHI.s_Map_Var]
$$PAR_INC=30000
$$PAR_EXP_RATING=IIF(SAL<=20000 OR ISNULL(SAL) ,'POOR',IIF(SAL<=30000,'AVG','GOOD'))
$DBConnection_SRC=SUMATHI_ORACLE_CONN
$DBConnection_TGT=SUMATHI_ORACLE_CONN


5. Goto session->properties->specify Param file location
6. Run session

INFORMATICA XML SOURCE TRANSFORMATION

For XML files processing XSD is required.
Please collect XML file and XSD files from Institution.

1. Import XML definition in source analyzer like below
      Source Analyser->Sources->Import XML Definistion->Specify XSD file->OK
2.  Create a mapping with the below flow.
   Drag and Drop Party_XML from sources list, connect to flat file to load
3. Create session and specify the below

    For source 
                      Specify XML file location (not XSD)
                      Directory: c:\
                      Filepath: Data\party_xml.xml
   For Destination
                   Directory: c:\
                   Filepath : Data\XML_Opt.txt
            
 

Friday 11 December 2015

INFORMATICA UNCONNECTED STORED PROCEDURE TRANSFORMATION

CREATE THE BELOW IN ORACLE
=========================================================================

create table emp(eid integer,enm varchar(30),did integer)
insert into emp values(1,'vinaytech',10)
insert into emp values(2,'vinay',20);








create table dept(did integer,dnm varchar(30))
insert into dept values(10,'IT');
insert into dept values(20,'HR')





CREATE PROCEDURE1
=====================
create or replace procedure pr(id in integer, nm out varchar2)
as
begin
select dnm into nm from dept where did=id;
end;

CREATE PROCEDURE2
=====================

CREATE OR REPLACE PROCEDURE pr1(outParam1 OUT VARCHAR2)
IS
BEGIN
 
  outParam1 := 'Vinaytech OUT parameter';

END;


TARGET TABLE
=============


 create table sp_result(eid integer,enm varchar(30),did integer, dnm varchar(30),msg varchar(30))













 

TERADATA FASTLOAD ADDITIONAL FAQS




Saturday 5 December 2015

INFORMATICA PUSH DOWN OPTIMIZATION (PDO)

PUSH DOWN OPTIMIZATION
=======================

CREATE MAPPING LIKE BELOW
--------------------------

 SRC(PARTY)->FILTER(PARTYINCOME>1000)-->SORT(PARTYLOC DESC FIRSTSORTCOL,PARTYCODE ASC
2NDSORTCOL,PARTYINCOME DESC 3RDSORTCOL)-->
 EXP(THREE PORTS ADDED
  CREATED BY: VINAY
  CREATED_DATE: SESSSTARTIME
  TOTAL_SAL=SAL * 12 )-->DEST




CREATE SESSION LIKE BELOW (WITHOUT PUSH DOWN OPTIMIZATION)
----------------------------------------------------------

CREATE SESSION WITH ADDITIONAL PROPERTIES

 WRITE BACKWORD COMPATIBLE SESSION LOG --TICK IT
 COLLECT PERFORMANCE DATA --CHECK IT

 (THIS WILL GIVE YOU COMPLETE TRANSFORMATION IN AND OUT RECORD COUNT)
 NOTE: DELETE SESSION LOGS BY GOING TO INFASHARED IN C:\Informatica\9.5.0\server
\infa_shared\SessLogs

RUN THE WORKFLOW WITHOUT PUSHDOWN FEATURE
-------------------------------------------
 RUN THE WORKFLOW AND SEE THE BELOW
 A) SESSION LOG->GET RUN PROPERTIES
 B) GOTO INFA_SHARED\SESSLOGS AND SEE THE SESSION LOG SELECT STATEMENT
 C) GOTO INFA_SHARED\SESSLOGS AND SEE THE PERFORMANCE COUNTER FILE
  AND OBSERVE EACH TRANSFORMATION INPUT AND OUT PUT COUNT OF ROWS


PREVIEW PUSH DOWN OPTIONS IMPLEMENTATION IN VIEWER
---------------------------------------------------
 SESSIONS->MAPPINGS->PUSH DOWN OPTIMIZATION
    TO SEE /VIEW (VIEWER)HOW MUCH SOURCE AND TARGET CAN GENERATE LOGIC AND PASSES

 TAKE OPTIONS ONE BY ONE AND PREVIEW THE FLOW


DESC:
=====
TO SOURCE-->HOW MUCH IT CAN PUSH TO SOURCE
TO TARGET-->TOWARDS TARGET HOW MUCH IS POSSIBLE
FULL->SELECTION,INSERTION ETC... EVERYTHING CONVERTED TO SQL.IT WILL NOT PASS ANYTHING TO

THE PIPELINE AND MOVES TO TARGET.
TO_SOURCE AND TO_TARGET HAVE LIMITATIONS,WHERE LESS LIMITATIONS ON FULL PDO
FULL POD USES DIFFERENT SOURCE CONNETIONS FOR ITS OPERATIONS.

FOR FULLPDO -->SOURCE AND TARGET SHOULD HAVE SAME RELATIONAL CONNECTION.(IF ERROR DO
THIS...)

PUSHDOWN IMPLEMENTATION
-----------------------
SPECIFY THE BELOW FEATURE IN THE PREVIOUS SESSION LEVEL.

SESSION->PROPERTIES TAB->PUSHDOWN OPTIMIZATION->TICK MARK AND CHOOSE OPTION FULL
RUN THE WORK FLOW AND OBSERVE THE BELOW

 A) SESSION STATISTICS ONLY ON PDO TABLE (NOT ON SOURCE AND TARGET)
 B) SEE THE QUERY IN PERFORMANCE LOG FILE (INSERT INTO SELECT STATEMENT).
 C) OBSERVE PERFORMANCE FILE AND SEE COUNTS (NO MORE COUNTS)


 

Saturday 28 November 2015

INFORMATICA_LOOKUP_PRACTICALS

WORKING WITH LOOK-UP TRANSFORMATION
===================================
SC1: LOADING MATCHED DATA INTO ONE DESTINATION AND UNMATCHED INTO ANOTHER DESTINATION
--------------------------------------------------------------------------------------
CREATE EMP(EID,ENM,DID),DEPT(DID,DNM) TABLES.
NEED OUTPUT LIKE MATCHED DID VALUES RESULT IN TO ONE TABLE AND UNMATCHED RESULT INTO OTHER TABLE.
DEST1: EID,ENM,DID,DNM
DEST2: EID,ENM,DID

EMPSOURCE-->SQ->LOOKUP(DEPT)-->EXPRESSION->ROUTER->TWO DESTINATIONS.
LOOKUP-->TABLE (DEPT),CONDITION--EMP.DID=DEPT.DID (FOR THIS DRAG ONE INPUT COLUMN DID FROM SQ TO LOOKUP), LOOKUP PORTS DID,DNM
EXPRESSION: DRAG LOOKUP COLUMNS FROM LOOKUP,EID,ENM,DID FROM SQ,CREATE ONE ADDITIONAL PORT WITH THE BELOW OPERTION
 MATCH_UNMATCH--> IIF(DID=DID,"MATCH","UNMATCH")
ROUTER:
 MATCH_UNMATCH ="MATCH" --ONE GROUP
 MATCH_UNMATCH="UNMATCH" --ANOTHER GROUP






SC2:IMPLEMENTING INCREMENTAL LOADING
---------------------------------------
CREATE EMP1,EMP2 WITH COLUMNS (EID,ENM,ELOC),TAKE DATA INTO EMP1
EMP1->SQ->LOOKUP->EXP->FILTER->EMP2
LOOKUP-->TABLE(EMP2),FROM SOURCE DRAG AND DROP EID,ENM,ELOC,CONDITION(EID=EID,ENM=ENM,ELOC=ELOC),LOOKUP PORTS (ALL THESE THREE)
EXPRESSION-->TAKE EID,ENM,ELOC LKP PORTS FROM LOOKUP,EID,ENM,ELOC FROM SQ.
   ADD ADDITIONAL PORT LIKE BELOW
  NEW_OLD= IIF(EID=EID AND ENM=ENM AND ELOC=ELOC,"OLD","NEW")
FILTER-->NEW_OLD="NEW"
FIRST RUN: NEW REORDS ADDED
2ND RUN: EMP1 TABLE TAKE SOME OLD AND SOME NEW,RUN WORK FLOW ONLY THE RECENTLY ADDED NEW RECORDS
 SHOULD ONLY BE INSERTED.

3. UNCONNECTED LOOK UP EXAMPLE PRACTICE
-------------------------------------------
GET DNM FROM DEPT TABLE (LOOKUP -UNCONNECTED) BASED ON DID MATCH BETWEEN EMP AND DEPT
1.SOURCE(EMP)-->EXPRESSION(CONNECT EMP ALL COLUMNS ,CREATE ONE ADDITIONAL PORT FOR DNAME CAPTURING WITH THE BELOW EXPRESSION)
     DNAME STRING :LKP.LKP(DID) -->DESTINATION TABLE (EID,ENM,DID,DNM)
2.IN THE MAPPING CREATE LOOK UP LIKE BELOW-> CHOOSE DEPT TABLE
     A) ADD NEW PORT WHICH RECEIVE INPUT DID
      I_DID INTEGER INPUT
     B) TICK MARK DNM RETURN PORT
     C) CONDITION (DID=I_DID)
3. RUN WORKFLOW,SO THAT DESIRED OUPUT WE WILL GET.