Top Banner
RH Donnelley Corporation, Raleigh NC May 04 ¡V Mar 05 ETL Developer RH Donnelley Corporation is an advertising company providing businesses with affordable, effective advertising. Project was to design, develop, implement and maintain ETL components for the sales data mart to provide analytical information. A data Mart was built with sources including Oracle, and target database as Oracle. Responsibilities: „Ï Interacted with Business Analyst to understand the business requirements. „Ï Created users/groups and folders using Repository Manager. „Ï Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer. „Ï Extracted Source Data using Informatica tools and Stored procedures from Source Systems. „Ï Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer. „Ï Designed various mappings using transformations like LookUp, Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and Expression Transformation. „Ï Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code. „Ï Developed and executed UNIX scripts as pre/post-session commands to schedule loads, through SQL-loader utility. „Ï Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping. „Ï Extensively involved in Fine-tuning the Informatica Code (mapping and sessions), Stored Procedures, SQL to obtain optimal performance and throughput. Environment: Informatica PowerCenter/ PowerMart 7.1/6.2, Oracle 8i/9i, SQL, SQL*Plus, TOAD, IBM DB2 8.0, Erwin 4.0, Sun Unix 8, Windows NT, PL/SQL Developer 5.1 EquiFirst Corporation, Charlotte NC Aug 03 ¡V Apr 04 ETL Developer
46
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: New Microsoft Word Document

RH Donnelley Corporation, Raleigh NC May 04 ¡V Mar 05ETL DeveloperRH Donnelley Corporation is an advertising company providing businesses with affordable, effective advertising. Project was to design, develop, implement and maintain ETL components for the sales data mart to provide analytical information. A data Mart was built with sources including Oracle, and target database as Oracle. Responsibilities:„Ï Interacted with Business Analyst to understand the business requirements. „Ï Created users/groups and folders using Repository Manager.„Ï Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.„Ï Extracted Source Data using Informatica tools and Stored procedures from Source Systems. „Ï Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.„Ï Designed various mappings using transformations like LookUp, Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and Expression Transformation.„Ï Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.„Ï Developed and executed UNIX scripts as pre/post-session commands to schedule loads, through SQL-loader utility.„Ï Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping. „Ï Extensively involved in Fine-tuning the Informatica Code (mapping and sessions), Stored Procedures, SQL to obtain optimal performance and throughput.Environment: Informatica PowerCenter/ PowerMart 7.1/6.2, Oracle 8i/9i, SQL, SQL*Plus, TOAD, IBM DB2 8.0, Erwin 4.0, Sun Unix 8, Windows NT, PL/SQL Developer 5.1

EquiFirst Corporation, Charlotte NC Aug 03 ¡V Apr 04ETL Developer EquiFirst is one of the premier non-conforming wholesale mortgage lenders in the United States providing innovative mortgage products and fast closings to independent mortgage brokers throughout the country. Assigned as ETL Developer at EquiFirst to design, develop, and implement the ETL process to load the Products Data Mart, which helped generate Business Intelligence and Performance Management reports. Responsibilities:„Ï Involved in Installation of Informatica PowerCenter 6.2.„Ï Involved in gathering requirements from business users.„Ï Participated in the detailed requirement analysis for the design of data marts and star schemas.„Ï Created Mappings using various transformations like Update Strategy, Lookup, Stored Procedure, Router, Joiner, Aggregator and Expression.„Ï Designed many Multi Source Single Target mappings and vice versa.„Ï Used Shell Scripting to schedule jobs on Unix, and to upload data files into tables of the oracle database.

Page 2: New Microsoft Word Document

„Ï Involved in Performance tuning and testing at mapping level.„Ï Involved in end user training and support.Environment: Informatica PowerCenter6.X, Oracle 8i, Microsoft SQLServer 2000, FlatFiles, Windows 2000, UNIX, Cognos7.x.

OBJECTIVE To build a career in a progressive organization that will provide an intense learning opportunity to grow and utilize my software skills and experience.

Work status in Canada : Work Permit

SUMMARY ? 3 plus years of experience in Data warehousing and ETL process using Informatica Power Center 8.x & 7.x and Microsoft Technologies ? Extensive experience with Data Extraction, Transformation and Loading from disparate data sources including relational databases such as Microsoft SQL Server, Oracle; and integrating data from flat files into a common reporting and analytical Data Model using Informatica ? Strong in Data Warehousing concepts as well as methodologies utilizing Star Schema and Snowflakes dimensional models ? Hands-on work experience in technologies such as ASP.NET, C#.NET, ADO.NET, HTML, JavaScript, Crystal Reports and Microsoft SQL Server, and Oracle? Excellent Object Oriented Programming (OOP) and relational concepts? Excellent team player with good communication skills

EDUCATION ? Master of Computer Application (MCA) from SRM Engineering College, Madras University, India. ? Bachelor of Commerce (B.Com), Kannur University, India.

CAREER PROFILE ? Worked for Techies Network Bangalore as a Software Engineer from November 2006 to January 2010? Worked for IRIS InfoTech -- Thalassery, Kerala as a Faculty cum Programmer from December 2005 to October 2006. TECHNICAL EXPERTISE Operating System : Windows 2000/XP and Windows 2000/ 2003 server Languages : C, C++, VB.NET, C#.NET, VB and JAVA ETL Tool : Informatica Power Center 8.x, 7.x (Power Center Designer, Workflow Manager, Repository Manager, Admin Console & Workflow Monitor) Web Technologies : ASP.NET, ADO.NET, HTML, IIS and XML

Page 3: New Microsoft Word Document

Scripting Language : Java Script, VB Script RDBMS : Microsoft SQL Server 2005/2008, Oracle 9i/10g Reporting Tools : Crystal Reports 10

PROJECT EXPERIENCES Techies Network

Drill Accounts and Cost Analysis (DACA)Project involved design and development of data warehouse for a major bank in India. Main objectives of the project were improving turnaround time for data access and reporting, improving data integrity and consistency across the organization, lowering cost to create and distribute reports. The scope of the project was also to maintain present and historical data to support clients' business process to help senior management to take informed decisions. Duration : December 2008 to January 2010Team Size : 8 Role : Mapping Developer Technologies : Informatica Power Center 7.1, Oracle 9i Responsibilities: ? Extensively used almost all Transformations constructs such as Aggregator, Router, Joiner, Expression, Lookup, Update Strategy and Sequence Generator for developing various mappings ? Developed simple & complex mappings using Informatica to load dimension & fact tables as per STAR Schema techniques ? Designed and Developed Informatica mappings between source data to target repository utilizing Informatica designer tool ? Extensively worked in the performance tuning of ETL mapping and session ? Scheduling and creation of sessions and executing them to load data from the source system ? Authored stored procedures and functions. ? Performed unit testing at various levels of the ETL processes ? Analyzed Session log files to resolve errors in mapping and managed session configurations? Extensively involved in extraction of data from Oracle and flat files? Expertise in different type of data loading including Bulk load? Utilized Informatica debugger for trouble-shooting

Drill Across Dimension (DAD)Project involved design and development of data warehouse for a major bank in Dubai.The system requirement was to be build a Data Warehouse to maintain historical data at central location for analysis of the businesses at various locations. The system provides improved business visibility through reports, building dashboards and key performance indicators. Duration : Feburary 2008 to November 2008 Team Size : 6 Role : Mapping Developer

Page 4: New Microsoft Word Document

Technologies : Informatica Power Center 7.1, Oracle 9i Responsibilities: ? Developed Mapping Workflows and Worklets according to business rules using Informatica power center 7.1? Developed ETL Design Documents ? Designed and developed informatica mappings to load source data into target systems? Extensively used almost all of the transformations features offered by Informatica including the use of expressions and lookups; authoring stored procedures and developing update strategy? Configured sessions, setup workflow and tasks to schedule data loads at desired frequency ? Maintained and monitored scheduled jobs? Worked with memory cache for static and dynamic cache management for increased throughput? Provided resolution to issues by correcting and validating specification changes ? Created effective test cases and performed unit and integration testing

Budget Control System (BCS) BCS is a web application developed for the purpose of effectively managing and optimally utilizing the funds allocated to various departments within the company. Major objectives of the application were to maintain information on budget, reporting of funds usage, tracking of funds and recording of the performance by each division in the company. The information is then analyzed during the annual budget approval process. The application is primarily developed using Microsoft technologies. It utilizes Internet Information Services (IIS), .NET framework, ASP.NET and MS SQL Server 2005. Crystal Reports performed the function of reporting engine. The system is deployed on the Microsoft Windows Server 2003. Duration : Jun 2007 to January 2008Team Size : 6 Role : Programmer Technologies : ASP.NET with C#, ADO.NET, SQL Server 2005, HTML, and Crystal Reports, Windows Server 2003 Responsibilities ? Involved in analysis and development of budget module ? Authored several database stored procedures ? Developed screens and functionality using ASP.NET ? Involved in functional and regression testing activities

Food Court Management System A billing system that records all the information of daily transactions, payroll, inventory, automatic inventory updates. The system also maintains attendance for each employee, their demographic details, salary slip generation etc. Since the “self-service” system is implemented, the person operating the computer prepares the bill which could be held if the customer delays his order and the billing of the next customer can be done. The held bill can then be processed later by retrieving it from the database server. It also records

Page 5: New Microsoft Word Document

flow of money for the daily expenses of groceries; salary etc. Payment can be done either by cash or by credit card. The system also generates reports for daily, weekly, monthly and yearly transactions, salary report, credits, total profit, and the money deposited in banks. Duration : November 2006 to May 2007 Team Size : 5 Role : Programmer Technologies : C#.net, SQL Server, Crystal Reports

Responsibilities ? Involved in the analysis and development of purchase module. ? Performed unit testing and defect tracking

PERSONAL DETAILS Gender : Male Marital Status : Married Nationality : Indian Personal Strengths : Pragmatic, adaptability to changing situations and committed to meet deadlines.

References: References available upon request. I hereby declare that the above furnished details are true to the best of my knowledge.

SUMMARY

• Around 7 years of experience in the IT / Software Industry possessing strong knowledge in data warehousing technology with skill set of Informatica and Oracle• Experience in all the phases of Data warehouse life cycle involving design, development & analysis and testing of Data warehouses using ETL• Capable of analyzing the business requirements and creating functional design documents to design mapping, mapplet, workflow and sessions in Informatica 6.0/7.1• Experience in creating mapping, reusable transformations and mapplet using Firstlogic and performing Firstlogic administration • Developed scheduling jobs in Informatica and monitored the same using the workflow monitor• Practical understanding of the Database schemas like Star Schema and Snow Flake Schema used in relational and dimensional modelling.• Strong knowledge of Entity-Relationship concept, Facts and dimensions tables and Slowly changing dimensions• Six sigma certified (Green Belt Certification)• Experience in writing Unix Shell scripts• Handling of database management activities for new releases using Oracle • Experience in writing database scripts such as SQL queries ,PL/SQL Stored Procedures, Indexes, Functions, Views, Materialized views and Triggers • Strong knowledge in the Software Development Life Cycle working in CMM Level 5

Page 6: New Microsoft Word Document

Company.• Efficiently handling the data fixes and other support work on time • Quick learning, sound communication, interpersonal skills and client interaction abilities complement the technical abilities• Excellent analytical/ communication skills and good team spirit• Strong commitment towards quality, experience in ensuring compliance to coding standards and review process

SKILLS

Informatica Tools: Informatica 6.0 / 7.1 (Designer, Server Manager, Workflow Monitor, Repository Manager), Other Tools: TOAD, First logic, Informatica Match Consolidate, Informatica ACE, Informatica IACE, Guide PostLanguages: C, C++, Java, Visual Basic, SQL/PL-SQLDatabase: OracleOperating Systems: Windows, Unix

EDUCATION

Master of Science in Information Technology.

SUMMARY:? Around 2+ years of work experience in Informatica and Business Objects in Flextronics.? Having experience in Informatica designer components, workflow manager, Monitor tools, Repository manager, Admin Console and UNIX shell scripting; Hands on knowledge on SAP power connect.? Experience in ETL Migration projects like ETL Server Migration and ETL Version Migration from Informatica 7.1 to 8.1.? Experience in End to End Informatica 8.1.1 Server & Client installation, Configuring Infa 8.1, Upgrading Service Packs, User Management, Security and Backup & Restore process on both UNIX and Windows environments.? Having experience in BO XI R3 (Designer, Webi, Rich client) and knowledge in BO 6.5? Quick learner and excellent team player having ability to meet tight and dead lines and work under pressure

PROFESSIONAL EXPERIENCE:? Working as System Analyst in Flextronics Technology (P) Ltd, Chennai from May 2008 to till Date. (1.10 years)? Internship program in Flextronics Technology (P) Ltd, Chennai from Jan 2008 to May

Page 7: New Microsoft Word Document

2008. (6 months)

EDUCATIONAL PROFILE:? M.C.A. from D.G. Vaishnav College with 75%, Chennai (2008)? B.C.A. from Loyola College (Autonomous) with 83%, Chennai (2005)

SOFTWARE SKILLS:

ETL Tools : Informatica 7.1, 8.1.1Reporting Tool : BO 6.5, BO XI R3Database : Oracle 9i, 10G, SQL Server 2005Packages : MS- OfficeOperating Systems : Windows 9x/2000/XP, UNIX -- Sun Solaris

Flextronics is a leading Electronics Manufacturing Services (EMS) provider focused on delivering complete design, engineering and manufacturing services to automotive, computing, consumer, industrial, infrastructure, medical and mobile OEMs. Flextronics helps customers design, build, ship, and service electronics products through a network of facilities in 30 countries on four continents. This global presence provides design and engineering solutions that are combined with core electronics manufacturing and logistics services, and vertically integrated with components technologies, to optimize customer operations by lowering costs and reducing time to market. Technical Environment:ETL -- Informatica 5.1 / 7.1.2 / 8.1.1ETL application Server -- UNIX Solaris / Windows 2000 ServerETL & BO Backend DB -- Oracle 10.2.0.3Business Objects -- 6.5 SP4 / XI-R3 3.1BO Application Server - UNIX Solaris / Windows 2003 ServerSource & Target Systems -- Oracle 9i/10g/11i, MS SQL Server 2000/2005, DB2, Informix, SAP BI 7.0, XML files, Flat filesUtilities -- TOAD, TOAD for SQL Server, SQL Developer, PuttyERP Systems -- BaaN, Visual Manufact, INFIMACS, MAPICS, Fourth Shift, NEC Boss, SAP R/3 PROJECTS# PROJECT 01: Title : Flex 3G-Direct Materials (DM)Team Size : 6Environment : Informatica 8.1.1, UNIXDuration : Jun 09- Till DateSource & Target : Source is Flat files and Target is SAP BI and Flat filesDescriptionEstablish a robust materials and procurement management technology and process infrastructure that will provide Flextronics a sustainable competitive advantage for the next decade. ETL tool Informatica consolidate the site data and load into stage tables, from stage tables the data will then loaded to SAP BW. Role: Designed, Developer and Modules lead for Error processing.

Page 8: New Microsoft Word Document

Responsibilities: • Installed Infa 8.1.1 and configured the Node in QA environment.• Created and maintained Repositories and Integration services in Development and QA env.• Designed the data flow and ETL process flow for error records received from the sites. Design incorporated extremely detailed error messages which were then reported to sites in daily basis.• Connected to SAP BI tables to extract error records.• Develop 60+ mappings to gather data provided from the flat files through different stages of cleansing and data validations.• Created complex UNIX scripts to validate file names and file contents.• Handled unit-testing, integration testing and User-Acceptance test cycles• Code migrated from DEV to QA.

# PROJECT 02: Title : Baan Online Reporting Team Size : 5Environment : BO XI 3.1, ORACLE, BAAN V, Windows NT, UNIX Duration : Jun 09- till date

Description The purpose of BO Baan reporting project is to shorten waiting time for users requesting reports. Instead of requesting and waiting for Baan development to generate a report, users at each site will be trained to make their own reports and access the standard reports through infoview.Role: DeveloperResponsibilities: • Created 'No-Join' universes to enable self-service model applicable for different BaaN sites with differing requirements.• Developed 15 standard reports.• Provided L3 supports.• Created documentation and handled sessions for user training in BObj.• Handled User management and Folder access management for the BObj platform.

# PROJECT 03:

Title : Billing Engine Team Size : 4 Environment : Informatica, BO XI 3.1, Oracle 10g, UNIX, Flat files XML Duration : Mar 09 - Jun 09Source & Target : Both Flat files and XML files from Baan & Flat files from Red Prairie and Target is OracleDescription To Create Offline Data Store / Calculation ETP process to Support RPL solution for Flextronics Global Services. Billing Engine ETL Process to fetch the Master data and transaction data from Baan and Red Prairie and store it in the offline Data Store Staging

Page 9: New Microsoft Word Document

Area. Billing Engine from SPL project to calculate the billing records in addition to financial transaction in Baan.This requires an offline data store and a billing engine process created in ETL tool using Informatica and Business Objects reporting tool. Baan and Red Prairie will generate the necessary Master Data and Transaction Data files into FTP server. Informatica ETL tool will fetch those files periodically, Validate and Store it in Staging Tables. Staging Tables data will be used by billing engine merging process and calculation process to generate Billing Backing Data.Role: Design and DeveloperResponsibilities:• Coded very complex shell scripts which dynamically decides which sessions to run and how many times each session should be run. Each session run will have a parameter file dynamically generated to provide source file names, filter condition values etc.• Created triggers and signal output files to indicate condition and quality of the data in the received source files • Finalize requirements with the functional and business owners for both BO & ETL• Created Low-level and High-Level technical design documents• Created mappings, sessions and scheduled workflows. Implemented SCD-1 for change capture in the data.• Created 'No-Join' universes to enable self-service model applicable for different manufacturing sites with differing requirements• Handled User management and Folder access management for the BObj platform• Handled unit-testing, integration testing and User-Acceptance test cycles• Code migrated from DEV to QA and QA to PROD• Handled post Go-Live Production Support

# PROJECT 04:

Title : Services Sites IntegrationTeam Size : 3 Environment : Informatica 8.1.1, Baan ERP, UNIX Duration : Feb 09- Mar 09Source & Target : Baan ERP and Flat FilesDescription To integrate the new Services Sites running on BaaN Vb backends to the MDSS data warehouse, ETL code has to be created to extract data from their ERP systems, put them on a predefined format flat file, and send across FTP to the MDSS servers. The ideal site to be integrated to MDSS would provide a set of 24 data feeds. However, not all feeds are applicable for some sites. Role: Design and DeveloperResponsibilities:• Finalize requirements with the functional and business owners for both BO & ETL• Created Low-level and High-Level technical design documents• Created mappings, sessions and scheduled workflows• Handled unit-testing, integration testing and User-Acceptance test cycles• Code migrated from DEV to QA and QA to PROD

Page 10: New Microsoft Word Document

• Handled post Go-Live Production Support

# PROJECT 05:

Title : Flex HRDWTeam Size : 4 Environment : Informatica 8.1.1, BO XI 3.1, Oracle, UNIX Duration : Nov 08 -- Jan 09Source & Target : Source is flat file and Target is OracleDescription The purpose of this project is to collect all data from Flextronics sites globally, including legacy Solectron sites, to provide an easy way for HR users to run enterprise-wide reports through GDW. Data will be collected in a new Source input file format. All data collected from the sites needs to be loaded into the GDW database. The report will display all information passed from the source files. The data validation will be done as needed. GDW will refresh data every month. To maximize performance GDW will not maintain the employee history data. Extensive security measures were taken to protect sensitive HR information of the employees which included encrypting data using custom Oracle functions.Role: DeveloperResponsibilities:• Designed the data-model. Implemented indirect ETL data-loading using dynamic list files that were generated by shell scripts• Parameter files were generated dynamically using UNIX scripts to provide values for the ETL sessions• Developed complex Shell Scripts for filename validation, ftp of files from more than 80 different locations• Extensive encryption was performed using custom Oracle Packages on the data to protect sensitive HR information• Created mappings, sessions and scheduled workflows. Implemented SCD-1 for change capture in the data.• Designed the BObj Universe for the HRDW• Detailed and summary BObj reports were created which included drills and hyperlinks• Created documentation and handled sessions for user training in BObj• Handled unit-testing, integration testing and User-Acceptance test cycles• Code migrated from DEV to QA and QA to PROD• Handled post Go-Live Production Support

# PROJECT 06:

Title : NPPTeam Size : 3 Environment : Informatica 8.1.1, Oracle, UNIX Duration : Oct 08 -- Nov 08Source & Target : Source and Target is Flat files, staging is Oracle

Page 11: New Microsoft Word Document

Description SAP BI application requires consolidated data for all indirect-materials purchases. Input source data will be received from different sites for Master and Transaction data. For each area, data will be received as a flat-file from different ERP systems. All these files consist of non-delta data (full load). Complex UNIX scripts were used to ftp files from over 100 locations and to alert sites if the files were not available 2 hours before the data-loading starts. The shell scripts also archive the source files for 15 days and remove files older than 15 days. Informatica was used for Change Data Capture to reduce load on SAP BI and to generate reports for analysis of indirect materials purchases. The project also involved cleansing each record of non-English characters which caused errors in lookup and records to fail in SAP BI.

Role: DeveloperResponsibilities:• Developed complex Shell script for data validation, ftp of files from a large no. of different locations, email alerts if files were not delivered on time and updating load status tables.• Designed data-flow sequence in ETL to perform the required data- integration and processing• Implemented Indirect data loading in ETL by creating dynamic list files through shell scripts• Created mappings, sessions and scheduled workflows• Handled unit-testing, integration testing and User-Acceptance test cycles• Code migrated from DEV to QA and QA to PROD• Handled post Go-Live Production Support# PROJECT 07:

Title : Data-Center Consolidation -- 2 (Integration of UNIX and Windows Informatica Jobs and ARDW migration)Team Size : 3 Environment : Informatica 7.1.2, 8.1.1, Oracle, Windows NT, UNIX Duration : Sep 08Description The Windows Informatica application server had to be decommissioned and therefore the jobs that were running on server have to be migrated to an existing UNIX Informatica server. The project also involved migration and consolidation of the ARDW application which was running on Informatica 5.1 connecting to MAPICS ERP running on DB2 servers. ARDW server including the Informatica application server and the DB2 servers had to be migrated from Singapore Datacenter to the new Hong Kong Datacenter.Role: Team memberResponsibilities:• Compared results from parallel run of ETL jobs from Windows and UNIX servers• Compared results from parallel run of ETL jobs from existing ARDW to the cloned ARDW server• Handled Integration Testing and User-Acceptance test cycles

Page 12: New Microsoft Word Document

• Handled post Go-Live Production Support# PROJECT 08:

Title : Data-Center Consolidation -- 1 (Server Migration and Version Migration)Team Size : 3 Environment : Informatica 7.1.2, 8.1.1, Oracle, Windows NT, UNIX Duration : Jun 08 - Aug 08Description The datacenter in Singapore was planned to be shut-down and all servers migrated to Sacramento. While migrating the servers, it was also decided that the version of Informatica will be upgraded from version 7.1 to version 8.1. The project involved the following: ? Migrating GDW (Global Data Warehouse) and dependent applications including Informatica & Business Objects.? Migrate and Consolidate ERDW (Europe Regional data warehouse) into GDW at SAC.? Combine ERDW and GDW Informatica installations to run from the same instance of Informatica.? Migrate and Consolidate Europe Business Objects repository domains into GDW BObj repository at SAC.Role: Team memberResponsibilities:• Tested and fixed over 70 UNIX shell scripts to establish complete functionality in the new server environment (which was a different version of UNIX)• Tested and fixed connectivity to the various source database systems -- edited the driver configurations in Informatica to establish successful connectivity• Ensured data integrity of session outputs from Infa 8.1 by comparing with Infa 7.1 current production -- behaviour of certain functionalities like lookups were different in 8.1 compared to 7.1• Compared results from parallel run of ETL jobs in v8.1 and v7.1 and fix discrepancies so that the output from the v8.1 matched the v7.1 accurately• Tested all BObj reports in the new server - fixed BObj Connections• Corrected numeric and date formatting errors in the BObj reports on the new server• Handled Integration Testing and User-Acceptance test cycles• Handled external UAT with external customers like HP, Nortel, Cisco, Serus, Lexmark etc. • Co-ordinated activities with the Infrastructure team to establish project milestones and meet deadlines.# PROJECT 09:

Title : Flex IntegraionTeam Size : 3 Environment : Informatica 8.1.1, Oracle, Windows NT, UNIX Duration : Apr 08 - May 08Source & Target : Source is Oracle 10g & Target is FTP (flat files)Description

Page 13: New Microsoft Word Document

Solectron Centum was acquired by Flextronics in June 2007. GDW was the Solectron Centum's decision support system, whereas Flextronics had MDSS as its datawarehouse solution. As each DW had differing structures and conventions, the requirement was to integrate the materials data from GDW in a format that was usable in MDSS on a daily basis for global reporting and decision-making. Selected extracts that were determined to be critical were chosen to be integrated from GDW to MDSS on a daily load basis.Role: DeveloperResponsibilities: • Finalize requirements with the functional and business owners• Drafted low-level technical design document• Created mappings, mapplets and organized them into sessions, worklets and workflows• Handled unit-testing, integration testing and User-Acceptance test cycles• Code migrated from DEV to QA and QA to PROD• Handled post Go-Live Production Support# PROJECT 10:

Title : GDW Production SupportTeam Size : 6 Environment : Informatica 5.1, 7.1.2, 8.1.1, BO 6.5 SP4, Oracle, Sql Server, Informix, DB2, UNIX, Windows NT Duration : Feb 08 - till date

Description Global Data-Warehouse (GDW) was the material decision support datawarehouse for all ex-Solectron Manufacturing sites and Global Services sites (SGS). The GDW is a consolidation of data from the sites in Americas region with data from the Regional Data-Warehouses (RDWs) in Asia and Europe. The reports published from GDW are used by the corporate heads to make business decisions on a day-to-day basis. The data-loading consisted of over 700 workflows running in 3 different time-zones to feed the RDWs and then load data to GDW. Role: Team member

Responsibilities:• Informatica and BO server administration and maintenance • In-charge of bug-fixing and enhancements of ETL mappings, Shell scripts and BO reports • Monitor daily data loads and ensure timely completion of data-loading from all the sites globally to the GDW. Publish daily data-loading status reports to the functional managers and business owners. Monitoring scheduled reports in BObj.• User Management in Business Objects; Published periodic weekly and monthly reports and communication regarding planned maintenance shut-downs for BObj

SUMMARY

ETL Consultant with an extensive experience in implementation of

Page 14: New Microsoft Word Document

Datawarehousing,Informatica applications and source code customization, testing, debugging and maintenance of complex systems. I have a total experience of over 6.2 years. I well versed in the following application / functional areas:

· Thorough experience with Data Extraction, Transformation and Loading using Data Warehousing ETL Tool like Informatica 8.6.· Having knowledge of complete development life cycle of Data Warehousing -- ETL Process.· Attended corporate training for Design and development of Warehouse by using Erwin 3.5 data modeling tool.· Attended training in Business Object 6.5 OLAP tool.

SKILLSETL Tools : INFORMATICA 8.6 Modeling Tool : ERWIN 3.5.2 OLAP Tools : BO 6.5 Data Base : ORACLE 10g, DB2Operating Systems : WINDOWS 98/NT/2000,UnixLanguages : C, C++, SQL, PL/SQLMonitoring Tool : Tivoli,Control-MTesting : Mercury Quality Center, Toad, AQT, SQLdeveloper

PROFESSIONAL PROFILE

Employer : JBA Infotech.Title : ETL ConsultantDate of Employment : 14 April 2009 to 10 Sep 2009

Employer : Oracle India pvt. Ltd. (payroll of ASM Technologies Ltd.)Title : Software EngineerDate of Employment : 18 Aug 2008 to 14 April 2009

Employer : Patni Computer System Ltd., Navi MumbaiTitle : Software EngineerDate of Employment : 23 May 2006 to 5th July 2008

Employer : Sheltec Software India (P) Ltd, BangaloreTitle : Software EngineerDate of Employment : 3rd Feb 2003 to 19 May 2006

Page 15: New Microsoft Word Document

PROJECT INFORMATION

Project : New Auto Product(NAP) Platform : Windows XP.UnixTechnology/Software : Informatica 8.6,DB2Duration : 14 April 2009 to 10 Sep 2009Client : Farmers Insurance Group.

Project Description:

NAP (New Auto Product) is a new Auto product to be offered by Farmers Insurance Group. This new product will use a rating methodology that will differ from current Auto product (a.k.a. Legacy Auto Product) by applying rating factors (e.g. surcharges, discounts, points, etc.) across the household (i.e. quote), drivers, and vehicles. With the initial rollout of this project, NAP will be quoted for New Business New Households only.

The purpose of this document is to detail the business requirements for integrating NAP support in the FDR and the MLCDM Quotes Data Mart. At a high-level, the scope of this project will be limited to augmenting FDR with new table(s) and elements to support NAP Quotes. Some of the new NAP Quoting factors, such as symbols, credit scores, discount, and surcharges from the source system will also be loaded from FDR into the MLCDM Quotes Data Mart to support NAP Quotes Analytics.

Responsibilities:

Involved in implementation of entire testing cycle and deployment of System and tested the functionality of software during all stages of the development life cycle. Involved in Creation and Execution of Test Cases, and Test Scenarios. Monitoring the creation and execution of test cases with SQL queries successfully completed "Security Profile Testing" in NAP Application.Integration Testing: Integration Testing would cover End-to-End Testing for DWH. The coverage of the tests would include the below:

§ Record Count Verification DWH backend/Reporting queries against source and target as a initial check.§ Validation after isolating the driving sources.§ Data integrity between the various source tables and relationships.§ Validation for various calculations.

Page 16: New Microsoft Word Document

§ Check for missing data, negatives and consistency. Field-by-Field data verification can be done to check the consistency of source and target data.· Development of ETL using Informatica 8.6.· Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.§ Uploaded Requirements, Test cases and Test Execution results in the Quality Center.

§ Maintain the Error handling method.§ Testing the Jobs according to Unit Test Cases§ Prepared SQL Queries to validate the data in both source and target databases.

PROJECT INFORMATION

Project : iZoom Store Phase-II DevelopmentPlatform : Windows XP.UnixTechnology/Software : Informatica 8.6,oracle 10gDuration : 18 Aug 2008 to 14 April 2009Client : Procter and gamble ltd.

Project Description:

iZoom Store application is a customer POS data warehouse aimed to serve DSS needs of retailer business groups and management. It is a functional application used by Global Point of Sales Customer team to drive business growth and provide business insight to P&G. iZoom store project plans covers migrating the existing customers from the CDW data warehouse on to ADW platform and add new customers on an ongoing basis. The Phase-1 scope of the project involves developing a modular and scaleable platform for ADW (Atomic Data Warehouse) and migrate Wal-Mart PoS data over to the new platform.

Responsibilities:

· Development of ETL using Informatica.· Used various transformations like Source Qualifier, Expression, Aggregator, Joiner,

Page 17: New Microsoft Word Document

Filter, Lookup, Update Strategy Designing and optimizing the Mapping.· Configuring Sessions in Workflow Manager· Used workflow manager to create task and used workflow monitor to monitor and analyze the loading statistics.· Done UTC (Unit Test Cases).

MetLife IDW & Ereporting Employer Patni Computer Systems LtdPlatform Client/Server, Windows,unixTechnology/Software Business Intelligence-ETL Tool-Informatica 7.1.3, UNIXDuration Nov 2007 to 5th JulyClient Metropolitan Life Insurance Company, USA

Project Description:MetLife has built an Institutional Data Warehouse (IDW) that is loaded on a daily basis from internal and external data sources. This Enterprise Data Warehouse further feeds data at various intervals into various Marts viz. Company Relationship Mart, Dental Mart, Disability Mart, Finance Mart, eMetrics Mart, Long Term Care Mart and Life Mart. The IDW/Data Marts provide critical information for business analytic and decision-making purpose. IDW receives data in the form of flat files, which are transformed and loaded into warehouse using Informatica as an ETL tool. Business users use this information for their decision-making using COGNOS and ACTUATE reports. IDW/Data Marts are available 24x7 and reside on a distributed hardware across different sites. The project involves maintenance and production support activities for the data load process for IDW and current Marts and any other data marts that might be sourced from IDW in future. The scope of support covers the code developed for data load processes using Informatica PowerMart, Unix Scripts, UDB SQL scripts and Maestro Schedules.

Responsibilities:

· End to End monitoring of the ETL process· Check data consistency/accuracy/integrity· Abend Fixing/Call Resolution· Maintain production log, analysis and abend history· IDW Production support activities like monitoring, Abend fixing, · preparation of WSR, Mart ownership also ensure SLA fulfillment. · Creation & Maintenance of Run Books for each Mart.· Coordination of all activities with onsite team.§ Check for missing data, negatives and consistency. Field-by-Field data verification can be done to check the consistency of source and target data and txt, csv files and reports§ Uploaded Requirements, Test cases and Test Execution results in the Quality Center§ Maintain the Error handling method.§ Testing the Jobs according to Unit Test Cases

Page 18: New Microsoft Word Document

§ Prepared SQL Queries to validate the data in both source and target databases§ Interact with the project manager for the issues and status of project.

Project : MetLife Bank Individual Business -- Data Mart Platform : Windows XP.Technology/Software : Informatica 7.1, DB2Duration : JUL 2007 -- NOV 2007 Client : MetLife.

Project Description:Patni had developed the METLIFE METBANK WAREHOUSE & DATAMART which had information pertaining to the Met Bank's Accounts, Customers and sales agents. During initial phase, only required data elements were pulled from staging and populated in corresponding Warehouse and Mart tables. The main purpose of Phase II is to expand the data mart by adding new CIF(Customer) ,DEP(Account) and SCAU(security) data elements. Along with this 2 NEW feeds for CIF(Customer) and DEP(Account) transaction data will also be added. Some of the tables in warehouse has huge amount of data and hence load from warehouse to mart is very slow. So in the present phase some of the tables will be archived on a monthly basis, leaving at least 3 months of data in the respective tables. The data elements which are included in the current phase are related to DEP and CIF.

Responsibilities:

· Development of ETL using Informatica.· Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.· Configuring Sessions in Workflow Manager· Used workflow manager to create task and used workflow monitor to monitor and analyze the loading statistics.· Done UTC (Unit Test Cases)

Project : Fund information WarehousePlatform : Windows 2000.Technology/Software : Informatica 7.1, Oracle 9iDuration : Sept 2006 -- June 2007Client : Fidelity Investments International.

Page 19: New Microsoft Word Document

Project Description:Client plans participant data, including personal profile information, plan details, transactions, balances, channel activity, customer satisfaction, etc. are managed and stored in several Fidelity proprietary systems. Currently, there is no central repository that compiles data from the numerous systems to measure, track or profile participants and their behavior on a global basis. The business users need this central system to satisfy the analysis requirements which are currently served by ad-hoc reporting and manual comparisons. Such a system will give business users a better understanding of client's customer base and participant preferences reduce the Systems resources needed to run ad-hoc reports, and provide analysts with the tools needed to make the most efficient use of client's marketing and operational budgets.

Responsibilities:

· Development of ETL using Informatica.· Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.· Configuring Sessions in Workflow Manager· Used workflow manager to create task and used workflow monitor to monitor and analyze the loading statistics.· Done UTC (Unit Test Cases)

Project : Sales and Marketing Data WarehousePlatform : Windows 2000.Technology/Software : Oracle 9i, Informatica 6.2, Erwin 3.5, BO 6.5Duration : July 2005 -- May 2006Client : eMark electronics. kolalampur

Project Description:

The project was undertaken for electronics based industry. The company is searching for business enhancement and competitive edge using information Technology to make better business decisions. Client wants to have some system which can give him intelligent information reports regarding his existing business situation and how he can analyze specifically his business. According to the client requirement and other project requirements the Data Warehouse System has been developed. This project describes the high level requirements of the Data warehouse system. It is meant for use by the designers and developers and will be the basis for validating the final delivered system.

Responsibilities:

Page 20: New Microsoft Word Document

· Design and development of ETL using Informatica.· Extensively used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy to load data into slowly growing and slowly changing dimensions (SCD).· Designing and optimizing the Mapping.· Configuring Sessions in Workflow Manager· Used workflow manager to create task and used workflow monitor to monitor and analyze the loading statistics.· Done UTC (Unit Test Cases)

Project : Account Analyzing SystemPlatform : windows 2000Technology/Software : Oracle 9i, Datastage, Erwin 3.5, BO 6.5Duration : March 2004 -- May 2005Client : Bank of Hoya (JAPAN)

Project Description:This project is to provide different reports and support adhoc queries for making intelligent banking decision based on data available in various branches across the country connected over a period of time. Bank wants to have tool, which will help it to analyze the current business trends and to make predictions about the future business trends. Especially the Bank is focused on getting the analytic information about Target customer groups, Loss making schemes, Season wise financial analysis?etc.

Responsibilities:· Designed and implemented various transformations through designer.· Configuring Sessions in workflow Manager· Detected bottlenecks of target, source, mapping and session · Done performance tuning at mapping and session levels.· Tested the mapping using Debugger.

Project Financial AnalyzerPlatform Windows NTTechnology/Software Oracle 8i, Datastage, Erwin 3.5Duration March 2003 -- February 2004Client APEX FINANCE

Project Description:The development and design of a data warehouse is inextricably link to business needs of an enterprise. A leading finance company in UK for over 25 years required an analytical data warehouse. Since it is a finance facilitator company, it has to be link with many kinds of business enhancement and competitive edge using information technology to

Page 21: New Microsoft Word Document

make better decisions. They are dedicated to costumer service and do intelligence analysis of their business which has become prima-facie of finance sector throughout the UK.

Responsibilities:

· Worked as ETL Developer (Trainee)· Involved in loading of data into warehouse under ETL phase using Informatica .· Developed various kinds of mappings.· Optimizing the mapping to load the data in slowly changing dimension.· Done performance tuning of the mappings· Tested the mappings.

Achievement :

Appreciation of outstanding contribution as a member of the Dream Team for the year 2006 from patni computers ltd.(MUMBAI).

Certifications:SP2I (on financial domain)SAD

Education· Master of Computer Application from IGNOU in DEC 2004.· Bachelor of Computer Application in DEC 2002.· DISM from APTECH Ltd. in dec.1998.

Experience Summary I am an associate with Tata Consultancy Services for the last 3 years and 5 months and worked in several projects during this period. My primary skill set includes Informatica Powercenter. I have worked in support, enhancement and development projects primarily in Informatica. My other skills include SQL, UNIX and data warehousing.

Other Skills includeGood Communication Skills, both Verbal and Written.Experience to tackle the Client and end-Users.Maintaining good relationship with the people.

Technology

Page 22: New Microsoft Word Document

Below is a list of important hardware, software products, tools and methods that I have worked with.

Hardware Software Products Tools IBM PC CompatibleHPApple PC Windows XP/NT,Macintosh,DOS,Unix, Oracle 9I (PL / SQL )Informatica Power Center 7.1.5, Informatica Power Center 8.1.1Informatica Power Center 8.1.5,Hyperion,Rational Clear Case, Clear Quest, Master Reference Material/Hierarchy Manager, Quality Center

Qualifications

Degree and Date Institute Major and Specialization Bachelor Of Technology, July 2005 Galgotia's College Of Engineering and Technology, Greater Noida, U.P. Affiliated to Uttar Pradesh Technical University, Lucknow. Information Technology Senior Secondary, March 2000 St. Aloysius College, Pilibhit, U.P. Science Stream Higher Secondary, March 1998 St. Aloysius College, Pilibhit, U.P. All Subjects

Assignments

The details of the various assignments that I have handled are listed here, in chronological order.

Project Master Reference Material Customer ELI LILLY and Company Organization Tata Consultancy Services Period Sep-08 -- till date Description Eli Lilly & Company wants its data to be moved to a new Oracle database

Page 23: New Microsoft Word Document

from MVS database to carry its operations and to be at par with the new applications. The existing database will be obsolete once the new system would be in place.The Overall responsibility is to create a new application that would cater to the needs of the Business. Role ·1 Working in the Project as a Module Lead.·2 Developed the mappings and workflows as per the Business logic using Informatica 8.1.5 version.·3 Development of new system for Eli Lilly & Company.·4 Sorted out the issues that arise during unit testing.

Solution Environment Windows OS, HP Unix, Oracle 10g Tools Informatica 8.1.5, Toad, MRM/HM, Quality Center

Project GODW EDW Core to Semantic Customer Apple Technologies, Inc. Organization Tata Consultancy Services Period April 2008 -- July 2008 Description Apple Technologies, Looking for the EDW(enterprise data warehouse) instead of GDW(global data warehouse) to carry its operations and load data to this EDW(teradata database) with source as SAP BW. From here the data has to be send to the target(flat files).The Overall responsibilty was to have an EDW instead of a Global operational data warehouse that would cater to the needs of the Business. Role ·5 Worked in the Project as a Module Lead.·6 Worked on Design documentation, SIA(System interface agreement).·7 Developed the mapppings and workflows as per the Business logic using Informatica 8.5 version.·8 Development of new system for Apple Technologies Inc.·9 Sorted out the issues that arises during unit testing.·10 Worked on creating the views on the tables in the staging area.·11 Worked on SAP BW which acted as a source to the new application. Solution Environment Macintosh OS, HP Unix, Teradata Tools Informatica 8.1.5

Project CM2 Customer Agilent Technologies, Inc. Organization Tata Consultancy Services Period Feb 2008 Description Project was taken up to test the inventory data (in Data Warehouse) flowing correctly after the modification in the Agilent's operations. Agilent wants to have a single inventory at Switzerland instead of having it at various places like Malaysia, Singapore etc. Role ·1 Worked in the Project as a Team Member.·2 Functional Technical Consultation on behalf of TCS to Agilent project

Page 24: New Microsoft Word Document

·3 Repository Administration·4 Data loading from Source to target in the Test environment.·5 Fixing the issues while loading the data

Solution Environment Windows 2000/NT/XP, HP Unix, Oracle 9i Tools Informatica 8.1.1 SP4

Project GDW Support Customer Agilent Technologies, Inc. Organization Tata Consultancy Services Period Apr 2006 -- Jan 2008 and March 2008 Description Agilent needs this Global data warehouse for an enterprise view of functional data. This warehouse is a trusted source for key enterprise analytical and management reporting.The project involves maintaining this Global warehouse. Data flows in to this warehouse through OLTP & OLAP source system (Relational Database and Flat File) and ETL process is governed through Informatica power center. Monitoring these ETL processes and maintaining the integrity of data through various source systems is the key task of this project. Role ·1 Resolving technical issues related to ETL through Informatica to maintain data integrity·2 To study the mapping and workflows thoroughly and know the data relationships and data flows.·3 Requirement gathering and requirement Analysis·4 To prepare the detailed Low level design document for the bug fixing·5 Code creation ·6 Test case preparation.·7 Performing Independent Unit and Integrated Testing·8 Monitoring Informatica jobs

Solution Environment Windows 2000/NT/XP, HP Unix, Oracle 9i Tools Informatica 7.1.5.and 8.1.1 SP4, Hyperion, Toad, MSVSS

Key Competencies & Skills My Competency Profile includes the following: OS/Environment DOS, UNIX, Windows XP/NT and MacintoshBusiness Intelligence Data Warehousing, Informatica, HyperionLanguages & Tools Visual Source Safe, Rational Clear Case, Rational Clear QuestDBMS Oracle-9i, Oracle-10g, PL/SQL.

Career Profile

Dates Organization Role

Page 25: New Microsoft Word Document

Since 06-Feb-2006 TATA Consultancy Services Analyst Programmer

Training / Continuing Education Summary

Program or Course Coverage Dates .Net Training Complete 04-04-2006 to 19-04-2006 AS/400 training Complete 18-03-2006 to 31-03-2006 ILP Trivendrum Complete Feb--Mar 2006

SUMMARY? Over 8 Years of IT experience in Data Warehousing, Oracle, UNIX and Siebel. Involved in implementation of Warehousing solution for Banking (Federal Reserve Bank, NY), Insurance (Zenith), Manufacturing and Retails (P&G and AVON) Industries.? Six Years (6+) of strong Data Warehousing experience using Informatica PowerCenter 8.5/7.1.3/6.1.2 (Workflow Manager, workflow Monitor, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer) and Informatica PowerMart 5.1.2.? Having around two years of Production Support and Maintenance experience with Enterprise Data Warehouse and data base applications at different levels.? Actively involved in different stages of SDLC like requirement analysis, design specifications, development, testing and maintenance, and on going support of process and development procedures.? Involved in designing, developing and maintain Datamart as well as Enterprise Data Warehouse with using ETL process based on different methodologies like classical System Development Life Cycle, CLDS and this is called as “Spiral Methodology”.? Expertise in developing the Oracle PL/SQL Stored procedures and well versed with exception handling. These stored procedures are used for applying the complex business rules to validate the data as well as to extract data from holding tables to EIM tables.? Expert in writing SQL queries, which are used as in the part of stored procedures and as well as used for developing the data base object like tables, indexes, sequence generators, etc..? Sound knowledge in developing the UNIX Shell Scripts, which are used to do the ftp'ing files, validate files, calling informatica workflows, SQL Loader and etc.? Expert in using the TOAD for oracle to developing the stored procedures and packages as well as using other features of the tool like data importing data, exporting process and etc.? Exposure to another Data Warehousing ETL tool BO's Data Integrator (XI) and with various OLAP technologies like BO 6.0 and Cognos 8.5.

Page 26: New Microsoft Word Document

? Proficient in developing Control files which are used in the process of loading the data from flat file to RDBMS tables as in the part of SQL Loader.? Good in Siebel -- EIM process as well as developing the Ifb files, which are used as in the part of running the EIM jobs.? Having exceptional knowledge on Data Warehousing, related Dimensional data modeling process and their related schemas like star schema, snowflake schema and network schema. ? Expert in job scheduling process by using third party tolls like autosys based on target dependencies.? Having excellent Analytical skills, oral & written skills, problem solving skills. And possess the ability to work with staff and clients at all levels.

TECHNICAL SKILLSETL Tools : Informatica 8.5/7.1/6.1/5.1.2, BO -- Data Integrator, HORIZON and Siebel -- EIM.Technologies : Cognos 8, Visual Basic 6.0, ERWIN 3.5/4.1.4,TOAD 8.6.1 and RSQL 7.3.2.GUI Tools : MS Visio, Developer2000, Visual Source Safe, E-R Diagrams.Languages : C, C++, Java, XML, SQL, PL/SQL, UNIX-Shell Scripting.RDBMS : ORACLE 10g/9i/8.1.2 and MS SQL Server and MS Access.

EDUCATION

Completed Master Degree in “Computer Applications” -- MCA.

PROFESSIONAL EXPERIENCE

Federal Reserve Bank, NYC, NY 06/08 -Till Date

The Federal Reserve Bank of New York is one of 12 regional Reserve Banks which, together with the Board of Governors in Washington, D.C., make up the Federal Reserve System. It is responsible for formulating and executing monetary policy, supervising and regulating depository institutions, providing an elastic currency, assisting the federal government's financing operations, and serving as the banker for the U.S. government. The New York Fed's supervisory activities are designed to ensure a safe and sound banking system. The New York Fed conducts onsite and offsite examinations of banks in New York, New Jersey, and Fairfield County in Connecticut.

Involved in design, development, testing and maintenance of three applications, those are Bloomberg, PDCF and BrokerTec.

Page 27: New Microsoft Word Document

As a Lead:-? Understand business process and user requirements by attending meeting with business analysts and users.? Developed the ETL specification documents with column level transformation logic, validation rules.? Distribute specs to developers and make sure work load is balanced in order to meet project time line.? Defined Informatica Naming standards document, which gives guidelines to maintain consistency in naming “transformations, mappings, maplet, sessions, worklets and workflows”. Responsible of informatica client installations on developer machines.? Involved in refinement of ETL process by redesigning target tables structures with source type flag values and its corresponding mappings and work flows.? Assists team members as and when they required technical solutions to critical scenarios like handling complex structured source files, resolving conflict between confirm dimensions and normal dimensions.? Found anomalies in Explain plan of some heavy SQL queries. Created Function Based Indexes to resolve the problem as execution plan was not using the indexes those were already there.? Involved in preparing PVCS document for code migration with all instructions to DBA, SA and Informatica team. Prepared Unix TAR file for SA with all required files to the deployment process. ? Responsible for code reviews, integration of objects into project folder from developer's respective folders, make sure integrated workflow is working perfectly and code deployment at Repository Manager.As a Developer:-? Explored advanced features in Informatica 8.5 to improve the reusability of base Informatica code such as concurrent workflow execution by using different instances with corresponding parameter files. ? Make use of new functions like MD5 in data validation rules to check hash key values of each record.? Suggested and implemented reusable sequence number generation logic in Informatica by using maplet and look up transformation on target table. And make use of Informatica Version control extensively. ? Involved in tuning informatica mapping and successfully improved session load performance by 3 times by using sorted input option in aggregator transformation and no. of sorted ports in source qualifier.? Developed critical and challenging Informatica jobs like preparing parameter file based on date in source file, audit flow from repository & Ware House tables by using SQL Override in source qualifier transformation for ODS & DWH and reading data from multi structured source files.? Involved in developing UNIX script, which does ftp the encrypted files, decrypt them, validate them by using UNIX utilities like SED and finally keep ready for Informatica to load them into ware house.? Developed Informatica mappings, sessions, worklets and workflows based on ETL specifications with error processing mechanism by using the exception cross ref tables and exception log table.

Page 28: New Microsoft Word Document

? Developed back load UNIX shell scripts, which loads previous year business feeds into the ware house tables by using Informatica pmcmd command from script.? Worked different kinds of source feed file, such as delimited, fixed width and XML files and especially multi structured delimited source bulk files for different class codes like CMO, CMBS, ABS and etc..? Involved in data purge process of ODS tables to keep current data only and involved in design and development of workflow recover mechanism in case of failure.

Environment: Informatica Power center 8.5, Oracle 10g, TOAD.9.0.1 and SunOS 5.10.

AVON, Rye, NY 10/06 -- 05/08

Avon, the company for women, is a leading global beauty company. As the world's largest direct seller, Avon markets to women in well over 100 countries through over five million independent Avon Sales Representatives. Avon's product line includes beauty products, fashion jewelry and apparel, and features such well-recognized brand names as Avon Color, Anew, Skin-So-Soft, Avon Solutions, Advance Techniques, Avon Naturals, Mark, and Avon Wellness. Latin America Data Store is a new data integration hub to support regional data requirements. First phase of Data Store construction will primarily focus on serving Southern Cone MAPS requirements. But it will be expanded to cross several business functions based on business need and business priority. The main aim of this project is to Integrate the data from all the local markets into MAPS, which are related to the same Cluster and providing the locally generated data to the other systems like Brochure printing, Trend analysis, macpac cost systems etc.,

As a lead:-? Prepared High Level Design document based on design document and from understood business rules in requirement phase and this document gives information about the each Oracle Package flow.? Developed Low Level Design document, which will give complete column level information from source tables to staging and staging tables to target dimension and fact tables along with transformation logic.? Prepared ETL architect diagrams with completely from source to targets with error flows and CDC control.? Suggested and successfully implemented new way of failure recovery mechanism by tracking the records loaded into target table with help of load track tables based on primary key column combinations.? Designed and documented on incremental update strategy, data refresh, transformation rules, slowly changing dimensions and data cleansing.

As a Developer:-? Involved in Performance Tuning of oracle SQL Queries by resolving mysterious slow loading process problem by changing the structure of underlying small tables to Index organized tables.? Another query was performing slow and explain plan showed that query was using all

Page 29: New Microsoft Word Document

indexes. Found waits in Sequence generator. Problem was resolved by caching sequence values in advance. It eliminated the DML query wait on getting keys from Sequences.? Implemented Slowly Changing dimension Type -- II logic on dimensions with effect date methodology.? Involved in developing load failure recovery mechanism by tracking the records loaded into target table with the help of temporary tables based on primary key column combinations.? Involved in developing ETL flows/jobs in HORIZON application, which will do ftp, file loading and calling Oracle packages to validate the business rules before loading data into ware house tables.? Developed complex PLSQL Packages and Procedures based on low level specifications to load data from stage tables to target dimension and fact tables.? Involved in developing Oracle Packages to pull data from target fact table, which is having data at low level granularity and move to summary table by rollup data to month level for each fact table.? Developed Horizon Jobs to prepare and publish XML files to IBM Message Queue with unique topic name for MAPS local reference data. ? Involved in developing of subscribe jobs, which will subscribe XML files from Message Queue and these files were published for Central Master Data.? Tested developed DB Stored Procedures, Packages related scripts like control file and Unix shell scripts.? Involved job scheduling process, which is done by third party scheduler autosys.

Environment: Oracle 10g, Horizon, IBM - MQ, TOAD.6.8.1, IBM -- AIX 5.0 and MS - Visio.

Zenith Insurance, Los Angeles, CA 01/05 -- 09/06 ETL DeveloperActuary Underwriter DataMartHeadquartered in Los Angeles, Zenith is about 40 years old as a Company, and now employees about 1400 people. Zenith specializes in Workers' Compensation for small to mid-size Companies, the bulk of its clientele employing 20 to 30 persons. Zenith conducts business through independent agencies. Zenith operates in over 40 states, through 13 Branches across the U.S. Apart from California, which is the largest contributor to its business; Zenith is a strong player in the state of Florida.

As a developer:-? After analyzing the Business requirement document and based the Low Level ETL specification documentation developed the mappings with using Informatica designer. ? Brought up new idea of implementation for parameter file generation with informatica mapping by using Normalizer transformation based on parameter values available in

Page 30: New Microsoft Word Document

RDBMS table.? Developed sessions, worklets and workflows for daily and initial loads based date available in load control table, which is updated after daily loads are completed successful.? Debug the informatica mappings and tested thoroughly for different scenarios and prepared test results.? Extract Incremental data from source systems periodically by using last extracted timestamp based on target system dependencies according to subject areas.? Wrote shell scripts to call ftp utility to get remote flat files from the remote systems and validate them before start the loading process and even to call Pre/Post session tasks.? Developed the SQL Scripts to cross check the data in ware house tables with data in the source tables to validate as in the part of Unit Testing.? Involved in integration testing and project deployment from one environment to another environment.? Involved in informatica client tools installation on my machine as well on other team member's machines.? Involved in level -- III production Support and implemented new changes to existing project workflows.

Environment: Informatica 7.1.3, Oracle 9i, TOAD 8.6.1, BO -- XI, IBM -- AIX 5.0 and ERWIN 4.1.4,

Proctor & Gamble, Cincinnati, OH 06/02 - 11/04 ETL Developer Transition CPG/RSS/Delphi 03/03 - 11/04

CPG, RSS and Delphi are three projects currently populating P&G Data warehouse to fulfill different P&G User needs Oracle PL/SQL Packages and P&G is currently customizing the SIEBEL Application to bring the existing Commercial Product Group (CPG) Customers, Products and other associate entities data into SIEBEL to facilitate the CPG sales people and managers with more information about their accounts. CPG takes care of the non-retailing part of the P&G business. Their customers are mainly organizations, which buy P&G products for non-retailing purpose. RSS is the division of P&G which takes care of the retailing part of the business.

As a developer:-? Involved in studying existing Pl/SQL ETL packages and business rules from BRS documents for the enhancements needed to the existing data warehouse.? Prepared technical understand document based on whatever knowledge explored from existing ETL loads.? Prepared mapping specification document with related sessions and workflow names.? Developed informatica mappings for ETL flow with different transformations like source qualifier, update strategy, router joiner, sequence generator, lookup, expression and etc.? Involved in the unit testing of mappings and session thoroughly before moving into testing environment.

Page 31: New Microsoft Word Document

? Developed shell script to validate source file before start actual ETL flow with the informatica.? Created sessions and batches using Server Manager for the developed informatica mappings.? Developed two different kinds of flows for initial/full load and incremental load and involved in informatica Job Scheduling process based on target table dependencies.

Environment: Informatica (Power Mart 5.1.2), Oracle 8.1.3, Siebel Analytics, TOAD, UNIX Scripting

AIMS 06/02 - 02/03EIM Process Developer

P&G has decided to upgrade its current SIEBEL 6 application to SIEBEL 7.5.2 so as to take advantage of various new features in this release. Consolidation of various modules in the currently running SIEBEL 6 is being done and the functionality re-implemented in SIEBEL 7.5.2.

The objective of the project NA MDO (North American Marketing Development Organization Implementation) is to Maintain and enhance the existing Siebel eConsumer Goods application that was implemented at onsite in Siebel 2000. The application includes modules IAMS (Call Center and Contact Management).

As a developer:-? Involved in developing Pl/Sql scripts to bring the data from holding table and put it into EIM tables.? Developed IFB files, which are used by Siebel server for controlled information in EIM process.? Prepared control files, which are used in data load process from file to holding table using sql loader.? Developed the Privacy Interface, which extracts the data from Siebel 7 and put into flat file, this file will be generated on weekly bases.? Running EIM Task after Analyzing and mapping legacy data into Siebel Interface tables? Creating and modifying configuration (ifb) files to run the EIM tasks (Insert) in order to populate the base tables in optimal manner.? Performed Data Mapping, Bulk imports of data in the Siebel database using EIM.? Shell Scripts were developed for data extraction and built the flat file for the Privacy Interface.? Involved in developing the job scheduling script as well as scheduling process using UNIX CRONTAB job processing.

Environment: Siebel 7.5.2, Oracle 8, Windows 2000, UNIX and Shell Scripting.

Page 32: New Microsoft Word Document

Soft Office (Future Tech Instruments) AP, INDIA 01/01 - 05/02Database Developer

Soft Office is an office management system. It consists of 10 modules, which are front office, purchase orders, marketing, accounts, etc. This part of project dealt with front office module.

In front Office there are 12 sub modules called as visitors register, fax register, phone register, e-mail register, Xerox register, employee master, servicing inward, servicing outward, material inward, material outward and etc,. The front office module provides information for the employees of the organization facing the customers and serving them in a better way.

As a developer:-? Understand user requirements from business requirement document and some times get the requirements from the users on ad hoc bases.? Involved in developing the scripts to create Data Base objects like tables, indexes, views, sequence and etc.? Developed stored procedures, packages, triggers, cursors in order to process the business data, used by reporting team.? Involved in testing process extensively and prepared the test reports for different kind of scenarios. ? Prepared the technical document with flow diagram of each procedure to understand easily.? Involved in Production Support for daily loads, taken care of implemented the new changes as well as enhancements to the existing project.

Environment: Oracle 8, SQLPLUS and VB 6.0.