Want your profile here?
Becoming a member is easy!
Climber.com works with you to help you advance your career by putting you directly in front of tens of thousands of recruiters in your field. Sign up today on Climber.com
Recruiters - Try Postings!
Postings.com™ is a must-have for recruiters who want to:
- Find Qualified Candidates
- Find Job orders and Post Splits
- Be Found in Search Engines
- Implement a Social Sourcing Strategy
About Me
Industry: |
Information Technology |
---|---|
Occupation: |
Computer Programmer |
Education level: |
Master |
Will Relocate: |
Yes |
Location: |
Pittsburgh, PA |
Work Experiences
3/2007 - 3/2009
Federal Home Loan Bank
Individual Contributor
- • Performed a basic study on the existing CDR (Customer Data Repository) Data warehouse and recommendation has been made for better the performance.
• Developed system for Capital Stock & Member to provide static and ad-hoc reporting on the Capital Stock position of the FHLB and to improve operational workflow for the Member Market Access group.
• Developed system for Data Quality to provide the data validation performed for important input data in the target system, including flat files.
• Developed system for Call Reporting System to provide the Federal Home Loan Bank an automated process for preparation and submission of the monthly and quarterly Call Reports mandated by the Federal Housing Finance Board (FHFB)
• Developed system to feed QRM application, OF (Office of Finance) through FRS (Financial Reporting System) and Micro Funding.
• Identified and created partition tables with the retention of data for the entire data warehouse in UAT and production.
• Involved in Business analysis, Data analysis, Data Modeling and ETL development.
• Designed and created Tables, Views, Synonyms, Indexes and grant privileges
• Experience with packages, stored procedures, functions, triggers and materialized views
• Used UTL_FILE package to write and read data from CSV files
• Created different joins to extract data from different source tables and views
• Used Analytic functions to compute an aggregate value based on a group of rows.
• Create different type of cursors to manipulate the information within that SQL statement
• Create Global Temporary Table to store intermediate result and temporary data.
• Created range Partitions on fact table based on transaction date
• Created shell script to run the PL/SQL’s.
• Participate in meetings and act as subject matter expert when required.
• Analyze user needs and software requirements to determine feasibility of design within time and cost constraints.
• Identified the dimensions, measures and the detail attributes from the requirements gathered from the meetings with users
• Create an ODS for ad-hoc and batch reporting by drawing data from different source system
• Provide technical analysis of ETL processes applicable to effort and assist in integration of legacy data from multiple sources into an Oracle environment.
• Support data cleansing and reporting activities to legacy system owners necessary in identifying and correcting data inconsistencies (incorrect formats, invalid values, or duplicate records).
• Created csv files for job, task, validation and business rules for Data Quality
• Loaded data from flat file to oracle using SQL Loader utility
• Participates in all phases of the data warehouse development cycle with minimal direction.
• Tuned the queries to improve the performance of retrieving data using explain plan and sql trace
• Involved in migration of data from development to testing and production environments.
• Provided production support and maintenance of the system, and works with other team members in optimizing system performance.
• Provided technical assistance by responding to inquiries regarding errors, problems, or questions with programs/interfaces.
• Prepare documentation and test plan that clearly addresses all technical processes
12/2004 - 1/2007
TechMahindra
Individual Contributor
- • Translated the requirements into functional and technical specifications.
• Developed system to migrate data for Standard Operating Environment (SOE) and Big Friendly Giant (BFG) data warehousing project
• Developed system for generating daily Key Performance Indicator data extracts from data warehouse
• Creating PL/SQL packages for the rules engine to calculate the KPI values using the meta data
• Created PL/SQL packages to generate the data extract files in CSV format
• Partitioned tables, which have frequent inserts, deletes and updates to reduce the contention and to improve the performance
• Extracting the data from Legacy systems and store in staging schema.
• Create mappings between the legacy systems data files and the staging tables
• Field level validations like Data Cleansing and Data merging were applied on various Interfaces
• Created and used DB links for accessing source data from different databases.
• Experience with packages, stored procedures, functions, triggers and materialized views
• Created Synonyms to access objects from different databases.
• Created indexes to tune the queries for generating the data extracts
• Create PL/SQL packages to clean and derive the data using the rules stored in meta data schema
• Create PL/SQL packages to populate the cleansed data into targeted system
• Used error routines and exceptions handling techniques as per business rules.
• Used different type of cursors to manipulate the information within that SQL statement
• Used Analytic functions to compute an aggregate value based on a group of rows
• Developed Unix Shell Scripts to automate the extraction process.
• Work with other teams to integrate the components into the core system
• Provided technical assistance by responding to inquiries regarding errors, problems, or questions with programs/interfaces.
• Performed unit testing and tuned for better performance.
1/2003 - 11/2004
K C S Pvt Ltd
Entry Level
- • Created Depository Back Office System for tracking and managing the security transactions between the Clients and Depositories (Banks)
• Created Loan Against Demat Shares System to draw loan against Demat shares from the bank according to bank value of shares.
• Created a system to generate and process the Annual Maintenance Charges of Emat billing by calculating various charges like account opening charges, prepaid credit adjustments and promotional credit adjustments.
• Designed and created Tables, Indices, Views, Procedures, Packages, and Functions.
• Created normalized tables by creating the reference keys
• Fine-tuned the stored procedures, queries by changing the join orders and giving all combinations of joins and fixed them with the help of various utilities.
• Tuned SQL queries and PL/SQL code using Oracle utilities like tkprof, Explain Plan and Autotrace.
• Developed procedures, packages, functions, triggers using PL/SQL & Oracle supplied packages.
• Created complex joins, correlated sub-queries, aggregate functions and analytic functions
• Dropped and recreated the indexes on tables for performance improvements
• Partitioned tables, which have frequent inserts, deletes and updates to reduce the contention and to improve the performance
• Created and run the SQL scripts to create the database logical structures.
• Shifted data into oracle which had already been present in other databases such as excel.
• Creating and grant roles for privileges.
• Create sequences based on volume of the data
• Created comprehensive test plans and result test cases and scripts for unit and integration testing
Education
2003
Master Degree
Indira Gandhi National University
- Computer Applications
1999
Bachelor Degree
M G University
- Computer Applications