📔Projects

 

Project Detail

 

Company Name

Atos / Eviden

Project Name

Data Technologies and Innovations

November 2022 - Till

Client Organization

Credit One Bank

Role as Data Engineer (Visualization and Data pipeline)

About Project: This project revolves around a centralized repository to store both structured and unstructured backing data of any size. It preserves data in its native/original format and processes diverse data types through multiple layers. We utilize various data sources, such as Hive employing Apache Parquet and Apache Iceberg formats. To access data from different sources, we employ a range of tools capable of handling various formats, including JSON, EBCDIC, CSV, XML and relational databases (DBMS). AWS S3 Bucket (for consumption data layer) and Minio S3A (for Staging - Raw Data Layer) are used to migrate data from various sources. Airflow is used to schedule ELT and PostgreSQL database used to store metadata and events log. User reporting layer is hosted on a Snowflake using Data Vault 2.0 Architect. That S3 data copied to “Snowflake” and processed data redirected to “Tableau” Dashboards using ATSCALE (that semantic layer on Tableau).

Responsibilities: As a data architect, my role involves developing data pipelines to facilitate the smooth flow of data, ensuring accuracy and availability. This includes handling datatype conversions to match the requirements of various applications. I am also responsible for managing warehouses (Data Vault 2.0) in Snowflake, overseeing data processing load from both inbound (S3A) and outbound data (Tableau), and ensuring compatibility and consistency across databases. Additionally, I create data visualizations to provide insights and facilitate informed decision-making, making data more accessible and understandable for stakeholders.

Environment: Python (Pandas & Spark), Apache Hive, PostgreSQL, Snowflake, Apache Superset, Elasticsearch & Kibana, Tableau & ATSCALE, Apache Kafka, Trino, Apache Airflow, Minio, AWS S3 bucket.

 

 

Company Name

Cognizant

Project Name

TriZetto facets Configuration

January 2020 – November 2022

Client Organization

Horizon

Role as Project Lead (Database / Report Solution Architect)

About Project: This project focuses on configuring the “TriZetto-Facets Application”. We have developed several clustered script programs to efficiently upload data directly into the Configuration database, thereby preventing superfluous entries in the “Facets Application”. Once claims are loaded into the database, the snow pipeline facilitates transferring data to Snowflake for generating charts displayed on the Tableau Dashboard.

Responsibilities: We developed an Automation Utility to streamline the configuration load process for Facets. This utility, implemented as a console application with integration into Windows services, facilitated the export of data to Excel sheets. Users could conveniently edit and add data within these exported spreadsheets. Once the necessary modifications were made, the application seamlessly imported the Excel data into the Facets Configuration database, ensuring that user changes were accurately reflected in the system.

Key features of the Automation Utility included:

·         A console application integrated with Windows services for ease of use and automation.

·         Export functionality that enabled the extraction of data to Excel sheets.

·         A user-friendly interface within Excel sheets for editing and adding data.

·         Seamless import of modified data back into the Facets Configuration database.

Robust data validation and integrity checks to ensure accuracy during the import process.

With this Automation Utility, managing and updating Facets configurations became more efficient and user-friendly, ultimately enhancing the overall workflow and productivity.

Environment: MS SQL Server / Azure SQL, T- SQL, VB.net Application, Windows Services, MS-Excel, Tableau Workbook + Snowflake

 

Company Name

Cognizant

Project Name

Encounter Data Management

April 2019 – January 2020

Client Organization

TMG, Horizon

Role as Project Lead (Database / Report Solution Architect)

About Project: This project focuses on loading EDM data into SQL Server using Informatica Mapping. The mappings are designed to facilitate the loading of various types of files including '835 Response,' '999 Imputed and rejected,' 'remittance,' and 'Claim Approve and Reject' Files. These mappings, though small in size, work continuously to load incoming data files residing in landing folders. Upon successfully completing the claims loading process into the database, the Snow pipeline is engaged to further transfer data to Snowflake for visual representation through charts on the Tableau Dashboard.

 

Responsibilities: We created Automation Utilities for various tasks involved in EDM data loading, including:

·         Automation Utilities Package for loading 835 Response files into EDM.

·         Automation Utilities Package for loading 999 Imputed and rejected files into EDM.

·         Automation Utilities Package for loading remittance files into EDM And loading Provider Data into EDM.

·         Automation Utilities Package for loading Claim approved and rejected Data into EDM.

·         Data Migration to AWS S3 Bucket and later aggregated data to Snowflake for dashboard purpose.

·         FHIR (Fast Healthcare Interoperability Resources) API (Json) parsing and load to Azure Container in CSV format and copy in MongoDB Tables in Json Format.

·         CCDA (A Continuity of Care Document) XML parsing and load to Azure Container in CSV format.

·         CCDA also parsed XML to JSON and load to MongoDB Tables in Json format.

·         Data Migration to Containers and later aggregated data to Snowflake for dashboard purposes.

Environment: MS SQL Server/ Azure, SSIS, T-SQL, VB.net Application, Windows Services, MS-Excel, AWS S3, Tableau Workbook (using Snowflake as data source)

 

Company Name

Cognizant

Project Name

Trucare Population Health and Care Management Platform

March 2017 – March 2019

Client Organization

CaseNET LLC, Bedford, MA

 

Role as Project Lead (Database / Report Developer Lead)

About Project: TruCare is a product for care and disease management and is owned by Centene Corporation. This product has been used by a few healthcare companies to control Care, Utilization, and Disease management. Design and develop database objects for ETL and performance tuning of database queries. Designed and developed XML parser and XML Commit using both Informatica and SSIS + .Net.  There were a lot of reports showed summary data for all the section from this product and these reports are developed by Jasper and SQL server.

Responsibilities:

·         Designed highly optimized Views based on Table for data mart used in Jasper Report Model (Domain).

·         Designed the Jasper Reports and Stored Procedures for building the reports and Implemented Procedures and Jasper report based on database / Report model (Domain)

·         Developed and designed Reports based on Jasper Studio and scheduling snapshots to Excel Export.

Environment: MS SQL Server, Jasper Report and Jasper Data Modeling (Domain) T- SQL, MS-Excel, Informatica and SSIS.

 

Company Name

Cognizant

 

Project Name

Collection, Foreclosure and Claims on Bank

April 2016 – March, 2017

Client Organization

US Bank, Irving, TX

Role as Project Lead (Database / MSBI Solution Architect)

About Project: The "Collection, Foreclosure, and Claims Banking" project uses SSIS and SSRS to streamline data management and reporting in the banking industry. SSIS facilitates data extraction, transformation, and loading from multiple systems for customer collections, foreclosure activities, and claims processing. Meanwhile, SSRS creates detailed reports on performance indicators and compliance metrics for informed decision-making. The project emphasizes data accuracy through validation processes during migration and collaboration with business units to meet reporting needs effectively. Documentation of ETL workflows supports maintenance efforts while continuous monitoring identifies areas for optimization to boost operational efficiency in banking operations.

Responsibilities: As the project lead for SSIS and SSRS development, responsibilities include defining project scope, leading a cross-functional team, overseeing ETL processes, developing a reporting strategy, implementing quality assurance measures, engaging stakeholders, maintaining documentation, managing risks, monitoring performance metrics, and providing post-implementation support. The role involves ensuring alignment with business goals and stakeholder needs while delivering actionable insights through well-structured SSRS reports. Proactive risk management and continuous performance monitoring are essential to stay on schedule and within budget throughout the project lifecycle. Coordination of training sessions for end-users is crucial for smooth adoption of new systems.

Environment: MS SQL Server 2012, SSIS (Visual Studio 2010), SSAS Tabular Modeling with Excel Reporting (DAX), T- SQL, Team Foundation Server 2012, MS-Access, MS-Excel, Tableau Workbook

 

 

Company Name

Cognizant

 

Project Name

Underwriting Data Migration

January 2015 - March, 2016

Client

CareFirst, Baltimore

Role as Project Lead (Database and SSIS Developer)

About Project: The Underwriting Data Migration project focuses on migrating vital underwriting data from legacy systems to a contemporary platform. This comprehensive process encompasses data mapping, guaranteeing data integrity, and validating the accurate transfer of all information. Collaboration among stakeholders plays a crucial role in identifying key data components and upholding compliance standards. Additionally, the project places significant importance on reducing downtime and facilitating a seamless transition for underwriting operations. Rigorous testing is carried out to ensure that the new system meets all functional requirements before its deployment.

Responsibilities: As a project lead for SSIS and SSRS development, responsibilities include project planning, team management, technical oversight of ETL processes, report strategy focusing on user requirements, quality assurance through testing protocols, stakeholder engagement for feedback, and expectations management. Additionally, tasks involve documentation for future reference, risk management by identifying potential risks and developing mitigation strategies. Performance monitoring is crucial to track progress and make necessary adjustments while post-implementation support includes coordinating training sessions and providing ongoing assistance to end-users.

Environment: Oracle, MS SQL Server, SSIS, SSRS, T- SQL, Visual Source Safe 2005 (VSS), PL/SQL, MS-Access, MS-Excel

 

Company Name

Syntel Inc.

 

Project Name

BI Development

September 2013 - January, 2015

Client

Ameriprise, Minneapolis

Role as Sr Developer (Database and MSBI Lead)

Responsibilities:

Ø  Worked with integration services for transferring and reviewing data from heterogeneous sources like Excel, CSV, flat file, Text Format Data.

Ø  Using SSIS importing files (Employee Data, Picture) to Tables (SQL Server)

Ø  Worked on various Dashboard Reporting tools including “Cognos”, “Tableau” and “BOXI” for different project.

Ø  Developed different kinds of reports such a Sub Reports and ChartsMatrix reports using Crystal BOXI.

Ø  Developing project plans and leading resources to meet planned deliverables.

Environment: Oracle, MS SQL Server, SSIS, SSRS, T- SQL, Visual Source Safe 2005 (VSS), PL/SQL, MS-Access, MS-Excel, Cognos, Tableau Worksheet Desktop 8.1, BOXI – Crystal 8.0

 

No comments:

Post a Comment