Datawarehouse / DWH
Online Analytical Processing (OLAP)
J2EE (Java EE)
Data Vault 2.0
Projekt‐ & Berufserfahrung
IT & Entwicklung
11/2019 – 2/2020Tätigkeitsbeschreibung
Fast paced intensive development of multiple data integration modules in Python on Linux in Docker.
These developments enabled data integration and provision for a new web based software:
Integration with Nifi over nifi-api. This component works with Json retrieved from Nifi Rest API, to traverse through Nifi Flow, Process Groups and Processors.
Metadata handling component. This component handles for target MySql database data type conversions, against Big Data AVRO primitive data types based meta data.
Component to download large CSV files over Rest API with Streams (use of requests iter_content) in Parallel threads.
Loading of CSV files into MySql using Data Load Infile command over Sqlalchemy(+pymysql)
Data Integration Job. A main python job that combines other components together.
requests, Sqlalchemy, multiprocessing, pandas, dotenv, logging
Technologies: Python 3.7, Linux, Docker, PyCharm, Liquibase, Putty, Real VNC, Citrix.
ETL, mySQL, Python
4/2019 – 7/2019Tätigkeitsbeschreibung
Erfahrung mit Python
Data Vault 2.0 basiert Data Lake Entwicklung und Beratung mit Hadoop Cloudera (CDH) und Amazon Stacks.
Hadoop Cloudera (CDH):
- GDPR Compliant HDFS Data Lake using AVRO file format.
- Hive/Impala based Data Vault Entities & Information Mart.
Amazon S3 and Redshift:
- S3 based Data Lake and external Athena/Redshift tables.
- Redshift based Data Vault and Virtualised Information Mart.
Pre-computed Hash keys Materialised as AVRO files in Lake.
Technologien: Python 3,7, AWS, S3, Redshift, DMS, SQS, Cloudera, Avro, Hive, Impala
Apache Hadoop, Big Data, Python, Amazon Web Services (AWS)
Telekom Lösungsanbieter, Berlin
9/2018 – 8/2019Tätigkeitsbeschreibung
● Data structure / model analysis between SAP, Salesforce and real-time micro-services and respective data mappings.
● Development in MS SQL Server 2014 SSMS.
● SQL development and stored procedure development with Transaction Management and Exception Handling.
● Test Evidence and UAT documentation.
Technologien: MSSQL Server 2014, Middleware, JSON/XML, CSV
8/2017 – 3/2018Tätigkeitsbeschreibung
Part of FE team, responsible for implementation of MicroStrategy Use Cases for the retail business.
Business Validation of Requirements, with RE & Arch. team.
Solution Concept Workshops, with Arch. & Business teams.
Implementation of MicroStrategy Use Cases (package 2 & 3).
Liaising between Backend and Frontend teams.
Extensive Development Experience with MSTR Documents.
Use of Panel Stacks, Selectors, Grids and Graph components.
Use of Multiple Datasets.
Extensive experience with Visual Insights and OLAP Metrics.
Datasets with Level and Derived Metrics.
Technical feats include:
● Use of Transaction Services.
● Mapping of Attributes (IDs, Forms).
● Parent-Child relationships & Hierarchies.
● Use of multiple Datasets, based on multiple Data Marts.
Advanced Topics include:
● Setting up MicroStrategy Job Prioritisations
● iCube Optimisation & Incremental Refresh reports.
Operational tasks include bi-weekly deployments.
MicroStrategy, Datawarehouse / DWH, Oracle (allg.)
Crosslend GmbH, Berlin
FinTech, Consume Lending
9/2015 – 2/2017Tätigkeitsbeschreibung
I was responsible for leading BI and analytics function of the company, a member of management team. Close cooperation with other Heads, Team Leads and C-levels. Vendor management (Microstrategy). Streamlined many data acquisition, processing and KPI calculation challenges (e.g. Payment processing). Built visualizations and dashboards, together with maths intensive calculations for returns and portfolio performance.
Responsible for leading BI and Analytics function of Crosslend.
Close collaboration with Executives, Marketing, Operations, Finance, Product, Engineering and DevOps.
Enabled self service BI, rolled-out Microstrategy.
Investor Fact sheets and pitch-decks.
Financial Metrics, IRRs, Annualized Net Returns (unadjusted) and Default Curves.
Marketing Performance dashboards and reports (per Channel).
Customer Insights for Operations and CC team.
Payment processing and overdue related KPIs.
Visualizations, simulations and correlations.
Successful closing of audits (positive opinion).
MicroStrategy, Datawarehouse / DWH, Business Intelligence (BI), mySQL, ETL
Kreditech Holdings SSL, Hamburg
Fintech, Consumer Lending
7/2014 – 8/2015Tätigkeitsbeschreibung
I was responsible for Data Warehouse Architecture and managing company relationship with Exasol (service provider), trained and hired people (DWH Engineers), built overall DWH Architecture, Infrastructure, integrated unstructured NoSQL data (Mongo DB), modeled company core business tables, wrote Finite State Machine (for IFRS based classification), successful closing auditing (a pre-req for series-B funding).
Responsible for Data Warehouse Architecture and Data Engineering Team.
Managing Data Warehouse technology infrastructure and service providers.
Data Modelling company core Revenue and Accounting Fact tables.
Marketing data mart, performance data at campaign and keyword level (Hierarchy).
Finite state machine (for IFRS based classification) and Payment Waterfall calculations.
Data historisation design concepts.
Integration of unstructured NoSQL data (Mongo DB).
Successful closing of audits and series B funding.
Tech-stack: Exasol, Mongo DB, Postgres, Pentaho Kettle, Python and LUA.
Online Analytical Processing (OLAP), Datawarehouse / DWH, Pentaho Open-Source-BI-Suite, PostgreSQL, MongoDB, ETL, Datenbankentwicklung, Lua, Python
Zalando SE, Berlin
8/2012 – 6/2014Tätigkeitsbeschreibung
I was part of ERP/MIS team, responsible for Customer Analytics pipeline. Carried out wide set of responsibilities and functions. Came across aggregation requirements using Hadoop (Java Map/Reduce). Oracle, Exasol, Pentaho Kettle technology stack. Lead Oracle DWH migration to a new HW. Re-wrote legacy ETLs, migrated IBM Unica CRM in-house, managed freelancers.
Responsible for Customer pipeline within Zalando BI.
Cohort trend analysis for customers -Hyperlink entfernt-
Analysis of website click log files using Hadoop (Java Map/Reduce).
Design and Development of Customer Survey Data (Oracle PL/SQL).
Interfacing operational subset for forecast analysis (Exasol).
Migration of IBM Unica CRM in house. Redesigning CRM Data Model and simplifying ETLs.
Leading migration of Oracle DB to new HW, improving backup and recovery options.
Tech-stack: Hadoop, Oracle, Exasol, Pentaho Kettle, PostgresSql and Business Objects
Apache Hadoop, Datawarehouse / DWH, SAP BusinessObjects (BO), PostgreSQL, Oracle Database, ETL, CRM Beratung (allg.), ERP Beratung (allg.)
Ort: Lahore, Pakistan
● Data Vault 2.0 Zertifiziert.
● Oracle Zertifiziert (11g DBA).