freiberufler Big Data, Python, Data Vault, BI, DWH auf freelance.de

Big Data, Python, Data Vault, BI, DWH

offline
  • auf Anfrage
  • 12207 Berlin
  • National
  • ur  |  en  |  de
  • 19.01.2022

Kurzvorstellung

tätig als Selbständig Berater und Entwickler im Bereich Business Intelligence, Data Warehousing, Big Data und Data Vault. biete Erfahrungen mit Python, Oracle, Exasol und MicroStrategy, sowie auch mit MSSQL, MySQL, Postgres, Reshift, und Middleware

Qualifikationen

  • Big Data
  • Data Vault 2.0
  • Datawarehouse / DWH
  • ETL
  • J2EE (Java EE)
  • MicroStrategy
  • Online Analytical Processing (OLAP)
  • Oracle Database
  • Python

Projekt‐ & Berufserfahrung

Python Data Engineer
Beratungshaus, Heidelberg
11/2019 – 2/2020 (4 Monate)
IT & Entwicklung
Tätigkeitszeitraum

11/2019 – 2/2020

Tätigkeitsbeschreibung

Fast paced intensive development of multiple data integration modules in Python on Linux in Docker.

These developments enabled data integration and provision for a new web based software:

Integration with Nifi over nifi-api. This component works with Json retrieved from Nifi Rest API, to traverse through Nifi Flow, Process Groups and Processors.
Metadata handling component. This component handles for target MySql database data type conversions, against Big Data AVRO primitive data types based meta data.
Component to download large CSV files over Rest API with Streams (use of requests iter_content) in Parallel threads.
Loading of CSV files into MySql using Data Load Infile command over Sqlalchemy(+pymysql)
Data Integration Job. A main python job that combines other components together.

Python Libraries:
requests, Sqlalchemy, multiprocessing, pandas, dotenv, logging

Technologies: Python 3.7, Linux, Docker, PyCharm, Liquibase, Putty, Real VNC, Citrix.

Eingesetzte Qualifikationen

ETL, mySQL, Python

Data Vault und Big Data Berater und Entwickler
Versicherungsunternehmen, Köln
4/2019 – 7/2019 (4 Monate)
Versicherungen
Tätigkeitszeitraum

4/2019 – 7/2019

Tätigkeitsbeschreibung

Erfahrung mit Python
Data Vault 2.0 basiert Data Lake Entwicklung und Beratung mit Hadoop Cloudera (CDH) und Amazon Stacks.
Hadoop Cloudera (CDH):
- GDPR Compliant HDFS Data Lake using AVRO file format.
- Hive/Impala based Data Vault Entities & Information Mart.
Amazon S3 and Redshift:
- S3 based Data Lake and external Athena/Redshift tables.
- Redshift based Data Vault and Virtualised Information Mart.
Pre-computed Hash keys Materialised as AVRO files in Lake.
Technologien: Python 3,7, AWS, S3, Redshift, DMS, SQS, Cloudera, Avro, Hive, Impala

Eingesetzte Qualifikationen

Apache Hadoop, Big Data, Python, Amazon Web Services (AWS)

Middleware Spezialist
Telekom Lösungsanbieter, Berlin
9/2018 – 8/2019 (1 Jahr)
Telekommunikation
Tätigkeitszeitraum

9/2018 – 8/2019

Tätigkeitsbeschreibung

● Rest APIs and Gateways with Json, Xmls, Idocs and Javascript.
● Data structure / model analysis between SAP, Salesforce and real-time micro-services and respective data mappings.
● Development in MS SQL Server 2014 SSMS.
● SQL development and stored procedure development with Transaction Management and Exception Handling.
● Test Evidence and UAT documentation.

Technologien: MSSQL Server 2014, Middleware, JSON/XML, CSV

Eingesetzte Qualifikationen

Microsoft SQL-Server (MS SQL), SAP IDOC, JavaScript Object Notation (JSON), Representational State Transfer (REST)

MicroStrategy Entwickler
Einzelhandel, Ruhrgebiet
8/2017 – 3/2018 (8 Monate)
Großhandel
Tätigkeitszeitraum

8/2017 – 3/2018

Tätigkeitsbeschreibung

Part of FE team, responsible for implementation of MicroStrategy Use Cases for the retail business.

Responsibilities include:

Business Validation of Requirements, with RE & Arch. team.
Solution Concept Workshops, with Arch. & Business teams.
Implementation of MicroStrategy Use Cases (package 2 & 3).
Liaising between Backend and Frontend teams.

Extensive Development Experience with MSTR Documents.
Use of Panel Stacks, Selectors, Grids and Graph components.
Use of Multiple Datasets.

Extensive experience with Visual Insights and OLAP Metrics.

Datasets with Level and Derived Metrics.

Technical feats include:
● Use of Transaction Services.
● Mapping of Attributes (IDs, Forms).
● Parent-Child relationships & Hierarchies.
● Use of multiple Datasets, based on multiple Data Marts.

Advanced Topics include:
● Setting up MicroStrategy Job Prioritisations
● iCube Optimisation & Incremental Refresh reports.

Operational tasks include bi-weekly deployments.

Eingesetzte Qualifikationen

MicroStrategy, Datawarehouse / DWH, Oracle (allg.)

Head of Data Technology
Crosslend GmbH, Berlin
9/2015 – 2/2017 (1 Jahr, 6 Monate)
FinTech, Consume Lending
Tätigkeitszeitraum

9/2015 – 2/2017

Tätigkeitsbeschreibung

I was responsible for leading BI and analytics function of the company, a member of management team. Close cooperation with other Heads, Team Leads and C-levels. Vendor management (Microstrategy). Streamlined many data acquisition, processing and KPI calculation challenges (e.g. Payment processing). Built visualizations and dashboards, together with maths intensive calculations for returns and portfolio performance.

 Responsible for leading BI and Analytics function of Crosslend.
 Close collaboration with Executives, Marketing, Operations, Finance, Product, Engineering and DevOps.
 Enabled self service BI, rolled-out Microstrategy.
 Investor Fact sheets and pitch-decks.
 Financial Metrics, IRRs, Annualized Net Returns (unadjusted) and Default Curves.
 Marketing Performance dashboards and reports (per Channel).
 Customer Insights for Operations and CC team.
 Payment processing and overdue related KPIs.
 Visualizations, simulations and correlations.
 Successful closing of audits (positive opinion).

Eingesetzte Qualifikationen

MicroStrategy, Datawarehouse / DWH, Business Intelligence (BI), mySQL, ETL

Data Warehouse Architect
Kreditech Holdings SSL, Hamburg
7/2014 – 8/2015 (1 Jahr, 2 Monate)
Fintech, Consumer Lending
Tätigkeitszeitraum

7/2014 – 8/2015

Tätigkeitsbeschreibung

I was responsible for Data Warehouse Architecture and managing company relationship with Exasol (service provider), trained and hired people (DWH Engineers), built overall DWH Architecture, Infrastructure, integrated unstructured NoSQL data (Mongo DB), modeled company core business tables, wrote Finite State Machine (for IFRS based classification), successful closing auditing (a pre-req for series-B funding).

 Responsible for Data Warehouse Architecture and Data Engineering Team.
 Managing Data Warehouse technology infrastructure and service providers.
 Data Modelling company core Revenue and Accounting Fact tables.
 Marketing data mart, performance data at campaign and keyword level (Hierarchy).
 Finite state machine (for IFRS based classification) and Payment Waterfall calculations.
 Data historisation design concepts.
 Integration of unstructured NoSQL data (Mongo DB).
 Successful closing of audits and series B funding.
 Tech-stack: Exasol, Mongo DB, Postgres, Pentaho Kettle, Python and LUA.

Eingesetzte Qualifikationen

Online Analytical Processing (OLAP), Datawarehouse / DWH, Pentaho Open-Source-BI-Suite, PostgreSQL, MongoDB, ETL, Datenbankentwicklung, Lua, Python

Senior Manager BI
Zalando SE, Berlin
8/2012 – 6/2014 (1 Jahr, 11 Monate)
E-Commerce
Tätigkeitszeitraum

8/2012 – 6/2014

Tätigkeitsbeschreibung

I was part of ERP/MIS team, responsible for Customer Analytics pipeline. Carried out wide set of responsibilities and functions. Came across aggregation requirements using Hadoop (Java Map/Reduce). Oracle, Exasol, Pentaho Kettle technology stack. Lead Oracle DWH migration to a new HW. Re-wrote legacy ETLs, migrated IBM Unica CRM in-house, managed freelancers.

 Responsible for Customer pipeline within Zalando BI.
 Cohort trend analysis for customers -Hyperlink entfernt-
 Analysis of website click log files using Hadoop (Java Map/Reduce).
 Design and Development of Customer Survey Data (Oracle PL/SQL).
 Interfacing operational subset for forecast analysis (Exasol).
 Migration of IBM Unica CRM in house. Redesigning CRM Data Model and simplifying ETLs.
 Leading migration of Oracle DB to new HW, improving backup and recovery options.
 Tech-stack: Hadoop, Oracle, Exasol, Pentaho Kettle, PostgresSql and Business Objects

Eingesetzte Qualifikationen

Apache Hadoop, Datawarehouse / DWH, SAP BusinessObjects (BO), PostgreSQL, Oracle Database, ETL, CRM Beratung (allg.), ERP Beratung (allg.)

Zertifikate

Certified Data Vault 2.0 Practitioner
2018
Oracle Certified Professional Database 11g Administrator
2010

Ausbildung

Computer Science
Bachelors
4
Lahore, Pakistan

Über mich

Dank für Ihren Besuch. ich bin tätig als Selbständig Berater und Entwickler im Bereich Business Intelligence und Data Warehousing. ich biete Erfahrungen mit Oracle, Exasol und MicroStrategy, sowie auch mit MSSQL, MySQL, Postgres, Reshift, Python und Middleware! in Bereich Big Data habe ich GDPR willig Data Lake mit Cloudera CDH (avro, Hive/Impala) und AWS (S3, Athena/Redshift) kürzlich durchgeführt. Ich bin immer auf der Suche nach Interessante Kontakt und Projekten. es wird mich freuen von Ihnen zurück zu hören.

Weitere Kenntnisse

● MicroStrategy, Oracle, Exasol und Python
● Data Vault 2.0 Zertifiziert.
● Oracle Zertifiziert (11g DBA).

Persönliche Daten

Sprache
  • Englisch (Fließend)
  • Deutsch (Gut)
  • Urdu (Muttersprache)
Reisebereitschaft
National
Arbeitserlaubnis
  • Europäische Union
Profilaufrufe
6193
Alter
40
Berufserfahrung
19 Jahre und 1 Monat (seit 02/2005)
Projektleitung
2 Jahre

Kontaktdaten

Nur registrierte PREMIUM-Mitglieder von freelance.de können Kontaktdaten einsehen.

Jetzt Mitglied werden