freiberufler Senior Data Engineer auf freelance.de

Senior Data Engineer

zuletzt online vor 1 Tagen
  • 110€/Stunde
  • 40549 Düsseldorf
  • auf Anfrage
  • id  |  de  |  en
  • 12.04.2024

Kurzvorstellung

Senior Date Engineer mit mehr als 15 Jahre Erfahrungen in den Bereichen Big Data Integration, Data Warehouse, Cloud Computing und Advanced Analytics

Auszug Referenzen (1)

"Working with B.-W. has always been a pleasure. He is super friendly, helpful and always professional. We will miss him in our team."
Senior Data Engineer
Kundenname anonymisiert
Tätigkeitszeitraum

1/2022 – 12/2023

Tätigkeitsbeschreibung

• Build and implement ETL-Pipelines using Python/Pandas
• Develop and deploy data-driven product & services used by customers
• Solve various data integration challenges
• Bring capabilities to deliver fast reporting and analytics solutions

Eingesetzte Qualifikationen

Datenbankentwicklung, Pandas DataFrame, Python

Qualifikationen

  • Apache Spark / Apache Kafka
  • Azure Databricks
  • Data Vault
  • Datawarehouse / DWH
  • Datenbankentwicklung
  • Dimension Modeling, Data Vault Modeling
  • ETL-Tools: DBT / ADF / Talend / Informatica
  • Microsoft Azure
  • Pandas DataFrame
  • Python
  • Python/Pandas, Pyspark, Snowpark for Python
  • snowflake
  • Snowflake Data Cloud
  • SQL, Scala, Java

Projekt‐ & Berufserfahrung

Senior Data Engineer
Ista SE, Essen
4/2023 – offen (1 Jahr, 1 Monat)
Energiedienstleister
Tätigkeitszeitraum

4/2023 – offen

Tätigkeitsbeschreibung

• Build a new data platform on Snowflake Data Cloud (Azure Cloud Platform)
• Data integration with Azure Databricks, Azure Data Factory and Snowpark for Python
• Data model with Data Vault 2.0 methodology and implementation the data model with Data Build Tool (DBT)
• Develop Proof of Concept for generative AI use case in Snowflake
• Build CI / CD Data pipelines on Gitlab for full automation of testing and deployment
• Provision and manage infrastructure in Azure Cloud and Snowflake with Terraform

Eingesetzte Qualifikationen

Data Vault, Datawarehouse / DWH, Microsoft Azure, snowflake

Senior Data Engineer
DB Systel GmbH, Frankfurt am Main, Frankfurt
1/2022 – 12/2023 (2 Jahre)
Logistikdienstleister
Tätigkeitszeitraum

1/2022 – 12/2023

Tätigkeitsbeschreibung

• Build and implement ETL-Pipelines using Python/Pandas
• Develop and deploy data-driven product & services used by customers
• Solve various data integration challenges
• Bring capabilities to deliver fast reporting and analytics solutions

Eingesetzte Qualifikationen

Datenbankentwicklung, Pandas DataFrame, Python

Snowflake Architect
Ratepay GmbH, Berlin
11/2021 – 12/2021 (2 Monate)
Finanzdienstleister
Tätigkeitszeitraum

11/2021 – 12/2021

Tätigkeitsbeschreibung

• Design and implement cloud data platform on Snowflake AWS Cloud:
- Migration of on-premise SQL-Server database to Snowflake on AWS
- Integration of Okta as Identity Provider, Tableau as BI Tool

• Design and implement the Snowflake Layered Security:
- Single Sign-On and SCIM Integration with Okta as Identity Provider
- Network Security with AWS Private Link
- Design Role Based Access Control (RBAC)
- Design Security Monitoring scripts

Eingesetzte Qualifikationen

Cloud Computing, snowflake

Senior Solutions Architect (Festanstellung)
Snowflake Computing GmbH, München
5/2021 – 10/2021 (6 Monate)
IT & Entwicklung
Tätigkeitszeitraum

5/2021 – 10/2021

Tätigkeitsbeschreibung

• Guide customers through the process of migrating to Snowflake and develop methodologies to improve the migration process
• Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
• Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology
• Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
• Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments
• Provide guidance on how to resolve customer-specific technical challenges
• Support other members of the Professional Services team develop their expertise
• Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing.

Eingesetzte Qualifikationen

snowflake, Amazon Web Services (AWS), Cloud Computing, Datawarehouse / DWH, ETL, Google Cloud, Microsoft Azure, Solution Architektur

Senior Data Engineer (Festanstellung)
ARAG Versicherung, Düsseldorf
1/2019 – 4/2021 (2 Jahre, 4 Monate)
Versicherungen
Tätigkeitszeitraum

1/2019 – 4/2021

Tätigkeitsbeschreibung

Design and building a data lake on Microsoft Azure Cloud:

• Build ETL pipelines with Azure Databricks, Azure Data Factory, Azure Functions and Talend Data Management
• Data integration from variety of data sources (e.g. relational data from databases such as IBM DB2, MS SQL Server, SAP BW and semi-structured data from web services such as weather data, check24 data)
• Deployment of Azure Database for PostgreSQL for relational data within the data lake
• Deployment of Azure Data Lake Storage gen. 2 for unstructured / semi-structured data within the data lake

Eingesetzte Qualifikationen

Apache Spark, Big Data, Cloud Computing, Datawarehouse / DWH, ETL, Java (allg.), Microsoft Azure, Python, snowflake

Data Specialist (Festanstellung)
AXA, Köln
12/2017 – 12/2018 (1 Jahr, 1 Monat)
Versicherungen
Tätigkeitszeitraum

12/2017 – 12/2018

Tätigkeitsbeschreibung

Building a scalable big data pipeline moving streaming data and batched data to a data lake on AWS Cloud:

• Data integration from variety of data sources IBM Mainframe, Oracle, MS SQL Server and SAP BW with Informatica Data Integration Tool and AWS Glue
• Develop python application for automatic generation of Informatica mappings, sessions and workflows
• Deployment of Streaming Platform Confluent Kafka, which is one of the main building blocks for data ingestion process
• Deployment of a Snowflake Data Cloud in AWS Cloud Platform for structured and semi-structured data within the data lake
• Deployment of AWS S3 storage for unstructured and semi-structured data within the data lake

Eingesetzte Qualifikationen

Amazon Web Services (AWS), Apache Spark, Big Data, Cloud Computing, Datawarehouse / DWH, ETL, Python, snowflake

Big Data Engineer
Fintech Firm, Düsseldorf
3/2016 – 11/2017 (1 Jahr, 9 Monate)
Finanzdienstleister
Tätigkeitszeitraum

3/2016 – 11/2017

Tätigkeitsbeschreibung

Building a big data platform on Azure HDInsight platform:
• Extract, transform and load data from variety of sources (Oracle, MS SQL Server, SAP BW) to Azure blob storage
• Use HDInsight tools Spark und Hive for transforming and storing data from the Azure blob storage to Hadoop-based storage (HDFS)
• Support data scientist team building a Hadoop-based analytical platform leveraging the analytic tools such as Spark Machine Learning, Apache Mahout

Support Proof of Concept activity designing a machine learning based predictive model for credit card fraud detection leveraging Azure HDInsight platform:
• Design a predictive model using HDInsight tools such as HDFS, HBase, Kafka, Spark
• Build a data pipeline, which extract the master data from HBase noSQL database through Kafka streaming platform to Spark for batch processing
• Build a data pipeline for the transaction data, which is streamed through Kafka streaming platform to Spark for real-time Processing
• Use Spark Machine Learning library to model a prediction for credit card fraud detection

Eingesetzte Qualifikationen

Apache Hadoop, Apache Spark, Big Data, Cloud Computing, Datawarehouse / DWH, ETL, Python, SQL

Technical Sales Manager (Festanstellung)
ZTE Deutschland GmbH, Düsseldorf
6/2008 – 2/2016 (7 Jahre, 9 Monate)
Telekommunikation
Tätigkeitszeitraum

6/2008 – 2/2016

Tätigkeitsbeschreibung

• Provide technical support to the customers (Vodafone Group, Deutsche Telekom, Telefonica O2) for deploying and testing of ZTE products

• Organize Proof of Concept with the customers to demonstrate ZTE product capabilities and to specify their potential use cases

• Team management in the ZTE data-based product development

• Planning and organization of pre-sales activities and meetings with suppliers (Qualcomm, Intel, Microsoft and Google)

Eingesetzte Qualifikationen

Technischer Vertrieb

Zertifikate

SnowPro Core Certification
2021
Certified International Investment Analyst Diploma
2015
Certified European Financial Analyst Diploma
2015

Ausbildung

Elektrotechnik
Diplom Ingenieur
2000
Leibniz Universität Hannover

Weitere Kenntnisse

Data Platform:
- Snowflake Data Cloud
- Databricks Data Lakehouse
- Azure Synapse
- Google BigQuery
- PostgreSQL
- Azure SQL Server
- DB2
- Oracle

ETL / ELT Tool:
- Python (Jupyter Notebook)
- Snowpark for Python
- Azure Databricks (Pyspark)
- Azure Data Factory
- DBT
- Apache Kafka
- Talend Data Management
- Informatica Data Integration

Data Warehouse Modeling:
- Dimension Modeling, Data Vault Modeling

Cloud technology:
- Microsoft Azure
- Amazon Web Services
- Google Cloud

Programming skills:
- Python
- SQL
- Scala
- Java

Advanced Analytic / Machine Learning:
- Spark MLlib
- Scikit-learn

Operating System:
- Red Hat Linux
- Ubuntu Linux
- Windows
- macOS

Persönliche Daten

Sprache
  • Indonesisch (Muttersprache)
  • Deutsch (Fließend)
  • Englisch (Gut)
Reisebereitschaft
auf Anfrage
Arbeitserlaubnis
  • Europäische Union
Home-Office
bevorzugt
Profilaufrufe
1651
Alter
54
Berufserfahrung
23 Jahre und 3 Monate (seit 01/2001)
Projektleitung
5 Jahre

Kontaktdaten

Nur registrierte PREMIUM-Mitglieder von freelance.de können Kontaktdaten einsehen.

Jetzt Mitglied werden