Data Strategy Consultant
- Verfügbarkeit einsehen
- 1 Referenz
- auf Anfrage
- 61118 Bad Vilbel
- auf Anfrage
- en | de
- 26.01.2026
Kurzvorstellung
Geschäftsdaten
Qualifikationen
Projekt‐ & Berufserfahrung
7/2024 – 5/2025
Tätigkeitsbeschreibung
- Defined, designed the requirements for data services on
Azure that will facilitate a Data Lake, Data Analytics and
reporting capabilities for a Fintech in the Payments space
- Defined and instrumented a mobile events tracking plan
within Amplitude for a native mobile Payments app built
by the client to facilitate product and customer analytics
- Defined the KPI for the client based on a North Star
Framework to empower stakeholders for operational
excellence, success and push for growth.
- Building and designing dashboards in Power BI for Management Reporting and monitor operational excellance.
Tool-kit: Amplitude, Power BI, Azure Data Lake, Azure Data Factory
Microsoft Azure, Python
7/2023 – offen
Tätigkeitsbeschreibung
• Aufbau und Weiterentwicklung moderner Datenplattformen und Datenprodukte (hands-on)
• Entwicklung von Data Pipelines, Data Models und Analytics Layers mit Python, SQL und modernen Cloud-Stacks
• Umsetzung von AI- und RAG-basierten Datenprodukten mit Fokus auf Datenqualität, Robustheit und Produktivbetrieb
• Automatisierung datengetriebener Workflows zur Reduktion manueller Prozesse und Verbesserung der Effizienz
• Designed and built AI-enabled data products using Python, including RAG pipelines and agent-based systems with LangChain, Pydantic, Instructor, and AWS Bedrock
• Designed and built RAG applications, optimized end-to-end processes using Palantir Foundry, Palantir AIP
• Implemented structured output validation, retry logic, observability, and human-in-the-loop patterns to improve robustness and production readiness of AI systems
• Developed data products and analytical workflows using Python, DuckDB, and Jupyter, focusing on fast iteration and reproducible analysis
• Worked with vector search, hybrid retrieval, and prompt-driven workflows where data modeling and data quality are critical
• Designed system architectures balancing experimentation with production constraints (security, cost, latency)
Tech: Python, DuckDB, AWS Bedrock, LangChain, Pydantic, Pydantic AI, Palantir Foundry
Amazon Web Services (AWS), Google Cloud, SQL
11/2019 – 6/2023
Tätigkeitsbeschreibung
• Aufbau und technische Verantwortung der unternehmensweiten Daten- und Analytics-Architektur (hands-on)
• Implementierung eines modernen Data Stacks mit AWS, Snowflake, dbt, Airflow und BI-Tools
• Entwicklung komplexer, wiederverwendbarer Datenmodelle für Finanz-, Produkt- und Funnel-Analysen
• Etablierung von Analytics-Engineering-Best Practices (Testing, Versionierung, Datenqualität)
• Designed and built the company-wide data architecture and analytics platform, owning the technical implementation end-to-end
• Implemented a modern data stack using AWS, Terraform, Airflow, dbt, Fivetran, Snowflake, and Tableau to support management reporting and analytical data products
• Designed and developed complex, reusable data models supporting revenue analysis, margin analysis, and conversion funnel analytics across multiple domains
• Built standardized data transformation layers and data quality checks to improve consistency, trust, and reliability of analytics outputs
• Partnered with product teams during system and schema design to ensure data structures supported scalable analytics and downstream reporting
• Introduced Analytics Engineering best practices based on DevOps practices, including version control, code reviews, CI/CD pipelines, data testing, documentation, dev/prod environments to in order to increase team velocity and reliability
Tech: AWS, Terraform, Airflow, dbt, Snowflake, Fivetran, SQL, Tableau
Amazon Web Services (AWS), Data Warehousing, Python, Snowflake
11/2016 – 6/2019
Tätigkeitsbeschreibung
• Aufbau und Betrieb eines PostgreSQL-basierten Data Warehouses (Skalierung von 3 auf 23+ Datenquellen)
• Entwicklung und Automatisierung von ETL-Prozessen und Reporting-Pipelines mit Python, SQL und Jenkins
• Umsetzung eines End-to-End Invoicing-Systems inkl. REST APIs, Datenpipelines und Automatisierung
• Technische Leitung und Umsetzung eines AML-Transaktionsmonitorings (Datenmodellierung, Performance-Optimierung, Produktivbetrieb)
• Reduktion von Abhängigkeiten zur Plattform-/IT-Abteilung durch Neugestaltung von Datenflüssen
Progressed from BI Analyst to BI Manager while remaining hands-on in data engineering, platform and data product development.
• Designed, built and maintained a PostgreSQL-based data warehouse, scaling data ingestion from three to over 20 sources
• Developed ETL pipelines, data processing workflows and data models using Python, SQL, and Jenkins
• Automated management and operational reporting, significantly reducing manual effort by 80% and improving reliability
• Designed and implemented an end-to-end invoicing system, including REST APIs, data pipelines, and automation (Python, Git, Jenkins), reducing manual effort by ~90%
• Led the technical design and implementation of an AML transaction monitoring system, including data modeling, query optimization and end-to-end delivery to production
• Improved SQL performance and data model efficiency to support large loan portfolio analytics and compliance use cases
• Integrated data exchange solutions with third-party providers to enable secure and reliable data sharing
• Reduced platform team dependencies by ~95% by redesigning data flows and ownership boundaries
• Introduced version control, CI/CD-style workflows, and reproducible data practices across analytics and reporting
Amazon Web Services (AWS), Jenkins, Postgresql, Python, SQL, Tableau
5/2016 – 8/2016
Tätigkeitsbeschreibung
• Automatisierung von Customer-Performance-Reports mit Python, SQL und Jenkins
• Entwicklung von Power-BI-Dashboards zur Analyse von Kunden- und Umsatzkennzahlen
• Durchführung explorativer Datenanalysen und Market-Basket-Analysen zur Identifikation von Kaufmustern
• Automated customer performance reporting using Python, SQL, and Jenkins, reducing manual reporting effort
• Built analytical dashboards in Power BI to monitor customer performance and key business metrics
• Performed exploratory data analysis and market basket analysis using R to identify customer purchasing patterns
• Worked on data cleaning, transformation, and statistical analysis to support internal decision-making
Tech: Python, SQL, Jenkins, Power BI, R, Data Analysis
Business Intelligence (BI), Postgresql, Power Bi, Python, R (Programmiersprache), SQL
10/2015 – 3/2016
TätigkeitsbeschreibungResearched how and if Machine Learning could optimize the detection and classification of fraudulent documents through Text Mining. Developed a 'Proof of Concept' using a variety of Machine Learning models (using Python and Scikit-Learn) to analyze e-mails using Natural Language Processing (NLP) technique to derive whether a classification for these documents.
Eingesetzte QualifikationenMaschinelles Lernen
5/2013 – 8/2023
TätigkeitsbeschreibungLed the development of the financial model for a media venture. Including Waterfall, Feasibility Test, Market Entry Strategy, revenue projections with scenarios, Profitability Analysis. Skills: Data Processing, MS Excel, Financial modeling, communication
Eingesetzte QualifikationenFinanzierungsmodelle, Microsoft Excel
8/2011 – 9/2012
Tätigkeitsbeschreibung
Developed and maintained financial models of infrastructure, energy and desalination plants, transportation (BOT). Additionally, preformed model auditing for Infrastructure Projects for banks and financial institutions.
Skills: MS Excel, Financial modeling, risk management
Finanzanalyse, Finanzierungsmodelle, Kommunikation (allg.), Microsoft Excel
6/2007 – 10/2008
TätigkeitsbeschreibungModelled Regular Expressions for clients in order to improve security. Analyzed HTTP Request logs for malicious attack and attempts. Knowledge of SQL Injection, XSS, Path Traversal and more
Eingesetzte QualifikationenData Mining
5/2006 – 8/2007
TätigkeitsbeschreibungIT Security Analyst of fraudulent activity in the Banking and Credit Card Industries. Monitored suspicious activity, detecting and investigating Phishing attacks against customers of financial institutions and credit-card companies.
Eingesetzte QualifikationenData Mining
12/2002 – 12/2005
TätigkeitsbeschreibungAn Information Systems Technical specialist. Unit leader, emergency outbreak response team. Consultant to a wide range of stakeholders for information systems - debug and fix/troubleshooting issues with critical operational systems and software.
Eingesetzte QualifikationenIT-Techniker (allg.), IT Service Management (ITSM), Linux (Kernel)
Zertifikate
Palantir Technologies
Palantir Technologies
Amazon Web Services
DataTalks.Club
Ausbildung
Hochschule Wirtschaft und Recht Berlin
found in education place 0 found in education plac
Interdisciplinary School Herzliya
Israel
Über mich
I am proficient in instilling a strong Data Culture, Data Literacy program, in increasing automation capabilities and improving operational performance of a firm using data.
I possess strong technical skills and I am an avid business professional rooted in his love of numbers and of making sense of data.
I've designed, developed and orchestrated the infrastructure for several Fintechs - pushing forward a data driven culture, modernization of the data infrastructure and tech stack and creating a scalable and future proof architecture.
I deliver results on-time and more importantly results that matter and that bring the highest value.
Weitere Kenntnisse
Persönliche Daten
- Englisch (Muttersprache)
- Deutsch (Fließend)
- Europäische Union
- Schweiz
Kontaktdaten
Nur registrierte PREMIUM-Mitglieder von freelance.de können Kontaktdaten einsehen.
Jetzt Mitglied werden
