
Cloud Data & AI Solution Architect | Senior BI Consultant
- Verfügbarkeit einsehen
- 0 Referenzen
- auf Anfrage
- 70771 Leinfelden-Echterdingen
- auf Anfrage
- el | en | de
- 23.06.2025
Kurzvorstellung
Qualifikationen
Projekt‐ & Berufserfahrung
11/2024 – offen
Tätigkeitsbeschreibung
• Defining the concept and architecture of the business data & AI cloud platform.
• Leading the creation and maintenance of the infrastructure components of their data & AI platform.
• Designing the CI/CD pipelines to streamline the software delivery process from outsourced dev teams.
• Automating the data pipelines to prevent errors from back-office operations and manual processes, to allow rapid product iterations and provide consistent feedback.
• Developing a scalable DB with multiple instances to dynamically scale workloads, data volumes and optimize queries.
• Implementing Dev(Sec)Ops practices and workflows to assure data quality, code integration and security.
• Building and integrating to the data platform AI/ML models to provide predictions, rankings and customer analytics.
• Developing text analytics methods based on Natural Language Processing (NLP) to analyze data (incl. text messages, emails or other unstructured data), enhance communication between users, implement AI-powered Chatbot/virtual assistant and incorporate OpenAI/LLM-powered responses to users.
Technologies: MongoDB, MS Azure technologies, DevOps, Databricks, Python, various AI/ML/GenAI/LLM/NLP methods, Github, Asana.
Architektur (allg.), Continuous Delivery, Continuous Integration, Data Engineer, Data Science, Data Warehousing, Databricks, DevOps, Generative KI, Git, Maschinelles Lernen, MLOps, Mongodb, Natural Language Processing, Python
2/2023 – 10/2024
Tätigkeitsbeschreibung
• Engaging with (internal) customers to understand their business challenges and requirements, and help them design & implement new methodologies, processes and cloud-based services to maximize the value chain.
• Contributing to the conceptualization and technical definitions of Agile SAFe features, as well as to the standardization of data definitions, data processes and operations.
• Maintaining and extending customer relations, in order to increase awareness, bring more traction and increase the onboardings to the company’s modernized Data Management ecosystem.
• Definition, development and delivery of APIs, PoCs, Use Cases, MVPs and E2E data-driven solutions.
• Design and implementation of an efficient enterprise-wide data integration and consolidation framework including:
o DevOps, DataOps, data migration and data quality assurance strategies.
o A multi-layered highly secure network infrastructure (incl. VLANs, VPNs, DMZ, Firewalls, Load Balancing, Gateways, etc.) for data onboardings from different locations and systems, or from public internet, into a Landing Zone.
o CDC-based and schema evolution aware data transfer pipelines for ingesting streaming and real-time data into a Landing Zone and then transferring them with data pipelines/flows to a Data Lake.
o Structured, template-based and automated data batch pipelines for ingesting large volumes of data from heterogenous source systems and legacy on-prem or monolithic platforms from different geo-locations (EMEA, NAFTA, APAC).
o Data consumption pipelines from a multi-tenancy Kafka shared event streaming platform.
• Designing a multi-layered Data Lake and its data models, contributing also to the design of the company’s modernized Data Platform ecosystem (data application, analytics, refinement layer, etc.), as well as to the development of the corresponding APIs.
• Contributing to the implementation of the internal self-serve Data Product Marketplace using a data mesh architectural framework that considers various advanced data security challenges and promotes distributed, decentralized ownership.
• Transforming legacy applications and services using modern secure development methods with a set of proven DevOps and DataOps practices to rapidly launch new services to customers.
• Establishing metrics and KPIs to perform data validations and measure the data quality and the effectiveness of the data management initiatives.
• Analysing and fixing potential security threats for complying to the internal auditing process.
• Contributing to the design and implementation of various dashboards (e.g. billing dashboard).
• Developing and implementing Data Governance policies and procedures for data compliance and adhering to regulatory requirements.
• Collaborating with cross-functional teams and supporting data-informed decision-making on the overall overarching Data Platform architecture.
• Contributing to the development of the technical foundation for CI/CD, DevOps, IaC, monitoring and logging.
Technologies: MS SQL Server, MS Azure ecosystem technologies (e.g. Fabric, Synapse, ADF, Power Apps, Event Hubs/Grid, AI, etc.), PowerBI, DevOps, Terraform, Pulumi, Kafka Streaming, Databricks (incl. Spark Structured Streaming, Auto Loader), Delta Lake, Github, Confluence, JIRA.
Adobe Creative Cloud, Amazon Web Services (AWS), Apache Kafka, Big Data, Data Vault, Data Warehousing, Databricks, DevOps, Enterprise Architect (EA), Microsoft Azure, Python, Systemmigration
1/2022 – 2/2023
Tätigkeitsbeschreibung
Realization of a Sales BI Reporting & Analytics data platform
- Translation of organizational strategies, business challenges and objectives into feasible business transformation roadmap and data strategies incl. requirement elicitation.
- Evaluation and analysis of Business Capabilities, Business Objects and Business Processes incl. Customer Journeys and Sales processes.
- Analysis and evaluation of current data interfaces via REST API, data streaming and other heterogenous resources from all international Retailer branches and the company’s Dealer Management system.
- Development of an overarching architecture and realization of an Enterprise Data Platform for onboarding Data Assets and creating Data Products following the BDD practices, Scrum/Kanban/SAFe Agile Frameworks, Design Thinking and Lean Methodologies.
- Realization of a Medallion Lakehouse Architecture, Delta Lake for streaming and batch data ingestions, and Data Vault (Raw, Business).
- Elaboration of a Data Virtualization layer across multiple other data platforms and data source systems.
- Implementation of strategies for data governance, data lineage, data cataloging, monitoring and data orchestration.
- Implementation of data & deployment pipelines (Infrastructure as Code (IaC), CI/CD, DevOps and DataOps).
- Development of data analytics methods (pattern analysis, text mining, clustering, ML/AI algorithms, etc.) for estimating KPIs and creating data reports and dashboards.
- Definition of best practices, guidelines and creation of an internal knowledge-base.
Technologies: MS SQL Server, MS Azure ecosystem services, Kafka Streaming, Databricks, Python, Databricks Spark Structured Streaming, Delta Lake, Terraform, VaultSpeed, Power BI, Github, Confluence, JIRA.
Databricks, Azure Synapse Analytics, Data Science, Data Vault, Data Warehousing, Datenanalyse, DevOps, Apache Kafka, Microsoft SQL-Server (MS SQL), Python, Transact-Sql
6/2021 – 1/2022
Tätigkeitsbeschreibung
Modernization of on-prem DWH and data migration to a cloud-based data platform.
- Assessment of current data quality and digital estate incl. the understanding of data sovereignty requirements, business processes, workflows dependencies and the identification of technical debts in the legacy on-prem systems.
- Evaluation of migration and cloud adoption readiness incl. optimization and sizing of cloud platform, cost analysis, gab/blockers analysis, workload prioritization and migration testing.
- Realizing a (meta)data migration strategy for moving the data structures, ETL/ELT, DevOps and business processes to the cloud data platform, modernizing them at scale and resolving data gravity related issues.
- Determination of decommissioning processes for on-premise and legacy systems.
- Realization of a data security Ring-Fencing architecture based on a Zero-Trust approach for increased users, devices and network security.
- Implementation of a modernized Data Lakehouse with automated data discovery, sensitive data classification and end-to-end data lineage capabilities, focusing more on the Azure Serverless capabilities.
Technologies: MS SQL Server (+SSIS, SSAS), MS Azure ecosystem services, DevOps (Git, Azure DevOps), Databricks, Python, Terraform, Bicep.
Databricks, Corporate Security, Data Warehousing, DevOps, ETL, Infrastrukturarchitektur, Microsoft Azure, Microsoft Business Intelligence, Microsoft SQL-Server (MS SQL), Systemmigration, Projektleitung / Teamleitung (IT), Python, Solution Architektur
2/2021 – 6/2021
Tätigkeitsbeschreibung
Development of workflows and a web app for the digitalization of documents, text analytics and semantic search.
- Conceptual design and feasibility analysis (operational, technical, QA, etc.).
- Development of a sustainable, robust and automatic text recognition (OCR) method for multilingual (scanned) documents.
- Research and development of document processing and Deep Learning methods for the automatic document layout understanding.
- Realization of Computer Vision and AI-based methods for the identification and recognition of text-objects inside documents and the extraction of their content in a structured format.
- Intelligent and automatic classification and categorization of documents based on data/text mining and pattern recognition techniques.
Technologies: MS SQL Server, MS Azure technologies, Python, Github/Gitlab, JIRA, Confluence.
Databricks, Data Mining, Data Science, Data Warehousing, Microsoft Azure, Projektleitung / Teamleitung (IT), Python, Software Architecture, Transact-Sql
5/2020 – 2/2021
Tätigkeitsbeschreibung
- Conceptualization, Design Thinking and implementation of a digital transformation roadmap incl. strategic data modelling and democratization.
- Legacy DWH architecture modernization and optimization, as well as application rationalization.
- Performance analysis and optimization of DWH, ETL processes, T-SQL queries and stored procedures.
- Analysis, design and implementation of a dynamic security role application concept.
- Data-driven analysis and determination of disturbing factors for the improvement of application quality and Customer Experience in operative systems.
Technologies: MS SQL Server (SSIS, T-SQL), Excel, Power Pivot, Power Query, SQL Server Profiler, Power BI, Github, Confluence, JIRA.
Data Warehousing, Powerapps, Power Bi, Microsoft SQL-Server (MS SQL), Microsoft SQL Server Integration Services (SSIS), SQL, Transact-Sql
10/2018 – 4/2020
Tätigkeitsbeschreibung
Realization of a modern Enterprise Data Platform and data migration from a legacy DWH system
- Analysis of legacy monolithic DWH platform status, technical debts, platform bottlenecks, organizational boundaries and data life-cycle.
- Feasibility analysis and implementation of a new modular, decoupled and collaborative DPAAS (Data Products as a Service) Enterprise architecture.
- Definition of transformation strategy incl. the security concept from the legacy monolithic architecture to a cloud-native architecture, ensuring cross-organizational compliance.
- Implementation of strategies for data migration, data governance, data quality, data modeling, data lineage, data literacy, Master Data Management, monitoring and data orchestration.
- Development of a lean and effective Target Operating Model (TOM).
- Building an Enterprise Scale Data Management & Analytics Landing Zone’s architecture.
- Realization of an automated on-demand provisioning process of Data Engineering workspaces (Sandboxes) for data engineers and ML/AI/data scientists.
- Development and implementation of build/release/deployment pipelines incl. IaC (CI/CD, DevOps and DataOps).
- Development of various MVP, Use Cases and Microservices.
Technologies: MS SQL Server, MS Azure ecosystem technologies, Terraform, Kubernetes, Docker, Databricks (pySpark, Python, pySQL), Power BI, Power Apps, Github, Confluence, JIRA.
Databricks, DevOps, Git, Microsoft Azure, Powerapps, Power Bi, Microsoft SQL-Server (MS SQL)
2/2018 – 2/2021
Tätigkeitsbeschreibung
Realization (design, development, testing and roll out) of a multi-layer DWH architecture for data analysis and reporting, incl. 3rd Level Support.
- Conceptual design and implementation of a complex ETL solution for data evaluation and consolidation from different underlying source systems.
- Design and development of multi-layer DWH architecture, incl. data source and extraction layer, staging area, data cleansing, business and presentation layers, as per Kimball DWH design methodologies.
- Responsible for maintaining the DWH database schemas, entity relationship diagrams, data modeling, tables, stored procedures, functions, constraints, indexes, views and complex SQL statements.
- Performance tuning/optimization of T-SQL queries and ETL processes.
- Creation of jobs, alerts and scheduled executions of SSIS Packages using SQL Server Agent.
- Designing, prototyping and developing high-visibility interactive dashboard solution for operational and functional reporting, providing tracking of different Key Performance Indicators (KPIs) and metrics from ODS and OLAP cubes.
- Monitoring full, incremental and daily loads and supporting all scheduled ETL jobs for batch processing.
- Responsible for doing deployments between different environments and maintaining version control using MS Team Foundation Server (TFS).
- Creation of technical specification and ETL mapping documents with entity relationship and UML diagrams.
Technologies: MS SQL Server (SSIS, SSAS, SSRS, T-SQL, MDX), Oracle DB 11g, Excel (Power Pivot, Pivot Tables), SharePoint (Dashboards, ScoreCards), VBA, C#.
C#, Data Warehousing, Power Bi, Microsoft SQL-Server (MS SQL), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Reporting Services (SSRS), MultiDimensional EXpressions, Oracle Business Intelligence (BI), SQL, Transact-Sql, VBA (Visual Basic for Applications)
2/2017 – 10/2018
Tätigkeitsbeschreibung
Development of a cloud-based and AI-powered material quality control system for car seats in automobile industry, in the field of interconnected devices and IoT.
- Conceptual design, feasibility analysis and data strategy based on customer needs.
- Carve-out of data science operations unit for industrialized delivery with specific service catalogue (A/B- and multivariate testing, use case automation, code standardization, scalability, etc.).
- Development of a ML/AI-powered cloud-based solution for the automatic material quality control of car seats based on high-resolution cameras and optical sensors.
- Automatic data-driven report of product condition based on real-time cognitive results.
- Automatic detection and classification of the detected wrinkles, damages and defects.
Technologies: MS Azure, Excel, Power BI, Python, Computer Vision, Data Analytics, Github, JIRA.
Data Science, Data Warehousing, Microsoft Azure, Power Bi
11/2015 – 1/2017
Tätigkeitsbeschreibung
Realization of a DWH, OLAP and Reporting solution for an international finance company to consolidate and generate its financial reports.
- Analysis and verification of the newly defined business and technical requirements.
- Development of cross-functional architecture and utilization of centralized consolidation with high level of standardization to ensure security, compliance, data validity and quality standards.
- Extension of legacy DWH architecture, dimensionality optimization and development of OLAP cubes.
- Realization (design, implementation and quality testing) of user interfaces and ETL processes to check data integrity and to perform data cleansing, data profiling and data transformation.
- Development of migrations strategies to cloud and overcoming the internal data silo issues.
- Development of a data-layer structure for daily cash-flow analysis, evaluation and reporting.
- Optimization of SSIS (ETL) packages to improve the daily load performance using SQL performance tuning methodologies.
- Design and implementation of business Operational Data Storage (ODS) to integrate corporate data from different heterogeneous data sources, in order to facilitate operational reporting.
Technologies: Hyperion, MS Azure, MS SQL Server (SSIS, SSAS), Excel (Power Pivot, Pivot Tables), BIML, MDX, C#.
C#, Microsoft Azure, Microsoft SharePoint Server, Microsoft SQL-Server (MS SQL), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Integration Services (SSIS), MultiDimensional EXpressions, Transact-Sql, VBA (Visual Basic for Applications)
Persönliche Daten
- Griechisch (Muttersprache)
- Englisch (Fließend)
- Deutsch (Gut)
- Europäische Union
- Schweiz
Kontaktdaten
Nur registrierte PREMIUM-Mitglieder von freelance.de können Kontaktdaten einsehen.
Jetzt Mitglied werden