data engineer, devops

freiberufler data engineer, devops auf freelance.de
Verfügbarkeit einsehen
Europa
en  |  ru  |  de
auf Anfrage
10439 Berlin
04.06.2019

Kurzvorstellung

Hello
I'm powerful engineer, with experience in BigData and DevOps.(Call me AIOps)
I am commiter to many opensource projects
Red Hat Certified Architect https://www.redhat.com/rhtapps/certification/badge/verify/UE25ZMXHKE3QVMNM47OBNQONRIAEQU3CUPSQX2K

Ich biete

IT, Entwicklung
  • Cloud Computing
  • Apache Hadoop
  • Red Hat Enterprise Linux (RHEL)
  • Software Architektur / Modellierung
  • Datawarehouse / DWH
  • Big Data
  • Cyber Security
  • PCI-DSS

Projekt‐ & Berufserfahrung

consultant
younnicos ( Aggreko ), Berlin
7/2018 – 3/2019 (9 Monate)
Maschinen-, Geräte- und Komponentenbau
Tätigkeitszeitraum

7/2018 – 3/2019

Tätigkeitsbeschreibung

Self managed powergrid system.
Establish release process. Create self managed system based on openshift, glusterfs, openstack swift, activemq, ansible, ospf, karaf.
Create ansible module for moxa network devices. VPN connections based on moxa edr and cisco ASA devices.
Create timeserver based on gps receiver.

Eingesetzte Qualifikationen

Jenkins, Cloud Computing, OpenShift


consultant
eos, hamburg
7/2017 – 8/2018 (1 Jahr, 2 Monate)
Banken
Tätigkeitszeitraum

7/2017 – 8/2018

Tätigkeitsbeschreibung

Automatic infrastructure by vmware, theforeman, freeipa, ambari. Create system for full recovery from scratch system for big data analytics teams(multi tenants and encryption by default). Patching ambari and hortonworks services. Integration to company infrastructure(ActiveDirectory binded with Freeipa pki, kerberos and ldap, network services like dns, dhcp, vlans). Automatic installation and configuration by ansible,puppet and foreman(redhat satellite).

Eingesetzte Qualifikationen

Apache Hadoop, Red Hat Enterprise Linux (RHEL), Cloud Computing

Zertifikate

Red Hat Certified Specialist in Ansible Automation
Januar 2018

Red Hat Certified Engineer , Red Hat Enterprise Linux 7
Dezember 2017

Red Hat Certified Specialist in Deployment and Systems Management Red Hat Satellite 6.2
November 2017

Red Hat Certified Specialist in OpenShift Administration Red Hat OpenShift Container Platform 3.5
Oktober 2017

Red Hat Certified Specialist in Containerized Application Development Red Hat Enterprise Linux Atomic Host 7
September 2017

Scrum Master PSM1
Mai 2017

Scrum Team Member Accredited Certification
Januar 2017

RHCSA RHEL7
Juli 2016

certified developer on apache spark
Dezember 2015

Red Hat Certified Engineer
Oktober 2008

Ausbildung

2002
(Ausbildung)
Jahr: 2007
Ort: Saint Petersburg

Qualifikationen

Big data with hadoop + java + dc.js(d3.js + crossfilter or rChart ) + R(charts, statictic) or SAS
Cluster computing: yarn(hadoop), apache spark.
Stream computing: apache storm, apache spark, mongodb(collections).
Data science: prediction, statistics, process mining, data mining.
web servers: apache, nginx – configuration and patching for add new features.
Databases:mysql, postgresql, ha, horizontal and vertical scalability.
NoSQLDB: hbase, couchdb, mongodb, cassandra(cql30), Elasticsearch, Neo4J
Cluster systems and failovers (red hat cluster suite, rocks cluster)
Queue systems amqp(rabbitmq,qpid), hornetq, Kafka
Virtualization (libvirt, xen, kvm,vmware,vagrant, OpenStack)
Automatic deploy using chef and puppet(very good know both, but love chef).
VoIP (asterisk, openser)
monitoring by snmp ,opennms, nagios, zabbix, cacti, collectd,rrd
ELK contributing and customizing.
standart network services (ftp,dns,samba,dhcp,proxy,firewalls,nfs)
IDS systems (snort)
Expert in svn and git
Scripting (ruby, lua, bash, awk, python)
Statistics: R, sas, python
Vissualisation: js(d3.js, crossfilter, dc.js, nvd3), octave, R(rchart, graphics), blender (by python api)
Experience with C, Go, erlang, java

Über mich

10.2015-present Mentor on course “Data Engineer”  train people to use big data tools for real cases.
Create cases for spark, storm, graphx + neo4j.
Graduated persons successfully working in big data fields.

03-04.2017 Consultation.
1. provided big data infrastructure audit. Created roadmap for improve security, stability and automatisation. Hortonworks data platform, freeipa(ldap, kerberos), oracle linux, numpy, containers and virtual network.
2. networks. Kubernetes, ipsec, aws kms, weave, k8s/kops.

06.2016-03.2017 Strato AG. Software engineer.
Design,implement and integrate to cloud infrastructure based on freeipa(ldap, kerberos, pki) , theforeman+katello projects , openstack.
Create ci pipeline for manage cloud, integration tests for salt formulas, puppet modules, ansible playbooks. build system for rpm packages and create infrastructure for rpm repositories and docker images(crane + pulp + foreman). Create high availability infrastructure with automatical deployment openstack.
Distributed storage: glusterfs and Ceph. Integrate central authentication, authorization and certificates infrastructure in HA mode (FreeIPA: dogtag, ldap, kerberos, selinux, vault, dnssec). Security scaner OpenScap for fit standart pcidss.
Provide internal trainings for collegues.
Write patches for foreman, inspec, flask, hammer, katello, pulp, puppet, openstack.

08.2015-03.2016 ExacTag – Duisburg (contractor)
Cloudera Hadoop, Kafka, Spark(graphX, streaming)
Create realtime application for advertisement metrics
Monitoring(zabbix, jmx) and tune spark application
Patching spark streaming and RDD for simplify data aggregation.
Cassandra as raw data storage.

05.2015-08.2015 IngDiBa Bank – Frankfurt am Main (contractor)
Hortonworks Hadoop, Ranger, FreeIPA, Ambari, Spark
HDFS, Hive, Spark, Yarn, Hbase, Kerberos
Rstudio, R , local CRAN.
Docker and docker-registry for local distribution docker images.
Authorisation and authentication in hadoop with kerberos and ldap.
Create big data platform for data scientists.
Fraud protection algorithm implementations.
Mortgage analys.

01.2015-05.2015 Fujitsu – Munich (contractor)
Development FUJITSU Software ServerView Cloud Load Control
OpenStack(neutron, nova, cinder, heat)
OpenShift,ProjectAtomic(multinodes cluster), kubernetes (multinodes cluster inside Atomic), docker
Web Management tool based on angularjs + d3.js + java + gradle,jax-rs

05.2013-04.2015 Citozin – Berlin (Internet of Things)
founder, backend engineer
collect and analyze data (fluentd, cassandra, postgresql, hadoop, R , sqoop, d3.js)
openstack + hadoop(hortonworks)+centos
Multiple point for write to cassandra.
Kafka as message broker.
Cars error prediction based on collected data and logit model.
DWH design and implement ETL process.
Hardware device creation – bluetooth low energy, OBD2 protocols(mainly CAN)
create product(docs, ads, manage team)

02.2014-12.2014 Nokia-Here – Frankfurt am Main (contractor)
ruby, python, rails , postgresql(json, hstore), aws, d3.js, crossfilter.js, dc.js
data visualization by graphics and heatmaps, agregations by geographic and roles createria
log analyze by splunk and logstash
elasticsearch cluster(>10TB): rebalancing, scaleup, failover
create custom logstash filters.
Hadoop + elasticsearch for aggregation.
Go for write REST api
R + shiny + splunk api for draw beauty and fast graphics.
Cloudformation templates for autodeployed and scaled elasticsearch cluster
develop on java and jruby – improve and speedup logstash for s3(opensourced on github)
Write puppet modules.

01.2013 – 08.2013 DevOp in Zimory GmbH
1.implement distribution product by rpm(maven for build rpms)
2.automatize installation process(puppet + foreman + rpm)
3.implement central log system(puppet + logstash + rsyslog)
4.testing and bug fixing(vagrant + vagrant-libvirt + python + ruby + bash)
5.ldap modification

Persönliche Daten

Sprache
  • Englisch (Fließend)
  • Russisch (Muttersprache)
  • Deutsch (Gut)
Reisebereitschaft
Europa
Arbeitserlaubnis
  • Europäische Union
Profilaufrufe
3396
Alter
34
Berufserfahrung
14 Jahre und 10 Monate (seit 08/2004)
Projektleitung
11 Jahre

Kontaktdaten

Nur registrierte PREMIUM-Mitglieder von freelance.de können Kontaktdaten einsehen.

Jetzt Mitglied werden »