Ready to take ownership of the ingestion layer of a modern data platform and drive data flows across multiple markets?
We’re looking for a Data Ingestion Engineer to design and scale streaming and batch pipelines, integrate complex source systems, and ensure reliable, high-quality data delivery across Croatia, Slovenia, and Bulgaria.
What you’ll do:
- Design, build, and maintain integrations for structured, semi-structured, and event-based data
- Develop and operate Kafka-based streaming pipelines
- Build and maintain batch ingestion workflows (Logstash, object storage, Parquet, Python)
- Implement and support MySQL replication and other extraction patterns
- Ensure stable, scalable, and observable ingestion pipelines
- Define and maintain data contracts (schemas, freshness, SLAs)
- Implement data quality checks (schema validation, completeness, anomaly detection)
- Monitor pipeline performance (throughput, latency, failures)
- Work with cloud-based ingestion tooling and storage
- Document pipelines, integrations, and incident procedures
- Collaborate closely with platform, analytics, and engineering teams
What you bring:
- Hands-on experience with data ingestion pipelines (streaming & batch)
- Strong knowledge of Kafka (producers, consumers, partitions, offsets)
- Experience with Logstash, object storage, and formats (Parquet, JSON, CSV)
- Strong Python and SQL skills
- Experience with orchestration tools like Airflow
- Understanding of relational databases (e.g. MySQL) and analytical stores (e.g. ClickHouse)
- Knowledge of ingestion patterns (CDC, incremental loads, schema evolution)
- Experience with monitoring, alerting, and SLA tracking
- Familiarity with Git and CI/CD workflows
Nice to have:
- Experience with GCP data ecosystem
- Exposure to Cloud Composer and managed ingestion tools
- Experience with ClickHouse or similar analytical databases
- Knowledge of data contracts and platform-oriented ingestion design
What we offer:
- Flexible working (hybrid/remote)
- Learning & development opportunities
- Supportive and collaborative team culture
- Competitive compensation & benefits
- Modern tech stack & impactful projects
If you’re experienced in building robust data pipelines and want to shape how data flows across markets – we’d love to hear from you.
Apply by sending your CV to human.resources@rtl.hr (subject: Data Ingestion Engineer).
Prijavi se