Data Engineer with DevOps experience
PeakData
Data Engineer with DevOps experience
📍 Hybrid | Wrocław | Start date: January 2026
💼 B2B: 30–35 €/h | Long-term collaboration
About PeakData
At PeakData, we don’t just analyze numbers – we shape the future of healthcare.
We are a Swiss technology company with 7 years on the market. Half of the TOP 20 global pharma companies are our clients, and with a new global player joining, we are expanding our team.
PeakData is a stable, financially solid company, and we’re looking for someone who wants to join us for the long term. This is not a short project – we want you to grow with us in a supportive, international environment.
Role Overview
We’re looking for a Data Engineer with DevOps experience to help us build and scale our internal data platform supporting med-pharma projects.
This role bridges two worlds — data engineering (pipeline design, integration, optimization) and DevOps practices (CI/CD, monitoring, infrastructure automation).
You’ll work in an end-to-end full-stack data team responsible for development, testing, documentation, and user guides for our internal data systems.
Key Responsibilities
- Design, build, and maintain data pipelines (ETL/ELT) in AWS
- Develop and automate data integration and transformation processes in Python
- Apply DevOps principles – CI/CD, observability and infrastructure automation
- Manage and optimize cloud resources for performance, reliability and cost-efficiency
- Contribute to infrastructure as code (Terraform) and containerized deployments (Docker/Kubernetes)
- Collaborate with cross-functional teams on architecture, testing and documentation
We’re Looking For
We’re looking for a hands-on Data Engineer with DevOps experience who combines strong technical skills with a passion for automation and cloud technologies.
You’ll work with modern data tools, AWS infrastructure and DevOps practices in a production environment.
Someone who has:
- Curiosity and willingness to experiment with new tools and technologies
- Growth mindset, especially in areas like automation, CI/CD and cloud infrastructure
- Analytical thinking – you like to understand how systems behave and why certain solutions work better than others
- Ownership and independence – you can organize your work, make technical decisions and deliver results without constant supervision
- Team player attitude – you collaborate easily, share knowledge and support others in finding the best solutions
- Clarity in communication – you can explain technical concepts and architectural choices in a clear, structured, and simple way
Must-have
- Strong Python skills (FastAPI, Flask, asyncio)
- Minimum 3 years of experience with AWS, including S3, Lambda, IAM, monitoring and Bedrock
- Proven experience in building and maintaining ETL/ELT pipelines
- Practical knowledge of Docker and Kubernetes for deployment and scaling
- Experience with Terraform (IaC) and infrastructure automation
- Ability to write automated tests and maintain clear technical documentation
Nice to have
- Experience with CI/CD pipelines (GitHub Actions, GitLab)
- Experience using LLM-based tools (e.g. GitHub Copilot, Cursor) to support coding, automation or data workflows
- Familiarity with GCP (BigQuery, Vertex AI, Gemini LLM – focused on data processing)
- Knowledge of cloud cost optimization, monitoring, and alerting tools (CloudWatch, Prometheus, Grafana)
- Experience in automating data analysis workflows or building ML-assisted processes
What You’ll Get
- Attractive pay model – B2B (30–35 €/h), you issue a net invoice – VAT is handled on the Swiss side
- Start date: January 2026
- Stability meets innovation – yes, we have startup energy, but also the trust of the industry: half of the world’s Top 20 pharma companies already work with us
- Team that values YOU – inclusive, open-minded and giving space for your ideas and creativity
- Balance of stability and energy – clear processes, global clients, and the freedom to experiment with new technologies
- Visible impact – your work will directly support decisions in the pharmaceutical sector and help improve patient outcomes worldwide
- Hybrid model – work mostly remotely, with one day per week in our Wrocław office for collaboration and team sessions
- Continuous learning – we actively support your growth in LLM, AI and automation, with access to the latest tools and frameworks
- Modern tech environment – the newest generation of LLM tools at your disposal
- Ownership and autonomy – we trust your expertise and give you the space to design, build and improve
- Work-life balance that actually works – flexible core hours (9:00–15:00) and a results-driven culture
✨ Join PeakData and help us shape the data backbone of global healthcare.