Data Engineer
-
Location
Berlin
-
Sector:
-
Job type:
-
Salary:
€80 - €100 per hour
-
Contact:
Sean Bowley
-
Contact email:
s.bowley@ioassociates.eu
-
Job ref:
BBBH148473_1728661051
-
Consultant:
Sean Bowley
We are seeking a Data Engineer can help us on a long term contract/b2b assignment working for a large Energy Company, who are based in Germany.
You can be located Europe-wide, have fluent English skills, you'll focus on the developments of capabilities to enable Data Mesh, and offering expertise in leveraging contemporary technologies.
This involves advising on the utilization of services available on public clouds or developing in-house services on our internal Kubernetes-based platform.
- Remote - with a 3 day block onsite per month in Berlin
- Rate - is open
- Availability: to start in October, no later than November
- Duration: 3 months rolling for 12-18 months
Duties:
- You will be responsible for the development and management of data products, ensuring robust data governance, and establishing comprehensive data cataloging processes.
- This includes the creation, maintenance, and enhancement of cross-cloud managed data services, which are essential for enabling seamless data accessibility and interoperability across different cloud environments.
- The Data Engineer will also work closely with various teams to implement and enforce data governance policies, ensuring data quality, security, and compliance.
Experience:
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- 5+ years of general IT experience
- 3+ years of Big Data experience
- Proven experience as a Data Engineer with a focus on designing and implementing scalable data architectures. Using Python/Java or Scala
- Extensive experience in developing and maintaining databases, data lakes, and data warehouses.
- Hands-on experience with ETL processes and data integration from various sources.
- Familiarity with modern data technologies and cloud services.
- Proficient in designing and implementing data models to meet business requirements.
- Experience with Data Mesh
- Strong DevOps Focus
Knowledge and experience with at least some of the Data technologies/frameworks:
- RDBMS (PostgreSQL/MySql etc.)
- NoSQL Storages (MongoDB, Cassandra, Neo4j etc.)
- Timeseries (InfluxDB, OpenTSDB, TimescaleDB, Prometheus etc.)
- Workflow orchestration (AirFlow/Oozie etc.)
- Data integration/Ingestion (Flume etc) .
- Messaging/Data Streaming (Kafka/RabbitMQ etc.)
- Data Processing (Spark, Flink etc.) And/Or with their Cloud provided counterparts, i.e., Cloud Data/Analytics services (GCP, Azure, AWS)