Bachelor’s degree in Computer Science, Computer Engineering, Mathematics, or equivalent professional experience.
Industry experience in a Data Engineering/Data Architecture role
Experience working with Spark and other distributed data technologies (e.g. Hadoop, Presto, Druid) for building efficient & large scale data pipelines.
Experience in designing streaming data pipelines (eg Kafka, Flink, Spark-Streaming)
Nice to Haves
Growth mindset and ability to learn new technologies
Experience in backend API development and management
Experience with a graph database (eg Neo4j, TigerGraph)
Familiar with Data Lake and Data Warehouse technologies
Good understanding of software development life cycle, version control, code reviews, testing, data quality tools and frameworks
What You'll Be Doing
Design and develop data pipelines for the Knowledge Graph
Work with a Dublin based team on designing the data layer
Have significant influence over the direction of the service
Interact with global teams with unique skill sets
Perks and Benefits
Opportunity to learn and grow professionally
Engage with diverse teams operating in different time zones