Passionate about computer science, developing software & internet applications!
Strong algorithm and logical skills, very good problem-solving skills, and ability to investigate to find the root of the problem
Proven knowledge of Java technologies/frameworks
Experience in designing and developing scalable and distributed applications
Experience in regards to observability i.e. monitoring, logging, tracing
Proficient in standard methodologies for writing code that is maintainable and secure
Ability to research and become proficient in new technologies
Committed to the highest levels of quality, demonstrates accuracy and thoroughness both in testing and code development
Nice to Have
Proficiency in data technologies for large-scale operations and Typelevel ecosystems, like Hadoop/HBase/MapReduce/Iceberg, Kafka; operational experience with these would be advantageous.
Scala and Spark knowledge
Experience in working with infrastructure, i.e. cloud providers, containers, and orchestrators is a plus
Experience with monitoring solutions: New Relic, Datadog, Runscope, Prometheus, Grafana.
What You'll Be Doing
Help shape the architecture and bring new features to life through hands-on design and development. Collaborate with various teams to develop high-quality solutions.
Architect, develop, and optimize data pipelines using Scala/Java, Spark, and Databricks.
Ensure high availability and reliability of data pipelines and manage multi-cloud infrastructure.
Improve technical skills, research technologies useful for the project, and participate in engineering on-call schedule.