Skip to content

Senior Data Engineer

On-site
  • Tehran, Tehrān, Iran, Islamic Republic of
Tech

Job description

Snapp is the pioneer provider of ride-hailing mobile solutions in Iran that connects smartphone owners in need of a ride to Snapp drivers who use their private cars to offer transportation services. We are ambitious, passionate, engaged, and excited about pushing the boundaries of the transportation industry to new frontiers and being thefirst choice of each user in Iran.

About the Role
The role contribute to a variety of exciting projects ranging from designing robust and automated Data Pipelines and Storage processes to building tools for improving company-wide productivity with data. It’s more about designing, implementing and operating stable, scalable and efficient solutions to flow data from different sources into datalake and other databases. You will work as stakeholder to bring the data into standard/queryable format and empower company to take data driven decisions. Our Team empowers nearly all of Snapp Cab/Box to take data driven decisions and make impact throughout the company.

Responsibilities
  • Design, implement, and maintain scalable real time and batch data pipelines handling billions of records
  • Maintain Real Time Analytics Systems and Big Data Systems and make sure their reliability and maintenances
  • Setup Real Time Analytics solution depending on services
  • Propose new data architecture for new requirements and fine-tune the existing ones
  • Work with Business Intelligence Team, Ventures and data scientists team as well as other teams and meet their requirements
  • Monitoring data services and resolve issues in case of any incident
  • Knows how to write highly efficient data pipelines and troubleshoot them

Job requirements

Requirements
  • BS/MS or more in computer engineering/science or related experience
  • Hands on experience in Linux, Virtualization, Docker, and Kubernetes
  • Specialized in Hadoop ecosystem (HDFS, Yarn, Hive, Spark)
  • Hands-on experience with Kafka/Zookeeper
  • Experienced in Agile / Scrum / DevOps projects
  • At least 2 years of programming experience in Python, Java or Scala or Go
  • Experience working with one or more of these: Airflow, Debezium, Confluent Schema Registry
  • Familiar with monitoring systems (Grafana, Prometheus, Exporters)
  • Experience working with Logstash, Clickhouse, and MySQL
  • SQL Knowledge and Experience with database systems like Clickhouse/Cockroachdb/Rocksdb/Posgres/Mysql and other DBs
  • Experience in streaming technologies like Spark, Apache Flink, Nifi
  • Good Communication and Teamwork Skills
  • Experience with data exploration and data visualization like Hue, Superset

or