One stop solution to your remote job hunt!

By signing up you get access to highly customizable remote jobs newsletter, An app which helps you in your job hunt by providing you all the necessary tools.

OR
Subscribe to our highly customizable newsletter to get remote jobs from top remote job boards delivered to your inbox.
Andela  worldwideover 2 years ago
software developmentsoftware development
Apply Now

As the world’s leading marketplace for technologists, Andela is in the business of changing lives, matching the brightest talents to roles at innovative technology companies across the globe. By joining Andela, you can experience the joy of long-term work with vetted companies and competitive compensation, ensuring you grow your career while becoming part of a vibrant community. Andela’s Talent Network connects you to a erse global network of like-minded technologists, so you can develop your expertise and access exciting employment opportunities. We are always looking for Senior Data engineers to join Andela and gain access to high quality, long term remote roles.

< class="h3" dir="ltr" style="line-height:1.38;margin-top:16pt;margin-bottom:4pt;">Must-haves:
  • 5+ years hands-on experience using data technologies

  • Strong analytic skills related to working with both structured and unstructured datasets

  • Experience in building ETL/ELT data pipelines in Big Data deployments and processes suporting them - data structures, metadata, dependency and workload management

  • Experience writing complex queries and stored procedures in SQL

  • Strong modeling/data architecture experience along with relational database analysis/optimization

  • Coding experience with Python/Java/Scala/Golang

  • Experience deploying data pipelines in cloud environment

  • Knowledge of/and experience with modern git workflows (Pull Requests, CI, Code Reviews)

  • Knowledge of/and experience with Software Development Methodologies such as Scrum

  • Excellent verbal and written communication skills and the ability to work with others at all levels

  • Toolset:

    • Analytics: Spark, Databricks

    • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

    • Stream-processing systems: Kafka, Storm, Spark-Streaming, etc.

    • Distributed data storage file systems/solutions (HDFS, S3)

    • AWS (or equivalent GCP and Azure) cloud services related to data storage and processing: EC2, EMR, RDS, Redshift.

< class="h3" dir="ltr" style="line-height:1.716;margin-top:10pt;margin-bottom:10pt;">Nice to have:< class="h3" dir="ltr" style="line-height:1.716;margin-top:10pt;margin-bottom:10pt;">

  • Working experience in data analytics; data wrangling, integration, analysis, visualization, data modeling and reporting using BI (Business Intelligence) tools

  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores

  • Experience with architecture of data warehouses and data lakes

  • Design and deployment of Data Science and ML models is a big plus

  • Experience with replication, administration and performance tuning of databases

< class="h3" dir="ltr" style="line-height:1.716;margin-top:10pt;margin-bottom:10pt;">