- Kathmandu, Nepal
- Full Time
- Asia Technology
Qualification & Experience:
- At least 1-3 years of overall industry experience. (Working experience in a relevant field would be preferred and favorable)
- Bachelor’s degree in Computer Science or equivalent.
- Experience in any programming language like Java, Python, Scala.
- Experience in creating ETL pipeline and familiar with extraction, transformation, loading, filtering, cleaning, joining, scheduling, monitoring, and data-streaming.
- Experience with data processing tools. (Spark, Hadoop)
- Experience with AWS/GCP Services (EMR, Redshift, Google Data Studio, BigQuery)
- Familiarity with Data warehousing tools and processes. (Snowflake, RedShift, S3, BigQuery)
- Experience in setting up ingestion pipelines. (Apache Kafka, Amazon Kinesis)
- Familiarity with analytics and visualization tools is preferable.
- Candidates with certifications in big data tools would be preferable.
- Experience with relational SQL and NoSQL databases.
- Familiarity with project management processes (Sprint, KANBAN) and tools. (Jira, Asana)
- Ability to work independently or in a collaborative environment with a proactive attitude.
Apply for this position