Senior Data Engineer
Waseela

Waseela is a catalyst for change, empowering rural communities to drive Pakistan's economy forward. Through data-driven, research-backed projects, we address critical challenges in our rural economy and unlock opportunities for sustainable growth.
Our focus is on creating a lasting impact at the grassroots level, building resilient communities and fostering a prosperous Pakistan. We achieve this through our dynamic brands:
- Kisaan
- The Market
- Waseela Financial Services
Work with us to enable the rural communities of Pakistan!
Senior Data Engineer
Role Summary:
We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable data pipelines using AWS. The ideal candidate will have extensive experience in batch and real-time data ingestion, orchestration using tools like Apache Airflow or Dagster, and implementing fault-tolerant alerting and logging mechanisms. This role is critical in ensuring the reliability and efficiency of our data infrastructure, enabling seamless data flow across our systems.
Responsibilities:
- Design, develop, and maintain scalable and robust data pipelines for batch and real-time data processing.
- Work with cloud-based platforms (AWS or GCP) to build efficient and cost-effective data infrastructure.
- Implement and manage orchestration workflows using Apache Airflow, Dagster, or similar tools.
- Ensure data quality, integrity, and security across all pipelines.
- Set up logging, monitoring, and alerting mechanisms to proactively identify and resolve issues.
- Optimize data storage solutions, ensuring performance and scalability.
- Collaborate with cross-functional teams including data scientists, analysts, and software engineers to support data-driven decision-making.
- Automate and streamline data engineering processes to improve efficiency and reliability.
- Lead the migration of existing data workloads to cloud platforms, ensuring minimal disruption and optimal performance
- Mentor junior engineers and promote best practices in data engineering across the organization
Must Have
Qualifications:
- Strong proficiency in AWS (Glue, Lambda, S3, Redshift, Kinesis) or GCP (BigQuery, Dataflow, Pub/Sub, Cloud Functions).
- Hands-on experience with batch and real-time data processing frameworks such as Apache Spark, Flink, or Kafka.
- Expertise in workflow orchestration tools like Apache Airflow, Dagster, or similar.
- Proficiency in programming languages such as Python, Scala, or Java.
- Strong understanding of observability principles, including setting up monitoring, logging, and alerting using tools like Prometheus, CloudWatch, Datadog, or Splunk.
- Excellent problem-solving skills and ability to work in a fast-paced, agile environment.
- Strong communication skills and ability to collaborate effectively with both technical and non-technical stakeholders.
Experience:
- 5+ years of experience in data engineering, specializing in scalable data pipeline development.
- Experience with SQL and NoSQL databases, data modeling, and ETL frameworks.
- Experience with containerization and orchestration (Docker, Kubernetes).
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
Sales Development Representative

Logistics Coordinator (Customs Clearance Specialist)

Senior Technical Lead (Embedded)
