Title: Data Engineer
Company Name: SSL Wireless
Vacancy: --
Age: Na
Job Location: Dhaka
Salary: Negotiable
Experience:
Technical Expertise:
Data Warehousing: Concepts, architecture, OLAP, and operational knowledge
Data Modelling: Star & Snowflake schema design; dimensional modelling using DBT
Pipeline Development: End-to-end batch and streaming pipeline construction
Programming Languages: Python (primary), SQL (advanced)
ETL & Data Frameworks: Apache Spark, Kafka, Flink, Airflow, DBT
Performance Optimization: Large dataset handling, query tuning, and pipeline profiling
Preferred (Nice-to-Have) Skills:
Data Warehouse Engines: Experience with Doris, ClickHouse, and Druid
Data Lake Technologies: Familiarity with Apache Iceberg and Hudi for open table format management
Change Data Capture (CDC): Hands-on experience with Debezium and MySQL binlog-based CDC patterns
Data Quality & Validation: Knowledge of frameworks such as Great Expectations, Soda, or similar tools
Data Security & Governance: Understanding of tokenization, PII masking, and compliance standards (e.g., DPO 2025)
Containerization & Deployment: Experience with Docker and Kubernetes for pipeline deployment
Education:
Bachelor's degree — Computer Science, Engineering, or related field
Experience:
2–3 years in Data Engineering or closely related roles
The Data Engineer owns the end-to-end data pipeline from raw ingestion to the warehouse. This role is the backbone of SSL Wireless's data platform — ensuring data flows reliably, correctly modelled, and meets the quality standards required by analysts and ML teams.
Key Responsibilities:
• Design, build, and maintain end-to-end data pipelines for ingestion, processing, and storage
• Develop and optimize ETL/ELT workflows for structured and unstructured data sources
• Implement efficient data models and schema designs for data warehousing (Star / Snowflake)
• Work with large-scale datasets to ensure data quality, reliability, and performance
• Collaborate with data analysts, AI/ML teams, and business stakeholders across functions
• Monitor and troubleshoot data pipeline issues; optimize latency and throughput
• Enforce data governance, consistency, and security best practices across all pipelines
• Contribute to the design of scalable big data architectures on on-premise infrastructure
* Attractive salary as per industry best practice.
* Weekly medical consultancy.
* Annual leave encashment.
* Congenial & friendly working environment.