Title: ETL Developer
Company Name: Enterprise System Management Solutions
Vacancy: 2
Job Location: Anywhere in Bangladesh
Employment Status: Full-time
Educational Requirements:
∎ Bachelor of Science (BSc) in CSE
Experience Requirements:
∎ 1 to 2 year(s)
∎ The applicants should have experience in the following area(s):
Data Warehousing
∎ The applicants should have experience in the following business area(s):
Telecommunication
Job Responsibilities:
∎ To work together with the project team on project implementation in meeting project timeline and objectives
∎ To understand and develop ETL and other technical artifacts based on an assignment from customer requirements.
∎ To practices standard project management : Jira, Confluence, Git .
∎ To provide documentation on developed artefacts in the form of artefacts in-line comments or part of overall technical documentation (i.e. Technical manual) as required
∎ To provide technical skills support to project team members as required
Additional Requirements:
∎ Both males and females are allowed to apply
∎ Both males and females are allowed to apply
∎ Minimum 2 years of hands on experience PLSQL, Linux
∎ Minimum 2 years working on any ETL tools like Talend,Informatica, NiFi
∎ High preference for candidate who has understanding on data modelling, sense of
∎ Data quality and data transformation
∎ databases (Oracle, Hive)
∎ knowledge on Java or Python will be added advantage.
Salary: Negotiable
Compensation & Other Benefits:
∎ Mobile bill
∎ Festival Bonus: 2
Application Deadline: 7 Aug 2020
Company Information:
∎ 9 Jul 2020
∎ Enterprise System Management Solutions
∎ Address : Shah Ali Tower, 33 Kawran Bazar, Dhaka-1215
∎ Business : Big data, Data warehouse, Data Lake, Service integration, Real-time feed processing,Business Intelligence.
Category: IT/Telecommunication
Read Before Apply: We are looking for a ETL developer to work in a telecom data warehouse project. The ETL developer is responsible for the design, development, and implementation of data integration solutions. Responsible for importing, cleaning, transforming, validating and analyzing data with the purpose of understanding or making conclusions from the data for data modeling, data integration and decision making purposes. We have highbred data warehouse & data lake using oracle, hive, kafka, nifi. So there is a great opportunity to learn big data.