Design, develop, and deploy a data integration and optimization process for the ETL process from various sources for efficient data extraction, transformation, and loading in the big data pipeline through complex projects to successful completion.
Customize and manage integration tools, databases, warehouses, and analytical systems.
Ensure data quality and measure accuracy, completeness, and consistency in the data preparation process.
Collaborate with data team and business stakeholders to understand data requirements and deliver solutions that meet business objectives and timelines
Advise and mentor junior data engineers in their day-to-day tasks and project assignments.
Provide documentation as needed for project requirements.
Qualifications
Bachelor's or Master's degree in Computer Science, Software Engineering, Information Technology, or a related field.
3+ years of experience in big data technologies and their ecosystems.
Proficiency in programming skills in languages such as Python, SQL, or others.
Proficiency in ETL processes, data modeling concepts, and database management systems.
Experience with in-depth knowledge of data warehouse concepts and cloud platforms such as AWS, Azure, GCP, or others.
Be able to demonstrate enthusiasm for leveraging data engineering technologies to solve business problems.
Be able to clarify complex concepts in simple terms.
Be able to demonstrate leadership and project management.
Be able to work collaboratively in a team environment, sharing deep knowledge and actively collaborating on projects.