Responsibilities
- Data Pipeline Development:Design, implement, and maintain data analytics pipelines and processing systems.
- Data Modeling:Apply data modeling techniques and integration patterns to ensure data consistency and reliability.
- Data Transformation:Write data transformation jobs through code to optimize data processing.
- Data Management:Perform data management through data quality tests, monitoring, cataloging, and governance.
- LLM Integration:Design and integrate LLMs into existing applications, ensuring smooth functionality and performance.
- Model Development and Fine-Tuning:Develop and fine-tune LLMs to meet specific business needs, optimizing for accuracy and efficiency.
- Performance Optimization:Continuously optimize LLM performance for speed, scalability, and reliability.
- Infrastructure Knowledge:Possess knowledge of the data and AI infrastructure ecosystem.
- Collaboration:Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Continuous Learning:Demonstrate a willingness to learn and find solutions to complex problems.
Qualifications
- Education:Bachelor's or Master's degree in Computer Science, AI, Engineering, or a related field.
- Experience:At least 3 years of experience in data engineering
- Technical Skills:Proficiency in Python, SQL, Java, experience with LLM frameworks (e.g., LangChain), and familiarity with cloud computing platforms. Additional, visualization tools i.e Power BI, Tableau, Looker, Qlik
- Cloud Computing:Familiarity with cloud computing platforms, such as GCP, AWS, or Databricks.
- Problem-Solving:Strong problem-solving skills with the ability to work independently and collaboratively.
Location:BTS Ekkamai
Working Day:Mon-Fri (WFA Every Friday)