Responsibilities:
-
- Design and implement robust data analytics frameworks.
- Upgrade existing data infrastructure for scalability and agility.
- Ensure data integrity, security, and compliance with industry standards.
- Architect scalable service platforms to enhance user experience.
- Lead cloud solution integrations to improve service delivery and efficiency.
- Collaborate with cross-functional teams to define and implement service APIs.
- Align IT architecture with business goals for financial sustainability.
- Identify and mitigate risks in IT infrastructure and data management.
- Drive innovation by exploring emerging technologies and industry trends.
Requirements:
-
- Bachelor’s degree in Computer Science, Data Science, IT, Electronic Engineering, or related field.
- 3 to 5 years of experience in data warehouse, data lake, or big data solutions.
- Hands-on experience designing enterprise data platforms with big data, Gen AI, Agentic AI, and cloud (Azure/AWS preferred).
- Strong knowledge of project management methodologies (Scrum/Jira).
- Familiarity with big data technologies and open-source packages (Kafka, Confluence, Hadoop, Spark).
- Experience in understanding and converting RPA requirements.
- Proficiency in computer software (Kafka, Confluence, Hadoop, Spark) and operating systems (PaaS, VM, Unix, Linux).
- Programming skills in Python and Java.
- Fluent in both spoken and written English and Chinese.
Salary: 1M annual package