Responsibilities:
-
- Design and implement robust data analytics frameworks.
- Assess and upgrade existing data infrastructure to support scalability and agility.
- Ensure data integrity, security, and compliance with industry standards.
- Architect scalable service platforms to improve user experience.
- Lead the integration of cloud solutions to enhance service delivery and operational efficiency.
- Collaborate with cross-functional teams to define and implement service APIs.
- Align IT architecture with business goals to support financial sustainability.
- Identify and mitigate risks associated with IT infrastructure and data management.
- Drive innovation by exploring emerging technologies and industry trends.
Requirements:
-
- Bachelor Degree in Computer Science, Data Science, IT, Electronic Engineering or related disciplines
- 3 to 5 years of experience in implementing data warehouse/data lake/ big data solution
- Hands-on experience to design enterprise data platforms and solutions incorporating big data, Gen AI / Agentic AI / RAG and Cloud (Microsoft Azure/ AWS is a plus)
- Strong knowledge of project management methodologies (Scrum / Jira)
- Experience in Big data technologies and related open-source packages such as MS CI, Kafka, Confluence, Hadoop, Spark, etc.
- Experience in understanding and converting RPA requirements
- Computer Software knowledge in Kafka, Confluence, Hadoop, Spark and etc, and Operating System knowledge in PaaS, VM, Unix, Linux
- Programming Language knowledge in Python, Java
- Good command of both spoken and written English and Mandarin
Salary: Basic salary (negotiable) + Double pay + month bonus