Responsibilities• Create and maintain optimal data pipeline architecture on data platforms either data warehouse and data lake.• Design optimal data models that suit either structured, semi-structured, or unstructured datasets to gain performance and efficiency of data pipeline process. • Build infrastructure and framework required for the data pipeline process using Oracle and Hortonwork technologies in the most efficient way.• Utilize data platforms to provide insights into business metrics: customer acquisition, operational efficiency, operational risk, regulation, etc.• Prepare data & tools for the data analyst and data scientist team.• Design and build BI reports that let business users can analyse data by themself• Design, implement, and maintenance internal process improvements: automating manual processes, optimizing data delivery, capability expansion, etc.• Work with stakeholders including the Executive, Product, and Business unit to assist with data-related technical issues and support their data infrastructure needs.• Work closely with data experts, IT engineers, and partners in various functionalities of data projects: identify, plan, procurement, contract, implementation, maintenance, etc. Qualifications• Advanced working with SQL as well as data integration & ETL tools.• Experience with a variety of databases: Oracle, MSSQL, Hbase, Hive, Postgres, MySQL, etc.• Experience with big data tools, especially Hortonwork or Cloudera: NiFi, Spark, Kafka, Tika, Ranger, etc.• Experience with cloud platforms: GCP, AWS, Azure, etc.• Working knowledge of software life cycle methodology either development and operation: DevOps, Agile, CI/CD, CMMI, ITIL, etc.• Strong project management and organizational skills.• Strong in systematic thinking, problem-solving, good leadership, interoperability, and orchestration skills• Experience in the business of banking and life assurance industries is a plus. Additional• Experience building data warehouses and big data platforms. • Working knowledge of message queuing, stream processing, and distributed data processing. • Experience building and maintaining RESTful APIs.• Experience coding with various computer languages: Python, Java, C++, C#, Scala, Node.js, etc.อ่านเพิ่มเติม