Responsibilities
- Data crawling
- Perform exploratory data analysis and data visualization
- Feature engineering
- Model construction
- Model deployment on cloud
- Optimize product performance
- Optimize model training
- Design and build reusable code
- Clear technical documentation
- Follow Agile methodology
Requirements
- Experience in programming, machine learning, AI, databases
- Experience in data governance / data risk related work
- Previous experience with data governance or data risk, auditing, information security, data protection, data management or relevant areas
- Excellent communication and organizational skills
- Good knowledge of data lake and data warehousing principles and modeling
- Experience with analytics, batch and real-time processing
- Experience with cloud platforms, eg AWS
- Experience working with both technical and non-technical teams for data
- Practical experience working in an agile delivery environment
- Programming and SQL skills are pluses
- Degree or above in Computer Science, Computer Engineering, Electrical Engineering, Mathematics, Physics or relevant majors
Technical skills
- Python, JavaScript
- TensorFlow, PyTorch, Keras
- Seaborn, Plotly, D3, ggplot, Matplotlib, Tableau
- MongoDB, SQL database, Hadoop / Apache Spark platforms, Amazon S3, Flash, Spark
Details
- Work from home
- Attractive renumeration package
- Application deadline until roles filled
Apply now