Junior Data Engineer for an International Media Group | Big Data, Cloud Infrastructure | SQL, Python, AWS, Hadoop | Singapore-based
- Are you someone who loves learning and sharing new technical knowledge?
- Are you someone who has a strong sense of ownership and who is adaptive to change?
- Are you tech-savvy and someone who’s willing to take on new challenges?
- Are you looking to work with innovative, collaborative, dependable and passionate people?
Sounding like something you have been waiting for? Then read on, this is for you!
Your new employer: A media group that aspires to inspire, innovate, and drive culture forward!
Your new employer humbly started its passion project, a sneakers website, back in 2005 and evolved to a publicly listed media company in 2016. Having the drive to create a lifestyle universe to uncover the emerging trends in fashion and culture, the company expanded into three major divisions – Digital Media, Creative Agency, and E-Commerce retail store. Being an international company, your new employer has 300+ employees across APAC, USA, and Europe’s major cities, with physical offices in Hong Kong, Shanghai, Tokyo, London, New York, and LA.
Their flagship platform boasts the latest trends in fashion, art, design, and culture. They are the leading online destination for men’s contemporary fashion and streetwear and so they made the platform to be available in 6 languages and has a wide readership across 10+ countries.
The company’s core motivation of educating a global, influential audience in the context of fashion, arts, design, and music has been drive by a team of passionate and dedicated tech-savvy people who are in constant pursuit of self-improvement and building the company together.
The Role: Data Engineer
Due to the continuous hyper growth of the company, they need you right now! As the Junior Data Engineer, you will work together with the Senior Manager, Data Analyst to build the cloud infrastructure of the company. You will have the opportunity to learn how to scope and run a data project from end to end, how to use GCP as a data pipeline and to design and build analytic applications on to the pipeline infrastructure.
- Design, implement, and maintain data governance, dictionaries, and models
- Identify, define, and compile disparate data sources (local, cloud, 3rd party) for ingestion in data pipelines
- Develop and implement data pipelines & infrastructure for the company
- Build data lakes and warehouses using Google Cloud Platform (GCP)
- Develop views with MySQL/Postgre
- Implement functional data marts using BigQuery
- Visualizations using Tableau/Data Studio
- Collaborate with Data Analyst to generate actionable insights based on your soon-to-be intimate understanding of the company’s data
What’s needed from you?
- 2 years’ experience in managing databases or building data pipelines on the cloud
- Proficient in SQL and relational databases
- Proficient in at least one scripting language, preferably in Python or R, to clean, transform, and denormalise datasets
- Knowledge of visualization tools such as Tableau, Data Studio, or Power BI
- Loves learning and sharing new technical knowledge
- Willing to take on new challenges
- Possesses strong sense of ownership
- Adaptive to change
What would make you stand out?
- Experience with GCP infrastructure
- Knowledge in customer analytics