Afterpay has transformed the way people pay by allowing shoppers to receive products immediately and pay in four simple installments over a short period of time. The service is completely free for customers who pay on time - helping consumers spend money responsibly, without incurring interest, fees or revolving and extended debt. Afterpay is offered by more than 48,400 of the world’s best retailers and has more than 9 million active customers globally. The solution is currently available in Australia, New Zealand, the United States and the United Kingdom where it is called Clearpay. Afterpay is on a mission to be the world’s most loved way to pay. We are building our tech teams in Melbourne, San Francisco and Shanghai. We’re team players, committed to fast-paced quality work, and we’re looking for people who are keen to be part of something new as it rapidly grows to sky-high limits.
This position will be based in our newly setup Shanghai Technology Center.
Afterpay has unique opportunities for Big Data Engineers to join our Data Science Engineering Team and make an immediate impact on our product by getting involved early on at our Shanghai Tech Center. The engineers will influence many aspects of our business through architecting our data processing platform. This is an exciting opportunity to make a direct, tangible impact on our product and work on projects that are pivotal to our team’s success and our company’s growth.
• Design and build self-serving data platform to handle new products and business requirements that securely scale over millions of users and their transactions
• Develop, integrate and optimize end to end data pipeline
• Partner with data scientists, data analysts and domain engineering teams to identify and execute on new opportunities.
• BA/MS/PhD in computer science or a related field
• Experienced in building and maintaining applications that utilize a modern programming language, mainly Java and Python
• Experience with Flink, Kafka, Spark, Cassandra and so on
• Experience with big data cross platform compatible file formats like Apache Avro & Apache Parquet
• Experience with AWS like S3, EMR, Redshift and so on
• Experience on architecturing large scale distributed system
• Enthusiastic about solving business problems with technology and can take ownership of an end-to-end solution
• Passionate about continuously learning new technologies, frameworks, and services
• Possess an execution mindset and the ability to deliver with cross-functional teams that are globally distributed
• Serious about testing and have experience with automated testing frameworks
• Exceptional written and verbal technical communication skills