LiveRamp is the leading data connectivity platform for the safe and effective use of data. Powered by core identity resolution capabilities and an unparalleled network, LiveRamp enables companies and their partners to better connect, control, and activate data to transform customer experiences and generate more valuable business outcomes. LiveRamp’s fully interoperable and neutral infrastructure delivers end-to-end addressability for the world’s top brands, agencies, and publishers.
For more information, visit [...] 请访问 [...] Design and build core services, APIs, and infrastructure that supports the growth of LiveRamp product.
● Build distributed, scalable, and reliable data pipelines that ingest and process terabytes of data per day.
● Work on a massively scalable segmentation engine that performs real time segmentation with accurate counts for people-based marketing. If you enjoy advanced query processing engines, indexing strategies, and database engines in general, this could be a fit.
● Enhance our core toolkit to enable easy implementation of discoverable services and APIs. You would love working on this if building toolkits and libraries that other developers love to use is your thing.
● Build data collaboration capabilities that allow different parties to connect their own data and also create new data partnerships.
● Build tools that enable data analysts and data scientists to turn data into actionable insights.
● Leverage cutting-edge privacy-enhancing technologies to enhance our product offerings.
● Collaborate closely with product, data science, infrastructure, and other internal partners to work on a remote-first team that consists of supportive and passionate software engineers.
● Excellent oral & written English skills, ability to communicate with foreign colleagues and to document requirements and specifications in English.
● 6+ years of experience writing and deploying high-quality production code to deliver business value.
● Have a passion and track record for designing and building scalable services, APIs, and infrastructures.
● Strong experience programming in Java.
● Experience working with big data technologies (Hadoop, Spark, Presto, Airflow, etc)
具备Hadoop / Spark / Presto / Airflow等大数据技术经验者优先
● Have a product-oriented mindset: you like offering technical solutions to business problems.
● Strong knowledge & experience on SCRUM project management.
● Experience with GCP or another cloud provider (AWS, Azure, ...).
● Excellent communication skills, team-work spirit, analytical and logical thinking.
● Type S(tartup) personality: smart, ethical, friendly, hard-working and proactive.
● Experience working on a remote-first team.
● BSc degree in Computer Science, Engineering or relevant field.