Work with product and technology team in understanding different entities and objects
Define a data model to store the source data that feeds into transactional system as well as for analytics platform
Work with content acquisition team in building partnership with different data providers to enable integration
Create and maintain multiple pipeline architecture in an optimal manner
What you’ll need:
10+ years of experience building scalable, modular data lakes/data warehouse or 8+ years of experience building large scale data warehouse systems along with a Degree in Computer Science with specialization in Analytics, Information System etc.
Strong data modeling skills
Strong analytical skills working with unstructured data
Advanced knowledge in SQL and scripting
Experience with big data tools: EMR, Hadoop, Spark etc.
Experience with Relational and No SQL: Cassandra & Postgres etc.
Experience with workflow management tool: Airflow or something similar
Experience with AWS cloud service: EMR, EC2, ECS etc.
Experience with object oriented programming: Java, Python, Scala, R
Experience with BI tools like Tableau/Domo or Looker
Worked in a fast paced startup environment and has a track record of building scalable infrastructure and analytics platform