- Create and maintain optimal data pipeline architecture
- Extract large, complex data sets that meet functional and non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
- Build the infrastructure required for optimal ETL pipeline from a wide variety of data sources using Python, SQL and Hadoop eco-system technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with stakeholders including the Executive, Marketing, Accounting, Finance, etc teams to assist with data-related technical issues and support their data infrastructure needs.
- Work with data scientist and analytics to strive for greater functionality in our data systems