• At least 4 years’ experience in backend software development, including designing and developing data pipelines and APIs.
• Experience in data-driven design, database architecture, and implementing ETL/ELT pipelines with a focus on loose coupling and end-to-end data lifecycle management.
• Must have an understanding of high level concepts and industry standards in data lake/warehouse/lakehouse management, including data cataloging, data governance, data quality, data observability, best practices in data persistence and leveraging data to drive analytics for business use.
• Must have working knowledge of distributed systems & microservices architectures.
• Must have working knowledge of highly scalable distributed and cloud native databases.
• Bachelor’s of Computer Science, or equivalent combination of education and experience, professional certification preferred.
• Must have experience in technical leadership and ownership of software components and code bases.
• Must have experience leading and guiding other engineers.
• Must have excellent verbal, non-verbal, and written communication skills, including ability to effectively communicate with internal and external customers.
• Strong knowledge in writing SQL queries, stored procedures, and user defined functions.
• Strong experience in Python.
• Experience in standard Git process, and working knowledge of using CI/CD pipelines for automating testing and deploys.
• Nice to have working knowledge of Snowflake and Apache Kafka or equivalents.