You'll be working on a small remote team with a lot with Python, SQL, Airflow, Airbyte, dbt, Snowflake, and Kubernetes. Requirements include 3+ years of experience leading data engineering projects, strong hands-on experience with cloud-based data solutions, expertise in Python, SQL, and Git, deep understanding of data structures, experience with containerization and Infrastructure as Code, proficiency in pipeline orchestration tools and data integration frameworks, strong knowledge of data warehousing and real-time data streaming, and a coaching mindset.