About the Role
Ethos is an extremely data driven company. We take every decision backed by data. Making data and insights available to everyone in the company is the mission of the data engineering team. You will work with some of the brightest people helping drive infrastructure and insights to disrupt one of the oldest and largest industries in the country.
Duties and Responsibilities:
- Your team owns data platforms like Snowflake, Segment Airflow, DBT, Mode, and Sagemaker. Take responsibility for everything from configuration using Terraform to monitoring and observation using Datadog. Operational excellence is the name of the game.
- Integrate new versions of systems and upgrades to software on a regular, timely cadence.
- Evaluate changes for performance or functional impact and manage deployments appropriately.
- Provide observability into our systems. Use out of the box functionality if possible, build it out if necessary.
- Setup monitoring, alerting, and notifications to proactively respond to any issues.
- Constantly optimize usage across all of our platforms.
- Maintain change control for data infrastructure.
- Build realtime data pipelines to ingest structured and unstructured data into data warehouse
- Build ETL pipelines using Airflow to drive analytics, reporting and machine learning
- Build and maintain data governance, classification and dictionary
- Constantly improve A/B test framework and reporting
- Support leadership and every single team with research on key business initiatives and challenges
Qualifications and Skills:
- Minimum of 5 years of experience with Python, Java, or Scala
- Minimum of 5 years of solid software engineering experience building data or backend systems
- Comfortable with navigating complex topics and using data to make decisions
- Excellent communication, presentation, and interpersonal skills