Website everly_well Everlywell

Everlywell makes lab testing easy and convenient with at-home collection and digital results in days.

Everlywell is a consumer-initiated at-home laboratory testing company with easy-to-read and actionable results. We are at the forefront of personalized health, transforming the $25B lab testing industry. Everlywell is growing rapidly and we are looking for exceptional talent to join our team. We move at a fast pace to solve problems quickly so that our customers have a rewarding experience. If this sounds like your type of environment, we are eager to speak to you! 
ROLE SUMMARY
As a Data Engineer you will be directly reporting to the Director of Data Architecture. If you have a passion for data and cloud technologies, and a drive to design an effective data pipeline, then look no further. You will ensure that our data infrastructure and architecture supports the evolving requirements of the business. You will work closely with business stakeholders, Data Analytics, and application engineers to develop a strategy for our long term Data Platform architecture. You will identify gaps in data processes, and drive improvements, while mentoring and coaching other team members.

What You’ll Do:

    • Be an essential part of designing and building our new data architecture and platform
    • Build and maintain ETL pipelines that are reliable and scalable
    • Explore and evaluate new technologies and make recommendations where necessary
    • Develop, test and maintain existing architecture
    • Identify gaps and monitor in current data processes and drive improvements
    • Recommend ways to improve data reliability, efficiency and quality of the data platform and optimize for performance, scalability and cost
    • Work with ELT tools to sync data to/from 3rd party services
    • Collaborate with the Data Analytics team to build the correct datasets for further consumption by various visualization tools
    • Design data models that support business needs

Who You Are:

    • Programming experience and a demonstrated interest in statistical analysis and business intelligence
    • 5+ years experience with SQL, Data Warehouse development and ETL
    • Hands-on experience with at least one cloud-based data warehouse (e.g. Snowflake, Redshift, etc.)
    • Expert-level scripting skills using Python, Shell or similar
    • Expertise in PySpark and Pandas
    • Experience with standard warehousing concepts like Data Marts and Dimensional Modeling
    • Excellent communication skills, both verbal and written
    • Experience with at least one data modeling tool
    • Strong problem-solving abilities and critical thinking

Nice To Have:

    • Hands-on experience managing and performance-tuning PostgreSQL
    • Experience with ETL tools like Stitch, Fivetran, Pentaho, etc.
    • Experience with data warehouse schema design and architecture
    • Experience with Big Data solutions such as Snowflake or Redshift
    • Experience managing RDS, a definite plus
    • Experience with Data Science Notebooks
    • Experience with NoSQL databases
You’ll Love Working Here:
· Venture backed by top-tier firms
· The opportunity ahead knows no bounds
· Open vacation policy
· Employee discounts
· Paid parental leave
· Health benefits
· 401(k)

To apply for this job please visit jobs.lever.co.