at Carvana, LLC in Tempe, Arizona, United States
Job Description
Carvana, LLC seeks a Data Engineer in Tempe, AZ. Responsibilities include:
- Build and maintain SQL logic for reporting on critical business functions supported by the team (purchasing, pricing, inventory management, delivery fee optimization)
- Build and maintain SQL logic in Snowflake data warehouse to ingest user data on customer search and view behavior for detailed analysis and reporting
- Build and maintain SQL logic in Snowflake to ingest user experiment data produced by product engineering / analytics for a number of A/B tests supported by the Retail Data Science team
- Build and maintain SQL logic in Google Big Query for monitoring of market data provided by 3rd party vendors to track inventory, sales, and pricing of key competitors in the used vehicle market
- Build and maintain SQL logic to monitor forecast vs. actual performance of predictive models owned and operated by the data science team in service of critical business functions (approx. 20 predictive models used in pricing and purchasing optimization)
- Build Tableau visualizations and dashboards in support of the above
- Build capabilities for statistical analysis and tracking of key performance metrics for predictive models owned by the data science team
- Build capabilities for statistical analysis of user experiment results by experiment, cohort, and treatment group over time and summarized for the duration of experiments
Telecommuting is permitted.
The position requires a Bachelor's degree in Computer Science, Business Analytics or related degree plus three (3) years of experience. In the alternative, the company will accept a Master’s degree in Computer Science, Business Analytics or related degree and knowledge of these skills gained through any amount of previous work experience or university level coursework.
Requires 3 years of experience (with a Bachelor's degree), or any amount of knowledge in the following (with a Master's degree) with:
- SQL
- Tableau dashboard design and deployment, building dashboards and visualizations
2 years of experience (with a Bachelor's degree), or any amount of knowledge in the following (with a Master's degree) with:
- SQL performance optimization on traditional OLTP and columnar databases e.g. Snowflake
- Tableau's extract scheduling to coordinate updates with data engineering pipelines for performance, up-to-date reporting refreshes
- Measurement and analysis of user experiments (control vs. treatment impact)
1 year of experience (with a Bachelor's degree), or any amount of knowledge in the following (with a Master's degree) with:
- Working with data scientists to build and deploy measurement tools to track predictive accuracy and performance of predictive models
Telecommuting is permitted.
40 hours/week, Wage range: Must also have authority to work permanently in the U.S. Applicants who are interested in this position may apply at www.jobpostingtoday.com (Ref #32115) for consideration.
To view full details and how to apply, please login or create a Job Seeker account