Senior Data Engineer - Omni Channel
Here at Discount Tire, we celebrate the spirit of our people with extraordinary pride and enthusiasm. Our business has been growing for more than 60 years and now is the best time in our history to join us. We recognize that to remain the industry leader we must continue to grow and evolve our business in a rapidly changing industry. We are achieving this, not only by opening new stores, but by transforming our technological landscape and making data a central component of our strategy. The Business Analytics team, one of the fastest growing teams in the company, is leading this change. We are responsible for driving the insights, recommendations, and developing the decision support tools that influence the strategic direction of the company.
Under minimal supervision, the Senior Data Engineer designs, builds, and operates scalable data solutions on Discount Tire's Databricks-based lakehouse platform. This role supports enterprise analytics, business intelligence, and data science initiatives by delivering high-quality, governed, and performance data products across Omni-channel domains. The Senior Data Engineer is responsible for developing and maintaining ELT pipelines, data models, and lakehouse architecture using modern cloud and big data technologies, with a strong focus on reliability, performance, and production ownership.
Essential Duties and Responsibilities:
- Lakehouse & Data Architecture
- Designs, develops, and maintains Databricks-based lakehouse solutions leveraging medallion architecture (Bronze, Silver, Gold) and Delta Lake.
- Builds scalable, high-performance analytical data models to support reporting, analytics, and downstream consumption.
- Ensures data solutions align with enterprise standards for security, governance, scalability, and cost efficiency.
- Data Engineering & ELT Pipelines
- Develops, tests, and maintains batch and streaming ELT pipelines using Spark SQL.
- Implements incremental data processing, dependency management, and idempotent pipeline patterns across multiple data sources.
- Integrates structured and semi-structured data from transactional, digital, and third-party systems into the lakehouse.
- Databricks Platform Development
- Develops and maintains Databricks notebooks, workflows, and jobs across development, QA, and production environments.
- Leverages Delta Lake features including ACID transactions, schema enforcement and evolution, and time travel.
- Partners with platform and DevOps teams to support environment promotion, scheduling, and deployment automation.
- Data Quality & Reliability
- Implements automated data quality checks, validation frameworks, and monitoring to ensure accuracy, completeness, and timeliness of data.
- Proactively identifies and resolves data issues, ensuring reliable data availability for business partners.
- Performance Optimization & Tuning
- Troubleshoots data pipeline and query performance issues and implements optimizations related to Spark execution, partitioning, file sizing, and data layout.
- Documents root cause analysis and corrective actions and provide recommendations to improve system performance and reliability.
- Production Ownership & Operations
- Owns data pipelines end-to-end in production, including monitoring, alerting, incident response, and operational support.
- Contributes to runbooks, operational documentation, and continuous improvement of production readiness practices.
- Programming & Standards
- Writes clean, maintainable, and well-tested code using Python (PySpark) and SQL.
- Contributes to data engineering standards, reusable frameworks, and best practices across the Analytics Engineering team.
- Documentation & Reviews
- Produces clear technical documentation, including design specifications and data flow diagrams.
- Participates in design reviews and code reviews, providing constructive feedback to ensure quality and consistency.
- Collaboration & Stakeholder Engagement
- Collaborates closely with data engineers, analytics engineers, data scientists, digital analysts, and business stakeholders to translate business needs into scalable data solutions.
- Works cross-functionally with project managers, platform teams, and external partners to deliver high-impact analytics capabilities.
- Mentorship & Technical Leadership
- Mentors and supports less experienced data engineers through technical guidance and best-practice sharing.
- Leads technical design discussions for complex datasets and influences architectural decisions within the team.
- Continuous Improvement
- Stays current with emerging data engineering technologies, trends, and industry best practices.
- Identifies opportunities to improve efficiency, reliability, and scalability across data engineering workflows.
- Performs other duties as assigned.
Qualifications:
- 5+ years of experience in data engineering, data integration, or analytics engineering roles.
- Experience with cloud-based analytics platforms and modern data architecture.
- Technical Skills
- Advanced experience with SQL and Spark (PySpark and Spark SQL) in large-scale data environments.
- Strong experience designing and maintaining ELT pipelines in a cloud-based lakehouse architecture.
- Deep understanding of data modeling for analytics, including star schemas and slowly changing dimensions.
- Experience with streaming or near-real-time data processing is preferred.
- Strong knowledge of data governance, security, and compliance best practices.
- Tools and Platforms
- Extensive experience with Databricks and Delta Lake for analytics and data engineering workloads.
- Experience with cloud object storage platforms such as AWS S3.
- Familiarity with orchestration, scheduling, and monitoring of data pipelines.
- Experience working with large, complex datasets in analytics and reporting contexts.
- Familiarity with BI and visualization tools such as Tableau, Looker, or Power BI.
- Experience with digital analytics data (e.g., Adobe Analytics or Google Analytics) is a plus.
- Methodologies and Frameworks
- Experience working within SDLC methodologies such as Agile and Waterfall.
- Familiarity with CI/CD and version control practices for data engineering workflows.
- Communication and Presentation Skills
- Ability to clearly communicate complex technical concepts to technical and non-technical audiences, including leadership.
- Strong written and verbal communication skills, including documentation and presentation of technical solutions.
- Customer Service & Collaboration
- Demonstrated ability to work collaboratively across teams and with business stakeholders.
- Strong customer service mindset with a focus on delivering reliable and high-quality data solutions.
- Adaptability & Problem Solving
- Strong troubleshooting and analytical skills with a proactive, solution-oriented approach.
- Ability to work effectively in a fast-paced, evolving environment and manage multiple priorities.
- Curiosity
- Intellectual curiosity and passion for data, analytics, and continuous learning.
Educational Requirements:
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field, or equivalent practical experience.
- Data or cloud-related certifications are a plus.
Work Days:
Normal work days are Monday through Friday. Occasional Saturdays and Sundays may be necessary.
Work Hours:
Normal work hours are 8:00 a.m. to 5:00 p.m. Additional hours may be necessary.
Discount Tire provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local law.
#LI-GW1
#LI-Hybrid
LOCATION
20225 N Scottsdale Rd , Scottsdale , Arizona
JOB TYPE
Full Time
CATEGORY
Business Analytics/Insights