Posted on 7/30/2025
Junior Software Engineer – Data Mobility
Jobright.ai
Seattle, WA
Qualifications
- B.S., M.S., or PhD. in Computer Science or equivalent
- 2+ years of experience with CS fundamental concepts and experience with at least one of the programming languages of Scala, Java, and Python OR 2+ years of production experience with at least one of the programming languages such as Scala, Java, and Python
- Very good understanding of SQL
- You are located or are willing to locate to the Bay Area or Seattle
- Experience improving efficiency, scalability, and stability of data platforms
Responsibilities
- The Software Engineer in Data Mobility will focus on managing the seamless movement of vast amounts of data to support various business lines and drive machine learning initiatives
- Contribute to powering multiple business lines with high-quality, low-latency data directly integrated into online systems, driving billions in revenue
- Work with advanced open-source technologies such as Apache Spark, Flink, Kafka, Airflow, Delta Lake, and Iceberg
- Play a crucial role in evolving our systems to accommodate a 10x scale increase, supporting DoorDash’s expanding international footprint
- Be part of a team that drives innovation and maintains high standards of reliability and flexibility in our data infrastructure
- Collaborate closely with cross-functional teams in Analytics, Product, and Engineering to ensure stakeholder satisfaction with the data platform's roadmap
Full Description
Verified Job On Employer Career Site
Job Summary:
DoorDash is a food delivery platform that connects customers with local and national businesses. The Software Engineer in Data Mobility will focus on managing the seamless movement of vast amounts of data to support various business lines and drive machine learning initiatives.
Responsibilities:
• Contribute to powering multiple business lineswith high-quality, low-latency data directly integrated into online systems, driving billions in revenue.
• Work with advanced open-source technologies such as Apache Spark, Flink, Kafka, Airflow, Delta Lake, and Iceberg.
• Play a crucial role in evolving our systems to accommodate a 10x scale increase, supporting DoorDash’s expanding international footprint.
• Be part of a team that drives innovation and maintains high standards of reliability and flexibility in our data infrastructure.
• Collaborate closely with cross-functional teams in Analytics, Product, and Engineering to ensure stakeholder satisfaction with the data platform's roadmap.
Qualifications:
Required:
• B.S., M.S., or PhD. in Computer Science or equivalent
• 2+ years of experience with CS fundamental concepts and experience with at least one of the programming languages of Scala, Java, and Python OR 2+ years of production experience with at least one of the programming languages such as Scala, Java, and Python.
• Very good understanding of SQL.
• You are located or are willing to locate to the Bay Area or Seattle.
• Experience improving efficiency, scalability, and stability of data platforms.
Preferred:
• Prior technical experience in Big Data solutions - you've built meaningful pieces of data infrastructure.
• Bonus if those were open-sourced big data processing frameworks using technologies like Spark, Airflow, Kafka, Flink, Iceberg, Deltalake.
Company:
DoorDash is a food delivery platform that connects customers with local and national businesses. It is a sub-organization of DoorDash. Founded in 2013, the company is headquartered in San Francisco, California, USA, with a team of 10001+ employees. The company is currently Public Company. DoorDash has a track record of offering H1B sponsorships.
Find AI, ML, Data Science Jobs By Location
Find Jobs By Position
Subscribe to the AI Search Newsletter
Get top updates in AI to your inbox every weekend. It's free!