Autodesk

Autodesk

Design and make software for architecture, engineering, construction, and entertainment industries.

11,600Building DesignConstructionAutomotiveBuilding Product Manufacturing3D AnimationArchitectureEngineeringConstruction ProfessionalsMechanical EngineeringMechanical CADThermal SimulationElectronic Design AutomationPrint Circuit Board DesignMechanical, Electrical, and Plumbing (MEP)HVACFabricationEstimationInfrastructureCivil EngineeringGenetic Engineering (Life Sciences)Website

Senior Data Engineer

Develop scalable ETL pipelines and data warehousing solutions using Snowflake, dbt, Spark, Airflow.

Vancouver, British Columbia, Canada
87k - 127k USD
Full Time
Intermediate (4-7 years)

Job Highlights

Environment
Office Full-Time

About the Role

The engineer will apply a product‑focused mindset to understand business requirements and architect scalable, extensible data systems. Responsibilities include leading design reviews, developing ETL processes with Snowflake, dbt, and Apache Spark, and building data‑quality tracking mechanisms to detect anomalies. Code reviews and the establishment of best engineering practices are also key parts of the role. • Design and implement scalable ETL pipelines using Snowflake, dbt, Apache Spark, and Airflow. • Create and maintain data‑quality monitoring and anomaly‑detection solutions. • Conduct design and code reviews to enforce best practices and improve efficiency. • Translate business requirements into robust dimensional data models. • Automate deployment workflows with Git, Jenkins, and CI/CD pipelines. • Collaborate within Agile Scrum teams to deliver data solutions. • Leverage AWS, Hive, and Hadoop ecosystems for large‑scale data ingestion. • Communicate technical concepts clearly to cross‑functional stakeholders.

Key Responsibilities

  • etl pipelines
  • data quality
  • design review
  • dimensional modeling
  • ci/cd automation
  • aws ingestion

What You Bring

Autodesk is seeking a Senior Data Engineer to join its Data and Analytics team supporting the Product Development & Manufacturing Solutions (PDMS) organization. The role works within a robust ecosystem that includes DBT, PySpark, Python, Airflow, Snowflake, Hive, and AWS. The ideal candidate brings strong data‑warehousing experience, a detail‑oriented mindset, and enthusiasm for learning new technologies. Minimum qualifications include a Bachelor’s degree in Computer Science, Engineering or equivalent experience, plus at least five years of hands‑on work with ETL/ELT tools and data curation. Candidates must be proficient in Python and SQL, understand dimensional modeling, and have experience with Snowflake or similar data‑ingestion platforms. Strong communication, problem‑solving, and a curiosity for the “why” behind business processes are required. Preferred credentials include experience with DBT, PySpark, Airflow, Spark, Hadoop 2.0, and related ecosystems. Familiarity with automation frameworks such as Git and Jenkins and experience working in Agile Scrum teams are also valued.

Requirements

  • python
  • sql
  • snowflake
  • dbt
  • pyspark
  • airflow

Benefits

Autodesk offers a competitive compensation package, with a base salary in British Columbia ranging from $86,600 to $127,050, plus potential bonuses, stock grants, and a comprehensive benefits suite. The company emphasizes a culture of belonging, diversity, and meaningful work that helps build a better world.

Work Environment

Office Full-Time

Apply Now