Equinix

Equinix

Global leader in data center and interconnection services, enabling digital transformation.

10,000Website

Senior Data Engineer

Lead design and delivery of enterprise-scale data solutions on GCP, mentoring team.

United States
136k - 204k USD
Full Time
Expert & Leadership (13+ years)

Job Highlights

Environment
Onsite

About the Role

The Senior Data Engineer will drive end‑to‑end delivery of enterprise‑scale data solutions on Google Cloud Platform (GCP). This role blends hands‑on technical design, cross‑functional team leadership, and mentorship of staff and other engineers to create mission‑critical data products. Key responsibilities include architecting complex, distributed data systems capable of handling petabyte‑scale workloads with fault tolerance, high availability, and cost‑optimized performance. The engineer will also evaluate emerging technologies, create architectural documentation, and establish best‑practice guidelines. The role requires ownership of the full delivery lifecycle for data engineering initiatives, from conception to production, using technologies such as BigQuery, Cloud Dataflow, Composer, Dataform/dbt, Pub/Sub, and Vertex AI. Real‑time streaming pipelines will be built with Cloud Pub/Sub, Kafka, and Apache Beam, and data platforms will be containerized using Docker and Kubernetes. Leadership duties involve guiding and mentoring data engineers, conducting technical design reviews, defining coding standards, and driving continuous improvement in processes and tooling. The engineer will also lead technical training, participate in hiring, and foster a culture of technical excellence. Data governance responsibilities include establishing data classification, access control, encryption strategies, automated data quality frameworks, lineage tracking, and ensuring compliance with regulations such as GDPR and SOX. Privacy‑preserving techniques and secure data sharing mechanisms will also be designed. The Senior Data Engineer will partner with business stakeholders to translate strategic objectives into technical solutions, lead cross‑functional initiatives, and present architectural decisions to executive leadership, ensuring alignment between technology and business outcomes. • Design and implement data pipelines using BigQuery, Cloud Dataflow, Composer, Pub/Sub, Kafka, and Apache Beam. • Build microservices‑based data platforms with containerization (Docker/Kubernetes) and ensure observability. • Establish data governance policies, automated quality frameworks, and compliance with GDPR and SOX. • Lead and mentor data engineering teams, conduct design reviews, and define coding standards. • Translate business objectives into technical solutions and present architectures to executive leadership.

Key Responsibilities

  • data pipelines
  • microservices
  • data governance
  • team mentorship
  • architecture design
  • cloud platform

What You Bring

Required qualifications include 6+ years of hands‑on GCP experience with deep expertise in BigQuery, Cloud Dataflow, Cloud Composer, Pub/Sub, Dataproc, and Vertex AI, as well as expert‑level programming in Python/Java and proficiency in Python/Scala for Spark. The candidate should have 8+ years of data engineering experience, strong architectural thinking, and excellent communication skills. Preferred qualifications comprise Google Cloud Professional Data Engineer or Cloud Architect certifications, experience with AI/ML and MLOps on GCP, knowledge of emerging AI technologies such as Agentic AI and Model Context Protocol, and familiarity with BI tools, data mesh, and multi‑cloud architectures. • Architect petabyte‑scale distributed data systems on GCP with fault tolerance and high availability. • Demonstrate 6+ years of GCP expertise, including BigQuery, Dataflow, Composer, and Vertex AI. • Proficient in Python/Java and Python/Scala for Spark, with strong knowledge of Apache Beam, Spark, Airflow, and Kafka. • Hold 8+ years of data engineering experience, with 6+ years focused on GCP and enterprise data platforms. • Possess Google Cloud Professional Data Engineer or Cloud Architect certifications and AI/ML pipeline experience.

Requirements

  • gcp
  • bigquery
  • dataflow
  • spark
  • python
  • data engineer

Benefits

The targeted base salary for this position in Dallas is $136,000‑$204,000 annually, with eligibility for bonus, equity, and a comprehensive benefits package that includes health insurance, retirement contributions, paid time off, and employee assistance programs.

Work Environment

Onsite

Apply Now