
Aecom
Trusted global infrastructure consulting firm delivering engineering, design, construction management services.
Data & Integration Engineer - Phoenix
Design and build production-grade data integrations and AI workflows for construction projects.
Job Highlights
About the Role
The position sits at the intersection of data engineering, systems integration, automation, and business‑facing application development, emphasizing end‑to‑end production ownership, secure authentication, incremental loads, monitoring, and documentation. A tool‑agnostic approach is used, integrating sources such as ERP, project management, document management, and scheduling systems into a centralized repository that supports structured tables, metadata, and vector/semantic indexing for AI use cases. • Design and maintain production‑scale integrations (API, database, file‑based, event‑driven) to deliver data into a central repository. • Build and support a centralized data platform that powers analytics, internal applications, and AI/LLM workflows. • Implement data normalization and metadata standards for reliable reuse across teams and products. • Translate business requirements into scalable technical solutions, including automation and AI‑assisted workflows. • Ensure reliability with monitoring, alerting, observability, data validation, and failure‑recovery mechanisms (retries, idempotency, backfills). • Contribute to architecture and engineering standards for integration patterns, data modeling, and AI‑ready data access. • Develop internal tools and services (web apps, APIs, utilities) that enable business teams to discover, search, and operationalize data. • Enable AI/LLM use cases such as semantic search, summarization, classification, structured extraction, and agent workflows with repeatable pipelines and human‑in‑the‑loop controls. • Design low‑latency data access patterns (views, APIs, query endpoints) that support governed analytics. • Evaluate and recommend tools and platforms pragmatically, prioritizing outcomes and maintainability over specific vendors. • Debug issues across integrations, storage, and applications; produce clear documentation including data lineage, contracts, and runbooks. • Build three or more production integrations into the central repository. • Deliver at least one data‑to‑application outcome (internal tool, API, or agent workflow) adopted by the business. • Implement monitoring/alerting and comprehensive documentation (runbooks, lineage, contracts) for supportability. • Establish repeatable patterns for structured extraction and AI automation, including QA/evaluation.
Key Responsibilities
- ▸data integration
- ▸data platform
- ▸automation
- ▸monitoring
- ▸ai enablement
- ▸internal tools
What You Bring
The role is based on‑site in Phoenix, AZ, and works closely with construction professionals, analysts, and technologists to deliver reliable data systems, tools, and APIs that make data usable beyond reporting, including agent‑driven retrieval, summarization, and decision support. Candidates with experience in construction, engineering, manufacturing, or other asset‑heavy industries are especially well‑suited. • Own end‑to‑end integration pipelines: source ingestion, normalization, storage, and access via BI, APIs, apps, or agents. • BA/BS in Computer Science or related field and 4 + years of experience in data engineering, systems integration, backend engineering, or automation (or equivalent). • Proficiency with SQL and relational databases; ability to design schemas for operational and analytical use. • Experience with Python or a similar language for APIs, transformation logic, and automation tooling. • Prior experience in architecture, engineering, or construction (A/E/C) industry preferred. • Expertise in APIs and ETL/ELT workflows, handling pagination, rate limits, retries, incremental loads, and secure credential management. • Experience building systems with monitoring/alerting and clear failure‑recovery paths. • Hands‑on experience integrating LLM/AI into automation workflows, including structured outputs and handling hallucinations or low‑confidence results. • Familiarity with software engineering best practices: version control, code review, testing, and secure secret management. • Ability to work independently in a dynamic environment. • Experience designing data platforms supporting reporting, APIs, applications, and AI agents. • Knowledge of orchestration/integration frameworks and patterns such as scheduling, triggers, queues/events, and dependency management. • Experience deploying internal services/tools (admin panels, lightweight apps, APIs) for non‑technical users. • Ability to operationalize AI/LLM workflows with human‑in‑the‑loop, confidence scoring, structured outputs, and drift monitoring. • Understanding of data modeling, governance, privacy/permissions, and system reliability. • Experience in construction, engineering, manufacturing, or similar project‑based industries where data ties to cost, schedule, contracts, and field operations. • Comfort collaborating with non‑technical stakeholders.
Requirements
- ▸sql
- ▸python
- ▸etl
- ▸apis
- ▸data modeling
- ▸4+ yrs
Benefits
AECOM offers a comprehensive benefits package, flexible work options for eligible staff, and opportunities for professional growth on groundbreaking projects worldwide. The position reports a salary range of $74,499 to $137,824 annually, does not provide relocation, and requires U.S. work authorization without sponsorship.
Work Environment
Office Full-Time