
Parsons Corporation
Multinational technology‑driven engineering and infrastructure firm serving government and private sectors.
Data Warehouse Engineer - Intern
Develop and optimize Azure-based data pipelines and warehousing solutions.
Job Highlights
About the Role
• Design, develop, and optimize cloud‑based data pipelines using Python, Apache Spark (Databricks) and Azure Data Factory. • Implement batch and streaming ETL/ELT processes for structured and semi‑structured data. • Utilize Azure services (Data Factory, Synapse Analytics, Data Lake Storage, Databricks) and support multi‑cloud storage (AWS S3, GCP BigQuery) as needed. • Build and maintain modern data warehouse/lakehouse solutions such as Snowflake, Redshift, or BigQuery. • Orchestrate workflows with Azure Data Factory pipelines or Apache Airflow, handling scheduling and failure recovery. • Manage relational (PostgreSQL, MySQL, Azure SQL) and NoSQL (MongoDB, Cassandra, DynamoDB) databases, writing efficient SQL queries. • Apply version control (Git, Azure DevOps) and support data quality checks, performance tuning, and documentation. • Collaborate in Agile teams, communicate data needs, and document pipelines and system designs.
Key Responsibilities
- ▸data pipelines
- ▸etl/elt
- ▸cloud services
- ▸data warehouse
- ▸workflow orchestration
- ▸database management
What You Bring
The internship lasts 3 to 6 months with the possibility of extension, offering hands‑on experience building data pipelines in an Azure enterprise environment. You will receive mentorship from senior data engineers and architects, work with lakehouse architectures, and develop practical skills in Databricks, Azure Data Factory, and Synapse. The role provides insight into how data engineering supports large‑scale infrastructure and program delivery while promoting professional growth. • Required skills: Python programming, SQL, Apache Spark/Hadoop/Beam, Azure data services, and Git proficiency. • Preferred: exposure to Azure DevOps/GitHub Actions, Agile/Scrum, data governance (GDPR/CCPA), and multi‑cloud experience.
Requirements
- ▸python
- ▸sql
- ▸spark
- ▸azure
- ▸git
- ▸databricks
Work Environment
Hybrid