Your browser cookies must be enabled in order to apply for this job. Please contact support@jobscore.com if you need further instruction on how to do that.

Data engineer- ETL, Python

Information Technology | Remote in Washington, DC | Contract

Job Description

About Us:

Radiant Digital delivers technology consulting and business solutions for commercial and government clients.

Our flexible delivery model allows us to provide end-to-end solution delivery, single project execution, and, or strategic resources.

CMMI Maturity Level III and ISO 9001 – 2015 certified.

Responsibilities:

Responsibilities:
  1. Design, develop, and maintain scalable and robust ETL pipelines to extract, transform, and load data from various sources into our data warehouse/Data Lake.
  2. Collaborate with cross-functional teams to gather data requirements, define data models, and implement data integration solutions.
  3. Optimize ETL processes for performance, reliability, and scalability, ensuring efficient data processing and timely delivery of insights.
  4. Implement data quality checks and monitoring mechanisms to ensure data accuracy, completeness, and integrity.
  5. Troubleshoot and resolve data-related issues, investigate root causes, and implement preventive measures.

Requirements:

  • Bachelor’s or master’s degree in computer science, Engineering, or related field.
  • 8+ years of experience as a Data Engineer, with a focus on ETL development and data integration.
  • Proficiency in programming languages such as Python, SQL, and Scala
  • Experience with other cloud providers (e.g., AWS, GCP) and related services.
  • Experience working with relational databases (e.g., SQL Server, PostgreSQL, MySQL) and non-relational databases (e.g., MongoDB, CosmosDB).
  • Hands-on experience with Azure data services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, or similar.
  • Experience working with Azure SQL Database, Azure Blob Storage, Azure Cosmos DB, and other Azure data services.
  • Familiarity with data visualization tools (e.g., Power BI) for creating dashboards and reports.
  • Experience with CI/CD (Continuous Integration/Continuous Deployment) pipelines for automated software delivery.
  • Strong understanding of data warehousing concepts, data modeling, and database design principles.
  • Excellent problem-Communication, solving skills, analytical mindset, and attention to detail.