Data Engineer
Location: Washington, District Of Columbia - Remote
Category: Data
Employment Type: Contract
Job ID: 15287
Date Added: 03/16/2023
This position will be required to:
- Work with Business Owners, Data and Business Intelligence teams [On Looker/Tableau etc] to identify requirement on new and existing data sources and implement ETL logics from various different types of interfaces that we extract from – APIs, Web Services, Databases, external and On-Prem databases and warehouses.
- Work with business users and technical designers to assist in efficient data model designs that meet business unit requirements involving integrations of various ACS technical data from systems and platforms.
- Work with management, project managers and other lead developers to design and develop pipelines and ensure data accuracies.
- Lead and participate in troubleshooting and fixing major system problems in core data systems and supplemental data pipelines as well.
- Understand relationship between GCP products well – Primarily Data Fusion, Big Query and Looker, and demonstrate experience in Data Fusion [comparable ETL acceptable] and Google Big Query [Comparable DW acceptable].
- Provide strong leadership and mentoring for less senior personnel in the areas of design, implementation, and professional development.
- Where required, effectively delegate tasks to development teams of Software Engineering, providing guidance and proper knowledge transfer to ensure that the work is completed successfully.
- Be flexible to work during non-Business hours.
The ideal candidate will:
- Have experience with Data Fusion or Equivalent, Big Query or Equivalent, SQL server, scripting in Java/Python that works well in GCP products, and their respective practices.
- Python experience is a plus.
- Develop data ETL pipelines that meets both the functional and non-functional requirements, including performance, scalability, availability, reliability and security.
- Have experience with writing code in Java, in order to work on data extracts that require cleanup.
- Have a working knowledge of XML, JSON and other forms of data streaming artifacts and related technologies in a Java/Python environment.
- Have strong written and verbal communication skills
- Able to multi task on various streams of the entire data process.
Education and Experience and Technical Requirements:
Bachelor’s degree or equivalent experience. 5+ years with proven results in system development, implementation, and operations is required. Strong understanding of design patterns with a focus on tiered, large-scale data systems.
#LI-AW
#DICE
#REMOTE