Data Backend Engineer
Location: Rockville, Maryland - Remote
Category: Data
Employment Type: Contract
Job ID: 15372
Date Added: 05/15/2023
This engineer will work in Enterprise Data Platforms group which includes products covering the full data lifecycle: ingestion, cataloging, management, sharing, and authorization. Teams in our group develop platform-level services central to FINRA’s cloud-based data lake that processes 100s of billions of market events daily.
The position is on a newly formed team created to improve data discovery, streamline the path to data access for analytics, and centralize the management of data authorizations. This work involves augmenting metadata catalog and integrating with distributed compute engines like Spark and Presto that power our cloud analytics.
Job Responsibilities:
• Build the foundation of a complex new platform – the services we deliver must be secure, robust, and scalable from day one.
• Developing back-end service layer.
• Designing and developing APIs.
• Integrate open source distributed compute engines like Spark and Presto with in-house developed and third-party data management products.
• Take ownership of code through the entire SDLC. This engineer is responsible for supporting production operations along with developing, testing, and deploying software.
• Ensure our software is aligned with product standards for security compliance and automated test coverage. Every member of our team writes product and test code.
• Participate actively in executing and evolving the team’s advanced CI/CD process
• Keep up with the evolving technology ecosystem that drives our product development – innovative analytics use cases, emerging compute engines, and new cloud services.
Qualifications:
• Development experience with Java, C#, Python, or comparable language; solid foundation in object-oriented principles, multi-tier architectures, API development
• Experience building complex release and deployment automation to create and manage distributed compute stacks in the cloud – k8s, CloudFormation, Terraform
• Experience building AWS event-based architectures including S3, SQS, Lambda, SNS, EKS, EMR, Postgres.
• Deep knowledge of one or more big data compute engine such as Spark, Presto including both end-user and administrative experience.
• Experience working in an agile environment where the team defines stories collaboratively, commits to delivering working software every iteration, and improves through retrospection.
• Proven ability to maintain healthy team practices such as automated unit and functional testing, code reviews, and coding standards
Strong organizational skills.
• Excellent verbal communication skills.
• Good problem-solving skills.
• Attention to detail.
#LI-WB