Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Devops Engineer

In the role of Front-end Engineer, you would be responsible for Designing and development of individual product features for IBM of storage products, IBM Sof...


Ibm Careers - Maharashtra

Published 17 days ago

Package Consultant: Sap Hana Scm Sd

Assists clients in the selection, implementation, and support of SD for SAP. Lead multiple sized projects as team member or lead to implement new functionali...


Ibm Careers - Maharashtra

Published 17 days ago

Java Developer

Responsible for designing and developing Java components using Spring framework to implement transaction management Will manage Java objects and enterprise i...


Ibm Careers - Maharashtra

Published 17 days ago

Software Developer

As a C++ Developer you will:Be responsible for component/feature development and its integration with the complete product. Work and collaborate with team me...


Ibm Careers - Maharashtra

Published 17 days ago

Sr. Software Data Engineer

Details of the offer

The Role

Portfolio Data Integration is part of the broader Addepar Platform team. The overall Addepar Platform provides a single source of truth "data fabric" used by the Addepar product set, including a centralized and self-describing repository (a.k.a Data Lake), a set of API-driven data services, an integration pipeline, analytics infrastructure, warehousing solutions, and operating tools. The team has responsibility for all data acquisition, conversion, cleansing, disambiguation, modeling, tooling and infrastructure related to the integration of client portfolio data.
Addepar's core business relies on the ability to quickly and accurately ingest data from a variety of sources, including 3rd party data providers, custodial banks, data APIs, and even direct user input. Portfolio Data integrations and feeds are a highly critical cross-section of this set, allowing our users to get automatically updated and reconciled information on their latest holdings onto the platform.
As a software data engineer for this team, you will execute the development of new data integrations and maintenance of existing processes in order to expand and improve our data platform. You'll be adding automation and functionality to our distributed data pipelines by writing PySpark code and integrating it within our Databricks Data Lake. As you gain more experience, you'll contribute to increasingly challenging engineering projects within our platform with the ultimate goal of dramatically growing the efficiency of data ingestion for Addepar. This is a crucial, highly visible role within the company. Your team is a big component of growing and serving Addepar's client base with minimal manual effort required from our clients or from our internal data operations team.
What You'll Do

Complete individual project priorities, deadlines, and solutions.
Build pipelines that support the ingestion, analysis, and enrichment of financial data in partnership with business data analysts
Improve the existing pipeline to increase the throughput and accuracy of data
Develop and maintain efficient process controls and accurate metrics to ensure quality standards and organizational expectations are met
Partner with members of Product and Engineering to design, test, and implement new processes and tooling features that improve data quality as well as increase operational efficiency
Identify areas of automation opportunities and implement improvements
Understand data models and schemas, and work with other engineering teams to recommend extensions and changes
Mentor junior engineers as required

Who You Are

A computer science degree or equivalent experience
Minimum 7+ years of professional software engineering experience
Competency with relevant programming languages (Java, Python)
Familiarity with relational databases and data pipelines
Experience and interest in data modeling
Knowledge of financial concepts (e.g., stocks, bonds, etc.) is encouraged but not necessary
Passion for the finance and technology space and solving previously intractable problems at the heart of investment management
Experience with any public cloud is highly desired (AWS preferred).
Experience with data lake or data platforms like databricks highly preferred.


Nominal Salary: To be agreed

Source: Greenhouse

Requirements

Built at: 2024-12-27T01:56:57.049Z