Data Engineer, I

Details of the offer

Remote Work: Hybrid
Overview:
At Zebra, we are a community of innovators who come together to create new ways of working to make everyday life better. United by curiosity and care, we develop dynamic solutions that anticipate our customer's and partner's needs and solve their challenges.

Being a part of Zebra Nation means being seen, heard, valued, and respected. Drawing from our diverse perspectives, we collaborate to deliver on our purpose. Here you are a part of a team pushing boundaries to redefine the work of tomorrow for organizations, their employees, and those they serve.

You have opportunities to learn and lead at a forward-thinking company, defining your path to a fulfilling career while channeling your skills toward causes that you care about – locally and globally. We've only begun reimaging the future – for our people, our customers, and the world.

Let's create tomorrow together.

A Data Engineer will be responsible for understanding the client's technical requirements, design and
build data pipelines to support the requirements. In this role, the Data Engineer, besides developing the
solution, will also oversee other Engineers' development. This role requires strong verbal and written
communication skills and effectively communicate with the client and internal team. A strong
understanding of databases, SQL, cloud technologies, and modern data integration and orchestration
tools like Azure Data Factory (ADF), Informatica, and Airflow are required to succeed in this role.
Responsibilities:
• Play a critical role in the design and implementation of data platforms for the AI products.
• Develop productized and parameterized data pipelines that feed AI products leveraging GPUs and
CPUs.
• Develop efficient data transformation code in spark (in Python and Scala) and Dask.
• Build workflows to automate data pipeline using python and Argo.
• Develop data validation tests to assess the quality of the input data.
• Conduct performance testing and profiling of the code using a variety of tools and techniques.
• Build data pipeline frameworks to automate high-volume and real-time data delivery for our data
hub.
• Operationalize scalable data pipelines to support data science and advanced analytics.
• Optimize customer data science workloads and manage cloud services costs/utilization.
Qualifications:
• Minimum Education:
o Bachelors, Master's or Ph.D. Degree in Computer Science or Engineering.
• Minimum Work Experience (years):
o 1+ years of experience programming with at least one of the following languages: Python,
Scala, Go.
o 1+ years of experience in SQL and data transformation
o 1+ years of experience in developing distributed systems using open source technologies
such as Spark and Dask.
o 1+ years of experience with relational databases or NoSQL databases running in Linux
environments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis).
• Key Skills and Competencies:
o Experience working with AWS / Azure / GCP environment is highly desired.
o Experience in data models in the Retail and Consumer products industry is desired.
o Experience working on agile projects and understanding of agile concepts is desired.
o Demonstrated ability to learn new technologies quickly and independently.
o Excellent verbal and written communication skills, especially in technical communications.
o Ability to work and achieve stretch goals in a very innovative and fast-paced environment.
o Ability to work collaboratively in a diverse team environment.
o Ability to telework
o Expected travel: Not expected.
To protect candidates from falling victim to online fraudulent activity involving fake job postings and employment offers, please be aware our recruiters will always connect with you ****** accounts. Applications are only accepted through our applicant tracking system and only accept personal identifying information through that system. Our Talent Acquisition team will not ask for you to provide personal identifying information via e-mail or outside of the system. If you are a victim of identity theft contact your local police department.


Nominal Salary: To be agreed

Requirements

Software Developer

The Cloud Developer is a key role in the growing and dynamic IBM Automation organization. As a developer for Cloud Pak System(CPS)  product, you will be focu...


Ibm Careers - Karnataka

Published a month ago

Devops Engineer

As a Software Engineer in the IBM Cloud, you will design and build a service that allows customers to create cloud storage solutions that are secure, reliabl...


Ibm Careers - Karnataka

Published a month ago

Software Test Engineer

As a QA (Quality Assurance) /Test Developer you will be designing better ways to identify potential weak spots, inefficiencies, and issues within software sy...


Ibm Careers - Karnataka

Published a month ago

Data Scientist: Advanced Analytics

Work with broader team to build, analyze and improve the AI solutions. You will also work with our software developers in consuming different enterprise appl...


Ibm Careers - Karnataka

Published a month ago

Built at: 2024-12-23T10:31:09.058Z