About the role: this role involves designing, building, and maintaining solutions that will be deployed on public and private clouds and local workstations. You'll master distributed systems concepts such as observability, identity, and tracing, and work with both kubernetes and other technologies.
about web shop manager & partslogic
web shop manager is a company dedicated to providing innovative solutions in the automotive parts industry. With a focus on e-commerce, search & discovery, fitment and vehicle data to create powerful e-commerce experiences with efficiency, accuracy, and customer satisfaction. We aim to enhance the overall experience for both buyers and sellers.
at web shop manager & partslogic, we rely on a dynamic team of engineers to solve the many challenges of our rapidly evolving industry and technical stack. We’re seeking a data engineer who is ready to work with new technologies and architectures in a forward-thinking organization that’s always pushing boundaries. This person will be responsible for developing and maintaining high-performing saas solutions and will have complete, end-to-end ownership of projects.
as a data engineer in the data platform team, you will have the opportunity to build, optimize and grow a large data platform which directly supports the partslogic search and discovery engine, as well as other products within partslogic. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct and huge impact on the company's core products as well as millions of users.
responsibilities
- participate in all aspects of agile software development, including design, implementation, and deployment.
- design, build and maintain data transformations efficiently and reliably for different purposes (e.g. indexing, reporting, growth analysis, multi-dimensional analysis).
- design, implement and maintain reliable, scalable, robust and extensible big data systems that support core products and business.
- design, develop and support data pipelines and workflows that integrate various sources of structured and unstructured data across our customers' landscape.
- assess and implement any customizations and custom data wrangling needed to transform customer datasets in a way that is best suited for processing by our ml models.
- monitor data processing and machine learning workflows to ensure customer data is successfully processed by our ml models, debugging and resolving any issues faced along the way.
- generate necessary data artifacts and analytics needed to assess the quality of results generated by different parts of the ai workflow to ensure they meet desired quality standards.
- collaborate closely with the engineering team to assess model performance and improve results as needed.
- collaborate closely with product engineering as needed to resolve any issues faced during customer deployment and onboarding.
- collaborate closely with customer success managers and account managers to maintain close customer relationships and ensure customer satisfaction.
- develop tooling and playbooks to streamline and automate key technical delivery activities to be repeatable and scalable.
- develop and maintain rag & vector embeddings for ai and semantic search.
- establish solid design and best engineering practice for engineers as well as non-technical people.
- develop and maintain scalable and high-performing software solutions using the specified tech stack.
- architect and provide guidance on building data pipelines and lakes.
- collaborate with cross-functional teams to design and implement new features.
- ensure code quality through functional testing.
- troubleshoot and debug issues to ensure smooth functionality.
- collaborate across time zones via slack, github comments, documents, and frequent videoconferences.
- stay up to date with industry trends and best practices.
required skills and qualifications
- experience in the automotive industry and automotive parts standards such as aces and pies.
- experience in performing data analysis, data ingestion and etl (extraction, transformation & loading).
- experience with the big data technologies (airflow, hadoop, m/r, hive, spark, metastore, presto, flume, kafka, clickhouse, flink etc).
- experience with elasticsearch & opensearch.
- experience with vector databases (qdrant, opensearch, milvus).
- experience with aws bedrock & q.
- excellent problem-solving and communication skills.
- experience working in a small team startup environment.
- experience working on or implementing e-commerce platforms or solutions.
- experience with aws and cloud computing services.
- ability to write and maintain clear and concise technical documentation.
- experience in search technologies such as full text lexical search and/or semantic vector based search.
- strong understanding of git and experience with ci/cd pipelines.
- experience building saas dashboards with graphs to display analytics and performance kpis.
- experience working with restful and graphql apis.
- knowledge of python, r, rust.
- excellent ability to write efficient sql queries.
- excellent debugging and optimization skills.
preferred skills and qualifications
- bachelor's degree in computer science.
#j-18808-ljbffr