Carrix / ssa marine is committed to making our employees feel welcome and respected. Our team is unique and our approach successful because we have fostered an environment that values varying backgrounds, perspectives, and experiences and takes pride in how the collective delivers value to our customers and partners.
summary / objective :
as a data engineer at carrix / ssa marine, you are charged with making the appropriate data available to various data consumers including data and business analysts, integrators, citizen data scientists, etc. You build, manage, and optimize data pipelines and move these data pipelines effectively into production for key data and analytics consumers.
essential responsibilities :
1. embrace and implement devops principles and practices, identify and assess operations opportunities such as automation, provisioning, etc.
2. learn to use innovative and modern tools, techniques, and architectures to automate the most-common, repeatable and tedious data preparation and integration tasks.
3. assist with renovating the data management infrastructure to drive automation.
4. collaborate across teams and departments to refine data and data consumption requirements for various initiatives.
5. work to propose appropriate data ingestion, preparation, integration, and operationalization techniques to address data requirements.
6. perform data conversions, imports, and exports of data within and between internal and external software systems.
7. create data transformation processes (etl, sql stored procedures, etc.) to support moderately complex to complex business systems and operational data flows.
requirements :
key knowledge, skills & abilities :
1. knowledge of programming languages such as (r, perl, python, java, c++, c#, scala, etc.).
2. knowledge of commercial etl / elt platforms such as (boomi, informatica, alteryx, aws glue, adf, etc.).
3. knowledge of modern data storage and access technologies, including caching, application of nosql, key / value, and rdbms datastores.
4. deep knowledge of sql, pl / sql, oracle.
5. knowledge of etl processing and change data capture (cdc) technologies.
6. knowledge of big data concepts, analysis, frameworks, api development, and visualization.
7. knowledge of streaming data technologies like apache spark, flink, and kafka.
8. knowledge of business intelligence tools (such as tableau, qlik, powerbi).
9. knowledge of data management architectures like data warehouse, data lake, data hub and the supporting processes like data integration, governance, metadata management.
#j-18808-ljbffr