At softgic we work with the coolest, with those who build, with those who love what they do, with those who have 100 in attitude, because that's our #cooltura. Join our purpose of making life easier with technology and be part of our team as a data engineer.
compensation:
usd 20 - 28/hour.
location:
remote (anywhere).
mission of softgic:
in softgic s.a.s. we work for the digital and cognitive transformation of our clients, aware that quality is an essential factor for us, we incorporate the following principles into our policy:
* deliver quality products and services.
* achieve the satisfaction of our internal and external clients.
* encourage in our team the importance of training to grow professionally and personally through development plans.
* comply with the applicable legal and regulatory requirements.
* promote continuous improvement of the quality management system.
what makes you a strong candidate:
* you are proficient in azure data lake, azure sql, elt (extract, load, transform), and python.
* english: native or fully fluent.
* spanish: native or fully fluent.
responsibilities:
* design, develop, and maintain scalable data architectures using sql server, azure sql database, and snowflake on azure.
* implement and manage data pipelines using azure data factory, supporting etl and elt processes.
* work with sql change data capture (cdc) along with debezium to enable real-time and incremental data processing.
* work with streaming technologies such as kafka and azure event hub to deliver near real-time analytics and reporting.
* manage azure data lake to store and process structured and unstructured data efficiently.
* design and optimize data vault and star schema models for data warehousing solutions.
* develop and maintain etl/elt workflows using python and sql-based tools.
* leverage databricks for big data processing, machine learning, and advanced analytics.
* ensure data quality, governance, and security across multiple data environments.
* build and maintain analytical reports using sigma.
* collaborate with business stakeholders and data analysts to ensure data solutions align with business needs.
* monitor and troubleshoot data pipelines to ensure reliability, accuracy, and efficiency.
* support disaster recovery planning and high-availability data strategies.
* stay up to date with emerging data engineering technologies and best practices.
requirements
* 5-7 years of experience as a data architect or senior-level data engineer.
* expertise in sql server (ssms, t-sql, ssis, ssrs, ssas) and azure sql database.
* strong experience in data modeling, including data vault and star schema methodologies.
* proficiency in etl/elt development and data pipeline management.
* hands-on experience with snowflake on azure and databricks for big data processing.
* experience working with streaming technologies (e.g., kafka, flint, event hub).
* strong analytical and problem-solving skills with a focus on data integrity and scalability.
* knowledge of python for data transformation, automation, and analytics is a bonus.
additional requirements:
* ability to sit or stand for extended periods of time as required.
* ability to work in a fast-paced, deadline-driven environment with minimal supervision.
* we're certified as a great place to work.
* opportunities for advancement and growth.
* paid time off.
* formal education and certifications support.
#j-18808-ljbffr