Agileengine is one of the inc. 5000 fastest-growing companies in the u and a top-3 ranked dev shop according to clutch.
we create award-winning custom software solutions that help companies across 15+ industries change the lives of millions if you like a challenging environment where you're working with the best and are encouraged to learn and experiment every day, there's no better place - guaranteed!
:) what you will do - design, develop, and maintain real-time and batch etl pipelines tailored to blockchain and market data; - build and maintain scalable backend systems and apis for internal and external data consumption; - ensure data observability, quality, and reliability through standardized monitoring and alerting systems; - collaborate cross-functionally with data scientists, engineers, and product teams to support decision-making through accessible data; - improve existing systems and identify opportunities for automation and performance optimization; - contribute to open-source tooling and share knowledge with the broader ethereum and web3 ecosystem; - mentor other engineers and promote a culture of innovation and technical excellence; - engage in technical discussions, architectural reviews, and code quality assurance; - provide technical leadership, mentorship, and guidance to other engineers.
must have - 5+ years of experience as a data engineer or backend engineer with a strong foundation in data-intensive systems; - experience building and maintaining highly available data pipelines; - proficiency with apache kafka or other data streaming/pub-sub systems; - experience with data modeling, api design, and distributed systems; - proficiency in at least one modern backend language (python, go, node.js/typescript); - 1+ years of experience working with ethereum or solana ecosystems ; - strong understanding of blockchain data structures and transaction flows; - comfortable working in a fast-paced, remote, and collaborative environment; - upper-intermediate english level.
nice to haves - experience with aws, gcp, or azure; - experience with apache airflow and other workflow orchestration tools; - experience with postgresql, clickhouse, or similar analytical databases; - familiarity with docker, kubernetes, and ci/cd practices; - prior exposure to on-chain analytics, defi protocols, or smart contracts; - experience contributing to open-source data tooling.
the benefits of joining us - professional growth accelerate your professional journey with mentorship, techtalks, and personalized growth roadmaps.
- competitive compensation we match your ever-growing skills, talent, and contributions with competitive usd-based compensation and budgets for education, fitness, and team activities.
- a selection of exciting projects join projects with modern solutions development and top-tier clients that include fortune 500 enterprises and leading product brands.
- flextime tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
next steps after you apply the next steps of your journey will be shared via email within a few hours .
please check your inbox regularly and watch for updates from our internal applicant site, launchpod, which will guide you through the process.