Company centralizes and simplifies all the metrics used by Shopify sellers, enabling them to save more time and make more money. We help e-commerce businesses accurately track and forecast all kinds of metrics, including changes in profits and operational metrics. Our goal is to infuse AI into e-commerce, empowering shops to have an accurate picture of their business and make optimal choices that boost their bottom line.
We are looking for talented Data Engineers to join our growing team. This is a hybrid role, with 3 days a week in-office required (Jerusalem or Bnei Brak offices).
Our Data Engineers are responsible for building and operating the data systems to deliver value to the end users and to internal users, by expanding and optimizing the data pipelines and data services, ensuring data integrity and driving a data-driven culture. In addition, you will support our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
Our ideal candidate is experienced with data pipeline builders and data wranglers, and someone who enjoys optimizing data systems and building them from the ground up. Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products and comfortable working in a fast-paced and often pivoting environment.
Responsibilities
* Build and maintain our data repositories with timely and quality data
* Build and maintain data pipelines from internal databases and SaaS applications
* Create and maintain architecture and systems documentation
* Write maintainable, performant code
* Implement the DevOps, DataOps and FinOps philosophy in everything you do
* Collaborate with Data Analysts and Data Scientists to drive efficiencies for their work
* Collaborate with other functions to ensure data needs are addressed
* Constantly search for automation opportunities
* Constantly improve product quality, security, and performance
* Desire to continually keep up with advancements in data engineering practices
Qualifications
* At least 3 years of professional experience building and maintaining production data systems in cloud environments like GCP
* Professional experience using JavaScript and/or other modern programming language
* Demonstrably deep understanding of SQL and analytical data warehouses
* Experience with NOSQL databases, eg: ElasticSearch, Mongo, Firestore, BigTable
* Hands-on experience with data pipeline tools (eg: Dataflow, Airflow, dbt)
* Strong data modeling skills
* Experience with MLOps - advantage
* Familiarity with agile software development methodologies
* Ability to work 3 days a week in-office (Jerusalem or Bnei Brak)