OLSYS Ltd provides full-service solutions for mid-market and enterprise organizations.
As an enterprise software development company, we are building long term partnerships helping our clients accelerate their digital experiences with reasonable IT investments.
Our tailored approach, e-commerce focus, and flexible solutions allow us to design, develop, and deliver scalable, integrated commerce platforms that drive profits and boost the business.
17+ years of experience, 100+ global clients, 200+ top-class engineers.
We are looking for a Senior Data Engineer with strong experience in Databricks to help establish the foundational architecture and initial delivery steps for new data products. You will work on canonical data model definition, data warehouse implementation, and the build-out of our modern data platform.
The Data Engineer will work closely alongside one of the customer`s Data Architects to ensure alignment of design, architecture, and delivery.
Requirements:
- 6+ years of experience as a Data Engineer, with 2-3+ years hands-on with Databricks.
- Strong experience with PySpark, Delta Lake, SQL, and distributed computing.
- Experience implementing medallion architecture (Bronze/Silver/Gold).
- Solid understanding of data warehousing, canonical modelling, ETL/ELT patterns.
- Experience with CI/CD, code versioning, and automated testing for data pipelines.
- Strong communication skills and ability to collaborate across architecture, engineering, and business teams.
- Ability to translate high-level architecture into actionable engineering tasks.
Nice to have:
- Experience with cloud platforms (Azure preferred; AWS/GCP also welcome).
- Knowledge of data governance, data quality frameworks, metadata management.
- Experience participating in data transformation programs or platform modernizations.
- Experience building user-facing data products, dashboards, or APIs.
English level: Upper intermediate
Responsibilities:
- Clarify the purpose, scope, and functional requirements for new data products by working closely with product owners and business stakeholders.
- Collaborate with the Data Architect to confirm component-level (C3) designs and define end-to-end data transformation logic.
- Design and implement the first-phase technology stack for the Bronze and Silver layers (and future Gold), including ingestion, transformation, and modelling.
- Define detailed work items and delivery plans, ensuring proper cross-team collaboration to bring initial data products to customer-facing interfaces.
- Build scalable, reusable, and well-documented data pipelines using Databricks, PySpark, Delta Lake, and distributed processing frameworks.
- Ensure high engineering standards around testing, observability, data quality, versioning, and operational readiness.
- Provide feedback to Data Leadership and the Transformation Steering Group regarding skill needs, organizational readiness, risks, and blockers.