e. g. Python, Warsaw, Startup

Principal Data Engineer

location-pointer-icon Wroclaw
zł 28000 — 33000
Gross / Month / B2B
Data Science
remote
Apply

We are looking for someone who thrives in an extremely fast-paced, collaborative environment and is passionate about good engineering, getting things done, and developing a good product. Your input will be valued, and you will play a significant role in building and transforming the client's wealth management platform using these technologies.

As a Principal Data Engineer, you will be responsible for designing and implementing scalable data pipelines, ensuring data quality, and optimizing data processing for better performance. You will collaborate closely with cross-functional teams to understand business requirements and translate them into technical solutions. Your expertise in data engineering best practices, coupled with your ability to mentor and guide junior team members, will be crucial in fostering a culture of continuous learning and improvement within the organization.

Your ability to effectively communicate technical concepts, provide constructive feedback, and facilitate knowledge sharing among team members at all levels will be highly valued. By leveraging your experience and technical acumen, you will contribute to the professional growth and development of the entire data engineering team.

We are seeking a proactive individual who can be responsible for a big part of our data platform, drive innovation, and continuously enhance the data architecture to support the growing needs of the business. Your strong problem-solving skills and ability to navigate complex data challenges will be essential in ensuring the success of our data-driven initiatives.

Tech stack:

  • Google Cloud, 
  • BigQuery, 
  • GCS, Airbyte, 
  • SQL, 
  • DBT, 
  • Python, 
  • Flask, 
  • FastAPI, 
  • Pandas, 
  • JavaScript (React), 
  • GraphQL, 
  • Terraform, 
  • GitHub Actions, 
  • Sigma Computing, 
  • Open Metadata, 
  • Monte Carlo Data, 
  • OpenAI, 
  • LLMs.

About the project:

The client is a famous venture capital firm running the fund using Salesforce as the master data repository and Google Drive for document storage. Ranging from Excel to external commercial tools and custom-built user-facing apps, different departments use multiple siloed systems for data management.

The major challenge is the suboptimal time to make the final decision as analysts can't rely on the existing data due to possible human errors and complicated data management. The solution is to build a unified data platform that will be integrated with Salesforce, fund accounting, and other existing systems to efficiently transform financial, legal, and ops data to data marts that are ready for interaction and comprehensive visualization with BI tools on the web and mobile platforms.

Skills & Experience

  • Experience with Python, SQL, data systems development life cycle, and preferably full-stack development.
  • Experience using different databases (for example, PostgreSQL, BigQuery, Redis) including experience with query and optimization techniques.
  • Practical experience with Google Cloud services.
  • Experience with unit and integration testing methodologies.
  • Experience working with Docker
  • Proven background in collaborative efforts with product managers and fellow engineers, particularly within distributed multicultural teams.
  • Fluent English.

Nice to have

  • Great communication skills, coupled with a sense of humor, and interest in the domains of personal and corporate finance.
  • Experience with GraphQL and its integration with data platforms.
  • Familiarity with data observability tools like Monte Carlo Data and data discovery tools like Open Metadata.
  • Knowledge of the financial domain and understanding of wealth management, and investment concepts.
  • Contributions to open-source projects or personal projects showcasing data engineering skills.

Responsibilities

  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Design, implement, and maintain scalable and reliable data pipelines, data warehouses, and data lakes.
  • Develop and enforce best practices for data governance, data quality, and data security.
  • Mentor and guide engineers, fostering a culture of continuous learning and improvement.
  • Help maintain code quality, organization, and automation.
  • Collaborate with other teams as needed to ensure interoperability, stability, and code reusability.
  • Optimize data processing and querying for better performance and cost-efficiency.
  • Implement monitoring, logging, and alerting mechanisms to ensure the health and reliability of the data platform.
  • Stay up-to-date with the latest trends and technologies in the data engineering field (Modern Data Stack) and propose improvements to the existing architecture.
Proxet
Outsource
10 - 50
Founded
2009

This site uses cookies to offer you a better browsing experience.

Find out more on how we use cookies and how to change cookie preferences in our Cookies Policy.

Customize
Save Accept all cookies