Data Engineer
Key Responsibilities:
Data Analysis and Interpretation:
- Utilize Microsoft Azure Stack tools and services to efficiently analyze large datasets.
- Implement Python and SQL-based techniques to extract actionable insights from structured and unstructured data.
API Ingestion and Integration:
- Design and implement efficient API ingestion processes to collect and integrate data from various sources.
- Collaborate with cross-functional teams to ensure seamless integration of external data into the analytics ecosystem.
Databricks and Data Processing:
- Leverage Databricks for scalable and distributed data processing tasks.
- Develop and optimize Python scripts for complex data transformations and manipulations.
- Maintain well-documented notebooks for data exploration, analysis, and visualization.
Power BI Dashboard Development:
- Utilize Power BI to build interactive dashboards for data visualization and reporting.
- Collaborate with stakeholders to gather requirements and design informative, visually appealing dashboards.
Data Quality Assurance:
- Implement data quality checks and validations to ensure accurate and reliable analysis.
- Troubleshoot and resolve data quality issues promptly, prioritizing issues efficiently.
Collaboration and Communication:
- Work closely with data engineers, business analysts, and key stakeholders to understand data requirements and deliver meaningful insights.
- Effectively communicate findings to both technical and non-technical audiences.
Solution Design and Delivery:
- Designed and delivered regulatory reporting solutions for the finance team.
Requirements
Core Skills:
- Proficiency in Python for data processing and analysis.
- Experience with Databricks for large-scale data processing.
- CI/CD pipelines, ideally using Azure DevOps.
- Version control with Git.
- Cloud computing, with a focus on Microsoft Azure.
- SQL for database interactions and data transformations.
Desirable Skills:
- .NET (C#) development.
- Experience with Azure Data Lake.
- Kubernetes for container orchestration.
- Monitoring and alerting systems.
- Infrastructure as Code (e.g., Terraform).
- Knowledge of financial or FinTech domains, particularly in regulatory reporting.
- Additional Requirements:
- Adaptability and agility in a fast-paced, evolving business environment.
- Strong knowledge and experience in identifying and resolving data quality issues.
- Familiarity with various data domains in the financial industry is a plus.
Data Analysis and Interpretation:
- Utilize Microsoft Azure Stack tools and services to efficiently analyze large datasets.
- Implement Python and SQL-based techniques to extract actionable insights from structured and unstructured data.
API Ingestion and Integration:
- Design and implement efficient API ingestion processes to collect and integrate data from various sources.
- Collaborate with cross-functional teams to ensure seamless integration of external data into the analytics ecosystem.
Databricks and Data Processing:
- Leverage Databricks for scalable and distributed data processing tasks.
- Develop and optimize Python scripts for complex data transformations and manipulations.
- Maintain well-documented notebooks for data exploration, analysis, and visualization.
Power BI Dashboard Development:
- Utilize Power BI to build interactive dashboards for data visualization and reporting.
- Collaborate with stakeholders to gather requirements and design informative, visually appealing dashboards.
Data Quality Assurance:
- Implement data quality checks and validations to ensure accurate and reliable analysis.
- Troubleshoot and resolve data quality issues promptly, prioritizing issues efficiently.
Collaboration and Communication:
- Work closely with data engineers, business analysts, and key stakeholders to understand data requirements and deliver meaningful insights.
- Effectively communicate findings to both technical and non-technical audiences.
Solution Design and Delivery:
- Designed and delivered regulatory reporting solutions for the finance team.
Requirements
Core Skills:
- Proficiency in Python for data processing and analysis.
- Experience with Databricks for large-scale data processing.
- CI/CD pipelines, ideally using Azure DevOps.
- Version control with Git.
- Cloud computing, with a focus on Microsoft Azure.
- SQL for database interactions and data transformations.
Desirable Skills:
- .NET (C#) development.
- Experience with Azure Data Lake.
- Kubernetes for container orchestration.
- Monitoring and alerting systems.
- Infrastructure as Code (e.g., Terraform).
- Knowledge of financial or FinTech domains, particularly in regulatory reporting.
- Additional Requirements:
- Adaptability and agility in a fast-paced, evolving business environment.
- Strong knowledge and experience in identifying and resolving data quality issues.
- Familiarity with various data domains in the financial industry is a plus.
Apply for this job
Does this job fit your talents and seem right for you? Don't hesitate to apply online now.

Facts about the job
Field
IT / Telecom / Internet
Country
Romania
Location
Bucuresti
Contract type
permanent
Job-ID
428159
Company
Talentor RomaniaContact person

Filis Culamet
Technical Business Unit Manager