
Khujaev Utkir
Senior/Middle Data Engineer
- Python
- SQL
- Английский язык
- Snowflake
- Big Data
- Databricks
- Azure Data Factory
- Английский — B2 — Средне-продвинутый
Responsibilities
-
Design, develop, and maintain data pipelines using Databricks for big data processing and analytics.
-
Implement and manage ETL/ELT workflows with Azure Data Factory and Alteryx.
-
Build and optimize data models and queries in Snowflake to ensure scalability and performance.
-
Ensure data quality, integrity, and governance across all data platforms.
-
Collaborate with business stakeholders to understand data requirements and translate them into scalable technical solutions.
-
Automate workflows and implement best practices for data engineering, monitoring, and performance tuning.
-
Support advanced analytics and machine learning initiatives by preparing and delivering high-quality datasets.
Requirements
-
Strong hands-on experience with Databricks (PySpark, Spark SQL, Delta Lake).
-
Proficiency in Snowflake data warehousing, including schema design, query optimization, and performance tuning.
-
Experience with ADF and Alteryx for workflow orchestration and ETL development.
-
Strong SQL and Python skills.
-
Familiarity with cloud platforms (preferably Azure; AWS/GCP is a plus).
-
Understanding of data governance, security, and compliance standards.
-
Ability to work independently and in cross-functional teams.
-
Strong problem-solving and communication skills.
Nice-to-Have
-
Experience with BI tools such as Power BI or Tableau.
-
Knowledge of CI/CD pipelines, DevOps, or DataOps practices.
-
Exposure to Agile/Scrum methodologies.