Job Summary:
We are looking for a highly experienced Senior Talend Developer to join our data integration team. The ideal candidate will have over 10 years of experience in designing, developing, and implementing ETL processes using Talend, along with strong knowledge of data warehousing concepts, cloud platforms, and enterprise-grade architecture. This role involves working closely with business stakeholders, data analysts, and other developers to ensure seamless data operations and business insights.
Key Responsibilities:
- Design, develop, and maintain scalable ETL solutions using Talend Data Integration / Talend Big Data tools.
- Collaborate with business analysts, data scientists, and stakeholders to gather requirements and translate them into robust technical solutions.
- Develop and maintain data pipelines for batch and real-time processing.
- Perform data cleansing, transformation, validation, and loading across multiple data sources and targets.
- Optimize ETL workflows for performance and scalability.
- Lead data migration and integration projects across cloud and on-prem environments.
- Create and maintain technical documentation, including data flow diagrams, ETL specifications, and deployment processes.
- Troubleshoot and resolve data-related issues and ETL job failures.
- Guide and mentor junior developers and contribute to best practices and standards within the team.
- Work with DevOps to manage CI/CD pipelines for ETL deployments.
Required Skills & Qualifications:
- 10+ years of hands-on experience in ETL development with at least 6+ years in Talend (Talend Open Studio, Talend Data Fabric, or Talend Big Data).
- Strong SQL skills and experience working with relational databases like Oracle, SQL Server, MySQL, PostgreSQL.
- Experience integrating with cloud platforms like AWS, Azure, or Google Cloud.
- Proficiency in data modeling, data warehousing, and data lakes.
- Experience with API integration, REST/SOAP Web Services, and JSON/XML data structures.
- Familiarity with big data ecosystems such as Hadoop, Spark, Hive, or Kafka is a strong plus.
- Knowledge of data governance, data quality frameworks, and metadata management.
- Experience with job scheduling tools like Autosys, Control-M, or Talend Job Conductor.
- Strong analytical and problem-solving skills.
- Excellent communication, collaboration, and documentation skills.