Zum Hauptinhalt springen

Senior Data Engineer

Jetzt bewerben
Anzeigen-ID 319313 Veröffentlichungsdatum 12/17/2025 Standort : Bengaluru, Indien Kategorie  DIGITAL, TECHNOLOGY AND DATA (DT&D) Job Status  Salaried Full Time

Job Purpose and Impact

We are seeking a highly experienced Senior Engineer with strong expertise across API engineering, platform development, and data engineering. The ideal candidate will design, build, and optimize scalable services, data pipelines, and digital platforms that power enterprise applications. This role requires deep technical proficiency, strong architectural thinking, and the ability to collaborate with cross-functional teams to deliver high-quality, reliable, and secure solutions.

Key Accountabilities

Core Responsibilities

API & Platform Engineering

  • Design and develop high‑performance, secure, and scalable APIs using Java, Spring, and Hibernate.
  • Build and maintain microservices-based architectures ensuring robustness, modularity, and efficiency.
  • Engineer and support digital platform components, infrastructure, and foundational services.
  • Implement and optimize CI/CD pipelines, automated deployments, and cloud-native release processes.
  • Integrate with API Gateways and manage API lifecycle management, versioning, logging, and monitoring.
  • Troubleshoot production issues, ensure service reliability, and provide ongoing technical support.
  • Write unit, integration, and performance tests; participate in peer code reviews to ensure code quality.
  • Maintain comprehensive technical documentation, architectural diagrams, and configuration details.

Data Engineering

  • Design, automate, and optimize scalable data pipelines for batch and real-time ingestion, transformation, and aggregation.
  • Develop and maintain ETL/ELT workflows using AWS Glue, Python, SQL, and other cloud-native tools.
  • Support migration and integration across multiple data platforms like Hadoop, Snowflake, AWS, and Oracle.
  • Implement data modeling, data warehousing, and performance optimization strategies.
  • Monitor, troubleshoot, and resolve issues across data workflows ensuring reliability and data integrity.
  • Contribute to engineering best practices, code reviews, and CI/CD enhancements for data processes.
  • Partner with analysts, data scientists, and business stakeholders to understand requirements and deliver scalable solutions.

Qualifications

    • Bachelor’s degree in Computer Science, Engineering, or a related technical field with minimum 9 years of work experience.
    • Strong expertise in Java, Spring Framework, Hibernate
    • Hands-on experience with Python and scalable data processing
    • Proficiency with AWS services, API Gateway, Lambda, EC2, S3, IAM
    • Experience in CI/CD (GitLab, Jenkins, CodePipeline, or similar)
    • Cloud-native logging & monitoring tools (e.g., Datadog)
    • SQL and advanced data transformation skills
    • Experience with Snowflake, Hadoop ecosystem, AWS Glue
    • Strong understanding of data modeling, warehousing, and performance tuning
    • Familiarity with Oracle and BI tools such as Tableau

    Architecture & Integration

    • Microservices architecture and API lifecycle management
    • Real-time and batch data pipeline integration
    • Strong understanding of distributed systems and scalable design

Jetzt bewerben

Job-Matcher auf LinkedIn

Finden Sie Ihren Platz bei Cargill. Melden Sie sich an, um Ihr LinkedIn-Profil zu verlinken, und wir nutzen Ihre Fähigkeiten und Erfahrungen, um die Stellen zu suchen, die für Sie geeignet sein könnten.

Znajdź swoje dopasowanie

Nachhaltiger Kakao

Das Kakaoversprechen von Cargill dient der Sicherung einer blühenden Kakaobranche über Generationen hinweg.

Weitere Informationen

Inklusion und Diversität

Unsere integrative Kultur hilft uns, die Zukunft der Welt zu gestalten.

Weitere Informationen

Leben bei
Cargill

Entdecken Sie, wie Sie mit einer Karriere bei Cargill Ihr höheres Ziel erreichen.

Weitere Informationen (Leben bei Cargill)
Alle unsere Stellenanzeigen anzeigen