Skip to main content

At Cargill, we care about your safety and want your job search experience to be a positive one. Unfortunately, there are scams out there where individuals pretend to be Cargill recruiters to try and collect personal information or request payment. Please know that Cargill will never ask you for money during the hiring process, and in most cases, we only accept applications through our official careers site, with the exception of some roles in our production plants. If something doesn't feel right or you have questions, don't hesitate to contact us. To learn more, visit our Notice on Fraudulent Job Offers.

Associate Data Engineer

Apply Now
Job ID 323610 Date posted 04/14/2026 Location : Bengaluru, India Category  DIGITAL TECHNOLOGY AND DATA (DT&D) Job Status  Salaried Full Time

Job Purpose and Impact

The Associate Data Engineer job assists with the design, building and maintenance of routine data systems that enable data analysis and reporting. Under close supervision, this job provides collaboration to support that large sets of data are efficiently processed and made accessible for decision making.

Key Accountabilities

  • DATA & ANALYTICAL SOLUTIONS: Assists with the development of basic data products and solutions using big data and cloud-based technologies, supporting scalable, sustainable and robust designs.
  • DATA PIPELINES: Collaborates with the development of basic streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
  • DATA SYSTEMS: Assists with the implementation of existing data systems and architectures in support of improvement and optimization activities.
  • DATA INFRASTRUCTURE: Supports the preparation of data infrastructure aligned with the efficient storage and retrieval of data.
  • DATA FORMATS: Helps implement appropriate data formats to improve data usability and accessibility across the organization.
  • STAKEHOLDER MANAGEMENT: Assembles requirements from multi-functional partners assisting the team to ensure that data solutions meet the functional and non-functional needs of various partners.
  • DATA FRAMEWORKS: Conducts basic testing of new concepts and assists with the implementation of data engineering frameworks and architectures to support the improvement of data processing capabilities and analytics initiatives.
  • AUTOMATED DEPLOYMENT PIPELINES: Collaborates with the implementation of automated deployment pipelines to support improving efficiency of code deployments with fit for purpose governance.
  • DATA MODELING: Performs basic data modeling aligned with the datastore technology to ensure sustainable performance and accessibility.

Qualifications

  • Have a Bachelor's degree with 2 years or more of relevant experience.
  • CLOUD ENVIRONMENTS: Basic familiarity with major cloud platforms (AWS, GCP, Azure) and interest in learning how cloud services support data pipelines and storage.
  • DATA ARCHITECTURE: Introductory understanding of modern data architectures such as data lakes and lakehouses, with exposure to concepts like ingestion, governance, and basic data modeling.
  • DATA INGESTION: Hands-on experience or coursework using data ingestion tools (e.g., Kafka, AWS Glue) and awareness of common data storage formats like Parquet or Iceberg.
  • DATA STREAMING: Foundational understanding of streaming concepts and exposure to tools such as Kafka or Flink.
  • DATA MODELING: Experience writing SQL and supporting data transformation tasks. Familiarity with modeling concepts (e.g., SCDs, schema evolution) and introductory experience with tools like dbt, Airflow, or AWS Glue.
  • DATA TRANSFORMATION: Basic experience using Spark or similar frameworks for data processing, with a willingness to learn more advanced topics like performance tuning and debugging.
  • PROGRAMMING: Proficiency in at least one programming language (typically Python) and ability to write clean, reusable code. Comfortable with SQL basics and working toward stronger query optimization skills.
  • DEVOPS: General awareness of DevOps practices such as version control (Git) and basic CI/CD concepts. Interest in learning deployment and automation workflows.
  • DATA GOVERNANCE: Foundational understanding of data quality, security, and privacy principles. Awareness of best practices for handling data responsibly.

Apply Now

LinkedIn Job Matcher

Find where you fit in at Cargill. Log in to connect your LinkedIn profile and we’ll use your skills and experience to search the jobs that might be right for you.

Find Your Match

Our
stories

Learn how our purpose drives everything we do.

Learn More (Sustainable Coco)

Diversity,
Equity
& Inclusion

Our inclusive culture helps us shape the future of the world.

Learn More (Inclusion & Diversity)

Our Annual Report

Read Cargill’s Annual Report to see how we’re helping transform food and agriculture to build a food-secure world.

Learn More (Annual Report)

View All of Our Available Opportunities

Thrive