Senior Data Engineer - Ag & Trading • Digital Technology & Data - Ag & Trade
Job Purpose and Impact
- The Senior Professional, Data Engineering job designs, builds and maintains complex data systems that enable data analysis and reporting. With minimal supervision, this job ensures that large sets of data are efficiently processed and made accessible for decision making.
Key Accountabilities
- DATA INFRASTRUCTURE: Prepares data infrastructure to support the efficient storage and retrieval of data.
- DATA FORMATS: Examines and resolves appropriate data formats to improve data usability and accessibility across the organization.
- DATA & ANALYTICAL SOLUTIONS: Develops complex data products and solutions using advanced engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
- DATA PIPELINES: Develops and maintains streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
- DATA SYSTEMS: Reviews existing data systems and architectures to identify areas for improvement and optimization.
- STAKEHOLDER MANAGEMENT: Collaborates with multi-functional data and advanced analytic teams as well as with business teams to gain requirements and ensure that data solutions meet the functional and non-functional needs of various partners.
- DATA FRAMEWORKS: Builds complex prototypes to test new concepts and implements data engineering frameworks and architectures that improve data processing capabilities and support advanced analytics initiatives.
- AUTOMATED DEPLOYMENT PIPELINES: Develops automated deployment pipelines improving efficiency of code deployments with fit for purpose governance.
- DATA MODELING: Performs complex data modeling in accordance to the datastore technology to ensure sustainable performance and accessibility.
Qualifications
- Minimum requirement of 4 years of relevant work experience. Typically reflects 5 years or more of relevant experience.
- TECHNICAL SKILLS REQUIRED:
- Data Platform Design - Designing scalable ELT data platforms on Snowflake supporting batch and real-time workloads
- Advanced Python Engineering - Building production-grade Python pipelines and reusable data frameworks, with working knowledge of .NET services and integrations
- Snowflake & Relational Database Expertise - Deep knowledge of Snowflake architecture, advanced SQL, and experience working with Oracle, SQL Server, and PostgreSQL
- Batch & Real-Time Processing - Designing and operating reliable batch and streaming / real-time data pipelines using Apache Kafka and Apache Pulsar
- Performance & Cost Optimization - Optimizing Snowflake queries, warehouse usage, and Python workloads for efficiency and scale
- Security & Governance - Implementing access controls, data protection, and secure data-sharing patterns across data platforms
- Reliability & Data Quality - Ensuring pipeline resilience, monitoring, and data quality across critical datasets
- GenAI Enablement - Enabling GenAI use cases through high-quality data pipelines, including preparation of structured and unstructured data, embeddings, and integration with OpenAI (e.g., RAG-style workflows)
PREFERED COMPETENCIES
- Proven experience working in the Trading and / or Finance industry
- Proven experience with MS Power BI and Tableau
Job-Matcher auf LinkedIn
Finden Sie Ihren Platz bei Cargill. Melden Sie sich an, um Ihr LinkedIn-Profil zu verlinken, und wir nutzen Ihre Fähigkeiten und Erfahrungen, um die Stellen zu suchen, die für Sie geeignet sein könnten.
Nachhaltiger Kakao
Das Kakaoversprechen von Cargill dient der Sicherung einer blühenden Kakaobranche über Generationen hinweg.
Inklusion und Diversität
Unsere integrative Kultur hilft uns, die Zukunft der Welt zu gestalten.
Leben bei
Cargill
Entdecken Sie, wie Sie mit einer Karriere bei Cargill Ihr höheres Ziel erreichen.
Weitere Informationen (Leben bei Cargill)