Senior Data Engineer - Ag & Trading • Digital Technology & Data - Ag & Trade
Job Purpose and Impact
- The Senior Professional, Data Engineering job designs, builds and maintains complex data systems that enable data analysis and reporting. With minimal supervision, this job ensures that large sets of data are efficiently processed and made accessible for decision making.
Key Accountabilities
- DATA INFRASTRUCTURE: Prepares data infrastructure to support the efficient storage and retrieval of data.
- DATA FORMATS: Examines and resolves appropriate data formats to improve data usability and accessibility across the organization.
- DATA & ANALYTICAL SOLUTIONS: Develops complex data products and solutions using advanced engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
- DATA PIPELINES: Develops and maintains streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
- DATA SYSTEMS: Reviews existing data systems and architectures to identify areas for improvement and optimization.
- STAKEHOLDER MANAGEMENT: Collaborates with multi-functional data and advanced analytic teams as well as with business teams to gain requirements and ensure that data solutions meet the functional and non-functional needs of various partners.
- DATA FRAMEWORKS: Builds complex prototypes to test new concepts and implements data engineering frameworks and architectures that improve data processing capabilities and support advanced analytics initiatives.
- AUTOMATED DEPLOYMENT PIPELINES: Develops automated deployment pipelines improving efficiency of code deployments with fit for purpose governance.
- DATA MODELING: Performs complex data modeling in accordance to the datastore technology to ensure sustainable performance and accessibility.
Qualifications
- Minimum requirement of 4 years of relevant work experience. Typically reflects 5 years or more of relevant experience.
- TECHNICAL SKILLS REQUIRED:
- Data Platform Design - Designing scalable ELT data platforms on Snowflake supporting batch and real-time workloads
- Advanced Python Engineering - Building production-grade Python pipelines and reusable data frameworks, with working knowledge of .NET services and integrations
- Snowflake & Relational Database Expertise - Deep knowledge of Snowflake architecture, advanced SQL, and experience working with Oracle, SQL Server, and PostgreSQL
- Batch & Real-Time Processing - Designing and operating reliable batch and streaming / real-time data pipelines using Apache Kafka and Apache Pulsar
- Performance & Cost Optimization - Optimizing Snowflake queries, warehouse usage, and Python workloads for efficiency and scale
- Security & Governance - Implementing access controls, data protection, and secure data-sharing patterns across data platforms
- Reliability & Data Quality - Ensuring pipeline resilience, monitoring, and data quality across critical datasets
- GenAI Enablement - Enabling GenAI use cases through high-quality data pipelines, including preparation of structured and unstructured data, embeddings, and integration with OpenAI (e.g., RAG-style workflows)
PREFERED COMPETENCIES
- Proven experience working in the Trading and / or Finance industry
- Proven experience with MS Power BI and Tableau
Linkedin 채용 매칭
카길에서 어떤 업무에 적합할 지 알아보십시오. 로그인하여 LinkedIn 프로필에 연결하면 여러분의 기술과 경험을 바탕으로 가장 적합한 일자리 정보를 검색할 수있습니다.
우리의 위치
우리는 전 세계 70개국 이상의 국가에서 고객과 지역사회에 기여하는 것을 자랑 스럽게 생각 합니다. 전 세계 카길 직원들은 안전하고 책임감 있으며, 지속 가능한 방식으로 세상을 풍요롭게 하는데 공헌 하고 있습니다. 우리와 함께 하여 카길에서의 경력이 여러분의 더 높은 목표 달성에 어떤 도움이 되는지 알아 보십시오.
