THE BIG PICTURE

Sysco LABS is the captive innovation center for Sysco Corporation (NYSE:SYY), a Fortune 100 company and the world’s largest foodservice provider with 71,000+ associates, 330+ distribution centers and over 725,000 customers in 90 countries. For fiscal 2023 that ended July 1, 2023, Sysco generated over $76 billion in sales.

Sysco LABS powers Sysco’s farm-to-fork operations and our technology is present in the sourcing of food products, merchandising, storage and warehouse operations, order placement and pricing algorithms, the delivery of food and supplies to Sysco’s global network, the in-restaurant dining experience of the end-customer and much more.

Our technology ecosystem spans 600+ applications, monitoring and incident management across 10,000+ servers, multi-cloud – multi-platform event streaming and microservices architecture, and enterprise-grade systems that power a catalog of over 1.4 million products, 330+ distribution centers and a fleet of 14,000 IoT-enabled delivery trucks, and more.

Everything we do at Sysco LABS supports Sysco’s Purpose of ‘Connecting the world to share food and care for one another’, and our technology directly impacts millions of food consumers in a trillion-dollar, global industry.

THE OPPORTUNITY

We are currently on the lookout for a Technical Lead – Data Engineering to join our team. 

RESPONSIBILITIES

  • The designing and development of large data processing solutions for one of the world's largest corporations involved in the marketing and distribution of food products
  • Working collaboratively with agile cross functional development teams and providing guidance for database design, query optimizations and database optimizations while adhering to DevOps principles
  • Designing and developing capacity/scalability plans for fast growing data infrastructure 
  • Adhering to Continuous Integration and Continuous Delivery of solutions, and ensuring high code quality by following software engineering best practices
  • Being involved in projects throughout their full software lifecycle - from development, QA, and deployment, to post-production support

REQUIREMENTS

  • A Bachelor's Degree in computer science or equivalent, and 5/6+ years of experience developing production enterprise applications, data integration solutions and in managing teams
  • Excellent communication and leadership skills
  • Hands-on experience working with large volumes of data and distributed processing frameworks (preferably Apache Spark and Kafka)
  • Strong Python programming skills for data processing and analysis
  • Proficiency in batch processing techniques and data pipeline development
  • Hands-on experience in design and development of ETLs to process large volumes of data, including experience with Informatica
  • Expertise in data quality management and implementation of data quality frameworks
  • Familiarity with data lakehouse architectures and related technologies. (OLAP/OLTP database design techniques)
  • Strong skills in query optimization and performance tuning, particularly for large-scale data warehouses and distributed systems
  • Experience with query plan analysis and execution plan optimization in various database systems, especially Amazon Redshift
  • Knowledge of indexing strategies, partitioning schemes, and other performance-enhancing techniques; extensive experience with AWS services, particularly:
    • Amazon S3 for data storage and management
    • Amazon Redshift for data warehousing and query optimization
    • AWS Lambda for serverless computing and data processing
    • Amazon ECS (Elastic Container Service) for container orchestration
    • Proficiency in designing and implementing cloud-native data architectures on AWS
    • Experience with AWS data integration and ETL services (e.g., AWS Glue, AWS Data Pipeline)
  • DevOps practices for data platforms:
    • Extensive experience implementing DevOps practices for data platforms and workflows
    • Proficiency in automating data pipeline deployments, including CI/CD for ETL processes and database schema changes
    • Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation) for provisioning and managing data infrastructure
    • Familiarity with monitoring and observability tools for data platforms (e.g., CloudWatch, DataDog)
  • Experience working in a Scrum Agile delivery environment and DevOps practices, as well as prior experience working with Cloud IaaS or PaaS providers such as AWS will be an added advantage.

BENEFITS

  • US dollar-linked compensation
  • Performance-based annual bonus
  • Performance rewards and recognition
  • Agile Benefits - special allowances for Health, Wellness & Academic purposes
  • Paid birthday leave
  • Team engagement allowance
  • Comprehensive Health & Life Insurance Cover - extendable to parents and in-laws
  • Overseas travel opportunities and exposure to client environments 
  • Hybrid work arrangement

Sysco LABS is an Equal Opportunity Employer.

Apply Now
Personal Information
* Required Fields
Qualifications
Work Experience
Prior Employers (if applicable)
Skills(Enter as many as applicable.)
Please upload PDF files less than 5MB only
Sign up for Sysco LABS Vacancy Alerts to be notified when similar opportunities arise
Life @ Sysco LABS
At Sysco LABS, we always go the extra mile but know when to have some fun too - we never pass up an opportunity to celebrate or let our hair down and understand the importance of play in helping us do our best work.