AI Field Day: Nature Fresh Farms Profits by Machine Learning, Not LLMs

AI Field Day: Nature Fresh Farms Profits by Machine Learning, Not LLMs

Nature Fresh Farms uses machine learning (ML) to grow and deliver top quality fresh produce. Presenting at AI Field Day, Keith Bradley, VP of IT for Nature Fresh Farms, outlined the multi-year journey to instrument and optimize the whole process using ML and automation

Nature Fresh Farms delivers high-quality fresh produce year-round from massive greenhouses. Much of the AI in the business is driven by IoT sensors, machine vision, and neural networks, bringing consistency to usually subjective judgements. Inputs such as light, temperature, supplied water, and nutrients are measured and adjusted based on outputs such as water returned, growth and leaf state. This setup is far from the billions of parameters required for the simplest large language model (LLM), with a corresponding decrease in the compute resources needed to create and operate the models.

Keith mentioned the need to process real-time data inside the greenhouse while longer-term data is processed in a data center. This approach mirrors many ideas in edge computing and removes reliance on WAN connections for immediate decisions. For example, the decision to open the vents in the greenhouse based on relative temperatures is made immediately, as is the decision to turn on lights on a cloudy day. The decision about how the relationship between paying for electric lights and getting a higher yield provides better profit needs a more complex analysis in the data center. That longer analysis then feeds back to the onsite real-time decision.

The use of machine vision and ML extends beyond growing to both sales and delivery. Forecasting what quantity of products will be at their optimal condition each day or week is essential for the sales team to sell all the products at the best price. On the packing line, machine vision inspects and grades each item of produce. It even ensures that all the pieces of fruit in a package will have an equal shelf life and uniform appearance. While LLMs are an area Nature Fresh Farms is investigating, they now use ML, not LLMs, to improve productivity.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Amazon SageMaker HyperPod Claims 40% Reduction in AI Training Time

Amazon CodeWhisperer and MongoDB Collaborate

AWS, Microsoft, and Google Cloud: Tying Up LLMs

Author Information

Alastair has made a twenty-year career out of helping people understand complex IT infrastructure and how to build solutions that fulfil business needs. Much of his career has included teaching official training courses for vendors, including HPE, VMware, and AWS. Alastair has written hundreds of analyst articles and papers exploring products and topics around on-premises infrastructure and virtualization and getting the most out of public cloud and hybrid infrastructure. Alastair has also been involved in community-driven, practitioner-led education through the vBrownBag podcast and the vBrownBag TechTalks.

SHARE:

Latest Insights:

At Microsoft Build 2025, AI Is Microsoft’s Future, and Developers Are at the Heart of the Agentic AI Revolution, Not Replaced by It
Analysts Mitch Ashley, Nick Patience, and Keith Kirkpatrick at Futurum share their insights on Microsoft Build 2025, revealing Microsoft's strategic pivot to agentic AI and its profound implications for the future of software development.
Tara DeZao and Peter van der Putten from Pega join Keith Kirkpatrick to discuss the future of AI and creating impactful customer experiences responsibly.
Snowflake Unveils Cortex AI Enhancements, OpenFlow for Interoperability, and Significant Compute Performance Upgrades, Aiming To Make AI More Accessible and Efficient for Enterprises
Nick Patience, AI Practice Lead at Futurum, shares his insights on Snowflake Summit 2025. Key announcements like Cortex AI, OpenFlow, and Adaptive Compute aim to accelerate enterprise AI by unifying data and enhancing compute efficiency.
Q2 Results Driven by Improved Server Execution, GreenLake Gains, and Cost Actions
Fernando Montenegro and Krista Case at Futurum examine HPE’s Q2 FY 2025 results, with improved server execution, AI systems traction, and hybrid cloud strength driving upside despite margin pressure and macro headwinds.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.