Feeding a growing global population while facing climate change is one of agriculture’s defining challenges. Greenhouses are already part of the solution, yet managing countless variables – from plant health and irrigation to labor and market demand – remains highly complex. This is where Hexafarms comes in.
Founded by David Ahmed, the Berlin-based startup integrates Artificial Intelligence (AI), Computer Vision (CV), and proprietary Internet of Things (IoT) hardware directly into greenhouse operations. What started with sensors and cameras in his own room has grown into a platform used across hundreds of hectares in ten countries, enabling growers to forecast yields, detect diseases early, and make sense of more than 80 real-time parameters.
In the interview, David shares how Hexafarms designed AI models that work across diverse crops and environments, reflects on being recognized as a finalist for the Deep Tech Award 2025, and explains why Berlin provides fertile ground for scaling deep tech startups. He also discusses the urgent need for a “GPT moment” in agriculture, the opportunities for AI to make food production more resource-efficient and resilient worldwide, and how Hexafarms is building toward an intelligent “Operating System” for commercial farming.
Hi David, Hexafarms is putting AI and computer vision right into the heart of greenhouses – from spotting diseases early to forecasting yields. What was the moment or insight that made you say, “We can solve this with deep tech,” and how did you first start connecting agriculture with AI in a way that actually works in the field?
Complexity. It’s mind-boggling to see how many operations and data streams were still being managed manually in a typical greenhouse. For example, per hectare there might be as much as 2 km of running distance with plants on either side, which means you simply can’t inspect everything. A team can range from 10 to 200 people, and you need to understand and decide what everyone should do. On top of that, you have to consider countless factors – fertilizer values, moisture, temperature, solar radiation, light, and more. That’s when we realized we needed systems that could ingest all these values and help make sense of them. Little did we know that even more factors – like plant health, market demand, and seasonality – would also come into play. Luckily, we were prepared. We actually started with a very small “farm” in my own room. I had set up some basic sensors and a small camera to “understand” what was happening. One thing led to another, and our focus deepened – not just in AI, Machine Learning (ML), and CV, but also in IoT, sensing, and user engagement. Today, we’re working with hundreds of hectares of high-tech greenhouses across ten countries.
Your system tracks over 80 parameters in real time – which is a serious amount of data for any grower to handle. What were the biggest hurdles, whether technical or agricultural, in creating something that could work across so many crops and greenhouse setups?
If you look at our servers, we receive telemetry roughly every second – sensor values from production sites across our ecosystem. On top of that, every two minutes during the day, a 4K image from a production site runs through our CV models. This volume of data is only growing as we scale. We managed to handle such diverse use cases by starting with very generalized models that were data-hungry but had the depth to keep learning. We then fine-tuned them for specific scenarios. In fact, the diversity of data has been more of a blessing than a hurdle: the more data we gather, the better our systems and services become. The real challenge was getting the level and quality of data we needed. None of the existing solutions worked the way we wanted. That’s why we invested early in IoT and hardware as well, and we now have proprietary solutions that we use exclusively for our own clients.
You’ve built something that works just as well in a strawberry tunnel as it does in a basil-filled greenhouse. How do you manage the tension between tailoring the tech to each farm’s unique conditions and keeping it universal enough to roll out at scale?
Our AI and CV models are generalized and can be fine-tuned for special use cases. At this point, we often don’t even need to do that, but for some large accounts or performance gains, we add fine-tuning factors. In those cases, fine-tuning really just means some additional configuration – either mined or human-enforced. We’re even working on automating that process. The biggest customization actually happens in our SaaS platform, which is what end-users interact with. We’ve worked closely with every single customer, and right out of the box our services cover 99 % of their priority use cases.
Being a finalist for the Deep Tech Award 2025 put you alongside some of Berlin’s most forward-looking tech companies. Even without the trophy, what does that kind of recognition do for a young company like yours – and what’s your biggest takeaway from being in that spotlight?
To be honest, I wasn’t really familiar with the Deep Tech Award at first – it was our team who drew my attention to the application. That’s why it was all the more rewarding to be on stage alongside such innovative companies. It shows that our work is being recognized and motivates us to keep going.
Berlin has built a reputation as one of Europe’s deep tech hubs. In your own experience, how has the city helped you grow – whether that’s finding the right people, forming partnerships, or securing funding? What does Berlin give you that other cities might not?
The city has an ample supply of everything an early-stage startup might need. It might be a contrarian view, but I find Berlin to be perfect for deep tech. In my opinion, Berlin offers the optimal balance of freedom, capital, talent, government support, geo-location, and networks for a deep tech startup to succeed at an early stage.
One of Berlin’s strengths is how the AI scene cuts across disciplines – researchers, engineers, founders, and creatives often end up in the same room. Can you share a moment when a local connection or collaboration moved your work forward in a way you didn’t expect?
Our very first customer was from Berlin! The city isn’t exactly known for greenhouses, but the one that was here became our client and was instrumental in helping us better understand the problem.
With climate change reshaping agriculture worldwide, where do you see the most urgent opportunities for AI to make food production more resilient – both here in Europe and on a global scale?
There’s a bit of a misunderstanding and even a romantic belief that humans are inherently good at farming – that’s simply not true. We estimate that even the best growers could produce the same output with 30 % fewer resources. Given the intensity of production needs and the Return on Investment (ROI) expectations of the agriculture industry, there’s an urgent need for the right AI solutions. In fact, existing solutions have already peaked in terms of what they can do – whether that’s more fertilizer, labor, or Genetically Modified Organisms (GMOs). What we need is a GPT moment for agriculture: a multimodal AI system that can work across different setups, transfer learning between crops, and in real time either steer – or help humans and machines steer – every possible parameter to optimize dynamic output.
You often say your platform lets farmers “watch their plants grow in front of their eyes.” Looking ahead, what’s next – are there new features, crops, or even markets you’re aiming for in the near future?
Based on our market knowledge, we offer the most precise yield forecasts for major greenhouse crops and consistently outperform human experts. And this keeps improving as we expand. But our customers are asking for something bigger: an intelligent “Operating System.” Our goal is to be the true Operating System for commercial food production. We already have several additional modules either live in beta or launching soon – such as scouting, automated CV-based plant Key Performance Indicator (KPI) tracking, and disease and pest prediction.
Thanks for talking to us.