The robot revolution and our food future, with Rajat Bhageria
When you’re hungry and you need something in a pinch, a meal kit or pre-made entree is tough to beat. But it turns out that getting them from the factory to your refrigerator is difficult when there aren’t enough people to do the prep. Rajat Bhageria, founder and CEO of Chef Robotics has an answer for that. Chef Robotics utilizes embodied AI, the intersection of AI and robotics, to make unlimited kinds of meals with a smart and flexible approach that’s revolutionizing the production line. Bhageria joins Pioneers of AI to talk about this industry-leading tech, its impact on labor shortages, production volume, and how it’s helping businesses scale.
About Rajat
- Founder/CEO of Chef Robotics, market leader in food robotics
- Scaled Chef Robotics to 30M servings produced in food manufacturing
- Pioneer in embodied AI for food assembly and factory automation
- Founder/Managing Partner at Prototype Capital, a pre-seed VC firm
- Built Third Eye, a computer vision startup for the visually impaired
Table of Contents:
- How AI and robotics come together in the physical world
- Why food assembly became the smartest place to start
- Solving labor shortages where factories lose the most capacity
- Why a robot staffing model beats traditional automation pricing
- What deployment looks like on a live food production line
- How food data can unlock expansion and quality control
- What the robots see and how they handle delicate ingredients
- Designing robots that people can actually work alongside
- What it takes to scale a flexible robotics platform
- Episode Takeaways
Transcript:
The robot revolution and our food future, with Rajat Bhageria
RANA EL KALIOUBY: Let me set the scene for you. You worked through lunch, again. Your next meeting is in 15 minutes. But you don’t want others to hear your grumbling stomach. You need something to eat, fast. You open up the freezer and there – thankfully – you find your emergency meals. Pre-portioned, pre-cooked entrees complete with a vegetable and protein.
Okay honestly, I stopped eating these freezer meals a few years ago – I’m trying to eat more fresh foods these days.
That being said, many of us do rely on these meals. But how many of you out there actually stop to think about them. How are those carrots such perfect cubes? How are the noodles perfectly cooked? Not too mushy or underdone?
RAJAT BHAGERIA: The thing with food is that it’s a lot more variable than might meet the eye. I’ll give you an example. Every batch of a very simple ingredient like rice is a little bit different. So if you cooked it a little bit longer, a little bit less long, if you put a little bit too much oil, less oil, it changes the stickiness, the density. And it’s not obvious for humans and they are often actually very inconsistent because of that.
EL KALIOUBY: Humans are inconsistent. But you know what’s a lot more consistent than us? Robots.
Rajat Bhageria runs Chef Robotics – a company making AI-powered robots for the food industry. Robots that can sort and assemble your microwave meal on the factory floor.
This kind of advanced robotics is called “embodied AI.” And Rajat is a pioneer in the field. Just like me, he is a computer scientist turned entrepreneur.
The intersection of AI-powered robots and food manufacturing is an area I see so much potential in. I’m actually an investor in Rajat’s company. And I’m so excited to share with you what Chef Robotics is about.
In this episode, we’ll get into the nitty gritty of how robotic AI works and can solve some of the biggest labor shortage problems we have. We’ll also dive into Rajat’s own background as an entrepreneur and how he’s building a business on the cutting edge of food tech and robotics.
I’m Rana el Kaliouby and this is Pioneers of AI – a podcast taking you behind-the-scenes of the AI revolution.
[THEME MUSIC]
Hi Rajat, welcome to Pioneers of AI. I am so excited to have this conversation today.
BHAGERIA: Thank you Rana, I’m super excited to have this conversation as well.
Copy LinkHow AI and robotics come together in the physical world
EL KALIOUBY: So I want to set the stage here because this is our first episode on AI robotics. And not all robots are AI driven. In fact, most robots don’t have any AI in them at all. And a lot of AI today lives in the software world, right? It’s like ChatGPT. It doesn’t have any physical manifestation. So can you talk about the difference between AI and robotics and when they come together, what does that look like?
BHAGERIA: Yeah. It’s a good question. I think colloquially the definition that roboticists use is this idea of basically you sense the world, you make decisions and then you actuate, that’s basically the robotics framework that most people use. And now there’s this fourth idea of you learn, which is kind of the AI enabled part. So what you’re sensing is becoming much more rich. So instead of maybe sensing with a very simple sensor, we sense with a RGB-D camera or with a pressure sensor or a force torque sensor, we sense with LLMs, with VLMs, right? And all of that kind of allows you to be a lot more rich in how you act in the physical world.
Like, if you go to Waymo, for example, it’ll show you, like, why it stopped, or things like that, right? So, there’s quite a lot of progress, as you’ve called out, happening in digital AI, or, like, software AI, with LLMs and VLMs, and things like that. And our thesis is that this is going to have a huge impact on robotics and the physical world.
EL KALIOUBY: A VLM is a visual language model. Instead of being trained on large amounts of data to generate and translate texts like LLMs do, visual language models are multi-modal and are trained on images, videos, and texts.
BHAGERIA: I think I’m very excited about the intersection of that idea to the physical world. So, I think perhaps some of the biggest impact will be there.
EL KALIOUBY: So Rajat, what does Chef Robotics actually do?
BHAGERIA: The way you can kind of think about it is we essentially have two things we make. One is this AI enabled software, we call it Chef OS. It’s really the software on how to manipulate food, which is to say, how do you manipulate thousands of different ingredients, no matter how you cut it and cook it at any portion size into any tray. Any compartment within the tray, things like that. So that’s kind of Chef OS. And of course, to make food, you need hardware. So on the hardware side, we mostly use, actually off the shelf hardware that we kind of integrate into a product, that we kind of scale across our customers. And then we also have a set of utensils that we have across our customers.
And then we combine the two and we have this kind of system that we can deploy to flexibly automate their production, and help them basically increase production volume.
EL KALIOUBY: Chef Robotics works with companies like Sunbasket, the healthy meal delivery company and Amy’s Kitchen, the organic food pioneer.
Copy LinkWhy food assembly became the smartest place to start
So we’re both computer scientists by background and we actually both spent a lot of time in the computer vision world. And then we also both spent some time in the business and the investing space. I am so curious how all of this culminated into the origin story of Chef Robotics. How did you land on the food industry in particular?
BHAGERIA: Yeah, it was definitely a roundabout journey, which didn’t really make sense at the time, but I think the puzzle pieces kind of connected in hindsight, I guess. When I was getting into engineering, I was actually planning to be an electrical engineer. I did a bunch of research in high school with electrical engineering, my plan was to be electrical.
But then basically the first week of college, there’s this hackathon. And I decided to enter with a few friends and we actually did really well. And I was introduced to this idea of computer science and software in a more substantial way, I would say. And it was so exciting because electrical felt slow, relatively speaking, computer science is very fast and exciting. But also that hackathon led to the creation of my first company, Third Eye, we were using computer vision to recognize what’s in front of you for the visually impaired. So that was my first real experience with AI and computer vision.
And I think both those experiences, of course the coursework, but also more importantly, Third Eye kind of got me excited about this idea that really software’s biggest impact will be on the physical world.
So at that point, it was like, okay, well, I wanted to do something in this idea of embodied AI, what you alluded to, but I didn’t know what product or company to start.
But I also had seen the graveyard of robotics in me. It’s like robotics, honestly, has been tough. Historically, it’s been tough. I think it’s gaining a lot of excitement in the 2020s right now, but, you know, there’s been a lot of robotics in me that didn’t work out. So I was like, okay, I believe this idea that AI enabled robotics is going to be huge, arguably the biggest industry on planet earth.
But I want to be very practical. So I also kind of did a bunch of coursework in economics and business. And I started a small VC fund and I think all of that kind of really allowed me to understand the economics of robotics and how economics of how these older industries work, I would say. So I think it was kind of the background from, you know, the software, computer science, Third Eye experience, and then kind of economics, business, venture capital.
I think those experiences kind of coalesced. And I came into this idea of starting some robotics with an appreciation that I think it’s just as much of an economics problem as it is a technology problem. Because if the ROI doesn’t make sense, then our customers are not going to buy these robots.
I think the reason the food industry happened is actually quite practical. I want to work on the biggest thing that I possibly could.
So I was like, okay, well, the best proxy for that is the number of humans who do that job today from a labor perspective. So I looked at the Bureau for Labor Statistics data and the biggest job was actually nursing and personal care aides, which I didn’t feel was tractable by AI today. The next most common one was retail salespeople.
Same thought. Food service was actually number three, food preparation broadly. And then of course we immersed ourselves in the food industry, we literally lived more or less with our customers until we really understood how they think and what they care about, and that’s how I learned about it. It was a very practical decision, I would say.
Copy LinkSolving labor shortages where factories lose the most capacity
EL KALIOUBY: And then what did you find was the gap that you went on to solve?
BHAGERIA: So when I had this thought about the food industry, I wasn’t trying. Like, I think there’s been a lot of robotics companies that kind of put this idea of I want to do robotics on the customer. Like they kind of have leading questions. They’re like, where do we put the robots?
I think we had the opposite question, which is, hey, look, we’re excited broadly speaking about food. What kind of pain points do you face? Not setting aside AI and technology and robotics. All of that. But what was interesting is that every single customer, without exception, we talked to essentially said their biggest pain point was a labor shortage and flavors of that, which is that even if they could hire the laborers, it’s really hard to retain the labor.
I would hear stories about how people would turn over 300 to 400 percent per year. Now, by the way, food is a little bit more intense than other environments, even worse than perhaps construction and, you know, agriculture and warehouses, mainly because it’s extremely cold, like 34 degrees Fahrenheit or extremely hot, either extreme.
So anyways, that was kind of the big pain point.
EL KALIOUBY: So that isn’t like, this isn’t the cooking food problem. This is the packing and assembling, like food assembly?
BHAGERIA: Yeah. So we hadn’t gotten there yet. When we were having these conversations, it was like, broadly speaking, what’s your pain point? It was like labor, right? And they were like, it’s really hard to hire because the hot room or the cold room, even if you’re doing assembly or packing, it’s still hot. Because you’re next to the stove and the grill, right?
So I think that was where we were at that point. And then the next step was like you said, what task? We learned that actually assembly is the task that all of our customers ask us to focus on, not cooking or prepping. And this is also kind of weird to us, because, you know, when you cook at home, or I cook at home, cooking takes the longest.
And number two is prepping, and then finally assembly is like seconds, right? What I learned is that at even a little bit of scale, that’s not true anymore. The assembly is actually 60 to 70 percent of the labor force. Basically our customers told us you should focus there.
Copy LinkWhy a robot staffing model beats traditional automation pricing
EL KALIOUBY: I love that you talk about your company as a robot staffing agency. Say more about that.
BHAGERIA: The big pain point that this labor shortage results in for our customers is that they’re running under capacity. And one thing I should just say is that, you know, at the moment we’re selling to really food manufacturers. We do also want to sell down the line to, like, fast casual restaurants, prisons, hotels, stadiums.
Today our customers are food manufacturing customers, so big, think big food factories.
Now, their biggest pain point is that because they cannot hire these people, they are running way under capacity. What that means essentially is, let’s say that they have ten conveyor lines.
They can only staff, let’s say, seven or eight of them at a time to really, you know, make the production numbers they need. So they’re basically leaving revenue on the table. And hypothetically, let’s just say they can staff all 10. Even if they can staff all 10, they have so much turnover that like, sometimes it doesn’t make sense to staff all 10, because they’re just going to have to train people constantly.
So, basically, what we can say is, look, we’re going to be this kind of staffing agency, in the sense that we’re going to charge you a recurring fee. We’re not going to charge you a big CapEx up front.
EL KALIOUBY: CapEx as in Capital Expenses. These are the big upfront costs for upgrades in things like equipment. Typically these kinds of robots are charged as a capital expense.
BHAGERIA: And I think that’s a notable point. A lot of automation is big CapEx purchases. And we don’t do that. You’re gonna pay us a yearly recurring fee, which is less than the cost of people. So you’re paying for us just like you pay your people. And basically you can run production now. You can go from running 8 lines at a time to now running 10 lines at a time. Which is a huge ROI for them.
EL KALIOUBY: You’re not necessarily replacing human jobs. There is a shortage of labor in this particular application. And, you’re stepping in with robots to fill in this gap, which I think is a lot of people’s fears and concerns around AI is about AI taking jobs.
BHAGERIA: I think that’s exactly right. And more generally I would just say that anytime you have automation is not a new idea. AI is a new idea, but automation is not a new idea. Like, the tractor is automation, right?
The dishwasher is automation. The steam engine, it’s all kind of automation in some regards. It’s always created a lot more jobs in aggregate than it’s produced. Right, because what happens is that the cost of goods goes down, businesses burgeon, and there’s the economy kind of swells, and there’s a lot more jobs in aggregate created by this than there’s taken.
EL KALIOUBY: So, we’ve talked about why Chef Robotics came to be, and how its services can help food businesses. But what does putting these robots to work actually look like?
That’s after the break.
[AD BREAK]
Copy LinkWhat deployment looks like on a live food production line
So, set the scene for us. Take us to one of these food manufacturers. What does it look like? And how does the robot come into play?
BHAGERIA: A status quo is, and then we can talk about how Chef fits in. So status quo is basically of these long, like, big assembly rooms. You might have two conveyor lines, you might have 40 conveyor lines, based on the volume that the customer might be doing. On each assembly line, basically, you have stations.
Each station has a person, and the person has a big tub of food. That might be rice, carrots, diced green beans, whatever ingredient it might be. And they’re basically scooping ingredients from the source container into individual customer containers. As they moved on the line, the customer containers might be frozen meals.
They might be fresh meals, yogurt parfaits, sandwiches, wraps, burritos, whatever kind of food you the customer might be making that day. And so that’s kind of status quo, right? And just to be clear, these are high mix customers. What I mean by that is that they’re not just like line one, isn’t just doing the same meal over and over again, they’ll change over the line.
So they might do 10,000 trays of Cobb salad and then they’ll change over the line and they’ll do a Caesar salad, for example, right? Okay. So what we do is we say, okay, look, you know, line eight over there is not running because of lack of people or we’ll find opportunities to help, right.
And we make these enabled robot modules. We call them modules in the sense that they are modular. They’re the same footprint as a human. And basically, you can just slide them onto the line. They are on wheels or casters. So there’s no retrofitting required. And essentially, each module takes up the same footprint as a person.
The only inputs we need is really compressed air and power, which every facility in the food world really has. At that point, basically what you do is you say, here’s the meal I want to run. Let’s say I’m running the Caesar salad. Then you might say, here’s the ingredient I want to run.
Let’s say it’s the leafy green. And then you select the portion size. At that point, Chef will kind of query the relevant policy, the AI policy, and how to manipulate that leafy green. And of course, every leafy green is a little bit different, but it’ll get the relevant one. And it’ll ask you to load the pans of food, the pans of leafy green.
It’ll also ask you to load a relevant utensil. Each policy has an associated utensil, which is kind of similar to a human utensil. So you attach a utensil, and then you press play. That entire process, by the way, takes all of a couple minutes. It’s very quick.
And at that point, you know, Chef will detect the trays as they move down the line, and then it’ll place onto those trays as they move down the line. So we use computer vision to detect where the trays are, and that allows us to adjust for, oh, the tray’s rotated, or there’s no tray, or there’s increasing acceleration, the conveyor decreasing acceleration, or the conveyor’s slanted, or the conveyor’s skewed.
Like, there’s all these variations you see in the physical world, and we can be robust to all of that using that process.
EL KALIOUBY: Hold on. I want to pause you for a second because I think this point is really important because automation in the food industry has been around forever. What is different about you guys’s approach is that you’re harnessing AI and we’re going to go under the hood in a second to talk about it.
Like, what exactly are you leveraging in terms of data and models, but for now, you’re using AI and that’s giving you flexibility. You don’t have to have a different robot for lettuce and another one for carrots and another one for ice cream or whatever. So can you talk about that?
I think this flexibility and the use of AI is really key to explain.
BHAGERIA: You can think about it as a single actuator, maybe a very simple sensor, like a photo transistor. It’s a very simple, mostly hardware system.
So it’s very good at doing the same thing over and over again. So an analogy might be like, you can make a dispenser work for, let’s say diced tomatoes. Great. But now if you cut your tomatoes a little bit differently, it’s going to get clogged up. Right? And it can place into a single kind of tray. And if you change the kind of tray, it won’t work again. It can place into, really, it’s kind of fine tuned and hard coded for one kind of conveyor. If you change the conveyor, it doesn’t work. If the conveyor is a little bit farther away, it doesn’t work. You know, these ideas, right? It’s basically a very custom piece of automation, right? That works fine for a certain subset of customers.
Like, if I’m Kraft Heinz making ketchup bottles, and I’m going to be making 100 million ketchup bottles, of course I get a dedicated custom line that just runs ketchup bottles all day long. And I get a custom machine just for that. And it works great. For our customers, they have 300 different SKUs. 500 different SKUs.
In other words, products. All these different kinds of meals that you and I might want. And because of that, they don’t have custom lines for each of those products. They have flexible lines that change over. So flexibility is key. Traditional automation is not very flexible. So I think the point you made is exactly right. You could solve the high mix problem by having custom machines, essentially for each ingredient, each portion size, each tray, each placement within the tray, each conveyor, it just goes on and on. So there’s all these permutations, our vision, and what we’re kind of telling the customers, look, and what we’ve productionized is a single piece of hardware.
EL KALIOUBY: It’s an arm with like some utensils at the end?
BHAGERIA: Yeah, exactly. And what’s nice about that is, like, you can basically pick, really, any ingredient at any portion size, into any container, into any placement within the container, into any conveyor using purely software and different utensil inserts. So it just makes a lot more scalable and flexible because of that.
Copy LinkHow food data can unlock expansion and quality control
EL KALIOUBY: Yeah. Amazing. So you gave us the example of the large scale kind of food manufacturer. What are other areas where you’re applying Chef Robotics?
BHAGERIA: Right now our bread and butter is really these kind of high mix customers. And I think the areas that we’re excited about within that is really focusing on unfrozen, fresh meals, airline catering, school meals, hospital meals, things like that, right? Where we want to expand from there is kind of two ideas. Idea number one is horizontally to other sectors in the food industry. And idea number two is vertically. So let me talk about horizontal first. Horizontal would basically be, you know, right now we’re kind of working with these high volume customers, right?
They’re doing, let’s say, 10,000 meals a day. And so robots are really good in those kind of environments. Where we’d like to go to next is lower and lower volume environments. Imagine like a fast casual restaurant, like let’s say Cava or Sweetgreen or Chipotle. If you think about it, they basically have a lunch rush and a dinner rush and that’s like four hours, right? So to generate an ROI, right, you basically have to generate that ROI in four hours.
Which is very tough. So, your robot has to be very generalized. And not only does it have to generate an ROI in four hours, within those four hours, it’s got to do 90 to 100 ingredients. Like, if a robot just does leafy greens for Sweetgreen, that’s not useful. It’s got to do everything. So, essentially the thinking for Chef is that food manufacturing is high volume.
We’re running the robot 16 hours a day. Right, so there’s a very compelling ROI. And each robot, at any given moment, is doing a single ingredient. Now, throughout the day, I might do five to six ingredients, but it’s a set of ingredients. And so our thinking is that by shipping robots here, we’re collecting a lot of valuable training data on how to manipulate food. And over time, we can use that training data to make a more generalized food manipulation model, which then allows us to go to lower and lower volume environments with the ultimate goal of really being able to do things like fast casuals, ghost kitchens, things like that. So that’s kind of how we think about scaling horizontally, right?
EL KALIOUBY: So imagine robots behind the counter preparing your lunch at Chipotle and Sweetgreen. That’s a goal for Chef Robotics.
But Rajat doesn’t just want to grow into different markets. He sees a lot of potential value in all of the data they’re collecting about food production.
What about the vertical integration or the vertical roadmap?
BHAGERIA: Yeah, the vertical idea is also interesting, which is that right now what we found in the food industry is there’s actually relatively little software and data that these companies are capturing.
Some of them have more than others, but it’s quite a bit more manual than you might think. So I’ll give an example. From a QA perspective, they’ll randomly sample meals at the end of the line. And it’s like they’ll do it, you know, X times per hour, which is cool. It’s useful. But we are measuring the weight of every single meal, first of all.
And we’re doing that on the ingredient level. Not the full meal level, but every single ingredient we’re measuring. Okay, we’re doing the same thing for placement. We’re measuring every single metric of how the placement aesthetics look. We have a lot of metrics around throughput and, you know, basically food wastage.
There’s all this data we’re collecting. And I think over time, what we’d like to be able to do is leverage that data and help build tools for line monitoring, placement QA, pick weight QA, because at the center of food is really these metrics on quality, right?
Quality, throughput, things like that. So I think there’s a lot of opportunity where we can basically over time become more of the operating system of how these companies work from a software perspective. And, you know, you can imagine also software about scheduling and line planning, right?
Optimized for robots, optimized for robots and humans, for example.
EL KALIOUBY: I think that’s really cool because QA or quality assurance today, as you say, isn’t happening with every single meal being served, but you can do just in time QA, right? Because you’ve got all the data anyway. So that’s really awesome.
And you can interject, you know, if I don’t know, I’m going to make this example up, you tell me if it makes any sense. But say we’re putting, you know, some rice on and then some avocado, and then we’re going to put like chicken, whatever. And if it messes it up in the avocado step, you don’t have to wait until you’ve put the chicken and all the other stuff.
You can intervene just in time and say, okay, this one doesn’t work. Take it off the line. Does that make sense? Exactly.
BHAGERIA: That totally makes sense. And actually we do more or less exactly that. And further, the thing with food is that it’s a lot more variable than might meet the eye.
I’ll give you an example. Every batch of a very simple ingredient like rice is a little bit different. So if you cooked it a little bit longer, a little bit less long, if you put a little bit too much oil, less oil, it changes the stickiness, the density.
And it’s not obvious for humans and they are often actually very inconsistent because of that. So I think one thing we have found is that we can actually, because we’re measuring in real time, adjust on the fly, right? If we find that today, the rice is overcooked, undercooked, too oily, less oily, whatever it might be, whatever dimension it might be, we can adjust to that, which allows for much better consistency, which ultimately allows for way less food wastage.
Which ultimately actually allows for much better ROI to our customers. You can also imagine in aggregate, if you have thousands and tens of thousands of robots, in aggregate that can actually have a pretty big impact on food wastage globally too.
EL KALIOUBY: But there are so many variables in the food industry that go beyond cook time and amount of oil. For example, produce isn’t consistent. Something like a blueberry can come different sizes, plus they’re quite delicate.
EL KALIOUBY: But there are so many variables in the food industry that go beyond cook time and amount of oil. For example, produce isn’t consistent. Something like a blueberry can come different sizes, plus they’re quite delicate.
[AD BREAK]
EL KALIOUBY: So how does a robot even pick up blueberries without squishing them? Stick with us, to find out.
So, these robots can grab ingredients, but no ingredient is the same. For example, a cup of blueberries and a handful of kale need to be picked up differently. You don’t want to squish the blueberries and you can’t exactly scoop kale.
So how does Chef Robotic’s robots navigate this problem?
Copy LinkWhat the robots see and how they handle delicate ingredients
Let’s break that down. So what is the robotic arm seeing? What’s the sensor? What are its senses? Start there. Yep.
BHAGERIA: Yep. So, great question. So we use two R-G-D-R-G-B-D cameras.
EL KALIOUBY: This is a kind of camera that can help make 3D images.
BHAGERIA: One is kind of pointing at the ingredient pans, and one is pointing at the conveyor pans. So we kind of mesh those together into a four point cloud, and now we have a good sense of the scene we’re seeing.
So that’s kind of the main inputs. We also have a scale, literally a weigh scale, underneath the hotel pans themselves. So, essentially you got to pick and you got to place. So from a picking perspective, the job is like, okay, how do I figure out where to pick from? Remember there’s like hills and valleys. There’s like mountains basically. Lots of blueberries. Yeah. Yes, exactly. And they’re not like a flat uniform surface. So first of all, where do you pick, but also how do you pick to make sure you don’t crush the blueberries, to not have spillage. So that’s kind of what’s encoded in the pick parameters of this policy. We do have different utensils and these utensils are actually quite cool. We have some really great mechanical designers that can pick without actually damaging the material, which is pretty cool, right? And on the placing side, same idea, right? So now you have to detect and track the containers. It’s very similar technology to detection and tracking in a self driving car. And then place into the relevant compartment or quadrant of a tray.
And there are similar parameters. So one parameter that we have is this idea of dwell time, which is this very simple idea. It’s this idea that if you have a very sticky material, imagine like peanut butter, it takes peanut butter some time to physically fall down. So you have to basically track the container for a little bit longer to make sure that peanut butter fully makes it to the container.
For example, right? So there’s all these parameters basically we have that allow us to successfully pick and place an ingredient while maximizing for all these different dimensions I talked about.
EL KALIOUBY: So what’s something that the robot is not ready to do yet, but you’re actively working on?
BHAGERIA: I think one that’s been tough is we call it big bulky items. So the way you can think about this is, think about, like, a very big broccoli floret, right? Let’s say it’s a broccoli floret, like, this big. And you have, like, you want to pick up, like, a bunch of them, right?
Yeah. Well, let’s say the target’s, like, you want to pick up, like, 150 grams of broccoli for whatever reason. Maybe it’s, like, a party tray or something. Well, if you just pick up, you’re probably going to get one piece too much or one piece too little. And what a human does is looks at that and they say, ah, let me remove the big piece and put a small piece in.
And so the robot, once it picks up, it knows that it got too much or too little, but then how do you edit? Right? So we’re trying to build in functionality to edit and then figure out how we can kind of do these big bulky things. Because the thing is if you get one piece too much broccoli, then you’re super high.
If you go one piece too little, then you’re super low. And you of course can’t just slow down throughput because the line is gonna get slowed down. So that’s a more complex problem that we’re working on right now.
EL KALIOUBY: Yeah. So cool. What’s something that you’re not even trying to have the robot do?
BHAGERIA: We find a lot of weird ingredients that you might not think about. So like lasagna sheets, right? There’s like, literally big trays and people are just taking sheets of lasagna to make that, it’s quite complex, right?
And they’re kind of stacked up one on top of each other, sticking to each other. That is quite tough. No lasagna. We also see a lot of noodles, like long stringy noodles. That’s actually quite tough as well at the moment. We can pick noodles, but all these—
EL KALIOUBY: Slips, right? Oh, you can? Yes.
BHAGERIA: Well, this is what I mean. It’s like, it’s such a hard optimization. Like you look at Chef and it seems simple in some regards on the surface. Oh yeah, you’re just picking and placing ingredients. But the thing is that you can pick noodles, but can you pick very consistently?
Do you spill? I mean, when you place, do you get like some of the strings on top of the tray? Now it’s ugly. It doesn’t look aesthetically nice. So there’s all these dimensions you have to kind of optimize for, which is where the tough part is. So we can pick, but we’re not very consistent.
So now, you know, there’s work that we are thinking about to optimize consistency and get it to a production ready state, basically.
Copy LinkDesigning robots that people can actually work alongside
EL KALIOUBY: So I spent my entire career thinking about human machine interfaces and what that could look like in the future. And I’m especially interested in this idea of a co bot, a collaborative robot that’s meant to work alongside and interact with humans.
So for Chef Robotics, have you kind of thought about how these robots are going to coexist and work alongside other humans on the factory floor? Sure.
BHAGERIA: Yeah, absolutely. And actually, I think it’s a really important question that I would say a lot of folks don’t think about.
Broad strokes? Why is this important? Well, I think there’s a lot of robotics companies that historically have built really cool technologies. But I think the usability and the human machine interaction is really what takes the technology into a product. And then ultimately, I think a solution.
I’ll give you a good example why this matters for us and why we spend so much time thinking about it. So, you know, oftentimes our users, not the customers, not the people writing the checks, the people who are using the robots on a day to day basis, they’re oftentimes non technical, right? Most of the time.
And, you know, I think because of that, it’s really forced us, honestly, to make an extraordinarily simple system that also feels nice to use. So now, for example, you can see a 3D view of exactly what the robot sees, right, like what trays are going down the line. Sometimes the tray might be too far, and the robot might not attempt to place it, and then you understand, oh, okay, that’s why, the tray was too fast, too far away. How do we really optimize the robot to be helping them do their job in a very simple way as opposed to detracting or making it hard for them.
EL KALIOUBY: So, I want to ask kind of, I want to flip it on its head a bit. Do you think there are any skills that we as humans need to learn as more and more robots will become the norm, right? We’ll be surrounded by robots doing all sorts of stuff. So what kind of skill sets do you think we should learn?
BHAGERIA: It’s a great question. I think humans have a very, very, very high bar for robots.
What I mean by that is like, I think people assume that if it’s a robot, it’s gonna be perfect. Right. Like we’ve had customers where they’re like, hey, you’re using AI, you’re using robots. It should be perfect consistency. You should not have a single gram of standard deviation. It should be basically nothing. And you see this in self driving cars too, right? It’s like, people have all these accidents all the time, but as soon as one AV gets into an accident, the entire world kind of is upset, right? So the bar is just very high, and I think some empathy that machines and robots aren’t perfect, but they can still be better than the status quo, is an important one, I would say.
EL KALIOUBY: I love that. Bring empathy to how we deal with robots. How cool is that? Love that thought. Yeah. Yeah. That’s great.
BHAGERIA: I think another one that comes to mind is this idea that faster is always better. Let’s just go fast, as fast as we can. But oftentimes, what that leads to is you kind of accelerate, and then, you know, your line is down because your people can’t catch up, or your machines can’t keep up, and then you accelerate again, and you slow down, and then at the end of it, your average production is actually much lower.
Whereas, like, steady production is actually much more efficient. It’s better for people. It’s like line balancing, right? And load balancing is a lot better. And again, that’s specific to manufacturing, it’s a more general concept that I think applies, which is this idea that if you try to optimize a process and you don’t alleviate the bottleneck, you won’t actually solve anything. Maybe a simple example I can give, let’s talk about the Chipotle line. The constraint, if you think about it, is oftentimes the assembly. It’s not oftentimes the cooking or prepping.
So if you try to, if you automate the cooking or prepping using robots or really any process, great, that might be useful, but you actually probably didn’t solve anything because you didn’t alleviate the bottleneck. So it’s really important in any process to alleviate the bottleneck, and you shouldn’t automate a process before you’ve optimized and simplified and potentially even removed the process.
I think that’s a manufacturing concept, but it generally applies to really day to day life. Yeah, that’s—
Copy LinkWhat it takes to scale a flexible robotics platform
EL KALIOUBY: So earlier this year you raised a fresh round of funding. Congratulations. Thank you. Share a little bit. How are you planning to use this capital and what are some of the next exciting milestones for Chef?
BHAGERIA: Yeah, I think a lot of the funding is really going towards making our product even better. The reason I say this, by the way, is that the approach you’ve taken is very much kind of finding these fairly large customers and really trying to make them very happy so that we can scale within them. So it’s more of a land and expand business model as opposed to let’s get dozens and hundreds of different customers that we have to make happy.
Most of our customers have multiple plants. Each plant has opportunity for dozens of robots. But they’re of course only gonna do that if the product really works. So a lot of the financing is going towards making even better models to do more ingredients, be more consistent, spill even less, have faster throughput.
Basically, like, there are these six or seven metrics that matter for a chef. It’s not too hard to think about, right? If you think about manufacturers, they care about throughput, they care about quality. They care about waste. There are these few metrics we have to optimize for, and we really maniacally optimize for them, while also maximizing our flexibility.
Because like you said at the very beginning, you can’t have a machine that just does only that thing. It’s got to do all these things, but also be very flexible. So a lot of the capital is really going towards building out the AI team to continue making the products even more flexible.
EL KALIOUBY: So exciting. Well, thank you so much for this conversation. Fascinating work and very exciting.
BHAGERIA: Yeah, thank you, Rana. This was awesome, thanks for having me on.
EL KALIOUBY: Rajat, like me, is a techno-optimist. His work adds a lot of nuance to the conversation around AI and jobs.
Like Rajat, I truly believe that AI will not only create future jobs, but also solve for the labor shortage crisis in some industries – like manufacturing.
But there is so much potential for embodied AI beyond the factory floor. We’re so excited to share with you conversations about AI-powered robots — in other industries and even in your home.
We want to hear from you. What’s an area of work that you would outsource to AI?
Episode Takeaways
- Rana el Kaliouby opens with a familiar frozen-meal moment, then reframes it as a hidden engineering marvel shaped by AI, robotics, and the messy variability of food.
- Chef Robotics founder Rajat Bhageria explains embodied AI as the fusion of sensing, decision-making, action, and learning, bringing software intelligence into the physical world.
- Rajat says Chef Robotics zeroed in on food assembly after hearing the same pain point again and again: severe labor shortages that leave factory lines understaffed and revenue untapped.
- Inside the factory, Chef’s modular robots slide onto existing lines, use computer vision and specialized utensils, and flex across ingredients, trays, and conveyors with software-driven precision.
- Looking ahead, Rajat sees Chef Robotics becoming both a robot staffing partner and a food data platform, improving quality, reducing waste, and expanding toward restaurants and other kitchens.