Designing a Food Delivery Robot
#tech
Sebastian Aarnio

Designing a Food Delivery Robot

Food delivery robots have been roaming the streets of several Finnish cities for a while now. In this blog post, our Software Developer Sebastian embarks on a fun thought experiment: what if you were to build a food delivery robot of your own? What are some of the considerations that such an undertaking would entail? And would the little bugger have a cute name like Bell-E?

So you want to create a commercial food delivery robot? That’s quite a task. In this article, we will break down some aspects a project like that would include: from technical challenges and new design perspectives to consider, all the way to the social impact of your decisions.

First, we need a catchy name, otherwise your product will have an uphill battle to tackle. Let’s go with Bell-E. Disclaimer: if you’re taking this one into production, we suggest consulting Disney’s lawyers if you don’t want Mickey knocking on your door.

Clearing the obstacles

Your robot not only needs to be able to achieve its mechanical goals of getting from point A to B, but also take notice of the people around it – something that has historically had to happen the other way around. In a controlled setting such as a factory, it may be realistic to inform employees how to handle machines with safety in mind. As robots enter our day-to-day lives, this responsibility shifts onto us developers and designers. The safety considerations are broad as humans and real-life traffic can be unpredictable.

The easiest way to get started is to create a wheeled robot. Wheels are ideal on roads, and walking is one of the most complex tasks to get right in robotics. Wheels are a tad boring though, aren’t they? So let’s combine the best of both worlds by installing wheels on some legs. This allows the robot to deliver to places that aren’t as accessible.

The safety considerations are broad as humans and real-life traffic can be unpredictable.

With the recent hype around the rockstar known as AI, it is important to take a step back and assess what the best choice of technology is for each task, rather than blindly following the trends.

AI is most certainly the best tool for several tasks your new food delivery friend needs to perform, although it is still only one of several tools in our belt. Common advice is to start simple and incrementally add shiny tools like machine learning models where traditional techniques fail. Object detection may, for instance, be an area where machine learning would outshine trying to manually categorise images, while many sections of the robot's control flow can be implemented using traditional techniques. Tesla’s latest version of its in-development full-self-driving feature does however use an end-to-end neural network, which means it’s operated more like a black box without as much manual software development.

Machine learning often takes inspiration from how life on earth works. Let’s see how we can do the same for your robot.

Decision-making on the road

We have all seen the growth of AI, especially LLMs such as the widely known ChatGPT. While you can train your own AI from scratch, there are ways you could utilise existing AIs to their strength even though your robot will have next to nothing to do with text or language.

Think about the more creative tasks you might use ChatGPT for – such as coming up with ideas or explaining things. Sometimes a robot will find itself in unexpected situations, and you could use an LLM to boil down a complex situation into some simpler instructions your robot might be able to follow. The robot can send images – and get back instructions which it can then process. Examples include a response after getting into an accident, or understanding which door will lead to the right customer.

For more mundane situations, you can use classic object recognition to detect objects like cars, and importantly, street signs. You can then have your own AI running in your robot's little brain, with a simple low-resolution camera feed, street sign and vehicle data, and the instructions you have prepared ahead of time or gathered from an LLM. This allows you to react to changes in your environment with near-instant response times. The local processing also doesn’t need an internet connection – a requirement for any areas without a connection.

Now there’s a fast lane for quick decision-making, crucial on the road, but also a precise slow lane for the decisions that need it. This slow but precise reasoning is also something that’s currently being researched, and is likely to improve in future LLMs. This fast and slow thinking, popularised by the book Thinking, Fast and Slow by Daniel Kahneman, just might be the key to solving more complex problems than we are currently capable of.

You need to think not only about how the robot will respond to other humans but also how the humans respond to the robot.

If you have a hole burning in your pocket: feel free to train your own neural network which will definitely give you better results as you can take advantage of all kinds of inputs specific to your robot. Using existing solutions is often a good way to start the journey though – so that you haven’t already spent millions before even having a prototype.

The human factor

When designing a robot that will move amongst pedestrians and cars, you need to think not only about how the robot will respond to other humans but also how the humans respond to the robot. The field of human-robot interaction focuses on how this two-way interaction flows. Adding human elements without entering the Uncanny Valley is a great way to spark a connection within humans. So, let’s add some large eyes to your robot for a hint of cuteness. Rounded features can also add a level of softness us humans tend to like.

To add to the two-way interaction, you can for instance make the eyes move to whenever your robot’s cameras are focusing, resulting in the robot visibly looking around, scanning the nearby humans and objects. Keep in mind that humanisation like this can also easily backfire if your robot meets someone who isn’t so eager to interact with our inevitable successors 🤖

Keep your design simple and likeable. If it looks cute and soft, and if children want to pet it rather than start crying at the sight of it, you pat yourself on the back.

Impact

It is also our responsibility in the industry to understand our work’s impact to the world around us. For example, food delivery robots are undoubtedly taking away courier jobs. One day, there will likely be heated discussions over possible government regulations on what robots are allowed to do. Perhaps we’ll see a robotics tax?

The current robotic trends seem to be focused on dangerous and ergonomically challenging tasks like handling pressurised or hot items, or lifting heavy objects. What the future holds will likely be both fascinating and a little bit eerie to witness. The best we can do as individuals working in tech is to do our small part in nudging the ship towards what we believe to be the most ethical direction.

In the meantime, our Bell-E will hopefully stay in its lane in the traffic and bring smiles on hungry customers’ faces wherever it goes.

Sebastian Aarnio

A software developer who likes staying on top of the latest technologies, specialising in UX, software architecture and rendering.

About the author

Sebastian Aarnio

Latest Blog Posts

Read all Posts