Future Tech 2024: An Interview with Dr Robert Fitch (Australian Centre for Field Robotics)

Rob_Headshot_webIn this Future Tech interview, we’re continuing our discussion with Dr Robert Fitch, a leading research scientist in the area of autonomous field robotics and a senior research fellow with the Australian Centre for Field Robotics (ACFR) at The University of Sydney. He is interested in the science of how to make complex decisions under uncertainty applied to key robotics problems in commercial aviation, agriculture, and environmental monitoring. (In Part 1 of the interview we talked about the design of autonomous robots, social robots, transformer robots, robotic sensors and the genesis of Google’s self-driving car.)

Infrastructure Planning for Robot Cars

Shara Evans (SE): One thing that I’ve been pondering in this whole area of robot cars, is: do we also need sensors laid along the roadways, or is it better to just have the sensing apparatus in the vehicle itself, or the third choice would be the combination — is that really the best of both worlds? Do you have a view on that, Robert?

Robert Fitch (RF): Well, the idea of changing the environment, or we say instrumenting the environment, with something that will help the robot to know where it is, is useful to the robot. It makes the problem easier in some senses, but then that also limits the usefulness of that robot. If you want to take it and move it to an environment that hasn’t been modified in that way, then you’re in trouble. It’s really better for the robot itself to have all the technology required to allow it to cope with the environment that’s at hand, rather than changing environment to suit the robot.

In the case of self-driving cars, that’s a good example. There are previous projects in California where they embedded technology into the road itself to help the cars stay on the road and navigate. Now it’s really a case of mapping and localising yourself within a map. What the Google Car needs to rely on is to be able to use this sensor information, this laser and camera information, to match what the robot sees at any one point in time to a map that’s been collected previously.

SE: It’s basically marrying the sensor data that’s on the robot-driven car with, say, GPS coordinates and other things that help place you on the map that’s already been surveyed. Is that where that’s heading?

Example of GPS drift

Example of GPS drift

RF: Well, the GPS — it sounds like it’s a useful idea for robots, but it turns out that it’s really dangerous to rely on something like GPS for robots because there’s many elements in the environment that prevent those GPS signals from reaching the robots. You oftentimes find yourself in an urban canyon or next to a tall tree or a building where you completely lose that GPS signal.

SE: Or it’s wrong. I know with mapping and tracking mobile network performance, I’ll find that that my GPS often has me placed two kilometres from where I actually am.

RF: That’s right. The commercially available GPS devices can vary in accuracy and in general are not accurate enough to keep you on the road. It could be 5 to 10 metres off. If you have a car on the road that really needs to be within a lane when you have half a metre on the side of you…

SE: Five to ten metres is too much.

RF: It’s too much. It’s really the local sensing that allows you to get the resolution that you need.

SE: What technology would you use to place the robots in that virtual mapped environment to be able to calculate what’s coming next? I’m imagining that in doing the programming, you’d also want to know that a turnoff is coming, let’s say, a kilometre ahead of time, and you now need to start shifting lanes because you don’t want to be crossing five lanes of traffic at the last instant because perhaps the algorithm hasn’t sensed that the turnoff is coming. What do you need to know to get that level of information? Is that where the road instrumentation comes in?

RF: Not necessarily. This turns out to be one of the fundamental challenges of robotics, which is localisation and mapping. Where am I in the world? Where’s the robot now? It comes down to a couple of components. First of all, you have to be able to sense some sort of what we call features and other variants. In the case of the road, you may be able to sense the road markings or the little painted lines on the side of the road. If you can sense those, then you place yourself within a lane, but then in terms of deciding where you are in a big city, then you have to match the features that you see to some map.

SE: And it would be a 3D topography, wouldn’t it?

Simultaneous Localisation and Mapping

RF: Well, it would be a map of, again, of these features that you’re observing. There could be various types of features that you can use for that. That’s really one of the fundamental problems of robotics. We have a name for it. It’s called SLAM: Simultaneous Localisation and Mapping.

SE: (Laughing) Hopefully, the robot won’t slam into something!

RF: The pioneering work in this problem of SLAM is actually from here at ACFR many years ago, and robotisists around the world have built on and extended this work in many different ways. Really, the robot navigation that we have today depends fundamentally on our ability to solve this SLAM problem.

ACFR: Multi Modal Adaptive SLAM

ACFR: Multi Modal Adaptive SLAM

SE: Is it solvable? I mean, are driverless cars at a stage now where you could actually put them on the road and have them operate safely in the mix where there might be other robot-driven cars and human-driven cars, or do they need to be completely quarantined to perhaps sets of highways or two different lanes or multiple different lanes?

RF: We think that putting robot cars on the road is going to be the biggest boost to safety that we can think of in terms of fatalities from traffic accidents. Even in the case of the DARPA Grand Challenge, you had a competition where you had robot cars driving right next to human-driven cars. In that case, however, they were driven by stunt drivers.

SE: Slightly better reflexes than your typical driver on a highway.

RF: Yes, but they didn’t hold back — they were really trying to push the capability of these things. You had robots having to negotiate four-way stop signs, merge into traffic, wait for their turn to move to the stop signs and to turn, and also to queue, so you’ll have a number of cars stopped at a stop sign or something, and then you have to have this queuing behaviour that we do every day when we drive. These robots have to be able to do that as well. We think that in self-driving cars that we’re developing now and we want them in future are going to drastically reduce the number of fatalities on the road.


Meet Ethel the Yaris, GoGet’s first self-driving car. Credit: GoGet

SE: What kind of timeframe do you think it will be until you go to your car dealer to buy a new car and it will essentially be a robot car or maybe a hybrid manual-driven car and robot car?

RF: There are elements of this technology that we can buy today. This is surprising that the self-parking is certainly available in affordable cars.

SE: I’ve been in cars that have done that. I don’t own one, but I’ve been in them.

RF: You also have now smart cruise control. Not only is the cruise control doing what it used to be doing, which is to control your velocity but it also controls your steering, and that’s also commercially available.

SE: Now if only it would interface with the speed limit signs to make sure that you’re not accidentally too high or conversely too low, that would be useful, too.

RF: You’ll see this gradually, bit by bit, becoming incorporated into our cars until at some point in the future — I don’t know when, but I would like to think that by the time my children are of driving age — they will never drive a car, in a sense.

SE: We’re talking maybe 15 years?

RF: Yes.

SE: It’s pretty mindboggling. Will these robot-driven cars of the future be able to intermingle with human-driven cars seamlessly, or will we have perhaps a digital have and digital have-nots on the highway?

RF: That’s a good question. There’s certainly good reasons to encourage all cars to be self-driving cars because then you have the ability to have them talk to each other. Once they can talk to each other, they can have nice, for example, platooning behaviour on the freeways, where you can have a group of vehicles traveling at a higher speed than would normally be safe because they’re all sort of moving in sync with each other.

Platooning Robot Cars (Volvo Road Train) — Source: Volvo Road Train Animation of SARTRE Vehicle Platooning

SE: I’ve seen a video from Volvo that had platooning robot cars. It was just a simulation, not real robot cars. They were very closely spaced together, as well as traveling at, from memory, 150 kilometres per hour, which is much faster than you’d find on a typical roadway. They had separate robot lanes so that that these high-velocity, closely spaced cars were not really intermingled with the general traffic but then you’d move out of the robot lane when you wanted to go at a different velocity. Is that how you see it perhaps unfolding, or do you see something different?

RF: I’m not sure. I guess, time will tell if it comes to this. That’s certainly one scenario you can imagine. I think that you always have to cater to the fact that you’ll have people wanting to drive for some period of time, so you’ll have to have the capability for robots to deal with cars that are not in communication with them or not self-driving. I think gradually we’re going to see the adoption of this. At some point in the future, we won’t be driving our own cars.

SE: In the future we’ll wonder what we were thinking …

RF: The people of the future will look back to us and will think how crazy it was for us to drive our own cars.

SE: One of the robot cars that you had in the lab was a big truck, really, which is used in the mining area. Can you share with us a little bit about how these driverless cars are used in mining?

The business drivers for automated vehicles

Rugged terrain driverless vehicle at ACFR, Photo: Market Clarity

Rugged terrain driverless vehicle at ACFR, Photo: Market Clarity

RF: Sure. At ACFR, we have a big project with Rio Tinto to fully automate the mine. As part of that, there’s many different types of vehicles that will be used: everything from the haul trucks to drill rigs to the type of self-driving vehicles that we saw today, which are really commercial, four-wheel drive cars that, in our case, are carrying sensors to gather important information about the mine site as it evolves. It’s sort of a real-time picture of what’s going on there. The idea is really to build and collect this kind of information autonomously, so you want these things to drive themselves and collect this sensory data that’s useful to the mining application.

SE: Is the business driver collecting the sensor data, or is it that getting humans out to these areas is too difficult, or is it a combination of both that’s the driver for Rio Tinto to do this kind of trial?

RF: Well, I guess they have various reasons to be interested in automation. When these machines are driving themselves, it’s not necessarily just an issue of eliminating human labour. That’s the first thing you think about. Really, what we found in our history with other types of projects is that there are actually other drivers of automation that may be more important than this idea of human labour. I mean, it’s obviously good to keep people out of harm’s way and dangerous situations in the many cases we’ve got. We’ve got robots that are able to do that.

What we find really is that the benefits of having these kinds of machines is that their performance is more consistent. It’s really about consistency of operation on a sort of system level that I think is probably one of the main drivers of automation. The robots don’t have a bad day or take a smoke break or whatever.

SE: Yes.

Environmental Monitoring

What sort of environmental monitoring projects are you working on at the lab? I recall that you had mentioned that you have a project that is designed to find things in the environment that are quite difficult because of their small size and the large scale of the geography. An example of that might be finding insects like locusts. How does that fall into this whole robotics field?

RF: We have a number of projects in the general area of environment monitoring. That means collecting information about various things. It could be plants; it could be animals, things that are in natural environment that we’d like to be able to map and understand better. In the projects that we worked on, we’ve really been focusing on things like weeds. In Australia, there’s various species of weeds that are typically invasive species that have big impacts, negative impacts, on both the natural ecosystem and the economy. The industry is reliant on some of these ecosystems.

Alligator Weed Detection with Unmanned Aerial Vehicles: Australian Centre for Field Robotics

Fixed wing robotic aircraft at ACFR, Photo: Market Clarity

Fixed wing robotic aircraft at ACFR, Photo: Market Clarity

One of the problems is to be able to find where these invasive species are growing, and to be able to take some sort of manageable action. Because of the large scales involved and the difficulty of accessing some of the areas, we found that being able to use, for example, flying robots to automatically collect the sensory information and at the same time automatically classify the types of weeds that the robot has seen is very useful for management. So we can take a series of images from a flying robot, combine those into a single image, use machine-learning algorithms to automatically find the concentrations of the weeds we’re looking at based on the colour and the shape and texture, and then build a map of where these invasive species are. People who are doing something to manage these weeds can then use this information.

SE: I’d see it as a two-step process. The first step would be doing the environmental sweep and collecting the data and putting together the map. Then the second step, either done by humans or perhaps even on an automated basis, would be deciding what to do about the results of the data. Let’s say you saw a cluster of alligator weed, would you, at some point in the future, be using real-time maps and then collecting the information and then doing another fly-around and spraying pesticides?

Helicopter style flying robot at ACFR, Photo: Market Clarity

Helicopter style flying robot at ACFR, Photo: Market Clarity

RF: Yes, we thought about that. We have a project where we used a higher-altitude robot, so a fixed-wing flying robot, to do this mapping. Then there was a partner robot, which was a hovering robot in the helicopter style of flying robot in this case, that was then able to go and visit the locations that were identified by that high-altitude robot. Then it had the capability to dispense some sort of herbicide. In the trials that we did we didn’t actually use live chemicals, but we demonstrated that that kind of thing is possible.

SE: That would really cut down on the time to action, especially if it was automated in real time and didn’t require humans to interpret these massive amounts of data, because I’m imagining that the data that you’re collecting is absolutely massive.

The Future of Agriculture

RF: Yes, and it’s the scale of the environment and the accessibility. We did one trial in Victoria, where we were working in a type of waterway where the flying robot can easily fly over this rugged terrain, but then to actually walk through and verify that what we found was correct took a team of three people two days to do the work that we did in collecting the data in a few minutes.

Ladbybird (Source: ACFR)

Ladbybird (Source: ACFR)

SE: Wow, that’s a big scale difference. You need a lot of people, and a lot of time. If you can imagine having a swarm of these flying robots going out to different parts of, say, a farm simultaneously, maybe have a hundred of them mapping everything and then taking the appropriate action, getting teams of people out there to do the same would be a very time-consuming job.

RF: That’s right. We’re still developing this type of technology. We still have research problems to solve, but we’re starting to see some really good progress that indicates that we’ll be able to do useful work with these kind of robots in the near future.

SE: Are you doing anything in the area of tree crop horticulture. We’ve talked about pests, but what about the more general area of crop surveillance and perhaps using robot sensors to build a real-time picture of an orchard or a field or a forest?

RF: Yes, absolutely. We have a big interest in agricultural robots. One way to understand that is to think about the kind of algorithms that we use to automatically classify weeds based again on the colour and texture. Well, we could equally apply those to agricultural applications, not necessarily just for weeds, but also to look at the individual fruit that’s being grown.

We have several projects funded by Horticulture Australia and AUSVEG where we’re doing this. We have robots operating in tree crop industries, such as olives, apples and others, where we’ve done trials where we take our ground robots in this case, and sometimes aerial robots and collect laser data and collect visual camera data, and then apply these classification algorithms to do things like yield estimation — counting the individual fruit or maybe affirmative stages of development, the blossoms. The idea is that by automatically collecting this data, you could have a robot system that’s building up a real-time picture of this crop. This is something that will work with different tree crops and also in veggies. We have a bunch at AUSVEG, and we have a ground robot that’s looking at crop health and yield in vegetables.

SE: Is that the Ladybird?

RF: That’s the Ladybird.

SE: Yes, that was an interesting robot in the lab. I like that one. Didn’t it just win an award?

Ladybird in the lab

RF: Yes, so our group just recently, just last weekend, won the AUSVEG Researcher of the Year award.

SE: You must be very happy about that.

RF: Very happy about that. That was in part due to this Ladybird robot.

SE: Ladybird does not look like a typical robot that I would imagine from science fiction. It’s got solar panels all around it on the outside. Inside what I would call cut-off domes, you’ve got what looks like call tractor wheels and a picking arm. It’s able to pick weeds and do other things, as well as sense the health of the crops.

The Ladybird in action on Cowra beetroot farm (Source: ACFR)

The Ladybird in action on Cowra beetroot farm (Source: ACFR)

RF: That’s right. The name Ladybird comes from the kind of shell that’s on top of this robot. It has flexible solar panels that happen to be red and black, and so it kind of has the appearance of the ladybird. It’s able to move around the crop and it has a variable wheelbase, so it can accommodate different crop spacings. That shell not only helps it to gather solar energy for purposes of recharging its batteries, but also helps to control the level of illumination underneath where the sensors are. They’re actually gathering information about the crop health.

We’ve got robot arms — a manipulator, mounted underneath as well. This allows us to start to think about mechanical readings, so autonomously dealing with weeds that are in the crop using that manipulator. That might mean using an herbicide in a very concentrated area. Instead of broadcast spraying everything, we could just put that chemical just in the areas where it’s useful, and therefore drastically the total number of herbicides that will be used, or even mechanically removing that weed. This is really important for many reasons. Reducing the use of chemicals is an important thing for many people, including the growers.

SE: I don’t think anyone would argue with you about that one.

RF: But there’s also a big problem in Australia with the rise of herbicide-resistant weeds. There are some weeds now that have evolved such that no chemical can kill them.

SE: It’s just like viruses in humans.

RF: Exactly right. In the same way, the more herbicides that are used, the more this problem arises. We’d like to have some mechanism of tackling that. This is one of the really promising avenues, research-wise, to deal with that important problem.

When will Agricultural Robots be Commonplace?

SE: I know your lab is working with a number of commercial partners and with farms in Victoria, Queensland and some other places. What kind of timeframe do you see where, can I call them field robots or agriculture robots, would be in common use in a typical farm? Is it something that’s going to gradually creep in, or do you think in 5 years’ time or 10 years’ time, they’re we’re going to be very commonplace, or will it take longer because of the equipment replacement cycle? Maybe you can share a little bit about your thinking there on the timeline?

Mantis and Shrimp: General purpose perception research ground vehicles (Source: ACFR)

Mantis and Shrimp: General purpose perception research ground vehicles (Source: ACFR)

RF: We’ll see some kind of ground robots being used in agriculture in the next five years. We have a project in Queensland that’s a collaboration with QUT, Queensland University of Technology, and also a private farmer. The result of this research project is that the private farmer is going to commercialise a system for broad-acre agriculture, where you have a team of many small robots that are ground robots. In this case, this is initially targeting cereals, so weeds, etc. This team of robots is going to replace the big monolithic farm equipment that’s being used now. As the size of the farms increases, bigger and bigger machines are being used. That’s been really the trend, in dealing with the increase in size and farm size, is to make the machines bigger.

There’s a different way of thinking about that. Instead of making the machines bigger, we’re going to make the machines very small. Many of them you can have them move autonomously. Initially, there’ll be things like, again, spraying but spot spraying. Spot spraying means instead of just having an even coverage all over the entire paddock, well, you can sense where treatment is needed and just spray in that remote area.

SE: Wouldn’t there be another benefit in terms of soil compaction? When you have these really heavy machines, they’re compacting soil just because of weight; whereas if you have lots of smaller machines, they don’t have that same impact. I certainly not an expert in the area of agriculture, but I do understand that that’s a problem.

RF: That’s right. That was one of the issues that kind of led us to think about this kind of research. The effect of this trend towards larger and larger machines is that you have a big problem with soil compaction, where the area that the tractor tyre traverses really becomes unusable for a period of 5 to 10 years.

SE: That’s a long time of not getting any crop yield.

RF: Yes. The soil compaction problem is a big problem for the industry. If you think about soil compaction and why that would be so damaging, you need to think about how things like the Great Wall of China was built. A lot of these walls are actually built with compacted soil, so you can actually build walls and even buildings out of compacted soil.

SE: There’s not a whole lot growing out of that either.

RF: The same effect happens in the paddock, and that’s one of the things we can battle.

The Business Case for Robots in Agriculture

SE: What would be the business case for a farmer to replace this type of machine? Would they be looking at replacing machines, or using these new robots in place of new investments and then a gradual swap out when a machine has reached its end-of-life period?

RF: The idea is that a team of these small robots would be roughly equivalent in cost to one of the big tractors.

SE: The decision for the farmer would be: if I need a new area of land covered, do I buy a new big machine, or do I buy a system of smaller ones?

RF: That’s right. Initially, this will probably be commercialised as a service, so you’ll hire a service —instead of buying their system of robots, you’ll have a service that provides them as needed.

SE: That makes sense.

RF: All the operational costs, and all the maintenance costs would be handled by the company that’s providing the service, and the individual farmer would just choose this as a contracted service, which may do spot spraying and many other things.

Sensor laden robots such as Shrimp and Mantis can generate data at 1 Gbps or more, Photo: Market Clarity

Sensor laden robots such as Shrimp and Mantis can generate data at 1 Gbps or more, Photo: Market Clarity

SE: Well, the other part of the service would be once you’ve collected the data, doesn’t that then also need to be transmitted to some sort of central repository that the farmer would have access to, to be able to look at historical patterns or to be able to compare the operations on their farm to other farms and make informed decisions about the kinds of crops that they’ll grow, for example, or where they’ll make investments in new land? Wouldn’t this require quite high data rates because of the amount of information being collected? What kind of service would be required to transmit all this data?

RF: That’s right. It’s a good question. You’re doing crop surveillance and crop health monitoring. These robots collect a large amount of data. You’ve got lasers. You’ve got cameras that are operating high resolution. All this data has to go somewhere and could live on the individual robot, but you really want to put it together in a common place where you can see the whole business as a system. There needs to be some method of communicating that and storing that data. It could be eventually a cloud-based repository.

High Bandwidth Requirements

SE: I’d imagine it would be cloud-based in terms of the data repository, but there still needs to be some way of getting the data off of the robot and into a service cloud or a private cloud or even just a farmer’s server.

RF: That’s right. Communication is one of the issues that are storming through robotics because of this high data rate, not only in the agricultural applications but many applications of robotics. We need some way of communicating this — usually wirelessly — from the robot to other locations. Yes, there’s definitely a need for high bandwidth communication.

SE: Yes. We’re talking about a gigabit per second, aren’t we?

RF: At least, yes, at least.

SE: The more sensors we put on the robots and the finer grained the instrumentation is, the bigger the bandwidth requirements will be.

RF: That’s right.

SE: One can only assume it will grow in the course of time.

RF: Absolutely, and that’s for one robot. We’re talking about 5 to 10 to 20 in a system — just on one farm. You can imagine these on many farms.

SE: It’s complicated by the fact that the farms tend to be in remote and regional areas, where the communications infrastructure isn’t as robust as it is in metropolitan areas. By definition, the farms are out where you’ve got lots of land.

RF: Exactly.

Commercial Drones

SE: One last closing plot that I’d like to talk about, Robert, and that is these commercially operated unmanned aerial vehicles. Frankly, I’ve seen quite a few of them that are, how shall I say, pesky at best. It would seem to me that organisations like CASA here in Australia and the FAA in America, and of course across the world, really aren’t 100 percent across the problem right now because the technology is out there already. People are doing things like almost crashing into buildings or actually crashing into buildings or crashing into people. Every week somebody sends me an article about a drone that has hit a person or almost hit a person or has crashed into a building or done something else that is not a good thing. Sometimes it’s operator error. Sometimes perhaps it’s malicious. I saw one video a few days ago where the video was taken from a video camera mounted on the drone itself. You could see that the person controlling it didn’t have control because it was shaky. It was all over the place. Then eventually it crashed. They recovered the video from the crash site. What do we do about it?

Example of an out-of-control drone in New York City

RF: That’s a great question. In our area of aerial robots, we work closely with CASA, if necessary. We appreciate the dangers.

SE: Yes, but you’re a professional robotics lab. The people buying these kits for as low as $100 dollars don’t have your background. They don’t have any kind of pilot licensing, remote or otherwise.

RF: Yes, the laws today really are there to try to limit that kind of thing. Technically there’s a minimum distance away from people or buildings that you have the fly. There’s a maximum size.

SE: What’s that distance?

RF: It’s 30 metres away from people and buildings, and some number of kilometres away from all airports.

SE: Again, I keep hearing about near collisions with commercial aircrafts as well.

RF: Yes, it’s tough to enforce these at the moment because it’s a situation of the technology is outstripping the pace of the regulation. It’s all too easy for someone to build their own drone and operate it in an unsafe way. How to combat that from a policy perspective? I think we’ll have to see, but I don’t have any deep insights as to how to attack that.

SE: Yes, you’re not the policymaker.

RF: I could tell you that it actually is quite a serious problem. You mentioned crashing into buildings, but really it’s deeper than that. Even if it would land safely, but happens to land in the middle of a freeway or even just a road…

SE: That’s a safety hazard.

RF: …then the effect of that can be quite dramatic because you’ll have cars that are trying to avoid it, and that can easily cause a collision between cars on the road, or one car and property. Really, there are many dangers of using these in congested public spaces. We’ll see how the regulations evolve to deal with that kind of thing, and how they’re policed.

SE: I have a feeling it’s going to get worse before it gets better because they’re proliferating, and they’re cheap, which means that people buy them to play around. They just don’t understand even basic aerodynamic principles, like if you tried to operate this drone in high wind, you may not have the same control as you would have when it’s a nice calm day.

RF: That’s right. If a drone gets into a fight with nature…

SE: Nature wins.

RF: Nature wins.

SE: We’ll leave it at that. Thank you so very much for your time. It’s been great speaking to you and a lot of fun touring through the lab.

RF: My pleasure.

About the author: Shara Evans is internationally acknowledged as a cutting edge technology futurist, commentator, strategy advisor, keynote speaker and thought leader, as well as the Founder and CEO of Market Clarity.

Looking For a Dynamic Speaker?

Get in touch now to check Shara’s availability to speak at your next event. Shara works closely with her clients to ensure all her presentations are tailored to your event – ensuring maximum impact for your business.