This wonderful interview with Karl Iagnemma is all about autonomous vehicles. Karl is the CEO and co-Founder of nuTonomy, which is a self driving car company based in Boston. Many of you probably haven’t heard of Karl, but nuTonomy is pushing the boundaries of what self-driving cars can do. And we’re lucky enough to hear about it from Karl.
Karl has deep experience around robotics mobility, probably one of the deepest backgrounds in the world. He’s also the Director of the Robotic Mobility Group at MIT. He received his PhD from MIT in 2001.
nuTonomy has been testing their autonomous cars in Singapore and Boston.
-What type of mobility projects did you work on earlier in your career?
-How is your training and philosophy a little different than other self-driving car companies?
-When will we have level 5 cars, approximately?
-Is it difficult to train cars when it’s snowing?
Dave Kruse: Hey everyone. Welcome to another episode of Flyover Labs and today we get to talk to Karl Iagnemma. Sorry about that Karl. A lot of you probably haven’t heard of Karl, but his company is making ways around self driving cars.
Karl is the CEO and Co-Founder of nuTonomy, which is a self driving car company and Karl has deep experience around robotics mobility and he probably has one of the deepest background in the world in that area. He is also the Director of the Robotic Mobility Group at MIT. So Karl stays pretty busy and he received his PhD from MIT in 2001. So nuTonomy has all been over the media about their self driving cars in Singapore and now at Boston. They might not get quite as much attention as Google or Tesla, but they are definitely pushing their self driving boundaries just like them. So I’m pretty pumped to have Karl on the show.
So Karl, thanks for coming on today.
Karl Iagnemma: Yeah, my pleasure.
Dave Kruse: So let’s – yeah before we talk about what you are doing now, can you just give us a brief overview on your background and how you got to where you are now?
Karl Iagnemma: Well my academic background, you know I came to MIT in 1995. I was a graduate student. I did a PhD in robotics. My PhD thesis work was in the area of mobile robotics and specifically planetary exploration robot and when I finished my PhD then I started directing a research lab at MIT and we did a number of projects over the years for automotive companies in the areas of what was then called active safety or diver assistance technology and we also did a number of research programs for the Department of Defense, the National Science Foundation, other government agency in the area into robotics. And as it turns out you know, all the technology we were investigating during that period, things like robotic motion planning, localization, mapping, perception, you know those became the building blocks of self driving cars.
So about three or four years ago myself and my Co-Founder who was one of my colleagues at MIT, someone named Emilio Frazzoli, we work up one day and realized that the work that we have been doing at MIT was really at the dead center of this emerging industry around self driving cars and that’s what really motivated us to launch the company.
Dave Kruse: Got you. And so what was one of the first projects you worked on with a car company?
Karl Iagnemma: Well, we did quite a bit of work with Ford let’s say relatively early in my career when we were looking at this column of driver assistance and how you might develop what you can call a semi autonomous system, which is to say one that would let the driver drive sometimes and other times the computer system would drive the car. And the question of when the system would take control from the driver and when it would hand it back to the driver that was one of the fundamental issues we were trying to figure out.
It’s kind of an interesting side anecdote here. The student of mine who worked on that project for his PhD was a fellow named Sterling Anderson. Sterling later went off to be a consultant on McKenzie, but after McKenzie he went to Tesla and at Tesla one of the things he ended up doing was leading their self driving car program, which you know my description of our work sounded familiar, because Tesla’s work was semi autonomous control approach and so I guess that’s part of my academics family tree that went around the West Cost.
Dave Kruse: I’m sure it’s a small world up there on the top. And so when you are working on these projects do you every envision kind of where things are right now at self driving cars. I mean it sounds like it kind of just hit you on the head almost at a certain time like Wow! What we are doing here has a lot of overlap with what Google is doing or other exotic companies are doing?
Karl Iagnemma: Yeah, I agreed we did. I mean when you are in it from the beginning, sometimes it really does take a moment where you are able to step back and put things in perspective. You know we had known for a long time that the technology had massive promise to revolutionize how people got around, mainly because it would lead to more efficient, safer and lower cost transportation worldwide.
But for quite a while the technology maturity just wasn’t quite there yet. I mean we would struggle for days, weeks and months, just to be able to demonstrate you know some relatively simply tasks in a lab environment and the technology over time got a little better, little better, but again it really was one of those days when you realized, hey, this stuff actually is almost working. And then and that’s the time when you start to think beyond the academic laboratory, you know the four walls of your laboratory and think about how you could start to deploy that technology in the wider world.
Dave Kruse: Got you, okay. So let’s talk about nuTonomy a little bit. Can you tell us, can you give us a little overview on – well we know what you guys are doing, but a little bit of where you are at with Singapore and Boston and how many employees do you have and kind of your timelines for whether it’s a level five driving or whatever time or other time lines you want to share?
Karl Iagnemma: Sure. So we are based in Boston and Singapore. We are roughly one-third of the company today in Boston and roughly two-thirds in Singapore. We are very much focused on technical development. Our business team is pretty lean. It’s only about four of five at the company that really aren’t focused on core technical developments.
What is somewhat unique about what we are doing compared to so many of the startups in this pace is that we are developing a complete horseback solution autonomous driving. That includes all the software that would go on the vehicle, to allow a vehicle to you know navigate safely down a road network, autonomously. It includes software that goes on a headset to allow an end user to monitor the progress of a car that is coming to pick them up for an autonomous trip and it includes software that would sit in the cloud, that would coordinate in an optimal fashion, the activities of a very large fleet of autonomous vehicles.
So we tend to, you know when we think about autonomous vehicles, just think about the software that sits on a car, but there is other dimensions to that problem where you can add a lot of value. What we find at least today in the community is that while there is a lot of people focused on self problems, you know perception, mapping, we believe that there is a big advantage to able to address the entire problem, because what that allows you to do is optimize the performance of the system by having a very deep integration of the various sub systems and the knowledge of how they are working internally.
Dave Kruse: Interesting. I can see where that is a big advantage. And what type of sensors and cameras do you have on your car? Is that – I mean do a lot the companies have kind of similar sensors and cameras or does everyone have a little different package that you know of at least?
Karl Iagnemma: Yeah, I would say that as a community you know we have generally converged to a place where we are using the same types of censors, namely cameras, radars and LIDARS in some combination and the reason we are doing that is because of the redundancy you get from those complementary senor modes. I think everybody had a slightly different configuration of those sensors and understandably so, they are all coming at the probably you know independently trying to optimize as best they can.
There is a few exceptions to that. You know there’s a few groups that’s helping to solve the problem relying let’s say not on LIDAR, entirely on vision and radar. It makes the problem more difficult, but the potential pay off if you can win that way is that you know you avoid using quite an expensive sensor. So in our fleet of R&D vehicles though I can tell you that you know we experiment, we explore a number of different senor configurations throughout today. We do tweak things from time to time, but all of our cars have some combination of radars and some LIDAR sensors.
Dave Kruse: Okay, and you know you said one of your main advantage is kind of more of a systems approach and you know I was curious, while you are in Boston, so you don’t get snow. We are in Madison, Wisconsin so we get snow. Now I was curious about the difficulties between and what type of training you have done in snow versus sun. There’s probably not a lot of snow in Singapore, so how much harder is it to train a car in snow and how is your systems kind of analysis thinking compared to not having that overall approach.
Karl Iagnemma: Yeah, that’s a good question. You know this is a really hard technical problem and it’s hard enough to get it working in a good condition and that’s why I really – we do most of our testing in good weather conditions and most of our competitors do their testing in good weather conditions, but we’re still trying to solve that, the ‘easy case’ first.
But with that said, we think about the future where we want to be deploying these cars in cities worldwide, in all kinds of conditions. We do a lot of driving in the rain in Singapore. It rains frequently in Singapore, anywhere from a drizzle to a really heave monsoon like downpour and we were able to drive in quite a worldwide range of rainy conditions. We’ve driven in snow, fairly heavy snow, but not deep snow. So kind of dense flurries and we’ve done this exceptionally. I wouldn’t say we’ve done it extensively. We haven’t really validated our software across a range of conditions.
We’re trying to you know really build a great competency in the good conditions and then push it to boundaries of the off nominal weather conditions. So we’re very pleased to see that when we did our testing in snow, we saw a good performance to the algorithms and similarly in rain, but you know there is work to be done to really understand the limits of your system, you know how much snow is too much, how much rain millimeters per hour is too much.
When you’re done with the detail, you know it’s not really the snow, the falling snow that bothered the system. It’s the fact that when you accumulate a lot of snow, the world around you looked different than it did you know when there was no snow on the ground. So that can cause some you know complications for some of the sub-systems, but so far our initial testing has been very promising.
Dave Kruse: Interesting. And I imagine the vision system is probably going to work as well as LIDAR, but is that the case you think in snow or is it the vision systems have been working okay?
Karl Iagnemma: Well, this is you know – this really gets to that complementary point. I mean this is the reason we use multiple sensor modes, is because for exactly scenarios like this where you’ve got decreased visibility in snowy conditions. Built-in sensors, there is no magic bullet there. You know if you are looking out your windshield and you can’t see a whole lot, your camera is not going to see a whole lot either.
But the good news is your radar sensors, your LIDAR sensors maybe relatively unaffected and so again, that’s exactly the reason you know why we have this kind of belt and suspenders approach of having multiple complementary sensor modes that in often its overlapping, they are providing you very similar information, but in some conditions like snowy days, one of them may not function well at all and you have to rely more heavily on the others.
Dave Kruse: Makes sense, okay. So you probably get this question asked a lot, but I was curious, when in your mind will I be able to buy a fully self driving car? Kind of like what they call the level five where I can just get picked up and taken to the grocery store and brought back home. Do you have any guess, any sense and any range of years?
Karl Iagnemma: Well you know you buy a self any car, that’s likely to be quite a bit later than when you would be able to experience a ride in the self driving car. And what I mean by that is, there is really two different models that play there selling a feature to a customer when they buy their next car. So when you go to the dealership and you buy your next Ford or GM car or Volvo whatever the case maybe, you know the question is when will that salesman say, would you like the autonomy option package for $62,000? That’s going to be several years in the future. I would be surprised if it was earlier than 2025. The reason for that is that implies that you are not using extremely sensitive sensors to enable that feature to operate. You are only relying on cameras and radars.
Now with that said, the big caveat here is that you know often we are assuming when we think about buying a car with autonomous operation. We assume that this will be a feature we can turn on and off anywhere in the world, at any time of the day or night in any weather condition. My strong sense what we are going to evolve to and the field is the universe where, even when we are offering that self driving car feature to an end user customer, it will only be available part of the time under certain conditions, okay. Ideally under most conditions I don’t think you’ll sell it if it was only under some conditions, but not under all conditions.
And so I think you know the wild card here is a company like Tesla who is promising to actually sell these features that we are talking about in a much, much nearer timeframe, in 2025, probably in the coming few years. I think the likely caveat here is that that feature wouldn’t be available to you as a customer all the time. It will be a few years down the road before we have that 24/7 autonomous feature.
Now being able to experience a ride in a driverless car, the reason that that’s going to be available to you as a customer sooner is because the economics of what we call mobility of the service are fundamentally different than the economics of let’s say private vehicle ownership. And it really – the difference is that on the one hand when you are buying a car and you are going to elect to purchase that feature, you are very price sensitive, very price constrain of you being the average buyer you may pay three, four, five, six, seven thousand dollars. You are not going to pay $20,000 for a feature and add on to a car.
On the other hand, if I wanted to you pick you in my robo taxi and I wanted to make a business out of doing that, well if you think about the economics of a taxi service, a significant percentage of the cost of your taxi ride is well, the cost of that driver, the salary of that driver that’s actually driving that car. When I say significant, I mean anywhere from 30, from roughly one-third to two-thirds the cost of that trip is the cost of the driver. So if you can take the driver out of the car, you know the economics of that taxi trip are radically upended and let’s say you know the first store of that taxi driver, two tips for the taxi driver per year, let’s say for the moment that’s $100,000. That means you could offset that $100,000 salary with equipment that you put on the car.
So you could put tens of thousands of dollars. You know it’s the first order, again of equipment on the car as a favorable return on your investment and be able to operate you know a business of moving people around using autonomous vehicles and the economics would make sense. So that’s really the reason why you are probably going to ride in a roller taxi and pay by the kilometer you know several years before you will actually own that fully autonomous car.
Dave Kruse: Got you. So even in Madison we could possibly see fully autonomous taxis in three, four years potentially?
Karl Iagnemma: I think the other. The other point to make is that you know a mental model that we sometimes fall into is assuming that when these cars arrive they will be available at scale. They will be all over the place ad available all the time. I think the likely reality is that you first experience in a shelf driving shuttle or a robo taxi will be in a fairly let’s say structured environment. It will be at the shopping mall, it will be at an amusement park, it will be at a closed campus somewhere, maybe on the you know University of Wisconsin of somewhere like that where you got a predictable route that these cars are following or predictable networks of routs. It’s not necessarily the unconstrained open road environment.
You know these are technically easier use cases. They are simplified economics cases. They are ways for developers of this technology and companies interested in the mobility services space to really test the waters and you know for those reasons, all those reasons together I think we’ll tend to see this technology be deployed in these constrained environments first.
Dave Kruse: That makes sense. Yeah, and it seems like some companies are, you see like they are doing autonomous like shuttle service, which would be kind of like a airport of something, people are starting to do that at least or in Vegas I think I read somebody is doing that. That makes sense.
Karl Iagnemma: Exactly. I mean that’s a good example of an environment that we might call semi structure. Of course anything can happen when you are out in the natural world, but by bounding the operational environment of that car, by saying you are going to stay in this what we call a geo fenced area and by the way, if it’s a private ground you might even be able to impose certain restrictions like dedicating a travel lane for these types of cars while putting flashing lights on them or you know similar things like that. You can make (a) the technical problem easier and (b) you know the liability risk lower. You can essentially operator at low speeds. There is a number of things you could do to make the problem more retractable and make the business case more attractive.
Dave Kruse: Got you, okay. And I know we’re out of time. Do you have time for a couple more questions?
Karl Iagnemma: Yes, sure.
Dave Kruse: Yeah, okay, because I was curious you know about partnering, which car companies have you partnered with if any?
Karl Iagnemma: Well, we do. We have a couple of partnerships with automotive companies. It’s a little bit difficult to talk about the specifics of them, because they all have a distinct nature. You know I can say generally that you know the automotive companies, a few had gone and I’m talking three or four years ago about partnering with a start-up. It would have been a probably difficult conversation because there is a big, I’m going to say big key miss match between you know just the scale of a typical OEM and the scale of a start-up. It’s really hard to find ways to meaningfully work together in my experience.
You know these days the landscape has changed a little bit. There is a real strong interest in autonomous vehicle technology. There is a strong interest in mobility as a service in nearly every OEM worldwide these days and there’s a recognition that you know some of the good ideas and some of the good technology is being developed outside their four walls at the companies.
This one of course you can’t completely generalize, but you know in our experience talking with the greatest players in the auto industry, there is strong interest in what we’re doing in the autonomy. There is often a willingness to find a way to partner and it really just comes down to you know identifying a structure that the two sides are happy with and can get meaningful work done. But we have actually had very productive relationships with some of the leading players in the auto industry and I expect you know we’ll continue to do that in the coming years.
Dave Kruse: Interesting. Yeah, I mean even on this podcast I interview a lot of Chief Innovation Officers and I think sometimes the innovation teams have really opened up some of these large opaque corporations to smaller entities and let them filter through some of these large companies.
Karl Iagnemma: Yeah, and I think what large organizations have realized is this is a very fast moving space and its sometimes the case that you can develop technology internally at a very fast pace and you can get it out the door and in that manner keep pace with your competitors. In other scenarios it’s just not that easy to do and so as a way to accelerate your progress, you look outside the four walls of your own company.
You see if there is possibilities to partner the licensed, software, in some cases to you know through M&A activity to seep up your own internal development. And you know I think to the auto industry real credit. This I an industry that hasn’t historically, at least again very generally been very big on outside partnership, especially with small companies. But I think that’s changed pretty quickly over the last couple of years.
Dave Kruse: Got you, all right, so last question. You know after you’ve achieved a level five autonomy and you are pretty comfortable. I guess there is probably always more environments you can get better at as I was going to ask what else is there to work on. Maybe there is just more and more used cases or tweaking to improve safety or – I mean you haven’t probably thought a lot about what you are going to do after you reach it because you are not there, but I was just curious what else there is to work on once you reach it.
Karl Iagnemma: You know there’s ways to go in the technology development front. The technology that we are building and I can say generally across the space, no one has a finished product yet. Nobody has a solution that they would feel comfortable today taking the driver out of the car and letting that system operate in a completely driverless manner, in really dense difficult urban street. We are making fast progress for that goal. Some of our competitors are as well. But there is lot of work to be done just on core technical development and so we really are focused on that.
Obviously as we do that, we are continuously evaluating the business case, our go-to-market strategy, ensuring that when we have a product that’s mature enough to put on the road, that we are able to go-to-market and start generating meaningful revenue in the important, early markets around the world.
Dave Kruse: That makes sense. All right, well Karl, definitely I really appreciate your time and your thoughts here and what you are doing is very inspiring. So thanks for sharing with us and spending some time with us today.
Karl Iagnemma: Well, my pleasure. It was nice talking to you.
Dave Kruse: Definitely, and thanks everyone for listening to another episode of Flyover Labs. As always, I greatly appreciate it and we’ll see you next time. Thanks everyone. Thanks Karl. Bye.
Karl Iagnemma: All right, bye-bye.