This is a great, thoughtful interview with Dr. Cory Kidd. Cory is the founder and CEO of Catalia Health. Catalia Health provides chronic disease management programs for patients which are delivered through Mabu, a little robot that helps people through the many challenges of sticking with a treatment program.
Cory is the perfect person to build this type of robot. He has a deep background in robotics combined with psychology and artificial intelligence. He understands how to build robots that people want to interact with. That’s not easy. Cory shares how he makes it happen.
Here are some other things we talk about:
-How do people respond to Mabu, your robot?
-How did you think about designing Mabu? There is an infinite number of potential interactions. How did you come up with certain use cases?
-Why did you include the robot instead of just an app?
David Kruse: Hey everyone. Welcome to another episode of Flyover Labs. Thanks for joining us and today we are lucky enough to talk to Cory Kidd. And Cory is the Founder and CEO of Catalia Health, and Catalia sells a personal healthcare companion. So what is that exactly? Well, it’s a little robot that helps people remember to take their medications and provides other health suggestion. So it’s pretty cool. You have to see some videos which we’ll post and we’ll find out more of course about it. So Catalia has raised about $1.25 million from a Khosla Ventures and so they have done quite a bit with a small amount of money, which isn’t easy. And Cory himself has quite a background. He has a PhD from MIT Media Lab and has worked with a number of interesting companies and projects. So I invited Cory to hear more about Catalia and their robot Mabu and learn more about Cory’s background and how he pulled together Mabu over many years of research and development. So Cory, thanks for coming on the show today.
Cory Kidd: Absolutely. Happy to be here. Thank you for the invitation.
David Kruse: Yeah, so as I kind of alluded to in the intro, you know you have an interesting background. Do you want to give us a little overview on your background.
Cory Kidd: Sure. I mean I have been in the healthcare technology space for a long time now, the better part of two decades. Back in the 1990’s when I was down in Atlanta, Georgia Tech, first as a student did an under grad degree in Computer Science and then by the time I finished up I’d being hired as a research faculty member and ended up hoping to build and run a lab called the O’Hare Home, where we were looking at a lot of technology focused on an over population and how we help people live at home longer rather than move into a nursing home or an assisted living facility. Then in 2001 I went back to graduate school. As you mentioned I went to MIT. So I spent 6.5 years there at the Media Lab where I did my master is PhD in what I usually say is human robot interaction, but it was really the intersection of artificial intelligence, physiology, medicine and of course robotics, which is a part of what I’m still doing now. And the last half of that time I actually split between MIT and Boston University Medical Center where I did my clinical work in endocrinology for three years. And since finishing at the end of the 2007 I have been in the start-up or focused on commercializing this kind of technology to help people with a lot of different healthcare issues and applications.
David Kruse: Interesting. And so how – Wow! I mean you were definitely ahead of the time if you were working on kind of helping senior citizens in homes since longer than 2000. You know that’s like that a hot topic now. I’m curious, what were you working on back then?
Cory Kidd: So I started on that project and joined that group around 1998, 1999 and we were looking at a lot of different home based technology, whether that be sensors to try to figure out what was going on in the home environment, from things embedded in the floor to cameras in the ceiling, all different sorts of things to better understand the activity in the home, to actuators right, it’s a simple thing for like open and closing doors or blinds or windrows or things that we kind of think of as home automation today, to different interfaces, whether that be interfaces kind of like we have in many places to today, you know flat screens or voice activated interfaces, now things like Crisis or Amazon Echo are fairly common. 15 to 20 years ago that was pretty challenging technology to build and get to work well. So we are really looking at just all sorts of different things and how we integrate it into the everyday environment of an individual.
David Kruse: Got you, okay. And how do you originally get interested in working for robotics and healthcare. What was – do you remember like was there a project or because you have been working in it a long time. So what was kind of the initial spark?
Cory Kidd: You know that means going back even more than 20 years ago. You know I’ve had a long time interest in technology and programming since I was, I don’t know, probably eight years old, back in the 1980’s and my first comet 0-64 [ph]. And you know through going to university considering whether to go into technology or go into medicine. At one point in time trying to decide whether to get a degree in Computer Science or in Pre Med, and I have chosen to combine both. So the academic route that I took was more, slightly more in the technical side as I mentioned my undergrad degree in Computer Science and my Grad Degrees are from MIT, but they are a bit of clinical work as well. So it’s really for me been a long time about bringing those two things together. So applying technology in the real world in ways that can you know practically help people with their healthcare.
David Kruse: Got you and MIT, you know you mentioned that you worked on projects across multiple disciplines and so was that an established lab or did you kind of create your own thing when you went there?
Cory Kidd: Yeah, I did my graduate work in the Media Lab, which if you’re familiar with is known for being an amalgamation or many different things. So I was in a research lab that was focused on robotics. So the Media Lab is a fairly large place. Within it there are about 25 to 30 faculties at any given time, each running their own research lab on a particular topic. And so while my degree was within the lab focused on interactive robots or social robots with Professor Cynthia Breazeal, my work was kind of bringing that together with healthcare applications. So if you are a student at the Media Lab, your course work is a combination of courses from a number of different places. So usually a small number of those are courses that start within the Media Lab. You can choose from courses across the rest of MIT as well, as well as some neighboring universities. Unfortunately Boston has a few good ones in the area.
David Kruse: Yes, just a few. Yeah, I’m curious to know why endocrinology. Do you do practice in that?
Cory Kidd: So a couple of reasons. I ended up meeting Dr. Caroline Apovian who runs a nutrition weight management clinic at Boston University of Medical Center. So in the department of endocrinology and within that the big healthcare issue that profession spends a lot of time focused on is around diabetes and weight management and if we are looking at problems, looking for problems that we can potentially help a lot of people with, this is definitely a big one, right. So if we think about weight loss in the United States, we’ve got about two-thirds of our population overweight or obese. So from an academic or medical study perspective it’s great, because there is a lot of literature, a lot of studies done on this challenge. And just from a practical perspective right, if it’s something where we can make a difference, it has the potential to the impact positively a lot of people.
David Kruse: Got you, that’s for sure. And okay when you were at MIT what type of projects were you working on. Were they related to totally that you are working on now or are they different projects?
Cory Kidd: Yeah, I did a verity of different thing, but the core of my work has followed the similar art for about 15 years now. So my first couple of years there, such as my masters degree I created and ran a whole series of studies that of course involved technology, building a programming robot, but it was really more about physiology. So in addition to my advisor, I saw Cynthia Breazeal at the Media Lab and then Ross McCarge [ph] was also in my masters committee at the Media Lab. I worked with professor Cliff Nass he was at Stanford University and had written a book two years before that focused on physiology of human computer interaction and what’s happening when people are interacting with a lot of technology that we were used to having around us in the late 1990s and we designed a series a studies to look at what’s happening when people are interacting with robot. And the short summary of two years of work is what we found is that we know that face to face interaction makes a difference, right. I’m in business now. I spend a lot of time meeting with people in my company, with investors, with customers, with partners, spend a lot of time on the road and there is a lot of effort involved in getting to all of those meetings, all over the U.S. or the world. Now I could do all of that over the phone, right. We could set up phone calls or may be a video conference and certainly I spend time on those, but I still make the effort to meet face to face with people as I’m sure everyone listening here does, because we get intuitively that there is a difference when we’re face to face with someone. Now it’s no doubt that philologists studied for decades what that difference is, where face to face we create a stronger relationship. That relationship is going to last longer and we find that person that we are talking with to be more credible, more informative, more trustworthy. And what I discovered is that these basic physiological differences actually carry over very strongly into the world of technology. In other words, when you put that physical thing in front of someone, the robot, they can look at them, can make eye contact and can share physical space. You get a lot of effects of face to face interaction that you will never get through the screen. So the fact that you have something physical there means that interaction is much more like being face to face with another person versus some kind of mediator to their actions through text or voice or video. So that was really the foundation of the work I was doing then, and that led then into okay, what are the real world applications of that, that’s when I started working also at Boston University Medical Center developing both software and hardware to really implement this. So if you want to see what the earlier robots that I was building look like, you can type my name into Google, C-O-R-Y K-I-D-D and A-U-T-O-M and you will see its updating back to my time at MIT, a decade ago, as well as stuff from my first company that I can talk about in a minute. But what I ended up building for in my PhD work was essentially a robotic weight loss coach and we put these in patients’ homes doing a real world randomized controlled trial and that really went from this physiological theory, hey, looks like this stuff is a lot more effective in interaction than something on a screen to real world practice approved. So we saw that this really works with people, helping them engage with something for much longer and of course that carries forward to what we are doing today at Catalia Health. So while the work at MIT, particularly in the early days was much more theoretical, it evolved very quickly into practical applications, that that’s carried forward into the stuff I’m building today more than 10 years later.
David Kruse: Yes, that was a good explanation. You already answered one of later questions of why the robot instead of a nap, but that, it makes a lot of sense. I mean everyone should see this robot. It’s a cute little robot. So I want one in my house. It’s well designed, which we can get into, but I’m curious how – but yeah, anyways before we get there you mentioned that the previous, before we get into talking more about Catalia you mentioned that previous startup, can you share a little bit more about where you are working on there?
Cory Kidd: Certainly. So my first startup was called Intuitive Automata. This was the company that I started straight out of graduate school with a couple of co-founders and we were focused on applying exactly what I was working on in my PhD work in the real world and this company was this big bet. Now remember this was in 2007 when I formed this company. I think we actually legally established the company a few weeks before the first iPhone was announced, let alone you know Smartphone being shipped. So we are talking about a long time ago in terms of technology time and what that really meant in practical application there is that building the kind of stuff that we were doing, the hardware, the software, the interactive robot cost about a 100 times than what it does today. So if we think about the effect of Smartphones and there is the obvious effect, right. We’ve all got this really powerful device in our pocket, we got these cool apps at our fingertips, but there has also been a huge effective in terms of project design and manufacturing. So all the components that go into these devices that we carry around are now commodity components. So things like tiny low cost, low type of processors, memories, screens, you know touch interfaces all this kind of stuff was pretty rare 10 years ago and you know if you could get it, it was incredibly expenses. So building this stuff was expensive. And so we built the company around the concept that, okay we know this works; let’s figure out the business model for that. And we spent about five years doing that and we did have some limited success in doing sales to big pharma companies, to some of the largest health insurers in the U.S., to some of the large healthcare systems, self-insured employers, direct individuals. But because of the limitations in the business model, because of the expense of building the platform it never really took off. So you know while we developed a lot of great technology, and had some great applications around it, it was just too early for that company to really be feasible. So we ended up shutting that down about three and a half years ago, so kind of leaving the company open to hold the IP that was created, but ceasing work on that at that time.
David Kruse: Got you, okay and then – yeah, how was that whole process? How did that help you with starting Catalia next? Did you do things differently? I guess the costs were way down, so regardless that probably helped you a lot. But yeah, any other lessons?
Cory Kidd: Yeah, so the costs have come down, definitely. You know I think the biggest thing from that company was you know partly just time right and the fact that we are almost a decade on now means of course the cost of technology are much, much lower for building this kind of stuff. But also it was developing a much deeper understanding of the healthcare system, right. So in selling to those five different channels of what I’m talking about, you know quick aside I based that company in Hong Kong largely for economy reasons in terms of inability to fund raise in the U.S. around healthcare or hardware back in 2008. But about three years ago I moved back to the U.S. I spent about a year trying to answer a couple of fundamental questions. So starting with this technology that we knew worked very well, questions one is okay, where could we apply this in healthcare. And the short answer is kind of everywhere, right. There are a lot of applications where changing our behavior, helping us to adhere to something is really critical to healthcare outcomes, so obviously a huge opportunity. And the second question then is trying to understand okay who is the customer for this who is willing to pay for this today? So leveraging the network that I built in running that first company for five or six years had a lot of conversations with many of those companies and individuals and that’s really what helped me focus in on the specific areas that we are working on at Catalia Health, which are around chronic disease management and medication adherence. So still two enormous challenges, but focused on particular application areas where a lot of people in healthcare are actively looking for solutions.
David Kruse: Got you, okay. Well put and so you started Catalia in 2014 right and can you just give everyone a brief overview or, you kind of did, but about the maybe like a use cases and just kind of describe a little bit more of what Mabu the robot does for people?
Cory Kidd: Certainly. So we are building a platform and what I mean by that is the core software and hardware for Mabu is shared across a lot of different applications. And so the easy part to see is the robot, Mabu the robot is – there is a short video on our website cataliahealth.com and that robot was designed by IDO and so we used that to interact with patients across a variety of applications, and our core software platform also gets shared across those applications. So there is the kind of infrastructure stuff right that makes it work, that makes the robot move, that communicate back with our backend platform, but there is also the core artificial intelligence algorithms that really draws directly from psychology, in understanding how we create these conversations, how we create a relationship with that individual, that gets built up over time and in general how we follow up with patients. And then on top of that what we are doing is building particular applications. So our customers at Catalia Health are not the individual that want to use them. But either the drug marker, the large pharmaceutical manufacturers or the hospitals or its integrated health systems. And for those customers we are building applications around particular diseases that patients are trying to manage or particular treatment that they use for the disease. And so it’s actually happening for the patient then is they don’t need to know any of the complicated technology behind it. They get this device that gets mailed to them at their home, not charge to the patients by the way. So our customers, pharam, and its integrated health are the ones paying us for it. You know a patient with – you know talking a certain drug might get this mailed to their home. So take it out of the box and plug it in, which is the entire setup process, everything else happens through conversation and the robot will turn on, maybe stretch its neck, look around, and AH! Thanks for taking me out of that box, good to see you. I’m sorry to hear about your cancer, but I’m going to help you make sure you take your medication, answer any questions that you have and I’ll get some information back to your pharmacist so that he or she can help you as well. So the use case for the patient is these everyday conversations you know. So maybe he was just going to check in, Ah! I want to see how you are feeling today, you know is everything going okay and you can have a short back and forth conversation, kind of like talking to Siri or your Amazon Echo where maybe it was checking in seeing how things are going, giving a little advice at that point in time and reporting that data back to either a pharmacist or a doctor, depending on who our partner is for that particular application.
David Kruse: Interesting, and so that dialog piece just sounds fairly complex. It feels like there could be a large number of potential dialogs that could occur. You end up just talking about diabetes or whatever that might be, you mentioned AI. Is that dialog continually being updated based on a response or how are you designing that?
Cory Kidd: So there is an entire set of algorithms behind sort of really crafting that dialog for that patient at that point in time. So we are building up these models of an individual understanding, what they need medically, understanding physiologically what this particular patient needs at this point in time and that is part of what goes into crafting that specific conversation for the individual. So there is a lot going on behind the scenes as a result of that conversation that’s helping us to better model that person so we can choose what we are going to say next. And so there is a huge database of content that a robot can then draw from in those conversations. So the result for the patient is that no two conversations are the same. It’s always learning about and adapting to the individual and trying to talk about what they need at that particular point of time.
David Kruse: Interesting, and so was this, were those algorithms based, did you develop those starting in 2014 or did you have some of that already done by the time you started Catalia?
Cory Kidd: So we spent a lot of new work here at Catalia Health, but the concept that we build on, its stuff that I’ve being building for over a decade now.
David Kruse: Okay, wow! And what type of use cases or diseases do you – are you currently working on. One of those is prescription adherence, but you mentioned maybe some other disease management ones as well?
Cory Kidd: Yeah, you know largely things in oncology are doing with different caners and acknowledging heart failure. So there is variety of different application that we are working on at Catalia Health.
David Kruse: Interesting, and what’s the difference if somebody has – and I don’t know if you know this on top of your head, but somebody who has heart failure, versus a cancer. How is the dialog different, especially initially or is it pretty similar initially and then it changes and adapts over time?
Cory Kidd: So the latter, right. So when we start off its just like you are meeting someone new, right. That first thing you are going to say is probably you know my name is, nice to meet you, right. There is no other way that we – we always start off when interacting with someone. But from there it’s going to go in a very different direction, right. Why are you meeting this person? What’s the context? Why? Where is this conversation supposed to go? Is this something casual, is this a business conversation, what’s the context of the direction of that conversation, and so the same kind of thing happens here, right. So in the first conversation, she is going to start off with a greeting and you know the robot will introduce herself to the patient and then from there maybe talk a little bit of the purpose and how they are going to work together, but then it’s going to get into just disease specific. So part of what our team is doing is we are building in a new application is focusing in on what are the challenges for that particular patient population. You know if we are talking about say a particular cancer where our patients are 70, 80, 90 years old, there is a different set of things that they are dealing with if we are talking about a disease that typically effects people in their 30, 40, 50 and you know just the types of things that those two groups of patients are going to be dealing with are going to be a little bit different. And the other things that we look at of course are what are the challenges relevant to that specific disease, right? What are the questions I need to ask each day or each week about how that disease is affecting you? So it’s going to be very different for the heart failure patient versus the cancer patient. We care about different things medically that are going to be most relevant to that individual. And so a lot of what’s going on is we are building these applications, is understanding those particular needs and that will then end up driving the conversation with the patient later on.
David Kruse: I can see why this is – why you have been working on it for 10 years. That does not seem like an easy problem to solve and I’m sure you are always working on it some more, that’s interesting. So – and I’m curious, what is – and maybe I don’t know if you can share with us even whatever stats you have with an adherence or the usability. Once it comes in somebody homes, you know a lot of times people stop using technology. You know how do you encourage that deal of the interaction and do you have – how many people actually continue to use it on a daily basis?
Cory Kidd: So if we look at a lot of the technology solutions trying to solve this problem, they think first about the technology. And what I mean by that is there are, I don’t know how many, but in the hundreds if not thousands of apps that you can download for your iPhone of android phone that will help you remember to take a pill. And this is because that kind of an app is easy to build on that device, right. We can send an alarm, we can send the reminder. Now as I mentioned a minute ago, one of the things that we look at in each disease state are what are the challenges for that patient. One of the things that we research are what are the so called barriers to adhere. In other words, one of the problems that those patients actually have with staying on therapy, and typically we come up with a list of four or five challenges that are most common for that patient population, and forgetting to take a drug is always on there and it’s always the last thing on that list. In other words, there are a lot of other issues that we have as individuals besides forgetting. There are a lot of other reasons why I might not want to take that pill today or give myself that injection or you know so there is a lot of different aspects of this that are important and we really start with two things, when we think about building these applications. So first, what I was talking about a few minutes ago, the relationship. Right, the key thing that we are trying to do is build up the relationship between the patient and the robot, right. And we are not trying to replace people with this, we are not trying to make people this is human. If you take one look at the robots that we are talking about, you will see very clearly that these are not human like, rather they are plastic, they are currently bright yellow plastic, but they do have eyes, right. They can look at you, there is something to focus on. But those conversations are about how are you going to work together and how to build up that relationship. And so that’s really the foundation of what we do and then its talking about what are the challenges for that individual and what we find is that, yes there is some medical challenges around side effects and side effect management that we can learn from the patient. But a lot of the challenges are physiological. Right, if we are later in life in dealing with dealing with chronic diseases, that effects us in many ways besides just medically and we have a conversation with a patient around all of these things. And as a result of this, what we see is an enormous difference is how patients use this kind of technology over time verses what patients will do with an app. So we’ve seen a much great rate of patients sticking with it over long periods of time versus what most technology is able to help patients with today.
David Kruse: Interesting. It makes so much sense. I mean you are right usually, well usually in other fields, but then people bring that kind of technology perspective to healthcare and that’s what they lead with, but really just, yeah there is so much more to it as you have been talking about it today and have been researching for many years with a physiology perspective and the, yeah, just the emotional perspective and yeah, that’s interesting. So yeah, really it involves kind of a holistic like a perspective to design a high quality healthcare technology in some ways, especially when it’s in somebody’s home. Okay.
Cory Kidd: Absolutely, starting with the healthcare and the patient as opposed to the technology and what makes it usable and useful to individuals.
David Kruse: Okay, and so we talked a little bit about the design of the robots and I wish everybody could see it, but I guess well you can, I’ll put a link on the website. But you know you mentioned IDO helped to design it, so can you talk about kind of the design process for the robot and why you picked the particular design.
Cory Kidd: Sure. So you know I mentioned a minute ago the robot has eyes. And that is actually one of the key things. And the reason for that is not about it being again to make it look cool, but this actually goes back to the physiology research that I was doing 15 years ago. One of the things that we found in those early studies is the ability for the eyes to move around and make eye contact. It was actually really important for these kinds of applications. Now we know in generally with people eye contact is really important. Alright, imagine meeting someone and having a five minute conversation and they are the entire time starting past your shoulder and looking off into the distance, you are going to kind that person a bit awkward, a bit strange, right. It’s just not quite the same as having a conversation like we used too. Now what we learnt in those early studies is that eye contact with the robot was also very important. Now it’s not the things that sustains a long term relationship, but it is the thing that helps draw that individual in at first. You know if we think about, you know imaging you are at a crowded party in a big area and you are all the way across the room from someone who you see look in your direction. You can instantly tell if they are looking in your eyes or looking right past you. And it’s just something we can innately do. Eye contact is – you know we are hard wired to do and we are not even the only species that do this. Dogs for example can do the same things, and so eye contact is really critical. So what that meant is going into work with IDO and designing this robot, we had a very broad design brief that we gave them. So they wanted something roughly the size of a kitchen appliance. We can build this thing to be the size of your Smartphone, but then it’s a little too small. You don’t take it seriously, you can toss it behind some paperwork, drop it in a draw and it’s gone. But at the same time we don’t need something that’s human size, right, that just takes up too much space and no one is going to want that in their home, and you know something kind of counter top, appliance size, the size of say a blender or toaster actually worked pretty well. So we gave them that rough size criteria. We need a touch screen on it. Now voice interaction is getting better and better, but it’s not going to replace visual interaction completely. For a long time in general or locating much better with things like the Echo, but in these kinds of applications we are always going to need the screens. Imagine a scenario where you know I’m taking, say I have to give myself an injection ever week or two or every day or whatever it might it. And the robot companion can say to me, oh! I know you are just kind of getting started on this and it can be touch at first. Do you remember how to do it or do you want me to show you that video again that your doctor did. Right, so using for things like that, even though most of the conversation is literally conversation spoken back and forth, having that kind of capability is important. And we also for the reasons of demographics that we are working with, many of our patients are older and they have a harder time seeing or touching something on the screen, we needed a decent size screen. So it couldn’t be you know a little three inch screen like on a smaller phone. We went with a completely larger screen and finally we needed the eye; we needed to be able to look at and make eye contact with the person for the reasons I was talking about. So those were really the three things that we went into IDO with, rough size, we needed a screen and we need eyes and from there we had a very broad wide open undesigned process. We spent a few weeks with the industrial design people looking at all different sorts of directions, lots of different inspirations for what this might look like and then from there it was really an iterative process of narrowing down until we came to something that looks close to the Mabu robot that you can see on our website now. And then the overall process was only about a month to a month and half in terms of the industrial design of this, a fairly quick processes. You know we had their whole team, IDO San Francisco Studio working on this. So you know a lot of excellent designers who put a lot of effort and thought into this and so it came up with a product that you see there and then comes another set of hard work that are now turning that into something that’s manufacture able. So there is of course challenges around that as well, but the initial design was that in about a month and half.
David Kruse: And where do you manufacture it?
Cory Kidd: We have manufacturing partners in China that we are manufacturing with.
David Kruse: Got you, interesting, okay which makes sense and so you are over there and I’m sure IDO has a few contacts of there too. Interesting! Okay and so what technology is in Mabu. You know there is a speaker and a microphone and there is a camera. Do you actually track people’s eyes or how do you try to get that eye contact?
Cory Kidd: Yes, there is a camera there. We know where the patient is and who they are, so we can look at them. We are not using that camera for anything else. So in other words none of the video ever gets recorded or sent off of the device, it’s just used in real time to know where you are and to understand something about, you know it can see your emotions and interaction. We use a microphone for the same reason, right. We are able to listen to what the patient is saying during conversations and just like the video, that audio also never goes anywhere. It’s not even stored locally on the device. We just use it to listen to what someone is saying, so that we are able to have that conversation with them.
David Kruse: Got you, okay. And we are getting little near to the end of the interview, but I got a couple more question.
Cory Kidd: Sure.
David Kruse: Yeah, one of them is what’s kind of your vision for Mabu the robot over the – or it could be other robots interactions over the next five to 10 years. Where would in five years if you had this ideal robot or in 10 years maybe if it takes that much time, what would it be doing?
Cory Kidd: Well, I can tell you where our robots are going in the next five years. I described the scenario where the robot comes out the box and starts talking to the patients today and you know I think that’s a great application. We start shipping these to patients in fall. I hope that it would be very helpful for many patients and I think on the business side we’ve build a great model here. But the reality is for those patients, particularly the older patients suffering from chronic conditions is that they are probably dealing with two or three or four other things and they might be taking another 10 or 15 pills a day. And so while we are starting helping patients with a particular disease, in five years when that robot comes out of in the box and stretches its neck and says, thanks for taking me out of there, I was getting so cramped, and I’m sorry to hear about your cancer. I know you have been dealing with diabetes for years and you’ve got high blood pressure. Look here is the list of the 11 medications you take every day and I’m glad to be here. I’m going to help you with all of this and we’ll get information back to your doctor or your nurses and your enter care team, so that together we can provide you with the best healthcare we can. So that’s where we are going with this. So really focused on being an interactive coach that can help any of us with all of the ongoing healthcare conditions that we face and that we are dealing with and to again provide the best care that we can to a large number of patients.
David Kruse: And so I’m interested, how do you and maybe it’s kind of built in, but you know it’s more around adaptive learning. So if you are treating, helping someone with diabetes, like how do you know it’s working and if it is, then how do you put that back into your model just to keep improving it. Because I can imaging over time if you have a really smart robot, but some of this is offline or some of its not necessarily all embedded into this robot in interaction with the person always or maybe it is?
Cory Kidd: Yeah, absolutely. Well it’s a combination of things. So a lot of it can happen in real time with a person, but we are also learning and improving the model across groups of patients. So understanding how our behavioral model can work better with patients, understanding medically how we adapt to giving the right information at the right time to an individual. And so this goes even beyond what we are doing here at Catalia Health. Our time specific conversations with patient, that’s of course things that we are building, but we also watch medical best practices in certain areas and disease states. You know the medical portion of our product team is always looking at what our best practices for certain patient population, so then we can always deliver the best information that we can to provide that patient with good care. And so there is a lot that we can do as we learn from more and more patients to use these over the next several years to get a lot of information, that kind of feedback and improve the system, but again healthcare is a constantly changing field. We are always learning more and we’ll be able to employ that in the conversations that we have with patients.
David Kruse: Got you, okay. And you said that you are officially shipping in this fall, is that what you said?
Cory Kidd: Correct, yes.
David Kruse: Oh! Cool. Well, that’s soon.
Cory Kidd: Very.
David Kruse: Yes, yes it is. So do you – and this is last question, did you have – has Mabu been in many homes so far helping patients and if they have you know do you have an example of how it helped a person maybe don’t since I know, you haven’t officially shipped it but…
Cory Kidd: Yeah with his version of the product it will go into first home long term later this fall. We’ve done a lot of testing with patients in a short term with this version of it and the previous versions of the technology that I built has been in patients home for months. You know I can go back to the very first time I put these in homes, which is almost 10 years ago back in 2007 and that was a trial that was supposed to last only six weeks. A few weeks after that, about two months into that I finally got most of the robots back. As it turns out one of the challenges of running that particular study is that the end of it patients didn’t want to give them back to me.
David Kruse: Oh! Wow.
Cory Kidd: You know they try to negotiate you know, just a few more weeks, you know maybe one more month and I went back to get those and at that point the robots, I hand built all of them, so they were a bit bulky given 2006, 2007 technology, but patients had dressed them up. They were wearing hats, scarves, one of them had a red feather bow around its neck. Every single person had named their robot, unprompted. So we’ve seen a lot in terms of the how the relationship really develops with individuals and how people really like these things. They really find them helpful in terms of helping them stick with what they already want to with their healthcare, but run into the challenges that all us do when we are trying to make real changes over time.
David Kruse: That’s good. Yeah, I remember seeing on one of your videos, they put, I don’t know there is some time of clothing or a scarf around Mabu and I’m like oh! That’s clever, but as part from that experience that…
Cory Kidd: Exactly.
David Kruse: Awesome. Well, that’s a good way to I think end this interview. So Cory, I definitely appreciate your time. This is a – what you are working on is well fascinating from the technology perspective and also just good for humanity. So that’s a – exiting to see where you guys go with this stuff and maybe someday, well hopefully I don’t have too many health issues, but somebody maybe I’ll have Mabu in my home too. That will be awesome.
Cory Kidd: Plus we will get them out there for a lot of different reasons.
David Kruse: Yeah, I mean you – I’m sure you have thought about other potential ways to use Mabu outside of healthcare or even just for wellness, healthy living and yeah that’s another whole podcast. So anyways, I definitely appreciate your time and your thoughts and on sharing your experiences with us today Cory.
Cory Kidd: You’re welcome. Thanks again for having me on.
David Kruse: Yeah. And thanks to everyone for listening to another episode of Flyover Labs. We’ll see you next time. Thanks everyone.