
The Prosthetics and Orthotics Podcast
The Prosthetics and Orthotics Podcast is a deep dive into what 3D printing and Additive Manufacturing mean for prosthetics and orthotics. We’re Brent and Joris both passionate about 3D printing and Additive Manufacturing. We’re on a journey together to explore the digitization of prostheses and orthoses together. Join us! Have a question, suggestion or guest for us? Reach out. Or have a listen to the podcast here. The Prosthetic and Orthotic field is experiencing a revolution where manufacturing is being digitized. 3D scanning, CAD software, machine learning, automation software, apps, the internet, new materials and Additive Manufacturing are all impactful in and of themselves. These developments are now, in concert, collectively reshaping orthotics and prosthetics right now. We want to be on the cutting edge of these developments and understand them as they happen. We’ve decided to do a podcast to learn, understand and explore the revolution in prosthetics and orthotics.
The Prosthetics and Orthotics Podcast
Visionary Hardware for Enhanced Patient Outcomes with Ravi Shah
This episode reveals how Structure Sensors are reshaping healthcare technology, emphasizing their role in improving patient care through accurate 3D scanning. Ravi Shah shares insights into the importance of collaboration and data sharing in the medical field, discussing the intricacies of their hardware, software, and the journey toward providing better solutions for clinicians.
• Exploring Ravi's journey from tech to healthcare
• The critical role of 3D scanning in patient outcomes
• Importance of open platforms and data sharing
• How Structure Sensors ensure accuracy and consistency
• Practical advice for clinicians integrating 3D technology
• The promise of technology in narrowing healthcare gaps
• Insights into user-friendly developments for practitioners
• Future aspirations for quality care
Special thanks to Advanced 3D for sponsoring this episode.
Welcome to Season 10 of the Prosthetics and Orthotics Podcast. This is where we chat with experts in the field, patients who use these devices, physical therapists and the vendors who make it all happen. Our goal To share stories, tips and insights that ultimately help our patients get the best possible outcomes. Tune in and join the conversation. We are thrilled you are here and hope it is the highlight of your day. Hello everyone, this is Brent Wright and Yoris Peels with another episode of the Prosthetics and Orthotics podcast, and I am super excited to have Ravi Shah on the show today to share a little bit about his journey with Structure Sensors. There's a lot of history there, a lot of stuff to get into, so I'm really, really excited to hear some of that journey and where you are headed as well. So welcome to the show.
Speaker 2:Thank you so much, brent. I really appreciate the warm welcome, at least for me. I think it's an absolute pleasure to be invited to the show. I love the work that you do in the industry and I really love listening to the different perspectives that you bring to what's happening and how we can make everything a little bit better. So, thank you, it's really an honor to be invited.
Speaker 1:Yeah well, I'm really excited to dive into some of this stuff and you know, what's interesting about doing this podcast is, like my mind changes. So I guess you kind of have to be open-minded a little bit. But like like if you and I think this, just you know, says a little bit to life as well, as like if you have at least an open mind and try to listen and learn and be okay with changing your mind, I think that's a that's a good way to live.
Speaker 2:Yeah, no, a hundred percent, I think, especially working with a lot of clinicians and in the healthcare context. I don't have that background. I come from tech, and so I find that I don't know very much, and every day I'm learning a little bit, and so, you know, for me it's, it's. I find that opinions are really fluid. I think that, especially given the state of where healthcare is today, like the use of all these workflows, any kind of 3d workflow is really nascent. And so, in terms of technology, we're really at just the beginning of it and I'm really excited to see where we're going to go and if we can make it a little bit better across the board and we can help all parts of the industry get a little bit better. You know, I like to say, a rising tide lifts all boats. I've always held that to be true, and so I'm excited to see where we're going.
Speaker 1:Yeah, well, we, we always ask the question Okay, so how did your journey land you at structure? So I'd love to you you mentioned already, you foreshadowed, that you have, uh, you know, a background in tech and, uh, some pretty cool stuff there, and then you've kind of brought your talents to structure. So, yeah, share a little bit about that journey.
Speaker 2:Yeah, no, thank you, brett. I'm sort of a giant nerd I guess it's not. Sort of it's really who I am. I was actually at Google for about 12 years before I joined Occipital, the original company that created the Structure Center. You know, at Google I spent my time doing a little bit of everything. I've done everything from answering support phones to working with partnerships, to working with engineering and being a product manager, and so I've sort of run the gamut there, joined Occipital, actually in 2017.
Speaker 2:I was really looking for a set of technologies that would make people's lives better. You know, I had worked in mobile and Android for a really long time before that and we thought we were making the world better because we were making mobile devices ubiquitous, and when you can give somebody this beautiful device that allows them to ask a question, get an answer instantly at least for me, that was sort of like Andrew Carnegie's vision of the library, like yeah, it was the great equalizer. You could change the way that people thought and they worked, and we saw it all over the world. We worked everywhere, from the streets of London to the slums of Delhi, to understand how people use these devices and how we could really get them into the hands of everyone, and, at least for me, I was looking for the next set of technologies that I thought would help humans and sort of give them superpowers. And when I looked ahead, it wasn't just like what we had on mobile today, where you know you could ask a question, you can answer it. It was a set of technologies on the horizon like, namely, computer vision and machine learning and these little sensors that are in your devices, and all of a sudden we could get to the point where, instead of having to ask a question, our devices could just answer those questions for us, and it was a whole nother set of superpowers, and I was really looking for a way to move in that direction with my career. I knew I didn't want to do it at Google, I like to tell people, google grew up and I didn't, and so I was introduced to this little startup called Occipital in Boulder, where we had moved, and it was just beautiful. They had a lot of beautiful technology there's just a lot of products and a lot of tech there and so I actually joined, originally to lead the AR VR team in 2017. I joined Structure in late 2019, along with my partner, paolo now, who is my partner in crime.
Speaker 2:But we found this beautiful, beautiful team that was selling sensors. Nobody internally really knew who we were selling them to, and so we dug in and we realized that healthcare had sort of adopted these sensors and there were like 100 apps or more that were really just designed to treat humans, like you know, to analyze the body and build custom therapies. And you know, paolo, bless his heart, he really pushed for this to be our focus. You know, I was a little skeptical at first because, you know, the company really had been focused on things like AR, vr, robotics, but over time I came to understand that it really was the thing that we did, and so, yeah, we spent our time really digging into it and we found that there was this huge user base that we didn't serve at all. Honestly, I think people made these beautiful apps on our platform and we sort of left them to their own devices and sold sensors, and that was it. And so Paolo and I really spent the next couple of years trying to focus the team on those use cases. I know the Mark II that had launched wasn't a really good fit for the healthcare use case, and so we spent a long time trying to refocus the team back towards that. And so, you know, we launched the Structure Sensor Pro in 2021.
Speaker 2:And in 2022, we convinced the board hey, having a really healthcare-focused startup in the middle of an interior and room scanning and interior understanding spatial computing startup just wasn't a good Neither would thrive. And so we had two businesses attached to the hip. We convinced the board to let us spin out, and so we spun out in 2022, really with that mission in mind, that idea that we could build a beautiful healthcare platform that if we could help make it really easy to capture information about the human body and analyze it and give tools to clinicians, they would be able to diagnose people anywhere around the world. And really we looked at who's using our products and we were everywhere. We were ubiquitous whether it's MSF scanning children in austere environments, who've stepped on landmines to fit them, from prosthetics to the Mayo Clinic, you can find a structure sensor. And so this idea that we could take that platform and use it to improve the lives of humans and make sure that, no matter who you are, where you are, you get the same quality of diagnosis, really fueled our desire to go out on our own, and that's really what we've been focused on since we spun out is how do we make this a big, beautiful, ubiquitous thing where it doesn't matter who you are around the world, you know?
Speaker 2:I like to say that there's this huge, huge gap in terms of quality of care when you look at clinicians. I used to live in the Bay Area and my daughter actually her pediatrician was the chair of her department of pediatrics at Stanford, and so she used to get insane like she would know things about our children that I couldn't even fathom anybody deriving from just like a quick look, and it was because she wrote and published and consumed peer-reviewed research 20 to 30 hours a week. We moved to Boulder and our doctors are fantastic, but the clinicians don't have that luxury. They have to treat 30, 40 kids a day and when you're working you know 10, 12 hours a day, you don't have time to be reading everything that's in the latest medical journals.
Speaker 2:The reason why I think the technology is so powerful, this confluence of computer vision and machine learning and those types of things, is we can provide those insights to everyone, like if we as a tool set can sort of make that quality of care ubiquitous that everybody, anywhere. No matter what resources they have access to, they can get that same diagnosis. We've done something really good, and so that's sort of our mission. I know everybody thinks we make. No matter what resources they have access to, they can get that same diagnosis. We've done something really good, and so that's sort of our mission. I know everybody thinks we make sensors, but really we make sensors A lot of software. We try to collaborate with others. If folks have a sensor of their own, we try to see if we can support it in our platform as well, and we're really just trying to see how do we elevate the quality of care for humans.
Speaker 1:I think that's super cool, and you know. One of the things, though, that is interesting, that I would love for you to dive in and give us the nerdy answer to is why look at something that's separate, right? So, like, people are like, well, you've got your Androids and you've got your iPhones and you have your different version. I mean, these things are good, cameras are good, hardware is good, but we know, like there's so much variety, right, you've got good stuff, bad stuff, stuff that gets deprecated, stuff that doesn't work well, play together all that. So why even do something that's separate?
Speaker 2:I think that's a really good question, you know, and the real answer is that we'll work with anything. Just, very truthfully, I think that this idea that we're trying to build something really ubiquitous implies that we need to support whatever device you have, and so we'll give you the best 3D reconstruction, landmark detection measurements, things like that, that we can, regardless of what device you come to us with. You know, I think we have. You know, we work even as far as working on every device via photogrammetry internally. We don't have something that we feel is good enough for clinical use yet, but we're still working on that. But the real answer for why we make our own hardware is there's a few actually big reasons why. One is because we can ensure consistency and quality. You know, I think the level of accuracy that you can consistently get from a dedicated device is high. And we work very closely with Apple. You know, I think we have team members, former team members there. I have former colleagues there. I have former bosses and employees who work at Apple. My CTO, our partner, paolo he has a lot of colleagues there too, you know, in the calibration teams and things like that.
Speaker 2:The problem is is that when you look at even the active depth sensing on these devices. We're not the consumers. You look at even the active depth sensing on these devices. We're not the consumers. They're not made for clinical purposes. The consumer of even the front-facing true depth camera really is the face ID team at Apple. It's designed for them. It's a black box, and so, whether it's hardware changes that come from year to year, that change the baseline or the projector, or parts that are becoming more efficient, or minor iOS revisions, changing the calibration parameters and things like that, it really is like. It's like the Wild West if you're trying to consume from many devices. And so, like I said, we actually do set up robot arms, we calibrate. We have a proprietary calibration system that we built that actually we use to calibrate iOS devices and things like that in order to get the best possible results, but nothing beats having the consistency and the accuracy of having your own hardware. So that's one piece and the other piece really is.
Speaker 2:You know, I've worked at this intersection of hardware and software my whole career and what I found is that the discipline required to make your own hardware is important because it allows you to understand how it's used in the real world, Like what are the physical interfaces and how that works. And when you make your own hardware, I really fundamentally believe that your software is better Because you have a full understanding of that entire ecosystem, of that full usage from top to bottom. The opposite is true as well. Top to bottom. The opposite is true as well. I think making our own SDK and even making our own internal apps and helping people build their apps and going out in the wild and tweaking them makes us better at hardware, and so it's really. We can build the best thing in my mind if we touch all of those things, and it's more than just 3D sensing.
Speaker 2:You know, at least for us, we do a lot of work as we look beyond that. We have a lot of work done in volumetric reconstruction. We have a lot of technology that we've built in-house for how do you marry different types of sensing modalities? So it starts with 3D, but in a few years we're going to have other types of insights, like we're going to layer temperature, microbial load and other types of sensing over this and all of a sudden you start to see intersectionality between these things. This, and all of a sudden you start to see intersectionality between these things, like just by adding temperature to a structure sensor, being able to 3D reconstruct and then extracting landmarks from that using our own hardware.
Speaker 2:Now we can diagnose things like diabetic ulcers. We can find them before they're a problem, you know. We can find pressure injuries. We can find things that are non-obvious, we can help with wounds and things like that by adding microbial load. We can go even deeper into wounds. Looking at dielectric properties of the skin, we can start finding melanomas.
Speaker 2:And so not only can we do more by making our own hardware, you know, we can do ad sensing that just doesn't exist in the world. We can start finding intersectionality between these different modalities, and it's really. If we can do that, we'll hopefully give clinicians the tool to just find insights about the human body that we've not found before. You know, mapping the human phenotype doesn't sound as exciting as mapping the human genotype, but I think there hasn't been enough work done there to really find all those intersectionalities, and I think we can help just make the world a better place if we do that. So yeah, I'm sorry it was a very long answer, but you know we make our own hardware because it's important, it makes us better, it makes us makes our SDK better, it makes the apps better, it makes everything better. But also we can do a heck of a lot more Like. We can make significantly more accurate, more consistent experience for clinicians, and we can. We can just start doing things that nobody's really done before.
Speaker 3:It's also interesting that that interesting that one thing here is that I went to two different museums and there were three different art pieces that were all kind of interactive 3D scanning art pieces and all of them used the Microsoft Kinect right, which has been discontinued, I think 15 years ago, if I'm not mistaken.
Speaker 2:It was discontinued the same day that they stopped making projectors for the original structure sensor because PrimeSense was bought by Apple in 2013.
Speaker 3:Exactly, that's at one moment. But still still, 50 years later, they're still using these connect things to do this, because there isn't an alternative, right? There is an alternative where somebody at a museum or an artist or some organizer can turn to to have a kind of hackable kind of sensor thing, and every single time I've paid attention to this, I've seen this more than a few dozen times now in museums where you have some kind of thing where you wave your arms or something, it's always the Kinect sensor, and I just checked on Amazon. You can still buy them Xbox One Kinect sensor. You can just buy them, and they must be. That's what's keeping this alive. So I think, no, I think you're right.
Speaker 2:And, by the way, it's a pleasure to meet you. Sorry I was late. No, not at all, actually. Life is unpredictable.
Speaker 3:And I love what you guys are doing here. And did you, you know? Because you could have kind of presented yourself as like the universal scanning platform for everything. Right, why choose that? I mean because a lot of people have tried doing that and that hasn't really worked for anyone, right?
Speaker 2:Yeah, no, I mean I guess just very transparently. I think, like I was telling Brent earlier, really we're really nascent with regard to the use of 3D technology in healthcare. Now, like you said, the Kinect has been gone forever, still being used for just crazy things out in the wild, and people are doing new beautiful things with it, and so I think, from a healthcare perspective, both the industry moves really slow. Like I think there's this concept in venture capital that drives me nuts. I hate it every time I hear it, but they talk about Uroom's law, which is the inverse of Moore's law. It's instead of silicon you get this increased transistor density and speed every couple of years. It's the opposite. Like medical device and pharmaceutical is more expensive and takes longer to produce every few years.
Speaker 2:Our real mission is to sort of realign those innovation cycles, and so in healthcare, media is still relatively unheard of.
Speaker 2:You know, in an industry like orthotics and prosthetics or in those areas, you see a lot of it, but it's still just the start, and so we're not going to get anywhere meaningful if we try to build an ecosystem or a platform that tries to suck all the air out of the room, like that's never been something I've been interested in.
Speaker 2:Personally, I think if we're going to make the world a better place, it really is going to be by being as open as humanly possible and supporting as many devices as possible and supporting the workflows that people actually have and the devices they have in hand. Whether it's something that has one of our sensors attached or not, we're going to do the best that we can with it, and whether it's our cloud system or the SDK actually we're pretty ubiquitous Like we'll give you measurements on meshes that came from anything. Like somebody could take a I don't know. They could take like a block that they see and see and model it themselves by hand and send it to us, and we'll still give you landmark recognition if we can do it, and I think that's the right way to approach things. I think the more open we are and the more, the more we work with other people's technology, the faster things will get better for clinicians, which just means better outcomes for actual humans at the end of it.
Speaker 3:And if you're looking at who's buying and actually using your sensors who are they, Is orthopedics a big part of that. What kind of doctors are using it? What kind of patients are using it?
Speaker 2:It's actually pretty broad. So I think our biggest single market is really the combination of orthotics and prosthetics, and I hate lumping them together because I think the needs and uses are so nuanced that I could spend the rest of my career understanding just like the workflow of one clinician and not really understand it in its entirety.
Speaker 2:But that is really a big part of it. But we do everything, you know, I think we do. We have partners. You know we have several partners that we've helped through the FDA 510K pre-market process. Some of them do things like wound care in the VA. I think we're approved for that use case. We actually do neurosurgery. We have a partner called Skia in Korea that we work with for surgical navigation and actually they've outperformed Medtronic's cell station in 80 out of 80 neurosurgery trials and we're just about done with their FDA pre-market notification for that.
Speaker 2:You know we do oh gosh, a thousand little things across the world. You know everything from actual healthcare use cases down to non-healthcare use cases, things downstream, like if you go and you want to buy a custom pair of goggles from Smith Optics, we power that along with a lot of other consumer-facing applications. Bath Fitter uses us to measure bathrooms really quickly to figure out what will fit. I think the use cases are pretty broad. In a healthcare context, though, I think that's about 99% of our use, and really what interests the team and what gets us up in the morning is how can we make healthcare better? That's cool.
Speaker 3:And then. So one thing is like you're kind of in a divorce thing with this healthcare thing, because you could do three things Essentially. You could make like a super cheap one you know $50 for everything. Or you could do, yeah, like the wound and looking into the wound, like whatever look at your DNA, I don't know whatever like the super mega advanced one, right. Or you could make 17 different kind of you know different like versions for everyone. You know what are you choosing to do, or what are you kind of like, like like focusing on.
Speaker 2:Yeah, and I think that's a really good question. You know, one of the things I found uh, you know I'm a product guy at heart. I've um shipped a lot of really good products but also a lot of clunkers there, and one of the things I've learned is having a strong opinion makes a better product, and so you know, if you're not, if it doesn't feel painful to bring it out, if it doesn't feel like you had to make some compromises in your vision to get the thing out, you end up with something that doesn't have an opinion. I like to liken it to the Simpsons. There's an episode where Homer's stepbrother long last stepbrother finds him and owns a car company and he hires Homer to design a car and he creates the Homer car. It's terrible. It's like an every car. He just threw every feature he could ever imagine in it, but it was the worst possible car that ever existed.
Speaker 2:Because of that, and we take that same approach internally.
Speaker 2:So one of the reasons we actually do try to support as many sensors as possible is because we know that at least there are parts of the world and people who you know buying a standalone sensor is an unattainable thing, and so for us it's really about building dedicated hardware for the people who need it and the use cases that need it.
Speaker 2:You know there are many, many use cases that need consistent, accurate device to device consistency, right, Like a lot of these partners especially who go all the way through the 510k pre-market and things like that, and even use cases before that, right, Like we talked to, people making head orthoses and things like or braces and stuff like that, scanning children and trying to custom fit those. You, or braces and stuff like that, Scanning children and trying to custom fit those you have to have not only a high level of accuracy but you have to be robust to movement and things like that. And if you're off by a couple of millimeters in those use cases it's a horrible experience for the child. So I think we make our own hardware because sometimes you need it, Sometimes you need that level of accuracy and consistency, and then for the use cases where it doesn't, we're happy to support.
Speaker 3:What device you have on here? And why is 3d scanning so hard? I mean, I think I used to do these reports right about the the 3d printing market and then it was like the ultimaker now has the ultimaker 3 plus, you know, or they have now a bigger system, you know, like incremental improvement, and then and then I'd have the same report for scanning right 2015 in scanning, and then half the players would be bankrupt and like it wouldn't work anymore, like it's been a bloodbath, comparatively, even the 3d printing, which are very competitive in places. So why is it that difficult? Is that intersection thing you were talking about before?
Speaker 2:yeah. So I think part of it. I actually don't think that, like a lot of the core technology, at least in the geometric side of computer vision, just like everything else, there's like papers that are written like eight years ago that really guide you, and one of our computer vision leads our calibration lead she actually we jokingly, when we were talking to her, when we were interviewing her, we talked that she's Google's 3D telepresence revolutionary system, project Starline. We talked about how she actually developed that in 2001. It was the exact same, basically, concept, and so a lot of these core technologies haven't changed. The problem is is that I think a lot of companies approach this as oh, you capture something and now you go figure out what to do with it. And I think that if that's all you think about when it comes to 3D capture, like you're sort of doing a disservice to the customer because it takes an entire workflow for it to be a thing. Just capturing something doesn't matter. And so actually I think the reason why we invest so much time and effort into the software side of things is the power doesn't come from us and the things that we make. It comes from what people do with it and that SDK and that ability to build your own apps and find new ways of doing things is really important. And then I think it's that intersection of geometric computer vision and machine learning. That's where a lot of the innovation comes from, because a lot of companies now are actually eschewing these original geometric computer vision techniques in thinking they're just going to machine learning their way to every answer, and that's really a bad. That's a big mistake. I think geometric computer vision is really good at solving certainties and machine learning whether it's computer vision or otherwise is really good at solving for uncertainties. And if you can constrain the problem and understand the certainties and then use machine learning to solve the things that are uncertain, you have this beautiful, beautiful set of technologies that interplay. So I think we're going to start seeing a lot more innovation in this space.
Speaker 2:I think it just takes time for adoption until you have these end-to-end workflows. So I think that and it's expensive to do end-to-end workflows Like one of the things we find is that a lot of people make medical devices on our platform because they couldn't otherwise. You know there's these companies that they spend. They have to carry an R&D team for 10 years and then they have to go through the FDA process and they've burned $50 to $100 million by the end of that and they go bankrupt. And on our platform we find that it takes we shave almost a decade off of that time and they save like 98% of the cost in getting to market.
Speaker 2:And so I think that's where the power is, is these end-to-end use cases, and the better that we make the platform and the more capabilities we give people, the more that we make the structure sensor more like a tricorder, the more adoption we'll see, just because there's just more possibilities and more workflows. So I think you're seeing a lot of it on the 3D printing space. Because you generate this tactile object right, you have a problem and you're like, oh, my door broke and I need to fix something in it, or there's a unique part I need, or I'm prototyping something or whatever it is Like I'm creating a custom therapy. There's a direct tactility to the thing that you generate. I think on the capture side it's a lot harder to wrap your head around because without others, without that other, side of the workflow.
Speaker 1:It doesn't mean anything, it's just a mesh Right. One thing that I'd love for kind of just expand on too is I mean, I know you said that your technology goes a lot into the prosthetic and orthotic field or industry, just from the outside looking in. And I always love to ask this question because I've kind of grown up in the field. You know, I started when I was 15. I knew that this is what I wanted to do since I was in second grade, so it's been a long time right. So sometimes you get blinders on. So you looking into the field and don't worry about hurting my feelings, okay. I mean, what do you see? You know, I think we've talked before like we have a passionate group of people trying to change people's lives. We are also stubborn and don't like to learn, so but like, what do you see?
Speaker 2:and where do we right? I think it's challenging. I think the industry is really in this time of flux. There's a lot of providers you know one of the things that and I always look at things like how I would at Google, like I like to look at like what are the trends and who are we selling to? And those kinds of things, because I'm a robot and that's the best way for me to actually try to understand how humans actually use products. But the demographic is aging and so there's almost like a generational shift, like in the industry, which is really interesting because I think in the conversations that I've had with clinicians and the time I've spent, I see a lot of people who really are like truly artisans, like they've mastered their craft and their way of treating not just like understanding their patients but treating them and they all have their tricks and their things that go into making like a beautiful orthotic or a beautiful prosthetic. And I think the biggest thing that I see, at least for the industry, is that there's almost like a tacit admission that you know the more that we can share and the more that we can understand the best way to treat people and make it consistent and make higher quality orthotics, whether it's like understanding. You know, like I talked to some clinicians who are like I actually don't even just take one scan anymore, I need to actually take multiple so I can understand how the foot changes when loaded, so that way I can make the right thing, like those types of things I think are starting to enter the common vernacular. I actually, outside, looking in, I'm really excited, honestly, like it's not even. I'm really excited, honestly, like it's not even. I don't have any shade to throw, I think. You know people don't. People get set in their ways, myself included, but that doesn't mean that there's not a lot of innovation happening in the industry and there's not a lot of people pushing to make it better.
Speaker 2:One of the things that has been a challenge for us is because we don't make the apps themselves, we don't necessarily always have a direct tie to the clinicians, and so we've tried to get that and I found that because the industry was probably a little bit more distributed and more more like a lot of little satellite private practices for a very long time. It takes a long time to get people to share. You know, I think that there's there's all sorts of like crazy, like I've heard all sorts of crazy rumors Like I've heard like people tell me, aren't, didn't you guys go bankrupt? And I'm like what I've heard. Like people tell me, aren't, didn't you guys go bankrupt? And I'm like what? I've heard? All sorts of weird stuff.
Speaker 2:The best thing that I have started to see over the last few years is just a lot more open communication both between clinicians and between people in the industry. Like having this idea that everything's closed off and everybody has their way of doing things is great, but also the more that we all can learn, the better the treatments are at the end of the day. So I don't know, I like the industry a lot. It's fascinating. It's like nothing I've ever seen in my life and I think that's how it is for every group of clinicians that you talk to. They have their own discipline and it's a very different thing. And as somebody who goes to the doctor, I'm like, oh, I go to the doctor that specializes in this. 15 years ago I had no idea that there's this much nuance. I was just like I guess they're all doctors, they just sort of specialize. But no, I think you know the culture and the dialogue and everything is different and I love it. So yeah, sorry, I didn't have anything really critical to say.
Speaker 1:No, I mean that's good, I think. The other thing that I'd love and for our listeners I think that's important is how exactly does your sensor work? Yeah, so like you run everything, you know, you hear lasers, structured light, mapping, all this stuff. So for our listeners, let's get.
Speaker 2:It's really simple, yeah, and so actually we have a stereo pair of IR cameras and we actually have an IR projector that has a pattern that's sent out, and one of the reasons we make our own hardware is IR projector that has a pattern that's sent out, and one of the reasons we make our own hardware is we can do all sorts of weird stuff with it. But basically it's the stereo pair. We get what's called like a disparity. So we understand, like there's depth matching engine on the silicon that we have so that it can match the depth really fast and we get that depth map from that. And so one of the reasons we build our own hardware and our own firmware and SDK and the calibration process secretly that sits in between, is that it's actually not just the sensor that produces this. We know that our computer vision slam is really good because we've been doing it forever. We inherited a 10-year-old stack of probably the largest body of research in computer vision outside of meta and Facebook and Google in the world. Right Like, we have a massive amount of technology behind that. Our calibration team is unique. Right Like, not a lot of people actually have taken that art and pushed it forward. Like I said, you know, the only companies that really do it are meta and Google and Apple these days. And so it's really all of those components, it's not just the depth that we're getting, it's not the data that we get from inertial units, it's not just the way that we calibrate it by itself, it's not just the way that we reconstruct in real time. It's all of those things together, everything from our factory calibration, like I said, to the actual underlying technology of the SDK. That makes the sensor work. And so, like I said, that's one of the reasons we make our own hardware we can do weird tricks, like we can take a really high resolution image. We're about to launch a version of our firmware that will enable you to do what we call we're going to call it STL1 mode just because everyone loves the STL1, but it'll change instead of like a much higher resolution, super precise thing that you can tune to scan, like, within a certain range, it'll scan from 27 centimeters to five meters, just a little bit more like the original Structure Sensor, but with a little bit better accuracy than you'd get out of that device. And so, yeah, I hesitate to say one thing makes our sensor work.
Speaker 2:Yeah, there's a stereo pair of IR cameras, there's a really nice projector, there's a big old honking battery. There's a lot of logic that we put on chip. You know the silicon that we use. We can load in our own machine learning models and stuff in there too. So there's a lot that goes into it. But it's like I said, it's more than that. Like, the further you get up the stack, the more that you see that it really is. Sorry, it's so dorky.
Speaker 3:No problem, we do dorky quite well here, I know. So if I'm an individual practitioner, do you have any 3D scanning tips? Should I get started? Why do I get started? When do I get started? That kind of thing?
Speaker 2:Yeah, no, I think that 3D workflow really just like anything else, it doesn't matter what you have in front of you, it takes time to adjust to, and so one of the biggest challenges I think that we've had in the past is people jump headfirst into 3D and then come back and go whoa, whoa, whoa, whoa, like it didn't do exactly the thing. That.
Speaker 1:I wanted.
Speaker 2:So I think it really is about finding the right. You know, if you're a practitioner, do you have somebody fabricating these things for you already, making sure that you have the right applications and then taking time to make sure that you have it set up right and get going? Whether, like I said, whether you're using just your phone or like a dedicated iPad with a structure sensor, if you're going to use these things in a clinical setting, it's really about wrapping your head around what you're trying to accomplish and making sure you have the right tools for it, and so, at least for me, I like to have, if you're using a structured sensor, I like to have an iPad with USB-C. Apple doesn't even make any new ones that don't. I like it because they charge the sensor as well, so you only have one thing to charge. You know, I like to make sure that I get a good. I hate that we call it calibrator, but like a good registration out the gate the new sensor.
Speaker 2:One of the challenges I think that we have is it doesn't matter how many videos and tutorials we put out, how many videos and tutorials we put out, people try to calibrate it. Just like an old sensor, it actually works differently. We shifted to 940 nanometers so that way we could reject more sunlight and so you could work in more environments, like in dark environments or outside. But that means that if you want to calibrate it and use that calibrator process, you probably need a little bit of sunlight or a halogen lamp, which breaks people's brains a little bit. But we're getting people there. We're putting out more videos, we're doing more on-site trainings. We're trying to get people over that hump and when you're scanning the patient, just make sure you have a really good scan before it actually goes out.
Speaker 2:A lot of new practitioners sometimes they take a scan and they haven't yet gotten to the point, when they're transitioning to technology, of recognizing whether the 3D capture was good or not, because they're so used to phone box or something else. I'm making sure you have a really good scan. We make some tools for that. Actually, we built we started with the for the AFO process for like the full foot and ankle. We have what's called scan quality indicators. So in the SDK now it'll tell you if the mesh is good before you finish it and then send it off. We're expanding that so that way the developers on our platform can use it for other things, you know, planar surfaces, elbows, head, and so what we do is we actually use our landmark detection to make sure, along with understanding the mesh, to make sure you've gotten a really good scan. And so, yeah, I think we're trying to automate that process. We're not all the way there yet for every body part, but we'll get there. But in the interim, just making sure you have a good scan before it goes off, I think, is really important as well.
Speaker 2:You know, I hesitate to be really prescriptive. Everybody has their own workflow, everybody has their own process and, you know, I'm sure some of our app partners are sitting there like banging their head against the wall saying, oh God, you didn't tell them to do X, y or Z. I really do think it depends on your workflow, but we're always happy to help. That's another thing. I think that's changed since we spun out. We try to be really accessible to our customers and so if you ever need help or you need advice, we're available, whether it's our partner team or our support team, and if people are having challenges, we're always happy to hop on a video call and walk through things with Cool and let's say I'm an app maker or some kind of person wants to develop a tool with you guys I guess I call you guys or something like that.
Speaker 3:Is there a really cool way to get to learn to see some kind of documentation, sdk, stuff like that?
Speaker 2:You just email us at partners at structureio. I know there's a phone number on our website. You can call there and leave a message and we'll get back to you. One of the things that we really are hoping to do is do a little bit more handholding of people through the process.
Speaker 2:You know, I think traditionally we had a lot of people come to us and make apps, which meant that there was a lot of variety in those experiences because we didn't have a strong enough opinion on how you should scan or how to make those experiences.
Speaker 2:So one of the things we've been trying to do, I think, a little bit more is helping people through that, whether it's hey, I have a really old app and I want to support the new sensor you know, secretly, a lot of the times we just do it we say, hey, can you like give us your scanning code and we'll just update it for you and we sneak in some tweaks here and there to make it better. So, yeah, if you just reach out to us whether it's on our website via the developer portal that you see there or, like I said, email us at partners at structureio or call the number and leave a message, we can help you through that process Getting an app up, getting it up to speed, updating old apps, making it as good as we can, talking through your use case and figuring out how to get the best results, like all of those things we're pretty open to, and our partner engineering team is growing every year, so we're hopefully staffing to help everybody out as those inquiries come in.
Speaker 3:And then the other thing is okay, so I've used your app with an iPad, right, and then also, like I've been scanned by it and okay, are you going to come up with like a handle to put in the middle of it? Or cause I just, I don't know, I'm a newbie, I only do like two or three times. I just thought it was a bit weird. It was kind of like I don't know. I just thought, is it me? Is it me?
Speaker 2:Is it like yeah, or are you going iPad or yeah, so actually hilariously, we have partners who have always used phones because they make their own cases. So actually we have two answers to that One. We actually have started working from a mechanical perspective on the right cases for phones, because in a handheld experience it's actually a lot easier that way. We're working on that right now. We have some prototypes. We want them to be good before we go out there. Alternatively actually I think a lot of people don't know this, but you don't actually have to if you make a custom bracket or case or something with a handle, you can use the Structure Calibrator app to enter your own and we'll help you with that. If you send us the CAD, we have team members who'll say, oh, enter this for XYZ dimensions, so that way you can get started and use our Calibrator tools so you're not stuck with our own.
Speaker 2:Some people use their own workflow, but we do recognize that some people like that more tactile handle handheld experience. We've been stubborn about it in the past, but I think we're moving towards supporting that and I think it's a little bit easier with iPhones to get that sort of grippy handheld experience. We're also working on more ruggedized cases rather than brackets for iPads. Probably won't be able to support every iPad, because it's just a lot of SKUs, but I think we'll support a few and I think that'll give you a little bit more of a grippy experience because it'll be in a case.
Speaker 3:And the other thing I've always wondered about is like I've been sitting there trying to scan my foot with the phone and stuff using you guys also other things and I'm like couldn't we put the phone on a gimbal or some kind of like thing that just moves around my foot and you know, like the dude just says, hey, hold your foot still, press the button, and then it just like goes around or something.
Speaker 3:We should do that we should okay, but there's no one asking for that, because that would just save them a lot of time.
Speaker 2:I agree, because I actually I'm surprised. So in a lot of other use cases we've had that like we do. You know we've had partners that do fitness towers that like rotate around you. Or you know we have a partner in japan that does baggage scanning with with our cross platform, like the linux based sensor from back in the day that rotates around the bag. I've always been interested into why that has it, and honestly, part of it too is like mechanical devices like that are really expensive.
Speaker 2:Trying to generate something that would capture like that. I think it's a really good idea. It'd be really easy to stick someone's like whatever it is in the case of the foot in there, get it to subtalar joint neutral and then have this thing rotate Like. I think that would be absolutely beautiful, especially because then you can maintain consistency. Like. One of the things that sucks about these devices is once you get outside of 30 centimeters they suck Like the depth goes to hell, like we have like thousands of trials on robot arm against like a variety of objects. That shows that the further away you get, the worse it gets, and so maintaining that consistency, whether it's this or with a sensor, would be fantastic. But no, we should collaborate on that. Let's make one.
Speaker 1:I love it.
Speaker 1:In PK5000. There we go. So, robbie, I do have a question for you, though. A lot gets made and you kind of alluded to it that we have it's fragmented in orthotics and prosthetics. Yoris says this all the time. You guys are big, small, all over the place, don't talk to each other. All this stuff With the digital stuff coming right, and I think one of the things that our listeners are just now starting to understand is there is data input, right.
Speaker 1:The scan is data endpoint, it's zeros and ones. It's input. The scan is data endpoint, it's zeros and ones, it's input. I think one of the power of having data input is that you have the power of being able to collect a lot of data and then also be able to build off of that. I think one of the resistance that we see and I hear this quite a bit, especially with the smaller companies that are very good at say a specific thing that they're really good at they really don't want that data out right. They want that data to be their own, and I just feel like there's this tug of war on data. What is your feeling on some of that and how can we make heads or tails of it?
Speaker 2:Yeah, I mean, I'll be honest, You're never going to get everybody to share everything, and nor should they feel like. People should feel like they have control over their workflow and their patients, their own data and what their process is. That being said, I think, at least for us, we've gone out of our way to really take a HIPAA-first approach to collection of data in our systems, Conf, to really take a HIPAA-first approach to collection of data in our systems. Confidentiality and those things are important. We see a lot of people who are like, oh, we've made a Dropbox connector and I'm like how do I know that once it gets to this Dropbox that I paid for that's HIPAA-compliant, that you've gone through a SOC 2 audit or anything. So I get the hesitancy very transparently. I actually I'll be super honest terrified for the security practices of the industry, more than anything, Having lived in a really high security environment and actually trying to exist in one today. You know our head of infrastructure actually comes from. He used to build HIPAA infrastructure and audit it. He did the same thing with banks. He built a lot of that infrastructure and audited it, and so when he sees like what our benchmarking looks like, he's terrified.
Speaker 2:I'd say that you know, honestly, I get it. I get that there's a hesitancy to share things, but at least the way that I look at it is, the more data that we have and the more that we understand about the human body, the better information we can give you out of that, like, the more insights we can give you, the more better that we can make. That people can make the actual fabricated custom therapy and you know, whether it's industries like orthotics and prosthetics or wound care, the state of the art doesn't move forward if people don't share. Like that's really the crux of it is that if you want to have the best possible experience for your patients and you want the best outcomes, the best way to do it is to collaborate and to be a little bit more open and, like I said, I think it's a personal thing, Like that's my philosophy. I know not everybody shares it, but you know I've I've a lot of our team does have backgrounds in academia.
Speaker 2:You know we have PhDs and folks on staff. We we secretly contribute to open source and other things like that. That openness is really the thing that pushes the boundaries of what we can accomplish, and so I encourage people to share as much as they're comfortable. Being married to little bits of data feels like something that I'm. I don't take that mindset because I know that nobody. When your data goes somewhere else, it's very rare that somebody is like going through your data to deconstruct your Like that level of attention doesn't exist. But the more that you can share, I think just the better that the industry is going to get.
Speaker 1:You know it's interesting that you say that, but I've heard some stories from even back in the day, early digital stories. The story would be like, hey, if this person ever sends in a socket, we want to see it. You know what I'm saying.
Speaker 2:So there's, a hesitancy.
Speaker 1:They're like I want to know what they're doing, just to kind of get a little bit of idea. Even though we're not supposed to look, we're looking.
Speaker 2:Well, I guess my answer to that is be very cautious who you work with. Yeah, yeah, just make sure you're not working with bad actors. Yes, definitely, like there is a lot of that that happens across the world, right, like? It doesn't matter where you are. So just be cautious who your partners are. Pick people that you like. I do this all the time. My partner Paolo and I. We pick companies to work with that we like.
Speaker 2:You know I think that's one of the beauties of spinning out on our own is we got to be picky. We can be picky and we can choose those things. Work with people that you trust. I think is really the crux of it.
Speaker 2:There you go and then trust that they are going to keep your data safe and also ask them, like, what happens to my data when it's with you, right? Does it just because it's in some Dropbox somewhere? Do you have access to it? What do you do with it? Like, do you publish a privacy policy? Do we know if you're training your people on it or you're training your data sets? These are questions that I would ask.
Speaker 2:We have to take this stuff really seriously, right? I don't want to end up in a situation where we're audited at structure and we're not compliant, right With things like face data in Illinois, right, it gets really strict in areas like that, and so making sure that people's privacy is really protected, make sure your data is secure, making sure nobody has inappropriate access to it internally, is a big deal. You know. We audit our systems. We figure out who can access what it's all logged on purpose, because, at the end of the day, we want to make sure we have a system that people can trust, and I encourage people to ask the same of whoever they work with across the board, whether it's people who make tools or people who provide them with the fabricated custom therapy or whatever it is. I would just encourage people to try to make sure you're working with people that you trust.
Speaker 3:That's awesome and a great sentiment, Ravi and I love what you guys are doing. I think it's a really, really great tool and kind of like a Lego block for people to build software solutions and the big part of the practice on top of so. I think it's absolutely fantastic what you guys are up to. So thank you so much for being here today.
Speaker 2:I love that you figured that out, Yoris. Actually, I will tell you that's something I only sort of say with our people who are on our board is, you know, we sort of try to make a Lego platform Like you take the pieces you need, whether it's this type that you came to that.
Speaker 3:Yeah, but it's also, I think, the way, like so many people are doing. We have this conversation every time. Some people want the one-click workflow, but that's like you know, for some people, if you do exactly the right thing for exactly the right people, it may work, but it's going to be super limiting practitioners. But also you know the way you set up your workflow internally in your company or whatever, or the type of way how you design or where it goes, all that kind of stuff. So I think that kind of interchangeability is, I think, a key thing that could make it very, very different than if you were just saying, to say, let's capture all the value and make one solution for $19 a month. You know that kind of stuff.
Speaker 2:So I'm going to tell you I hate quoting Bill Gates. So I'm going to tell you I hate quoting.
Speaker 1:Bill Gates. This is like a please don't throw me, don't just mute me, and thank me for doing this. Where's that button? Where's that button, uris?
Speaker 2:But in the 90s he said something that's always stuck with me, which is if you're building an ecosystem or a platform and you're sucking all the air out of the room, it's not an ecosystem in the platform.
Speaker 2:If you build something and you take just a little bit, like a couple points, and everything else is value for everyone else, then you will build something healthy and beautiful, and I think that's really true. That's always been the case in my career. Is any team I've worked on that was like, oh, let's take over and make all the profits. It's failed miserably and I didn't want to be a part of it and we were invested.
Speaker 2:Conversely, you know, as we build what we're building, it really is about building value for the clinician and the patient and who's building the medical device itself or analyzing that, like there needs to be room for all of those players. You know we also have to pay our engineers, so we'll charge money for it. I think the more that, and our philosophy is if we can drive a lot of value for a lot of people, we'll be successful. If we drive a lot of value for a few people, we're not really that successful. And we're doubly unsuccessful if we hold their feet over the fire and try to charge huge amounts of money and really kill them. Yeah, I think, uh, that's really.
Speaker 3:I think that's the only way to build a useful platform that's going to do something good for the world now we saw an additive, we saw so many people fall into probably the same trap simultaneously, where they're trying to build like a closed garden really, or walled garden or actually one of those terrariums. You know, those terrariums are all the rage now. They have like a glass pot and it's like closed and they're like the frog in the middle of the terrarium and then everybody else is moss and the frog is going to poo all over you and eat you, and that's it right and that's an ecosystem right. But it's like it's all about the frog, right? Or it's all about the glass thing. Keep you in inside, and that's it that they're like there's more, it's more about being sticky, it's more about getting it for them, and it just never works, or they're just not big enough, right. That's the other thing, that just like there's not enough volume.
Speaker 2:Yeah, I mean I, I hate to. It takes a lot of engineers to build even what we have, and we know we're just a fraction of what's needed for a workflow. You know, having 30 engineers or moreaffed even for the little bit of it that we do, and so I can't even wrap my head around. You know, I know that a large player entered the ecosystem and I won't name names and they invested $20 million and it just exploded on them because it was so hard to do. And that just goes to show you that when you go in and you try to own the entire thing end-to-end, it just like one, nobody wants to work with you, right? And two, I think it requires a massive amount of resources to do and it also requires an expertise that I think is really hard. There's no organizational focus. So, yeah, I don't know why people try to own things end-to-end or suck the air out of the room. Good luck to them. That's never been successful for me, but maybe somebody else can do that.
Speaker 3:All right. Well, thank you so much for everything, Ravi. It was wonderful having you on the Prosthetics and Orthotics podcast and yeah, Brent, you enjoyed this as well. I know you did.
Speaker 1:Oh, this is good and thanks for getting into the weeds and thank you for being an advocate for our field and bringing awareness to it. I think it's super important.
Speaker 2:Look, I think that anybody who has to consistently deal with the way that the world is shifting and treat patients in this world where you have to navigate insurance codes and everybody makes everything harder for you and there's uncertainty around what will happen with people's Medicare reimbursement If you have to contend with all of that and you still get up every day and treat patients, I have a lot like infinite respect for you because you're doing something fundamentally good and if we can do anything that we can from a technology perspective to make your life better and the outcomes for your patients better, we're going to do it. So thank you so much for having me. I feel really blessed that you guys invited me. I'm really grateful, thank you.
Speaker 3:Thank you very much as well, and thank you very much for listening. Have a great day.