Estimating Direction of Arrival with MATLAB - MATLAB
Video Player is loading.
Current Time 0:00
Duration 23:07
Loaded: 0.71%
Stream Type LIVE
Remaining Time 23:07
 
1x
  • descriptions off, selected
  • en (Main), selected
    Video length is 23:07

    Estimating Direction of Arrival with MATLAB

    From the series: Perception

    Stephen Cronin, Robotics Association at Embry-Riddle

    Stephen Cronin from the Robotics Association at Embry-Riddle Aeronautical University (ERAU) joins Connell D'Souza from MathWorks to talk about using MATLAB® to calculate the direction of arrival of an acoustic signal propagating underwater. 

    RoboNation’s competition tasks are designed to replicate real-world challenges. For example, a common task that marine student competitions, viz., RoboBoat, RoboSub and RobotX, share is to detect the direction of arrival of an acoustic signal. This task is designed to replicate search and rescue operations that recover flight data recorders that have been lost in bodies of water. 

    To calculate the direction of arrival of an acoustic signal, Stephen used Data Acquisition Toolbox™ to sample data from a supported data acquisition device and then filtered the signals to remove noise and reflections. Afterwards, a direction of arrival algorithm was used to obtain the bearing angle. Files used in this video can be found in this MATLAB file exchange entry.

    A complete list of hardware supported with the Data Acquisition Toolbox is available.

    Published: 30 Aug 2017

    Hello, everyone, and welcome to another episode of the MATLAB and Simulink Robotics Arena. In today's episode, we're going to talk about using MATLAB to run a direction of arrival estimation algorithm. And to join me today, I've got Stephen Cronin from the Robotics Association at Embry-Riddle Aeronautical University. Hello, Stephen, and welcome to the MATLAB and Simulink Robotics Arena.

    Hey, Connell. How you doing today?

    Doing pretty well. So, Stephen, do you want to go ahead and tell us why you are the right person to talk to us about direction of arrival estimation, and then introduce the Robotics Association at Embry-Riddle?

    Yeah. Yeah. Certainly. So I've worked on this kind of localization and directional determination problem for a few years now.

    We've applied it to various platforms within our own organization for various different AUVSI competitions, that being RoboBoat, RoboSub, and RobotX. All three of the maritime competitions. And we've seen a lot of success in it over the years in those competitions.

    About the organization overall, we participate in all of the AUVSI competitions. So beyond those three maritime ones, you'd add in SUAS, IARC, and IGVC, as well as one NASA competition. And what we primarily try to do as an organization is teach through these competitions, and as well as doing research on various aspects of robotic systems and unmanned systems.

    Excellent. So you guys seem to be involved with a lot of stuff with robotics. That's good to know. All right, so let's quickly jump into today's agenda. So Stephen's going to go over the problem and define what the scope of the problem is, following which, he will talk about the hardware requirements required to tackle these tasks.

    Finally, he talks about the localization algorithm that he's used, following which he will do a quick software demonstration. And then we talk about some key takeaways and some additional resources for you guys. So I'm going to hand it over to Stephen to take it away for now.

    All right. Thanks, Connell. So just starting with a brief overview about what the problem is, what we're trying to solve.

    In underwater applications, sound is really one of the best mediums to communicate any information. Any of your other typical means of communication that you would use break down underwater. So you don't have GPS. You don't have any wireless, any radio frequencies. So sound is really great.

    So for the AUVSI competitions, what you're tasked with is finding the location of a sound source underwater, and that would be between 25khz and 40khz for AUVSI. But this could really be anything. And what we're going to talk about today is just finding the direction that that sound source came from. So it's a 2D solution to the problem. You'll get an angle to the sound source relative to the array you're using to look for that sound source.

    Getting right into it, what do you need to actually address this problem? Well, what you need is a hydrophone. A hydrophone is like an underwater microphone, a device to hear your sound. You need a means to take this sound data you're hearing and actually turn it into something usable. There's a couple different ways you can go about this.

    You can use what's known as a data acquisition device, DAC, an FPGA, to just sample very quickly. And then another means that's not necessarily for sampling overall, but to aid in sampling, would be a circuit to kind of help shift the amplitude of your input of what you're hearing into something more usable for the device you're actually listening with. And then finally, you're obviously going to need a sound source, something to produce this-- that frequency.

    Because, as you're going to see, and as I'm going to iterate talking through this, practicing this task is really crucial. Practicing this problem and just testing and testing is going to give you the best success. So what you're seeing over there in Figure 1 is the array we used for our RobotX platform. As you can see there, there are obviously four sensors, not just the three I told you you needed.

    What we do with the fourth one is we provide a cross-check to help validate our answers. This is unnecessary, but useful in terms of the overall reliability of your solution. So you saw the array we just had. But how did we go about designing that? How did we go about sizing that?

    What you really need for your array design is knowing what your maximum frequency is. And what this is going to do is kind of define the smallest wavelength that you actually see in the data. And this wavelength is where you're going to target your array spacing.

    So, as you can see in figure 2, this is just kind of a generic-looking picture of the array. But your spacing between your array elements 1 and 3 and 1 and 2 is going to be based on two times the frequency measurement of your sound source. And this will size the sides of your right triangle and allow you to design and build your array.

    OK, so to actually localize-- we have our array now. We have our sampling all set up. What are we going to actually do?

    Well, we're going to break the problem down into three main steps. And these three steps are going to be to filter the data, to find the front of the waveform, and then to compute the bearing that we're actually interested in. And there is-- we break these up for a very good reason, and I'll talk a bit more about that as we're going through the software demonstration.

    In the end, what we're actually looking for to compute that bearing is the phase of the signals. Now, the phase of the signals directly correlates to what's known as the time difference of arrival of the signals, as the speed of sound in water is pretty relatively constant. And from this, because we have this time difference of arrival and we have a fixed geometry of an array, we can go through and calculate that bearing you see there at the bottom just from the difference of phases of the signals.

    So, Stephen, before we move on to the next slide, I'm looking at the two figures that you have up here. And when you look at the data that you have acquired before processing, you can see that the data is actually pretty noisy. How do you go about dealing with the noise? Do you filter out the noise programmatically, or is that something that you actually have a physical hardware filter for?

    All right. So what we're showing here, and in the demo we'll show, is a pure software solution to the problem. But something you're going to want to consider for your specific application moving forward is that you very well might need a hardware solution. Large sound sources in an environment, like motors on your vehicle, or anything else producing sound, might need to be filtered in hardware, so as to not saturate the sampling of any of those devices of the FPGAs or of the decks. So today, we're going to show a pure software filter, as the noise in the environment wasn't too loud, and to show something that's a bit more approachable for anyone just starting to work on this problem.

    That's good to know. Because, at least at the competitions that I've been to, I've seen a lot of teams struggling with dealing with the noise that actually comes out of their motors. I'm sure that probably creates a big problem. And so it's good to know that you might actually need a hardware filter solution to deal with the noise.

    Yeah. It's definitely something that every team is going to need to consider. They're going to need to get the array onto their platform and figure it out.

    When we were out at RobotX last time, what we found we had to do to deal with the problem was, when we were doing that task, we just had to shut off our motors and drift in the water. And obviously, we don't want to do that moving forward. So a hardware filter would be the perfect solution for us.

    OK. That's good to know.

    So we're going to move over to MATLAB now and take a look at an actual implementation of the software to solve this problem and to find a bearing. So now we're over in MATLAB, and we're going to kind of go through it piece-by-piece and show you actual code to get these signals and to process the data and find that bearing. What we have here is the whole implementation, and what we see, looking back to what we had discussed, if you were to run this, is that kind of three-step process of filter, find the front of the waveform, and then determine the bearing.

    So what we start off with here first is kind of your initial data read in your top chart here. And what we see is very noisy data. But we can still see that hard front of the waveform where the sound wave initially hits, and that's where the data we're actually interested in is. That's because the data after that point is most likely the result of a reflection of something in the environment, whether it be the surface of the water, the vehicle, anything else. And we can consider that data to be corrupted.

    Whenever you have a reflection, you're going to induce a phase shift into your data, and then that entire equation we have based on phase shifts breaks down, and you're going to get inaccurate bearings. So finding the front of the waveform is the meat of this problem and what all the filtering really is done for. So to find that front of the waveform, first step obviously is filter. We've got to get rid of the noise in this.

    And what we're going to do is take it through a bandpass, and that's what is going to leave you with this middle plot here. And to do that, what we do in MATLAB is we design a filter. The sound sample we have here which we've loaded in is a test sample that we took in our university pool. It's a 40khz sound sample.

    And we're just going to start by designing a filter to kind of work around that and filter that data. So we pick a small band above and below that 40khz target. What we're working with right now is a fourth-order bandpass filter. This is a pretty low one. It's successful in our testing.

    You might discover that for a more robust solution, that you would want a higher order bandpass filter. That's just going to be a result of your testing and what you find is best. And then we're just going to take that data, we're going to evaluate the filter on each of our channels.

    So in here, we have all four channels we're sampling on all four of those hydrophones in the array. We're actually only going to use the first three for the math, though. So the fifth channel is just kind of along for the ride right now.

    We're going to obviously graph this all. We're going to graph our original data, and then we're going to graph our new data. And then what do we need to do? Well, we need to find that front of the waveform where that data is uncorrupted and nice and clean.

    So, moving back over to the graphs, we can show you what that data looks like once you've found it in that bottom plot. And what we're doing to find this data is we're just using right now a simple thresholding technique, looking as you move across the data in the plot above you. So you have, really, zero noise leading up until that front of the waveform. So what you can do is just simply look for a starting increase in slope above a certain value, and you can determine you've hit your front of the waveform there.

    There are more robust methods to kind of approach this problem. But as a first pass for anyone, any teams just looking to get in and get working on this task, get testing with it, this is a great place to start. It's going to give you pretty good results once you've kind of dialed-in what amplitude you're actually hearing these sound waves at.

    So we've found our-- found the waveform. We've triggered it. We can tell that it's clean data.

    If you look across this bottom plot, what you can see is, looking at each of the channels, you don't see a phase shift among them. As you move throughout time along the bottom axis, you have no phase shifts. They're always within phase with each other, which means that when we go out and do our phase calculations, we're going to get pretty accurate results.

    So to do that, we're going to move back over to MATLAB, and we are going to open a sub-function we have here. So this is a pretty simple function. What we do here is we take in-- our inputs are the three shortened little pieces of signal at the front of the waveform. For this example, we use 200 samples, which correlates to somewhere between 12 and 20 full phases of signal, depending on what frequency we're using in the AUVSI range, as well as our fourth input is the frequency that we're currently operating at.

    And to then find the phase, we're just going to do some simple math, find-- and this is kind of known math for finding the phase of a signal. It's based on some signal processing equations. And then the final step there, you have the phase. We've converted it to time, which is directly correlated to phase. So they are-- those ratios are equal.

    And we evaluate it. We evaluate that equation that you saw in the slides and we return that bearing out of the function. For this, for purposes of the demonstration, what we do here is we convert it to degrees so it's just nice to see, and format it into a paren.

    So, as you can see here, that angle you got out in degrees is about 18 degrees. We can run through this. 18 degrees was the correct answer. I can confirm that. We did a lot of testing to validate that.

    Yeah. Just over 18 degrees. And for you guys who are-- who might be wondering, well, how am I going to really accurately get these measurements underwater? How am I going to get these angle measurements just to validate with?

    We just did a very simple solution to that. The bottom of our pool had some blue tile lines. We aligned the array with one, the sound source with another, and we quickly took some measurements and were easily able to confirm the angle. So simple things like that are going to save you a lot of headaches, rather than trying to go out in the water and, with some tape measures underwater and whatnot, measure it all.

    Stephen, before we move on, I just want to go up to the data file that you've loaded in. I remember you telling me that you actually used the Data Acquisition Toolbox to sample this data. Do you want to talk a little bit about that?

    Yeah. No, definitely. The Data Acquisition Toolbox made sampling all that data incredibly easy for us. It had support for the DAC we are using, so it was pretty much plug-and-play.

    They had functions that immediately found the DAC for us and easily let us set up sampling at the full data rate. And it would log it to a file nicely. That file and this code will be included for you guys.

    Just something for you guys to know moving forward, you guys are going to want to sample at a pretty good data rate. We sample at 500,000 samples a second. You're going to have some big quad files, but MATLAB definitely can handle that perfectly.

    Perfect. So if you guys are looking for more information on the Data Acquisition Toolbox and a list of supported hardware, you can look in the Resources section, and we've linked you to the documentation. And it's actually really pretty simple to get a data acquisition device started up with MATLAB. I'm going to let you guys go and check that out. Now let's jump back to the presentation to give a few key takeaways and some additional resources for you.

    Yeah, definitely. So some key takeaways for you guys, just some information we've learned over a period of time working on this problem. One of the biggest things you're going to want to remember is that minimization of noise is kind of the biggest thing that you need to do to be successful with this task. So this could be from mitigation of reflections. And depending on your competition, this is going to happen a bunch of different ways.

    For something like RobotX or RoboBoat, what you're going to want to do is get that array below the surface of the water as deep as you can get, because the surface of the water is going to provide reflections. For RoboSub, it might just be getting it out in front of your chassis, out in front of your vehicle, because your sound is going to reflect off that vehicle. As well as the other side of noise is what we had talked about earlier, is motor noise.

    The best way to kind of work this problem is to, one, figure out if it is even a problem for you. So I would go out and just get your vehicle in the water and get your array in the water and drive around. See if you can hear it. See if it's going to be too loud.

    If it's too loud, what you'll see on your DAC returns is that your entire input voltage is saturated. And what that means is that you're starting the clip your data. And that's going to make your higher-frequency data, which is typically the sound data you're actually interested in versus your motor, that's going to crop that data and make it unusable.

    So if you start to see this, then you need to start looking at solutions. Hardware filters, shutting off your motors. Different things like that.

    So beyond minimizing just noise, obviously, testing is crucial. You're going to want to go get out there, start putting your sound source in various configurations. As many as you can. Make it hard on yourself. Test the full range of frequencies.

    Your implementation might end up doing well at one frequency and not in another, or you might run into some interesting reflections that are unique to your vehicle at different frequencies. So you're going to want to characterize it all and figure it out. And then, finally, what we showed here today was not the only approach to the problem. It's by no means the best approach to the problem.

    What we were just showing here today is a good way to get started, a good way to get in the water and start finding the sound sources, and then to give you a place to iterate from. But, yeah, have fun with it. It's a great problem.

    All right. Perfect. Well, thank you so much, Stephen, for that insight into addressing the hydrophone task at the maritime RoboNation competitions.

    As we've mentioned before, all the files we've used to make this video will be up on File Exchange, and you can download them to get started. We're always interested in hearing about what you guys think about the resources that we put out, so you can contact us either via email at the roboticsarena@mathworks.com, or Facebook, where you can join the Robotics Arena Facebook group. The links for that are up on the screen.

    Please don't forget about our complimentary software offer. We do offer free software, free MATLAB and Simulink, with about 65-odd products on there. That's available to all teams taking part in RoboNation competitions. Thank you for tuning in, and we hope to see you again on the Robotics Arena.