(19)
(11) EP 3 514 478 A1

(12) EUROPEAN PATENT APPLICATION

(43) Date of publication:
24.07.2019 Bulletin 2019/30

(21) Application number: 18213746.3

(22) Date of filing: 18.12.2018
(51) International Patent Classification (IPC): 
F41J 5/06(2006.01)
G01S 5/22(2006.01)
G01S 3/808(2006.01)
(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA ME
Designated Validation States:
KH MA MD TN

(30) Priority: 26.12.2017 TR 201721987

(71) Applicant: Aselsan Elektronik Sanayi ve Ticaret Anonim Sirketi
06370 Ankara (TR)

(72) Inventors:
  • BEREKETLI, Alper
    Ankara (TR)
  • AVSAR, Sinan
    Ankara (TR)
  • DOGAN, Emir
    Ankara (TR)
  • DÖNÜS, Mehmet
    Ankara (TR)

(74) Representative: Yamankaradeniz, Kemal 
Destek Patent, Inc. Eclipse Business D Blok No. 5 Maslak
34398 Istanbul
34398 Istanbul (TR)

   


(54) A METHOD FOR ACOUSTIC DETECTION OF SHOOTER LOCATION


(57) This invention is related to a method for acoustic detection of shooter location and it comprises the following steps; reception of acoustic signals by a microphone array; detection of muzzle blast (MB) and shock wave (SW) signals through matched filter and cross correlation processes; transforming the detected MB and SW signals from time domain into frequency domain; beamforming for the signals by means of the Delay and Sum method in frequency domain; estimation of the direction of arrival (DOA) for the MB and SW signals by finding the azimuth and elevation which give the maximum power of the beamforming response; range estimation using the difference between the arrival times of the MB and SW signals together with the DOA estimations.




Description

Technical Field



[0001] The present invention is particularly related to a beamforming-based method for acoustic detection of shooter location. In the method of the present invention, the direction and range of the shooter are estimated by means of signal processing methods based on beamforming using a microphone array.

Background of the Invention



[0002] It is important to determine the location of a shooter for ensuring the security of police stations, vehicles, convoys, borders, troops, society and public and/or for sniper detection. The detection of shooter location is realized generally by the detection of acoustic signals such as muzzle blast (MB) and shock wave (SW) associated with the projectile, and by the estimation of the direction and range of the aforementioned signals by using various methods. In order to detect the acoustic signals and perform the aforementioned estimations, a single microphone array or distributed microphone arrays are used. Acoustic signals incident on the microphone array are processed using different methods and estimations are obtained.

[0003] The publication No. US2434644, which is comprised in the state of the art, direction of arrival (DOA) estimation according to the time differences of arrival (TDOA) of the shock wave is described. In the publication No. US2966657, DOA is estimated by collecting voltage polarizations from directional and non-directional microphones by means of the cathode ray tube technology. In publications No. US5241518 and No. US5258962 how the direction of a supersonic projectile is determined by means of a plurality of microphones is described. Similarly, in the publication No. US6669477, a solution method is presented for the bullet trajectory estimation according to the TDOA of acoustic signals to six microphones that are located discretely. In publication No. US6563763, a solution is proposed to improve the accuracy of bullet direction estimations. In publications No. US7190633 and No. US8149649, DOA is estimated according to the TDOA of the SW signal. The effects of the microphone location information on DOA estimation are analyzed, and a solution is proposed to reduce estimation errors.

[0004] In the publication No. US5544129, DOA estimation is based on the TDOA of acoustic signals on three microphones. In the publication No. US5781505, DOA estimation and localization are achieved using more than one array of microphones and optical sensors. In the publications No. US5930202 and US6178141, shooting location is estimated by using time of arrival (TOA) information with a plurality of microphone arrays. The publication No. US6965312 explains shooter localization in a wireless communication network of multiple microphone arrays located on helmets of soldiers. Another solution based on the communication network among wearable microphone arrays is described in the publication No. US7266045. In publications No. US7126877 and No. US7408840, target localization is performed by using the TDOA of acoustic signals with at least five or seven microphones separated by a distance of at least 1 meter. US6847587, US7139222, US7433266 and US7586812 can be shown as examples of publications in which the DOA estimation is accomplished by using distributed microphone array networks.

[0005] US7292501 presents a system of two sensor units that are at least one meter apart and contain three accelerometers, where TDOA of SW signals are used for target localization. In US7495998, acoustic sensor pairs are located with predetermined intervals according to the application. Shooter positioning is achieved by using the frequency spectrum of the incoming acoustic signals, time delays, amplitude differences and periodicity information. In fixed or mobile platforms and wearable system applications, usage of multi sensor networks is proposed. The effect of increasing the number of the sensors on decreasing the estimation uncertainty is explained.

[0006] In the publication No. US7583808, microphones distributed within the task field, are first trained with acoustic patterns arriving from different directions. Subsequently, a decision mechanism for determining the most probable location of the target through cross correlation of acoustic signals is described.

[0007] The sniper detection system proposed in US9488442 uses camera displays and radar systems together with a microphone array in order to locate and track the shooter. Visual, acoustic and thermal information obtained from the shooting location are used for estimation.

[0008] In the publication No. US9651649, DOA estimation is performed by using the TDOA of signals within a distributed network of microphones. A system, where the network is partitioned into clusters of at least four microphones, is described.

[0009] The approach generally used in the state of the art systems and methods is based on the inference of the DOA of MB and SW signals from their TDOA at array microphones. DOA estimation with time domain signals brings a lower limit to microphone spacing. In this approach, microphone spacing can be less than one meter only at very high sampling frequencies. Even in this case, DOA estimation performance may not be satisfactory. Moreover, high sampling frequencies increase hardware and software costs.

[0010] In most of the state of the art systems and methods, sensor units distributed within the mission region are used. This situation, together with signal processing, leads to network communication and synchronization problems among sensor units. The required processing load for network communication and synchronization increases hardware and software costs. On the other hand, transfer of raw data from sensor units to a central processing unit causes delay, which reduces the validity and accuracy of estimations.

[0011] Due to the usage of optical and thermal systems together with acoustic sensors, most of the detection and localization systems are expensive, complicated, and vulnerable to environmental conditions.

[0012] The present invention introduces a novel approach for shooter localization. In the method of the present invention, acoustic signals that reach the microphones are transformed into frequency domain and DOA estimation for the signals is realized in frequency domain. The susceptibility of the frequency domain DOA estimation to microphone aperture (the maximum distance between the most distant two microphones in the array) and sampling frequency is much lower than that of TDOA-based methods. This allows for lowering the array dimension to the half of the wavelength at 1 kHz (approximately 17 cm). Therefore, the present invention enables portable and wearable usage scenarios by adjusting microphone aperture according to application requirements.

[0013] The present invention avoids the necessity to use a distributed network, and the invention enables DOA estimation and localization of the shooter with a system using only acoustic data. Therefore, problems that affect the performance of the state of the art systems, such as delay, communication security, synchronization requirements and hardware costs, are prevented.

Brief Description of the Invention



[0014] The aim of this invention is to introduce a beamforming-based method for detection of shooter location. Direction and range of the shooter location is detected using a beamforming-based signal processing method with a microphone array.

[0015] Another aim of the present invention is to ensure the security of police stations, vehicles, convoys, borders, troops, society and public and/or for sniper detection through shooter localization.

Detailed Description of the Invention



[0016] A representative application of the method achieving the objectives of the present invention for shooter localization is illustrated in the attached figures in order to clarify the details of the invention. The details of the description shall be considered by taking the whole description into account. In these figures;
Figure 1 is a flowchart of a representative application of the method of the present invention.

[0017] In the method of the present invention, acoustic signals are received by an array of acoustic receivers. Microphones, especially condenser type microphones; dynamic, capacitive, stripped, crystal, electret, and carbon powdered microphones can be used as acoustic receivers without limiting the invention. First, an MB or an SW is detected from the received acoustic signals. The received acoustic signals are cross correlated with model signals (the acoustic waveforms of the MB and SW signals) by using a matched filter. The received signal is matched with the MB or the SW model according to the result of the cross correlation. If the received acoustic signal is an MBor an SW signal, the detected MB or SW signal is converted from time domain into frequency domain by using Fast Fourier Transform (FFT).

[0018] Using the MB and SW signals received by the acoustic microphone array, beams are created for every possible direction in three dimensional space by Delay and Sum beamforming in frequency domain [B.D. Van Veen, K.M. Buckley, "Beamforming: A Versatile Approach to Spatial Filtering", IEEE Acoustics, Speech and Signal Processing (ASSP) Magazine, 4-24 (1988)]. DOA estimations for the MB and the SW signals are the maximum power directions of the MB and the SW beams, respectively. The range (distance) that generates the arrival time difference of the MB and SW signals is calculated. The location of the shooter is determined as a result of the estimated DOA and range.

[0019] The processes in the preferred embodiment of the invention are as follows:

Acoustic Receiver Array



[0020] The method of the invention comprises an array that consists of at least two acoustic receivers that are appropriate for receiving acoustic signals such as sound waves. In a preferred embodiment of the invention an acoustic receiver array that comprises condenser type microphones as acoustic receivers. The acoustic receiver aperture in the array is approximately equal to the wavelength at 1 kHz. The acoustic receiver aperture is the distance between two furthest acoustic receivers in the array.

The Detection of the Muzzle Blast (MB) and Shock Wave (SW) Signals



[0021] The acoustic signals received by the acoustic receiver array are compared with the acoustic model waveforms of the MB and SW signals. The received acoustic signal which is matched with one of the signal models is determined as an MB signal or an SW signal correspondingly. The aforementioned comparison is made by taking the cross correlation of the MB model signal and the SW signal model with the received acoustic signal by using a matched filter.

Generation of the MB signal model



[0022] The MB signal model can preferably be formed as follows;

[0023] The modified Friedlander wave model is used for the MB signal model [Beck, S. D., Nakasone, H., Marr, K. W. (2011). "Variations in recorded acoustic gunshot waveforms generated by small firearms, " J. Acoust. Soc. Am. 129(4), 1748-1759.].

[0024] The positive phase part of the MB is calculated by the following equation;


Terminology:



[0025] 

pp = the positive phase part of the MB

Psp = the maximum pressure of the positive phase (in Pascals)

tp = the time vector for the positive phase (in seconds)

T0p = the duration of the positive phase (in seconds)

b = an exponential coefficient (unitless)

* = multiplication

.* = element-wise multiplication



[0026] In addition to the positive phase part, the second part of the MB signal, which is named as the negative phase , is calculated by the following equation;


Terminology:



[0027] 

pn = the negative phase part of the MB

Pnp = the maximum pressure of the negative phase (in Pascals)

tn = the time vector of the negative phase (in seconds)

T0n = the duration of the negative phase (in seconds)

* = multiplication

.* = element-wise multiplication



[0028] The MB signal model is obtained by concatenating the positive and negative phase parts.

Generation of the SW signal model



[0029] The SW signal model can be preferably created as follows;

[0030] For the SW signal model, Whitham wave model [Stoughton, R. (1997). "Measurements of small-caliber ballistic shock waves in air, " J. Acoust. Soc. Am. 102(2), 781-787.] is used.


Terminology:



[0031] 

p_kernel_S = SW signal model

t = time in seconds

P0 = ambient pressure (in Pascals)

M = v / c Mach number (unitless)

v = speed of the bullet (meters/seconds)

c = speed of sound (meters/seconds)

d = caliber (in meters)

b = miss distance (in meters)

l = bullet length (in meters)



M = v / c Mach number (unitless)

v = speed of the bullet (meters/seconds)

c = speed of sound (meters/seconds)

d = caliber (in meters)

b = miss distance (in meters)

l = bullet length (in meters)


Matched Filter and Cross Correlation Process



[0032] Separate cross correlations (matched filter) are carried out for the detection of MB and SW signals. The cross correlation between the received signal (vector x of size Nx x 1) and the theoretical signal model (vector y of size Ny x 1) corresponding to the m-th delay is calculated as follows:


Terminology:



[0033] 

Cxy(m) = Cross correlation

x(n) = Received signal

y = Theoretical signal model

n = Time index

m = Time delay index



[0034] The time delay index (m) stands for the time difference in matching the received signal with the MB or SW signal model. The summation is calculated over the time index n.

[0035] If a matching occurs as a result of the cross correlation, the received acoustic signal is detected either as an MB or as an SW signal.

Delay and Sum Beamforming



[0036] If an MB or an SW is detected, the received acoustic signal is transformed from time domain into frequency domain by means of the Fast Fourier transform (FFT). The aforementioned transformation process is preferably performed as follows;

[0037] The acoustic signal of size (Nt x M), which is received by M microphones at time index N is transformed into frequency domain by FFT:


Terminology:



[0038] 

X(l, m) = FFT output of the signal of length Nt received by M microphones

x(n, m) = The signal received by the microphones

i = (-1)(1/2) complex number

l = [0, Nfft-1] frequency index

m = [1, M] acoustic receiver channel

n = [1, Nt] time index

Nt = Number of samples in the received signal



[0039] The maximum amplitude in any channel (for example, at the 2nd channel/column that belongs to a reference acoustic receiver) of the Fourier coefficient matrix X with dimensions of ((Nfft/2 + 1) × M) is found. The frequency index (l0) and the frequency value (f0) corresponding to the maximum amplitude is determined.

[0040] Subsequently, using the MB and SW signals received by the acoustic microphone array, beams are created for every possible direction in three dimensional space by Delay and Sum beamforming in frequency domain [B.D. Van Veen, K.M. Buckley, "Beamforming: A Versatile Approach to Spatial Filtering", IEEE Acoustics, Speech and Signal Processing (ASSP) Magazine, 4-24 (1988)].

[0041] An MB or an SW signal is incident on each of the array microphones with a different delay. This delay is due to the spatial location differences of the microphones in the acoustic receiver array. In the Delay and Sum method, for an acoustic signal incident on the microphone array, delays are calculated in frequency domain for every possible DOA. These delays are applied to the received acoustic signals, and then the signals are summed up to form a beam for every possible DOA in three dimensional space. The abovementioned procedure is carried out for MB and SW signals.

[0042] Azimuth (ψ) vector of size (359 × 1) and elevation (θ) vector of size (181 x 1) are created with a resolution of 1 degree for [0, 359] degrees and [-90, 90] degrees, respectively.

[0043] The beamforming response corresponding to the (azimuth ψ, elevation θ) angle pair is calculated as:


Terminology:



[0044] X(l0, m) = Fourier coefficient at the frequency corresponding to the maximum amplitude of channel m

Rs(1, m) = Distance of m-th microphone to the center of the receiver array

Rs(2, m) = Azimuth of m-th microphone with respect to the center of the receiver array

Rs(3, m) = Elevation of m-th microphone with respect to the center of the receiver array


DOA Estimation for MB and SW Signals



[0045] DOA estimations for the MB and the SW signals are the directions with the maximum power for the MB and the SW beams, respectively.

[0046] The signal power in each direction of the beam is given by the following equation:


Terminology:



[0047] 

r(ψ, θ) = The beamforming response corresponding to the (azimuth (ψ), elevation (θ)) angle pair

b(ψ, θ) = Signal power in each direction



[0048] After beamforming for all azimuth and elevation directions, the azimuth and the elevation of the incoming acoustic signal is given by the maximum of the beamforming response:


Terminology:



[0049] 

ψ0 = The azimuth DOA with the maximum response

θ0 = The elevation DOA with the maximum response


Range Estimation



[0050] The range (distance) that generates the arrival time difference of the MB and SW signals is calculated. The location of the shooter is determined as a result of the estimated DOA and range.

[0051] The range is calculated by using the difference of velocities of MB and SW signals in air. Their arrival times at the acoustic receiver array are different. In a preferred embodiment of the invention, the difference between the arrival times of MB and SW signals is taken as the difference between the starting points of these signals, which are determined according to their cross correlations with the corresponding signal models.

[0052] Using the (azimuth, elevation) DOA estimations for MB and SW signals, which are denoted by (ψSW, θSW) and (ψMB, θMB), unit vectors in these DOA directions are generated.




Terminology:



[0053] 

ψSW = The azimuth DOA estimation for the SW signal

θSW = The elevation DOA estimation for the SW signal

ψMB = The azimuth DOA estimation for the MB signal

θMB = The elevation DOA estimation for the MB signal

uSD = Unit vector for the SW signal that has a DOA of (ψSW, θSW)

uMB = Unit vector for the MB signal that has a DOA of (ψMB, θMB)



[0054] The cosine of the angle α between these two unit vectors is obtained by means of scalar multiplication:



[0055] Consequently, the firing range is calculated by using Δt value that is the difference between the arrival times of the MB and SW signals. The range value is calculated by range = (Δt * c) / (1 - cos(α)).


Claims

1. A method for acoustic determination of shooter location, characterized in that it comprises the following procedural steps: Reception of acoustic signals by a microphone array; detection of muzzle blast (MB) and shock wave (SW) signals through matched filter and cross correlation processes; transforming the MB and SW signals from time domain into frequency domain; beamforming for the signals by means of the Delay and Sum method in frequency domain; estimation of the direction of arrival (DOA) for the MB and SW signals by finding the azimuth and elevation which give the maximum power of the beamforming response; range estimation using the difference between the arrival times of the MB and SW signals together with the DOA estimations.
 
2. A method according to claim 1, characterized in that it comprises the following steps; comparing the received acoustic signals with the MB signal model and the SW signal model; in case a matching occurs, then detecting the received acoustic signal as an MB of an SW signal accordingly.
 
3. A method according to claim 1 or 2, characterized in that it comprises the following steps; cross correlating the received acoustic signals with the MB signal model and the SW signal model by using a matched filter; in case a matching occurs, then detecting the received acoustic signal as an MB of an SW signal accordingly.
 
4. A method according to claim 2 or 3, characterized in that it comprises the MB signal model which is a modified Friedlander wave model.
 
5. A method according to any one of the claims 2 to 4, characterized in that it comprises the SW signal model which is a Whitham wave model.
 
6. A method according to any one of the preceding claims, characterized in that it comprises the Fast Fourier Transformation of the detected MB and SW signals from time domain into frequency domain.
 
7. A method according to any one of the preceding claims, characterized in that it comprises the following steps; calculation of delays in frequency domain for every possible DOA of acoustic MB signals incident on the microphones; application of the calculated delays to the MB signals; summing the delayed MB signals; Delay and Sum beamforming in three dimensional space.
 
8. A method according to any one of the preceding claims, characterized in that it comprises the following steps; calculation of delays in frequency domain for every possible DOA of acoustic SW signals incident on the microphones; application of the calculated delays to the SW signals; summing the delayed SW signals; Delay and Sum beamforming in three dimensional space.
 
9. A method according to any one of the preceding claims, characterized in that DOA estimation for the MB signal is the direction with the maximum power for the MB beam.
 
10. A method according to any one of the preceding claims, characterized in that DOA estimation for the SW signal is the direction with the maximum power for the SW beam.
 
11. A method according to any one of the preceding claims, characterized in that it comprises the following steps for range estimation that is calculated by using the unit vectors in the DOA directions of the MB and SW signals; multiplying the difference between the arrival times of the MB and SW signals by the speed of sound; dividing the calculated product by (1 - the cosine of the angle between MB and SW unit vectors).
 
12. A method according to any of the preceding claims dependent on claim 3, characterized in that it comprises the following calculation to find the difference between the arrival times of the MB and SW signals; the starting time of an acoustic signal is given by the index of the maximum of cross correlation between the signal and its corresponding theoretical signal model.
 




Drawing







Search report















Search report




Cited references

REFERENCES CITED IN THE DESCRIPTION



This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.

Patent documents cited in the description




Non-patent literature cited in the description