FIELD: radar.
SUBSTANCE: invention relates to radar and can be used to identify in pulse Doppler radar station various types of air target from class "Aircraft with turbojet engine (TJE)" under stationary conditions of its flight based on neural network. In the disclosed method, the radar signal S(t) reflected from the aircraft with the turbojet engine is subjected to narrow-band Doppler filtering in the Fast Fourier transform (FFT) unit and converted into an amplitude-frequency spectrum (AFS) S(f)={F,A}, where F – values of frequencies of spectral components, A – corresponding amplitudes. In the Spectrum truncation and normalization device, a sub-range of finding informative features is calculated and the AFS is truncated by the calculated sub-range and normalized by amplitude. Truncated and normalized array of amplitudes A is supplied to the corresponding inputs of the Estimating device, which is a set W of pre-trained neural networks in the Neural network model unit. Each neural network performs recognition of only one of W types of a priori class alphabet. Output of the w-th neural network in the Estimating device will be the probability P1 of coincidence of the received image of normalized amplitudes A with one of the W types of the a priori class alphabet and probability P2 of mismatch of the supplied image of normalized amplitudes A formed by the neural network of the generalized image. From the output of the Estimating device, the vector of estimations of the probabilities of referring the image of normalized amplitudes to the w type of the air target from W possible arrives to the input of the Resolving device. In the Resolving device, the probabilities P1 and P2 are compared with the given threshold probabilities Ptr1 and Ptr2respectively. If the condition P1w>Ptr1∪P2w>Ptr2, based on the criterion of maximum probability P1w, an unambiguous assignment is made to a specific type w of an aerial target from the a priori class alphabet, the value of which is transmitted to the output of the Resolving device as a recognition result. Training of neural networks is carried out in a Model of a neural network on types W of the a priori alphabet of classes using a model reflected radar signal.
EFFECT: development of a method which enables to recognize in a pulse-Doppler radar station with a probability not lower than a given one, the type of an aerial target from the class "Aircraft with turbojet engine" under stationary conditions of its flight.
1 cl, 1 dwg, 1 tbl
Title | Year | Author | Number |
---|---|---|---|
METHOD FOR RECOGNIZING TYPICAL COMPOSITION OF A CLUSTERED AIR TARGET OF VARIOUS CLASSES UNDER VARIOUS CONDITIONS OF THEIR FLIGHT BASED ON KALMAN FILTERING AND A NEURAL NETWORK | 2022 |
|
RU2802653C1 |
METHOD FOR ALL-ANGLE RECOGNITION IN RADAR STATION OF TYPICAL COMPOSITION OF GROUP AIR TARGET UNDER VARIOUS FLIGHT CONDITIONS AND INFLUENCE OF SPEED-CONTUSION INTERFERENCE BASED ON KALMAN FILTERING AND NEURAL NETWORK | 2023 |
|
RU2816189C1 |
METHOD OF AIRCRAFT TYPE IDENTIFICATION WITH TURBOJET ENGINE IN PULSE-DOPPLER RADAR STATION UNDER ACTION OF SIMULATING NOISE | 2020 |
|
RU2735314C1 |
METHOD OF AIRCRAFT WITH TURBOJET ENGINE TYPE IDENTIFICATION IN PULSE-DOPPLER RADAR STATION | 2019 |
|
RU2705070C1 |
METHOD OF AIRCRAFT WITH TURBOJET ENGINE TYPE IDENTIFICATION IN PULSE-DOPPLER RADAR STATION | 2020 |
|
RU2731878C1 |
A METHOD FOR RECOGNIZING THE TYPICAL COMPOSITION OF A GROUP AIR TARGET FROM THE CLASS "TURBOJET ENGINE AIRCRAFTS" BASED ON KALMAN FILTERING AND A NEURAL NETWORK | 2022 |
|
RU2786518C1 |
METHOD FOR RECOGNIZING TYPE OF SINGLE AIR TARGET FROM AIRCRAFT WITH TJE CLASS | 2023 |
|
RU2807510C1 |
METHOD OF AIRCRAFT WITH TURBOJET ENGINE TYPE IDENTIFICATION IN PULSE-DOPPLER RADAR STATION UNDER ACTION OF SPEED-ESCAPING INTERFERENCE | 2019 |
|
RU2732281C1 |
METHOD OF TRACKING AERIAL TARGET FROM "TURBOJET AIRCRAFT" CLASS UNDER EFFECT OF VELOCITY DEFLECTING NOISE | 2015 |
|
RU2579353C1 |
METHOD OF TRACKING AN AERIAL TARGET UNDER THE ACTION OF A SIGNAL-LIKE WITH DOPPLER FREQUENCY MODULATION OF DRFM TYPE NOISE | 2020 |
|
RU2727963C1 |
Authors
Dates
2024-09-06—Published
2023-12-29—Filed