FIELD: military equipment.
SUBSTANCE: invention relates to armament of armored force vehicles (AAFV) and can be used for automation of processes of external target designation, indication of targets and important objects directly in fields of vision (on screens of video viewing devices (VVD)) of sighting devices and surveillance devices with digital optoelectronic channels AAFV samples. At external target designation with indication reception of information from outside from other objects is performed (another armored fighting vehicle of subunit of echelon, e. g. command, or of control system, reconnaissance systems or control or reconnaissance systems, etc.) and displaying location information of targets and important objects in real time on a screen of a tablet displaying navigation and tactical information. Received target designation data contain at least three-dimensional coordinates (xWg, yWg and zWg) g targets and important objects (in outer coordinate system W) and their types (for example, armor-object, tank-dangerous live force, low-altitude low-mobile target, etc.). Data on directions of their movement and speeds in outer coordinate system (CS) W, time of last updating of information on target or important object are obtained as well. For all received g targets and important objects vectors PWg = (xWg yWg zWg 1) are recorded. According to target designation data corresponding graphic markers (tactical marks) are applied on digital map of navigation display surface and tactical information display surface. Angles αBg of rotation of a tower of a armored fighting vehicle on said g targets and important objects are calculated. For each j sight, for which target designation can be carried out, matrices CjMB of CS position of their bases relative to the beginning of the CS of the armored fighting vehicle (AFV). CWAFV matrix of CS position of AFV in CS W OWXWYWZW is calculated. For each camera (optoelectronic part) of the j sight, a position matrix CKj is calculated, which determines current (in real time) position and orientation of CS OKjXKjYKjZKj of camera of j sight relative to CS OjXjYjZj of its bases. Position matrix CKj is calculated as product of matrices CKj = Cj1Cj2Cj3...Cjq where q is the number of intermediate matrices defined by the sighting structure. Each intermediate matrix includes a rotation matrix Rjq and (or) transfer vector Tjq determining positions and orientation of intermediate (dependent) CS elements of the j sight, in their totality setting position and orientation of CS OKjXKjYKjZKj of camera of j sight relative to CS OjXjYjZj of its bases. At that for calculation of all coefficients of matrices Ciq (depending on sighting structure) using data from sensors of orientation of head module or head mirror (HM) of sight, as well as coordinates and orientation of CS OKjXKjYKjZKj of camera in sight relative to CS OjXjYjZj of a sighting device base or its HM. Vector PKj of coordinates of g target or important object is calculated in CS OKjXKjYKjZKj of camera of j sight. Coordinates of g target or important object are scaled in CS OKjXKjYKjZKj of camera of the j sight into the Imgj image plane, for what the projection coefficient sj is calculated. A projection matrix Sj is formed. Values of coordinates of vector PKj is recalculated. Kj internal camera parameters (optical-electronic parts) matrixes of j scopes are calculated, for each g target or important object and j sight the vector Pgj = (ngj mgj), containing pixel coordinates (column number ngj and line number mgj) of point Pjg of the position of the center of the graphic markers Qq on the Imgj images of the video view devices (VVD) of the j sights. For each g target or important object, graphical marker Qq corresponding to the type of the target, for example, in the form of a frame which selects the location of the target image in the field of view of the j sight, is displayed on Imgj images of the VVD of j sights. If the pixel coordinates of the g target or important object go beyond the Imgj image edges, i. e. 0 ≤ ngj ≥ Nj and/or 0 ≤ mgj ≥ Mj, the graphic marker Qq is displayed on the VVD in a reduced size, for example, in the form of a tactical sign along the edge of the Imgj image in that line or column, which did not reach the edges of the image by their values. When making a decision to steer the weapon on the g target, guidance commands are sent to the drive of the horizontal stabilizer of weapons to the moment of angle αBg transmission to the target, after what guidance signals for horizontal and vertical guidance drives are generated until the pixel coordinates of the g target become equal to the pixel coordinates of the position of the central sighting mark of the j sight. When transmitting the target designation, the detected targets or important objects are indicated on the VVD screen of the j sight, for example, by guiding the central sighting mark on the target or object and sending a command to transmit the target, or by pressing the area of the VVD screen, where the target or object is located, if VVD has touch screen, in any case by column number ngj and line number mgj, corresponding to image of target or important object on Imgj image, recording vector Pgj = (ngj mgj) and displaying the corresponding graphic marker Qq on the VVD screen of j sight. Distance zgKj to g target or important object is measured by any available method, for example using standard laser rangefinder of j sight. Vector Pgj is transformed from a pixel CS of Imgj image into a three-dimensional CS of camera. Vector P'Kj of coordinates are scaled from the Imgj image plane, for which the value of the projection coefficient sj. A projection matrix Sj is formed. Vectors PKj of coordinates is calculated. Coordinates of target are converted from CS OKjXKjYKjZKj of camera of j sight in CS W. Simultaneously corresponding graphic markers Qq (tactical signs) are displayed on digital map of area of graphic tablet (display of navigation and tactical information) and vector PWg with target coordinates is transmitted on receiving-transmitting equipment for further transmission, for example, to other AFV of subunit, tactical echelon control system, reconnaissance system, etc.
EFFECT: multi-purpose and accurate ETD in real time both in the field of direct visibility, and beyond its limits and shelters objects with minimum dependence of the target designation result on complexity of the target situation, intensity of combat, as well as qualities and training of the AFV crew.
1 cl, 5 dwg
Title | Year | Author | Number |
---|---|---|---|
METHOD FOR INTERNAL TARGET DESIGNATION WITH INDICATION OF TARGETS FOR ARMORED WEAPON SAMPLES | 2019 |
|
RU2712367C2 |
INFORMATION OVERVIEW AND PANORAMIC SURVEILLANCE SYSTEM | 2020 |
|
RU2757061C1 |
METHOD OF CURRENT DIGITAL ALIGNMENT OF SIGHTS WITH COMPENSATION OF AIMING MARK POSITION BY BENT VALUE OF BORE OF GUN | 2020 |
|
RU2725677C2 |
METHOD OF AUTOMATIC ADJUSTMENT OF ZERO LINES OF SIGHTING OF OPTOELECTRONIC CHANNELS OF SIGHTING OF ARMORED VEHICLES | 2018 |
|
RU2695141C2 |
METHOD OF DETERMINING POSITION OF REGION OF SEARCHING FOR MATCHES ON DISTORTION-DEGRADED IMAGES | 2020 |
|
RU2740435C2 |
METHOD OF DETERMINING COORDINATES OF OBJECTS BASED ON THEIR DIGITAL IMAGES | 2018 |
|
RU2697822C2 |
METHOD OF FIGHTING VEHICLE INDIRECT FIRE AGAINST UNOBSERVED TARGET AND CONTROL SYSTEM TO THIS END | 2010 |
|
RU2444693C2 |
WEAPON STABILIZER PARAMETERS AUTOMATIC CORRECTION METHOD | 2017 |
|
RU2667664C1 |
COMBAT VEHICLE WEAPON SYSTEM | 2007 |
|
RU2351876C1 |
METHOD FOR DETERMINING DISTANCES TO OBJECTS USING IMAGES FROM DIGITAL VIDEO CAMERAS | 2016 |
|
RU2626051C2 |
Authors
Dates
2019-08-08—Published
2019-01-22—Filed