BACKGROUND
[0001] Snipers are utilized in various types of battle situations and environments, and
can be a force multiplier on the battlefield if the respective sniper is well trained
for the specific engagement. Training for a specific engagement however, is sometimes
difficult. Current training methods can effectively train a sniper on aspects of marksmanship
such as breath control, stance, support, and trigger control but are often lacking
in addressing the critical skills for determining the point-of-aim on a target.
[0002] Determination of the point-of-aim on a target requires the consideration of several
factors, and the combinations of those factors, including distance, wind and other
weather conditions, target movements, and parallax. This is a skill that requires
substantial practice not only to learn, but to maintain as well.
[0003] Point-of-aim practice is currently limited to firing range facilities, resulting
in a limited variety of distance and weather condition. Sniper trainees would also
need to relocate among firing range facilities to receive a more comprehensive training,
costing time and money. In addition, firing range facilities which are suitable for
sniper training, are not always available at all bases and when soldiers are deployed.
[0004] Another component to point-of-aim training is the ability to observe trace. Trace,
which is the vapor trail formed by the bullet due to air turbulence, allows the trainer
and the trainee to observe the path of the bullet to determine the point of impact
on the target. The vapor trail however, lasts only a matter of seconds, making it
difficult for new trainees to learn.
SUMMARY
[0005] For improved sniper training, a sniper skills training system is described. The sniper
training system comprises a computer trainer station having a 3D rendering engine,
a computer display, a trainer user interface module, at least one database, and a
data storage; and a simulated rifle unit having a near-to-eye display and a hardware
interface module, wherein the trainer station is interfaced with the simulated rifle
unit to provide a simulated training scenario.
BRIEF DESCRIPTION OF FIGURES
[0006]
Figure 1 is a pictorial diagram showing the sniper training system, including a trainer
computer station and a simulated trainee rifle unit, according to one embodiment of
the invention.
Figure 2 is a block diagram showing the software architecture of the sniper training
system, according to one embodiment of the invention.
Figure 3A is a flow diagram showing the interaction between the trainee and the sniper
training system during a training session, according to one embodiment of the invention.
Figure 3B is a flow diagram showing the interaction between the trainee and the sniper
training system during a training session with the help of a trainer, according to
one embodiment of the invention.
Figure 4 is a pictorial diagram showing the setup of a sniper training session using
a sniper training system, according to one embodiment of the invention.
DETAILED DESCRIPTION
[0007] Figure 1 is a pictorial diagram showing the sniper training system
100, including a trainer computer station
102 and a simulated trainee rifle unit
104, according to one embodiment of the invention. The trainer computer station
102 acts as the host to the simulations for training. Scenarios are created and loaded
into the computer to provide the appropriate conditions that the trainee is preparing
for. The trainer computer station
102 is interfaced with the simulated rifle unit
104, which the trainee uses to simulate aiming and firing at targets during training.
While the trainer computer station
102 is preferably a laptop or other portable computer for increased portability, other
processing devices alternatively may be used.
[0008] The simulated rifle unit
104 includes a trigger sensor
110, a motion tracker
106, and a near-to-eye display system
112 with adjustment knobs
108. In one embodiment of this invention, the main structure of the simulated rifle unit
104 is a construct of materials similar to those of an actual sniper rifle, such that
the training simulations are as realistic as possible. Alternatively, the simulated
rifle unit
104 does not need to be in the shape or form of an actual sniper rifle. For instance,
a smaller unit may be appropriate for portability.
[0009] Determining the point-of-aim for a target, which is a core aspect of sniper skills
training, is performed through the use of a scope. The near-to-eye display system
112, which is in the shape of a scope, is mounted on the simulated rifle structure, similar
to how scopes are mounted on the top of actual sniper rifles. Looking into the near-to-eye
display system
112, the trainee will see a high resolution, realistic training environment. Incorporated
in the near-to-eye system
112 are adjustment knobs
108 for simulated windage control, elevation control, and parallax compensation. The
scope orientation tracker
106 senses orientation of the simulated rifle unit. Information from this sensor along
with input via the adjustment knobs
108 are processed by the trainer computer station
102 and the image in the near-to-eye display system
112 is updated in real-time. The trigger sensor
110 is excited when the trainee has determined the point-of-aim for the target in the
scenario and takes the shot.
[0010] Figure 2 is a block diagram showing the software architecture
200 of the sniper training system, according to one embodiment of the invention. The
entire architecture can be divided into two sections - a simulated rifle unit
244 and a simulation hosting trainer computer
246.
[0011] The simulated rifle unit
244 includes a near-to-eye display
236 and a hardware interface module
238. The hardware interface module
238 includes a scope orientation tracker
204, a windage input
206, an elevation input
208, a parallax input
210, and a trigger input
212, all connected to the simulation hosting trainer computer
246 via a hardware interface
248.
[0012] The simulation hosting trainer computer system
246 includes a computer display
202, a trainer user interface module
240, a 3D rendering engine
242, a maps database
232, a scenarios database
234, and a data storage
226.
[0013] The trainer user interface module
240 and 3D rendering engine
242 execute method steps, as shown in Figure 2. The trainer user interface module
240 includes a map, scenario and conditions selection step
214, an initiate step
216, a monitor hardware step
218, a record and log data step
220, a calculate ballistic physics step
222, and an after action review (AAR) step
224. The 3D rendering engine
242 includes a load map and scenario step
228 and a simulation rendering step
230.
[0014] The data flow within the software architecture
200 begins with the maps, scenario and conditions selection
214. Here a trainer or trainee can select from the geographical maps database
232 and scenarios database
234. Geographical maps can be representations of actual locations, or virtually created
environments. Scenarios will include different target situations, such as riding in
a moving vehicle or standing at a podium, and different target behaviors. Entry-level
scenarios may include static targets whereas high-level scenarios may include targets
moving in less predictable ways. Conditions that can be selected will include weather
elements that can affect the trajectory of a bullet, such as wind, rain fall, and
elevation etc. Once these elements are selected, the training session will begin.
An alternative to trainer or trainee selections of scenarios or conditions is random
scenario generation. The sniper training system can generate random scenarios or semi-random
scenarios that are increasingly more difficult, based on the trainee's prior performance.
[0015] The initiate step
216 activates the hardware interface module
238 in the simulated rifle unit
244 and loads the map and scenario
228 that was previously selected into the 3D rendering engine
242. The 3D rendering engine can be a commercially available graphics engine such as the
Unreal Engine 3 by Epic Games. In addition, foliage and groundcover can be rendered
using programs such as the SpeedTree Application Programming Interface. Target models
can be created using applications such as 3D Studio Max or Maya and imported into
the user set virtual training environment. Most graphic engines, including the aforementioned
Unreal Engine 3 offer robust artificial intelligence that closely approximates human
target behaviors. With the provided map and scenario information and selected conditions,
simulation rendering
230 occurs and the resulting image is presented to the trainee via the near-to-eye display
236. The image presented through the near-to-eye display
236 includes a reticle, which is standard in optical scope eyepieces. For versatility,
various types of reticles, such as the army reticle, the marine reticle, or the Horus
reticle can be available for selection. The goal is to produce a realistic environment
for the trainee when looking into the near-to-eye display
236. The rendered image will also be present on the computer display
202.
[0016] Based on what is seen in the near-to-eye display
236, the trainee will make adjustments to elements in the hardware interface module
238. Moving the simulated rifle unit
244 will prompt input from the scope orientation tracker
204. The scope orientation tracker
204 that provides input to the hardware interface
204 calculates the direction the simulated rifle unit
244 is pointed. The hardware interface
204 sends the acquired information to the monitor hardware
218 which relays the information for simulation rendering 230 to provide an updated image
on the near-to-eye display 236 and computer display
202 in real-time. Similar steps are executed in response to windage input
206, elevation input
208, and parallax input
210. Windage adjustment is the side to side adjustment of the scope on a rifle, to account
for the side force on a projectile from a cross wind. Elevation adjustment is the
up down adjustment of the scope on a rifle to account for bullet drop as a result
of gravity which varies depending on the distance to the target. Parallax compensation
accounts for aiming errors from positioning one's eyes at different angles against
the rifle scope. This error is often magnified when the target is moving.
[0017] This process continues in a loop until the trainee has determined the point-of-aim
and is ready to take the shot. Each adjustment can be recorded and logged
220 then stored
226 for review later on and used to improve the trainee's process to determine point-of-aim.
[0018] Then the trainee will activate the trigger sensor when point-of-aim has been determined
and a trigger input
212 will be sent to the monitor hardware
218 via the hardware interface
204. This trigger input
212 will initiate ballistic physics calculation
222 to simulate the bullet path and result of the shot. The simulated firing event will
be rendered
230 and be shown in real-time on the computer display
202 and the near-to-eye display
236. The rendered real-time image will be similar to the result of an actual firing of
a rifle, including the quickly disappearing bullet trace.
[0019] The resulting simulated firing event will be recorded and logged
220 and stored
226 and will also be immediately available for AAR
224. Through graphics rendering
230 and display on the computer display
202, the trainee and/or trainer can have a closer and more detailed look at the firing
event that had just occurred, providing immediate feedback.
[0020] In addition to providing the trainee and/or trainer with feedback in the form of
a rendering of the firing event, the training system can also generate recommended
scope adjustments and/or point of aims for the scenario. Recommended scope adjustments
and point of aims allows the trainee to compare his/her process to an optimized process.
The recommended scope adjustments are an optimized series of adjustments that allows
a sniper to quickly determine a point of aim. The recommended point of aim can be
generated either from the resulting scope adjustments by the trainee or the recommended
scope adjustments and can be shown on the display in the form of an indicator such
as a red dot or a red crosshair.
[0021] Figure 3A is a flow diagram
300 showing the interaction between a trainee
302 and a sniper training system
304 during a training session, according to one embodiment of the invention. The different
components include the trainee
302; the sniper skills training system
304, which further includes a near-to-eye display
306, scope adjustments
308, a trigger sensor
314, and training session results
310; and a training scenario
312 input.
[0022] First, the training scenario
312 is selected and loaded into the sniper training system
304. The trainee
302 will now be able to look into the near-to-eye display
306 and begin to determine the best point-of-aim for the target in the loaded scenario
by making simulated scope adjustments
308. The display
306 is updated in accordance with the adjustments
308. The trainee
302 will continue this process to make adjustments to determine the best point-of-aim.
When the point-of-aim to take the shot is determined, the trainee
302 will excite the trigger sensor
314. The result of the shot is then simulated and stored in the training session results
310. After the shot is taken, the trainee can continue to make adjustments and take additional
shots within the loaded scenario. The trainee
302 can look at the output of the training session results
310 either immediately after the shot, or after all the training scenarios have been
completed.
[0023] Figure 3B is a flow diagram
350 showing the interaction between a trainee
352 and a sniper training system
354 during a training session with the help of a trainer
362, according to one embodiment of the invention. The sniper training system in this
embodiment includes a near-to-eye display
356, a computer display
366, scope adjustments
358, a trigger sensor
364, and training session results
360.
[0024] The trainer
362 can select the scenario for the session. While the trainee
352 goes through the process of determining a point-of-aim for the target, the trainer
362 can monitor the trainee's selections through the computer display
366 and has the option to provide real-time feedback to the trainee
352. At the end of the training session, the trainer
362 can review the training session results
360 and provide feedback to the trainee
352 based on those results.
[0025] Figure 4 is a pictorial diagram showing the setup of a sniper training session
400 using a sniper training system
414, in one embodiment of the invention. The sniper training session
400 shown involves a trainer
402, a trainee
404 and a sniper skills training system
414. The sniper training system
414 includes a simulated rifle unit
406, the trainer computer station
408, a connection
412 from the computer station
408 to the rifle unit
406, and a connection
410 from the rifle unit
406 to the computer station
408. While wired communications
410 and
412 are shown, the connections
410 and
412 may alternatively be wireless.
[0026] The trainer
402 operates the computer station
408 portion of the sniper training system
414, while the trainee
404 operates the simulated rifle unit
406 portion of the sniper training system
414. One alternative training session setup may involve only a trainee operating both
the computer station as well as the simulated rifle unit.
1. A sniper skills training system, comprising:
a computer trainer station having a 3D rendering engine, a computer display, a trainer
user interface module, at least one database, and a data storage; and a simulated
rifle unit having a near-to-eye display and a hardware interface 1. A sniper skills
training system, comprising:
a computer trainer station having a 3D rendering engine, a computer display, a trainer
user interface module, at least one database, and a data storage; and
a simulated rifle unit having a near-to-eye display and a hardware interface module,
wherein the trainer station is interfaced with the simulated rifle unit to provide
a simulated training scenario.
2. The sniper skills training system in Claim 1, wherein the at least one database comprises
a maps database.
3. The sniper skills training system in Claim 1, wherein the at least one database comprises
a scenarios database.
4. The sniper skills training system in Claim 1, wherein the at least one database comprises
a maps database and a scenarios database.
5. The sniper skills training system in Claim 1, wherein the data storage is a portable
storage device.
6. The sniper skills training system in Claim 1, wherein the hardware interface module
is a wireless interface module.
7. The sniper skills training system in Claim 1, wherein the simulated rifle unit is
constructed to physical specifications of an actual rifle.
8. The sniper skills training system in Claim 1, wherein the computer trainer system
is a laptop computer.
9. The sniper skills training system in Claim 1, wherein the near-to-eye display contains
a reticle.
10. The sniper skills training system in Claim 9, wherein the reticle is chosen from the
group consisting of army reticle, marine reticle, Horus reticle, German reticle, SVD
type reticle, mil-dot reticle, and range-finding reticle.