Links to Individual Optic Test Results:
Thoughts on the Drill Itself
I’ve come to refer to this drill as the X-Box. As I described before, this test uses directional transitions to test the friendliness of the optic to target acquisition. Two target stands are placed approximately 7 yards apart. Each target stand is 8′ tall and has a target at the bottom and a target near the top. The shooter stands equidistant from each stand so that they are both approximately 10 yards from the shooter.
Although the distance of this drill is not all that different from test #1, which is at 7 yards, the drill itself is qualitatively quite different. While test #1 occurs over a very short duration, is a very simple shooting task, and puts 100% of the score into the outcome of 1 shot, the X-Box drill is more complex (I would call it ‘moderate’). Test #2 occurs over a longer duration with more shots, which allows the shooter to leave the mental game of waiting on the timer, pre-planning, etc., thus reducing the mental game aspect of it. Since the drill has more rounds it is less sensitive to mistakes, and provides a bigger picture of performance.
I would have liked to have the targets presented in a perfect square, but 21′ target stands are not realistic, and it was important to me to have a wide area between lateral targets to traverse. I wanted the transition to be significant, and not simply a matter of picking up something that was already in my scope. It is obvious that the up and down transitions were faster due to the much short distance between targets, approximately 6′ versus approximately 21′. Shot #4, the first of the up/down transitions, was always the point in the drill when it felt like ‘it’ took over shooting for me and I just tried to stay out of the way. The same phenomenon is probably what caused me to forget where I was going to in about 1 out of 4 runs.
Average Split Times:
What Makes an Optic do Better in This Drill?
I came into this drill expecting field of view to make a big difference. Here are the published numbers for field of view at the lowest power setting of each optic from the manufacturer’s websites:
I could not find a number for the Aimpoint.
I came out of the tests with a different theory, which I have already discussed a bit in the overview of the SWFA scope. With an unmagnified optic, and shooting with both eyes open, field of view inside the scope is not exactly what is important. The total field of view is what matters. Essentially total field of view can be defined as everything in front of view minus what the rifle and non-see-through parts of the scope obstruct. I initially expressed that as: vision = field of view inside the scope + field of view outside the scope – the field of view obstructed by the rifle and scope. Note that the “-” symbol represents a subtraction sign and not a dash. Since then I’ve learned something about reticles and have to conclude that the reticle, although necessary, is essentially obstructive in nature. That would make it necessary to amend the formula to something like this: vision = (field of view inside the scope – that which is obstructed by the reticle) + field of view outside the scope – the field of view obstructed by the rifle and scope.
In light of my theory, and speaking of field of view, I should note that I did not shoot this course of fire with any of the variable power optics at the lowest power setting. I have already noted several times that at the lowest power setting the image size appeared to be smaller than what I see with the naked eye. Therefore the optics were “turned up” a bit, in each case somewhere between 1 and 2. The SR-8c had to be turned up to about 1.6x, the SR-4c barely at all, and the two 1-6x scopes in between those extremes.
In terms of total field of view, I would rank them as follows, as viewed with a perfect sight picture, from best to worst: Z6i, SR-8c, SR-4c, T1, and SWFA. I actually referred to through the scope photos to come up with that, although it is still based on my impression. I was surprised to see that the SR-8c was actually better than the SR-4c in that respect, but the SR-4c had a thicker ‘flange’ of material between the in the scope image and the view outside of the scope. What likely led to that confusion was performance in another key area: ease of eyebox acquisition.
The eyebox is the area that the eye needs to be in to see a perfect edge to edge image through the scope. This varies quite a bit between optics and usually between powers of any single variable power optic. This is probably an equally important quality in a scope to be able to transition quickly and accurately.
Ideally, the eye can discover a target, acquire it, and the rifle will be presented so that the scope image (via the eyebox) arrives at the eye, hopefully presenting an acceptable sight picture. The tighter the eyebox, the more exacting that procedure must be, and therefore perhaps making it slower and more difficult. The better trained the shooter is to acquire a consistent cheekweld, the less of an issue this will likely be, but no one is perfect all the time, and not all positions are as easy as others. I cannot rate the eyeboxes of the test scopes objectively at this time, only from memory. I would rate them as follows, from best to worst: T1, Z6i, SR-4c, SWFA, and SR-8c, with the last two being close to a tie. None of them were bad at all, but the T1, being unmagnified, really didn’t have an ‘eyebox’ per se, and the Z6i is incredible. The SR-4c is quite good as well.
Accuracy and Reliability of the Test
So I went shooting with different optics and kept good track of how I did. How does that make for a meaningful evaluation of an optic? By itself, it doesn’t.
Like rifle shooting, a good test should be accurate, meaning that it should shed light on a particular aspect of performance of the tester’s choosing. Since this drill involved transitions, it should be a good indicator of how the optic affected the shooter’s ability to perform transitions with that optic.
Another important quality in a test is reliability of the data. How much confidence can be placed in the findings? I have already pointed out that it would have been a lot better with a large number of shooters doing the same drill, as this would have been more likely to provide results useful for the average shooter. In lieu of having a large number of shooters, I had to rely on my own consistency in shooting and ability to analyze what I experience. In this test there was a consistency issue, but I was very aware of it as it occurred.
The Beginning and the End of My Time on the Plateau
This test involved hitting a plateau in the middle of the testing and moving out of it just at the end. The first optic I tested in this drill coincided with the first time I shot it. That was not a great plan looking back in hindsight, but I figured at the time that the only times I would shoot the drill would be in testing. I felt like the infrequency of shooting this drill would preclude and increase in skill. I was wrong.
You may recall that in the Z6i test I remarked, “On runs 1 and 2 I turned in good times, but nothing out of the ordinary. Run 3 felt normal but was significantly faster for me. On run 4 I could feel that I was moving at a comparatively smoking hot pace…” This was an instance of learning taking place. That individual test affected the rest of the optics in this test through the T1, which was the second to last.
I left the plateau the day that I re-tested the SR-8c. I believe this can be explained by the fact that this was the last optic in the test that I had to do ‘work’ on and collect numbers on. As this process was new to me, and at that point I had a lot of numbers I wasn’t sure how I was going to best interpret, it was no small relief to be done with the actual tests. I was beginning to relax. You can actually see that in the results from Test #1, where my hit rate may have turned in a little lower than normal. Since the tests are qualitatively different they demanded different levels of focus to shoot them well (intensity vs. open focus), and I think that is what allowed me to do so much better in the SR-8c retest in this instance.
Average total time
In the previous drill I think that the time was a better indicator than the hits. In this drill I don’t think that I can assign more value to either, but will still present each as an average total time for a single run with each optic.
Taking the “plateau graph” into consideration when looking at this graph, I think what can be clearly stated is that the Aimpoint is clearly fast and that the SWFA was clearly slow in comparison to the others. I would also speculate that since the Z6i was basically on the verge of the plateau as I entered it, it would be running closer in time to the SR-4c, but I don’t know if it would match or surpass it.
The SR-8c is an interesting case, as it came in last in the initial test and at a close second in the retest. I would say that it would probably run a close third behind the Z6i and SR-4c.
Aimpoint T1 Micro
U.S. Optics SR-8c Retest
The graphs are interesting to me, because they make it so much easier to see things that I was not all that aware of before, even though I had the results on a spreadsheet. The adaptation that I referenced earlier with respect to my ability to shoot this well pertained mostly to speed. It would also seem clear that this graph somewhat coincides with the “plateau graph” above, notable exceptions being the T1 and SR-4c, which were among the fastest. With the T1 especially, I believe that the relative lack of ability to see hampered my efforts in this drill.
Remember that I was puzzled by the overall lackluster performance of the Swarovski Z6i in Test #1. After a lot of guessing, I made a comment, “The other thing I have wondered is if I could just see better and it caused me to be more discriminating in my decision to fire.” I think that is close, but not quite accurate. I think that it would be more accurate to say that each shooting problem demands a certain amount of visual information. It might seem like more is always better in terms of the speed of observation to action (Boyd’s cycle would be a better way to understand that process), but I don’t think that is that case. As with many other things, I think that precisely just enough information makes for the most streamlined action cycle that would arrive at an appropriate solution. Even in test #1, while the T1 made a very respectable showing as far has hit rates, the hits were not all of as high a quality as with other optics in terms of points, which took into account the quality of the hits. As the distance is increased and the complexity of the problem made “deeper” the requirement for information is intensified. I believe why this is where the T1 began to show its deficiencies in comparison to the other optics.
Remember that the SWFA scope was given an accuracy boost due to my taking part in some training in the days preceding these tests with that scope. The training addressed similar shooting problems as these. It could not be helped, and I mention it only to aid your interpretation of the results.
I should point out the obvious in saying that the ~4.2” targets are smaller than necessary for most people’s requirements for hitting the heart/lung vital zone of a large animal. If that is the case for you it would probably be safe to double the distances I worked at.
Average Total Points
The points measure is similar to the hit ratio, except it places more weight on a center hit than an edge hit. It also does reward 1 point to the larger circle outside the primary target. Note that it would take several hits nearer to the center to make up for one single point shot. The red dashed line in the graph represents the score if nine shots scored the minimum for a hit, which is 5 points per shot.
Hits Per Second
Finally it becomes apparent how good my last runs with the SR-8c were in comparison to the others. I think the graph illustrates how the U.S. Optics scopes were efficient performers. Taking my “plateau graph” into account, I would say that the Z6i was right up there as well. I already remarked in the SWFA individual test results how the reticle was just overwhelming my ability to receive the information I needed. With the Aimpoint I just couldn’t see as well.
Points Per Second
In this measure the SR-4c again shows its dominance. It just had some better hits, although its hit ratio was not as high as the SR-8c retest.
In looking at the totality of the tests, if I had to actually put money down on the optimum optic for this application, multiple targets at relatively close range, I would say that the Swarovski Z6i probably has the best balance of attributes to allow the shooter to work. I think that the two U.S. Optics scopes allow very close performance to the Z6i, with the SR-4c having a slight edge over the SR-8c.