Researcher Connects Playing Shooters With Better Aim

BENZOOKA

This is the most wittiest title
Oct 26, 2009
3,920
0
0
If accuracy is the issue now, then shouldn't they constrain all competitive shooters immediately.
 

Warped_Ghost

New member
Sep 26, 2009
573
0
0
http://files.sharenator.com/And_not_one_single_fuck_was_given_that_day_RE_Picture_Challenge_5_A_3_way_tie-s720x449-162317.jpg
 

Treblaine

New member
Jul 25, 2008
8,682
0
0
good sample size, but a few things missing for this :
-Group is large but what about the sample size, how many shots did each individual fire?
-peer review (to make sure it's all above board)
-double blind (yes, even in an objective test, the score counters cannot know who played the games)
-control group (it may be that people who are better at aiming are more likely to play games)
-repeatability (can someone repeat the outcome)

Also some controls. How about comparing those who "train" on a lightgun game against another group that trains with an actual gun? How do they compare then?

If it is a light gun like this:



Then they will have had SOME training with a "gun". Versus the control group who likely have never aimed to hit something in their life.

If for the "testing" where they used a very light recoiling weapons like the most common self-loading rifle you can find, a Remington 10/22 rifle. It has minuscule recoil, about equivalent to tapping down the end of the barrel will a ball-peen hammer.

PROPOSAL FOR FURTHER INVESTIGATION:

Repeat this experiment with another control group:

Group 1: Spends 20 minutes sitting reading a magazine before the test
Group 2: spends 20 minutes playing a lightgun game before the test
Group 3: Spends 20 minutes shooting at cans with the same rifle they are about to use in the test

If Group 2 performs best, that might be cause for concern. But if group 3 performs best, that just indicates the banal trend that the more you repeat a task the better you get at it.

After all, who gets a gun and doesn't spend any time shooting it? Breivick (Norway killer) stated in his trial that he became a good shot with his rifle my training with it on the range, that the games he played were just time wasting hobbies.
 

Daaaah Whoosh

New member
Jun 23, 2010
1,041
0
0
I played some laser tag a few weeks ago, and I was surprised at just how accurate I was, since I'd never shot a gun before. Of course, I was ironsighting the whole time, and laser guns have no recoil, but still. Those kids were dead before they could even cry about it.
 

Treblaine

New member
Jul 25, 2008
8,682
0
0
Baresark said:
Blablahb said:
Well put. Though, you did forget one vital part of the information. At no point would a sample of 151 people be large enough to indicate a single thing. The anecdotal "rule of small numbers" dictates that you are far more likely to get an extreme sampling of data from such a small group. This study should have been conducted with thousands of participants in much more controlled settings (as you so eloquently outlined for everyone).
Uh, the sample size is large enough, how could you get much more than 150 people for a test like this? Even divided into two groups 75 data points is enough to demonstrate a significant trend.

The issue with sample size is samples per individual. If they took only 3-4 shots each, that doesn't say much especially if playing Resident Evil 4 they thought they'd get more points for aiming for the head, but that was not a matter of ability, simply a matter of suggestion. They thought they SHOULD go for the head, not that they COULD.

Were they ALL told to aim for the head for more points in the testing stage? If not - and Resident Evil 4 did implicitly - then it's nothing but indication of the power of suggestion.
 

Harker067

New member
Sep 21, 2010
236
0
0
Blablahb said:
That study is very likely to be worthless for a number of reasons. His sample among students isn't random, and that sinks the entire experiment.

For a start because psychology as a whole is a 'soft science'. Even it's rock-solid conclusions are prone to change and revision because they turn out to be inacurate.

And to sink the study:
What about students who have experience with actual guns? It was done in the US after all. He asked about firearms training, but what about other sources of knowledge like military service, knowledge transfered from relatives or self-practise. Because I'd 'speculate' that shooting guns makes you better at shooting guns.
And other things that train eye-hand coordination, he didn't control for that. Archery for instance. Different weapon, basically the same activity.
And what about people being customary to other forms of violence? Did he for instance check for ring experience in fighting sports? No.

So he used an extremely unreliable sample consisting of one socio-economic group, same education level, only one country, likely extremely biased in ethnic background even. No matter what else you work on after that, from a sample group that biased, no conclusions can possibly be drawn.

His method is also flawed. He used a gun-shaped controller. Wait, hold it a moment there; So he didn't use the normal input device for a game, but instead he used a gun analogy? That means the increase in accuracy could be caused by using something shaped as a firearm, and not by it being a game.

Also it says he let students fire actual guns. Wouldn't the increase in willingness to aim at human shaped objects be caused by handling actual firearms? It's pretty common knowledge that weapons in such a context incite violence by themselves, so again he's gotten himself an interfering variable that sends his research down the drain.

Unless he's done some serious maths to rule out those interfering variables (something which he pretty much can't because of there being several other variables and the weakness of his test), then all his conclusions already sunk.

To make it even worse, he put a live-sized human target at 6 metres distance. Let me tell you, even if you had Parkinson's disease, you could make headshot on a stationary target at only 6 metres. It's basically point blank range at which nobody can miss, and accuracy results count for nothing at all.

Then he made yet another mistake in the number of shots. Six shots. But wait, he's counting missed or hit. That means he's conducting a new test, a binominal chance experiment. Hit or miss. With only 6 shots, it's impossible to draw any conclusions. The absolute lowest limit to draw conclusions in binominal chance experiments is 30, so the study's conclusions are invalidated because the outcome can be explained by randomness.


Basically this 'professor' wrote a setup that is so crappy that if you used it for a bachelor thesis, your tutor would come down on you like a ton of bricks and give it a heavily insufficient mark.

So okay, professor does test, proves that randomness exists. Good for him. When is his university going to sack him for disgracing them?
Lancer873 said:
The statistics sound made up (really? 99 percent and 33 percent? That's way too rounded off and way too extreme a difference
He only let them take 6 shots, meaning two shots difference is already a 33% gap. If one non-gamer hits twice and a gamer doesn't miss (which is quite bloody hard at such a tiny range) he can already write in a sensationalist style 'gamers make three times as many headshots', and his conclusions reek of such unsound assumptions.

But you're right. If he lets them take 3 shots, he must have been making up the data, because a hit percentage of 99% is an impossible fraction of the numbers, because people can only shoot 1 full bullet, and not 0,05 bullet.

Unless there's a different explanation, that professor had committed fraud.
Go reread the link i posted or read it. He let them take 16 shots not 6 with a sample size of about 30 people in 5 groups. So thats 480 shots per group so its not too hard to imagine 99 and 33 percent. And it sounds like at least some of the factor were controlled for in the study. I don't like it either but we have to at least try to represent the study properly.
 

Harker067

New member
Sep 21, 2010
236
0
0
Treblaine said:
good sample size, but a few things missing for this :
-Group is large but what about the sample size, how many shots did each individual fire?
-peer review (to make sure it's all above board)
-double blind (yes, even in an objective test, the score counters cannot know who played the games)
-control group (it may be that people who are better at aiming are more likely to play games)
-repeatability (can someone repeat the outcome)

Also some controls. How about comparing those who "train" on a lightgun game against another group that trains with an actual gun? How do they compare then?

If it is a light gun like this:



Then they will have had SOME training with a "gun". Versus the control group who likely have never aimed to hit something in their life.

If for the "testing" where they used a very light recoiling weapons like the most common self-loading rifle you can find, a Remington 10/22 rifle. It has minuscule recoil, about equivalent to tapping down the end of the barrel will a ball-peen hammer.

PROPOSAL FOR FURTHER INVESTIGATION:

Repeat this experiment with another control group:

Group 1: Spends 20 minutes sitting reading a magazine before the test
Group 2: spends 20 minutes playing a lightgun game before the test
Group 3: Spends 20 minutes shooting at cans with the same rifle they are about to use in the test

If Group 2 performs best, that might be cause for concern. But if group 3 performs best, that just indicates the banal trend that the more you repeat a task the better you get at it.

After all, who gets a gun and doesn't spend any time shooting it? Breivick (Norway killer) stated in his trial that he became a good shot with his rifle my training with it on the range, that the games he played were just time wasting hobbies.
If you read the link I posted earlier they talk about the gun they used. It was an airsoft pistol that they say had similar recoil to a normal handgun.
 

Treblaine

New member
Jul 25, 2008
8,682
0
0
Harker067 said:
If you read the link I posted earlier they talk about the gun they used. It was an airsoft pistol that they say had similar recoil to a normal handgun.
Right. An AIRSOFT pistol!?!??

I can see why he hasn't touted peer review of his study, he has clearly weighed this test to denigrate video games by contriving a comparison between a game controller and a "weapon" that are unrealistically similar. Then he has done is try to paint all games are training you for ALL weapons.

This is not (good) science for his misrepresentation of the experiment to come to an unsafe conclusion.

Ah, this study looked so promising: large sample size, and practical testing but clearly this is not science, he has gone out to prove his prejudice. Science is about investigating a topic you are interested in and following where the evidence leads you, he is clearly trying to contrive evidence to support his preconceived conclusion.
 

Torrasque

New member
Aug 6, 2010
3,441
0
0
Science, I love you and all, but please stay the hell away from video games.
Or at the very least, get a scientist that plays a lot of video games to do these studies involving video games.

There are just way too many factors that aren't accounted for in this study, that Professor Bushman hasn't accounted for and would not know to factor into his study, for me to even try to take this study seriously. Now if you'll excuse me, I am going to play some Search and Destroy to make my bomb defusing skills IRL go up.
 

Baresark

New member
Dec 19, 2010
3,908
0
0
Treblaine said:
Baresark said:
Blablahb said:
Well put. Though, you did forget one vital part of the information. At no point would a sample of 151 people be large enough to indicate a single thing. The anecdotal "rule of small numbers" dictates that you are far more likely to get an extreme sampling of data from such a small group. This study should have been conducted with thousands of participants in much more controlled settings (as you so eloquently outlined for everyone).
Uh, the sample size is large enough, how could you get much more than 150 people for a test like this? Even divided into two groups 75 data points is enough to demonstrate a significant trend.

The issue with sample size is samples per individual. If they took only 3-4 shots each, that doesn't say much especially if playing Resident Evil 4 they thought they'd get more points for aiming for the head, but that was not a matter of ability, simply a matter of suggestion. They thought they SHOULD go for the head, not that they COULD.

Were they ALL told to aim for the head for more points in the testing stage? If not - and Resident Evil 4 did implicitly - then it's nothing but indication of the power of suggestion.
That is arguable. But you can't really have both. If they were to let people take 100 shots, for example, the data would be much better from that sample size. But now you are also giving people more time to become proficient with a firearm, which could skew the data still. But you aren't putting it to enough individuals still. People are different. You cannot sample such a small portion of the population and make a broad spectrum analysis. There are all kinds of things wrong with this study, and the chief thing (in my opinion) is that the sample should be larger, especially if people only have a few shots. More shots would be nice, but there is an issue with that as well.

When push comes to shove, a days testing would never be enough. This would have to repeated with the exact same experiment with more people to get enough data to even start to draw a conclusion. 75 data points is not enough when making a broad spectrum statement like, "playing shooters make people better at shooting firearms". Any scientific studies that hold any merit at all are usually done over the course of months or even years, with every possible sample group they can include. This is just bad science for many reasons.

Rant time:

That is the one flaw of the information age it seems, everyone does studies and and publishes them like they mean anything. Daniel Kahneman, Michael S Gazaniga, Leonard Mlodinow... these people spent years coming up with their conclusions. David Eagleman did a study on a single subject, but for two long years before he even considering coming up with a conclusion. The study only ended because the student graduated, but they did extensive backround testing on the subject before it even began. This guy spent a day, maybe a week at most and tells us conclusive data from a sampling that is far too small from an experiment that is so sloppy it shouldn't warrant page time anywhere. I once heard a guy tell me that he read a study that too much vitamin C can increase your chances of heart disease, and the reason he trusted it was because of the source. Now we get college professors that say one thing that is counter intuitive to many many studies and people actually believe it because of the source. It literally makes me rage.

End rant, sorry.
 

Treblaine

New member
Jul 25, 2008
8,682
0
0
Baresark said:
That is arguable. But you can't really have both. If they were to let people take 100 shots, for example, the data would be much better from that sample size. But now you are also giving people more time to become proficient with a firearm, which could skew the data still. But you aren't putting it to enough individuals still. People are different. You cannot sample such a small portion of the population and make a broad spectrum analysis. There are all kinds of things wrong with this study, and the chief thing (in my opinion) is that the sample should be larger, especially if people only have a few shots. More shots would be nice, but there is an issue with that as well.

When push comes to shove, a days testing would never be enough. This would have to repeated with the exact same experiment with more people to get enough data to even start to draw a conclusion. 75 data points is not enough when making a broad spectrum statement like, "playing shooters make people better at shooting firearms". Any scientific studies that hold any merit at all are usually done over the course of months or even years, with every possible sample group they can include. This is just bad science for many reasons.

Rant time:

That is the one flaw of the information age it seems, everyone does studies and and publishes them like they mean anything. Daniel Kahneman, Michael S Gazaniga, Leonard Mlodinow... these people spent years coming up with their conclusions. David Eagleman did a study on a single subject, but for two long years before he even considering coming up with a conclusion. The study only ended because the student graduated, but they did extensive backround testing on the subject before it even began. This guy spent a day, maybe a week at most and tells us conclusive data from a sampling that is far too small from an experiment that is so sloppy it shouldn't warrant page time anywhere. I once heard a guy tell me that he read a study that too much vitamin C can increase your chances of heart disease, and the reason he trusted it was because of the source. Now we get college professors that say one thing that is counter intuitive to many many studies and people actually believe it because of the source. It literally makes me rage.

End rant, sorry.
Agreed that his conclusion is wrong (seems to be the main problem with this study) but you can conclude SOMETHING from only 150 data points (75 per group, or whatever) even if that conclusion is only indicative that suggests more studies should be done. You can't dismiss a 150 individual study.

150 individuals is certainly a good place to start. But my problem with this is comparing "light gun" type controller, which is rarely used in games with the test being a very similar airsoft-pistol... and I don't think anyone has ever been killed by an airsoft pistol except if they were possibly beaten to death with one. So the study is meaningless just from that. It doesn't control for general computer game use (aiming with a mouse or thumbstick) against most firearms use, such as a 9mm pistol or at the very least a .22LR calibre rifle.
 

Baresark

New member
Dec 19, 2010
3,908
0
0
Treblaine said:
Baresark said:
Agreed that his conclusion is wrong (seems to be the main problem with this study) but you can conclude SOMETHING from only 150 data points (75 per group, or whatever) even if that conclusion is only indicative that suggests more studies should be done. You can't dismiss a 150 individual study.

150 individuals is certainly a good place to start. But my problem with this is comparing "light gun" type controller, which is rarely used in games with the test being a very similar airsoft-pistol... and I don't think anyone has ever been killed by an airsoft pistol except if they were possibly beaten to death with one. So the study is meaningless just from that. It doesn't control for general computer game use (aiming with a mouse or thumbstick) against most firearms use, such as a 9mm pistol or at the very least a .22LR calibre rifle.
I can agree with that. Sometimes I'm out spoken and use wrong language, that is my bad. I agree completely that it certainly does lends itself to more studying. I said that you can draw no conclusion from that, and it was stupid of me to say.
 

Treblaine

New member
Jul 25, 2008
8,682
0
0
Torrasque said:
Science, I love you and all, but please stay the hell away from video games.
Or at the very least, get a scientist that plays a lot of video games to do these studies involving video games.

There are just way too many factors that aren't accounted for in this study, that Professor Bushman hasn't accounted for and would not know to factor into his study, for me to even try to take this study seriously. Now if you'll excuse me, I am going to play some Search and Destroy to make my bomb defusing skills IRL go up.
Don't fear science, and this IS NOT science by the way. Science is FAR more rigorous than this.

He doesn't need to play video games, he does need to study them. But the great issue here is his conceit of guns, his worst deception is to perform this test with a low/zero recoil Airsoft pistol as the "firearm" which is ridiculous as I don't think anyone has ever been killed with an Airsoft pistol. Though it IS most similar to a lightgun.
 

Torrasque

New member
Aug 6, 2010
3,441
0
0
Treblaine said:
Torrasque said:
Science, I love you and all, but please stay the hell away from video games.
Or at the very least, get a scientist that plays a lot of video games to do these studies involving video games.

There are just way too many factors that aren't accounted for in this study, that Professor Bushman hasn't accounted for and would not know to factor into his study, for me to even try to take this study seriously. Now if you'll excuse me, I am going to play some Search and Destroy to make my bomb defusing skills IRL go up.
Don't fear science, and this IS NOT science by the way. Science is FAR more rigorous than this.

He doesn't need to play video games, he does need to study them. But the great issue here is his conceit of guns, his worst deception is to perform this test with a low/zero recoil Airsoft pistol as the "firearm" which is ridiculous as I don't think anyone has ever been killed with an Airsoft pistol. Though it IS most similar to a lightgun.
Its science light?!
Half the facts, and twice the "what the fuck is this shit" !
 

Micalas

New member
Mar 5, 2011
793
0
0
BrotherRool said:
[/mild ranty thing]
Really? I thought the things about shooting guns were adapting for range, dealing with recoil, holding it properly, sighting properly. I wouldn't have expected it to help much at all.



99% is huge, I'm really interested in this and the legitimacy of this, because if so it's got some big implications. I mean he's suggesting, I don't know, if someone were to do 20 minutes on a shooter before going out they'd become twice as accurate. If you went out hunting you'd hit twice as many birds.[/quote][/quote]

It doesn't take too long for someone who's played a ton of FPS' to get used to those things, in my opinion. At least in my case. The first time I ever fired a real gun I was getting bull's eyes at 300 feet by the end of the first box of shells...

It takes a bit more setup and patience to do it for real, but it's still point and click.
 

BrotherRool

New member
Oct 31, 2008
3,834
0
0
This is a more detailed (and slightly more impartial article)

Micalas said:
BrotherRool said:
[/mild ranty thing]
Really? I thought the things about shooting guns were adapting for range, dealing with recoil, holding it properly, sighting properly. I wouldn't have expected it to help much at all.



99% is huge, I'm really interested in this and the legitimacy of this, because if so it's got some big implications. I mean he's suggesting, I don't know, if someone were to do 20 minutes on a shooter before going out they'd become twice as accurate. If you went out hunting you'd hit twice as many birds.
[/quote]

It doesn't take too long for someone who's played a ton of FPS' to get used to those things, in my opinion. At least in my case. The first time I ever fired a real gun I was getting bull's eyes at 300 feet by the end of the first box of shells...

It takes a bit more setup and patience to do it for real, but it's still point and click.[/quote]

Well the article above cleared some things up, they were shooting with airsoft guns so recoil wasn't a thing and whats more their accuracy was still awful compared to what you'd need.
 

BoogieManFL

New member
Apr 14, 2008
1,284
0
0
This is easy.

The gamers or people who played the more realistic shooters were *more concerned* about getting good impressive hits like a headshot. The others were probably more interested in simply hitting the target.

Not because they were inherently better or "trained" in a virtual environment. Anything else can easily be attributed to basic learning and skill transfer, which would occur the same for a docile person as it would for someone aggressive.