re: Multivictim searches – best beacon is?

GRS and DVRPS in training, Grmada, Jan.19th

Rather than replying in the comments to the original post on PisteHors I realized the length justified something else entirely.

So what is this about?

As reported by David on PisteHors (and published originally in Avalanche Review), SLF & ANENA decided to do comparison tests of avalanche beacons in multi-victim search scenarios. They did tests with two groups (beginners/intermediates and advanced) using various avalanche beacons. Group 1: Arva Axis, Mammut Element, Ortovox 3+, Pieps DSP Tour, Tracker 2. Group 2: Arva Link, Mammut Pulse, Ortovox S1+, Pieps DSP.

To quote:

All the searchers received training from representatives of each transceiver company. This focused on a basic introduction to each transceiver for the beginners to and in depth look at multi-victim strategies for advanced users…
…The tests showed that there were considerable differences between the various transceivers in multi-victim scenarios. The fastest transceiver to the first target was the BCA Tracker 2, confirming BCA’s marketing that for beginners, at least, simple is better.
Overall, and including 2nd and 3rd victims the Mammut Element came out best with fast search times and only 1 victim missed on the 3rd target. The Pieps DSP Tour and Arva Axis had problems with the marking function which caused particular difficulties with 3 victim searches. The Tracker 2 out perfomed the Pieps and Arva and even came in just ahead of the Ortovox 3+.

Arva Axis Mammut Element Ortovox 3+ Pieps DSP Tour BCA Tracker 2
time to 1st 2.00 1.45 2.00 2.00 1.30
time to 2nd 5.45 3.45 4.30 6.00 4.00
time to 3rd 10.00 6.00 6.15 10.00 7.00
fails 0/5/18 0/0/1 0/1/12 0/5/23 2/2/11

I have two slight objections to this test. The second is probably a direct result of the first.

All official beacon tests I’ve seen, at least those that involve beginners, are flawed to some degree. As quoted above the participants were given instructions by the manufacturers prior to searching.

This somewhat invalidates the results from the start as they include people who might not know much but they know enough to rationally bypass the design flaws of the beacons. Especially if they get several attempts with various devices (unknown since I don’t have the original paper). They should pull random people off the streets, people with zero interest in avalanches or even winter sports. They should give them one attempt with one beacon (several control groups) and no explanation on how to use the device or how many victims there are.

Something tells me beacon manufacturers would be shitting themselves after seeing the results of their poor user experience design. You might argue this would be totally unrealistic but it is a much better way to properly test and compare UX and would ultimately also benefit more experienced users who might not perform that well under emergency situations.

Planneralm, Jan.6th, ©Jonna

To be more specific, I have both BCA Tracker2 and Ortovox 3+ available and I find it very hard to believe T2 outperformed 3+ in a (beginner) multi victim scenario purely because it does not have masking (a flawed implementation or not). No beginner can possibly resolve that situation with a T2 without instructions.

Further, prior to purchasing my 3+ I did my own testing during an avalanche seminar where many different models were available. I tried them all (but no ARVAs or BCAs) on a 3 victim scenario and the 3+ was by far the easiest to use without any explanations. I went out without knowing anything about the beacons I was using, trying to see which one was easiest out of the box, no instructions, nothing.

With every other beacon I had a few “WTF?!” moments where I had to stop and figure out what it was trying to tell me. I was especially pissed off by Mammut beacons which, in my mind, are to be widely avoided unless you enjoy interacting with the beacon rather than focusing on the search. They might be fine, even excellent as the test shows, after reading/receiving instructions, but horrible for a beginner. When I was using them they kept flashing messages that were totally unrelated to the task I was trying to do. Unless they significantly updated the firmware there’s nothing that can convince me in their suitability for use in real situations.

Vogelsangberg & Sauereggnock (2240m), Dec.30th

Just this weekend I did a similar test with my wife (she knows absolutely nothing about beacons, coarse/fine search or anything else related) where I hid one beacon in the garden then gave her the other one to search. She wasn’t very pleased with my methods (the kind of person who wants instructions first) but quite quickly proceeded to follow the signal. I didn’t measure but overall it took her noticeably less time with the 3+. Possibly because it told her more clearly when she was moving away but in both cases she found them within the minutes one might reasonably expect to still be conscious.

My ultimate point here is this – UX design is an afterthought with most beacon manufacturers and I can’t figure out why they try to make it hard on everyone. I do however, have a suspicion they try to hide it in many of these tests.

PS In case of BCA and ARVA the same could be argued for product design. Seriously BCA, was this the ugliest you could make it?

No comments yet. Be the first.

Leave a remark

Gallery