Saturday 18 August 2012

I wish to draw attention to the followers of to what is a flawed polling method freely available and being used to determine voting results for online video contests. I will be using the video contests run at to demonstrate the problems associated with online polls. 

Last year I wrote to Steve (AKA Surfimp) from to raise my concerns at the commencement of the 2011 contest. We discussed how the online poll could be manipulated and how this kind of judging system could not necessarily be trusted to provide a fair and proper outcome. Following our correspondence, Steve brought this up for discussion, outlaying the potential manipulating of the poll. I was rather surprised in reading the responses, that no-one took this seriously enough to ask for the poll to be scrapped. In the washup, Steve believed that the overwhelming majority of participants were of good heart and in effect, hoped that the voters would do the correct thing. However, the final results of the 2011 and 2009 contests were questionable.

I would like to make it absolutely clear that I am not accusing anyone of cheating. But in any contest, the participants are expected to play within the spirit of the rules and online polls of this nature tend to pressure people into going against this principle or suffer the consequences of possibly not polling well or as we have seen in past contests, not polling at all. 

Games not contests
In effect, I liken these particular contests to games and it can come down to how well you play these games as to how well you may fare. One could enter a paper dart in these contests and win.  In fact, my friends suggested I should have done this in 2011 just to prove a point. However, it is my aim to improve these contests, not make a farce of them.  I am offering a possible alternative. 

There are National television shows which use online polls to determine a contestant’s theoretical popularity. In one program a contestant was caught out spending about $2000 on mobile (cell) phone calls to vote for himself in order to stay in the show. This shows that fan bases can be fabricated. There have been several incidents in some of our National talent shows here in Australia which have been questioned as to whether or not the most deserving performer/s won. 

And another point of interest.  In the 2008 contest, Thepasty and Hexosex posted identical videos.  Although the organiser picked this up in the early stages and decided to let the poll run its course and count both videos as one when the tally was finalised, it left me pondering.  If random visitors/voters came into the forum to vote, wouldn’t you think that these two videos would score similarly?  Thepasty came in second, polling 18 votes, while Hexosex could only manage 2 votes, coming in equal second last.  How could this be if both videos were identical?  Could this suggest that voters voted for the pilot and not the video? I hasten to say, I am not suggesting that one or both of these entrants' votes were knowingly manipulated. 

It is also possible to vote multiple times by creating fictitious IP addresses and email addresses, etc. 

Random visitors/voters versus unknown persons
The current voting system is supposedly based on random visitors who can cast one vote only. The ideal random visitor/voter would have some experience in slope aerobatics and be subjective. This voter would then browse through all the videos, taking notes along the way, then after many hours of assessing each video, cast his or her vote. So, do you think this describes every voter? 

I sincerely believe that the entrants involved are genuinely good people.  However, if sound rules are not devised from the beginning, contests of this nature can expose a number of loopholes. The downside is the possible loss of participants along with their creative videos. 

My Credentials and Experience
In speaking out, I think you should know about my flying background. I have been flying for 32 years. This includes planes, helicopters and gliders. I’ve been competing in contests off and on for 3 decades and have had the ultimate success in the last 10 state slope aerobatics championships I have entered. I am a qualified flight instructor and hold my Australian Bronze Wings for Glider and Bronze and Gold Wings for powered aircraft.  I have judged aerobatics for internal combustion Pattern and Scale as well as Slope Soaring.  I have also compiled a book on slope aerobatics. 

In-house judging the only way
In-house judging is the only way to judge these contests. If a judging criteria is designed and applied, then the outcome should by rights be fairer than guesswork judging and in particular, non-secure polls. By utilising the suggestions posted on past and present contest threads, I am sure we could come up with an agreed set of rules and guidelines. For instance, in the 2011 contest, DawsonH put forward these suggestions: Four categories: Aerobatic skills, Artistry (site, glider and music), Video Quality (shooting and editing) and Entertainment Value (creativity plus intangibles).  He suggested a plus to minus grading system. 

In March of this year, Steve wrote to me and asked if I would like to be part of a panel of judges for an additional judging section in the contest. I am yet to reply as I have been weighing up my options and it will be dependent on the outcome of this discussion. I am hoping my thoughts will be taken seriously and that the poll will be dropped.  If so, I would be glad to be a judge.

Finally.  May this be an alert call to those people who intend using online voting polls. If you cannot guarantee a fair and proper outcome for the poll's intended purpose, then I suggest you find another method. If it's not fair, it's not fun.

Ian Cole - AKA Ian Downunder. 

In the meantime, the following are my suggestions should a judging panel be formed. 

Suggestions for judging
Choose a panel of judges who preferably have experience in all facets of the judging criteria - but not necessarily one person being familiar with every facet. 

Set a mark out of a possible 100 for each video.  Marking as follows:

* Aerobatics Content (30) - video to predominantly contain aerobatics and not just be seen as gliders flying around doing passes or the occasional loop or role. Dynamic soaring is not aerobatics unless aerobatics manouevres are included.  Repeat footage or footage used merely to fill in time would be looked at and marked accordingly. 

* Aerobatics Skill (40) - footage to show flying skills with the ‘wow factor’ and should display unique features different to all the other videos. 

* Artistry (15) - footage to show a connection with the music that clearly bonds with the flying and which adds an extra ‘wow factor’ to the overall presentation. (While spectacular flying sites can seem to enhance a video, I do not believe this should be taken into consideration. The type of flying site, along with the conditions on the day will in effect determine the quality of the flying performance - and the flying is what should take precedence over all else.) 

* Video Production (15) - editing/presentation, camera handling, perspective of camera to glider, connection of glider to pilot, clarity of video, i.e. visibility of glider throughout the performance (weather conditions could play a part in this). 

To learn about judging aerobatics and see how subjective matter can indeed be converted into tangible text, visit: 

Judging the previous contests
Attached in PDF formats and plain text are my judging appraisals of the three aerobatics contests held at in 2007, 2008 and 2011. I did this to give a different perspective on the possible outcomes if an in-house judging system was used as opposed to the online poll. You may be surprised at some of the findings. 

    2011 Slope Aerobatics Contest in PDF format - ENGLISH     

2011 Slope Aerobatics Contest - PLAIN TEXT format    


2008 Slope Aerobatics Contest in PDF format - ENGLISH     

2008 Slope Aerobatics Contest - PLAIN TEXT format    


 2007 Slope Aerobatics Contest in PDF format - ENGLISH    

2007 Slope Aerobatics Contest - PLAIN TEXT format