My Method in Voting for North American Ultrarunners of the Year

AJW's TaproomFor the past 10 years, I have been honored to participate as one of the voters for UltraRunning Magazine’s year-end (North American) Ultrarunner of the Year balloting. Recently, I have been asked by several people if there is any method I use in filling out my ballot. While the magazine intentionally provides no template for completing the ballots, over the years I have formulated my own method which seems to work pretty well for me.

The way the process works is that in mid-December Tropical John Medinger (former UltraRunning Magazine publisher and all-around ultra guru) sends each voter a detailed spreadsheet of every notable North American ultramarathon performance of the year. Typically a small group of us receives an advance copy of the spreadsheet to review and make sure nothing is omitted or inaccurate and then the document (typically about 50 pages long) is distributed to the full panel. Each panelist then has about 10 days to complete the ballot and return it to John. Having spoken to several members of the group over the years, it has become clear to me that there is not one single acceptable process for balloting. What I am confident of is that each voter takes the responsibility very seriously and typically spends 20 to 30 hours in completing the process.

My approach is both science and art and one which I have refined over the years. I begin with a thorough reading of the entire spreadsheet highlighting in different-color highlighter pens (yes, I print the spreadsheet out and work on actual paper) such things as North American and world records, course records, significant results in large/prestigious events, and any other results that catch my eye.

Following that general review of the spreadsheet, I then begin looking at each individual runner’s full body of work for the year. In that process I consider how many events were completed, how many (if any) DNFs the athlete had, what were the competitive levels of the chosen events, and how the runner did overall relative to the field in those events. Once that review is complete, I create a list of approximately 15 to 20 runners of each gender that I think could likely find a place in my top 10. This list is not ordered at all, rather it is just a brain dump based on the spreadsheet data.

With the initial lists created, I step away from the process for two to three days to just let the basic data sink in and after that break I return to each list and begin setting about to order the runners into a ranking. In creating this initial ranking, I give scores on a 1 to 5 continuum for each race result. A score of 1 might be a first-place finish at an off-season fatass run while a score of 5 might be a top-three finish in a competitive race like The North Face Endurance Challenge 50 Mile Championships or UTMB. For DNFs I assign a score of zero. This scoring system is, of course, somewhat subjective but now that I have 10 years worth of my own balloting data, I can compare scores for results from other years and assign them similarly. After adding up each runner’s point totals and dividing that by the total number of events, I rank the full list and order it.

With that ordered ranking complete, I then compare head-to-head performances. Obviously, not every athlete has competed with every other athlete in the ranking, but where there are head-to-head comparisons, I take them into account and adjust the rankings accordingly. Typically, head-to-head comparisons tend to move runners one or two spots in either direction.

Some observers over the years have asked if I have any inherent biases in my process such as weighing trail runs more heavily than road or track runs or 100-mile races over shorter-distance races. In truth, I don’t have any explicit biases. However, in my rankings of certain events, I may use some subjective judgement that could be seen as a bias and I suspect that other voters do so, as well. That subjective judgement, or ‘gut test,’ is typically the last stage in my ranking and in those cases I may move an athlete to a different place solely on gut feeling. These occasions are very rare and typically only involve moving a runner up or down one space in the rankings.

With less than a week to go until we receive our ballots from John, I know many of us are beginning to think about our votes. From my perspective, it seems to get more challenging to do these rankings every year but it is also a wonderful year-end ritual which provides a great opportunity for reflection on the sport that means so much to me.

Bottoms up!

AJW’s Beer of the Week

This week’s Beer of the Week comes from Hardywood Park Craft Brewery in Richmond, Virginia. Hardywood’s Bourbon Barrel Cru is a classic Belgian-style Quadrupel ale with a delicious smoky and nutty flavor and a smooth texture which is simply scrumptious on a frosty winter’s night.

Call for Comments (from Meghan)

  • If you were to be a voter on this panel, what methods do you think you might use to create a women’s and men’s top-10 list?
  • And if you are a voting member on the panel, might you share how you go about casting your votes for (North American) Ultrarunner of the Year?

There are 28 comments

  1. Kent Green

    It is difficult to compare Zach Bitter’s Tunnel Hill and Jim Walmsley’s WS100 performances. Both are superhuman. And I am glad I don’t have the job of judging haha.

  2. Johnny

    Andy
    Can you please release your personal ranking, just like you did last year? Some might not agree with your comments but I always like to read what a former elite like you think.

  3. SageCanaday

    This sounds like an interesting process. 20-30 hours? Over a 10-day period you are losing a whole day then. How big is the voting panel?
    Why not have a vote for The People online (maybe on here at iRunFar) and have a much, much larger number of people vote?

    Knowing extreme diversity of MUT (mountain-ultra-trail running) events it seems like a major issue with this whole process is that people have trouble comparing, say a top level 50km performance to a multi-day record….not to mention a road/track vs mountain ultra and the relative times/places. Then of course you eliminate all amazing mountain-running events that happen to be 26.2 miles and less. Some of the most iconic and competitive mountain-trail races in the world like Sierre-Zinal. That is fair because it is “ultra running only…and ‘North American only'”, but the true realm of MUT is much more diverse.

    1. shitinthewoods

      If you include races like Sierra-Zinal, then it’s no longer the “ultrarunner” of the year. They’d have to change it to the MUT runner of the year. But then they’d have to include XC races and VKs, etc. I suppose? That would bring a whole group of runners into consideration who otherwise wouldn’t be in the conversation (i.e. people with success at only sub-ultra distance trail/mountain runs), and comparing them to people with success at only ultra-distance races would add a whole new layer of complication to the process. It seems like deciding who had the overall best ultramarathon results of the year is already difficult enough. I think it’s probably a good thing that, for the sake of this annual tradition in our community, “ultrarunning” is a different category than “MUT”.

      Are there similar annual rankings in other divisions of the running community? I assume in just about any sport, an MVP-type ranking like this is unavoidably subjective at least to some extent…World Series, Superbowl MVPs, etc…how are they decided? That being said, it’s probably best to just enjoy the annual UROY rankings as a fun and exciting conversation about the year, and not treat them too seriously or scientifically. Hell, I’m sure plenty of people who have been in the top-10 UROY rankings over the years don’t agree with the ranking they were given.

      Excited for another fun conversation this year with lots of amazing runs to look back on!

      1. SageCanaday

        Yeah that is exactly why I said in my last sentence that “it is fair because it is “ultra running only”…. “Ultra Runner of the year” not “SUB ultra runner” or “MUT Runner”). I was simply speculating. Much like I wonder why it is “North America” instead of just “America”? My main point is: why not open up the voting to the people? The masses? And do it online. I was actually talking to someone the other day about why our whole voting process (for the US) isn’t all online and it prompted this thought.

        And for the record I have been a top 10 UROY before and of course I didn’t agree with the ranking I was given! There is obviously a lot of bias on all sides. People have favorite events/distances and people see relative “competitive depth” and what is an “impressive performance” in different ways.

        1. Mike

          Opening the vote up to the people, the masses, will only lead to the voting being a popularity contest. I don’t know about others on the panel, but at least AJW takes a “scientific” approach to his voting.

          1. AJW

            Hey Mike, thanks for the comment. I would say that most of the voters I know realize that there is a lot of “chatter” out there that lends itself to a popularity contest and most of them see through that in large part by simply comparing performances and results rather than personalities.

    2. speedgoat

      it becomes a popularity vote if there are not certain voters on the panel. Simply because all of us voters get a full spreadsheet of top performers of the year. None are ever left out, and some shouldn’t even be on the sheet. But again if you put a public vote out there, it’ll lean towards what happened at the end of the year. (“who’s hot now”) is always the case. Camille would win hands down because of her record, but she’s got some stiff comp from a few others.

      1. SageCanaday

        Karl there’s a simple solution to that….post “The Spreadsheet” on here for all to see.
        If it is just quantitative data about people’s performances for the whole year then all the voters can “be informed”
        of all the results. I believe a lot of fans of the sport are quite knowledgable already about results from the past 12 months anyway. Of course there will be bias, but shouldn’t the popular vote by the people win?

        Again, I’m not saying the system is necessarily flawed or has to change…I’m just questioning and speculating because it is entertaining.

  4. Aaron Sorensen

    The only thing I didn’t see on you comparision list is a higher number multiplier for longer distances or harder 100+ mile races for that matter.

    Being fast and winning competitive 50k and 100k’s are minimally important to most even though it gets the most attention.

    Sure it makes for good spectating, but as an ultrarunner, winning a 200 miler is vastly more impressive than winning a competitive 50k.

    1. shitinthewoods

      @Aaron Is this trolling? Those are opinions stated in the form of facts, which I’d suggest is always dangerous in a public forum.

      Do you think winning a 200-mile race against a couple hundred people (?) is more impressive than a course record at JFK 50, or Pikes Peak Marathon, for example? Or a marathon world record? Or are you trolling? I feel like I’ve been trolled..

    2. SageCanaday

      Aaron you seem to have a very different perception on the sport compared to some in regards to “what gets the most attention.”

      This is my perception:
      In the US a win at a big 100-miler “gets a lot of attention.” Western States, Leadville, Hardrock. A win at UTMB as well.

      Now if a loaded field shows up to a 50-miler like the North Face SF Champs or a 50km like Speedogat then that gets some attention as well…but there seems to be a general bias for the big 100s. People also like extreme and difficult distances/courses like Hardrock, Barkelys etc. But do those races actually have more competitive depth?

      In South Africa though it is all about winning Comrades. Comrades has over 20,000 runners in it. It is a 55 mile road ultra. We could say (at least on the men’s side) it has the most competitive depth of any ultra marathon in the world by far (side note: It also has the most prize money by far).

      In Europe many obviously care a lot about UTMB…but also all sorts of “SkyRunning Races” and the new “Golden Trail series races”…epic lines like Sierre-Zinal or Zegama. Those are “sub ultra” though: so they must not be very hard or competitive, right? They actually don’t usually care if a race is 100-miles in distance exactly (UTMB is usually around 106). Is Matt Carpenters record at Leadville more impressive than his record at the Pikes Peak Marathon? I’d go with Pikes Peak. Nobody is touching that!

      But logically here is the question: Is a 10km twice as hard as a 5km? Is a slow 200-miler really vastly more impressive than a fast and competitive 100-miler? Both are quite hard and challenging when one pushes themselves 100%! The pain is a constant. What is the relative competitive depth of a race depending on field size, ultrasign-up score, ITRA score, and who shows up and finishes ahead of whom head-to-head? Some of that is actually easy to quantify.

  5. Ellie G

    I too am one of the panel of voters for (North American) Ultra runner of the year – male and female divisions. I can tell you that it is a head spinner for sure trying to determine who I class at top 10 for each gender. For me it’s really important to balance quality with quantity – is someone nails the win at one super competitive 50 miler but all their other races are very low key/ lacking in competition then it’s weighing it up against someone who might not have won any races at all but who has numerous 2nd or 2rd place finishes. For me, I like to see at least 3 solid results to really strongly consider voting for someone, as of course one stand out result would put them in the running for Performance of the Year but not UROY (where I want to see some quantity). As it’s Ultra runner of the Year I really try not to show sway over longer distances (be that 100 miles, 200 miles or fixed time events) as ultimately a 50k is an ultra and so should be treated equally. As such, I try to focus on on North American records, World records, course records (where depth of field and history of the race is strong). Yes, this does mean that sometimes you are confronted with trying to rank the quality of a 50k road race with a 100 mile mountain race, and a24hr track race – but that is where some subjectivity is key. My brain is exploding and Tropical John has not even sent us the data yet! Most importantly of all – this is NOT a popularity contest/ social media following contest – ultra running performance is all I consider, and that often requires some digging to those who choose to have a lower profile on social media but still should very much be considered.

  6. Greg Crowther

    With all due respect, the numerical approach outlined here seems flawed in that it actually penalizes runners for doing small low-key races even if they also did highly competitive ones. Example: runner wins Western States (5) and Comrades (5) and UTMB (5) for an average score of 5. Now say instead that the runner wins all these but also wins a local Fat Ass, worth 1 point; her average score is now down to 4. As the sport continues to become more professionalized and more global, I don’t think we should be penalizing runners who also participate in their local community races.

  7. Aaron Sorensen

    @shitinthewoods
    Trolling?
    So it will only be impressive when some of the voters actually start voting a higher score for theses races and more elites show up?
    It’s not a chicken or the egg thing. Theses races are already here and yes the winning of these races is more impressive than JFK.
    My opinion of course, but what will it take for elites to start doing theses races?
    Someone needs to take a stand and be impressed and start talking about it.
    Don’t understand why this is trolling?

  8. Jakub

    I agree with Greg Crowther, low profile races should only penalise one’s score if they are not alongside high profile races. For instance, taking top five performances in a year for every athlete would be fairer. If all of them are high profile, then it doesn’t matter how many local races the athlete ran. If not, then the athlete possibly didn’t run enough competitive races to be a UROY.

    Also, not being from the US, I’m a little sad that this nice effort doesn’t go beyond North America. It’s not “NA Ultra Runner of the Year”, so I was a bit puzzled last year that Francois D’Heane didn’t get the prize for men. I’d say that the name of the award is slightly misleading if the award can only go to North American runners.

  9. Stephen Patterson

    For what it’s worth, In his film Forever Running, Yannis Korous says “running 50 or even 100 kilometers is not ultrarunning, and that it doesn’t really start until 100 miles and over” I’m not qualified to have any opinions about this myself.

  10. Karl Meltzer

    I have an idea. Us voters pick the top 10, then publish that in no particular order with results attached. Then we let the public decide. Each race comes with a small notation on quality of field. Are most people going to study that? Probably not, then it’s back to the popular vote. I dunno, it’s a nightmare to go thru the spreadsheet, trust me.

  11. M. G. Dorion

    I echo Dr. Meltzer’s sentiment exactly:
    “It IS a nightmare to go through the spreadsheet.”

    (El Paso M.D., ultra racer since 1977, voter since ’97, still LITERALLY having nightmares and trouble sleeping during the final calculations/ rankings)

  12. Buzz Burrell

    Shitinthewoods asked, “Are there similar annual rankings in other divisions of the running community?”

    Yup! The Fastest Known Time of the Year Award is back for the 3rd year, and will be published in the same issue of Ultrarunning Magazine as UROY (presuming I manage to write the article by the deadline), and on fastestknowntime.com. Ballots went out today, the same day as UROY. And I’m firmly with Karl on the size of the UROY spreadsheet – I’m so intimidated I haven’t had the nerve to open it yet – but the FKTOY spreadsheet is only one page (I’m much lazier than AJW or Tropical John apparently).

    FKTOY Voters don’t use a numerical rating system. They learn about the routes, they think about what it took to do them, they likely are inspired themselves, and they decide based on what’s meaningful to them. At the end of the Ballot I wrote:

    “The purpose of the Fastest Known Time Of the Year Award is for all of us to learn of new places, people, and techniques, to be inspired by the same, and to create a richer community by sharing what we do. So please enjoy this process – traveling quickly and efficiently through our beautiful world is a wonderful gift we have been given, so let’s take delight in honoring and appreciating our opportunities.”

  13. Jason Schlarb

    Sage, a popular vote is fine. Powder magazine does that for skiing. For UROY, no way. General public voting will not point to performance, instead it will be massively about hype, social followership, media/branding support, recency, etc… It is difficult for TJ to find voters who can see past said… “fog”, general public, no chance.

    1. AJW

      As much as I wish I didn’t, I have to agree with Schlarbie on this one. I remember two years ago, yes two years ago, someone wrote into UR Mag outraged that Anton didn’t win UROY. Just one example of many…

      1. SageCanaday

        Schlarb, I won’t compare ultra running to skiing unless it is only about race results and direct head-to-head competition in narrowly defined events. I also don’t know anything about “Powder” Magazine or ski culture. Of course we want to make this as quantitive as possible to be fair.

        AJW, well that was only one example of one person writing about how Anton didn’t win UROY. Do you think Anton would really be in consideration by a mass public vote considering he hasn’t raced much these past few years? The people who read these articles are generally at least somewhat aware of most of the results iRunFar publishes and covers (which is quite extensive!). There is a certain power in the number of the masses….especially if they are informed and educated. We could also say that those who bother to vote are (hopefully at least somewhat) informed and have looked at the ballots and weighed information/results as objectively as possible. There can be bias at all levels of course. We’ve already seen huge bias for the longer ultra events like 100-milers and multi-day. And of course all events in ultra running are quite diverse when you are talking about 50km to 6-days and track vs road vs trail vs mountain surfaces.

        Like Karl said above, maybe the panel can narrow down the votes to a “top 10” and then the masses vote on the final ranking of those top 10? Make the spreadsheets public. All this information and results can be found on the internet anyway.

Post Your Thoughts