Imagine getting 10,000 application entries, and you had to review these entries in less than seven days. What would you do? I learned the hard way during the latest Nord 2 and Buds Pro Edition. How would you efficiently review these entries? Read the steps below if you would like to know how I do it. I will walk you through the process. If there is one thing I want you to take away from this Tutorial, it is the Transparency in the Selection process.
Round 0 - Prepare the list
- Step 1 - EXPORT NOW
I export the entries as a .excel file.
- Step 2 - Select objectively (AKA no favoritism/nepotism)
In the excel file, "Username" and "Email address" columns are hidden to ensure entries are reviewed objectively based on content.
- Step 3 - Start with the selection rounds
Step 3 is broken into four different selection rounds. Depending on the number of entries, we may ask our closest core user groups (e.g. Moderators and Experts) to help out in the intermediary round. I create 3 columns that represent the first three rounds of review.
Round 1 - Content Word Count
I create a word count column that corresponds to the number of words in the review; If your review is a four-word such as "OnePlus 9 Pro awesome", it will not have a Y on Round 1 and will not make it to Round 2. That means, no matter how strong your visuals are, they won't even be reviewed. Essentially, in this first round, I will not read word by word but instead filter out the entries that are closer to a tweet than to a review. Don't worry, links are excluded from this rule.
Round 2 - Visuals
You know the saying: "An image is worth a thousand words." Going through the photos you sent us gives us a quick way to weed out low-quality entries. Much like Round 1, this step is more about validating your entry than selecting winners. In this round, I check to see if product shots are valid. I will check if your pictures that are entirely unrelated to your entry or if they are considerably worse than the average.
By the time we get to Round 3, eighty to ninety-five percent of the entry would have been weeded out from the main shortlist (Having a Y mark on Round 1 and Round 2 Column).
Round 3 - Attention to Detail
This is when I start to review the entries word by word. I am reviewing how well the narrative is put together. As mentioned in Do's and Dont's above, I am looking for insights into the devices, not just cold specifications.
We are now at the most time-consuming part of the selection process, actually looking into the quality of your application, weighing in factors like the quality of your written entry, how engaging it is, whether your review is detailed enough, and so on.
Round 4 - Sudden Death
Once we have la crème de la crème, it's now time for the hardest part of the selection: deciding between several great entries. In the case where the Round 3 shortlisted users exceed the amounts of devices prepared for that edition of The Lab/Lab. I will then look into the following factors and elements:
- "(Optional) Is there anything else that you would like us to know?" besides the standard review question, we will also try to find differentiation in the finalist's entries from this question.
- Refer back to Step 2 on Round 0 - we hid the Community username to shortlist entries objectively. However, at this point, we will "unhide" the username to have a better perspective in evaluating your accomplishments in the Community.
While choosing between entries that are similar in quality, we believe we should prioritize those that have contributed the most to the growth of the Community. We do so by accessing your profile page to check for stats to evaluate contributions to the Community by looking into things like Best Answers, the number of posts, and warning points.
This will be the ranking of preference:
- Active users, with a regular presence on the forums and a helpful attitude
- Users that have limited participation on the forums, are inactive or only aim at contests
- Users with no participation or no account
Bonus Round - Crosscheck
With the final list of users, it's now time for the detective work to begin. We'll look for
- Plagiarised entries. As mentioned in Do's and Dont's, plagiarizing would just put everyone in an awkward position.
- "Youtubers" and "Tech Influencers." I do not have anything against "Youtubers" and "Tech Influencers," but given that The Lab is a product review program that is geared towards Community, we will prioritize "real users." However, we sometimes open exceptions to this rule when there are exceptional good entries or feel a particular entry might benefit the Community.
- Recent Lab Reviewers. Even though there are no concrete rules stating that Lab Reviewers cannot be on The Lab Reviewer Squad for two consecutive editions, we try to stay away from this situation as much as possible as there are limited spots. We believe everyone deserves a chance to participate in The Lab.
Click to expand...