Screeners are used to recruit participants for your dscout Diary, Live, or Express mission; you have the ability to screen dscout panel scouts or a panel of your own external participants.

Most often, your first step will be building your question script into the platform. Please see “Setting up your screener” for a quick walkthrough of the screener builder experience. If you have yet to draft your screener question script, check out this example screener for inspiration!

Once your screener is built in the dscout platform, you’ll submit it for review and a dscout Research Advisor will provide feedback within 2 business days. Read more about our standard screener approval workflow

But, you may wonder, what do Research Advisors review for? You might think, how can I expedite the review process by submitting a super stellar screener that is already in launch-worthy shape? Here is a guide for you to refer to.

Please note that this guide is neither exhaustive nor a replacement for the Research Advisor review, but rather a straightforward checklist to walk you through the essentials of a Research Advisor review.

Category Things to verify

Is the screener technically sound?

Are there any programming errors or omissions?

- The screener is free from grammatical errors.

- All fields on the Overview and Screener tabs have been filled.

- Knockout and Must Select logic is present and on the correct response options and/or questions in the Screener tab.

- Skip logic is present on the correct response options and/or questions and routes scouts to the correct prompts.

- Multiple-choice questions are correctly programmed as either single- or multiple-select.

- Open-ended questions are correctly programmed as either No Limit or 140 Characters.

Does the screener adhere to our scout experience standards and inclusive research guidelines?

- All language is actively inclusive and anti-racist.

- The screener has no more than 20 questions.

- Each scout is asked no more than 1 video prompt and 2 open ends.

- The screener does not expose scouts' Personally Identifiable Information.*

- An appropriate incentive amount is set for the mission(s) scouts will complete.

- Options for scouts to self-identify, select multiple, and select "I prefer not to respond" are present on all questions that collect sensitive personal information.

- Only essential demographic questions are included and, if possible/appropriate, Targeting Attributes are used.

Will the screener find the scouts you are looking for and achieve your recruitment goals?

- Screener questions address all relevant research criteria.

- The screener teaser provides a high level of detail on the project without disclosing your knockout criteria.

- The screener teaser communicates what the follow-up missions will consist of (eg, "Selected scouts will be invited to a multi-activity Diary mission, with the potential of a follow-up Live mission for an additional incentive.")

- The recruit is feasible within the dscout platform.*

- The screener title is scout-facing and fun.

Does the screener adhere to Dscout design best practices?

- Multiple-choice response options are mutually exclusive and collectively exhaustive.

- The screener contains an open-ended question and a video prompt to fully "audition" scouts' ability to speak and write on the topic.

- Questions are not binary and allow scouts to self-select into the desired response. For example, rather than asking "Do you have a blue car?" the screener asks "Which of the following colors is your car?"

- Questions begin broad, then become more narrow. For example, before asking "What brand of sedan do you own?", the screener should confirm that the scout owns a car and that the car is a sedan.

- Screener questions only collect information from scouts that is needed to determine their fit for the mission. 

*These points may necessitate a conversation with your Research Advisor. 

Was this article helpful?

7 out of 7 found this helpful
Have more questions? Submit a request