Goal: Qualify submissions in terms of quality and veracity.
MetricsDAO provides a 6 step process for Organized, On-Demand Analytics Delivery, or as we like to call it, OODAD.
QA for Solutions is the 5th step, which we experimented with for the first time with OlympusDAO. Here’s how we did it, what we learned and how we plan to improve!
Set Up
Timeline: We gave ourselves a week to review and grade the 11 OlympusDAO submissions based on a predefined rubric.
Reviewers: After putting the word out on Discord, we found 2 community members to review the 11 Olympus DAO submissions. Reviewers had to know basic SQL and have experience with crypto analytics. Our third reviewer was a member of the core team.
Incentive: Reviewers will earn a reviewer-specific POAP.
Peer-Review Process: The Core Team member was in charge of reviewing the other reviewers’ work.
Outcome
QA for OlympusDAO solutions went very smoothly. The reviewers broadly agreed with each other, and it was easy to identify winning submissions. We were able to use average scores to validate submissions, and we finished grading in a week.
Quality of submissions varied widely. The worst quality submissions were directly tied to users doing the bare minimum. The best quality ones were done by experienced users. Experience matters but it is not the only factor determining quality – better incentives should be put in place.
Future Experiments
Incentivizing better quality submissions
-
Payment:
- We want to experiment with awarding higher quality submissions with a higher payout.
- We will also be paying more for harder questions.
-
Setting Expectations:
- The level of difficulty for each Challenge will be stated from the get-go. This should help analysts know which challenge they can tackle effectively.
- The rubric should be made very public and shared more broadly ahead of time.
- Payouts should also be announced ahead of time.
-
Knowledge-Sharing & Mentorship:
- As more bounty rounds are completed we will have existing grand prize submissions to share out as examples to follow.
- The #support channel will be the place for users to ask questions about the rubric and get help easily. We want to promote education and knowledge sharing as much as possible!
- Our Mentor-Circle is planning to host more workshops where community members can learn from each other.
Dispute & resolution mechanism:
We want to implement a dispute & resolution mechanism in case one reviewer disagrees with everyone else. A few experienced graders should carry somewhat more weight.
Attracting more reviewers:
As the number of submissions increases, we will need more reviewers. Being a Reviewer is an excellent way for talented analysts with SQL and crypto skills to get involved at a low time commitment (2-3 hours a week, maximum).
For upcoming programs, like Convex, we plan to offer a Reviewer POAP to track reviewer contributions. Additionally - from experience, Reviewers have indicated that seeing the submissions of others is an excellent way to improve their own work by getting exposure to a range of different techniques.
If you have any interest in being a Reviewer for upcoming programs, please contact Snowslinger.ust!
And feel free to comment questions/thoughts/feedback below or on Discord!