Designing a Fair and Transparent Review System
A review feature designed to promote trust and accountability between artists, labels, and promoters through transparent, user-friendly feedback.
Problem and Context
LabelRadar is a web platform that streamlines the demo submission process in the music industry. It helps artists get their tracks heard while allowing labels and promoters to efficiently discover new talent.
On the platform, artists can submit their music directly to their favourite labels in hopes of securing a release. Although every label is vetted for legitimacy during onboarding, some artists began reporting negative experiences when communicating with label teams.
Because LabelRadar did not initially provide a space for users to share feedback or discuss their interactions, many turned to external platforms like X (formerly Twitter) and Reddit to express their frustrations. This public conversation began to create a perception that LabelRadar might not be fully trustworthy.
In addition to reports of being ignored, some users described suspicious behaviour from bad actors, including offers to promote tracks for a fee, sell playlist placements, or advertise paid courses.
Recognising the impact on community trust, we identified the need to create a safe and transparent space where artists could openly share their experiences with labels on the platform. The goal was to foster accountability, strengthen trust, and build camaraderie between artists, enhancing both the reputation of the brand and the integrity of the platform.
Process
Outcome and Impact
The launch of the review and rating system marked an important step toward improving transparency and accountability on LabelRadar. For a feature that had never existed before, adoption was stronger than expected: within just a few weeks, artists submitted over 2,000 ratings, even with many past interactions unlikely to ever be reviewed.
The new system quickly began influencing behaviour across the platform. Some labels and promoters with consistently poor reviews chose to leave, while others improved their communication and engagement with artists. This shift led to a noticeable increase in positive sentiment among users and helped rebuild trust in the platform’s community.
By giving artists a clear and safe way to share their experiences, the feature not only amplified their voices but also reinforced the sense of fairness and transparency that had been missing.
Two years after launch, the feature remains an integral part of the platform’s trust ecosystem:
Over 10,000 total reviews collected, with participation from around half of active artists.
Artist retention up by roughly 15%, supported by increased engagement and satisfaction.
Reports of suspicious behaviour reduced by about 40%, as public visibility encouraged better conduct.
User satisfaction scores improved by 20%, with many artists highlighting the review system as a key reason for renewed confidence in LabelRadar.
What started as a response to community frustration evolved into a sustainable feedback loop that continues to strengthen transparency, accountability, and connection between artists and labels.






