Rating the quality of labels
In house — LabelRadar
Design system
UX Design
Competitive research
The Challenge
LabelRadar is a web platform that streamlines the demo submission process across the music industry, empowering artists to get their tracks heard while enabling labels and promoters to efficiently discover new talent.

On LabelRadar, users can submit their music to their favorite labels with the hope of securing a deal. While all labels are vetted for legitimacy upon joining, some users have encountered negative experiences when communicating with label teams.

Since we did not initially provide a space for users to share or discuss their interactions with specific labels or promoters, many turned to platforms like X (formerly Twitter) and Reddit to voice their frustrations. This led to a perception that LabelRadar might not be trustworthy. Beyond simply being ignored, some users reported encountering suspicious activity from bad actors on the platform, such as offers to promote tracks for a fee or selling spots on playlists and courses.

To foster transparency and hold labels and promoters accountable, we are introducing a system that allows users to review their interactions, promoting a more open and trustworthy environment for all.
The Process
After conducting competitive research, primarily on platforms with public rating systems (such as Glassdoor), and collaborating with the engineering team to ensure a shared understanding of the project scope, we identified several key areas that needed to be addressed for this initiative:
These touch points were essential to ensuring the system would be both comprehensive and user-friendly. We aimed to make the review experience as seamless as possible, as this would be crucial in gathering fair and accurate feedback for the labels. This, in turn, would help hold labels accountable and discourage bad actors from the platform, ensuring a more trustworthy environment for all users.


Label and Promoter Profiles

We introduced the new feature under a dedicated tab, ensuring there was enough space to display the information clearly while minimising cognitive load for users.

On the left side, w placed colour-coded ratings that remain neutral by default, but light up on hover. This component uses a semantic colour palette to visually represent the sentiment chosen by the user, making the experience both intuitive and informative.

On the right side, reviews with comments are displayed, allowing users to vote on whether they found the reviews helpful. Labels can also reply to these reviews, with their responses appearing underneath in a different background shade to clearly differentiate them from the original review.
Community Feedback on Label Profiles — Prototype (LabelRadar)
Rating a Label or Promoter

In order to make the reviews as fair as possible, there needs to be an interaction to trigger the user to submit a review. We decided on it as we wanted to avoid users spamming Labels they had not interacted with at all.

So after 72 hours of having exchanged messages with a Label or Promoter about a track, the user is prompted to give a review in conversation, on their messages section.

The component has three different variables the user needs to choose from on Communication, Value and Professionalism. For this, they only need to click on the smiley face that matches the sentiment the most. After selecting the 3 options, an input field is progressively revealed to the user so they can write a review (this is completely optional).
Feedback prompt — Prototype (LabelRadar)
To ensure reviews are as fair as possible, we implemented a system that requires an actual interaction before prompting users to submit a review. This prevents users from spamming reviews for labels they haven’t engaged with.

After 72 hours of messaging with a label or promoter about a track, users are prompted within their message section to leave a review.

For users with retroactive reviews to submit, as mentioned earlier, we decided to send a prompt within each of their past conversations. However, to avoid overwhelming them, we opted not to send notifications for every single one. This way, users can provide feedback at their own pace without feeling bombarded by alerts.

The review component asks users to rate three aspects: Communication, Value, and Professionalism. To do so, users simply click on the smiley face that best represents their sentiment for each category. Once all three ratings are selected, an optional input field is revealed where users can leave a written review, if they choose.
The Outcome
For a feature that didn't exist before, and considering the backlog of past interactions that might never be rated, we were surprised to gather over 2,000 ratings within just a few weeks.

Some labels and promoters with the poorest reviews chose to leave the platform, while others improved their communication with users, leading to an overall boost in user sentiment. This has significantly improved how artists feel about the platform compared to before. We're still collecting data to guide future iterations of the feature, but it has already made a huge impact by giving artists a voice and fostering a sense of transparency.
Go back to projects