Designing a Fair and Transparent Review System

A review feature designed to promote trust and accountability between artists, labels, and promoters through transparent, user-friendly feedback.

2024

2024

B2C

B2C

UX Design

UX Design

Design Research

Design Research

Design System

Design System

Prototyping

Prototyping

Figma

Figma

User Interface

User Interface

Quantitative Research

Quantitative Research

Networking

Networking

Problem and Context

LabelRadar is a web platform that streamlines the demo submission process in the music industry. It helps artists get their tracks heard while allowing labels and promoters to efficiently discover new talent.


On the platform, artists can submit their music directly to their favourite labels in hopes of securing a release. Although every label is vetted for legitimacy during onboarding, some artists began reporting negative experiences when communicating with label teams.


Because LabelRadar did not initially provide a space for users to share feedback or discuss their interactions, many turned to external platforms like X (formerly Twitter) and Reddit to express their frustrations. This public conversation began to create a perception that LabelRadar might not be fully trustworthy.


In addition to reports of being ignored, some users described suspicious behaviour from bad actors, including offers to promote tracks for a fee, sell playlist placements, or advertise paid courses.

Recognising the impact on community trust, we identified the need to create a safe and transparent space where artists could openly share their experiences with labels on the platform. The goal was to foster accountability, strengthen trust, and build camaraderie between artists, enhancing both the reputation of the brand and the integrity of the platform.

Process

After conducting competitive research, focusing on platforms with public rating systems such as Glassdoor, analysing user survey data, and collaborating closely with the engineering team to align on technical feasibility and project scope, I identified several key opportunities to strengthen this initiative:

  • Showcase ratings and reviews directly on label and promoter profiles.

  • Encourage retroactive feedback, allowing artists to review not only new interactions but also past collaborations prior to implementation.

These touchpoints were essential for building a comprehensive, user-friendly review system. My goal was to make the review process feel effortless and trustworthy: maximising participation while ensuring fair, constructive feedback. This would help promote accountability, discourage bad actors, and foster a more transparent environment across the platform. The system also needed to handle reviews for interactions that predated its launch, maintaining fairness for all users.


For Artists

After 72 hours of any new interaction between a label and an artist, the artist would receive an in-chat prompt to rate their experience across three key dimensions:

  • Communication – Did the label respond respectfully and promptly? Was the exchange professional and enjoyable?

  • Value – How did the label’s contribution compare to others in the industry?

  • Professionalism – Did the label demonstrate integrity, reliability, and high standards of conduct?

To submit their feedback, artists would select a rating represented by an emoji (good, neutral, or bad) making the process quick and visually intuitive. They could then add an optional written comment to provide context or additional feedback.

Upon sending, users would receive a feedback affordance confirming their message was successfully submitted and would be processed shortly. All written input would be automatically scanned for offensive or inappropriate language before being published, ensuring a respectful and safe review environment.

For past interactions, the same prompt would appear within existing chats, but without triggering new notifications, preventing unnecessary noise and preserving a positive user experience.

After conducting competitive research, focusing on platforms with public rating systems such as Glassdoor, analysing user survey data, and collaborating closely with the engineering team to align on technical feasibility and project scope, I identified several key opportunities to strengthen this initiative:

  • Showcase ratings and reviews directly on label and promoter profiles.

  • Encourage retroactive feedback, allowing artists to review not only new interactions but also past collaborations prior to implementation.

These touchpoints were essential for building a comprehensive, user-friendly review system. My goal was to make the review process feel effortless and trustworthy: maximising participation while ensuring fair, constructive feedback. This would help promote accountability, discourage bad actors, and foster a more transparent environment across the platform. The system also needed to handle reviews for interactions that predated its launch, maintaining fairness for all users.


For Artists

After 72 hours of any new interaction between a label and an artist, the artist would receive an in-chat prompt to rate their experience across three key dimensions:

  • Communication – Did the label respond respectfully and promptly? Was the exchange professional and enjoyable?

  • Value – How did the label’s contribution compare to others in the industry?

  • Professionalism – Did the label demonstrate integrity, reliability, and high standards of conduct?

To submit their feedback, artists would select a rating represented by an emoji (good, neutral, or bad) making the process quick and visually intuitive. They could then add an optional written comment to provide context or additional feedback.

Upon sending, users would receive a feedback affordance confirming their message was successfully submitted and would be processed shortly. All written input would be automatically scanned for offensive or inappropriate language before being published, ensuring a respectful and safe review environment.

For past interactions, the same prompt would appear within existing chats, but without triggering new notifications, preventing unnecessary noise and preserving a positive user experience.

After conducting competitive research, focusing on platforms with public rating systems such as Glassdoor, analysing user survey data, and collaborating closely with the engineering team to align on technical feasibility and project scope, I identified several key opportunities to strengthen this initiative:

  • Showcase ratings and reviews directly on label and promoter profiles.

  • Encourage retroactive feedback, allowing artists to review not only new interactions but also past collaborations prior to implementation.

These touchpoints were essential for building a comprehensive, user-friendly review system. My goal was to make the review process feel effortless and trustworthy: maximising participation while ensuring fair, constructive feedback. This would help promote accountability, discourage bad actors, and foster a more transparent environment across the platform. The system also needed to handle reviews for interactions that predated its launch, maintaining fairness for all users.


For Artists

After 72 hours of any new interaction between a label and an artist, the artist would receive an in-chat prompt to rate their experience across three key dimensions:

  • Communication – Did the label respond respectfully and promptly? Was the exchange professional and enjoyable?

  • Value – How did the label’s contribution compare to others in the industry?

  • Professionalism – Did the label demonstrate integrity, reliability, and high standards of conduct?

To submit their feedback, artists would select a rating represented by an emoji (good, neutral, or bad) making the process quick and visually intuitive. They could then add an optional written comment to provide context or additional feedback.

Upon sending, users would receive a feedback affordance confirming their message was successfully submitted and would be processed shortly. All written input would be automatically scanned for offensive or inappropriate language before being published, ensuring a respectful and safe review environment.

For past interactions, the same prompt would appear within existing chats, but without triggering new notifications, preventing unnecessary noise and preserving a positive user experience.

For Labels and Promoters

The rating system was integrated into label and promoter profiles under a dedicated “Reviews” tab. The rating summary now appears at the top of the profile, providing immediate visibility of overall sentiment.

Below, users can scroll through all published reviews, which display:

  • The artist’s name,

  • Their emoji-based rating,

  • The optional written message, and

  • A “helpful” button allowing other artists to endorse valuable feedback.

Importantly, reviews are published automatically, ensuring labels and promoters cannot manipulate or filter them—promoting transparency and trust within the community.

From their own view, labels and promoters can publicly reply to individual reviews by clicking a “Respond” button beneath each one. These replies are also automatically scanned for offensive content before becoming visible to other users, maintaining the integrity of the system and safeguarding respectful communication on both sides.

For Labels and Promoters

The rating system was integrated into label and promoter profiles under a dedicated “Reviews” tab. The rating summary now appears at the top of the profile, providing immediate visibility of overall sentiment.

Below, users can scroll through all published reviews, which display:

  • The artist’s name,

  • Their emoji-based rating,

  • The optional written message, and

  • A “helpful” button allowing other artists to endorse valuable feedback.

Importantly, reviews are published automatically, ensuring labels and promoters cannot manipulate or filter them—promoting transparency and trust within the community.

From their own view, labels and promoters can publicly reply to individual reviews by clicking a “Respond” button beneath each one. These replies are also automatically scanned for offensive content before becoming visible to other users, maintaining the integrity of the system and safeguarding respectful communication on both sides.

For Labels and Promoters

The rating system was integrated into label and promoter profiles under a dedicated “Reviews” tab. The rating summary now appears at the top of the profile, providing immediate visibility of overall sentiment.

Below, users can scroll through all published reviews, which display:

  • The artist’s name,

  • Their emoji-based rating,

  • The optional written message, and

  • A “helpful” button allowing other artists to endorse valuable feedback.

Importantly, reviews are published automatically, ensuring labels and promoters cannot manipulate or filter them—promoting transparency and trust within the community.

From their own view, labels and promoters can publicly reply to individual reviews by clicking a “Respond” button beneath each one. These replies are also automatically scanned for offensive content before becoming visible to other users, maintaining the integrity of the system and safeguarding respectful communication on both sides.

Outcome and Impact

The launch of the review and rating system marked an important step toward improving transparency and accountability on LabelRadar. For a feature that had never existed before, adoption was stronger than expected: within just a few weeks, artists submitted over 2,000 ratings, even with many past interactions unlikely to ever be reviewed.

The new system quickly began influencing behaviour across the platform. Some labels and promoters with consistently poor reviews chose to leave, while others improved their communication and engagement with artists. This shift led to a noticeable increase in positive sentiment among users and helped rebuild trust in the platform’s community.

By giving artists a clear and safe way to share their experiences, the feature not only amplified their voices but also reinforced the sense of fairness and transparency that had been missing.

Two years after launch, the feature remains an integral part of the platform’s trust ecosystem:

  • Over 10,000 total reviews collected, with participation from around half of active artists.

  • Artist retention up by roughly 15%, supported by increased engagement and satisfaction.

  • Reports of suspicious behaviour reduced by about 40%, as public visibility encouraged better conduct.

  • User satisfaction scores improved by 20%, with many artists highlighting the review system as a key reason for renewed confidence in LabelRadar.

What started as a response to community frustration evolved into a sustainable feedback loop that continues to strengthen transparency, accountability, and connection between artists and labels.