Let's Talk About Music Rating
Whether it's Film, TV or Mainstream Music, they're all built on popular opinion. They're also built on critical opinion. Sometimes the lines are blurred between the two, but at the end of the day, every writer, director or musician are creating what they're creating for somebody to critique it. Hence why one of the branches for this here site is aptly named "Opinions & Critiques".
But this isn't a site for the critiques you think. Of course, I am talking about rating music.
I like to air out my opinion, that is no secret. But if you have ever looked around on this site and wondered why I don't rate music past my Top 10's. It is because I'm not intellectually equipped to really dive into rating albums like that. Films & TV shows, yes. Music, no.
There's a subtle art to music criticism. The ability to break down somebody's art really is something unique. The Film & TV criticism you see on websites & papers are very simplified. The only time you will see Film & TV criticism broken down like you see albums, is in academic essays.
I say all of this(...) to say that the ratings music critics use, sucks.
5 stars, 10 stars, out of 10, out of 5, letter grade, it doesn't matter how you slice it, it's a naturally imperfect science. Add the fact that every credible newspaper & website has several critics then it's not even a science at that point. It's just based on a human perspective and if that particular person agrees with you or not.
Because that is usually what it comes down to right? Whether you agree with the critic or not. Which is part of the reason why I wanted to talk about this.
When Cardi B's "Invasion of Privacy" dropped, I saw a few times on social media, people sharing the review by US music publication Pitchfork. They gave her album an 8.7. Now, reasons people called out this particular album, I'll leave for you to interpret, but look at the albums they consider worse than Cardi's album:
See what I'm on about? When I initially saw this, I was like the majority in thinking "How the hell" did they manage that?!". I still think that honestly, but surely things like this are bound to happen if you sift through other publications? Especially since two of the albums above are over a decade old.
Look at the reviews that The Source did for Hip-Hop albums back in the 90's. They get props for correctly calling Nas' "Illmatic" a future classic but past that? They were pretty sloppy.
You see this a lot if you keep up with this sort of thing. The main reason for Pitchfork getting the heat is because they're very specific with their ratings as you can see. So it seems that the less specific, the less chance that people will call you out. It is probably why "out of 5" is the most favourable amongst Film & TV critics.
But I was thinking. How about refreshing the system a little? Make it a little more regimented? Critics love to get into the nitty-gritty of it all. Some go track by track. So I propose an "out of 10" system where you can gain more simple knowledge as a reader and have a better chance at understanding the critic's rationale.
So here is my proposal. And since this is a Hip-Hop centric site. I am only thinking about this through the prism of Hip-Hop and what a fan of Hip-Hop look for in an album. I don't know what critics look for in an EDM album for example. So just Hip-Hop here.
I call it:
"10 into 2's"
I want to make the score more meaningful. So let's split the overall rating into five mini-ratings. Here's the criteria for each 2 points:
Lyrics - How good are the lyrics? Are they complex? Do they have meaning? Are the punchlines clean?
Production - Do the beats "slap" or "go hard"? Do they compliment the vibe of the work?
Message - What is the artist telling us? Are they trying to make us think or simply get us jumpin'?
Hype/Delivery - Did they have a two month hype train or just drop it out of nowhere? Did it live up to the hype?
Longevity/Influence/"Classic" status - Is it remembered? Was it a cornerstone in the new era of Hip-Hop?
With those categories set. The critic would then go through them and the highest rating for each would be 2. Decimals like "1.2" count and with this, the overall score would be tighter in its rationale. This is a similar system to what performance shoe reviewers do. As they go into detail with what a shoe is; materials, cushion, support, traction etc. We do the same with an album.
The last one is the special one. While the others are immediate reactions, the last criterion cannot be allocated to the final score until the body of work is 10 Years Old. This gives the critic incentive to really look back at their own review and look at the album again. We may love an album for a month or a year but how many albums can you successfully blast 10 Years after you initially listened to it?
Now this is only an idea I wanted to throw out there. I say again, I'm not a music critic, but I would like professional music critics to make an effort into making their thoughts more friendly to a music fan like me. I think that's why I personally watch more YouTube reviewers. Because they can express their opinions better than a Pitchfork review in my mind. But like I said at the beginning, everybody has their opinion. It's just about the level of detail people can go into when talking about somebody else's body of work and sometimes, the explanation is unsatisfactory.
If I rated albums. That is how I'd do it.