by Jacob Phillips
For many young people, arts criticism has taken a back seat to social media. We’re less likely to read the opinion of a magazine staffer than to see what our friend tweeted about a new movie. This could be related to a decrease in our literacy and attention spans, as well as a rise in anti-intellectualism. Critics embody a perceived pretentiousness about art. They’re the people who read into everything, who can’t just enjoy something without thinking about it, who get passionate about stuff that’s “not that deep.” The one aspect of criticism people continue to interact with, however, is the numerical rating.
It’s become common practice to express our general feeling about an artwork or product using a five/ten point scale, typically symbolized as “stars.” You’d give your favorite movie 5/5 stars, while you’d give the worst 1/5. It’s a similar concept to an academic grade, where the success of a given work is wholly abstracted into data. Both ratings and grades are arbitrary, meaning their significance is determined by the person giving the rating. We can set our own criteria to help us determine the rating we give, but no matter how deliberately or systematically we select the number, it’s an objectively meaningless method of quantifying our experience– which is to say, it turns our emotions into numbers. Yet many of us assign a great deal of significance to these ratings, both our own and others’.
Using film as our primary medium, we can find multiple websites that focalize the rating. For the ratings of professional critics, there’s Rotten Tomatoes and Metacritic. These sites are both based on the idea of simplifying critical consensus. Metacritic calculates the average =rating of multiple reviewers, accounting for differences in the numeration by changing everything to a 0 to 100 scale. So a 3/5 rating would be entered as a 60. Rotten Tomatoes is less about the specific rating, and more about generalizing the sentiment of reviews to be either positive or negative, it’s “Tomatometer” showing us what percentage of critic reviews are positive. Films that receive enough positive reviews from enough critics get a “Certified Fresh” rating, and films below the benchmark 60% are deemed rotten.
The exact method Rotten Tomatoes uses to determine whether a review is positive or not is opaque, but this doesn’t much affect the average user of the site, who doesn’t care what any individual critic thinks but wants a sense of if the movie is worth their time. Metacritic uses a color code to indicate this vague sense of quality– films above a score of 60 are green for good, 40-60 are yellow for mixed, and below 40 are red for bad. Though these websites are constructed around critics and their reviews, there’s very little done to actually emphasize their voices. Going to either home page will not show any spotlighted writing, but instead a roster of recent releases with the rating tacked on below them. Someone looking for a weekend watch might not even see what the movie is about, but just look for what has the biggest number attached.
This relationship between review and rating– that the review is the more superficial expression, and the rating the more clear– seems inverted from how a film critic would present their opinion. The critic might resent the reader who looked at their rating without reading the review, but these sites encourage us to do just that.
Both of those websites feature an aggregate user score, but if you’re after the layman’s take, you’d sooner search IMDb or Letterboxd. The Internet Movie Database is a user-edited informational site which mostly functions as a repository of film/TV facts, cast and crew credits, as well as allowing users to rate and review films. Letterboxd is similar, but instead emphasizes the user engagement tools as its main features, becoming more of a social media platform with the ability to like and comment on reviews and follow different users.
Both sites prominently feature the average user score, as well as aggregated lists of “The Best Films of All Time,” based on these ratings. They differ greatly in their presentation of the user review, however. IMDb’s user reviews are tucked down near the bottom of a given film’s reference page, below cast info, plot summaries, “similar film” recommendations, and other assorted trivia. Letterboxd, by comparison, centers the review much more, showing popular and recent reviews on the landing page, as well as the reviews of your friends if you have an account.
There’s differences in the style of reviews as well. IMDb users tend to write multiple paragraphs and speak directly about their opinion of the film. A review on Letterboxd, by contrast, can take many different shapes. It could be a multi-paragraph critical analysis, or a single emoji. The most popular reviews on Letterboxd tend to be short, humorous blurbs, more like tweets than anything else. It is often unclear just from the writing how a reviewer feels about a film. While this may make it a more successful social media platform– a Twitter away from X– it again deemphasizes the value of a substantive critical opinion. Ultimately, users of both IMDb and Letterboxd will end up resorting to the ratings to get a sense of a film’s quality before anything else.
These trends don’t just exist in online film spaces, either. For books, the websites Goodreads and StoryGraph function similarly to Letterboxd, the latter turning your habits and pages read into data graphics for your statistical navel gazing. Music forum site RateYourMusic puts it right in the name, though its administrators do more to feature articulate user reviews than any of the other sites listed, being the main feature of its homepage. Despite the relatively small user base of RYM, music discourse online can tend to be some of the most toxically computative of all the art. One needn’t look far to find people complaining about ratings from Pitchfork or YouTube critic Anthony Fantano, arguing over what defines a 10/10 record, or employing the similarly vapid Tier List.
I’ve grown concerned about the function of the rating in the past couple months, but I’ve been engaging with the system for years. I’ve been a fan of film and a Letterboxd user since 2018. Its rating system allowed me to systematize my watching habits. I made lists of the best films of each decade based on the ratings I gave them. Anything I gave a 4/5 or higher would go on the list. I even went so far as to make a spreadsheet of these movies. The rating wasn’t the only important part of my engagement, however. For every new film I watched, I tried to write at least a paragraph explaining my opinion of it. Over time, that’s proven the more beneficial practice. As a writer, it’s been a tool in honing my craft, and helped me write better critical essays for my classes. As a film fan, it's helped me to understand my tastes and opinions and to bring ideas to conversations that people have appreciated.
What hasn’t helped me are the ratings. Every year, I got a little more obsessive about the nuanced differences between ratings. I ranked directors by my average ratings of their films. I’ve had real life conversations about movies where all we said was “I give it x rating.”
Here at Pratt, I’ve had film major friends who put their work on Letterboxd, and I’ve written reviews of their work. I’ve abstained from rating them, however. It seems to me the best thing I can do is write as well as I can about their work and leave out the vapid numeral, and my friends have expressed their appreciation for my reviews. This made me ask myself: Why do I still bother with this number game when I can actually say what I want with words? For the lists? The spreadsheets? I don’t think there’s a good answer.
These days, I don’t rate the movies I watch, I just write the reviews. I’ve been trying to remove all of my ratings from Letterboxd, in fact, but there’s no way to do so en masse, meaning I have to go through all 2,000 some odd ratings on my profile and unrate them one by one, and to delete the ratings from previous reviews (of which I have about 1500) would be an additional step.
I want to get away from ratings, but they’re all over the websites I use. I want to enjoy movies by talking about them, but many people let the number speak for them. They’re just another way of avoiding nuance, along with memes, infographics, and like buttons: the tools of our rhetorical degeneracy. These days, I want opinions straight from the speaker’s mouth. But with the prevalence of these tools, and the disinterest in discussing art seriously, I don’t have high hopes of the sentiment spreading.
Kommentare