NHacker Next
login
▲A statistical analysis of Rotten Tomatoesstatsignificant.com
186 points by m463 14 hours ago | 98 comments
Loading comments...
jackero 8 hours ago [-]
I find RT scores very accurate but not the raw score.

What I mean is that a 70% score is meaningless to me. I need to know the movie genre, the audience score, the age of the movie and then I basically do a “lookup table” in my head. And I have that lookup table because I’ve looked up every movie I’ve watched on RT for 15 years so I know how the scores correlate to my own personal opinions.

As an example: the author said that critic scores should align with audience scores but no that’s not true at all. Critics tend to care more about plot continuity, plot depth and details while the audience tends to care about enjoyability. Both are important to me so I always look at both scores. That’s why a lot of very funny comedies have a 60-69% critic score but a 90%-100% audience score — because it’s hilarious but the plot makes no fucking sense and has a million holes. And if you see a comedy with 95% critic but 70% audience, it will be thought-provoking and well done but don’t expect more than occasional chuckles.

kelseydh 14 minutes ago [-]
I generally find IMDB user scores far more reliable and granular. There is a distinct quality in a movie's quality that leads it to getting a 6.x rating (okay), versus a 7.x (great) versus an 8.x (a Top 500 of all time).

Metacritic is the next most useful, while Rotten Tomatoes is easily the least useful in the information it provides. High critical and user reviews often does not provide a good intensity barometer of how good the film actually is.

waerhert 5 hours ago [-]
My immediate thought after seeing the first chart was that it is inversely related to my own experience with movies in the last 20 years. Maybe there's an idea in there for a 'score normalizing' browser extension.
SwtCyber 6 hours ago [-]
Rotten Tomatoes becomes way more useful once you treat it as a tool rather than a verdict
theshrike79 3 hours ago [-]
In Rotten Tomatoes there's always That Guy who's being contrary for the sake of being different.

Like Paddington and Paddington 2 had 100% review scores for a long time, until some "reviewers" disliked it on purpose bringing Paddington 2 to 99%

Using multiple sites as an aggregate works. In IMDB you need to check the vote distribution graph and in your mind take out all the 1's and 10's and see where the average/median lies after that.

And it's important to find actual reviewers whose taste aligns with yours and use them as more directed guidance.

taskforcegemini 6 hours ago [-]
same is true for product reviews in online shops
Jarmsy 5 hours ago [-]
I often enjoy movies that are unexpected and don't fit neatly into one established genre, but I think these tend to get lower audience ratings, while films that deliver to expectations do better, even if most of a randomly selected audience would dislike them. If a movie is a comedy, with a poster with big red letters and a white background, people know it's a certain kind of movie, and mostly those who enjoy those movies will go see it. Likewise with documentaries about some niche interest - those who watch it mostly sought it out because they're into that.
Finbel 8 hours ago [-]
Always wondered why Rings of Power have 84% critics score but just 49% audience.
prof-dr-ir 3 hours ago [-]
One important factor is that the critics score is binary in a sense: if all critics agree that the movie was "passable but not great" then Rotten Tomatoes still gives it a 100% critics score.

The website explains it clearly enough I would say.

moi2388 8 hours ago [-]
Because critics get paid, whilst audience have to pay.
echelon_musk 5 hours ago [-]
> Report says PR firm has been paying Rotten Tomatoes critics for positive reviews (screengeek.net) 254 points by mc32 on Sept 7, 2023

https://news.ycombinator.com/item?id=37419427

4 hours ago [-]
dandellion 6 hours ago [-]
I didn't like the idea that my money had paid for such a disservice of my favourite book, so it pushed me to cancel my Prime subscription that had been ongoing for years. I don't buy nearly as much on Amazon these days as a consequence.
Gareth321 2 hours ago [-]
I rarely get angry about bad content but RoP felt like a personal affront. I love Tolkien's world and the people who put RoP together did so with not just ignorance and incompetence, but some kind of malice. They intentionally butchered Tolkien's writing and world. This stands in such stark contrast with Peter Jackson's position that it is not his right to inject his personal values and narcissistic hubris into the movies. He chose to honour the material as best he could while adapting it. It is, without any shadow of a doubt, the better approach.
nottorp 52 minutes ago [-]
> but some kind of malice

Never attribute to malice what can be adequately explained by incompetence :)

I bet the RoP team are great content creation professionals. They obey all the rules of their craft.

They also do not care about the material at all, otherwise they'd be script writers and directors, not content creators.

pauke 2 hours ago [-]
Critics often score based on first few episodes to be released in, and never revisit the score. And if it's shiny/ expensive (and RoP was both) and seems like it might lead somewhere, they risk ridiculing themselves by being too critical.
jv22222 2 hours ago [-]
Please list a few more insights like this for picking good movies, thanks!

Additionally, I think someone could build an interesting RT browser based on these kinds of insights.

rubzah 3 hours ago [-]
High critic score / high audience score = Good

High critic score / low audience score = Paid-for hype, or politically motivated reviews

Low critic score / high audience score = Possibly a good movie

Low critic score / low audience score = Bad

i_love_limes 2 hours ago [-]
I have to disagree with your take on "High critic score / low audience score". There is a swathe of more challenging, experimental, or art house movies that fall into this category. These reviews fill the void where another audience only place like imdb falls short.
stared 2 hours ago [-]
Low critic score / high audience score = Bad, but enjoyable for the masses
Gareth321 2 hours ago [-]
Maybe we need to define "bad," because I would argue an enjoyable movie is good. Movies don't need to be avant garde to be good. They just need to be entertaining.
liveoneggs 3 hours ago [-]
this feels like an interview question
rdedev 3 hours ago [-]
Here is a better heuristic:

High critic score / low audience score = Avant garde type films. Might go over your head

Low critic score / high audience score = Maybe fun but forgettable movie

testdelacc1 2 hours ago [-]
The last comedy that I saw that matches your description is American Fiction. It didn’t feature too many laugh out loud moments, but it was thought provoking and well done. And yet, 93% from critics and 95% from audiences.

I wonder if audiences can appreciate these movies more than you give them credit for?

Let’s try a few more

- Death of Stalin (94%, 79%) has the pattern you’ve predicted.

- O Brother Where Art Thou? (78%, 89%) has the opposite of the pattern.

- Grand Budapest Hotel (92%, 87%) was appreciated by both, like American Fiction.

I’m just not seeing a pattern here. Looking at comedies that fit your description the critics and audience scores don’t follow a predictable 95%, 70% pattern.

MrMember 1 hours ago [-]
Death of Stalin is one of the funniest movies I've ever seen. It's been a long time since I laughed so hard at a movie or TV show.
testdelacc1 30 minutes ago [-]
And I laughed my ass off watching American Fiction. These are funny movies! Just a different vibe from say Talladega Nights.
atoav 31 minutes ago [-]
Additionally there are movies who just have something unique to them that a niche audience may love, but both critics and the general audience treat them more harshly.

The truth is that other peoples opinion may or may not be a good proxy for your own taste in movies, even if it was uncorrupted and independent.

baxtr 6 hours ago [-]
What I find weird is that no one has solved the "people like you also liked this" problem for ratings/reviews.

All ratings on these platforms are average values through the entire cross-section of people.

Yet I am sure that they are people who have a very similar taste like me. I want to read their reviews, see their ratings, and recommendations.

Social media platforms do that pretty well these days.

mlinhares 48 minutes ago [-]
A recommendations Netflix guy explained this quite well, people lie in their reviews, so they mostly don't matter, what matters is watching habits, those clearly show what you really like instead of the imaginary person that rates movies they'd never watch.

So the actual market for something that recommends like that is quite small.

conception 2 hours ago [-]
https://www.criticker.com is the best I’ve seen for this. You rate movies relatively and then they match your ratings to people who have similar tastes and recommend based on that. So if you have period westerns all rated highly they’ll see what other movies were rated highly by people so rare period westerns highly. It’s actually pretty genius.
xandrius 6 hours ago [-]
I guess because such a tool starts with you having to input a ton of data before being useful. Either people don't do that or if they are willing then the platform would be getting lots of valuable data and wants to keep you feeding it to increase their trove of data before selling off to Amazon or Google.
Hendrikto 4 hours ago [-]
I already rated hundreds of movies and TV shows on IMDB, for example. This could be used as a basis.
fzeindl 3 hours ago [-]
http://www.gnovies.com exists since 2002.

(Other media: http://www.gnod.com)

ZoomZoomZoom 3 hours ago [-]
Rateyourmusic.com / Sonemic added movie scores/reviews a long time ago. You can follow people and their scores will be visible for you.
adammarples 1 hours ago [-]
They kind of have, that's how Netflix and Spotify recommended for you stuff works
Hendrikto 4 hours ago [-]
There is Tastedive, which has given both great suggestions as well as recommended utter garbage to me in the past. Very hit or miss, but when it hits it hits.
uoaei 7 hours ago [-]
Plex shows you both critic and audience scores from RT (IMDB also) and they indeed diverge consistently on the lines you suggest. In general I trust the audience scores a lot more because I'm trying to have fun watching movies rather than analyze their plot/pacing/cinematography/etc.

The audience can be trusted to know how to have fun. The discrepancy between critic and audience scores is also a valuable signal to judge how fun campy/schlocky/B-movie horror films particularly from the 80s and 90s.

eastbound 8 hours ago [-]
Critics have a political agenda, they overrate movies with “a message”, the message being always leaning Californian. The movie industry is a massive sector with lobbies, and paid critics are no stranger to that.

And as the sibling says, audience pays to see a movie. The audience, the people, are more politically balanced. There is no bias or selection: It’s the democratic components, including people that the “in” lobbies don’t like.

If only we could get rid of this damn audience!

jibal 5 hours ago [-]
If I weren't already well familiar with the diverse critic reviews on RT, claims that the critics are "woke" (or equivalently, have a "Californian" "political agenda" that "overrates" movies with a message) would be reason for me to value their views over self-selecting "audience" reviews, which I find to be mostly shallow and uninformed, and with a good dose of provincial bigotry as part of the "political balance". I personally am not looking for "political balance", certainly not as that currently manifests itself in the U.S.

And if paid critics are no stranger to lobbies (or the movie industry as a massive sector with lobbies ... it's a bit hard to parse), I see no particular reason to expect them to have a political agenda that overrates movies with a message--I don't think those are the ones that make big bucks for the massive sector. (I'm more interested in indie fare, or at least stuff with more character and depth and less CGI and juvenile superheroes vs. supervillains.) Much more likely is that this spew reflects a political agenda.

vintermann 3 hours ago [-]
I thought "Californian bias" was a great term precisely because it isn't quite the same (or as shallow) as "woke". How could the movie industry not have a Californian bias? So much of it is made in that very peculiar culture, peculiar even by American standards.

And yet if you hated that sort of thing, why (or how?) would you become a movie critic? Can you imagine being a classical music critic and intensely disliking Vienna? (Another damn peculiar, damn influential culture, by the way).

Gareth321 1 hours ago [-]
I agree. It is clear and self-evident that movie critics have a California bias. I cite Emilia Perez (https://www.rottentomatoes.com/m/emilia_perez) with a 71% of critics recommending the movie. This movie won *91 awards* this season. This is, by any objective and subjective metric, an atrocious film. Audiences gave it 17% on Rotten and 5.4 on IMDB. Why did this movie win so many awards and positive reviews from critics? Because it has a trans person as the lead. That's it. The bias is on full display with this movie.
vintermann 30 minutes ago [-]
But that is more about "woke" than California. My point was that California is peculiar in far more complicated ways than merely being more trans-positive. Arnold Schwarzenegger doesn't seem especially "woke" to me, but he seems very California. Scientology is hardly "woke", but it's very California. Steve Jobs, same. Utterly weird culture, if we hadn't been so extremely exposed to it. We think of so many things as normal, even though they're not normal at all in our actual lives where we are, but they're normal in California. (Well, more normal). That was Vienna too. It goes way beyond a simple culture war dichotomy.
actionfromafar 7 hours ago [-]
Your OP just gave examples of bias.
Hendrikto 4 hours ago [-]
Obviously individuals still have preferences and biases. The idea is that they are less aligned than critics, so they average each other out somewhat.
rurban 6 hours ago [-]
> Humanity Has Stopped Producing Bad Art: After a century of trial and error, mankind has perfected the art of cinema, as proven by recent masterworks like Cats, Space Jam: A New Legacy, the live-action Snow White, Red One, and Joker: Folie à Deux. Critics, who were once joyless automatons thriving on takedowns of human creativity, now bask in this golden age of moviemaking, lavishing praise upon the timeless artistry of The Walt Disney Company and Warner Bros. Discovery.

This really should appear in professional film reviews.

neilv 9 hours ago [-]
> To account for this influx of reviewers, Rotten Tomatoes has created a "Top Critic" designation reserved for established media outlets, such as The New York Times and The Atlantic. However, this label has no special bearing on a film's top-line Tomatometer score and is largely incorporated into ancillary aspects of the site.

Just last night, I noticed that I could access the two percentage scores for critic reviews.

If you go to "https://www.rottentomatoes.com/m/the_dilemma", and click on the critic reviews percentage (25%), you get a popup that lets you select between seeing the All Critics score (25%) and Top Critics score (28%).

(And if I'd thought to check Rotten Tomatoes first, when selecting what looked like a fun light comedy on Netflix, I wouldn't have wasted an hour of my life before I said WTF, checked RT, and continued to be in a bad mood.)

Incidentally, I'd love to have the Tomatometer score integrated into my UI for video streaming services. The services seem to instead like to use their own very generous scoring instead. (When they show any score at all. Some like to suppress the ratings for new shows they produced, presumably to avoid shooting down their own poor shows before people watch them by default.) But Rotten Tomatoes is a much better predictor of how I'll like a show than the streaming service scores are. But maybe the streaming services don't want to expose that the majority of the movies and series offered at any time now range from mediocre to outright bad.

bruce511 9 hours ago [-]
>> But maybe the streaming services don't want to expose that the majority of the movies and series offered at any time now range from mediocre to outright bad.

There is no "now" necessary in that sentence.

All media production in all eras is mostly terrible. Music wasn't better in the 80s, or 70s, or 60s, its just that the 80s music you hear today is heavily curated to the good stuff.

It seems like streaming has made it worse, but only because you're watching so much more. In the past movies took effort to watch. You went to the cinema, or video shop. By the time they made it to TV they were curated, or at the very least you knew about them.

There was plenty of dross that made it direct to video that never made it to cinema or TV. (In 1989 I lived for a year at a place with no broadcast TV. We watched 2 videos a night from the local blockbuster-type store. They had a LOT of very crap movies.

To blame streamers for delivering a lot of mediocre content is to miss the root cause. Most new content is mediocre. Or bad. It has always been the way. Streaming just makes it easier to watch.

freddie_mercury 8 hours ago [-]
Rotten Tomatoes licensing for a major streaming platform would probably be millions of dollars a year.

https://www.reddit.com/r/webdev/comments/4649rw/comment/d03a...

michaeljx 9 hours ago [-]
I am doing my movie selection via Plex, which has both tomatoes' and IMDb scores in the movie description
rainingmonkey 2 hours ago [-]
Jellyfin also has this
brabel 8 hours ago [-]
You guys are opening yourself to manipulation. Why not just be open for surprises? That’s what I do. Most of the time it’s bad surprises, but the occasional masterpiece that everyone else seems to hate make up for it. I do seem to not have the same taste as most people writing reviews though. I almost always find other people’s opinions ludicrous.
maximus_01 8 hours ago [-]
Power to you, but you probably miss out on masterpieces in the other direction though. The ones you would be more likely to watch if you spent more time watching those 'conventionally' liked movies, given the limit of time.
incone123 7 hours ago [-]
I read some of the IMDb reviews and a person making a case for a film with a low score can tip the scales for me.
FuturisticLover 4 hours ago [-]
I find IMDb better than RT. RT even though it has an Audience score. It tends to give priority to the Critics' score, which in most cases we know is influential in one way or another.

IMDb score doesn't rely on a group of people, but on all users. It may give a little more weightage to the US users, but that's fine. Its top movies and TV shows make a lot more sense, unlike RT.

cantor_S_drug 4 hours ago [-]
Tangentially related : It is possible to deanonymize users from kaggle dataset or netflix competition.

https://medium.com/@EmiLabsTech/data-privacy-the-netflix-pri...

Compared to the example of the medical records, Netflix had been very careful not to add any data that could identify a user, like zip-code, birthdate, and of course name, personal IDs, etc. Nevertheless, only a couple of weeks after the release, another PhD student, Arvind Narayanan, announced that they (together with his advisor Vitaly Shmatikov), had been able to connect many of the unique IDs in the Netflix dataset to real people, by cross referencing another publicly available dataset: the movie ratings in the IMDB site, where many users post publicly with their own names.

https://www.cs.utexas.edu/~shmat/shmat_oak08netflix.pdf

https://courses.csail.mit.edu/6.857/2018/project/Archie-Gers...

Gareth321 1 hours ago [-]
I agree. I stopped relying on RT scores years ago. There are far fewer occasions when the audience score doesn't align with my preferences. Today I am so far away from most critics in taste and preference, it's like they live on another planet.
daft_pink 12 hours ago [-]
I will say that when I used to go to the theaters, which was before the pandemic and I started a family I used metacritic.

I found that any time I went to something that was red, I absolutley regretted it and it was terrible. Yellow was more hit or miss and top green scores were pretty good.

Exceptions were comedy where a lower score could still mean a good film, and politics oriented films, where a bad film with a media approved message could get a really good score even if it sucked.

It’s sad to not get a reliable indicator of that and someone should just resurrect the old score and call it Bad Apples. Since the actual score seems transparent, why not develop a competitor.

gboss 9 hours ago [-]
Theaters still exist you should go back. I go two or three times a month. If your kid is 4 or older they’ll have a great time. It’s good and healthy to get out of the house!
SwtCyber 6 hours ago [-]
Some genres just don't track well with critic metrics
mxxx 12 hours ago [-]
Regardless of the introduction of sycophantic reviewers, the 3/5 = fresh thing has always been a pretty half-ass threshold imo, and that a fact that a film can be "100% fresh" on RT on the basis of every single reviewer saying "yeah it's nothing special but it's fine, 3 stars" is fairly easy to misinterpret.
padraigfl 6 hours ago [-]
I think it's fine as a metric if you read it correctly.

100% means a film is extremely agreeable with whatever audience it has managed to get to. For major releases this can ultimately mean it's actually lacking anything particularly bold or interesting. This results in things like Frost/Nixon or Knives Out having higher ratings than broadly acclaimed films like Mulholland Drive or even There Will Be Blood; I know which ones I'd be more likely to put on with my extended family even if I don't especially like them.

But yeah, it's amazing how many people still don't grasp it after decades of getting angry about it.

pbsds 12 hours ago [-]
a mean rating of 3 can only be 100% fresh if the variance is 0
bhaney 11 hours ago [-]
That is what "every single reviewer saying '[...] 3 stars'" means, yeah.
og_kalu 10 hours ago [-]
Yeah and his point is that's never going to happen lol. People bring up the 100% point a lot and it's a bit silly because a movie with a significant number of ratings is never going to have that kind of distribution.

That's why it's always a hypothetical never backed with actual examples. It's one of those things that sounds plausible until you look at the numbers. Movies close to 100% have pretty high average scores and Movies with majority 3/5's are nowhere near 100%.

Yeah 100% for RT doesn't mean 10/10, but that's it.

foobarqux 9 hours ago [-]
It’s just not true in practice: it’s pretty typical to find films with high rotten tomatoes scores and not very high metacritic scores; rotten tomatoes scores are pretty much useless unless you are not very discerning.

Examples:sovereign, how to make a million…, count of monte cristo, etc

chongli 11 hours ago [-]
Wouldn't we expect the most truly mediocre movies to have the lowest variance in opinions?
throwup238 9 hours ago [-]
That might be why the Marvel movies score so highly.
kjkjadksj 11 hours ago [-]
Maybe the rating system should be changed from fresh and rotten into clonally propagated and heirloom to reflect this nuance
zdc1 7 hours ago [-]
Yeah, I'd love a personalised Tomatometer. I'd only get out of bed for something 4 stars (8/10) or above, so I'd love to know the percentage of audience/reviewers that scored like that.

This would also give "cult classics" and interesting/creative films that are more love-it-or-hate-it a bit more of an edge in ratings over the lukewarm Marvel slop we see these days.

cubefox 12 hours ago [-]
Yeah. Pixar movies are often close to 100%. IMDb ratings are usually far more reasonable.
seemaze 10 hours ago [-]
I just attend my local independent cinema. Sometimes I’m blown away. Sometimes the film sucks, but there are only 3-4 options at any given time. Simplifies the decision, and at the very least, they serve beer.
ildon 10 hours ago [-]
I only rely on IMDb scores and they're still reliable if enough time has passed since the movie release. A movie with more than 6 avg is usually enjoyable, while below 6 is not worth watching. Over 7 is usually very good, over 8 is a masterpiece
mjamil 10 hours ago [-]
It’s interesting that people pay close attention to one-size-fits all number (regardless of the pros and cons of the methodology used to derive said number). I find RT really useful for collating the reviews from “top” reviewers in one place: over the years, I know how my interests align with the tastes of particular reviewers, and I don’t have to look in multiple places to get a snapshot view of their opinions.
cainxinth 1 hours ago [-]
I’ve always preferred Metacritic because it assigns a weighted average. RT only tells you the “approval rate,” the percentage of favorable ratings. I don’t care if 100% of critics and fans agreed it was better than total crap.
s_dev 6 hours ago [-]
I've always found IMDBs rating to be far far better.

Anything that's 7+ is generally good, anything below that is flawed. The Tomato meter just comes off as random and an unreliable indicator for me.

bazmattaz 5 hours ago [-]
Yes I was going to say something similar. I often use a blend on IMDb score and rotten tomatoes to judge whether I should watch a movie.

The only thing I will say is that IMDB scores are also (likely) gamed by movie studios to artificially inflate their scores. You also need to be careful with certain genres and series on IMDB that attract positive ratings from the archetypal IMDB user. This is why marvel movies and such receive such sky high ratings.

Basically if the movie is not a mainstream superhero type blockbuster full of cgi you can use the IMDB score if your judgement

retox 13 hours ago [-]
Obvious to anyone using the site or aware of is ratings, including the author, but it is good to see some analysis as evidence.
DantesKite 7 hours ago [-]
Highly recommend MovieLens if you have eclectic or niche movie tastes. It can be a bit of a nerd-snipe though. One of my favorites activities is rating movies and watching the recommended ratings (what rating it thinks I'll give a movie) update overnight.

It's at the very least, better than average chance at predicting which movies you will like.

firtoz 3 hours ago [-]
There needs to be a "score from people who produce scores like me" for everything. E.g. for coffee shops I really don't care if the staff are rude but I care if there are tables and it's clean and the WiFi is good.

I was thinking of having granular scoring e.g. having people rate WiFi, rate cleanliness etc but that won't work because people are lazy. So some kind of Amazon algorithm of "people who buy similar things" but for movie, cafe, etc ratings.

To bring it back on topic, for movies I care about good story and visuals but not so much how "woke or not" it is, I prefer if it's intelligent Vs dumb except if it's a comedy then dumb is fine, and so on.

hermitcrab 2 hours ago [-]
I am very confused by the second chart (the bar chart). Shouldn't that be a scatter plot?
sirnicolaz 9 hours ago [-]
I still miss the times when I would decide to watch a movie based on the cover at the VHS store or based on a recommendation. Much faster, way more serendipity.
bmacho 1 hours ago [-]
Netflix shows movie posters and Reddit has recommendation threads.
qwertytyyuu 12 hours ago [-]
Would there be some selection bias as well? As info about movies becomes readily available, generally the people who go see movies would have decided that they would probably enjoy said movie, and write favourable reviews
mxxx 12 hours ago [-]
If they're actual movie reviewers then their job is to go see films regardless of whether they think they'd personally enjoy them. Some of the best reviews come from reviewers who have to go and see something they absolutely hate and would never go see for their own entertainment.
nzealand 11 hours ago [-]
Fun fact. You can click into Popcornmeter and see verified audience reviews versus all reviews.

E.g. Audience who went to see 2025's Snow White loved it. Those who haven't seen it hated it. Who is more biased?

garyclarke27 5 hours ago [-]
Interesting, over the past few years, I've ignored the critics score and just looked at the audience score - this explains why.
fifticon 8 hours ago [-]
audience scores, as I view them, is very much an indicator of "did the marketing connect to the intended audience?"

A great movie that met the wrong audience, will reasonably get a low score.

jamiek88 7 hours ago [-]
That’s a pretty good point. Lots of films find an audience post marketing phase and have a long tail. If the marketing phase had connected better their tail might be shorter.
SwtCyber 6 hours ago [-]
What really gets me is how effective the "Certified Fresh" badge still is, even for people who know the system is flawed
defrost 9 hours ago [-]
There are many metrics that can be sampled, many challenges to normalization.

eg: https://ext.to/browse/?sort=seeds&order=desc&cat=1&q=2019

is a listing of 2019 movie torrents ranked by seeds (number of clients holding full copies of a torrent version).

A normalization challenge is to group torrent variations (1080p rips and 720p rips and WEB-DL's and BluRay and etc.) and tally up and rank interest in various films over time.

Clearly Ne Zha (2019), a Chinese Animation, Fantasy, Adventure movie was a global pirate star of that year .. should it be "normalized" by population of country of origin to smooth out the home team having a billion+ in population "bias" ?

One advantage of ranking films by year and pirate copies is it provides a pragmatic measure of "staying power"

https://ext.to/browse/?sort=seeds&order=desc&cat=1&q=1964

My Fair Lady, Dr Strangelove, Rudolph the Red-Nosed Reindeer, Mary Poppins, A Fistful of Dollars, and Goldfinger are still being hoarded 60 years after their release.

* https://www.themoviedb.org/movie/615453

* https://www.rottentomatoes.com/m/ne_zha

13 hours ago [-]
cubefox 2 hours ago [-]
I would like to see the movies with the largest discrepancy between critic and audience score.

Edit: An example is Star Wars - The Last Jedi. https://www.rottentomatoes.com/m/star_wars_the_last_jedi

Critic score: 91%

Audience score: 41%

baronblackmore 6 hours ago [-]
So, the best movie is the one we sell
13 hours ago [-]
Joel_Mckay 8 hours ago [-]
In my experience of film reviews, Rotten Tomatoes high positive scores are not always representative of how entertaining some content will be.. However, the negative skew is almost always accurate for how bad something will be for all viewers.

The bimodal distribution of professional critics versus community opinions obviously describes what is happening behind the data. Recycled AstroTurf for 1980's cult films have little appeal to modern viewers even with maximal pandering for nostalgia.

Good Hollywood writers likely starved to death, and were replaced with LLM interpreted Nielsen Media Research data. Most video games offer better writing now... lol =3

12 hours ago [-]
mosst 4 hours ago [-]
[dead]
nebula8804 10 hours ago [-]
Its absolutely hopeless. The rankings are gamed because Rotten Tomato still carries some semblance of credibility (although in this post Trump winning era, sources of credibility can get eroded more quickly)

It does not get clearer than when a political movie comes out. 2018 was an interesting year, two movies came out that really allowed me to get a clearer picture of what was going on: "Knock Down The House" and "Death of a Nation".

When "Knock Down The House" (documentary featuring the leftwing US politician AOC) came out, I got interested in scraping the data off of Rotten Tomatoes and studying it.

Before making any moves, I first watched the movie for myself in a theater (and also got to see a live Q&A with the director to understand her thought process)

The movie had at the time a 100% rating from critics and ~80% from viewers. After watching it, I would concur with the viewer ranking but felt that the critic ranking was unusually high. Seriously? 100%? (It has now gone down to 99% but still). In regards to the viewer ranking I conceded that I was probably biased which is why I also ran this experiment on Death of a Nation (also saw in theaters but to a room with only one other moviegoer).

Knock Down the House eventually got featured on Tucker Carlson like a year after release(I think it coincided with Netflix making it free on youtube). I watched in realtime how the movie critic score kept going down and down and down to where it is now (11%). Dumping the scraped data, I ran a simple analysis and discovered a large portion of the people ranked it with no comments, or simplistic things like how stupid AOC is and many had had no other ratings other than this movie or the only other film is the one featuring Illhan Omar (another politician hated by the right).

For Death of a nation, the scores were flipped. A whopping 0% in the critic score(12 reviews) while the user score stood at a respectable 87% (again at the time when I did my scraping yet again we saw tons of 1 movie reviewers). Yeah the movie royally sucked and was painful to watch but 0%? That was a bit fishy. This essentially killed any credibility that I had in Rotten Tomatoes.

I started to trust places like /r/movies and /r/AMCsAList only to get burned by that as well when movies recommended in the comments would end up being terrible and then when I went back to criticize the films, I would get criticized and downvoted to non visible status. It was not a definite signal but gave me the feeling that there is a lot of astroturfing going on there as well.

Furthermore, these movies promoted on Reddit would typically be in 3/5 range on Rotten tomatoes which further made me think there is no real way to get a real signal if a film is likely to be good or not.

What I started to do was not a great metric but has helped cut down on the cruft: Follow specific actors/directors I really liked and ones that I felt were in it to make good films. As an example, actresses like Mary Elizabeth Winstead have turned down big roles in favor of indie films or other interesting scripts to the detriment of her career but the films have been more enjoyable and interesting. In each film I also find other actors to follow and if I start to see more studio promotion of a specific person (for example Anya Taylor Joy after The Queens Gambit) I start to caution away and sometimes just drop that actor from the list. In her case I stuck to films she worked with other people that I determined I liked(like Edgar Wright) before feeling like there is too much promotion going on and just dropping her from the list. Other than this, I fill out my list with franchises I like or subjects that I always give a chance to (science/space, etc.).

I know I am leaving out a lot of potential good films but the noise has become unbearable to the point where I don't want to waste my time anymore.

A few years later Rotten Tomatoes introduced "Verified Reviews". I thought this will be amazing as now it will only include people with skin in the game (ie. verified to have paid for a ticket)...except now this has been completely hijacked as well.

Going back to the example of political films now what the right wing does is they have a billionaire finance a film through some intermediary group then free tickets to the movie are given out at conservative events. People books seats to the movie, promote it on social media, post a "Verified Review" and then often don't show up to the screening. I have discovered sold out screenings to some of these movies but when I went to the theater to see some other film, i'd often peek into the screening of these films only to find they are almost empty. The movie plays regardless of if someone shows up or not. Furthermore some of these films actually have a code that they show at the end of the film to gift a free ticket to a friend so the box office numbers of these films are inflated and its all a bunch of hogwash.

Like I said its hopeless. In a way we saw the rise of this new fake world play out in Hollywood before it took over social media and the rest of the internet now. How will it end? People trusting only what they already know they like or from trusted friends and everything else is ignored.

This article has got me thinking of an interesting idea though: What if we go and determine which of the critics are known to provide reviews that reflect our tastes (maybe by reading reviews of movies that we enjoyed), then pull only those review data and compute a new Tomato score based only on those critics? We could toss the Rotten Tomatoes tomato meter in the trash and get back to a legitimate review that you could use as a positive signal again.

bufferoverflow 11 hours ago [-]
RottenTomatoes has been rotten for over a decade. IMDB user ratings are much more useful, but still far from perfect.
relwin 10 hours ago [-]
IMDB also had a ratings inflation of about 0.5 points about the same time (anecdata), where movies over 5.5 were watchable back then. Now a 6.0+ demarcates a watchable movie. Or my threshold changed in the past 10 years or so...
aaron695 12 hours ago [-]
[dead]
dcreater 11 hours ago [-]
Stopped reading when I saw a bar chart being used for correlating critic and audience scores.
ziotom78 10 hours ago [-]
It's not the best way to visualise a correlation, I agree, but the article is interesting and full of valuable information, so why stop reading?
mvdtnz 11 hours ago [-]
You don't need to comment if you didn't read.