2011 LemonMeter (Critics) vs. FringeMeter (Community): An Analysis

The FringeMeter.

Some loved it. Some hated it. Some simply scratched their heads.

What the hell was it?

It was an experiment. It was created through a partnership between the Hollywood Fringe Festival and us here at the Lemon. The idea was to create an engine that would take the user reviews from the Fringe Site and translate them into something that resembled our LemonMeter, creating an actual number score that would best represent the overall quality of the show.

How did it work?

Well that’s what we’re going to take a look at it now.

The major difference between the LemonMeter and the FringeMeter is that with the LemonMeter a human (me) is actually reading the reviews and judging whether they are Sweet (positive), Bitter (negative) or Bittersweet (something in between). The FringeMeter works purely on an automatic metric that took in the star system and the number of actual written reviews on the Fringe site put them through a top secret algorithm and then spat them back out on our site. Automatically. Untouched by human hands or minds.

Here was our FringeMeter Update on the final day of the Fringe. Just to help you compare.

So basically what we’re going to be taking a look at is man versus metric, Lemon Heads. Or even better Critic vs. Community. LemonMeter vs. FringeMeter. Whatever.

Why are we doing this?

Well, because we are considering creating a User Review engine here at the Lemon, a place where people can anonymously offer their take on a show creating a number that can be compared to the Critical Consensus. So this admittedly unscientific analysis – and your thoughts as well – will go a long way to helping us perfect the system.

The shows I’ve chosen to compare from the Fringe were the ones that garnered at least 3 Critic Reviews and also showed up on our FringeMeter.

I’ve offered two Fringe shows that have extended beyond the Fringe and have received a lot of critical praise but didn’t even show up on the FringeMeter. Just because.

Take a look below and I’ll meet you on the backend.

FRINGEMETER VS. LEMONMETER

THE TROUBLE WITH WORDS: FringeMeter: 80%; LemonMeter: 88%

FOUR CLOWNS: ROMEO AND JULIET: FringeMeter: 78%; LemonMeter: 67%

THE BARKING PIG: FringeMeter: 81%; LemonMeter: 80%

FIVE UNEASY PIECES: FringeMeter: 75%; LemonMeter: 100%

INEFFABLE: FringeMeter: 79%; LemonMeter: 100%

100 SAINTS YOU SHOULD KNOW: FringeMeter: 75%; LemonMeter: 100%

LIFE IN THE MIDDLE AGES: FringeMeter: 81%, LemonMeter: 75%

THE LAST FIVE YEARS: FringeMeter (No Score); LemonMeter: 100%

GIRL BAND IN THE MEN’S ROOM: FringeMeter: 78%; LemonMeter: 100%

LOST MOON RADIO: FringeMeter: 80%; LemonMeter: 100%

COWBOY MOUTH: FringeMeter: 77%; LemonMeter: 25%

WORKING: THE MUSICAL: FringeMeter: (No Score); LemonMeter: 80%

Okay. So what can we take from all this?

Keeping in mind that some of these LemonMeter ratings came from only 3 reviews while the FringeMeter scores may have come from a dozen or more – how do the two different types of scores compare?

Not too badly I’d say. With less data to work with you’d expect the LemonMeter scores to be higher – and for the most part they are.

And clearly there is ample room for manipulation on the Fringe site – friends of the show flooding the review section – so the FringeMeter scores are not really every going to be that low. And that seemed to play out as well. Only three shows scored less than a 60% on the FringeMeter Page, everything else was 70% and above.

Another interesting fact, of these twelve shows, four of them won Freak Awards at the Fringe. Basically, these were the Community Awards, voted on by selected balloteers and done through the Fringe site. Of these four shows, all of them were in the 70 and 80% margin on the FringeMeter, but except for Four Clowns did better on the LemonMeter. The critics knew better even though the Community voted them the winners? A little strange.

But the most interesting fact of all of this, the Four Clowns scores – the show that swept all three of the Freak Awards for which it was nominated – received almost the exact same scores on the FringeMeter (78%) as it did on the LemonMeter (75%). What the hell do we take away from that? Probably that the show – in its current incarnation – is about a 76% percent show any way you look at it. [UPDATE: That LemonMeter score has since shrunk to 67% Sweet]

Final conclusions on the FringeMeter? In my opinion, a deeply flawed but worthy experiment.

Our future plans are to simplify the mechanics of the whole thing. Ben Hill, Fringe Director agrees, and has been exploring a simple, A) Liked it B) Loved it C) Not my thing D) Skip it model. I like the elegance of that. How that would configure with our LemonMeter scoring I’m still not sure. But it’s a model that we might use here as well if we ever launch a Lemon Head user review arm to the LemonMeter.

Stay tuned. And please, let us know your thoughts and suggestions. The evolution continues here at the Lemon and you are an integral part of that transformation.

Filed Under: colin mitchellFeaturedPonderings

Tags:

Colin Mitchell About the Author: COLIN MITCHELL: Actor/Writer/Director/Producer/Father, award-winning playwright and screenwriter, Broadway veteran, Marvel comics scribe, Van Morrison disciple, Zen-Catholic, a proud U.S. Army Brat conceived in Scotland and born in Frankfurt, Germany, currently living in Los Angeles and doing his best to piss off as many people as possible.

RSSComments (1)

Leave a Reply | Trackback URL

  1. Kat Primeau kat says:

    thank you for the more nuanced look at this system! as producer of “girl band” i was thoroughly confused by what we were looking at on the fringe meter. appreciate the accountability, and all you do for theatre in LA :)