Earlier this month, after roughly a 20-year wait, audiences finally had the opportunity to see Quentin Tarantino’s Kill Bill: The Whole Bloody Affair.
Initially released in 2003 and 2004, Kill Bill: Volumes 1 and 2 comprised Tarantino’s then long-awaited fourth film, originally envisioned by the auteur as a single work but later split by producer Harvey Weinstein to avoid either releasing a movie with an over four-hour run-time that might deter the casual moviegoer or a greatly pared version that would severely compromise Tarantino’s vision.
Hence, Volume 1 introduced viewers to “The Bride,” a young, female, assassin, beaten, gunned down, and left for dead on the day of her wedding (or, more accurately, wedding rehearsal) by the Deadly Viper Assassination Squad, the team of trained killers led by the titular Bill, The Bride’s former lover and father to her unborn child.
In Volume 1 we see The Bride awaken from a coma after several years and win a knife fight against one of her former co-workers. Yet, the bulk of the volume focuses on The Bride’s acquisition of a legendary Hattori Hanzō sword and the series of stylized battles she must overcome before facing O-Ren Ishii, a former teammate who has ascended to the head of the Tokyo yakuza.
Slower and more methodical, Volume 2 better develops the remaining characters, further exploring their backstories and relationships with one another while gradually building towards The Bride’s final confrontation with Bill, which manages to both subvert and exceed expectations.
Although both volumes can be viewed as individual masterpieces, for Millennial cinephiles a single film called Kill Bill came to be viewed as something like the original theatrical release of George Lucas’ Star Wars. Unlike the four-hour cut of David Lynch’s Blue Velvet or the lost pie fight scene from Stanley Kubrick’s Dr. Strangelove, it was known to still exist. Tarantino had screened it in 2006 at Cannes and again for a special showing in 2011. He just wasn’t releasing it for general audiences.
Then finally on December 5, 2025, Kill Bill: The Whole Bloody Affair, quietly hit theaters, taking the number six spot for its opening weekend – something pretty impressive for a largely unadvertised four hour and thirty-five minute remix of a pair of films from more than 20 years ago.
When I learned of its release by chance while checking the movie listings for my local AMC, I promptly cleared an evening to ensure I could experience Kill Bill as intended. And, I am glad I did.
On whatever level, the experience is different watching the film as a single whole in a single sitting as opposed to watching it as two separate films months apart. Moreover, it was also a reminder of what movies used to be – and still could be.
Every scene is expertly crafted. Every shot is perfectly framed. Every color is carefully chosen. Every line of dialogue, no matter how seemingly insignificant, reveals something about the characters and their relationships with one another. The construction of the narrative is a masterclass in storytelling.
Moreover, after more than twenty years, watching The Bride embark on her globe-trotting, blood-soaked quest for revenge was just as captivating as it ever was. Watching her battle through O-Ren Ishii’s henchmen at the House of Blue Leaves was no less exciting. Watching her training under the mystical Pai Mei pay off as she punches her way out of her grave was no less triumphant. Watching her final confrontation with Bill was no less suspenseful.
Yet, throughout the film, I could not help but be troubled by a couple of nagging thoughts no matter how much I tried to cast them aside.
They Just Don’t Make Them Like They Used to
The first nagging thought, to which I already alluded, was that movies really have changed since 2004, undoubtedly for the worst. It seems strange thinking about Kill Bill in 2025 the way people did about Lawrence of Arabia or The Godfather in 2003, but they just don’t make movies like that anymore and it’s hard to imagine something like Kill Bill getting made in recent years by anyone other than someone with Tarantino’s clout.
In a narrow sense, Kill Bill features a strong, multilingual, female antihero, proficient in samurai swordplay and Chinese kung fu, fighting against several formidable female foes, as well as a much older man who happened to be her boss, with whom she once engaged in a romantic relationship, and who eventually tried to kill her after she left him.
However, neither The Bride nor any of the female antagonists she faced ever came off as annoying girl-bosses with unearned skills. The film never condescendingly lectured its audience about the patriarchy, toxic masculinity, or why workplace romances are never appropriate. The Bride never claimed to be an innocent woman manipulated into a life she didn’t want. Her only real gripe with Bill was that he tried to kill her when she came to the realization that working as an assassin while raising a child might be a bad idea, as might raising that child with a man who would very likely push the kid into the family business.
Moreover, given the heavy influence of Asian cinema (not to mention the occasional influence of ’70s Blacksploitation movies), it is difficult to believe something like Kill Bill could have been made by most directors within the past tenish years without being flagged over fears that someone might cry cultural appropriation.
Unlike films released between 2016-2024, Kill Bill never felt beholden to some woke Hays Code requiring films to conform to the delicate sensibilities of a thirty-five-year old cat mom with a grievance studies degree.
Additionally, in a broader sense, given Hollywood’s current obsession with stripmining old IPs for disposable content, it is equally difficult to believe that something like Kill Bill would have been made in recent years without an at least nominal built-in audience beyond diehard fans of a single director and maybe the former Roger Ebert At the Movies crowd.
The second thought I couldn’t quite shake, though, was one by which I was considerably more perturbed – and that was that not only have movies gotten considerably worse over the past ten years or so, but so has the moviegoing experience.
The Enshittification of Modern Moviegoing
The impetus for this second thought on my most recent multiplex excursion was when I got to the ticket counter and tried to purchase my ticket, I was informed that AMCs no longer take cash – but that I could use a machine in the lobby that would convert my cash to a pre-paid gift card. The guy at the counter could tell I was frustrated. He told me a lot of people were and said it might have something to do with Trump’s ill-planned elimination of the penny. In any case, AMCs apparently have gone cashless (even if the company website, as of that night, still said otherwise).
Now, like most functioning adults, I do have credit and debit cards. But I also like the option of using cash and generally avoid cashless brick-and-mortar businesses as a matter of principle. Hence, I reluctantly let him swipe my credit card because it was Kill Bill, told him this would likely be my last time at an AMC (maybe a bit of hyperbole on my part), and went to ruminate during my 30 minutes of previews on how movie theaters and the moviegoing experience they offer have become prime examples of “reverse progress,” sometimes referred to by the more PG-13 term of “enshittification.”
Seeing movies in movie theaters was something I did throughout much of my life. Growing up in the ’90s, I saw most of the major comedies, action movies, and summer blockbusters of the era in theaters with my father, who spent most of his working life in low- or mid-level management positions at various theater chains (meaning movies were basically free for us).
In the 2000s, when I was a little older and could get to a theater on my own, I managed to assemble a small group of young cinephiles with whom I saw films like Eternal Sunshine of the Spotless Mind, Spider-Man 2, and Kill Bill: Volume 2, always using a seemingly endless supply of “We’re REEL Sorry” passes obtained by my dad. Later, when I left home to pursue my first graduate degree in the 2010s, I continued to catch the major blockbusters at the local chains of my new college town while also finding what practically became a second home at the Normal Theater, which doubled as a historical landmark and specialized in classic, independent, and foreign films.
Yeah, by 2015 or so, some people may have come to feel seeing movies in movie theaters was unnecessary or inconvenient, but plenty still did. Plus, personally, it was always my preference. Personally, it never seemed burdensome – or that is at least not until the fall of 2016.
It was around that time I started to see the early signs of enshittification at the movies. It was around that time I once again moved and found most theaters, including the lone chain theater in my new location, were imposing a suite of features that were supposedly improvements but made frequenting them increasingly inconvenient.
Specifically, these included a tiered entry system, reserved seating, and comfy recliners. Superficially, these features may all sound like good ideas. But, in practice, they made trips to movie theaters more complicated and less predictable.
For years, the process of going to a movie basically came down to entering the theater, partaking in a 30-second transaction at the ticket counter, and then heading to your movie. On a moderately busy night, maybe you’d have to wait in line five minutes, but doing so was generally manageable.
Once these improvements were put into place, however, unless you were willing to pay a nominal monthly fee and presumably part with a certain amount of personal data, you could get stuck in a line that never moved if a sufficient number of people paying for priority entry were already there or came in after you.
Furthermore, even the priority line was slower than what most moviegoers would have experienced prior to there being a priority line. Given that multiple improvements were being rolled out simultaneously, once any customer was allowed to advance to the ticket counter, they had to stand through a brief tutorial on how to select their reserved seat, and, if they were not already paying for priority entry, endure a brief sales pitch for priority entry as if the downside of not paying for it hadn’t already been made clear.
Thus, a 30-second process that might require waiting in a two-to-five minute line on a bad day became a one or two-minute process that might require waiting in a five or ten minute line on a good day. Then, once you finally got to your reserved seat, there was a roughly 20 percent chance that you would find some old food or used napkins jammed in the cushions of your comfy recliner that really made you come to appreciate the classic, functional design of the traditional movie theater seat that folded upwards, allowing for easy cleaning by theater staff of whatever mess the seat’s prior occupant left behind.
In subsequent years, as patrons became accustomed to the inconveniences of their enhanced moviegoing experience and the kids at the ticket counter appeared to trim their sales pitches and tutorials, other perhaps less official practices seemingly based in poor management appeared to become increasingly common while further making the moviegoing experience feel like a trip to a failing amusement park run by the TSA.
Increasingly, I saw theater employees rifling through women’s purses and asking customers to lift or shake out winter coats to ensure no one was illicitly trafficking a bottle of water or a six-inch sub.
Additionally, at some point, it seemed that theater managers inexplicably agreed that it made sense to partially block theater exits with moveable garbage cans and less moveable theater employees shortly before a film would end, forcing customers to squeeze past an obese usher with questionable hygiene and physically move his accompanying trashcan before they could go home, burn their clothes, and proceed to take a 20-minute shower.
(My sense is this latter issue stemmed from an attempt to enhance perceived hospitality by patrons by having ushers hold exit doors for them and speed up cleaning times, however, because young theater employees didn’t know how to properly hold a door for someone, or what to do with their trashcans, both usher and receptacle often ended up right in the middle of the exit).
Then Covid happened (or better put the Covid response happened), which forced theaters to shut down before reopening with a collection of nebulous, ever-changing, safety-theater policies whose enforcement seemed ultimately determined by how paranoid an individual theater’s manager was about Covid and whether he had any strong martinet tendencies.
To offer some reminder of life during those years, I remember being able to go watch a bad Bad Boys sequel in a packed room at my local AMC in February 2020, opting against seeing Christopher Nolan’s Tenet in the summer of 2020 because I knew I would be unable to enjoy it if masked for two and a half hours, going to see Spiral and several other mid-level films at my local AMC in the summer of 2021 sans any Covid restrictions, then being denied entry to Edgar Wright’s Last Night in Soho at the same AMC in the winter of 2021 because I refused to wear a mask.
Fortunately, by that time, I had found a small independently owned theater near me that wasn’t enforcing Corona Law and could see most of the bigger tentpole movies there. Yet, even after Corona Law was lifted, I definitely found my already strained relationship with chain theaters to have soured even further.
Much attention has been given to why audiences have turned away from theaters in recent years. Largely, the consensus is a combination of lower-quality content from Hollywood, the rise of streaming, shorter waits between theatrical releases and home-viewing options post-Covid, a generation of moviegoers that became accustomed to not seeing movies in theaters during Covid, and perennial complaints about ticket prices, endless previews, and having to put up with other patrons who may not know basic movie theater etiquette. Many of these explanations probably have some validity. The Critical Drinker (aka The Drinker, aka Drinker) summarized them quite nicely in a recent video.
Yet, I can’t help but think the long series of enshittifying inconveniences and indignities that would have seemed foreign in the olden days of 2004 haven’t played a role.
Returning to my long-awaited excursion earlier this month to experience Kill Bill as Quentin Tarantino intended, as I sat there at the theater, I couldn’t help but feel that there was something bittersweet about the experience. As much as I enjoyed the film, I couldn’t help but wonder how much worse going to the movies was going to get thanks to future improvements to which I and other moviegoers will be expected to begrudgingly accept. As I drove home that night, I couldn’t help but ask myself: How much longer would I be going to the movies at all?
Join the conversation:


Published under a Creative Commons Attribution 4.0 International License
For reprints, please set the canonical link back to the original Brownstone Institute Article and Author.









