Details

  • Last Online: 1 day ago
  • Gender: Male
  • Location: USA
  • Contribution Points: 0 LV0
  • Birthday: April 26
  • Roles:
  • Join Date: May 22, 2022

DaddyBrioche

USA

DaddyBrioche

USA
Anti Reset taiwanese drama review
Completed
Anti Reset
6 people found this review helpful
by DaddyBrioche
Feb 17, 2024
10 of 10 episodes seen
Completed 2
Overall 5.5
Story 4.0
Acting/Cast 7.5
Music 7.0
Rewatch Value 4.0
This review may contain spoilers

Cringefest

The concept was great. Lots of potential questions that might arise when considering a love story between an AI and a human. However, so much is cringe and so far, not much depth. Its fairly obvious that we are SUPPOSED to think this narrative is somehow significant, and it could have been. But the questions are not explored. This presentation is sort of the opposite of Japanese anime, which often explains things to the audience in detail. Here, situations are sort of thrown out there, with very little nuance, and the audience just have to explore the topic on their own. A watch party might make this more interesting. Viewers could pause and discuss what COULD have happened or the unexplored questions.

A few of my issues:

(1) If your AI character is so perfect that it is indistinguishable from a real human, then where is the story? I mean who really cares if its AI or human if there is no difference?

(2) (a) The AI is supposedly an "emotional support" AI (who demonstrates deep knowledge in many areas), but then sometimes innocent and unknowledgeable with respect to emotions. (b) The explaining comes from a previously emotionally dry character, who is incapable of demonstrating even the most basic politeness skills, who then incidentally suddenly becomes a wise master of human emotions and is the dispenser of knowledge to the AI.

(3) If this is supposed to be the future, why are we using pen and paper? Sudoku on paper? And the AI can't beat a human? Couldn't we at least use tablets? Will future episodes have public pay phones? Who was responsible for set design?

(4) How do we interpret a self-centered man who finally falls in love with someone because he is waited on hand-an-foot?
The AI cleans and cooks and takes care of every need of the man. And the man falls in love. Keeping in mind that BL is written for a female audience and that the AI is clearly the uke/female, I can't see any other way of reading this other than:

>> a woman's role is to cook and clean and take care of her master, and by doing those things will she receive love and adoration from her master. The man's role is to merely exist and dole out his wisdom to her. <<

Considering the centuries of struggle that woman have had (and still have), while many self-centered men wait for women to fall in love with them and serve their every needs, do we need something that pushes traditional couples further in that direction? Do we need to encourage men to be more incompetent at self-care? Consider also that the AI, instead of merely being a fancy housekeeping robot, could have been presented as a source of knowledge for the master. We already have good AI that can summarize knowledge in effective ways. Realistically, our first AI results are more likely to be superior to humans in the intellectual/knowledge ways, not in the housekeeping department. The challenge of AI companions will be to communicate with humans in ways that will not threaten us, while simultaneously having significantly higher levels of knowledge than humans.

There is no question that reinforcing traditional gender roles is common in BL. But in many modern productions, these are softened, broadened, or turned on their head. While presenting itself as ground breaking, this series seems to be hawking nostalgia for an imagined traditional past, disguised as a story about future AI.

Update:
Episode 5 does bring up an interesting question. If a human forms an attachment to an AI, and the AI is reset and loses all memory of the human, the human will feel loss. Shared memories will be gone, and exist only in the mind of the human. Similar to having a partner suffer from dementia or memory loss and not remember the partner. Those who have had family go through this know how painful it can be. There is something reassuring in having a shared memory; its a way of feeling connected. In some sense it is worse than death, because the body of the loved one is still around, but the shared attachment is gone.

Episode 8 indirectly asks whether an emotional support AI is even a good idea. But having a human become dependent on an AI is no different from dependency on a human. Either one can cease to exist. Additionally, while we (as humans) have invented lots of religions with an afterlife, having an AI cease to exist sort of makes us face whether the "soul" of humans is really a thing, or just something we humans have invented to ease the pain of loss and the fear of death. Episode 8 does not directly ask these questions, but there is just a vague reference to this line of thought.
Was this review helpful to you?