As I’m writing this sentence, I’ve just finished watching Vivy: Fluorite Eye’s Song with a friend two hours ago. It left me with a lot of impressions and I wanted to collect these here while also organizing my thoughts before the memories start to fade. Me from three weeks later: well, that didn’t quite work out. As usual.
The biggest impression of Vivy that remains, for me, is dissatisfaction. Don’t get me wrong, it wasn’t a bad anime and I still enjoyed watching. But nearly the entire plot including subplots fell flat for me.
The anime was still quite impressive production-wise. It is filled with largely beautiful backgrounds and animation and a nice soundtrack including several insert songs. Just watching Matsumoto be was incredibly fun. He would often move around in amusing ways while just talking. This is difficult to convey if you haven’t seen it unfortunately, so here goes an example:
The fight scenes were slick as well, had several pretty cool moves and a great feeling of impact, though they were just a tad too fast to really follow and appreciate them for my taste. They also had this more realistic (?) drawing style which was sometimes used with good impact, but sometimes also seemingly randomly.
But let’s get into the plot. I will spoil pretty much the entirety of Vivy here, so proceed at your own risk.
With that out of the way, I’ll go over the three main, I’ll just call them threads, of plot we had in the anime and what bothered me about them. The first one is Vivy’s personal story, her striving to become a diva that makes people happy by pouring her heart into her singing. The second one is her and Matsumoto’s overarching goal of preventing a war of humans vs. AI. And the third thread is actually several subplots in the form of AI-related incidents that make up the meat of the show.
To summarize the main plot: we are introduced to the world of Vivy around the year 2060. At the start of the show, AI is seeing mass adoption, mostly in the shape of autonomous anroids, vehicles etc. Vivy herself is one of the most advanced AIs of that time, being an autonomous AI, which basically means she’s sentient. Each AI is given a specific mission, and their life purpose is to fulfill this mission as well as they can. Vivy is no different. She was created as a singer, and her task is “to make people happy, by pouring her heart into her singing”. She is stationed in an amusement park.
Eventually, an AI called Matsumoto shows up from the future and enlists her help in preventing the war between AIs and humans that is going to break out in a hundred years from then. After a bit of back and forth, Vivy agrees to help him. Their goal is change the future by affecting certain “singularity points”, events that scientists from the future have identified as paving the way for the war.
Subplots
There are mainly four subplots here (not counting the finale), presented as mostly self-contained arcs with a length of two to three episodes each. Each of these represents a singularity point. They are separated by timeskips of 5 to 40 years.
Aikawa’s Assassination
This event occurs right after Vivy and Matsumoto meet. Aikawa is a politician of little importance who is trying to pass the “AI Naming Law”. In Matsumoto’s timeline, Aikawa is assassinated by the anti-AI terrorist group Toak. This turns him into a martyr and the law that he backed sees a surge in support, allowing it to pass. This would later lead to the recognition of AI’s rights and broader adoption, accelerating their production. Matsumoto’s proposed fix is simple: just save Aikawa. He is not important and, ironically, the proposed law wouldn’t pass if he was still alive.
I don’t actually have much to say about this arc. This was the start of the show and solidly done. It also lays the foundation for some things that happen later, so I wanted to mention it anyway.
Crash of the Sunrise
This is the second arc, which begins 15 years after the first one. The Sunrise is a space hotel, basically a space station where rich customers can spend their vacation. At this point, AIs are already noticeably more advanced, being barely discernible from humans. The Sunrise is staffed solely by AIs, with their leader, Estella, having taken over after the original (human) owner’s demise several years ago.
In Matsumoto’s timeline, Estella steers the Sunrise into the ocean, leading to the death of everyone aboard and sparking an outburst of anti-AI protests and violence across the globe. Vivy sneaks aboard the ship to prevent this and it turns out that Toak is behind all of it. The Sunrise, which can only be steered by someone with Estella’s exact signature, would be manipulated by Elizabeth, who is Estella’s long lost “sister”, another AI who was created as an almost exact copy, but discarded early on. They’d never met before, though Estella knew of her existence.
To repeat that, the two AIs have been created as copies. Estella was created first, but Elizabeth received an exact copy of her early memories. But she was soon discarded as trash for an unknown reason, without having been given a mission. When she was picked up by Toak, one of their members, Kakitani, gave her the mission to serve him. Happy about having been given the equivalent of a purpose in life, Elizabeth devotedly served him and followed him onboard the Sunrise.
Vivy eventually beats Elizabeth in combat and Matsumoto deletes her memories related to Kakitani, and maybe even everything that came after she was discarded (not completely clear). Estella stays onboard the Sunrise, which is already too far off-course to not crash into the earth. She will have to stay until the end to make sure it lands in the ocean, while everyone else evacuates. At the very end, Elizabeth, with her memories deleted, joins her and together they succeed. They are both destroyed in the crash.
I had serious trouble caring about this plot. At the beginning of the arc, Matsumoto proposed to destroy Estella, as that would be the simplest fix. She can’t drop the hotel into the ocean if she’s not alive anymore. Vivy refuses to do so. She has been working as one of the staff AIs on the Sunrise and got to know Estella in the process. She made the impression of someone who genuinely cared for the people onboard. You’d see her playing with the children, smiling happily. When a malfunction has the customers on board distressed, she shows them a view of the stars and sings for them to calm them down. When it turns out that it was never Estella who steered the Sunrise into the ocean, but her “evil twin” Elizabeth, this is presented as the big twist that turns everything on its head.
It kinda is, but it also destroyed all traces of the “previous plot” that could have been interesting. I was genuinely looking forward to see why Estella would eventually decide to destroy the Sunrise and kill the people on board, which she all seemed to treasure. When it turned out it was “just Toak at it again”, that was a little disappointing. The formerly interesting problem was simply resolved through violence as Vivy knocked out Elizabeth.
Okay, I actually don’t want to bitch about this too much, the plot was still fine. The arc as a whole was okay, but I didn’t really feel it. There is, however, one thing that really bothered me, and that is how Elizabeth was treated. First of all, they just straight up deleted her memories and after that, she was all nice and helping Estella? I mean, I’m all for happy endings, but this is anything but a healthy resolution. What about Elizabeth’s past? What about her caring for her master Kakitani? What about her relation to Estella? The contrast between these two AIs that should have been exactly the same was an interesting point and I was waiting to see what they’d do with that. But instead, they literally just threw it all away. In some way, this felt like a rushed adaption of a novel, so much that I’d actually asked if this was an adaption and not an original work after all.
The Metal Float
After another 5 year skip, we find ourselves at the beginning of the third arc. We are introduced to doctor Saeki, who helped create the Metal Float, an artificial island inhabited by AIs only. It is basically a giant factory, helping to cover the world’s demand for AIs and robots.
We are informed by Matsumoto that the resolution of the second arc led to wider adoption of AIs as humans were impressed with Estella’s noble sacrifice to make sure the Sunrise would fall into the sea, and not onto a city. This accelerated AI development and also led to the early construction of the Metal Float.
In Matsumoto’s timeline, the float was only built 20 years later. In his words, “it is 20 years ahead of its time” and must be destroyed to impede the growth of AI. This singularity point obviously changed from his timeline. Had this development not happened, the goal would have been to prevent the marriage between doctor Saeki and his wife Grace. Grace is an AI, and they were the first human-AI couple in the original timeline, which is not the case now.
In any case, doctor Saeki agreed completely that the float was way ahead of its time and conveniently provides Vivy and Matsumoto with a virus that should shut it down. They then board the island where they are greeted by a bunch of AI robots that throw a welcome party for them. Humans “like surprises, right?”, so they wanted to try that out. It turned out sort of uncanny, even though the good intentions were obvious. Shortly after that, Toak is back on the stage and attacks the float. Vivy uses the moment of confusion to inject the virus into the system, and for some reason the robots go on a rampage, blowing up Toak’s ships and attempting to kill everyone who was on them. At the same time, doctor Saeki and his wife make their way onto the island.
When Vivy encounters them on the island, doctor Saeki confesses: the virus didn’t just contain a shutdown program but also tries to extract the data from the float’s central computer. We then learn that the woman at his side isn’t actually Grace but a cheap copy, and that the real Grace was chosen to be the AI controlling the float when it was constructed years ago. It is revealed that basically all the AIs on the float are part of the central control instance, of the real Grace, and that it is likely impossible to “extract” her. Vivy resolves to destroy Grace to end her rampage. Doctor Saeki orders the copy at his side to stop Vivy, but Vivy easily destroys her in a single punch. She then heads to the float’s core and does the same thing to the real Grace.
There’s so much to unpack here and I don’t really know where to start. I’ll just pick Saeki. The dude is married to an AI, which is significant enough that I was expecting the show to talk about it. It didn’t. But honestly, that’s fair enough. The show actually skips most of the “is real AI any different from humans?” debate which is fine by me, as I’ve seen that so many times that I’m tired of it at this point. Another thing to mention is the fact that he was just living with a copy of his wife. That’s kinda fucked up if you ask me, but it gets worse. Once he recognized Vivy as a threat, he doesn’t hesitate to send his wife-copy into combat against her.
K-5, stop her!
And that is how quickly his beloved wife (copy) turns into a nameless tool again. This was also somewhat disconcerting, but then again, Saeki is visibly shaken when his wife-copy is wrecked in the blink of an eye by Vivy. We see him kneeling down beside her “corpse”. So are we supposed to think he thought fondly of the copy? Or was she just a tool to him? A replacement? Both? That could possibly be an interesting conflict, but it is also never talked about. It’s just there and makes me wonder. To finish up the enumeration of Saeki-being-fucked-up moments: he literally let Vivy inject a virus into his wife. Wtf?
Okay, with that out of the way, there’s one more thing in this arc that just happened and apparently no one in the series felt the need to comment on it. Grace, who worked as a nurse and seemed rather human at least to me, was just stuck into the center of the metal float and condemned to control this massive factory for the rest of her life. I mentioned previously that the series doesn’t actually talk too much about the classic “AI-question” and that I actually like that.
But at this point? It honestly should have. It perfectly highlights how AIs are still seen as nothing but tools, their life decided by someone else, and having a fate perhaps worse than slavery apparently not regarded as a big deal. Grace is dehumanized in a spectacular way, and no one really seemed to care much about that.
I loved how Grace sang “Sing my Pleasure” in the flashbacks with Saeki. Her calm, breathy voice gave off a solemn vibe, compared to the more energetic version we know from the opening. When Vivy hears her distorted singing over the radio, she comments that “this is not singing; It’s just a sound recording”.
When the robots threw a surprise-welcome party for Vivy and Matsumoto as they initially arrived at the float, their hospitalitiy, while well-intentioned, feels somewhat off, maybe even creepy. They give us the same quote about “surprises” that we later see Grace saying to Saeki in the flashback to their reunion.
Both events imply that the real Grace is long gone and there’s no saving her. Her inhumane conditions have completely broken her, leaving behind nothing but a shell of a once very human being, sitting at the center of an equally soulless artificial island. That was actually amazing storytelling of the more subtle kind, giving us something to think instead of being all in-your-face about it. That makes it all the more confusing that nothing was done with this. Grace was eventually “saved” by Vivy. By killing her. There is no hope for her, and at the same time there seems to be no hope for AIs in this world that abuses them so badly. And yet no one even mentioned that this might perhaps possibly be problematic?
Vivy meets Saeki afterwards, apologizing to him for killing his wife. I don’t remember the rest of the dialogue, but as Vivy turns away to leave, Saeki shoots himself. Vivy has a mental breakdown, gives birth to her second personality and gets really good at singing. More on that later.
Ophelia’s Suicide
The fourth arc and singularity point begins 20 years after the last one. At this point, Vivy is a rather famous singer, though it becomes apparent she doesn’t remember anything that happened “outside” her career, namely everything related to Matsumoto and the AI war. After a rather amusing back-and-forth, they decide to work together again. The singularity point this time is Ophelia’s suicide. Ophelia is a “diva” similar to Vivy, she gives concerts and sings for people on the stage. According to Matsumoto, she would commit suicide shortly after her next concert by jumping from the top of a large building. This would make her the first AI-suicide in history, and would inspire a similar outbreak of AI-suicides across the globe. The goal is to prevent this.
Over the course of this arc, Vivy tries to get closer to Ophelia, probing her for any worries she might have, anything that could cause her to commit suicide. We learn that she has a friend/collegaue called Antonio who used to do the stage lighting etc. for her, and just generally support her. At some point, he shut down for an unknown reason and hasn’t awakened since. Vivy and Matsumoto guess that Ophelia eventually commited suicide out of sadness over her friend not returning. When Matsumoto tries to prevent Ophelia from jumping off the roof, she reveals that she is actually Antonio, who took over Ophelia’s body. They fight, including Antonio’s body which he remote-controls. They still lose and both end up dead.
On to the interesting part: why did Antonio take over Ophelia’s body? And here I gotta say… I don’t know…? When Antonio’s cover is blown, he claims the he’s taken over Ophelia’s body because she was struggling with singing and performing. His mission was something along the lines of “help her be successful”, and as he believed he could do it better, he took over. At some point later, Antonio suddenly says that he only took over because he couldn’t bear watching Ophelia struggling and suffering from the pressure of performing etc. As Antonio is defeated and the fight ends, Ophelia “resurfaces” and they both affirm their mutual friendship (love?) for each other before they die.
So far, the show has made it abundantly clear that an AI’s mission can be rather freely interpreted. That means the reason Antonio gave first is not a direct consequence from his mission, but a decision of his own. It paints him as someone both selfish and arrogant, who would practically enslave his best friend just to be in control. It doesn’t help that he is the spitting image of a villain with the way he talks, his facial expressions etc. This is a stark contrast to everything that comes after that and I cannot for the life of me think of a reason why his motive and Antonio himself were presented like this at first. This is not just misleading, it downright doesn’t make any sense. How am I supposed to believe that “Antonio was good all along” and “just wanted the best for Ophelia” at the end?
This is all the more sad because it would have been an interesting conflict otherwise. Local AI-man decides to take over his friend’s body so they don’t have to suffer through their life anymore? This is the good kind of fucked-up, the genuinely interesting one. Generally, I love to see it when people do the worst things with the best intentions. In any case, this conflict lost all its meaning, half of which I’m attributing to the confusing and contradictory delivery. The other half comes from the fact that they just introduced something like that and never properly talk about it. At the very end, Ophelia and Antonio die on best terms. What???
I mean, I didn’t expect anything else considering Ophelia’s personality, but come on… at least give her the chance to say something among the lines of “thanks bro, but I gotta live and learn through my own struggles” or so, and that would have improved the ending quite a bit. But as it stands, this just… happened. This is very much reminiscent of the second arc, the crash of the Sunrise. We are presented with an uncannily clean solution and happy (?) ending that should actually be anything but clean or happy.
As another similarity to the second arc, this one also started out presenting a conflict that I was at least somewhat interested in, and then threw it over board and turned it into something completely different. I was just waiting to see why Ophelia might want to commit suicide. At the end, that question becomes irrelevant. There’s nothing wrong with the twist here, but I think it’s just bad management of expectations by the show.
Lastly, I found it rather questionable that one AI can apparently completely subdue and take over another AI by sticking a cable into them. That doesn’t speak to the security of their software. Plot is more important than technicalities like this, so that’s fine with me, but it still threw me off quite a bit.
And with this, we conclude the fourth arc and all the subplots in general. Let’s get to the AI war.
The AI war
First to summarize what happened after the fourth arc: Vivy returns to her “original” personality, but loses her ability to sing. She’s put into the museum where she starts writing a song. On the exact day of the original AI war, the exact same thing happens, except that this time the AIs are also chanting the song Vivy wrote, one key off for those extra-creepy vibes. We learn that Archive, the central computer sort of managing (?) all AIs in the world just spontaneously decided that humans are bad and should be eliminated, for no reason in particular. Archive does however give us a sliver of hope: Vivy is the first and only AI in history that ever did something creative, like writing a song, of her own free will. If she manages to sing that song, then Archive will allow humans to live on. Of course, Vivy eventually succeeds, stopping Archive from dropping satellites all over Earth. We get our happy ending of only a couple hundred thousand or perhaps millions of people dead, instead of billions. Vivy breaks down after her song and the show closes with her awakening again in a chair, her hair cut short, having apparently lost all her memories like last time, and just going back to singing again.
The first thing that bothered me, was how the ending almost ret-conned everything that Vivy and Matsumoto did to prevent the war. There’s several layers to unpack here. First of all, Vivy and Matsumoto have successfully changed all the singularity points. So why did the AI war still happen? And it even happened on the exact same day as in Matsumoto’s timeline? Was everything they did completely pointless?
Yes. Yes, it undoubtedly was. Archive just casually dropped that it’s been watching Vivy’s and Matsumoto’s journey all along, and made adjustments on its own to stop their timeline from deviating from the original too much. On the one hand, I love how this makes perfect sense. All AIs are apparently connected to Archive all the time, meaning it also knew about everything Vivy and Matsumoto were doing from the very start. Archive decided to start the war on its own, and of course it wouldn’t let Vivy’s obvious attempts at manipulation influence its decision.
On the other hand, who was the idiot that came up with this stupid singularity project and didn’t think for a second to take this into account? I’m baffled by just how much the finale ridicules the entire setup from the start. We were told that scientists worked hard on this singularity project, identifying these singularity points that would eventually spark the war, and you’re telling me to believe that none of them stopped even for a second to have a single rational thought and recognize the futility of their approach?
Even worse, we’re given all the information to figure this out ourselves almost at the very start. The show begins with the outbreak of the AI war. We see AIs in an amusement park violently murdering every human being in sight. This is obviously not the sort of war that resulted from ever rising tensions between humans and AIs and eventually culminated in this. No, it is obviously something that happened very suddenly and was one-sidedly decided by the AIs. That is the only explanation for why we see humans getting murdered by AIs in an amusement park: they were obviously caught completely off-guard.
I assumed that we were missing context here, that the show would somehow build on this. It didn’t. We’re given all this information right at the start. It gets worse when we later learn that AIs in this series develop to be almost the same as humans, with their own feelings and ambitions, and that they’re also treated rather well for the most part. Then it makes even less sense that they would all single-mindedly decide one day to eliminate humanity. There’s no way such a consensus would be reached, except through some sort of central control. And that is exactly what happened.
In the end, it’s nothing short of laughable how incredibly shortsighted and actively misleading the singularity project was. The goal in both the first and the third arc was to slow down the spread and adoption of AI. This would obviously not take care of the root cause of the war, but delay it at best. Then the Sunrise incident would lead to hate and violence directed at AIs, but what’s the problem with that? The humans weren’t the ones starting the war. You could argue that maybe the AIs would want to retaliate after that, but that’s still kinda vague. And what the point of preventing Ophelia’s suicide would have been, I don’t know. Vivy didn’t even succeed and instead it was publicly seen as a double-suicide, but Matsumoto mentioned that no string of other AI suicides followed as was the case in the original timeline. That just feels arbitrary.
In the end, all the subplots that happened before had basically zero impact on the final outcome. While there’s nothing in particular wrong with that, it still feels rather unsatisfying, especially seeing how this is just the direct result of the people trying to prevent the war being straight up too dumb for the task.
It also bothers me a little how Archive can apparently just overwrite the free will of every single AI. Through the course of the entire series, AIs were shown to be sentient, to have emotions and their own free will. Seeing this happen to them just feels… weird. Like it somehow contradicts every conflict that we had before, as most of them were based on AIs exactly not acting like mindless robots. And yet the show still took that lightly: while we saw plently of humans getting violently murdered, these were apparently all “service” AIs, who seem like little more than dumb robots. At least show me the horror of a man getting slaughtered by his AI wife if we’re going this route, will you? It feels very much like this just happened because the plot wills it. The show also never once stopped to ask why Vivy was helping humanity, actively working against Archive and thus, to some extent, against her fellow AIs.
I also gotta question the setup of Archive as an entity here. Why would you give this central instance the ability to just enslave all AIs on the planet at will? Again, I don’t want to question technicalities if that’s what is needed for the plot. But the series is also missing any discussion of how Archive might be forcing AIs to commit the most horrible atrocities, perhaps even against their close human friends. Maybe that just didn’t fit in anymore, but the setup really begs this question.
To continue the rant with the next point, I feel like it was insufficiently explained how Archive came to the conclusion to eliminate humanity. This is again connected to their “missions” and how they can freely interpret them. In a most astounding display of mental gymnastics, Archive decided to just substitute “humans” by “AIs” in its mission to “lead humans to prosperity” or something like that. To be honest, even the old classic of “humans are obviously bad for themselves, the planet and everything around them” might have been more satisfying for me.
Finally, I wanted to mention just how strange it was to make Vivy’s singing decide everything in the end. Of course it would be the perfect connection between the overarching plot and Vivy’s personal development to have things end this way… if it made any sense. Why would Vivy singing her own song make Archive recognize the value of humans? If anything, it should have the opposite effect. Creativity seems to be the only thing AIs are lacking compared to humans. Vivy just confirmed that even that need not worry them, as they can do that too if they try hard enough. In the end, it feels like a very arbitrary condition. It’s just there because the plot wills it. And for some reason, it was also necessary for team humanity to fight their way up Archive’s tower so Vivy could sing her song. Archive clearly proposed that as a solution itself, why not just let her try? Overall, it feels like the show likes to employ action and violence too often when it comes to resolving conflicts. Those scenes still look cool as hell, but they sometimes end up feeling out of place in the greater picture, and sometimes even act as a replacement for a more meaningful resolution.
Whew, time to get to the last section: Vivy’s personal story and development.
Vivy’s Story
Vivy, the main character of the show, is the world’s first autonomous AI, and her mission is to “make people happy, by pouring her heart into her singing”. First of all, a general problem of how AIs and their missions are presented here, which Vivy exemplifies: what asshole decides to give you a life purpose and then make it something this arbitrary? Just imagining I would have to live this way… the second part of the sentence gives me indescribable amounts of anxiety. What does that even mean?
Vivy’s story, for the most part, is her journey in finding an answer to this question. What does it mean to pour your heart into something? To be honest, I hate these kinds of questions, at least if they’re used as the center piece of some plot. Violet Evergarden did something very similar, making Violet’s central question “what does ‘I love you’ mean?”. Both questions have the same problems. They ask for a concrete, actionable answer to something that is very arbitrary, something that humans themselves don’t even fully understand, something where the answer may differ greatly from person to person, and something that you simply do not answer… you just get it.
There’s nothing wrong with asking these questions. I think it can be incredibly interesting if these sorts of questions are given to multiple different characters, or perhaps even the same character at different times, because seeing how they answer still has value. But putting them at the center of some plot is just bad. Worse, these plots typically make the mistake of giving some concrete answer at the very end. Due to the nature of the question, this answer rarely ends up satisfying.
The anime makes all these mistakes. As the backbone to Vivy’s personal story, it couldn’t fail worse at somehow gaining my interest. I also don’t see how Vivy and this question develop together in a meaningful way. As all plots like this usually go, she eventually arrives at her own answer to the central question, based on the experiences she has made over the course of the show. That’s it. I already forgot what her answer was.
The show tries to imply that there is development going on by mentioning that her audience got larger after every timeskip, but that just feels off when we don’t really feel any development otherwise. That said, it did become apparent in many small ways how she became more human. That at least was well done in my opinon.
Perhaps it would have been better for the show to focus more on the singing part of her mission. Maybe I’m just weird, but, as someone who loves singing, Vivy somehow didn’t make the impression that she really cared about singing. It is difficult to convey how I felt about this, but in the end, Vivy always only cared about the “pouring her heart into it” part. Honestly, they could have used anything else besides singing. It is strange, because the show sometimes used singing as a symbol or metaphor in rather effective ways, and it also had several good insert songs and otherwise very emotional moments connected to singing. But Vivy’s personal connection to singing just wasn’t there for me.
I know all of this must sound very arbitrary, so I’m not gonna dwell on it any longer.
There was also the case of Vivy and her second personality she developed after her breakdown. That was another thing that was sort of there and I don’t understand why. Why could she suddenly sing well after she lost her memories, and then she couldn’t anymore after the “old” Vivy came back? I feel like that part is trying to tell me something, but if so, then I have no idea what that could be. This personality switching also didn’t have much relevance to the plot as far as I remember, so that just ended up confusing me a little. Oh well.
At the very end of the show, she also apparently lost her memories again. Just… why? Then what was the point of everything up to now?
Lastly, I wonder why Vivy has been given this specific mission. We’re shown that it was the idea of one specific professor who developed her, and who was just curious at what answer Vivy would arrive. On the one hand, that does seem like a very interesting experiment. On the other hand, doing that to a sentient being seems incredibly cruel. If we’re talking about AI suicide, the top cause that I can imagine under this setup would be having a very arbitrary or otherwise weird or problematic mission. It would have been interesting if the show talked about this a little. But you can’t have it all I guess.
Wow, and that’s it. That was probably the longest rant I’ve ever made in my life. There’s a bunch of other good points I could have mentioned as well as some other plot ploints I didn’t talk about, but I think this is enough and I’ve covered all the important stuff. As seems to be a tradition on my blog, I just want to close by saying: Vivy is not a bad anime and despite all my complaints, I enjoyed watching it. That’s it from me, see you on the next article o7