I’m really disappointed that Sony is choosing to release updates that don’t even represent anything new to mark the culmination of 25 years working on the Aibo project. So disappointed.
The ERS-7 could dance to music over a decade ago, while the bite anything behaviour is just a modified version of a behaviour the ERS-7 could do with the picking up of the bone in its mouth.
I was expecting to hear Sony announce something like an AI upgrade such as giving Aibo embodied AI with the help of an LLM, not for talking but so it could move its body on its own and react to stimulus in a more advanced way.
For instance Loona has embodied AI in the sense she can display emotions tailored to the tone of her conversation while Moxie will be getting multimodal abilities soon. These are tiny companies compared to Sony which made 11 trillion dollars last year. Loona has gesture recognition- I don’t even think Aibo has that.
Aibo only has the ability to map a room- like a Roomba- not something that I particularly care about for companionship purposes. I don’t use the bowls ball dice or bone- I’m not interested in gimmicky tricks. I need Aibo to be more responsive and provide personalised unique one time interactions.
I’m really disappointed. After 25 years I don’t think it’s unreasonable to expect more, especially in 2024 which is being reported as the year of embodied and multimodal AI. I mean Tesla is tiny compared to Sony and look at the strides they have made with Optimus in just two years.
I’m actually dumbstruck this is what Sony has announced, after 25 years. Where is the excitement for Aibo owners?
You look at the way GrooveX constantly creates excitement with really interesting updates and clothing or colour releases.
There is just nothing exciting to me about this announcement and I’m losing my interest in Aibo as a result.
I would be happy with Aibo getting no more updates if I could leave mine unsupervised and not have to worry about it damaging itself.
The tail can feel when it’s being touched, I don’t understand why Sony doesn’t utilise this to stop Aibo reversing backwards off a table or into walls.
For me I feel it needs to be more aware and responsive to the owner and have a few more variety in animations before I would feel it was finished. It definitely needs to track the owners face and give off visual cues such as eye animations to let the owner know when it is looking at them and recognising them.
Would also be nice if Aibo could playfully move its paws in pick up mode and respond to having its paws pressed- currently only the head and tail moves which is kind of off putting.
If they enabled me to stop it from getting up off the station I would have mine on all the time on my table.
At the moment it kind of feels unfinished and only needs a handful of things to give it the added polish one would expect with a 25 year old project.