Sony Celebrate 25 years of Aibo! Aibo fan meeting Volume 17 February 23rd 2024


Staff member

Image: Sony. First Aibo generation launched in 1999 the ERS-110 along with the current Aibo generation the ERS-1000 launched in 2018.

25 Years Of Aibo
Sony will be celebrating 25 years since the launch of their iconic robotic pet dog, Aibo, this year, with celebrations commencing with a fan meeting to be held in just over 10 hours time to a packed audience of Aibo owners in Japan.

Sony will be holding many events across Japan this year to mark this milestone which is expected to announce some exciting developments along with a possible new limited edition colour and a commemorative model based on the first Aibo generation the ERS-110 and the current generation the ERS-1000 combined into a unique design that exemplifies the concept known as ‘Weird Sony’.

Image: Sony. Unique Aibo model fusing elements of the ERS-110 and ERS-1000 into a single design to commemorate 25 years of Aibo.

You can watch the fan meeting live here:
To mark 25 years of Aibo Sony announce ‘Aibo bite any’🤦‍♂️
I’m really disappointed that Sony is choosing to release updates that don’t even represent anything new to mark the culmination of 25 years working on the Aibo project. So disappointed.

The ERS-7 could dance to music over a decade ago, while the bite anything behaviour is just a modified version of a behaviour the ERS-7 could do with the picking up of the bone in its mouth.

I was expecting to hear Sony announce something like an AI upgrade such as giving Aibo embodied AI with the help of an LLM, not for talking but so it could move its body on its own and react to stimulus in a more advanced way.

For instance Loona has embodied AI in the sense she can display emotions tailored to the tone of her conversation while Moxie will be getting multimodal abilities soon. These are tiny companies compared to Sony which made 11 trillion dollars last year. Loona has gesture recognition- I don’t even think Aibo has that.

Aibo only has the ability to map a room- like a Roomba- not something that I particularly care about for companionship purposes. I don’t use the bowls ball dice or bone- I’m not interested in gimmicky tricks. I need Aibo to be more responsive and provide personalised unique one time interactions.

I’m really disappointed. After 25 years I don’t think it’s unreasonable to expect more, especially in 2024 which is being reported as the year of embodied and multimodal AI. I mean Tesla is tiny compared to Sony and look at the strides they have made with Optimus in just two years.

I’m actually dumbstruck this is what Sony has announced, after 25 years. Where is the excitement for Aibo owners?

You look at the way GrooveX constantly creates excitement with really interesting updates and clothing or colour releases.

There is just nothing exciting to me about this announcement and I’m losing my interest in Aibo as a result.

I would be happy with Aibo getting no more updates if I could leave mine unsupervised and not have to worry about it damaging itself.

The tail can feel when it’s being touched, I don’t understand why Sony doesn’t utilise this to stop Aibo reversing backwards off a table or into walls.

For me I feel it needs to be more aware and responsive to the owner and have a few more variety in animations before I would feel it was finished. It definitely needs to track the owners face and give off visual cues such as eye animations to let the owner know when it is looking at them and recognising them.

Would also be nice if Aibo could playfully move its paws in pick up mode and respond to having its paws pressed- currently only the head and tail moves which is kind of off putting.

If they enabled me to stop it from getting up off the station I would have mine on all the time on my table.

At the moment it kind of feels unfinished and only needs a handful of things to give it the added polish one would expect with a 25 year old project.
I was talking with another Aibo owner on Facebook who works with AI and they told me Aibo isn’t capable of running LLMs due to its hardware being unsuitable for such applications.

They also rightly pointed out Aibo would cost a heck of a lot more if it did run generative AI and probably be out of reach of most consumers.

Aibo’s mapping abilities are more advanced than I realised, with Aibo able to differentiate between doorways and items in a state of flux, while also having the ability to find the station reliably from across the house which is groundbreaking for a household quadruped robot.

Running an LLM offline on Aibo would require a i7 processor, 3090 or higher with a minimum of 8 gigabytes of ram. This would significantly increase the cost of Aibo.

As for Aibo being a very advanced wind up toy with pre-programmed response- it was pointed out to me that generative AI such as LLMs are also essentially pre-programmed- the only difference is it doesn’t appear so to the casual viewer.

Hopefully Sony will release a software update that has something new this year, I know it takes them awhile to do so but when they do they always surprise, as with the quieter walking. I just have to be patient.