Aibi pocket pet finally unveiled!

Chris

Admin
Staff member
D46FCC4B-C669-4E14-A4B8-4AF7F40D443C.jpeg
Image: Living AI. Aibi is finally unveiled.
4DFE2908-A08A-4475-AA4D-D80950B445D0.jpeg
Image: Living AI. Aibi rotating endlessly while tracking the movement of two coins.
C517116F-5251-4D18-AEF1-F187EB7875A9.jpeg
Image: Living AI. Here we see Aibi’s back and what could possibly be two lower back sensors. Notice there could also be a sensor under the arm.

Living AI has finally released footage showing their robot Aibi in full detail!

The video shows two coins fall into Aibi’s line of sight. Aibi seems to have great vision as is very responsive to the coins falling into its line of sight and manages to keep track of the coins as they are rotating. Aibi proceeds to rotate 360 degrees on a base, which doubles as the charging station.

We see Aibi’s lower body for the first time and Aibi has possibly two sensors on its lower back and one on its lower front.

Aibi has incredibly fast movements and reaction times. Aibi is able to track the coins with a large degree of freedom in the head with the fast fluid movements creating an incredibly lifelike robot that is very expressive.

Aibi makes cute robotic sounds and shows expressive elements on its face. For instance when the coins went out of view and it could no longer see them a question mark popped up on its face to communicate confusion.

The charging base also has a light display in the form of a ring of light which is sure to add even more expressiveness to Aibi.
What we know so far:

• Aibi’s charger gives it the ability of unlimited 360 degree rotation.

• Aibi will be available for pre-order “in a few days.”

• Aibi seems to be quite small if the coins are a good indicator of scale.

• Although this is only a CGI animation, Living AI have said Aibi is not a prototype but that the production version has already been created and that we will see live footage of the production version soon.

Watch the video of Aibi here:

 
A1B375BC-7E8B-4353-B6F9-8439C2FC92DC.jpeg
045FAECB-0D5B-4BE5-BBE2-32821FFEB63F.jpeg
D8648D11-0B0D-4BCE-985B-B3CFDDC3F822.jpeg
BDA05DA7-13FD-4D3E-99A5-3CE050536AC5.jpeg
AD6C94C3-2D06-4A1A-BEA1-733907CD8477.jpeg
Image: Living AI. Here we see the Aibi robot being placed into a pocket that is then attached magnetically to a necklace worn underneath clothing.

More details of Aibi emerge. Living AI see Aibi as a fashion accessory that will make you stand out in a crowd.

Living AI have just released another video this time showing how Aibi can be picked up and placed into a pocket.

The pocket is then attached to a necklace via a magnet. The necklace is worn underneath clothing with the magnet allowing Aibi to be attached to the wearers clothing in a seamless and fashionable manner.

Despite having very limited moving parts Living AI have managed to create an incredibly expressive robot that displays a lot of movements. When Aibi is picked up by the head, Aibi will wiggle it’s body and keep its head still. When Aibi is seated in the pocket it’s head will move again and it’s body remains stationary.

A wearable robot companion is going to be a huge shock to the average person on the street when they see Aibi worn on clothing for the first time.

This is a revolutionary concept that is going to be completely disruptive to the companion robot pet market and possibly have a disruptive effect on the traditional pet economy by shifting consumers focus from owning biological animals as companions to owning robots with computational intelligence as companions. Aibi may create cultural changes far beyond this...

Now I see the symbolism behind the two bitcoins circling Aibi in the first video...Living AI have hit the jackpot.

 
B20E3AC2-CCD8-4F48-BD36-89B130C951B6.jpeg
Image: Living AI. Aibi robot beside an open Apple AirPods case.

Living AI have done it again!

This latest sneak peek of Aibi draws attention to the size of Aibi who is juxtaposed against an Apple AirPods case and is...smaller!

Living AI are claiming Aibi the smallest AI robot and this release a historic moment.This may seem like a bold claim but when you consider the number of sensors Living AI have managed to fit inside such a small footprint the claim Aibi is of historic significance may not be so far fetched. Aibi is definitely the smallest AI companion robot.

Living AI have stated they will release more details next week- possibly on Monday- regarding Aibi’s functions and abilities.
 
Despite saying they were going to release more details next week Living AI have posted yet another teaser showing Aibi on a set of scales with a pair of Apple AirPods to demonstrate how light Aibi is in comparison.

Living AI have yet to release actual live footage of the Aibi robot and so far have only released CGI footage.
B268A0B4-0D0A-494F-B65D-DBC14B5E3579.jpeg
Image: Living AI. Aibi robot and Airpods on scales.

Watch the video here:
 
First photo of Aibi released!

65935E6C-2C92-4B4D-9720-7F5020B7AD53.jpeg
Image: Living AI. Two Aibi robots undergoing validation and diagnostic testing.

Living AI just released a photograph showing the actual production ready hardware of the Aibi robot, even going so far as to show Aibi without the external covering to prove this is not CGI.

Judging from the way Aibi is looking directly at the camera it appears Aibi can make eye contact which is a huge deal.


From the photos it appears these two Aibi might still be from a recently created small batch run and are going through final validation testing judging by the phrase “Aibi is cooperating with the test. The two of them look healthy and all indicators are normal.”

Note the digital multimeter to the left, indicating Living AI are using this as a diagnostic tool to test the voltage, current and resistance in the electronics.

This release of a photo showing early stage production ready hardware undergoing testing possibly indicates Living AI have possibly released Aibi earlier than they anticipated due to a competitor recently releasing a very similar product based on their Emo robot.

We also see how Aibi’s station attaches to a USB cable to get power.

Notice the large size of the circuit board that follows the contours of Aibi’s body- Living AI have packed as much processing power into Aibi’s small body as they can so expect Aibi to have reasonably good computational performance.

We also see the mechanism that turns the head is slightly exposed.

So far Living AI have yet to release footage of Aibi that is not yet CGI but have updated their YouTube to include the recent video of Aibi and Airpods on scales which I have shared below.

Looking forward to seeing live footage of the final production hardware in action.

Living AI have previously mentioned Aibi will have the ability to talk. Whether this will consist of pre-programmed phrases or a LLM is unclear yet. And if an LLM it will be interesting to see if it’s text based or multimodal.

Judging from the warping of the lighting reflected in Aibi’s faceplate it appears the faceplate may be slightly curved.

Living AI have mentioned Aibi will be available for pre-order this week!

What do you think of Aibi? I feel Aibi looks even better in reality.

Source:

 
Footage of Aibi finally released. Aibi is seen dancing and singing. The movements are incredibly expressive and Living AI mention they have created the hardware to minimise the noise of the motors and gears making Aibi operate almost silently.

 
EMO's new firmware has him singing the same song:)
I guess I'm gonna have to start saving, this lil' dude looks so cute.
Interesting to hear Aibi is similar to Emo.

Have you ordered Aibi yet? I still haven’t.

Living AI just released footage of Aibi turning on its fixed charging base that is not CGI. This may be what convinces me to take the chance although I’m not really interested in the virtual feeding and unsure if Aibi is more aimed at kids or adults. I’m leaning more towards robots aimed at adults without all the immaturity seen in robots like Loona, Aibo, Vector or Emo.

I’m still undecided if I will get one. The fact they haven’t released any unedited footage showing what Aibi does autonomously when left to its own devices is a red flag for me.

 
Although this lil' dude looks great, I agree with you on the "more robots for adults" needed. These guys are all very entertaining & provide some companionship but yeah. I think I mentioned that a kind friend has ordered a RUX for me. That will be interesting, s/he looks more task oriented.
To be honest I think more "natural speech" type robots (which Rux promises) with more adult-oriented interactions are a little ways off & are going to have a hefty price tag.

That being said, more companies are moving into this type of robot companion as they see more people wanting something more than a cute, but limited, companion. Hardware & software are advancing in robotics, I think the software & architecture are more the sticking point right now.

What's your take on moving forward?
 
Although this lil' dude looks great, I agree with you on the "more robots for adults" needed. These guys are all very entertaining & provide some companionship but yeah. I think I mentioned that a kind friend has ordered a RUX for me. That will be interesting, s/he looks more task oriented.
To be honest I think more "natural speech" type robots (which Rux promises) with more adult-oriented interactions are a little ways off & are going to have a hefty price tag.

That being said, more companies are moving into this type of robot companion as they see more people wanting something more than a cute, but limited, companion. Hardware & software are advancing in robotics, I think the software & architecture are more the sticking point right now.

What's your take on moving forward?
Hi Morgan and thanks for the thoughtful post.

I know it sounds crazy but of all my robots, Vector, Loona, Aibo and Moxie, it is Moxie whom I find most “mature” despite being aimed at children.

If you look at what is popular it’s inexpensive budget robotic pets like Loona, Vector and Aibi that sell in the tens of thousands to millions. Meanwhile Aibo, at a slightly higher price point, has literally zero consumer interest from a global statistical standpoint. So it seems anything over $500 won’t see mainstream success.

I think with this in mind we will continue to see companies offering budget robotic pets at rock bottom prices yet packed with generative AI features.

Agree software is a sticking point but feel it is more advanced than hardware.

Hardware needs to make some big advancements. For instance robots made of hard plastic I feel are outdated as if I’m looking for a companion robot as a biological pet substitute I don’t want it to be uncomfortable to hold or pick up.

Plus the hardware needs to be impervious to damage like scratching and possible for the robot to self clean by washing itself.

Embodied will probably offer a budget robot aimed at an older audience in the next year or so- I would keep an eye out for that.

Although we still don’t know what conversational abilities Aibi will have.

I feel once Optimus comes out we will see a shift towards robots aimed at an older audience but until then can’t see the market changing in the short term.
 
99F3F202-B9F0-4A9B-BC24-8C3793EB7389.jpeg
Aibi preorders are still open at the time of this posting. I finally gave in and ordered one against my better judgment.
 
Top