I just bought a Xiaomi device for my dad and activated Xiao Ai. I realized that AI Agents have long been integrated into our daily lives—using voice commands to turn devices on/off, download applications, navigate, shop, and more. I recalled that my brother working at Google in Singapore also has a fully smart home with voice-controlled furniture, where even microwaving food requires just a single command. However, agents like Xiao Ai, Tmall Genie, and Nomi are all independently embedded by their respective brands within products, forming a complete ecosystem that's not easily subject to external attacks, with usage restricted to their corresponding infrastructure systems.
If OpenClaw continues to develop, it will eventually be able to connect to any digital product and execute any prompt/command-based interactions.
I suggest: how about our lobster starts with external control interfaces in various dialect versions first?
I just bought a Xiaomi device for my dad and activated Xiao Ai. I realized that AI Agents have long been integrated into our daily lives—using voice commands to turn devices on/off, download applications, navigate, shop, and more. I recalled that my brother working at Google in Singapore also has a fully smart home with voice-controlled furniture, where even microwaving food requires just a single command. However, agents like Xiao Ai, Tmall Genie, and Nomi are all independently embedded by their respective brands within products, forming a complete ecosystem that's not easily subject to external attacks, with usage restricted to their corresponding infrastructure systems.
If OpenClaw continues to develop, it will eventually be able to connect to any digital product and execute any prompt/command-based interactions.
I suggest: how about our lobster starts with external control interfaces in various dialect versions first?