The Rabbit R1 might resemble a GameBoy or a Pokédex, but it’s so much more. Much like our friends Siri and Alexa, it’s a personal assistant device. The difference is it’s powered by AI and has been heavily criticised for its inability to keep your data safe.
Rabbit R1: The AI assistant that could put your privacy at risk
What is it?
The TL;DR of the Rabbit R1 is that it’s a less efficient version of your phone. It has a 1,000mAh battery, which is about a fifth of the capacity of modern smartphones. It doesn't run on a conventional OS, either. Instead, it’s trained by AI to use apps on your behalf. Rabbit’s founder, Jesse Lyu says it's like handing your phone to a friend so they can order food for you. You can hold down the side button and talk into it like a walkie-talkie, or use the scroll button on the side to select an app.
You can ask it about the weather, for restaurant recommendations, and interact with the world around you. If you’re out and about, you can snap a pic and ask questions about what’s in front of you. You can order food, call an Uber, and generate AI images on Midjourney (for some reason).
While there are a slew of complaints about the Rabbit R1, chief among them is that it fails to complete the main set of tasks it advertises. It only comes installed with four apps – Doordash, Uber, Spotify, and Midjourney. Good luck getting a ride if your town doesn’t support Uber.
If you want to use it while you’re out and about (which is the expectation if you’re calling an Uber), you’ll need a data plan. However, you can hotspot it to your phone if you don’t want to fork out for a plan on a device that’s basically a bunch of apps encased in some hardware.
Privacy issues
The Rabbit R1 has found itself in hot water ever since it launched. Not only has it been subject to several data breaches, but its community researchers have uncovered a massive security flaw in its programming. API keys are embedded in the source code.
To get a better idea of what this actually means, I talked to Deyan Georgiev from Rapid Seed Box.
“So, imagine this: an API key is like a secret password that lets different software pieces talk to each other. When a device like the Rabbit R1 has this key hardcoded into its system—meaning the same key is baked into every single unit—it creates a huge vulnerability. If someone manages to uncover that key (which isn't as hard as it might sound), they don't just have access to one device; they potentially have access to all of them. It's like having one master key that opens every locker in a school,” He told SafeWise.
“I've personally seen how dangerous this can be. In one of my projects, we found that hardcoded keys allowed unauthorised users to tap into devices, access sensitive data, and even control device functions remotely. In some cases, hackers could turn these devices into part of a botnet, using them for malicious activities without the owners ever knowing.”
Scary. This means whoever has access to the key could listen to every response ever given. Yep, even those that include personal information. We’re talking home addresses, phone numbers, and location data.
The team of researchers identified the security flaw on the 16th of May. On the 25th of June, Rabbit issued a statement claiming “the API keys continue to be valid as of writing”. They told users in their Discord server that the four API keys were removed from the source code. However, shortly after this, they uncovered a fifth hardcoded key that wasn't publicly disclosed during the investigation. By the time the researchers published a follow-up, the API key was still active.
The fifth API key was labelled ‘SendGrid’. Anyone with access to SendGrid could access all emails and user information kept in the r1.rabbit.tech subdomain and the R1’s spreadsheet functions. Yikes.
Final word
The R1 has received some serious criticism for lacking basic functionality and putting its user’s privacy at risk. Though they’ve emphasised their commitment to securing its systems, it’s likely the R1 will find itself in hot water again. Exercise caution and consider the implications of giving your data to an AI-led company like Rabbit, as you could be putting your data at risk.