This might be a flashback to Google Glass. At an event in San Francisco this week, Google announced a new camera you can “clip” onto anything, called Clips. Say you want to take photos at a birthday party. Just attach the camera to the back of a chair and let the machine learning take over. The camera can identify people and even snap photos when there’s a smiling birthday-candle-blowing kidlet in the scene, recording your memory forever.

Which all sounds great, except for the fact that this is the type of gadget that could be used for nefarious purposes. And it works autonomously, so you won’t really know that someone is taking pictures unless you see the camera itself making a noise or flashing a light.

That said, a Google representative claims the device has no use outside of birthday parties and family gatherings.

“Google Clips is designed specifically with parents and pet owners in mind. It’s a camera and made to be used intentionally to capture more moments — seven-second clips — of the people that are important to you,” Google product manager Juston Payne told VentureBeat. “It doesn’t work well as a ‘set it and forget device’ or something you wear — it is designed to give you more spontaneous moments using machine learning of the people you’ve taught it are familiar to you and are smiling, and [it] looks for clear and stable shots.”

Payned added that Clips looks like a camera, not a spying device. The clips are stored locally and are transmitted only when you connect it to your phone. If another phone tries to pair with Clips, it will wipe the local data.

Security experts still have concerns, though.

“First of all, are we as humans that vain or that apathetic that we need so many pictures of ourselves and our moments, yet don’t want to actually participate in the function of capturing them?” asked Chris Roberts, chief security architect at Acalvio, speaking to VentureBeat.

Roberts questioned whether a device that sits in a corner and can snap photos all day is really that protected from intrusion. If the images and short video clips seem innocuous for most consumers, there’s still a question of how the device could be used by a hacker in a public place.

Roberts argues that the device further erodes a sense of privacy in public places, a scenario portrayed in movies like The Circle (albeit without a lot of subtlety).

“Are we really that pathetically [unconcerned] about our privacy and that of our children (whom this seems to be targeted at capturing) that we willingly invite and pay for more and more technology into our homes that continually erodes our privacy? The answer to both, thanks to Google Clips, seems to be yes,” added Roberts, who wondered if anyone will be able to figure out how to deconstruct the device and reverse engineer it.

Worse, this is just another hint at a future surveillance society. If hundreds of people are using “clipped” cameras all over schools, parks, and restaurants, it’s just a matter of time before a compromising photo captured automatically by an AI is used against the subject in the photo. AI could come into play there as well, cutting through swathes of data, comparing location information against social media accounts to find an easy target.

Another analyst mostly had questions about how the device will be used.

“Could these start showing up in locker rooms or other private places?” wondered Dan Lohrmann, chief security officer at Security Mentor. “How will this AI know to filter private moments? What will the software find interesting? Do people have a right to the content if these are used to record in public places? What happens if someone else says delete my video? Remember that these can be used places where a human is not [obviously controlling the camera directly].”