I wrote to Flock's privacy contact to opt out of their domestic spying program (honeypot.net)

by speckx 258 comments 670 points
Read article View on HN

258 comments

[−] kstrauser 31d ago
I wrote this. I had/have absolutely no expectation that Flock would comply with my request, but figured I should try anyway For Science. Their reply rubbed me wrong, though. They seem to claim that there are no restrictions on their collection and processing of PII because other people pay them for it. They say:

> Flock Safety’s customers own the data and make all decisions around how such data is used and shared.

which seems to directly oppose the CCPA. It's my data, not their customers'.

Again, I didn't really expect this to work. And yet, I'm still disappointed with the path by which it didn't work.

[−] carefree-bob 31d ago
They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data". A company that manufactures video cameras is not the one to talk to when someone records you, talk to the person who recorded you.

But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.

But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.

And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.

[−] fudgy73 31d ago
AFAIK Flock owns the cameras and leases them out [0].

[0] https://www.flocksafety.com/blog/flock-safety-does-my-neighb...

[−] ldoughty 31d ago
But the data collected is property of the government and flock is not allowed to use that data for additional business gain (according to their statements)...

So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.

If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"

[−] valeriozen 30d ago
The AWS analogy breaks down because AWS doesn't encourage customers to pool their S3 buckets into a nationwide searchable index.

Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?

[−] giancarlostoro 30d ago
Start standing in front of the cameras looking sketchy long enough till police are sent out to ya, then ask the cop who called.
[−] chaps 30d ago
Someone once dropped some fireworks not too far from me at 3am a few years back. They were loud and, yeah, cops were called. A few minutes later about five cars drive past me about 30mph over the limit. Not sure how they didn't see me or try to see me. But I know they didn't catch the BRIGHT orange and lifted care.

Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs.

Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back.

He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp.

[−] giancarlostoro 30d ago
I may or may not know a business owner who got criminals off their business' street by saying he thinks he saw a gun any time criminals showed up to do things, everything from prostitution to selling drugs. Cops showed up immediately. They stopped coming by altogether, probably the safest street in quite a rough part of town.

It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.

[−] hrimfaxi 30d ago

> It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.

Is it crazy? Shouldn't the response be proportional?

[−] giancarlostoro 30d ago
Contrast to someone being shot dead, if the killer drives away, they might be there half an hour later.
[−] Forgeties79 30d ago
The police should do a lot of things they fail to do.
[−] fwipsy 30d ago
Police prioritizing responses to violent crimes where lives may in danger seems reasonable to me.
[−] jnovek 30d ago
I don’t care. I don’t care who owns the data. If I can’t easily get private information like my movements removed from a database like this, the legislation does not sufficiently protect me.

It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.

[−] _DeadFred_ 30d ago
The legal term is 'distinction without a difference'. Flock/others can't create a weaselly scenario to pretend it's something else. Otherwise people could bypass all kinds of laws/rules just by giving some weaselly description to everything.

This also falls under the 2026 rule 'everyone Is 12 Now'. Flock is literally acting like a 12 year old to get out of following the rules. My 12 year old tried to use this dumb parsing of things to avoid rules/consequences.

[−] rjmunro 30d ago
The problem with this is where do you draw the line? If I film you with my iPhone (e.g. you walk past in the background of my video), Apple should delete my video from my phone and iCloud account based only on your instructions?

Apple hold the data in iCloud, Apple (or a phone network) may be leasing me the phone. That sounds pretty similar to the Flock situation.

I guess the difference is that flock might be sharing the data from a customers camera with other customers. Then they are definitely controlling it.

I think the bigger problem with Flock is the fact that their cyber security is so laughably bad that non-customers can easily access the data.

[−] psychoslave 30d ago
Not pronouncing about what path is the most distopic, just for the fun of the exercise of what if we push in the direction:

Given the rule, I would expect (IANAL), Apple should not deal with data stored on phones they sold.

People are responsible for what they store on their device. When I take a photo in the street, if someone come to me asking to erase a photo with them or their kids as they were in the background, I'll tell I don't publish any photo online, which is generally what people are thinking of as a concern and that stop there, but if they insist I will remove it from my phone. Because I'm too lazy to actually live edit the photo and remove them from the picture, even if that is certainly doable with a simple prompt by now.

Now if Apple store automatically photo in some remote server they own, they are the ones who should be responsible to comply with making sure they won't store something illegally. Microsoft, Google, and Apple use PhotoDNA to detect known CSAM if I'm not mistaken. Though legally they only should remove once they get a notice about it. Same way, they could proactively blur visages of people not detected as the people that were whitelisted for the uploading account. And, by that logic, they should certainly remove the information regarding a person if they get a notice, just as well as they wouldn't keep CSAM data once notified, would they?

Anyway the underlying issue is not who store what, but what societies lose at letting mass surveillance infrastructures being deployed, no matter how the ownership/responsibility dilution game is played on top of it.

[−] jnovek 30d ago
Are you using your phone photographs to track my movements? I don't care about the photographs part, I care about the "collecting data that can track my movements" part.

I don't mean my movements on the internet either. I understand that those things are easy to track. I mean in real life.

As far as responsibility for the data goes, you're right, it's not clear. Therefore, anyone who uses the data -- Flock or their customer -- should be required to delete it on my request.

That seems like a pretty clear delineation, no?

[−] cute_boi 29d ago
If apple collects all the data and track movement then yes, they should be liable.
[−] lazide 30d ago
A reasonably nuanced defense could likely claim that to be able to do what you want, would have much worse side effects on privacy.

For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units?

For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes.

Now, it's not the same thing of course - but hopefully you understand what I'm referring to?

[−] LadyCailin 30d ago
Except that the analogy is that they already have, or can easily create, that list. If they couldn’t, their value proposition would be lame. “We know you’re looking for a specific license plate, here’s a million hours of footage from all over the city, have at looking through it all.”
[−] lazide 30d ago
Only for paying customers, which you aren't of course. If those customers paid public storage to inventory their stuff, then that inventory is their property. Surely it would be inappropriate to use their inventory data to find your naked photos. A violation of privacy even. (/s, kinda)

I was enumerating the likely defense, not that it's valid.

[−] shakna 30d ago
"Existing capability" removes the argument against onerous requirements, in a legal setting.
[−] tptacek 30d ago
The law cares about lots of things we don't care about.
[−] jnovek 30d ago
The law is there to serve society. If it is not effectively serving society, it should be changed.
[−] tptacek 31d ago
This is also true according to their contracts (we were one of the first munis in the country to ostentatiously cancel our Flock contract, and the lead up to that was a bunch of progressive legal experts poring over that contract looking for holes.)
[−] fsckboy 30d ago

>

a bunch of progressive legal experts poring over that contract looking for holes

all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.

if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.

[−] shermantanktop 30d ago
As a practical matter, this may be good advice. But it also places a demand on someone with a legitimate concern that they go find an ideological "beard" to make themselves more palatable and sympathetic.

It's not hard to see how this enables an institution to gate itself from criticism.

[−] thaumaturgy 31d ago
Except that Flock very clearly benefits financially from having direct access to this data: owning (and in their own documentation, they very clearly do own it) a network of 80,000 surveillance devices across the country, and owning every single transit point for the data they collect, is what gets them to a $7.5 billion valuation from investors.

The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.

(After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)

Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?

[−] necovek 30d ago
That makes them a data broker in my reading, and at least in California, Data Broker legislation should apply. CA Data Broker registry gives me access denied, but that could be because I am outside US.
[−] marcus_holmes 30d ago

> But the data collected is property of the government

I thought this was the get-out clause from the constitutional problems with Flock? That because Flock is a non-government organisation it isn't restricted by the constitution (i.e. the constitution only restricts what the government can do).

They can't have it both ways - if Flock are collecting the data then they are subject to the privacy laws. If it's the government collecting the data via Flock as just a service, then they are subject to constitutional restrictions.

[−] unethical_ban 30d ago
This is worth validating independently, but to be clear:

Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?

If so, that makes the panopticon slightly less powerful.

[−] gorgoiler 30d ago
That’s a pretty compelling argument, but what if I went round to AWS’ house, peeked into their kitchen, and saw a crate of photos on their table with me in them?

I’d absolutely say:

“Hey, that’s me! Give me those right now!”

I’d also be pretty angry if they told me:

“Sorry we’re storing those for Corp Inc. Go ask them.”

To refute my own point though, this only sounds annoying because the data processor is being irritating by manually referring me to the data controller. In practice, it would be trivial for them to automatically forward communications between me and the controller.

That’s what feels is amiss with the top level article.

[−] eagleinparadise 30d ago
If I lease out a property to a tenant (apartment, retail, industrial use, whatever) and that tenant is committing an illegal activity on the property. Would the landlord be liable for knowing it? Or not?

"Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them."

Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge.

[−] wavefunction 30d ago
I assume they are building "meta-data" profiles of people based on the data they say they can't use directly. That seems like an easy work-around that satisfies the lip-service they've given to the issue.
[−] samrus 30d ago
Yeah but their argument is that if someone takes a photo of you with thier iphone and its uploaded to icloud, you cant ask apple to delete the photo, you need to ask the person who took it
[−] mminer237 31d ago
If you go to Rent-A-Center and rent a DSLR, that doesn't make Rent-A-Center responsible for the pictures taken by their cameras.
[−] robot-wrangler 30d ago

> They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data".

The response to this should just be, "Yes, very well, please divulge a complete list of your customers, their contact information, and information about camera locations so I will be able to pursue this per instructions".

When that obviously doesn't work either then we can all agree the law as written is completely useless, and feel great about rewriting it in a way that's calculated for maximum damage to both the vendor and their customers, and collateral damage to the whole panopticon. Or, just spitballing here, we can just skip to the punchline here and do all that anyway

[−] halJordan 31d ago
It easily goes both ways. But we do sue American gun makers for deaths caused by lunatics. We sue drug makers for drugs prescribed by a doctor. We sue cloud providers for not reporting illegal photos. Printers are forced to id every printed page to combat counterfeiting. Banks are forced to do close accounts even though it's not their dirty money
[−] mistrial9 31d ago
the way this has been addressed in complex product liability in the past in the USA is that the public-facing Brand Owner has certain legal liability for the product, despite contractors or supply chains. In this case, it appears that the Flock company is the brand owner and is public-facing.
[−] dozerly 30d ago
I don’t think you’re informed on the topic. They do not just manufacture cameras.
[−] beambot 30d ago
Isn't this the equivalent of asking Google to delete your image off every Android phone (not just yours)?
[−] justinclift 30d ago
Wonder if Flock could be DMCA'd to remove the data they're hosting then? :)
[−] Glyptodon 31d ago
I think you should write them back and ask that they provide you with a customer list and continually update you as they get new customers so that you may follow the advice they've given you.
[−] tptacek 31d ago
Wait, is it your data? If you drive your car in front of a Ring camera on my house (I don't have a Ring camera don't @ me), is it your claim that you own the data on that camera?
[−] rkagerer 30d ago
It would be revealing to see a judge scrutinize the degree of control Flock maintains over the system and deliberate on whether the company truly is as hands-off as it claims when it comes to privacy obligations.

Personally I feel if you're going to build so turnkey a system to facilitate collection of personal data by your customers at the scale Flock seeks, then at a minimum you should build an equally turnkey method to handle requests like the one you made. It would be a service both to their customers and to consumers at large who wish to exercise their rights under legislation to opt out.

The fundamental reason we ended up in a world where companies just pay lip service to privacy and don't take it seriously is because consumers / voters put up with it. I'm heartened when I see individuals keyed into the issue.

Is there some way to contribute funds toward your legal fees?

[−] gowld 30d ago
Lawsuit challenging Flock's illegal data brokerage:

https://www.courthousenews.com/california-drivers-accuse-flo...

[−] themafia 31d ago
These laws get complicated quickly. There's a specific ALPR law in the CA civil code which seems to carve out several exceptions for a business like Flock:

https://leginfo.legislature.ca.gov/faces/codes_displayText.x...

The enforcement provisions are rather bleak as well and afford no opportunity to directly bring a case against the agency that operates the system but instead just the individual who misuses it.

I think one of the more direct attacks would be going after jurisdictions that chronically have officers misusing the system. I think you're going to have to create precedent in this way to foment actual change.

[−] everdrive 31d ago
Nice work all the same. These systems need to be prodded and tested. Even unsuccessful results such as this tell us something about the situation we're in.
[−] necovek 30d ago
This answer is relevant: https://oag.ca.gov/privacy/ccpa#collapse6d

In short, Flock is a "service provider" and not the entity doing the recording.

Perhaps you can make a case that they are a "data broker" instead (https://oag.ca.gov/privacy/ccpa#collapse1i), but that is a separate law, and what you are really looking at is a combination of license plate, time and location being collected as data being collected and sold without your consent.

Obviously, I am not a lawyer (and not even US-based), but I like when privacy is respected :)

[−] wakamoleguy 30d ago
The data ownership is really interesting, as many threads here are going into. I wonder if it's possible to sidestep that entirely, though! Under the CCPA, "personal information" is defined as information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked — directly or indirectly — with a particular consumer or household. That says nothing about ownership.

To the extent that Flock is only storing the data on behalf of their customers, I'd understand they wouldn't be required to delete it. But to the extent that they are indexing it, deriving from it, aggregating it across customers, and sharing it via their platform, it seems they should be required to remove that data from those services.

But then again, I am not a lawyer!

[−] lmkg 30d ago

> which seems to directly oppose the CCPA.

I have some background in data privacy compliance.

It sounds like they are claiming to be a Service Provider under CCPA, which is similar to a Processor under GDPR. Long story short, a Controller is the one legally responsible for ensuring the rights of the data subject, and a service provider/processor is a "dumb pipe" for a Controller that does what they're told. So IF they are actually a Service Provider, they're correct that the legal responsibility for CCPA belongs to their customers and not them.

That's a big IF, though.

Being a Processor/Service Providor means trade-offs. The data you collect isn't yours, you're not allowed to benefit from it. If Flock aggregates data from one customer and sells that aggregate to a different customer, they're no longer just a service provider. They're using data for their own purposes, and cannot claim to be "just" a service provider.

[−] snowwrestler 31d ago
It’s not clear to me that it is actually your data. If I take a picture of you in a public place, I own the picture, not you.

But maybe I am unclear on how Flock works.

[−] AlBugdy 30d ago
Read a few of your posts. Just wanted to comment on how I like your to-the-point succinct style and how you care about privacy. :)

As a suggestion, I saw you have RSS:

https://honeypot.net/feed.xml

I didn't see it mentioned in the main page or About or Archive. Maybe add it to a more visible place?

[−] ratdragon 31d ago
maybe eff.org would be able to help you lawyer up or otherwise to push this forward. good luck!
[−] empathy_m 31d ago
I noticed that the company is glossed as "Flock" and not "Flock Safety (YC S17)" in posts like this and last week's "US cities are axing Flock Safety surveillance technology", https://news.ycombinator.com/item?id=47689237.

Did YC house style change a while back to drop the "(YC xxx)" annotation since so many popular firms particpate / or because it's well known?

[−] wcv 31d ago
Flock has stonewalled with the "we are not the controllers" excuse here in MN too. We have similar rights to opt-out and delete under the MCDPA [0].

[0] https://ag.state.mn.us/Data-Privacy/Consumer/

[−] dsr_ 30d ago
Remember that the difference between "Flock can do whatever the hell it wants" and "Flock is required to delete your data at your request" is a law. Citizens vote for legislators. If you want this to be a higher priority for your legislators, buy them off.

Or vote for/against them, that might work too.

[−] ldoughty 31d ago
I think you're going to have a hard time with this...

Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.

You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"

In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.

Not a lawyer, just noting the parallel.

I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.

[−] tedggh 30d ago
“United States v. Jones, 565 U.S. 400 (2012), was a landmark United States Supreme Court case in which the court held that installing a Global Positioning System (GPS) tracking device on a vehicle and using the device to monitor the vehicle's movements constitutes a search under the Fourth Amendment”

https://en.wikipedia.org/wiki/United_States_v._Jones_(2012)

[−] hmokiguess 30d ago
https://www.flocksafety.com/legal/lpr-policy

> In accordance with its Terms and Conditions, Flock Safety may access, use, preserve and/or disclose the LPR data to law enforcement authorities, government officials, and/or third parties, if legally required to do so or if Flock has a good faith belief that such access, use, preservation or disclosure is reasonably necessary to comply with a legal process, enforce the agreement between Flock and the customer, or detect, prevent or otherwise address security, privacy, fraud or technical issues. Additionally, Flock uses a fraction of LPR images (less than one percent), which are stripped of all metadata and identifying information, solely for the purpose of improving Flock Services through machine learning.

In this document, to which they linked in their reply, it says clearly "address ... privacy ... issues."

Does your case not constitute a privacy issue? I would say so.

Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.

[−] nekusar 31d ago
The only opt-out the citizenry has is with any of the following:

    2x4
    rebar
    spraypaint
    spray foam
    battery powered metal cutter
And bash those pieces of shit to chunks or completely ruin the lens and solar.

Republican community? They love corporate surveillance. Democrat community? They too love corporate surveillance.

There is no "Peoples' Party" that rejects this garbage.

[−] calmbonsai 31d ago
Per my understanding of the law for these sorts of data collectors, at least in the U.S., you need to contact the local municipalities (Flock's customers) for this redaction and the jurisprudence is governed at the state and municipal level.

The best source of this information is https://deflock.org/ . FWIW, this is run by a neighbor in Boulder, CO which has been wrestling with the use of these cameras.

[−] gsleblanc 30d ago
How did we get to allowing this in the USA? I remember the zeitgeist used to be to make fun of China's mass surveillance / social credit system, and ten years ago proposing to build something like this in the USA would be unthinkable. It's wild that we're just willingly sliding into the same system here too.
[−] barelysapient 31d ago
If that's a valid excuse than the CCPA isn't worth the paper its written on.
[−] deepsun 31d ago
If Flock collects and processes PII data, then all their customers are "subprocessors". Flock should really have a Data Processing Agreement with their subprocessors, to legally ensure they follow the same PII handling controls as Flock does.

For example, if Flock receives a legitimate request to delete some data, then Flock must forward that request to all their Data Processors (e.g. including AWS/GCP/Cloudflare) and they must delete it as well.

[−] bix6 30d ago
[−] tptacek 30d ago
I'm not sure how this request even makes sense. By the logic of the demand made here, a municipality can set up an ALPR system to record evidence of crimes (after the grim failure of Flock's alerting in Oak Park where I live, we rolled Flock back to a configuration where that was all it was used for), and random people can mail Flock to be exempted from that evidence collection.

It's good to want things, and I get why people want this specific thing, but the logic that says this is a viable demand under current law seems like it would prove way too much.

[−] pugworthy 30d ago
An interesting quandary here is that they'd need to constantly scan for you and your vehicle, etc. so that they could know it was you then delete you. So to ensure they don't observe you, they need to observe you.
[−] gguncth 30d ago
It’s fascinating how America could completely get rid of Flock cameras by sending criminals to prison and leaving them there, but we won’t do that so we have these endless arguments about these cameras.
[−] rz2k 30d ago
I don’t really understand this response. I thought the entire business model of Flock was about circumventing the Fourth amendment by posing as a separate vendor selling information it has collected, rather than acting as an agent of the government.

Are they describing third entities that are between Flock and the government end consumers, when they talk about customers that own the data?

[−] rdiddly 30d ago
Flock's customers own the data the same way Uber drivers are independent contractors, i.e. it's designed for weaseling out of obligations.
[−] annoyingnoob 30d ago
I've had the same kind of response from Email providers like Sendgrid, they claim its not their data. There is no way to have Sendgrid block you in their entire network, you have to play whack-a-mole with their customers. Seems like a flaw in these privacy laws when you can't ask the actual record holder to remove the records.
[−] _moof 30d ago
They seem to be implying that because they are a "service provider," they aren't responsible for complying with CCPA rules even though they are the ones with the data.

Does this hold water? I'm reading the CCPA rules now but if anyone knows, it would save me some tedious research.

[−] cold_tom 30d ago
Feels like a classic “we’re just the processor” answer But in reality you have no way to find or contact whoever actually controls the data, so it doesn’t really help. Kind of shows the gap between how the law works on paper vs how these systems work in practice.
[−] pext 30d ago
This reminds me of the Andrew Yang's "Data Dividend" project that ideally would have paid end users for their data rather than knowingly giving it aware for free. IMO, it was a great idea but flawed execution against all the lobbying.
[−] atmosx 30d ago
Back in 2018, CloudFormation data leaked through a public gist (misconfigured gist plugin, I thought the gist was private but it wasn't... I had change the default config) and showed up on an obscure website being served via CloudFlare. When I contacted CF, they claimed they couldn’t remove the cached content because their system “doesn’t work like that". I pushed back and then they said that they're not responsible for the content and that I should send another email to abuse@cf... to get data about the hosting provider and deal with the content provider (e.g. VPS, ISP, whatever). After a few back and forth msgs, I made it clear that if the data wasn’t taken down within a week or so, I would escalate the issue to the local and German GDPR authority (see https://www.ombudsman.europa.eu/en/european-network-of-ombud...).

And what do you know? I got not reply, but the content disappeared in ~48hrs.

[−] mmmlinux 30d ago
Lot of Flock Defenders in here.
[−] sklargh 30d ago
The concept of what constitutes a sale under CCPA is pretty expansive. An exchange of value can be a sale that occurs outside of a processing relationship. I’d say their note is inaccurate.
[−] carabiner 30d ago
It's not much worse than all the tracking adtech used by FAANG industry. Smartest people in the world working on these systems.
[−] kube-system 31d ago
I don't think they need your permission to use ALPR on your publicly displayed license plate.

> (2) (A) “Personal information” does not include publicly available information [...]

> (B) (i) For purposes of this paragraph, “publicly available” means any of the following:

> (I) Information that is lawfully made available from federal, state, or local government records.

> (II) Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer

[−] ranger_danger 31d ago
To me this sounds like the equivalent of visiting a website that sells your data, and then asking AWS to delete your personal data when it actually belongs to a customer of theirs and only resides within their private storage.

Would you ask your local ISP to delete data they provided to Tinder like your IP address? That doesn't make sense to me.

[−] lacker 30d ago
Isn't that how it should work?

If you write the police and ask them to delete all their data about you, that isn't a thing that they do. It shouldn't matter if the police store their data on AWS or their own servers.

Flock is a tool used by the police so it should work the same way.

[−] CSMastermind 30d ago
We truly need a constitutional amendment granting a formal right to privacy.
[−] waterproof 30d ago
Happy to contribute to your lawyer fund. I want to see how this plays out.
[−] rbbydotdev 30d ago
it would be nice if flock did not and could not exist
[−] thangalin 30d ago
Sent to Benn Jordan:

https://i.ibb.co/WWWYznHX/flock-future.png

See also a poster from IBM’s German subsidiary, circa 1934. The approximate translation: “See everything with Hollerith punch cards.”

https://www.clevelandjewishnews.com/opinion/op-eds/new-detai...

[−] nour833 30d ago
[dead]