I wrote this. I had/have absolutely no expectation that Flock would comply with my request, but figured I should try anyway For Science. Their reply rubbed me wrong, though. They seem to claim that there are no restrictions on their collection and processing of PII because other people pay them for it. They say:
> Flock Safety’s customers own the data and make all decisions around how such data is used and shared.
which seems to directly oppose the CCPA. It's my data, not their customers'.
Again, I didn't really expect this to work. And yet, I'm still disappointed with the path by which it didn't work.
They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data". A company that manufactures video cameras is not the one to talk to when someone records you, talk to the person who recorded you.
But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.
But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.
And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.
I think you should write them back and ask that they provide you with a customer list and continually update you as they get new customers so that you may follow the advice they've given you.
Wait, is it your data? If you drive your car in front of a Ring camera on my house (I don't have a Ring camera don't @ me), is it your claim that you own the data on that camera?
It would be revealing to see a judge scrutinize the degree of control Flock maintains over the system and deliberate on whether the company truly is as hands-off as it claims when it comes to privacy obligations.
Personally I feel if you're going to build so turnkey a system to facilitate collection of personal data by your customers at the scale Flock seeks, then at a minimum you should build an equally turnkey method to handle requests like the one you made. It would be a service both to their customers and to consumers at large who wish to exercise their rights under legislation to opt out.
The fundamental reason we ended up in a world where companies just pay lip service to privacy and don't take it seriously is because consumers / voters put up with it. I'm heartened when I see individuals keyed into the issue.
Is there some way to contribute funds toward your legal fees?
These laws get complicated quickly. There's a specific ALPR law in the CA civil code which seems to carve out several exceptions for a business like Flock:
The enforcement provisions are rather bleak as well and afford no opportunity to directly bring a case against the agency that operates the system but instead just the individual who misuses it.
I think one of the more direct attacks would be going after jurisdictions that chronically have officers misusing the system. I think you're going to have to create precedent in this way to foment actual change.
Nice work all the same. These systems need to be prodded and tested. Even unsuccessful results such as this tell us something about the situation we're in.
In short, Flock is a "service provider" and not the entity doing the recording.
Perhaps you can make a case that they are a "data broker" instead (https://oag.ca.gov/privacy/ccpa#collapse1i), but that is a separate law, and what you are really looking at is a combination of license plate, time and location being collected as data being collected and sold without your consent.
Obviously, I am not a lawyer (and not even US-based), but I like when privacy is respected :)
The data ownership is really interesting, as many threads here are going into. I wonder if it's possible to sidestep that entirely, though! Under the CCPA, "personal information" is defined as information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked — directly or indirectly — with a particular consumer or household. That says nothing about ownership.
To the extent that Flock is only storing the data on behalf of their customers, I'd understand they wouldn't be required to delete it. But to the extent that they are indexing it, deriving from it, aggregating it across customers, and sharing it via their platform, it seems they should be required to remove that data from those services.
I have some background in data privacy compliance.
It sounds like they are claiming to be a Service Provider under CCPA, which is similar to a Processor under GDPR. Long story short, a Controller is the one legally responsible for ensuring the rights of the data subject, and a service provider/processor is a "dumb pipe" for a Controller that does what they're told. So IF they are actually a Service Provider, they're correct that the legal responsibility for CCPA belongs to their customers and not them.
That's a big IF, though.
Being a Processor/Service Providor means trade-offs. The data you collect isn't yours, you're not allowed to benefit from it. If Flock aggregates data from one customer and sells that aggregate to a different customer, they're no longer just a service provider. They're using data for their own purposes, and cannot claim to be "just" a service provider.
I noticed that the company is glossed as "Flock" and not "Flock Safety (YC S17)" in posts like this and last week's "US cities are axing Flock Safety surveillance technology", https://news.ycombinator.com/item?id=47689237.
Did YC house style change a while back to drop the "(YC xxx)" annotation since so many popular firms particpate / or because it's well known?
Remember that the difference between "Flock can do whatever the hell it wants" and "Flock is required to delete your data at your request" is a law. Citizens vote for legislators. If you want this to be a higher priority for your legislators, buy them off.
I think you're going to have a hard time with this...
Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.
You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"
In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.
Not a lawyer, just noting the parallel.
I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.
“United States v. Jones, 565 U.S. 400 (2012), was a landmark United States Supreme Court case in which the court held that installing a Global Positioning System (GPS) tracking device on a vehicle and using the device to monitor the vehicle's movements constitutes a search under the Fourth Amendment”
> In accordance with its Terms and Conditions, Flock Safety may access, use, preserve and/or disclose the LPR data to law enforcement authorities, government officials, and/or third parties, if legally required to do so or if Flock has a good faith belief that such access, use, preservation or disclosure is reasonably necessary to comply with a legal process, enforce the agreement between Flock and the customer, or detect, prevent or otherwise address security, privacy, fraud or technical issues. Additionally, Flock uses a fraction of LPR images (less than one percent), which are stripped of all metadata and identifying information, solely for the purpose of improving Flock Services through machine learning.
In this document, to which they linked in their reply, it says clearly "address ... privacy ... issues."
Does your case not constitute a privacy issue? I would say so.
Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.
Per my understanding of the law for these sorts of data collectors, at least in the U.S., you need to contact the local municipalities (Flock's customers) for this redaction and the jurisprudence is governed at the state and municipal level.
The best source of this information is https://deflock.org/ . FWIW, this is run by a neighbor in Boulder, CO which has been wrestling with the use of these cameras.
How did we get to allowing this in the USA? I remember the zeitgeist used to be to make fun of China's mass surveillance / social credit system, and ten years ago proposing to build something like this in the USA would be unthinkable. It's wild that we're just willingly sliding into the same system here too.
If Flock collects and processes PII data, then all their customers are "subprocessors". Flock should really have a Data Processing Agreement with their subprocessors, to legally ensure they follow the same PII handling controls as Flock does.
For example, if Flock receives a legitimate request to delete some data, then Flock must forward that request to all their Data Processors (e.g. including AWS/GCP/Cloudflare) and they must delete it as well.
I'm not sure how this request even makes sense. By the logic of the demand made here, a municipality can set up an ALPR system to record evidence of crimes (after the grim failure of Flock's alerting in Oak Park where I live, we rolled Flock back to a configuration where that was all it was used for), and random people can mail Flock to be exempted from that evidence collection.
It's good to want things, and I get why people want this specific thing, but the logic that says this is a viable demand under current law seems like it would prove way too much.
An interesting quandary here is that they'd need to constantly scan for you and your vehicle, etc. so that they could know it was you then delete you. So to ensure they don't observe you, they need to observe you.
It’s fascinating how America could completely get rid of Flock cameras by sending criminals to prison and leaving them there, but we won’t do that so we have these endless arguments about these cameras.
I don’t really understand this response. I thought the entire business model of Flock was about circumventing the Fourth amendment by posing as a separate vendor selling information it has collected, rather than acting as an agent of the government.
Are they describing third entities that are between Flock and the government end consumers, when they talk about customers that own the data?
I've had the same kind of response from Email providers like Sendgrid, they claim its not their data. There is no way to have Sendgrid block you in their entire network, you have to play whack-a-mole with their customers. Seems like a flaw in these privacy laws when you can't ask the actual record holder to remove the records.
They seem to be implying that because they are a "service provider," they aren't responsible for complying with CCPA rules even though they are the ones with the data.
Does this hold water? I'm reading the CCPA rules now but if anyone knows, it would save me some tedious research.
Feels like a classic “we’re just the processor” answer
But in reality you have no way to find or contact whoever actually controls the data, so it doesn’t really help. Kind of shows the gap between how the law works on paper vs how these systems work in practice.
This reminds me of the Andrew Yang's "Data Dividend" project that ideally would have paid end users for their data rather than knowingly giving it aware for free. IMO, it was a great idea but flawed execution against all the lobbying.
Back in 2018, CloudFormation data leaked through a public gist (misconfigured gist plugin, I thought the gist was private but it wasn't... I had change the default config) and showed up on an obscure website being served via CloudFlare. When I contacted CF, they claimed they couldn’t remove the cached content because their system “doesn’t work like that". I pushed back and then they said that they're not responsible for the content and that I should send another email to abuse@cf... to get data about the hosting provider and deal with the content provider (e.g. VPS, ISP, whatever). After a few back and forth msgs, I made it clear that if the data wasn’t taken down within a week or so, I would escalate the issue to the local and German GDPR authority (see https://www.ombudsman.europa.eu/en/european-network-of-ombud...).
And what do you know? I got not reply, but the content disappeared in ~48hrs.
The concept of what constitutes a sale under CCPA is pretty expansive. An exchange of value can be a sale that occurs outside of a processing relationship. I’d say their note is inaccurate.
To me this sounds like the equivalent of visiting a website that sells your data, and then asking AWS to delete your personal data when it actually belongs to a customer of theirs and only resides within their private storage.
Would you ask your local ISP to delete data they provided to Tinder like your IP address? That doesn't make sense to me.
If you write the police and ask them to delete all their data about you, that isn't a thing that they do. It shouldn't matter if the police store their data on AWS or their own servers.
Flock is a tool used by the police so it should work the same way.
258 comments
> Flock Safety’s customers own the data and make all decisions around how such data is used and shared.
which seems to directly oppose the CCPA. It's my data, not their customers'.
Again, I didn't really expect this to work. And yet, I'm still disappointed with the path by which it didn't work.
But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.
But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.
And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.
Personally I feel if you're going to build so turnkey a system to facilitate collection of personal data by your customers at the scale Flock seeks, then at a minimum you should build an equally turnkey method to handle requests like the one you made. It would be a service both to their customers and to consumers at large who wish to exercise their rights under legislation to opt out.
The fundamental reason we ended up in a world where companies just pay lip service to privacy and don't take it seriously is because consumers / voters put up with it. I'm heartened when I see individuals keyed into the issue.
Is there some way to contribute funds toward your legal fees?
https://www.courthousenews.com/california-drivers-accuse-flo...
https://leginfo.legislature.ca.gov/faces/codes_displayText.x...
The enforcement provisions are rather bleak as well and afford no opportunity to directly bring a case against the agency that operates the system but instead just the individual who misuses it.
I think one of the more direct attacks would be going after jurisdictions that chronically have officers misusing the system. I think you're going to have to create precedent in this way to foment actual change.
In short, Flock is a "service provider" and not the entity doing the recording.
Perhaps you can make a case that they are a "data broker" instead (https://oag.ca.gov/privacy/ccpa#collapse1i), but that is a separate law, and what you are really looking at is a combination of license plate, time and location being collected as data being collected and sold without your consent.
Obviously, I am not a lawyer (and not even US-based), but I like when privacy is respected :)
To the extent that Flock is only storing the data on behalf of their customers, I'd understand they wouldn't be required to delete it. But to the extent that they are indexing it, deriving from it, aggregating it across customers, and sharing it via their platform, it seems they should be required to remove that data from those services.
But then again, I am not a lawyer!
> which seems to directly oppose the CCPA.
I have some background in data privacy compliance.
It sounds like they are claiming to be a Service Provider under CCPA, which is similar to a Processor under GDPR. Long story short, a Controller is the one legally responsible for ensuring the rights of the data subject, and a service provider/processor is a "dumb pipe" for a Controller that does what they're told. So IF they are actually a Service Provider, they're correct that the legal responsibility for CCPA belongs to their customers and not them.
That's a big IF, though.
Being a Processor/Service Providor means trade-offs. The data you collect isn't yours, you're not allowed to benefit from it. If Flock aggregates data from one customer and sells that aggregate to a different customer, they're no longer just a service provider. They're using data for their own purposes, and cannot claim to be "just" a service provider.
But maybe I am unclear on how Flock works.
As a suggestion, I saw you have RSS:
https://honeypot.net/feed.xml
I didn't see it mentioned in the main page or About or Archive. Maybe add it to a more visible place?
Did YC house style change a while back to drop the "(YC xxx)" annotation since so many popular firms particpate / or because it's well known?
[0] https://ag.state.mn.us/Data-Privacy/Consumer/
Or vote for/against them, that might work too.
Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.
You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"
In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.
Not a lawyer, just noting the parallel.
I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.
https://en.wikipedia.org/wiki/United_States_v._Jones_(2012)
> In accordance with its Terms and Conditions, Flock Safety may access, use, preserve and/or disclose the LPR data to law enforcement authorities, government officials, and/or third parties, if legally required to do so or if Flock has a good faith belief that such access, use, preservation or disclosure is reasonably necessary to comply with a legal process, enforce the agreement between Flock and the customer, or detect, prevent or otherwise address security, privacy, fraud or technical issues. Additionally, Flock uses a fraction of LPR images (less than one percent), which are stripped of all metadata and identifying information, solely for the purpose of improving Flock Services through machine learning.
In this document, to which they linked in their reply, it says clearly "address ... privacy ... issues."
Does your case not constitute a privacy issue? I would say so.
Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.
Republican community? They love corporate surveillance. Democrat community? They too love corporate surveillance.
There is no "Peoples' Party" that rejects this garbage.
The best source of this information is https://deflock.org/ . FWIW, this is run by a neighbor in Boulder, CO which has been wrestling with the use of these cameras.
For example, if Flock receives a legitimate request to delete some data, then Flock must forward that request to all their Data Processors (e.g. including AWS/GCP/Cloudflare) and they must delete it as well.
And Newsom killed this for some reason
It's good to want things, and I get why people want this specific thing, but the logic that says this is a viable demand under current law seems like it would prove way too much.
Are they describing third entities that are between Flock and the government end consumers, when they talk about customers that own the data?
Does this hold water? I'm reading the CCPA rules now but if anyone knows, it would save me some tedious research.
And what do you know? I got not reply, but the content disappeared in ~48hrs.
> (2) (A) “Personal information” does not include publicly available information [...]
> (B) (i) For purposes of this paragraph, “publicly available” means any of the following:
> (I) Information that is lawfully made available from federal, state, or local government records.
> (II) Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer
Would you ask your local ISP to delete data they provided to Tinder like your IP address? That doesn't make sense to me.
If you write the police and ask them to delete all their data about you, that isn't a thing that they do. It shouldn't matter if the police store their data on AWS or their own servers.
Flock is a tool used by the police so it should work the same way.
https://i.ibb.co/WWWYznHX/flock-future.png
See also a poster from IBM’s German subsidiary, circa 1934. The approximate translation: “See everything with Hollerith punch cards.”
https://www.clevelandjewishnews.com/opinion/op-eds/new-detai...