Stop Flock (stopflock.com)

by cdrnsf 308 comments 993 points
Read article View on HN

308 comments

[−] scarmig 30d ago
Although I oppose the surveillance state, it's important to understand the motivations and incentives involved in the move toward Flock (and its eventual successors); until those are resolved, governments are going to be implementing Flock style programs with relatively tepid opposition.

Police departments are seriously understaffed in many major cities, and officers are much less efficient than they used to be. This has led to the decline of the beat cop, who provided a kind of ambient authority that helped create, both a sense and reality, of public order. People really want the sense (even more than the reality!) of public order; without that, they will jump to faddish solutions that promise it, regardless of the data for or against it.

The best counter to Flock is to provide alternatives to it, not just reject it while keeping the status quo going. We need a new, vitalized police culture, that shares mutual trust and engagement with the community.

[−] bearjaws 30d ago
Most cities are spending 9-10 figures on Police staff, and somehow are understaffed?

We simply aren't getting effective policing, and technology isn't the solution.

Reality is cops have become police report writers, traffic accident helpers, and domestic abuse arbiters, that is over half the job.

[−] edmundsauto 29d ago
Law enforcement has also burned its goodwill with the public, making everything harder.
[−] scarmig 30d ago
I agree with everything you say; my "police departments are understaffed" is too generic, and might be better stated as "too few resources are devoted to traditional beat policing." Which isn't to say that the other things are unnecessary or pointless, but it's a different set of skills.
[−] nswest23 29d ago
[flagged]
[−] aldebran 30d ago
Police departments aren’t understaffed. It’s a priority problem not a lack of resources problem. I live in a tech heavy, wealthy city. It’s small. No violent crime. Barely any crime at all. There’s occasional break ins and car break ins. When this happens it’s a big deal.

We had one last year. Everyone around has cameras. The cops refused to do anything about it. They refused to get recordings. The neighbor went door to door and gathered it herself. Cops refused to do anything even though you can see the car and the plates from multiple videos, multiple angles.

Guess what the cops always have resources for? Hiding behind bushes and trees to ticket people going 5 over. Or at turns where they know they’ll get people before people see the cop car.

Our HOA came together and asked the police department about this. They gave us bullshit about how custody of evidence etc is hard and even if they put people in jail, the lenient judges will let them go anyway. It was fucked up.

Our HOA was going in hard about installing floc cameras everywhere. I had to fight hard not to get that done. One of the reasons I won wasn’t because privacy, it was because the cops literally were like unless we can directly pull video feeds from cameras, we won’t do much. And that access wasn’t available to those police department. At least at the time.

There have been many other such stories I’ve personally witnessed in the cities I’ve lived in.

Cops seem to have plenty of resources to bully people of color, seize assets and hide behind trees and bushes to ticket people, reduce the period of orange lights so people get more tickets etc. but never enough to actually do their jobs.

[−] scarmig 30d ago

> I live in a tech heavy, wealthy city. It’s small. No violent crime. Barely any crime at all.

Compare to e.g. Oakland, which recently approved a Flock expansion:

https://oaklandside.org/2025/12/17/oakland-flock-safety-coun...

Why?

https://sfstandard.com/2023/06/09/oakland-crime-police-respo...

https://oaklandside.org/2025/10/08/oakland-watchdog-audit-po...

https://www.cbsnews.com/sanfrancisco/news/oakland-police-off...

Now, will Flock help with this? No. But the visceral lack of safety people feel makes them more likely to see it as a necessary evil, not snake oil.

[−] burnt-resistor 30d ago
Baltimore has metric shittons of police cameras but it's still a dangerous shithole. Cameras don't stop idiot criminals from criming. Baltimore's police suck and the area sucks because there's massive unemployment, cameras aren't a panacea.
[−] MisterTea 30d ago
Yeah I don't get the camera supporters. Cameras do nothing. Desperate people aren't concerned about cameras or stopped by them.
[−] JuniperMesos 28d ago
Cameras make it easier to gather evidence that will let the criminal justice system try and incarcerate those desperate people. It's a lot harder to commit a crime if you are currently locked in prison.
[−] archon810 29d ago
We're talking about license plate recognition cameras, not cameras that show a blurry picture of a criminal committing a crime. These LPR cameras absolutely do result in meaningful law enforcement action and help catch criminals.

See San Francisco and their use of LPR cameras like Flock combined with drone surveillance.

https://fb.watch/GvATBMj1bj/

https://abc7news.com/post/how-sfpds-new-investigation-center...

Etc.

[−] tptacek 30d ago
I agree that police department staffing is less of a real issue than people claim it is, and that many departments have target staffing levels that are artificially elevated. But I'm struck by your comment about cops "hiding behind bushes to ticket people going 5 over", because in the ultra-ultra-progressive inner-ring Chicago suburb in which I live, one of the chief complaints about policing over the last couple years has been the lack of traffic enforcement.
[−] gbear605 30d ago
On a daily basis, I see several cars that are going forty+ miles per hour over the speed limit, weaving between lanes. They go right past the cop and the cop doesn't care. Then someone going five over goes past the cop and the cop gives them a ticket. They go after the easy ones that fill their quotas, not the ones that actually make anyone safer.
[−] lesuorac 30d ago

> one of the chief complaints about policing over the last couple years has been the lack of traffic enforcement.

Which traffic enforcement though?

I really do not like the fact that lefts on red are not enforced. I have numerous times seen people run a red-red light infront of a cop car with no enforcement.

That said, people going 35 in a 30? Like I care. People weaving in between lanes? Yeah that seems much more dangerous.

[−] JuniperMesos 28d ago
Are you taking the perspective of a car-driver risking getting a ticket for technically violating a traffic law, or a resident concerned about drivers technically violating traffic laws on the roads near where they live?
[−] tptacek 28d ago
Neither, but what I'm describing is the resident concern.
[−] gunsle 29d ago
Police departments in a lot of major cities are absolutely unstaffed relative to population growth after decades of demonization in the media. You are ignorant.
[−] lopsotronic 29d ago
To add to some of what others are saying, another problem is the measurement problem.

DAs and police in general are almost universally evaluated based on arrest numbers. Only very rarely on actual crime rates, and never on something as abstract as quality of life or local revenues or property values.

Gauging how good law enforcement is just by looking at arrest numbers is probably the wrong dial to be looking at.

[−] TZubiri 30d ago
https://news.ycombinator.com/item?id=47124169

I've noted this in the age verification debate, and in the Android developer verification debate as well.

Just denying the tradeoffs isn't productive, if tradeoffs affect others, just pushing your position disregarding the tradeoffs as fake or not important is divisive. In actuality I think that both parties become incentivized to solve the problems of the other group of people too, but as a centrist that position often gets pushback from both sides who seem to collaborate only indirectly from a place of adversarial competition and good vs evil framing, which I think is less productive than just recognizing the conflict and negotiating, but perhaps it's more engaging...

[−] contagiousflow 30d ago
If you take a look externally to other countries and cities, do you attribute their relative safety to good policing?
[−] gunsle 29d ago
No that would be cultural homogeneity and extremely low immigration.
[−] contagiousflow 28d ago
Give "correlation between immigration and crime" one search. Or don't and just continue to hit on your bigoted dogwhistles
[−] gunsle 29d ago
You’re not going to get nuanced law enforcement discussion on this site considering the commentary here is Reddit tier these days. I agree with you though - the mayor and city council here in Minneapolis continuously defunding the police and refusing to give them resources predictably led to sharp increases in crime. It’s baffling to me large liberal cities have demonized the police and gaslit their base into thinking everything is just fine.
[−] bmitch3020 30d ago
I don't want to stop Flock the company. I want to stop Flock the business model, along with all the other mass surveillance, and the data brokers. If the business models can't be made illegal, it should at least come with liabilities so high that no sane business would want to hold data that is essentially toxic waste.

Without that, we are quickly spiraling into the dystopia where privacy is gone, and when the wrong person gets access to the data, entire populations are threatened.

[−] stevemk14ebr 30d ago
You want to stop the source, which is that the government and other agencies can purchase surveillance data that would otherwise be disallowed by the 4th amendment. We need to end this 'laundering' of information through third parties, and enforce the constitution by its intent.
[−] RHSeeger 30d ago
Not just the government. It shouldn't be possible for any random stalker to find someone's daily movements.
[−] nullc 30d ago
They're also one and the same generally-- at least if the stalker has money or the right friends most kinds of law enforcement access means stalker access. It's not unheard of for an officer themselves to be the stalker, and there are so many people that work in law enforcement that bribing, impersonating, or persuading your way to access is not that big a deal. Not to mention that enabled stalkers can just file a federal lawsuit and issue subpoena for records.

The only safe thing is for the records to never exist in the first place.

[−] soulofmischief 30d ago
Further reading:

AP: Across US, police officers abuse confidential databases

https://apnews.com/general-news-699236946e3140659fff8a2362e1...

[−] Tangurena2 30d ago

>

It's not unheard of for an officer themselves to be the stalker

This was one of the motivations for passage of the Driver's Privacy Protection Act of 1994. Nowadays, officers need a legitimate reason to run a plate - unless the patrol car is fitted with automatic cameras[1] that look up every plate of every car they drive past.

> The Virginia state police used license plate readers to track people’s attendance at political events; > The New York Police Department used license plate readers to keep track of who visited certain places of worship, and how often;

> Despite all this surveillance, ALPR technology has been repeatedly shown to be unreliable; like other police technologies, ALPRs can and do make mistakes.[2]

Generally, court decisions have held that you have zero expectation of privacy when you are in public spaces. Current license plate standards[3] aim for plates that are not cluttered and are easily read by the human eyeball, despite being wrapped with license plate frames (which usually make the state hard/impossible to read which is the most common failure mode for ANLR[4]). If the reflectivity material (traditionally called "ScotchLite"[5]) is worn out (or defaced), most states require the plate to be replaced.

Notes:

0 - https://en.wikipedia.org/wiki/Driver%27s_Privacy_Protection_... Prior to passage, a slang term for running/looking up the plate/registration of a car with a pretty woman driver was "running a date".

1 - https://sls.eff.org/technologies/automated-license-plate-rea...

2 - https://www.aclum.org/publications/what-you-need-know-about-...

3 - https://www.aamva.org/getmedia/646bcc8a-219b-47d8-b5cd-72624...

4 - https://www.aamva.org/getmedia/0063bf88-cb44-4ab9-90b6-200c8...

5 - https://www.3m.com/3M/en_US/scotchlite-reflective-material-u...

Disclaimers:

I used to work for my state's motor vehicle department and had database/developer access to driving licenses and motor vehicle registration records.

I graduated from a police academy when I was a youngster.*

[−] jojobas 30d ago
How is that achievable? PIs can legally do it. Random people can keep tabs on you and exchange gossip. It's the sudden scale and low cost that doesn't sit well with freedom to not be tracked in public 24/7 we took for granted.
[−] ethbr1 30d ago

>

How is that achievable?

The core ill is aggregated data, because that's what allows the mass in surveillance, data mining, etc.

The collection actions are almost immaterial. Without persistence they must be re-performed for each request, which naturally provides a throughput bottleneck and makes "for everyone" untenable.

If we agree the aggregated data at rest is the problem, then addressing it would look like this:

1. Classify all data holders at scale into a regulated group

2. Apply initial regulations

   - To respond to queries for copies of personal data held
   - To update data or be liable in court for failing to do so
   - To validate counterparties apply basic security due diligence before transferring data (or the transferer also faces liability)
   - To maintain a *full* chain of custody of data (from originator through every intermediate party to holder) so that leaks / misuse can be traced
   - To file yearly update on the types, amount of data, and counterparties it was transferred to with the federal government that are made public
The initial impediment to regulatory action is Google, Meta, Equifax, etc. saying "This problem is too complex and you don't understand it."

It's not. But the first step is classifying and documenting the problem.

[−] rdevilla 30d ago
It's not achievable.

The only way is through - everybody should get into the practice of stalking and gossiping about each other in a Molochian environment, where the people who do not do so suffer from the losing side of an information asymmetry.

Expect AI, especially post-Mythos, to just enable this at even further scale. Consumer grade wireless networking gear as a whole is a very wide attack surface and is basically never updated.

[−] RHSeeger 30d ago
Sorry, I was ambiguous in what I meant.

It is not realistic to say that no person is allowed to keep track of another person; watch where they go, when, with who, etc.

It should not be acceptable for a company to gather information on "everyone"; where they have been going, when, with who, how often, etc. And it should not be acceptable for them to sell that information (to government agencies OR private citizens).

It's a matter of scale.

- Making the first one illegal/impossible would be difficult/costly; and not doing so has a limited impact (to society, not to the single person affected).

- Making the second one illegal is much easier, and it's much easier to shut down a large company doing it than it is 1,000 individual stalkers. The impact of making it illegal is much wider and better for society as a whole.

We don't want anyone being stalked. But in a cost/benefit analysis, we can do something about one of them but not the other.

[−] buzer 30d ago
If PIs can "legally" do it then it sounds like there is a law which allows them to do it. That law can be revoked (unless the power comes from Constitution which would make it effectively impossible to revoke).

Note that PIs are effectively illegal under GDPR by default. They would generally need to provide Article 13 notice, i.e. you would become aware of them unless they were just asking around without actually following you. Member states can make them legal though (via Article 23) and likely in many cases they have done so.

[−] jojobas 30d ago
In the US, PI licensing is only about PIing for hire. The actual act of going through public records, following cars and whatnot do not require a license, you can spy on anyone without a license as long as you don't get paid for it.

EU is more complicated, but Article 14.5.b allows withholding notice if it would impair/defeat the purpose of processing. The PI must however apply "safeguards", whatever it could mean.

[−] fc417fc802 30d ago

> following cars and whatnot do not require a license, you can spy on anyone without a license as long as you don't get paid for it.

Pretty sure that would be considered stalking and is broadly illegal in the US, PIs being an exception.

[−] buzer 30d ago
Article 14(5)(b) does, but that only applies for Article 14 notice (personal data not directly obtained from data subject). Article 13 (personal data obtained directly from data subject) does not have such exception in GDPR itself.

This becomes extremely relevant when you read it in the light of the C-422/24 decision. In that personal data collected via body worn cameras was determined to be "directly obtained". Paragraph 41 from the judgement:

> If it were accepted that Article 14 of the GDPR applies where personal data are collected by means of a body camera, the data subject would not receive any information at the time of collection, even though he or she is the source of those data, which would allow the controller not to provide information to that data subject immediately. Therefore, such an interpretation would carry the risk of the collection of personal data escaping the knowledge of the data subject and giving rise to hidden surveillance practices. Such a consequence would be incompatible with the objective, referred to in the preceding paragraph, of ensuring a high level of protection of the fundamental rights and freedoms of natural persons.

Given this it's very unlikely that PI observing (especially if they record) could be considered to be Article 14 instead of Article 13 type of collection as it's exactly "hidden surveillance practice" that the Court warned about.

Member states do have a right to restrict the Article 13 disclosure obligations via Article 23 restriction, but that requires specific law in the member state & the law itself must fulfill the obligations that Article 23 requires. Article 23(2) essentially forbids leaving everything up to the controller.

And as far as PI in the US goes, actions between stalking and PI "for self" tend to be so similar that I wouldn't necessarily recommend anyone to try it.

[−] NoSalt 30d ago
Government ... random stalker ... same thing.
[−] therobots927 30d ago
Means of Control by Byron Tau and Surveillance Valley by Yasha Levine. Can’t recommend these books enough for anyone who is skeptical of the above claim.
[−] caconym_ 30d ago
Honestly it should probably just be illegal for anyone, private or public, to engage in mass surveillance (or "data gathering", whatever) of anybody who didn't expressly consent to it. As long as the data exist, they will be abused.
[−] nandomrumber 30d ago
But I did expressly consent.

When I installed the SoundCloud app and it told me by continuing I agree to them sharing my data with their 954 partners.[1]

1. I’m not even joining. When I mostly recently installed the SoundCloud app - for the first time on a new device, that’s what’s it said: 954 partners. How can anyone reasonably understand what it is their agreeing to in that scenario.

[−] Perseids 30d ago
This is the important point. You need the right to not be discriminated when you withhold your consent, otherwise your consent is effectively meaningless, as it is forced on you by your impossible bargaining position. This is one of the central pillars of the GDPR without which it wouldn't work at all. Be advised to make asking customers for consent that doesn't directly benefit them illegal as well, lest you risk creating another wave of malicious cookie banners.
[−] Joker_vD 30d ago

> You need the right to not be discriminated when you withhold your consent, otherwise your consent is effectively meaningless, as it is forced on you by your impossible bargaining position.

Which is why "we don't serve patrons without shoes and pants" policy is unconstitutional, yeah.

If you don't want to agree to a business's demands — you're welcome to not deal with them and look for an alternative. All the alternatives have the same (or even worse) demands? Unless you can prove collusion, that's just how the invisible hand of the market worked its magic out. Go petition you congressman to violate laissez-faire even more than it already is, I guess.

[−] inetknght 30d ago
Not only that, but it should be illegal (eg: fines for the company and potential jail time for executives) for tying consent to use/purchase of services or products.

Consent should be _voluntary_, not mandatory.

[−] brazzy 30d ago
You mean something like a... general regulation for data protection?
[−] intended 30d ago
A significant chunk of the infrastructure that farms data is now from private organizations, who sell that information because it is a source of revenue.

Government is the bogeyman we are afraid of, but ad tech is doing the actual heavy lifting.

[−] dzhiurgis 30d ago
Yeah nah I’d rather stop the criminals.
[−] eru 30d ago
100 miles around your border is a constitution free zone anyway.
[−] Terr_ 30d ago
Necessary, but not sufficient.

Even if we somehow, perhaps via magic genie-wish, made the government totally disinterested... these systems would still enable dystopian levels of private surveillance and manipulation.

[−] neya 30d ago
This should ironically start at the VC level - and that includes YC et al. Some one comes and says "hey, we got this idea, we collect facial recognition data for training proprietary AI models", the response from the VC should "I'm gonna stop you right there. This is unethical."

Not "Did you say I can 5x my ROI? Here, shut up and take my money!"

[−] 0x10ca1h0st 30d ago
This is great sentiment. Companies can be stopped, and then the medusa grows another head. Kill the business model, make the brokering of data illegal, and if caught, fines would be paid directly to those effected. This would go a long ways to promoting privacy first.
[−] someothherguyy 30d ago

> we are quickly spiraling into the dystopia where privacy is gone

we are essentially already in that dystopia.

it is now more of a question of how bad it gets, and if the population will ever stand against it in any meaningful fashion.

[−] Tangurena2 30d ago
One simple remedy would be to make companies (that collect such private data) and their directors/executives jointly & severally liable[0] for any identity theft. It should come with "forever" liability equivalent to SuperFund sites[1]

Notes:

0 - Financial penalties would not be limited to "your share" of the penalty. If you have money, and the other parties don't, the plaintiffs can collect from whichever defendant has money.

1 - Everyone who ever owned the site with the toxic waste is liable for the cleanup. This is why when a gas station is sold (in the US), all of the fuel tanks are dug up and replaced - this way, none of the future leakage can be attributed to the previous owners.

[−] 01100011 30d ago
There is a weird fetish with Flock right now. Privacy advocates have been screaming at the public for 25 years now and suddenly the public cares and is obsessed with this one very specific company.

Nevermind license plate readers have been collecting your data for decades. Nevermind you literally carry a tracking device on your person, likely 24/7.

I mean, cool, stop Flock, but don't stop there. Flock is very much not the final boss in this fight. The cynic in me says we will all get bored once Flock is off the radar though.

[−] razster 27d ago
When you have dedicated groups of ppl on IRC and Newsgroups showing how easy it is to access these systems and teach others, you know it’s a bad thing. There are dozens of them and most of the people gathering this information and using it is not for good. It’s a flawed system
[−] King-Aaron 30d ago

> I don't want to stop Flock the company. I want to stop Flock the business model, along with all the other mass surveillance, and the data brokers.

Then you want to stop the company.

Which is reasonable.

[−] itomato 30d ago
Without their unique business model, what is the company? The product?

What is the addressable market for ubiquitous public surveillance devices? Who is the customer?

[−] NegativeK 30d ago
Make HIPAA include PII.

That hits your toxic waste goal real fast.

[−] boriskourt 30d ago
If we can set a legal precedent then this can cascade into policy, or an enforceable standard much faster.
[−] heyethan 30d ago
[dead]
[−] jimmar 30d ago
I followed the shooting at Brown University last year very closely. Brown's leadership was heavily criticized for having camera blind spots and not being able to track the shooter's exact movements through campus. I can understand why people with stewardship over the safety of their students/customers/constituents would make decisions to err on the side of tracking. I'm not saying I agree with it, but I understand it.
[−] jedberg 30d ago
We need a law that says if you hold any data about a person, they must be notified when anyone accesses it, including law enforcement.

I used to work in criminal investigations. I understand how this might make investigation of real crime more difficult. But so does the fact that you need a warrant to enter someone's home, and yet we manage to investigate crime anyway.

Your data should be an extension of your home, even if it's held by another company. It should require a warrant and notification. You could even make the notification be 24 hours after the fact. But it should be required.

[−] enaaem 30d ago
Btw, flock employees have been caught watching kids gymnastics class and pools.

https://substack.com/home/post/p-193593234

Why does the VP of strategic relations need to watch the kids gymnastics class?

[−] himata4113 30d ago
This is just reiterating same points deflock does including mentioning deflock and images from deflock?

Deflock: https://deflock.org/

Also: https://haveibeenflocked.com/

[−] khuston 30d ago
I’m all for mass surveillance of roadways, but I want to see results. Every day I see and hear people breaking laws with their vehicles in ways that make life worse for others around them.
[−] LurkandComment 30d ago
What might be more productive is to suggest legal ways law enforcement can prevent and enforce the problems this was designed to stop. It will help you see where the laws are limited, what needs to be updated so you can balance the risk of privacy vs. the need to enforce laws (or the need for new laws all together). Right now we're osilcating between gaps criminals exploit and infringing on fundemental privacy rights.
[−] amazingamazing 30d ago
I’m curious if there were some consortium of all private businesses with their own surveillance cams deciding to aggregate their footage could it be stopped?
[−] tbojanin 30d ago
As much as I dislike flock. There's definitely a case to be made for the business model where first responder companies can generate huge amounts of taxpayer funded ARR in "sanctuary cities" due to criminals never getting actually serving sentences. If criminals are stuck in the:

"Commit crime -> brief retention period -> activist judge lets you go" loop, then you can definitely build a business model that just capitalizes on the fact that there's no consequences to committing a crime, and local/state governments have to waste copious amount of money due to the incredibly inflated demand of first responder services.

[−] teraflop 30d ago
Here's a modest proposal: what if we made it a serious crime for anyone to retain automatically-recorded surveillance footage, or data derived from it, for longer than some limited period of time (say, 7 days) unless said footage is released to the public within that timeframe?

That is, you can put up cameras wherever you want, but you can't gain any kind of competitive advantage by doing so.

I think the public would be more alert to the dangers of mass surveillance if the magnitude of that surveillance was more obvious. And if everyone was watching everyone, at least it wouldn't as easily abused for purposes such as selective prosecution or blackmail.

[−] beloch 30d ago
For the Canadians sitting at home, tut tutting more American foolishness that could never happen up here... Flock started their expansion into Ontario this very month[1].

We should probably oppose this.

_________

[1]https://www.theguardian.com/technology/2026/apr/07/toronto-r...

[−] otterley 30d ago
Why do people consistently and falsely believe that they have privacy in public settings? You are literally out in public. If you don't want your behavior in public to be observed, then either don't behave in such a way that you wouldn't want observed, or stay home.

UPDATE: don't conflate stalking with observation. These are not the same. You can observe, but you cannot intimidate.

[−] 0ldblu3 30d ago
Lawmakers won't care until it is them and their kids that are being tracked by "hackers". Flock API's are poorly protected and the individual cameras are hackable as shown by Benn Jordan in multiple YouTube videos.
[−] czhu12 30d ago
This movement reminds me of the "protecting democracy" message that was run on the national stage that pailed against the backdrop of rising inflation.

Privacy is important just as democracy is important, but crime and lawlessness feel more immediate and will always take center stage.

Any message to try to address the spread of flock and flock-like business models have to address what replaces it. If the only choice people are given is either having flock, or car break-ins, then I think we can probably guess what people would choose.

SFPD at least, has credited flock and flock-like tech for why property crime has dropped so much in recent years.

[−] burnt-resistor 30d ago
It's taxpayer-funded dragnet-warrantless surveillance, data harvesting, and stalking. And it is local elected officials' deliberate voluntary or bribed choice to invite Big Mommy into the lives of their constituents and people passing through. The anti-privacy doomers don't get to force their invasive anti-values on everyone else who wants respect for and to preserve their right to not be autonomously surveilled and monitored every single second when not in a business or organization premises.
[−] stopachka 30d ago
I am surprised how overwhelmingly negative the comments are here. I would have expected at least a few voices defending Flock.

I'll step in and add a voice. Ultimately, Flock is solving a real problem with crime. This is why police departments when them.

Stopping Flock doesn't address the need that got police departments to use them. If you want to "stop flock", you need to address that need better.

[−] SonOfKyuss 30d ago
I could be convinced to support public cameras if access to the footage was tightly controlled and only used for solving serious crimes, but government officials and flock themselves have repeatedly shown that they can’t be trusted to use them in a responsible manner. It’s too powerful of a tool to put in the hands of untrustworthy individuals
[−] diogenes_atx 30d ago
To the list of references provided by this post in the section "Further Reading," I would add the following book:

Sarah Brayne (2020) Predict and Surveil: Data, Discretion, and the Future of Policing, Oxford University Press

https://www.amazon.com/Predict-Surveil-Discretion-Future-Pol...

An academic study about the use of surveillance technology at the Los Angeles Police Department, the book documents the LAPD's use of data brokerage firms (e.g., Palantir) that collect and aggregate information from public records and private sources, as well as automatic license plate readers like Flock, and Suspicious Activity Reports generated by police and civilians, which include reports of mundane activities such as using binoculars, drawing diagrams, or taking pictures or "video footage with no apparent aesthetic value." All this data ultimately gets parked in Fusion Center facilities, built in the aftermath of 9/11, where federal, state and local law enforcement agencies collaborate to collect, aggregate, analyze and share information. As the author observes, "The use of data in law enforcement is not new. For almost a century, police have been gathering data, e.g., records of citations, collisions, warrants, incarcerations, sex offender and gang registries, etc. What is new and important about the current age of big data is the role in public policing of private capitalist firms who provide database systems with huge volumes of information about people, not just those in the criminal justice system."

[−] jimnotgym 30d ago
Every time I see Flock mentioned, it makes me think it will be about the Linux lock file tool Flock, and I wonder why people want to stop it
[−] fffernan 30d ago
Live in a neighborhood that privately installed cameras. The city also installed cameras. Before the camera's cars used to come and raid houses and worker's trucks. It was quite common. The neighborhood has statistics and tracks how many stolen cars / plates drove through the neighborhood. All crime statistics dropped, police show up and arrest people. Saying its an illusion of safety is bullshit. It's all fake until you're a victim of a crime. We need our own way to fight back against gangs, etc. I'd rather have more cameras and less police. Also lets get some drones as well, they work fantastic in SF. The city owns the data its not getting sold. It gets erased unless a crime is reported.
[−] reboot81 30d ago
So with 100k cameras crime is down how much?
[−] nullc 30d ago
I am somewhat skeptical that either the ACLU or EFF are effective organizations for this cause. The ACLU in particular have drifted significantly from a civil liberties focus, and EFF's privacy track-record for corporate run surveillance has never been the best and of late they seem to be following the ACLU away from civil liberties.
[−] freakynit 30d ago
We are heading towards the exact future shown in the show "Person of Interest".
[−] eemax 30d ago

> The Illusion of Security

> Flock advertises a drop in crime, but the true cost is a culture of mistrust and preemptive suspicion. As the EFF warns, communities are being sold a false promise of safety - at the expense of civil rights* (EFF).

...

> True safety comes from healthy, empowered communities; not automated suspicion. Community-led safety initiatives have demonstrated significant results: North Lawndale saw a 58% decrease in gun violence after READI Chicago began implementing their program there. In cities nationwide, the presence of local nonprofits has been statistically linked to reductions in homicide, violent crime, and property crime (Brennan Center, The DePaulia, American Sociological Association).

These are incredibly weak arguments. I haven't personally looked into how good Flock cameras are at actually preventing crime and catching criminals, but if this is the best counterargument their detractors can come up with, it makes me suspect they're actually pretty good.

Crime is extremely bad. Mass surveillance is bad too, especially if abused, but being glib or dismissive about the real trade-offs is counterproductive.

Also, recording in public spaces (or private spaces that you own) is an important and fundamental right just like the right to privacy; simply banning this kind of surveillance would also infringe on civil liberties in a different way. I agree that laws and norms need adjusting in light of new technology, but that discussion needs more nuance than this.

[−] ccorcos 27d ago
Its worth checking out a recent interview with the Flock CEO on Cheeky Pint (Stripe's podcast with John Collison) [1]

As much as I fear the Orwellian future this could turn into, I think its important to recognize that the burden of preventing that is on lawmakers, not companies.

He explains in the interview that he would love for laws to be passed to create tighter controls over how these systems are used. The problem is that police departments don't want it! Perhaps they can elect to do it themselves, but then they expose themselves to competitors who have "more powerful features".

Something like 70% of homocides go unresolved! I think its cool that Flock is trying to help police departments fight crime. But it's time to yell at your lawmakers. Don't hate the player, hate the game!

[1] https://www.youtube.com/watch?v=UYm2nXFDaB0

[−] solaarphunk 30d ago
Counterpoint: vehicle related crime is way down in SF. Which is great.
[−] hariseldom 30d ago
a missed opportunity, going with StopFlock instead of FlockOff dot com
[−] josefritzishere 30d ago
Flock should be illegal. Full stop. These people should be in prison.
[−] self_awareness 30d ago
I understand the privacy argument. There are a few questions though:

1) Suppose there will be another shooting. Don't you want to know what exactly has happened before you go to the protest? Suppose your child will be hurt. Wouldn't you do anything to capture the culprit? How exactly would you feel if the police would tell you that they couldn't get the video with culprits face, because watching it would be a violation of someone's privacy?

2) Everyone has a camera in their pocket. Someone is filming all the time. Police can seize this video. Isn't that a privacy risk? Should we ban cameras in smartphones?

3) Should we even be private in the public? Doesn't privacy in public spaces encourage crime? I will die on a battle to keep the privacy in my home, but in public? I personally prefer to be safe, than private, in public.

4) What about private cameras near homes filming 24/7? Are those risks for privacy?

5) People in power will always be corrupt, have bad intentions, will use public goods for personal gain. Should we disregard broader benefits because there will be isolated cases where those benefits will be exploited?

Happy downvoting.

[−] VladVladikoff 30d ago
Boy would it just be terrible if someone hacked into the flock network and manipulated all the camera results ever so slightly. A letter here a number there, license plates or matches never quite lining up. It would take years for them to find the source of the “bugs”. Not saying I know anyone doing this or anything, just saying it would be oh soooooo terrible.
[−] 5255652 29d ago
I misread the title
[−] chris_wot 30d ago
Michel Foucault's Panopticon is alive and well I see.
[−] mike_d 30d ago
The "Take Action" section is missing the most obvious solution. Everyone just goes and takes down a camera. We as a society do not consent to this use of public space and simply have a national "Take out the trash day."

There is no way Flock could practically ramp up production or manpower to replace the entire fleet before failing to meet contractual requirements with their customers that keep money flowing in.

[−] Lio 30d ago
The best way to block Flock is probably to find a way to track billionaire celebs like Taylor Swift and Elon Musk.

If they were able to successfully change laws so they can fly anonymously they can probably change laws on cameras in public places, at least for them anyway.

Privacy for the special people.

https://gizmodo.com/taylor-swift-and-elon-can-finally-fly-pr...

[−] ianpenney 30d ago
Obligatory reference to PIPEDA and GDPR.

Edit: not a low effort comment. This is something you should all read and demand the same of. I consternated on how not to call your regime moronic. It _is_ moronic that you don’t have these basic protections and we keep having to listen to you all whine about that.

[−] mark124mj 30d ago
[dead]
[−] arcanemachiner 30d ago
[flagged]