|
Hey there, Joseph here. So Flock has AI-enabled cameras all across the U.S., with cops regularly tapping into them for all manner of crimes (or sometimes helping ICE). Now we've found Flock hires overseas workers to classify footage for its AI systems. Think, people manually reviewing footage to identify cars or people. Brings up all sorts of questions about who has access to Flock footage and when.
|
And for today only, we're running a Cyber Monday sale! Get full access to 404 Media's articles (like this one), bonus content, podcasts, and directly support journalists who quit corporate media to go independent. Get 25% off an annual subscription below.
|
|
|
|
This article was produced with support from WIRED. Flock, the automatic license plate reader (ALPR) and AI-powered camera company, uses overseas workers from Upwork to train its machine learning algorithms, with training material telling workers how to review and categorize footage including images people and vehicles in the U.S., according to material reviewed by 404 Media that was accidentally exposed by the company. The findings bring up questions about who exactly has access to footage collected by Flock surveillance cameras and where people reviewing the footage may be based. Flock has become a pervasive technology in the U.S., with its cameras present in thousands of communities that cops use everyday to investigate things like car jackings. Local police have also performed numerous lookups for ICE in the system.
|
|
|
|
|
Attack surfaces are expanding fast—driven by shadow IT, supply chains and rapid cloud adoption. Intruder’s 2025 Exposure Management Index—built from data across 3,000+ organizations—reveals how security teams are adapting to this new reality. Key findings: - AI is helping attackers weaponize the back catalog of CVEs, turning old vulnerabilities into new opportunities for exploitation.
- In a reversal from 2024, European organizations are pulling ahead of North America in critical vulnerability management, facing fewer critical issues.
- Thousands of CVEs are published each year, but only a handful truly matter. We’ve identified the five vulnerabilities that defined 2025 and what they teach defenders about real-world risk.
Discover how exposure management is evolving and where your peers stand in 2025.
|
|
|
|
Companies that use AI or machine learning regularly turn to overseas workers to train their algorithms, often because the labor is cheaper than hiring domestically. But the nature of Flock’s business—creating a surveillance system that constantly monitors U.S. residents’ movements—means that footage might be more sensitive than other AI training jobs. 💡 Do you work at Flock or know more about the company? I would love to hear from you. Using a non-work device, you can message me securely on Signal at joseph.404 or send me an email at joseph@404media.co. Flock’s cameras continuously scan the license plate, color, brand, and model of all vehicles that drive by. Law enforcement are then able to search cameras nationwide to see where else a vehicle has driven. Authorities typically dig through this data without a warrant, leading the American Civil Liberties Union (ACLU) and Electronic Frontier Foundation (EFF) to recently sue a city blanketed in nearly 500 Flock cameras. Broadly, Flock uses AI or machine learning to automatically detect license plates, vehicles, and people, including what clothes they are wearing, from camera footage. A Flock patent also mentions cameras detecting “race.” Screenshots from the exposed material. Redactions by 404 Media. Multiple tipsters pointed 404 Media to an exposed online panel which showed various metrics associated with Flock’s AI training.
Upgrade to continue reading.
Become a paid member of 404 Media to get access to all premium content.
|