top of page

Secret Live Facial-Recognition Surveillance Network Exposed: What It Means for Privacy, Civil Rights, and Potential Legal Claims

  • Writer: Justice Watchdog
    Justice Watchdog
  • Nov 22
  • 3 min read
People walk on a busy street in New Orleans. There are colorful historic buildings, including The Old Absinthe House Bar. U.S. flags hang nearby.

A Covert Facial-Recognition Surveillance System Comes to Light


A Washington Post investigation uncovered that the New Orleans Police Department (NOPD) operated with access to a citywide facial-recognition surveillance network of real-time facial-recognition cameras, despite a 2022 ordinance that sharply limited such technology. Source: Washington Post

The cameras — over 200 of them — were installed and managed by a private nonprofit, Project NOLA, and used software capable of scanning thousands of faces in public spaces every day. When the system detected a potential match against a “watch list,” officers received immediate alerts on their phones.

The public was never informed that the system was operating in real time, nor that it was being used far beyond the boundaries of city policy.

Civil-rights groups including the ACLU and CDT have sharply criticized the facial-recognition surveillance program, calling it a dangerous expansion of mass surveillance. Sources: ACLU, CDT


How the Secret Facial-Recognition Surveillance Program Worked


Live Alerts to Police Officers

  • Cameras scanned and analyzed faces continuously

  • Software compared faces against a police-curated database

  • Possible “matches” triggered direct alerts to officers

  • No human analyst reviewed these results

  • Many alerts involved non-violent offenses, violating local policy


The Law vs. What Actually Happened

The 2022 City Council rules required:

  • Use of facial recognition only for violent crime suspects

  • Review by trained, certified analysts

  • Quarterly reports documenting every use

But according to investigators, none of these requirements were followed.


Why This Technology Is So Dangerous


1. High Error Rates — Especially for Vulnerable Groups

Independent studies consistently show facial-recognition algorithms misidentify:

  • Black and brown individuals

  • Women

  • Younger and older people

  • Anyone with non-standard lighting or camera angles

This increases the risk of wrongful arrests, traumatic police encounters, or surveillance targeting innocent people.


2. “Dragnet Surveillance” of Entire Communities

Real-time scanning transforms ordinary public spaces into biometric checkpoints.The CDT warns this normalizes continuous tracking, erodes anonymity, and chills free movement and expression.



3. Little to No Oversight

Because the program operated through a private nonprofit, there were:

  • No formal contracts

  • No public hearings

  • No mandatory audits

  • No public disclosures

  • No review processes

This creates enormous risk of abuse or expansion without democratic oversight.


Potential Legal Exposure for Police Departments and Cities


Victims of misidentification or wrongful arrest resulting from facial recognition may have legal claims, such as:


Civil-Rights Violations

  • Unlawful seizure under the Fourth Amendment

  • Equal protection violations if the technology disproportionately misidentifies certain groups

  • Due process violations from reliance on unreliable or unverified technology


Municipal Liability (Monell Claims)

Cities may be liable when wrongful actions result from:

  • A pattern or practice

  • Failure to supervise

  • Failure to train

  • Use of unreliable technology as a primary investigative tool


Negligent Investigation

If officers relied solely on algorithmic matches without proper verification, victims may have a claim for negligence and resulting harm.


Emotional and Physical Damages

Misidentification may cause:

  • Arrest

  • Detention

  • Physical injury

  • Emotional trauma

  • Reputational harm

  • Loss of employment or income

These injuries may support personal-injury claims.

Neon signs illuminate a quiet street at dusk in a cityscape with tall buildings. Warm light from bars highlights the vibrant nightlife.

If You Were Misidentified, Wrongfully Arrested, or Harmed by Facial Recognition — You Have Rights

Real-time facial recognition is powerful, but it is not infallible.And when police rely on bad data, innocent people get hurt.

If you or someone you know experienced:

✔ False accusation

✔ Wrongful arrest

✔ Use of force

✔ Illegal detention

✔ Surveillance without cause

✔ Emotional trauma from a police encounter

✔ A misidentification by security cameras, airports, stores, or police

you may be entitled to compensation and legal protection.


Key Issues Identified

  • New Orleans police used real-time facial-recognition alerts in violation of city oversight rules.

  • A private nonprofit operated the cameras, circumventing government transparency.

  • The system was used for non-violent offenses, contrary to policy.

  • No certified analyst reviewed algorithm matches, raising reliability concerns.


Legal Theories for Potential Claims

  • Fourth Amendment: Unreasonable seizure from false matches

  • Fourteenth Amendment: Equal protection concerns due to algorithmic bias

  • Section 1983: Claims for constitutional violations by police

  • Monell Liability: For systemic misuse of faulty technology

  • Negligence: Failure to verify algorithm results

  • Emotional/Physical Injury: Supporting personal-injury damages


Bottom Line

Secret real-time facial recognition represents a dangerous expansion of government power with minimal oversight and high potential for error. When misidentification harms someone, strong legal remedies may be available.

bottom of page