Editor’s Note: This story was produced in partnership with the Pulitzer Center’s AI Accountability Network.
Over the past several years, the Texas Department of Public Safety (DPS) has quietly built out an expansive surveillance apparatus—one that’s increasingly powered by artificial intelligence. Many of these technology acquisitions have been made under the auspices of Governor Greg Abbott’s Operation Lone Star, an $11 billion program that has supercharged the state’s decades-long border militarization.
The powerful and well-funded state police agency has not just expanded its existing surveillance capabilities, which include a fleet of spy planes, unmanned drones, and a network of wildlife game cameras that are deployed all across the borderlands of Texas, but it also is increasingly using AI-powered software to perform intelligence gathering.
DPS records obtained and reviewed by the Texas Observer in recent months shed new light on the scope of the state police’s surveillance toolbox. The agency has spent millions acquiring an array of powerful—and controversial—artificial intelligence software tools that can mine billions of images to provide facial recognition, track vehicle locations from automatic license plate readers, monitor phone conversations of inmates in Texas prisons and jails, break into and search for data evidence from seized cell phones and computers, and even track cell phones without a warrant.
This all comes as Texas lawmakers are considering how to regulate the use of artificial intelligence (AI) in the private sector and by state government. It’s not clear, though, whether any currently proposed legislation would restrict DPS’ use of AI tools for policing or provide meaningful transparency or oversight.
Lawmakers took a first step last legislative session by establishing an AI advisory council to review how Texas state agencies are deploying artificial intelligence. Under the law, agencies were required to produce an inventory of their automated decision systems and an overview of how each tool is used. DPS’ inventory, which the Observer obtained through an open records request, provides a rare glimpse of its full AI arsenal.
Various AI-powered software programs were purchased under the governor’s border disaster declaration or in response to Abbott’s executive orders to prevent mass attacks, agency records show. Already this year, the state police force has shelled out several million more on contracts that extend its access to these tools for several more years.
The Legislature has supported this expansion with ample new funding for DPS. Last session, the biennial budget gave DPS $22.2 million to acquire “Advanced Analytics & Threat Detection Software”; over $6 million for Intelligence and Counterterrorism Division (ICT) “technology projects”; and $17 million to expand its Operation Drawbridge network of surveillance cameras along the border.
This session, DPS is set to possibly receive around $10 million to fund “contract services to improve investigative capabilities” and acquire “four cellular tracking vehicles,” and $10 million more to continue expanding the Drawbridge network.
As the Observer first reported last year, one of DPS’ key tools is an AI-powered intelligence software called Tangles, which scrapes information from social media platforms and the open, deep, and dark web and includes an add-on that gives police the ability to conduct warrantless cell phone location tracking using commercial data. DPS first acquired Tangles in 2021 through an emergency purchase order issued under the governor’s border security declaration; in total, the agency spent over $900,000 for Tangles licenses in three years.
Last month, the agency signed a $5.3 million contract to use the Tangles software for the next five years. Per that new contract, the software is needed to “identify and disrupt potential domestic terrorism and other mass casualty threats,” in response to the governor’s executive orders issued after the 2019 mass shootings in El Paso and Odessa.
Tangles gives the DPS ICT division the “ability to identify potential threats and … to create leads to forward to law enforcement partners for further investigation and actions,” an agency contract acquisition document states.

It is unclear how exactly DPS has used the software, including its cell phone tracking powers; in testimony to the AI advisory council last year, DPS officials said the platform is used for “lead generation on active law enforcement investigations” but did not provide any specific examples.
DPS did not respond to the Observer’s questions regarding the agency’s use of Tangles and other AI investigative tools.
Civil liberties advocates warn that DPS’ use of Tangles and other tools without a warrant may violate constitutional privacy rights under the Fourth Amendment.
When used in concert, DPS’ current tech capacity for open-source web intelligence gathering, license plate reading, facial recognition, and phone location tracking gives the agency the ability to look up a person or a car, figure out who they are, what they’re doing, where they’ve been, and who they associate with—all without a warrant, said Savannah Kumar, a staff attorney at the ACLU of Texas.
“Many of these technologies are eroding that reasonable expectation of privacy that people have, and [are] kind of leading to this police surveillance state, where the government ends up having … information about people without having to get a warrant,” Kumar told the Observer.
In Texas, and around the country, advocacy organizations across the ideological spectrum have sounded the alarm about how the creeping expansion of police surveillance, coupled with powerful artificial intelligence systems, poses a risk to privacy and civil liberties.
“We’ve kind of forgotten about how big a behemoth the government has become as a surveiller of its populace,” said David Dunmoyer, a tech policy director for the Texas Public Policy Foundation (TPPF), the state’s influential right-wing think tank. “We’re not by any stretch of the imagination anti-technology, but we’re seeing more and more evidence that technology is not serving the benefits of humanity and, long term, national security and trust in the agency authorities overseeing that.”
One of DPS’s other key artificial intelligence tools is biometric facial recognition software from Clearview AI, a company that works closely with law enforcement. The company’s facial recognition system operates by drawing on “over 40 billion publicly available images”, per the DPS inventory report, that police can search against their own images.
Clearview “enables law enforcement investigators to more quickly and effectively generate investigative leads that can potentially lead to the arrest of criminals,” the DPS report says, and its use “is strictly limited to open investigations.”
In response to questions from the Observer, Clearview provided an emailed statement from its general counsel Jack Mulcaire, stating: “Clearview AI is used by law enforcement for after-the-fact investigations. It is an investigative tool, not a surveillance tool. Clearview AI only collects completely public online images. Any images stored with Clearview AI by clients are not used for training algorithms.”
Contract records show that DPS acquired Clearview in 2019 and it issued purchase orders for software licenses from 2021 through 2024 under the governor’s Operation Lone Star disaster declaration, records show. In January 2025, DPS awarded Clearview a $1.2 million contract to extend its use through 2030.
Facial recognition software has long been controversial among both privacy advocates and some public officials, including because of the way companies like Clearview mine personal data—without consent—to fuel their platforms.
Texas also has its own laws restricting biometric data mining without consent. Google, which is being sued by the Texas Attorney General for alleged violations of that law, has used the state’s contracts with Clearview—which also uses biometric data—to argue that Texas is selectively enforcing the law. Last year, Google subpoenaed Clearview for records about its contracts with Texas. A recent report from 404 Media also revealed that Clearview attempted to purchase hundreds of millions of nationwide arrest records including mugshots and sensitive information like social security numbers and email addresses.
Critics also contend that facial recognition can reinforce existing racial inequities because it can be less accurate when identifying people with darker skin. Clearview has said that its algorithm is more than 99 percent accurate in identifying people of all demographics. Data analysis conducted by the U.S. National Institute of Standards and Technology shows that Clearview AI’s algorithm has a higher false match rate for white women, Black men, and Black women than it has for white men, though for all those tested demographics, the false match rate was still below 1 percent. The company’s recently departed CEO has long disputed the concept of facial recognition bias, which he called a “myth.”
Clearview AI’s general counsel told the Observer that the false match rate was “below 0.00001” for all demographics tested. “Clearview AI’s software uses its facial recognition algorithm to generate law enforcement leads, which are independently verified by a human-in-the-loop process. It is not used in court, but only to help law enforcement with their investigations which include saving current and future child trafficking victims,” Mulcaire, the general counsel wrote in a statement.
In addition to biometric facial recognition, DPS also uses Cellebrite, a mobile forensic data extraction tool that can bypass phone passwords and encryption. According to DPS’ AI inventory report, Cellebrite “uses AI to analyze images obtained by digital forensic analysis of cellular phones or other electronic devices.” DPS has used the tech for years and last year issued a $2.7 million contract to extend its license through 2027.
DPS’ AI arsenal taps into a vast surveillance network of automatic license plate readers as well. The agency pays for access to Motorola Solutions’ “LEARN” database, which allows DPS officers to “search through captured license plate information and [utilize] limited facial recognition capabilities to generate investigative leads for active law enforcement investigations,” according to the agency’s survey. DPS signed a contract in 2023 for $1.5 million to use the Motorola license plate database for five years, records show.
DPS also has an agreement with the surveillance company Flock Safety, allowing state police to search through data captured by its growing network of license plate reader cameras across Texas, per its inventory report. The company provides plate readers for several major cities; the Houston Police Department, for instance, has more than 3,800 Flock cameras throughout the city. The company’s technology can track vehicles by their license plates and also create a “fingerprint” of a car by its color, make, model, and other details, according to the Electronic Frontier Foundation, a watchdog group focused on civil liberties related to privacy and technology.
Texas state police don’t just have eyes in the skies, along major thoroughfares and highways, and on the internet. In the early 2000s, Texas also launched Operation Drawbridge to fund the installation of thousands of wildlife cameras on private ranches in South Texas. Ever since, DPS has had a constant live feed across the Texas borderlands. More than 9,000 wildlife cameras generate roughly 250,000 images daily, according to an agency PowerPoint obtained via a public records request.
Since the program began in 2009, Drawbridge cameras have detected over 2.1 million people, and assisted with the apprehension of over 1.1 million people and with the seizure of over 640,000 pounds of marijuana, per the PowerPoint.
For many years, Operation Drawbridge required human analysts to monitor the massive network of cameras. Now, machines do that. DPS says the algorithm correctly identifies objects 95 percent of the time. Since at least 2023, DPS has opted for automated image analysis, according to contracting records. That year, the agency signed a contract worth up to $6 million with Deloitte Consulting LLP to create and train an AI system to ingest, process, and categorize up to 175,000 images per day from DPS’ Drawbridge cameras. Deloitte’s Drawbridge AI system classifies objects including people, weapons, wildlife, and suspected drugs, and notifies DPS when it identifies images on a “watch list,” according to contracting documents.
DPS may have other plans for Deloitte’s Drawbridge system: “The algorithm could be used to find likely recurrences of potential suspects, also for predictive analysis,” the agency’s automated systems inventory reads.
“We’ve kind of forgotten about how big a behemoth the government has become as a surveiller of its populace.”
DPS also has a $4.8 million contract with Deloitte Consulting to maintain Spart-N, the agency’s own analytical platform that taps into its various intel tools, records show. DPS also has access to intel from Verus, a software that uses AI to monitor and transcribe inmate phone calls, via the Texas Department of Criminal Justice. DPS has also expanded its social media monitoring capabilities: The agency issued a contract in September 2024 for up to $7.3 million for licenses to ShadowDragon LLC’s social media investigative platform SocialNet through 2029; more than $8.5 million for licenses to use Dataminr’s FirstAlert system, which can monitor, capture, and analyze data, audio and images from social media platforms, including X (formerly Twitter). Between Tangles, SocialNet and FirstAlert, DPS’ contracts for social media surveillance tools could cost more than $20 million over the next five years.
As DPS expands the scope of its AI policing capabilities, lawmakers have put forth legislation aimed at regulating the use of the rapidly advancing technology in Texas.
At the forefront is state Representative Giovanni Capriglione, a Southlake Republican, who was formerly co-chair of the Texas AI Advisory Council. He made waves early on in this session by filing House Bill 1709, dubbed the Texas Responsible AI Governance Act (TRAIGA) that would create sweeping new regulations to, among other things, protect against potential discrimination or privacy violations involving AI deployed by private companies, and, to a lesser extent, state government. State Senator Tan Parker, a Flower Mound Republican and the other former AI Council co-chair, has filed Senate Bill 1964, which is more squarely directed at regulating the state government’s use of AI.
Capriglione’s HB 1709 includes a provision that experts said could outlaw the use of facial recognition software like Clearview AI by private companies or government entities—including police—though Capriglione disputed that assessment in an interview, telling the Observer that was not the intent of his legislation. More than a dozen states have various laws about how and when police can use facial recognition—but none prohibit it altogether.

HB 1709, however, is now likely obsolete. On March 14, the bill filing deadline, Capriglione quietly filed an overhauled version of TRAIGA as House Bill 149, which removed many of the regulatory provisions aimed at the private sector and gutted many of the proposed oversight powers. The new bill retains the creation of a permanent AI council focused on state agencies’ AI use, but it would limit the council’s power to merely providing evaluations and explicitly states that it cannot “interfere with or override state agency operations.”
Capriglione also rewrote the section that could affect facial recognition technology.
Like the original TRAIGA bill, HB 149 states that government entities may not deploy AI systems that capture biometric data “for the purpose of uniquely identifying a specific individual,” but it adds a carveout limiting the restriction to uses that would “infringe, constrain, or otherwise chill any right guaranteed by the United States Constitution, the Texas Constitution, federal law, or Texas law.”
That leaves the door open for both tech companies and state agencies like DPS to argue over when facial recognition counts as an infringement of someone’s rights, experts said. “There is no explicit ban in this bill,” said Paromita Shah, an attorney and co-founder of Just Futures Law, a legal advocacy group focused on the intersection of immigration and tech issues.
Shah said the new TRAIGA bill “removed all the meaningful guardrails” to protect average Texans against AI and seems to have been heavily influenced by tech industry lobbyists. Neither version of the bill, Shah noted, provides individuals a specific right to sue tech companies over harms caused by AI.
HB 149 was heard before the House Delivery of Government Efficiency Committee Wednesday and was left pending.
Several AI companies, including those that sell tech to DPS, have registered lobbyists in Texas this session, according to state records, including Clearview AI, Flock Safety, and LEO Technologies, which sells the Verus surveillance software. One company also has connections to state law enforcement in Texas: Skylor Hearn, a former DPS deputy director, was a registered lobbyist for Clearview AI in 2020 and 2021 and joined the company as its government affairs director in 2022. During his tenure at the firm, he testified in other states against banning or limiting police use of facial recognition tech. This session, Clearview AI has three registered lobbyists in Texas.
Capriglione has previously expressed concerns about law enforcement employing facial recognition and how the use of software like Clearview AI might violate Texans’ civil liberties.
“There are, in your words, billions of images that are looked at and these obviously are not people that have been found guilty,” Capriglione said, directly addressing DPS’ top data officials at an AI advisory council meeting in 2024 regarding Clearview AI. “Almost assuredly, most of them have not committed any crimes.”
The Republican lawmaker cautioned that, while he would not necessarily call the agency’s capabilities a “dragnet,” he had concerns about protecting Texans’ privacy: “It does come into question whether we are creating a wide area of study of people who have not committed a crime and trying to use that for law enforcement purposes.”
In her testimony before the Council, DPS Chief Information Officer Jessica Ballew, said:
“It’s not something we use to proactively go out and just see if the technology thinks there’s a bad actor out there or there’s a picture of somebody doing something that … they shouldn’t be doing. That’s not how it’s used.”
It is unclear whether DPS has official internal policies for how officers should—and should not—use facial recognition software like Clearview AI.
“I don’t believe the department has one policy in place for the use of Clearview but follow our normal guidelines for using databases, tools, and resources for law enforcement purposes only,” agency employee Lexi Quinney wrote in an email to DPS chief data officer Eric Baker last May, which was obtained in an open records request. (DPS did not answer an Observer question about this email.)
DPS has withheld a memo relating to its use of Clearview AI in response to an open records request, citing attorney-client privilege. The Observer also requested audits and reports relating to how DPS uses Clearview AI and Tangles, and the agency said there were no responsive records.
In their testimony to the AI Advisory Council, Ballew and Baker said that some of the agency’s AI-powered investigative software had built-in audit logs or search histories.
“At a very basic level, for any technology—no matter what it is—there should be a log that says who used it, when they used it, what their search parameters were, and what the reason was for doing it, and somebody could be able to come back and verify the person wasn’t using it to stalk their ex-wife,” said Dave Maass, director of investigations at the Electronic Frontier Foundation.
Meanwhile, Senator Parker’s bill, SB 1964, would require Texas agencies to more thoroughly report on how they use AI and what risks of “unlawful harm” these systems have. Under the bill, state agencies would be required to create impact assessments of any AI-powered tools they deploy—though the reports would be considered confidential and exempt from the Texas Public Information Act.
Progressive and right-wing advocacy groups have found common ground lobbying for stronger civil liberty protections amid the rapid growth of AI tech—including in police surveillance—with both the ACLU of Texas and TPPF pushing legislators to sponsor anti-surveillance bills.
“Texas’ founding documents speak to liberty as its highest ideal, and I think that that is reflected on the left and right and everywhere in between,” said Nick Hudson, a policy strategist at the ACLU of Texas. “People want to make sure that the government isn’t just surveilling people who aren’t doing anything wrong just because they can. … I think that there is a lot of work to be done, and we are hopeful that we’ll see some incremental progress this session, at the very least.”

Across the political spectrum, civil liberties-minded policy wonks and lawmakers have expressed concerns about warrantless police surveillance, whether via license plate readers, facial recognition, or social media monitoring that enables cops to track phone locations. While Republicans have enthusiastically supported the multi-billion-dollar Operation Lone Star, which has helped facilitate the expansion of DPS’ high-tech intelligence network, some GOP lawmakers have also raised concerns about potential constitutional abuses as a result of mass surveillance.
“Our founding fathers did not actually want a police state,” Capriglione said at a TPPF event in November. “They did not want cameras on every single road or on every single post. They didn’t want to have dragnets … which is what’s happening in Texas today, where there’s license plate readers everywhere … The most dangerous part of all of this is the ability for government to have their hands on this and use it against us.”
After that event, Capriglione affirmed to the Observer that his critique extends to warrantless phone tracking: “I don’t think anybody, not even companies, let alone the government, should be able to track you without a warrant for no reason whatsoever.”
In August, after the Observer published a story about DPS’ use of Tangles and the tool’s capability to engage in warrantless cellphone tracking, two other Republican state lawmakers indicated concern in response.
State Representative Brian Harrison, an outspoken right-winger from Midlothian, publicly pledged to investigate: “Texas must lead in the defense of individual liberty, and we must never become a police state. Government actions with significant privacy implications need #Txlege involvement,” Harrison wrote on X. “I will be reaching out to [DPS] for more information.”
Harrison’s office did not respond to the Observer’s inquiries about what information he had received from DPS, though the North Texas legislator filed a bill this session that would bar police from using automatic license plate readers without a warrant or court order, along with other restrictions. “Texas must never become a police state, and I have real concern about an unholy marriage between big government and big tech,” Harrison said in a written statement to the Observer. “I appreciate local governments working to keep citizens safe. However, if they want to use these tools to collect data on innocent citizens, they should get a warrant.”
Shah, the attorney from Just Futures Law, said the dangers of surveillance technologies are easily overlooked because they are not viewed as inherently or imminently violent.
“It’s just that it’s creating the infrastructure in which you can be harmed,” Shah said. Plus, she added, many surveillance tools were originally designed for warfare, or by former military intelligence personnel, and should be viewed through that lens and not as the “soft side” of policing, which is how some AI companies market the tools.
“These are wartime technologies that are now in the hands of local cops,” she said. “We should be really worried.”
We’re reporting on AI-powered surveillance in the U.S.-Mexico borderlands with support from the Pulitzer Center’s AI Accountability Network. We’re seeking tips about how police and prosecutors use automated surveillance tech tools. In particular, we’d love to hear from defense attorneys, public defenders, immigration lawyers, and police officers.
You can contact reporter Francesca D’Annunzio on Signal at +1-512-270-8604 or via email at francesca.dannunzio@proton.me or dannunzio@texasobserver.org. The Texas Observer’s mailing address is: P.O. Box #11554 Austin, TX 78711.
The post Texas’ AI-Powered Surveillance Arsenal Has Ballooned. Proposed Laws Provide Few Guardrails. appeared first on The Texas Observer.