Everything You Need To Know About Common Firefighter ... - tools firefighters use
Because the algorithms for these systems are often not disclosed, a judge would have no way of evaluating the likelihood of a false match when presented with investigative evidence about a suspect’s crime. Civil liberties experts find this especially disconcerting given the fact that machine learning systems make probabilistic, rather than binary, judgments. Amazon mistakenly predicting that you desire more toilet paper has vastly different implications for individual liberty than a private technology company’s cloud mistakenly telling an officer, with indefinite certainty, to react lethally to a seemingly aggressive suspect.
During a real-life scenario training, officers learn how to work in and around vehicles when protecting an Internationally Protected Person.
While much of the training concentrates on proactive tactics and mitigating threats, it also prepares officers for what to do if something goes wrong.
As privately owned policing tactics become increasingly black-boxed, citizens will have no recourse to uncover how they ended up on their city’s list of suspicious persons or the logic guiding an algorithm’s decisions. In “RoboCop,” for instance, a secret rule prohibits the robot from arresting any of the owner-corporation’s board members.
Baltimore City Police Commissioner Kevin Davis, at podium, shows a sample of footage from a body camera worn by a police officer during a news conference at police headquarters on Dec. 21, 2015.
Yet body-worn cameras show the police point of view by design; additionally, their footage will likely be labeled by officers, rather than civilians, meaning that systems could be taught to classify the behaviors of certain civilians as aggressive if such categorizations helped to support the officer’s narrative in a use of force encounter.
The Philadelphia school ordered teachers to “stay neutral” on Israel’s war on Gaza, but they helped students make pro-Palestine posters.
The new course is part of a wider initiative to modernize the RCMP's close protection training. Work is underway to develop and update courses related to other aspects of close protection work such as driving, site security, and working abroad.
The problem with any suspicious activity reporting, automated or not, is that suspicion always lies in the eye of the beholder. As The Intercept reported in February, the Transportation Security Administration’s own research showed that the agency’s program to detect suspicious behavior in travelers was unscientific, unreliable, and dependent on racial stereotypes.
"By the time they're done, their skills have sky-rocketed and I would work with any of them any day," says Cpl. Sylvie Nault, who spent a decade working with the Prime Minister Protective Detail before joining the National Close Protection Training Unit in 2018.
The extensive National Close Protection Officer Course replaces a one-week program, and allows candidates to become proficient in applying close protection knowledge, skills and abilities.
AxonBody cameraprice
This is why artificial intelligence experts fear that the human decisions that shape the way the data is collected, labeled, and perceived might not just reinforce the racial biases of the criminal justice system, but automate them. Dextro’s deep learning system, for instance, learns to pick out objects, like stop signs, guns, and license plates, and to discern actions, like the difference between a jogger and a suspect fleeing the police.
According to Piché, after taking the course many participants prepare for full-time work with a protective policing unit or for part-time support on protection teams when an IPP visits their region.
Despite prominent civil rights groups highlighting the need for comprehensive policies, state and local level legislation has lagged in regulating who can access body-worn camera footage, how long it is stored, and who gets to see it. But the biggest impediment to making sure body-worn camera footage remains accountable might be the manufacturers themselves.
"The instructor cadre has such a diverse background and everybody is able to bring something different to the table," says Nault.
A police officer wears a body camera during an anti-Trump protest in Cleveland, Ohio, near the Republican National Convention, July 18, 2016.
Exercises such as mock meetings allow participants to see how events with IPPs operate, and help them understand their role before, during, and after an event. Volunteers and actors sometimes form crowds during mock emergency and non-emergency scenarios to keep participants on their toes.
Predictions aside, the mere ability to trawl for evidence from body-worn camera footage also widens the range of “potentially suspicious persons” who can be contacted by law enforcement, according to Joh, the legal scholar of policing. “It’s a pretty radical expansion of the kind of discretion law enforcement has.” At such an indiscriminate scale, all kinds of insights and individuals get swept into an automated investigation process. “Once you’ve created a giant video database, it’s possible to search and re-search it, it’s not clear that there are any legal limits,” she said, since the Fourth Amendment focuses on the point of collection. “Generally speaking, there aren’t too many rules on what the police can do after they collect the information.”
Taser’s corporate ethos has long been inspired by cinematic science fiction. The company’s LinkedIn page describes its Seattle headquarters as “a mix of Star Wars, James Bond, Get Smart and Star Trek.” It even boasts eye scanners and sliding doors lifted from “Men in Black.”
Taser responded that it believes body camera “video represents an important step closer to what happened at an event.” When asked about racially disparate policing practices, the spokesperson said that the “huge gain in information fidelity and transparency in video (versus text) is something that we believe can identify such bias.”
RCMP Cst. Mike Park, who completed the course in the fall, signed up to add to his existing experience and try a new type of police work.
But the company took its sci-fi references to the next level in a little-publicized Law Enforcement Technology Report released earlier this year. In one of the interviews featured in the report, Arizona State University scientist George Poste explains that while artificially intelligent policing has yet to realize “the fully futuristic dimension of ‘RoboCop’ where you essentially have someone wearing an exoskeleton linked to advanced artificial intelligence capabilities,” or “the Tom Cruise ‘Minority Report’-level of cognitive prediction, … patterns of individual behavior will become increasingly informative in revealing the probability that an individual will act in a particular fashion.”
Overall, the report sells departments on how Taser will leverage its cloud of data “to anticipate criminal activity” and “predict future events.” “Imagine,” the report tells officers, that “you can find out if someone has a criminal record instantly — or be notified if someone’s demeanor has changed and may now be a threat.” While a tool like emotion detection is more marketing hype than imminent reality, such goals reveal the ambitions of Taser’s long-term blueprint.
“Body cameras are really just a story about private influence on public policing,” Joh said. “Whoever captures the audience first wins. And Taser is capturing the entire market. They get to shape the language that we use, they get to set the agenda, they get to say ‘this is possible’ and therefore the police can do it.”
Deep learning works by teaching computers to recognize patterns. The system is not given if-then rules; instead, it’s asked to infer associations from the large batches of data. Whereas a rule-based algorithm learns that a “cat equals two ears, narrow body, and a tail, but isn’t a rat” — and incrementally makes progress as it’s given increasingly specific rules — a deep learning system ingests a training set of hundreds of thousands of images that have been labeled as cats, lynxes, wolves, and so on. Layers of “neural networks” mimic the structure of a human brain to strengthen or weaken associations based on each correct association. But exactly how the deep learning system ultimately grasps the essence of a cat is not known; as with the juridical system for obscenity, it just knows it when it sees it.
“Everyone refers to ‘Minority Report’ … about how they use facial recognition and iris recognition,” said Ron Kirk, director of the West Virginia Intelligence Fusion Center, which uses both technologies, in an interview with Vocativ. “I actually think that that is the way of the future.”
AxonBody camerafor civilians
Participants like Rousseau praised the instructors for creating a supportive learning environment where mistakes are corrected rather than criticized.
Taser is collecting an unprecedented video archive of law enforcement encounters — and it wants to use AI and “deep learning” to predict criminal behavior.
AxonBody2
“The ‘RoboCop’ narrative,” said Marcus Womack, an executive vice president for software and services, “doesn’t align with our mission and is a poor example of how technology can impact policing. In particular, we are not using AI technology to make decisions for officers. We see the real impact being that this technology will make police officers more human.”
But looking to the past is just the beginning: Taser is betting that its artificial intelligence tools might be useful not just to determine what happened, but to anticipate what might happen in the future.
The report repeatedly compares Taser’s repurposing of its video data not just to pre-crime, but to the efforts of Wal-Mart, Google, Facebook, and Microsoft, all of which scrape their respective user data to anticipate purchases, tailor text, monitor activities, and optimize search results. Taser’s AI unit is using the same cutting-edge technique as these major technology companies: deep learning.
Taser CEO Rick Smith discussed a similar vision in a recent FastCompany profile, explaining that real-time artificial intelligence technology could have aided the officer who killed Philando Castile, the 32-year-old African-American man driving with his girlfriend and her 4-year-old daughter, by alerting him to the fact that Castile had a gun license and no violent criminal record.
Tactical training is an important component of the new close protection training course. During the training, officers learn how to respond if there's a threat to a VIP.
A previously undisclosed email and new documents show the Project Nimbus deal isn’t covered by Google’s general terms of service.
In some scenarios, participants have the chance to act as protection officers, an IPP and even an aggressor so they can see how the scene unfolds from all perspectives. The exercises are repeated and new elements, such as changes to the size of the group, are added as they develop their skills.
AxonBody Camerafor sale
A group of bodyguards-in-training practise a real-life scenario during the RCMP's close protection officer training. Training in a public setting gives officers a taste of what it's like on the job.
When it comes to programs like stop and frisk in New York City or traffic violations in Ferguson, Missouri, courts have determined that decisions about who, what, and where to police can have a racially disparate impact. In her book “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,” Cathy O’Neil argues that unjust decisions are reinforced when they’re programmed into computer systems that make claims to objectivity. She discusses the example of PredPol, the controversial predictive policing software first used in Los Angeles in 2009. PredPol is careful to advertise the fact that it uses geographic, rather than demographic, inputs to predict where nuisance crimes like loitering will occur. But because such crimes are already over-policed in black neighborhoods, the data fed to the algorithm is already skewed. By then sending more police to the computer-generated “loitering hotspots,” the system reinforces what O’Neil calls a “pernicious feedback loop,” whereby it justifies the initial assumptions it was fed. Any crime-predicting algorithm, O’Neil emphasizes, has the power to bring into being the world it predicts.
Plainclothes RCMP officers use prop firearms in a training exercise to protect an Internationally Protected Person during the RCMP's close protection officer training.
Taser’s investments in artificial intelligence, she added, seem like a more “scientific-sounding version of broken windows policing.” The expectation of finding crime may influence what the officers end up finding.
AxonBody2 price
AxonBody4
"They say there's over a hundred different jobs with the RCMP and I want to try out as many as possible throughout my career," says Park, who also worked general duty (patrolling the streets in a police cruiser and responding to service calls) and participated with the Musical Ride. "The course was difficult, but the instructors and support staff went above and beyond."
Piché credits the collaboration between the National Close Protection Training Unit, National Learning Services, Federal Policing Training Services, and National Division Training for the course's success.
"We need to ensure they have the tools, skills and knowledge on both the proactive and reactive sides of a potential incident," says Piché.
Or take the case of the criminal justice consulting firm Northpointe. A ProPublica investigation of Northpointe’s algorithm used to calculate the risk of recidivism was shown to be twice as likely to incorrectly decide black defendants were at a higher risk of committing future crimes. But while reporters were able to analyze the questionnaires used by the company, which disputed ProPublica’s findings, they were unable to analyze Northpointe’s proprietary software.
With an estimated one-third of departments using body cameras, police officers have been generating millions of hours of video footage. Taser stores terabytes of such video on Evidence.com, in private servers, operated by Microsoft, to which police agencies must continuously subscribe for a monthly fee. Data from these recordings is rarely analyzed for investigative purposes, though, and Taser — which recently rebranded itself as a technology company and renamed itself “Axon” — is hoping to change that.
"We try to bring an element of realism into the course," says Luke Ward, an instructional designer with the RCMP's National Learning Services who worked on creating the course. "It takes a lot of co-ordination, but the learning value for the participants is invaluable."
During the close protection training, officers learn how to respond if there's a threat to an Internationally Protected Person while in a vehicle.
Legal experts and surveillance watchdogs caution, however, that any company that automates recommendations about threat assessments and suspicions may transform policing tactics for the worse.
“Contributing to land theft and unlawful evictions of Palestinians in the West Bank would be a direct violation of the Supply Chain Act.”
Axon 3Body camera
As video analytics and machine vision have made rapid gains in recent years, the future long dreaded by privacy experts and celebrated by technology companies is quickly approaching. No longer is the question whether artificial intelligence will transform the legal and lethal limits of policing, but how and for whose profits.
This raw data fed into video analytics systems is itself captured and created by the police, said Elizabeth Joh, a law professor and policing expert at the University of California, Davis. “If you think about it,” she said, “some of the factors that algorithms use are products of human discretion. Crime reporting, contact cards, and arrest rates are not neutral. … You get analog facts transformed into unassailable, objective truths, and we have to be pretty skeptical about that.” Teaching the machine to look for “hoodies” may already be a reflection of human assumptions, not criminal propensity.
In an interview in Taser’s future of policing report, a senior data architect at Microsoft envisions a future in which officers receive alerts when “an individual has a known criminal record, or propensity to violence. Even if [the suspect] has not yet adopted a threatening posture, it heightens the overall threshold of awareness.”
“We’ve got all of this law enforcement information with these videos, which is one of the richest treasure troves you could imagine for machine learning,” Taser CEO Rick Smith told PoliceOne in an interview about the company’s AI acquisitions. “Imagine having one person in your agency who would watch every single one of your videos — and remember everything they saw — and then be able to process that and give you the insight into what crimes you could solve, what problems you could deal with. Now, that’s obviously a little further out, but based on what we’re seeing in the artificial intelligence space, that could be within five to seven years.”
Nondisclosure agreements allow private companies like Taser to defend their proprietary computing systems from public scrutiny, Joh explained. “Typically we think we have oversight into what police can do,” said Joh. “Now we have third-party intermediary, they have a kind of privacy shield, they’re not subject to state public record laws, and they have departments sign contracts that they are going to keep this secret.”
The training also includes mock outings such as trips to public venues. Trainees plan the outing themselves by following protective services protocol, meeting with facility staff to learn about site security, and researching potential threats. During the outing, actors with speaking roles are often embedded in a crowd to prompt certain scenarios, allowing instructors to see how participants respond to unexpected circumstances.
AxonBody4 price
He says instructors kept a close eye on the trainees throughout the course, and regularly provided pointers and advice, sharing insights from their own experiences in protective operations.
Taser isn’t the only company selling agencies on its powers of speculation. A spokesperson for the Russian company Ntechlab told me that its high-performing facial recognition algorithm is able to detect “abnormal and suspicious behavior of people in certain areas.” Several major face recognition companies have already been teaching their systems to detect anomalous behaviors in crowds. Earlier this year, IBM, which has spent over $14 billion on predictive policing, advertised that its Deep Learning Engine could pinpoint the location and identity of suspects in real time. And for the last several years, researchers funded by the Defense Advanced Research Projects Agency have been developing “automated suspicion algorithms” to predict and analyze behavior from videos, text, and online images. But as the market leader for video recording hardware, having relationships with an estimated 17,000 of the country’s 18,000 police departments, Taser’s research investments have an outsized influence on law enforcement tactics.
When questioned about the potential for predictive policing discussed in other interviews and advertised at several moments throughout the company’s 34-page report, a Taser spokesperson was more circumspect and said the company would only be using machine learning to improve “workflow” at this time. The spokesperson stated, contrary to the 2017 Taser technology report’s detailed speculations, that “Axon is not building predictive policing and will not make predictions on behalf of our customers. In addition, all Axon machine learning work is under the oversight of our AI Ethics Board that we are finalizing.”
Cst. Patrice Rousseau transferred to protective operations after 17 years of general duty policing and completed the new course in the fall of 2021.
Although intended as a grim allegory of the pitfalls of relying on untested, proprietary algorithms to make lethal force decisions, “RoboCop” has long been taken by corporations as a roadmap. And no company has been better poised than Taser International, the world’s largest police body camera vendor, to turn the film’s ironic vision into an earnest reality.
Hamid Khan, lead organizer for the Stop LAPD Spying Coalition, contends that feeding police information in real time about an individual’s prior records may only encourage more aggressive conduct with suspects. “We don’t have to go very far into deep learning,” he said, for evidence of this phenomena. “We just have to look at the numbers that already exist for suspicious activity reporting, which doesn’t even require [advanced] analytics.” He noted that when the LAPD’s Suspicious Activity Reporting program, which relied on analog human tips, was audited by the city’s inspector general, it determined that black women residents were being disproportionately flagged.
When civil liberties advocates discuss the dangers of new policing technologies, they often point to sci-fi films like “RoboCop” and “Minority Report” as cautionary tales. In “RoboCop,” a massive corporation purchases Detroit’s entire police department. After one of its officers gets fatally shot on duty, the company sees an opportunity to save on labor costs by reanimating the officer’s body with sleek weapons, predictive analytics, facial recognition, and the ability to record and transmit live video.
Police stand guard as demonstrators, marking the one-year anniversary of the shooting of Michael Brown, protest along West Florissant on August 10, 2015, in Ferguson, Missouri.
But while the complex associations of a deep learning system are opaque even to its programmers, the training labels for its datasets are human-generated. They can also be subject to bias. Many neural networks have already been found to reveal the geographical, racial, and socio-economic positions of their human trainers even as their complexity lends them an appearance of greater objectivity. Studies show that facial recognition neural nets trained on white faces, for instance, have trouble recognizing the faces of African-Americans.
In 2010, Taser’s longtime vice president Steve Tuttle “proudly predicted” to GQ that once police can search a crowd for outstanding warrants using real-time face recognition, “every cop will be RoboCop.” Now Taser has announced that it will provide any police department in the nation with free body cameras, along with a year of free “data storage, training, and support.” The company’s goal is not just to corner the camera market, but to dramatically increase the video streaming into its servers.
Over the course of the training, officers also learned close protection-specific firearms tactics, specialized police defensive techniques, working as a team in tense situations, and other skills that build on their previous police training and experience.
Taser has started to get into the business of making sense of its enormous archive of video footage by building an in-house “AI team.” In February, the company acquired a computer vision startup called Dextro and a computer vision team from Fossil Group Inc. Taser says the companies will allow agencies to automatically redact faces to protect privacy, extract important information, and detect emotions and objects — all without human intervention. This will free officers from the grunt work of manually writing reports and tagging videos, a Taser spokesperson wrote in an email. “Our prediction for the next few years is that the process of doing paperwork by hand will begin to disappear from the world of law enforcement, along with many other tedious manual tasks.” Analytics will also allow departments to observe historical patterns in behavior for officer training, the spokesperson added. “Police departments are now sitting on a vast trove of body-worn footage that gives them insight for the first time into which interactions with the public have been positive versus negative, and how individuals’ actions led to it.”
Christoph Musik, an expert in computer vision from the University of Vienna, has written extensively about the human assumptions built into such systems. Hunches are always subjective, he points out, unlike evaluating the proposition of whether or not an object is a cat. “It is extremely difficult to formulate universal laws of behavior or suspicious behavior, especially if we focus on everyday behavior on a micro level,” Musk wrote in an email. “‘Smart’ or ‘intelligent’ systems claiming to recognize suspicious behavior are not as objective or neutral as they [seem].”
The RCMP operates a number of units under the protective policing umbrella including the Governor General Protective Detail, the Prime Minister Protective Detail and the Divisional Protective Services. The units are responsible for safeguarding Canadian and foreign dignitaries, as well as visiting Internationally Protected Persons (IPPs).
The updated training was more than three years in the making, according to Piché. Staff studied global best practices, met with Canadian and international policing partners and researched the history and statistics of threats to IPPs to ensure close protection officers have the most up-to-date and evidence-based knowledge.
Participants come to the training from a variety of policing backgrounds, ranging from general duty to specialized roles. While an online module covers some basics in advance, much of the training consists of hands-on scenarios to simulate the reality of the job.