Google Tracking Location, Prosecutors Dropping Cases, and Facial Recognition
/The last couple of weeks have been eventful for news related to the world of digital evidence. The media often doesn’t talk about the forensic implications of tech news, so let’s me shed some light on what the media missed.
New York Times says Google is a Dragnet for Police
https://www.nytimes.com/interactive/2019/04/13/us/google-location-tracking-police.html
In this top story, the New York Times profiles an Arizona case where law enforcement requested information from Google about all of the phones that were present in a particular area on the date and time of a murder. In this case, it led to the arrest of a warehouse worker. The problem was that the worker turned out to be innocent. He spent a week in jail, being accused of murder before police learned new information and his case was dropped.
Location data can make or break a case. It’s no surprise that the police are getting warrants to obtain it from Google. The problem comes when lawyers or investigators use that information incorrectly.
Google gets location data from its user’s phones. If you have an Android phone, or an iOS device with Google Maps, Google likely knows where you are. There were even stories last year about Google tracking users even if they turned off location services on their devices. This information is highly accurate. Google uses a combination of GPS, Cell Tower, and Wifi information to determine the location of a phone. This accuracy is how Google can notify you of useful information like sales at a grocery store when you walk in, or the menu at the restaurant where you just sat down.
The problem comes when relying solely on this information. How do we know that the owner had the phone? What if they left it in their car and then loaned the car to a friend or family member? What if they let someone borrow their phone? What if the phone was stolen?
Location information is useful, and there are many ways to get it. Just make sure you combine it with other information in your case to get the complete picture.
Prosecutors Dismissing Child Pornography Cases
If you have worked a child pornography case, chances are you are aware that the police use automated monitoring software to look for computers sharing child images on the Internet. This practice has been the subject of litigation for years. Now there are new programs used by law enforcement that have led them to get warrants to search people’s homes and computers. In several cases, the searches revealed no child pornography.
These empty searches lead to the question – does the software work? Defense lawyers have started requesting access to the program to see how it functions. Instead of allowing defense teams to inspect the software, prosecutors have dismissed the cases. This, of course, leads to just more questions about what this software is actually doing.
Facial Recognition at Airports
DHS announced it will use facial recognition technology on 97% of departing airline passengers within the next four years. While this may sound like a good way to catch potential criminals and terrorist, facial recognition has a lot of problems. Facial recognition has a high rate of false positives on minorities and people of color. Famously, Google’s facial recognition could not tell the difference between gorillas and people of color. They “fixed” the problem by removing gorillas from its image labeling technology.
While AI may sound like the answer to our problems, it instead introduces more problems. Like all new technology can be a tool to assist investigators, but should not be the sole source of information for making decisions to arrest or charge individuals with a crime.
If you are facing a new digital challenge in your case, contact us today.