Law Enforcement

Caught in the Digital Dragnet: The Controversy Over Geofence Warrants and Privacy Rights

Yaoyu Tang, MJLST Staffer

 

Picture this: A sunny Saturday afternoon at a bustling shopping mall. Children’s laughter echoes as they pull their parents toward an ice cream stand. Couples meander hand-in-hand past glittering storefronts, while teenagers crowd the food court, joking and snapping selfies. It’s a portrait of ordinary life—until chaos quietly unfolds. A thief strikes a high-end jewelry store and vanishes into the crowd, leaving no trail behind. Frustrated and out of options, law enforcement turns to a geofence warrant, demanding Google provide location data for every smartphone within a quarter-mile radius during the heist. In the days that follow, dozens of innocent shoppers, workers, and passersby find themselves under scrutiny, their routines disrupted simply for being in the wrong place at the wrong time.

This story is not hypothetical—it mirrors real-life cases where geofence warrants have swept innocent individuals into criminal investigations, raising significant concerns about privacy rights and constitutional protections.

Geofence warrants are a modern investigative tool used by law enforcement to gather location data from technology companies.[1] These warrants define a specific geographic area and time frame, compelling companies like Google to provide anonymized location data from all devices within that zone.[2] Investigators then sift through this data to identify potential suspects or witnesses, narrowing the scope to relevant individuals whose movements align with the crime scene and timeline.[3]

The utility of geofence warrants is undeniable. They have been instrumental in solving high-profile cases, such as identifying suspects in robberies, assaults, and even the January 6 Capitol riots.[4] By providing a way to access location data tied to a specific area, geofence warrants enable law enforcement to find leads in cases where traditional investigative techniques might fail.[5] These tools are particularly valuable in situations where there are no direct witnesses or physical evidence, allowing law enforcement to piece together events and identify individuals who were present during criminal activity.[6]

However, the benefits of geofence warrants come with significant risks. Critics argue that these warrants are overly broad and invasive, sweeping up data on innocent bystanders who happen to be in the area.[7] In addition, civil liberties organizations, such as the ACLU and the Electronic Frontier Foundation (EFF), have strongly criticized geofence warrants.[8] They argue that the geofence warrants infringe on privacy rights and disproportionately affect marginalized communities. Without strict limitations, geofence warrants could become tools of mass surveillance, disproportionately targeting marginalized communities or chilling free movement and association. [9] Moreover, this indiscriminate collection of location data raises serious Fourth Amendment concerns, as it can be seen as a form of digital general warrant—a modern equivalent to the invasive searches that the Framers sought to prevent.[10] Tension between their investigative utility and potential for privacy violations has made geofence warrants one of the most controversial tools in modern law enforcement.

The legality of geofence warrants is far from settled, with courts offering conflicting rulings. In United States v. Smith, the Fifth Circuit declared geofence warrants unconstitutional, stating that they amount to general searches.[11] The court emphasized the massive scope of data collected and likened it to rummaging through private information without sufficient cause.[12] The decision heavily relied on Carpenter v. United States, where the Supreme Court held that accessing historical cell-site location information without a warrant violates the Fourth Amendment.[13] In Carpenter, the Court recognized that cell-site location information (CSLI) provides an intimate record of a person’s movements, revealing daily routines, frequent locations, and close personal associations.[14] This information, the Court held, constitutes a “search” within the meaning of the Fourth Amendment, requiring a warrant supported by probable cause.[15] Conversely, the Fourth Circuit in United States v. Chatrie upheld the use of geofence warrants, arguing that users implicitly consent to data collection by agreeing to terms of service with tech companies.[16] The court leaned on the third-party doctrine, which holds that individuals have reduced privacy expectations for information shared with third parties.[17] These conflicting rulings highlight the broader struggle to apply traditional Fourth Amendment principles to digital technologies. The Fifth Circuit’s ruling highlights discomfort with the vast reach of geofence warrants, pointing to their lack of Fourth Amendment particularity.[18] Conversely, the Fourth Circuit’s reliance on the third-party doctrine broadens law enforcement access, framing user consent as a waiver of privacy.[19] This split leaves courts struggling to reconcile privacy with evolving surveillance technology, underscoring the urgent need for clearer standards.

Tech companies like Google play a pivotal role in the geofence warrant debate. Historically, Google stored user location data in a vast internal database known as Sensorvault.[20] This database served as a central repository for location data collected from various Google services, including Google Maps, Search, and Android devices.[21] Law enforcement agencies frequently sought access to this data in response to geofence warrants, making Sensorvault a crucial point of contention in the legal and privacy debates surrounding this technology.[22] However, in 2023, Google announced significant changes to its data policies: location data would be stored on user devices instead of the cloud, backed-up data would be encrypted to prevent unauthorized access, including by Google itself, and default auto-delete settings for location history would reduce data retention from 18 months to three months.[23] These policy changes significantly limit the availability of location data for law enforcement agencies seeking to execute geofence warrants.[24] By storing data locally on user devices and implementing robust encryption and auto-deletion features, Google has effectively reduced the amount of location data accessible to law enforcement.[25] This highlights the significant influence that corporate data policies can exert on law enforcement practices.[26] Other companies, like Apple, have adopted even stricter privacy measures, refusing to comply with all geofence warrant requests.[27]

The debate surrounding the legality and scope of geofence warrants remains contentious. Courts grapple with varying interpretations, legislators struggle to enact comprehensive legislation, and public opinion remains divided. This uncertainty necessitates authoritative guidance. Whether through judicial precedent, legislative reform, or technological advancements that mitigate privacy concerns, achieving a consensus on the permissible use of geofence warrants is crucial. Only with such a consensus can society navigate the delicate balance between public safety and individual privacy rights in the digital era.

 

Notes:

[1] Ronald J. Rychlak, Geofence Warrants: The New Boundaries, 93 MISS. L. Rev. 957-59 (2024).

[2] Id.

[3] Id.

[4] Mark Harris, A Peek Inside the FBI’s Unprecedented January 6 Geofence Dragnet, WIRED(Nov. 28, 2022, 7:00 AM), https://www.wired.com/story/fbi-google-geofence-warrant-january-6/.

[5] Jeff Welty, Recent Developments Concerning Geofence Warrants, N.C. CRIM. L. (Nov. 4, 2024), https://nccriminallaw.sog.unc.edu/recent-developments-concerning-geofence-warrants/.

[6] Prathi Chowdri, Emerging tech and law enforcement: What are geofences and how do they work, POLICE1(Nov. 16, 2023, 9:06 PM), https://www.police1.com/warrants/google-announces-it-will-revoke-access-to-location-history-effectively-blocking-geofence-warrants.

[7] Jennifer Lynch, Is This the End of Geofence Warrants, ELECTRONIC FRONTIER FOUND., https://www.eff.org/deeplinks/2023/12/end-geofence-warrants.

[8] ACLU, ACLU Argues Evidence From Privacy-Invasive Geofence Warrants Should Be Suppressed, https://www.aclu.org/press-releases/aclu-argues-evidence-from-privacy-invasive-geofence-warrants-should-be-suppressed#:~:text=In%20the%20brief%2C%20the%20ACLU,they%20were%20engaged%20in%20criminal.

[9] LYNCH, supra note 7.

[10] Id.

[11] United States v. Smith, 110 F.4th 817 (5th Cir. 2024).

[12] Id. at 28-30.

[13] Id. at 27-29.

[14] Carpenter v. United States, 585 U.S. 296 (2018)

[15] Id.

[16] United States v. Chatrie, 107 F.4th 319 (4th Cir. 2024).

[17] Id. at 326-57.

[18] Smith, 110 F.4th 817, at 27-30.

[19] Chatrie, 107 F.4th 319, at 326-57.

[20] Jennifer Lynch, Google’s Sensorvault Can Tell Police Where You’ve Been, ELECTRONIC FRONTIER FOUND., https://www.eff.org/deeplinks/2019/04/googles-sensorvault-can-tell-police-where-youve-been?.

[21] Id.

 

[22] Id.

[23] Skye Witley, Google’s Location Data Move Will Reshape Geofence Warrant Use, BLOOMBERG L. (Dec. 20, 2023, 4:05 AM), https://news.bloomberglaw.com/privacy-and-data-security/googles-location-data-move-will-reshape-geofence-warrant-use?.

[24] Id.

[25] Id.

 

[26] Id.

 

[27] APPLE, Apple Transparency Report: Government and Private Party Requests, https://www.apple.com/legal/transparency/pdf/requests-2022-H1-en.pdf.


Privacy at Risk: Analyzing DHS AI Surveillance Investments

Noah Miller, MJLST Staffer

The concept of widespread surveillance of public areas monitored by artificial intelligence (“AI”) may sound like it comes right out of a dystopian novel, but key investments by the Department of Homeland Security (“DHS”) could make this a reality. Under the Biden Administration, the U.S. has acted quickly and strategically to adopt artificial intelligence as a tool to realize national security objectives.[1] In furtherance of President Biden’s executive goals concerning AI, the Department of Homeland Security has been making investments in surveillance systems that utilize AI algorithms.

Despite the substantial interest in protecting national security, Patrick Toomey, deputy director of the ACLU National Security Project, has criticized the Biden administration for allowing national security agencies to “police themselves as they increasingly subject people in the United States to powerful new technologies.”[2] Notably, these investments have not been tailored towards high-security locations—like airports. Instead, these investments include surveillance in “soft targets”—high-traffic areas with limited security: “Examples include shopping areas, transit facilities, and open-air tourist attractions.”[3] Currently, due to the number of people required to review footage, surveilling most public areas is infeasible; however, emerging AI algorithms would allow for this work to be done automatically. While enhancing security protections in soft targets is a noble and possibly desirable initiative, the potential privacy ramifications of widespread autonomous AI surveillance are extreme. Current Fourth Amendment jurisprudence offers little resistance to this form of surveillance, and the DHS has both been developing this surveillance technology themselves and outsourcing these projects to private corporations.

To foster innovation to combat threats to soft targets, the DHS has created a center called Soft Target Engineering to Neutralize the Threat Reality (“SENTRY”).[4] Of the research areas at SENTRY, one area includes developing “real-time management of threat detection and mitigation.”[5] One project, in this research area, seeks to create AI algorithms that can detect threats in public and crowded areas.[6] Once the algorithm has detected a threat, the particular incident would be sent to a human for confirmation.[7] This would be a substantially more efficient form of surveillance than is currently widely available.

Along with the research conducted through SENTRY, DHS has been making investments in private companies to develop AI surveillance technologies through the Silicon Valley Innovation Program (“SVIP”).[8] Through the SVIP, the DHS has awarded three companies with funding to develop AI surveillance technologies that can detect “anomalous events via video feeds” to improve security in soft targets: Flux Tensor, Lauretta AI, and Analytical AI.[9] First, Flux Tensor currently has demo pilot-ready prototype video feeds that apply “flexible object detection algorithms” to track and pinpoint movements of interest.[10] The technology is used to distinguish human movements and actions from the environment—i.e. weather, glare, and camera movements.[11] Second, Lauretta AI is adjusting their established activity recognition AI to utilize “multiple data points per subject to minimize false alerts.”[12] The technology generates automated reports periodically of detected incidents that are categorized by their relative severity.[13] Third, Analytical AI is in the proof of concept demo phase with AI algorithms that can autonomously track objects in relation to people within a perimeter.[14] The company has already created algorithms that can screen for prohibited items and “on-person threats” (i.e. weapons).[15] All of these technologies are currently in early stages, so the DHS is unlikely to utilize these technologies in the imminent future.

Assuming these AI algorithms are effective and come to fruition, current Fourth Amendment protections seem insufficient to protect against rampant usage of AI surveillance in public areas. In Kyllo v. United States, the Court placed an important limit on law enforcement use of new technologies. The Court held that when new sense-enhancing technology, not in general public use, was utilized to obtain information from a constitutionally protected area, the use of the new technology constitutes a search.[16] Unlike in Kyllo, where the police used thermal imaging to obtain temperature levels on various areas of a house, people subject to AI surveillance in public areas would not be in constitutionally protected areas.[17] Being that people subject to this surveillance would be in public places, they would not have a reasonable expectation of privacy in their movements; therefore, this form of surveillance likely would not constitute a search under prominent Fourth Amendment search analysis.[18]

While the scope and accuracy of this new technology are still to be determined, policymakers and agencies need to implement proper safeguards and proceed cautiously. In the best scenario, this technology can keep citizens safe while mitigating the impact on the public’s privacy interests. In the worst scenario, this technology could effectively turn our public spaces into security checkpoints. Regardless of how relevant actors proceed, this new technology would likely result in at least some decline in the public’s privacy interests. Policymakers should not make a Faustian bargain for the sake of maintaining social order.

 

Notes

[1] See generally Joseph R. Biden Jr., Memorandum on Advancing the United States’ Leadership in Artificial Intelligence; Harnessing Artificial Intelligence to Fulfill National Security Objectives; and Fostering the Safety, Security, and Trustworthiness of Artificial Intelligence, The White House (Oct. 24, 2024), https://www.whitehouse.gov/briefing-room/presidential-actions/2024/10/24/memorandum-on-advancing-the-united-states-leadership-in-artificial-intelligence-harnessing-artificial-intelligence-to-fulfill-national-security-objectives-and-fostering-the-safety-security/ (explaining how the executive branch intends to utilize artificial intelligence in relation to national security).

[2] ACLU Warns that Biden-Harris Administration Rules on AI in National Security Lack Key Protections, ACLU (Oct. 24, 2024, 12:00 PM), https://www.aclu.org/press-releases/aclu-warns-that-biden-harris-administration-rules-on-ai-in-national-security-lack-key-protections.

[3] Jay Stanley, DHS Focus on “Soft Targets” Risks Out-of-Control Surveillance, ALCU (Oct. 24, 2024), https://www.aclu.org/news/privacy-technology/dhs-focus-on-soft-targets-risks-out-of-control-surveillance.

[4] See Overview, SENTRY, https://sentry.northeastern.edu/overview/#VSF.

[5] Real-Time Management of Threat Detection and Mitigation, SENTRY, https://sentry.northeastern.edu/research/ real-time-threat-detection-and-mitigation/.

[6] See An Artificial Intelligence-Driven Threat Detection and Real-Time Visualization System in Crowded Places, SENTRY, https://sentry.northeastern.edu/research-project/an-artificial-intelligence-driven-threat-detection-and-real-time-visualization-system-in-crowded-places/.

[7] See Id.

[8] See, e.g., SVIP Portfolio and Performers, DHS, https://www.dhs.gov/science-and-technology/svip-portfolio.

[9] Id.

[10] See Securing Soft Targets, DHS, https://www.dhs.gov/science-and-technology/securing-soft-targets.

[11] See pFlux Technology, Flux Tensor, https://fluxtensor.com/technology/.

[12] See Securing Soft Targets, supra note 10.

[13] See Security, Lauretta AI, https://lauretta.io/technologies/security/.

[14] See Securing Soft Targets, supra note 10.

[15] See Technology, Analytical AI, https://www.analyticalai.com/technology.

[16] Kyllo v. United States, 533 U.S. 27, 33 (2001).

[17] Cf. Id.

[18] See generally, Katz v. United States, 389 U.S. 347, 361 (1967) (Harlan, J., concurring) (explaining the test for whether someone may rely on an expectation of privacy).

 

 


AI and Predictive Policing: Balancing Technological Innovation and Civil Liberties

Alexander Engemann, MJLST Staffer

To maximize their effectiveness, police agencies are constantly looking to use the most sophisticated preventative methods and technologies available. Predictive policing is one such technique that fuses data analysis, algorithms, and information technology to anticipate and prevent crime. This approach identifies patterns in data to anticipate when and where crime will occur, allowing agencies to take measures to prevent it.[1] Now, engulfed in an artificial intelligence (“AI”) revolution, law enforcement agencies are eager to take advantage of these developments to augment controversial predictive policing methods.[2]

In precincts that use predictive policing strategies, ample amounts of data are used to categorize citizens with basic demographic information.[3] Now, machine learning and AI tools are augmenting this data which, according to one source vendor, “identifies where and when crime is most likely to occur, enabling [law enforcement] to effectively allocate [their] resources to prevent crime.”[4]

Both predictive policing and AI have faced significant challenges concerning issues of equity and discrimination. In response to these concerns, the European Union has taken proactive steps promulgating sophisticated rules governing AI applications within its territory, continuing its tradition of leading in regulatory initiatives.[5] Dubbed the “Artificial Intelligence Act”, the Union clearly outlined its goal of promoting safe, non-discriminatory AI systems.[6]

Back home, we’ve failed to keep a similar legislative pace, even with certain institutions sounding the alarms.[7] Predictive policing methods have faced similar criticism. In an issue brief, the NAACP emphasized, “[j]urisdictions who use [Artificial Intelligence] argue it enhances public safety, but in reality, there is growing evidence that AI-driven predictive policing perpetuates racial bias, violates privacy rights, and undermines public trust in law enforcement.”[8] This technological and ideological marriage clearly poses discriminatory risks for law enforcement agencies in a nation where a black person is already exponentially more likely to be stopped without just cause as their white counterparts.[9]

Police agencies are bullish about the technology. Police Chief Magazine, the official publication of the International Association of Chiefs of Police,  paints these techniques in a more favorable light, stating, “[o]ne of the most promising applications of AI in law enforcement is predictive policing…Predictive policing empowers law enforcement to predict potential crime hotspots, ultimately aiding in crime prevention and public safety.[10] In this space, facial recognition software is gaining traction among law enforcement agencies as a powerful tool for identifying suspects and enhancing public safety. Clearview AI stresses their product, “[helps] law enforcement and governments in disrupting and solving crime.”[11]

Predictive policing methods enhanced by AI technology show no signs of slowing down.[12] The obvious advantages to these systems cannot be ignored, allowing agencies to better allocate resources and manage their staff. However, as law enforcement agencies adopt these technologies, it is important to remain vigilant in holding them accountable to any potential ethical implications and biases embedded within their systems. A comprehensive framework for accountability and transparency, similar to European Union guidelines  must be established to ensure deploying predictive policing and AI tools do not come at the expense of marginalized communities. [13]

 

Notes

[1] Andrew Guthrie Ferguson, Predictive Policing and Reasonable Suspicion, 62 Emory L.J. 259, 265-267 (2012)

[2] Eric M. Baker, I’ve got my AI on You: Artificial Intelligence in the Law Enforcement Domain, 47 (Mar. 2021) (Master’s thesis).

[3] Id. at 48.

[4] Id. at 49 (citing Walt L. Perry et al., Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations, RR-233-NIJ (Santa Monica, CA: RAND, 2013), 4, https://www.rand.org/content/dam/rand/ pubs/research_reports/RR200/RR233/RAND_RR233.pdf).

[5] Commission Regulation 2024/1689 or the European Parliament and of the Council of 13 June 2024 laying down harmonized rules on artificial intelligence and amending Regulations (Artificial Intelligence Act), 2024 O.J. (L 1689) 1.

[6] Lukas Arnold, How the European Union’s AI Act Provides Insufficient Protection Against Police Discrimination, Penn. J. L. & Soc. Change (May 14,2024), https://www.law.upenn.edu/live/news/16742-how-the-european-unions-ai-act-provides#_ftn1.

[7] See Margaret Hu, Algorithmic Jim Crow, 86 Fordham L. Rev. 633, 664 (2017),

https://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=5445&context=flr. (“Database screening and digital watchlisting systems, in fact, can serve as complementary and facially colorblind supplements to mass incarcerations systems. The purported colorblindness of mandatory sentencing… parallels the purported colorblindness of mandatory database screening and vetting systems”).

[8] NAACP, Issue Brief: The Use of Artificial Intelligence in Predictive policing, https://naacp.org/resources/artificial-intelligence-predictive-policing-issue-brief (last visited Nov. 2, 2024).

[9] Will Douglas Heaven, Artificial Intelligence- Predictive policing algorithms are racist. They need to be dismantled, MIT Tech. Rev. (July 17, 2020), https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/ (citing OJJDP Statistical Briefing Book. Estimated number of arrests by offense and race, 2020. Available: https://ojjdp.ojp.gov/statistical-briefing-book/crime/faqs/ucr_table_2. Released on July 08, 2022).

[10] See The Police Chief, Int’l Ass’n of Chiefs of Police, https://www.policechiefmagazine.org (last visited Nov. 2, 2024);Brandon Epstein, James Emerson, and ChatGPT, “Navigating the Future of Policing: Artificial Intelligence (AI) Use, Pitfalls, and Considerations for Executives,” Police Chief Online, April 3, 2024.

[11] Clearview AI, https://www.clearview.ai/ (last visited Nov. 3, 2024).

[12] But see Nicholas Ibarra, Santa Cruz Becomes First US City to Approve Ban on Predictive Policing, Santa Cruz Sentinel (June 23, 200) https://evidentchange.org/newsroom/news-of-interest/santa-cruz-becomes-first-us-city-approve-ban-predictive-policing/.

[13] See also Roy Maurer, New York City to Require Bias Audits of AI-Type HR Technology, Society of Human Resources Management (December 19, 2021), https://www.shrm.org/topics-tools/news/technology/new-york-city-to-require-bias-audits-ai-type-hr-technology.

 


The StingRay You’ve Never Heard Of: How One of the Most Effective Tools in Law Enforcement Operates Behind a Veil of Secrecy

Dan O’Dea, MJLST Staffer

One of the most effective investigatory tools in law enforcement has operated behind a veil of secrecy for over 15 years. “StingRay” cell phone tower simulators are used by law enforcement agencies to locate and apprehend violent offenders, track persons of interest, monitor crowds when intelligence suggests threats, and intercept signals that could activate devices. When passively operating, StingRays mimic cell phone towers, forcing all nearby cell phones to connect to them, while extracting data in the form of metadata calls, text messages, internet traffic, and location information, even when a connected phone is powered off. They can also inject spying software into phones and prevent phones from accessing cellular data. StingRays were initially used overseas by federal law enforcement agencies to combat terrorism, before spreading into the hands of the Department of Justice and Department of Homeland Security, and now are actively used by local law enforcement agencies in 27 states to solve everything from missing persons cases to thefts of chicken wings.

The use of StingRay devices is highly controversial due to their intrusive nature. Not only does the use of StingRays raise privacy concerns, but tricking phones into connecting to StingRays mimicking cell phone towers prevent accessing legitimate cell phone service towers, which can obstruct access to 911 and other emergency hotlines. Perplexingly, the use of StingRay technology by law enforcement is almost entirely unregulated. Local law enforcement agencies frequently cite secrecy agreements with the FBI and the need to protect an investigatory tool as a means of denying the public information about how StingRays operate, and criminal defense attorneys have almost no means of challenging their use without this information. While the Department of Justice now requires federal agents obtain a warrant to use StingRay technology in criminal cases, an exception is made for matters relating to national security, and the technology may have been used to spy on racial-justice protestors during the Summer of 2020 under this exception. Local law enforcement agencies are almost completely unrestricted in their use of StingRays, and may even conceal their use in criminal prosecutions by tagging their findings as those of a “confidential source,” rather than admitting the use of a controversial investigatory tool. Doing so allows prosecutors to avoid  battling 4th amendment arguments characterizing data obtained by StingRays as unlawful search and seizure.

After existing in a “legal no-man’s land” since the technology’s inception, Senator Ron Wyden (D-OR) and Representative Ted Lieu (D-HI) sought to put an end to the secrecy of StingRays through introducing the Cell-Site Simulator Warrant Act of 2021 in June of 2021. The bill would have mandated that law enforcement agencies obtain a warrant to investigate criminal activity before deploying StingRay technology while also requiring law enforcement agencies to delete data of phones other than those of investigative targets. Further, the legislation would have required agencies to demonstrate a need to use StingRay technology that outweighs any potential harm to the community impacted by the technology. Finally, the bill would have limited authorized use of StingRay technology to the minimum amount of time necessary to conduct an investigation. However, the Cell-Site Simulator Warrant Act of 2021 appears to have died in committee after failing to garner significant legislative support.

Ultimately, no device with the intrusive capabilities of StingRays should be allowed to operate free from the constraints of regulation. While StingRays are among the most effective tools utilized by law enforcement, they are also among the most intrusive into the privacy of the general public. It logically follows that agencies seeking to operate StingRays should be required to make a showing of a need to utilize such an intrusive investigatory tool. In certain situations, it may be easy to establish the need to deploy a StingRay, such as doing so to further the investigation of a missing persons case. In others, law enforcement agencies would correctly find their hands tied should they wish to utilize a StingRay to catch a chicken wing thief.


Mind Over Matter: Needed Changes to the Use of Hypnosis in the Criminal Justice System

Jordan Hughes, MJLST Staffer

When most people think of hypnosis today, they imagine stage-show demonstrations and over dramatized mind-tricks. Perhaps they picture people lined up, making ridiculous noises and actions seemingly without control of their own bodies at the behest of an entertainer. Despite such popular images, hypnosis has a wide range of psychological and medical applicability outside of entertainment. Trained professionals have found hypnotherapy useful as a tool to treat pain, depression, phobias, habit disorders, skin conditions, and many other psychological and medical problems. Clinical researchers lament that the public expectations of hypnosis, built up by its use for entertainment and its dramatization in media, make it more difficult to take advantage of a psychological tool that people throughout society could be benefitting from.

One group of people was quick to accept and explore the untapped potential of hypnosis in their work: criminal investigators. In the 1950s, the now partially de-classified MKUltra program began conducting hypnosis experiments on mental health patients, including experiments “hypnotically increasing ability to observe and recall a complex arrangement of physical objects.” This practice was generally considered “experimental” until a highly publicized case in 1976. A bus driver and 26 children were abducted and buried alive; after escaping, a hypnotist helped the bus driver to accurately recall the license-plate numbers on the vans used in the abduction, leading to the apprehension of all three kidnappers. After this case, police departments across the country began using forensic hypnosis as a part of investigations.

Since the 70s and 80s, the scientific validity of forensic hypnosis has been called into question. Studies have revealed that hypnotically recovered memories may be inaccurate, incomplete, or based on a leading suggestion. False memories introduced through hypnosis can be “hardened,” so that subjects cannot distinguish them from genuine memories. Courts have been split on the admissibility of hypnotically enhanced testimony at trial, and are becoming increasingly wary of its use. See Sims v. Hayette, 914 F.3d 1078, 1090 (7th Cir. 2019) (“The concealed hypnosis . . . calls into question everything [the hypnotized witness] said at trial.”).

Despite these hesitations and the scientific backlash, the Department of Justice maintains that there is a use for hypnosis in criminal investigations. According to the DOJ Criminal Resource Manual, while hypnosis should only be used “on rare occasions” and recalled memories should be corroborated, forensic hypnosis is considered an aid that investigators may employ. The DOJ states that hypnosis may be used where there is a “clear need for additional information,” and where hypnosis “can be useful” in aiding a witness’s memory.

Hypnotherapy, as described above, has been found useful in other contexts. And many of those contexts could be of help in the world of criminal justice. The things that make hypnosis dangerous for establishing facts in a court room—a subject’s openness to suggestion and confidence that the hypnosis will work—make the practice valuable in clinical settings.

In the clinical world, the field of hypnotherapy was pioneered by Milton H. Erickson, who founded the American Society for Clinical Hypnosis in 1957. Hypnotherapy has since been found effective as a tool for overcoming narcotic addictions, managing pain, fighting depression, and curing all kinds of anxieties and phobias. Hypnotherapy has also shown promise in helping survivors of domestic and sexual abuse overcome complex PTSD, helping adults to overcome childhood traumas, and providing a means to deal with traumatic grief. Different people are receptive to different types of hypnotic intervention, and trained hypnotherapists are able to tailor their interventions to the individual patient.

Addictions, pain, anxiety and depression, PTSD and other forms of trauma . . . all of these are conditions that are known to influence criminal behavior. A criminal justice system focused on prevention of crime would employ hypnotherapy with a public health approach, exploring the potential of hypnotic interventions to help people mold the physical and psychological conditions that can lead to criminal activity. Instead of featuring it in the DOJ Criminal Resource Manual as an investigation technique, we should be seeing hypnotherapy embraced by the Bureau of Prisons, probation officers, and case managers as a means of creating “correctional facilities” that live up to their name. Unfortunately, the will to explore this tool as a curative measure has not found its way to the prison system.

The problems with where hypnosis is used in the criminal justice system underscores a broader systemic issue. There is an overemphasis in the system on using innovative techniques to catch criminals. Whether a behavioral science that promises to “unlock” memories, or a piece of military tech that allows for dragnet-style spying on unsuspecting civilians, zealous investigators are often keen to employ novel tools to get ahead of the suspects they are after. This is at the expense of innocent civilians, whose constitutional and natural rights are inevitably contravened.

By and large, this desire for innovation has not crept into the world of those focused on helping to rehabilitate past convicts. Over one nine year study, 83% of the state prisoners released were rearrested for committing new crimes. Arrest data tells us that over two-thirds of state drug offenders are rearrested within five years of their release. 24% of sex offenders commit another sex crime with fifteen years of release and a much higher percentage of sex offenders are estimated to recidivate by committing non-sexual crimes that are nonetheless sexually motivated. These high rearrest rates are part of why America has the largest per-capita prison population of any country in the world.

But it does not have to be that way. Hypnotherapy is one of many techniques that, with investment and proper oversight, could prove essential to curing drug addictions and affecting long-term behavioral change. Federal courts in Minnesota have already created a unique one-on-one mentorship program to help rehabilitate offenders as they reenter society. An investment in this and similar programs, and a commitment to developing novel ways of helping people avoid criminal activity, could be the fundamental change that we need in order to see a criminal justice system that does more protecting of our society than punishing it.


Inconceivable! How the Fourth Amendment Failed the Dread Pirate Roberts in United States v. Ulbricht

Emily Moss, MJLST Staffer

It is not an overstatement to claim that electronic devices, such as laptop and smart phones, have “altered the way we live.” As Chief Justice Roberts stated, “modern cell phones . . . are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.” Riley v. California, 573 U.S. 373, 385 (2014). These devices create new digital records of our everyday lives. United States v. Ulbricht, 858 F.3d 71 (2d Cir. 2017) is one of many cases that grapple with when the government should gain access to these records.

In February 2015, a jury found Ross William Ulbricht (aka “Dread Pirate Roberts” or “DPR”) guilty on seven counts related to his creation and operation of Silk Road. United States v. Ulbricht, 858 F.3d 71, 82 (2d Cir. 2017). Silk Road was an online criminal marketplace where, using the anonymous currency Bitcoin, “users principally bought and sold drugs, false identification documents, and computer hacking software.” Id. Government trial evidence showed that, hoping to protect Silk Road anonymity, DPR commissioned the murders of five people. Id. at 88. However, there is no evidence that the murders actually transpired. Id.

On appeal, the Second Circuit upheld both the conviction and Ulbricht’s two-life sentence. Ulbricht, 858 F.3d at 82. Ulbricht argued, inter alia, that “the warrant[] authorizing the government to search his laptop . . . violated the Fourth Amendment’s particularity requirement.” Id. at 95. The warrant authorized “opening or ‘cursorily reading the first few’ pages of files to ‘determine their precise contents,’ searching for deliberately hidden files, using ‘key word searches through all electronic storage areas,’ and reviewing file ‘directories’ to determine what was relevant.” Id. at 101–02. Ulbricht claimed that the warrant violated the Fourth Amendment’s particularity requirement because it “failed to specify the search terms and protocols” that the government was required to employ while searching Ulbricht’s laptop. Id. at 102.

The court acknowledged that particularity is especially important when the warrant authorizes the search of electronic data, as the search of a computer can expose “a vast trove of personal information” including “sensitive records.” Id. at 99. It noted that “a general search of electronic data is an especially potent threat to privacy because hard drives and e-mail accounts may be ‘akin to a residence in terms of the scope and quantity of private information [they] may contain’ . . . Because of the nature of digital storage, it is not always feasible to ‘extract and segregate responsive data from non-responsive data,’. . . creating a ‘serious risk that every warrant for electronic information will become, in effect, a general warrant.’” Id. (internal citations omitted).

Nonetheless, the court rejected Ulbricht’s claim that the laptop warrant failed to meet the Fourth Amendment’s particularity requirement. It reasoned that it would be impossible to identify how relevant files would be named before the laptop search began, which the government reasonably anticipated when requesting the laptop warrant. Id. at 102 (emphasizing examples where relevant files and folders had misleading names such as “aliaces” or “mbsobzvkhwx4hmjt”). Further, the court held that broad search protocols were appropriate given that the alleged crime involved sophisticated technology and masking identity. Id. Ultimately, the court emphasized that the “fundamental flaw” in Ulbricht’s argument was that it equated a broad warrant with a violation of the particularity requirement. Id. Using the analogy of searching an entire home where there is probable cause to believe that there is relevant evidence somewhere in the home, the court illustrated that a warrant can be both broad and still satisfy the particularity requirement. Id. (citing U.S. Postal Serv. v. C.E.C. Servs., 869 F.2d 184, 187 (2d Cir. 1989)). The court therefore upheld the constitutionality of the warrant. The Supreme Court denied Ulbrich’s writ of certiorari.

Orin Kerr’s equilibrium adjudgment theory of the Fourth Amendment argues that as new tools create imbalanced power on either the side of privacy or the side of law enforcement, the Fourth Amendment must adjust to restore its original balance. The introduction of computers and the internet created an immense change in the tools that both criminals and law enforcement use. Without minimizing the significance of Ulbricht’s crimes, United States v. Ulbricht illustrates this dramatic change. While computers and the internet did create new avenues for crime, computer and internet searches—such as the ones employed by the government—do far more to disrupt the Fourth Amendment’s balance.

Contrary to the court’s argument in Ulbricht, searching a computer is entirely unlike searching a home. First, it is easy to remove items from your home, but the same is not true of computers. Even deleted files often linger on computers where the government can access them. Similarly, when law enforcement finds a file in someone’s home, it still does not know how that file was used, how often it has been viewed, or who has viewed it. But computers do store such information. These, and many other differences demonstrate why particularity, in the context of computer searches, is even more important than the court in UIlbricht acknowledged. Given the immense amount of information available on an individual’s electronic devices, Ulbricht glosses over the implications for personal privacy posed by broad search warrants directed at computers. And with the rapidly changing nature of computer technology, the Fourth Amendment balance will likely continue to stray further from equilibrium at a speed with which the courts will struggle to keep up.

Thus, adjusting the Fourth Amendment power balance related to electronic data will continue to be an important and complicated issue. See, e.g., Proposal 2 Mich. 2020) (amending the state’s constitution “to require a search warrant to access a person’s electronic data or electronic communications,” passing with unanimous Michigan Senate and House of Representative approval, then with 88.8% of voters voting yes on the proposal); People v. Coke, 461 P.3d 508, 516 (Colo. 2020) (“‘Given modern cell phones’ immense storage capacities and ability to collect and store many distinct types of data in one place, this court has recognized that cell phones ‘hold for many Americans the privacies of life’ and are, therefore, entitled to special protections from searches.”) (internal citations omitted). The Supreme Court has ruled on a number of Fourth Amendment and electronic data cases. See, e.g., Carpenter v. United States, 138 S.Ct. 2206 (2018) (warrantless attainment of cell-site records violates the Fourth Amendment); Riley v. California, 134 S.Ct. 2473 (2014) (warrantless search and seizure of digital contents of a cell phone during an arrest violates the Fourth Amendment). However, new issues seem to appear faster than they can be resolved. See, e.g., Nathan Freed Wessler, Jennifer Stisa Granick, & Daniela del Rosario Wertheimer, Our Cars Are Now Roving Computers. Is the Fourth Amendment Ready?, ACLU (May 21, 2019, 3:00 PM), https://www.aclu.org/blog/privacy-technology/surveillance-technologies/our-cars-are-now-roving-computers-fourth-amendment. The Fourth Amendment therefore finds itself in eel infested waters. Is rescue inconceivable?

Special thanks to Professor Rozenshtein for introducing me to Ulbricht and inspiring this blog post in his course Cybersecurity Law and Policy!


The “Circuit Split” That Wasn’t

Sam Sylvan, MJLST Staffer

Earlier this year, the Fourth Circuit punted on an opportunity to determine the constitutional “boundaries of the private search doctrine in the context of electronic searches.” United States v. Fall, 955 F.3d 363, 371 (4th Cir. 2020). The private search doctrine, crafted by the Supreme Court in the 80’s, falls under the Fourth Amendment’s umbrella. The doctrine makes it lawful for law enforcement to “search” something that was initially “searched” by a private third party, because the Fourth Amendment is “wholly inapplicable to a search or seizure, even an unreasonable one, effected by a private individual not acting as an agent of the Government or with the participation or knowledge of any government official.” United States v. Jacobsen, 466 U.S. 109, 113 (1984).

An illustration: Jane stumbles upon incriminating evidence on John’s laptop that implicates John in criminal activity (the “initial private search”), Jane shows the police what she found on the laptop (the “after-occurring” search), and the rest is history for John. But for law enforcement’s after-occurring search to avoid violating the Fourth Amendment, its search must not exceed the scope of the initial private search. “The critical measures [to determine] whether a governmental search exceeds the scope of the private search that preceded it,” United States v. Lichtenberger, 786 F.3d 478, 485 (6th Cir. 2015), include whether “there was a virtual certainty that nothing else of significance was in the [property subjected to the search]” and whether the government’s search “would not tell [law enforcement] anything more than [it] already had been told” or shown by the private searcher. Jacobsen, 466 U.S. at 119.

Of course, the Supreme Court’s holdings from the 80’s that speak to the scope of the Fourth Amendment are often difficult to reconcile with modern-day Fourth Amendment fact patterns that revolve around law enforcement searches of modern electronic devices (laptops; smartphones; etc.). In the key Supreme Court private search doctrine case, Jacobsen (1984), the issue was the constitutionality of a DEA agent’s after-occurring search of a package after a FedEx employee partially opened the package (upon noticing that it was damaged) and saw a white powdery substance.

Since the turn of the millennium, courts of appeals have stretched to apply Jacobsen to rule on the private search doctrine’s application to, and scope of, law enforcement searches of electronics. In 2001, the Fifth Circuit addressed the private search doctrine in a case where the defendant’s estranged wife took a bunch of floppy disks, CDs, and zip disks from the defendant’s property. She and her friend then discovered evidence of defendant’s criminal activity on those disks while searching some of them and turned the collection over to the police, which led to the defendant’s conviction. United States v. Runyan, 275 F.3d 449 (5th Cir. 2001).

There are two crucial holdings in Runyan regarding the private search doctrine. First, the court held that “the police exceeded the scope of the private search when they examined the entire collection of ‘containers’ (i.e., the disks) turned over by the private searchers, rather than confining their [warrantless] search to the selected containers [that were actually] examined by the private searchers.” Id. at 462. Second, the court held that the “police search [did not] exceed[] the scope of the private search when the police examine[d] more items within a particular container [i.e., a particular disk] than did the private searchers” who searched some part of the particular disk but not its entire contents. Id. at 461, 464. Notably absent from this case: a laptop or smartphone.

Eleven years after Runyan, the Seventh Circuit held that the police did not exceed the scope of the private searches conducted by a victim and her mother. Rann v. Atchison, 689 F.3d 832 (7th Cir. 2012) (relying heavily on Runyan). In Rann, the police’s after-occurring search included viewing images (on the one memory card brought to them by the victim and the one zip drive brought to them by the victim’s mother) that the private searchers themselves had not viewed. Id. Likening computer storage disks to containers (as the Runyan court did), the Rann court concluded “that a search of any material on a computer disk is valid if the private [searcher] viewed at least one file on the disk.” Id. at 836 (emphasis added). But notably absent from this case like Runyan: a laptop or smartphone.

Two years after Rann, the Supreme Court decided Riley v. California—a landmark case where the Court unanimously held that the warrantless search of a cellphone during an arrest was unconstitutional. Specific reasoning from the Riley Court is noteworthy insofar as assessing the Fourth Amendment’s (and, in turn, the private search doctrine’s) application to smartphones and laptops. The Court stated:

[W]e generally determine whether to exempt a given type of search from the warrant requirement by assessing, on the one hand, the degree to which it intrudes upon an individual’s privacy and, on the other, the degree to which it is needed for the promotion of legitimate governmental interests. . . . [Smartphones] are in fact minicomputers that also happen to have the capacity to be used as a telephone. They could just as easily be called cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums, televisions, maps, or newspapers. One of the most notable distinguishing features of [smartphones] is their immense storage capacity.

573 U.S. 373, 385, 393 (2014). Riley makes crystal clear that when the property at issue is a laptop or smartphone, the balance between a person’s privacy interests and the governmental interests tips heavily in favor of the individual’s privacy interests. In simpler terms, law enforcement needs a warrant to search a laptop or smartphone unless it has an extremely compelling reason for failing to comply with the Fourth Amendment’s warrant requirement.

 

One year after Riley, the Sixth and Eleventh Circuits—armed with Riley’s insights regarding modern electronic devices—decided Lichtenberger and United States v. Sparks, respectively. The two Circuits held that in both cases the police, in conducting their after-occurring warrantless searches of a laptop (Lichtenberger) and a smartphone (Sparks), exceeded the scope of the initial private searches, reaching these conclusions in large part due to Riley. In Lichtenberger, the police exceeded the scope of the initial private search when, without a warrant, they looked at photographs on the laptop that the private searcher had not looked at, despite the private searcher’s initial viewing of other photographs on the laptop. 786 F.3d 478 (6th Cir. 2015). In Sparks, the police exceeded the scope of the initial private search when, without a warrant, they viewed a video within the same album on the smartphone that the private searcher had scrolled through but which the private searcher did not actually view. 806 F.3d 1323 (11th Cir. 2015), overruled on other grounds by United States v. Ross, 963 F.3d 1056 (11th Cir. 2020) (overruling Sparks “to the extent that [Sparks] holds that [property] abandonment implicates Article III standing”).

 

At first glance, Lichtenberger and Sparks seem irreconcilable with Runyan and Rann, leading many commentators to conclude there is a circuit split regarding the private search doctrine: the “container” approach versus the “file”/“narrow” approach. But I disagree. And there is a rather simple explanation for reaching this conclusion—Riley merely heightened Jacobsen’s “virtual certainty” requirement in determining whether law enforcement exceed the scope of initial private searches of laptops and smartphones. In other words, “virtual certainty” is significantly elevated in the context of smartphones and laptops because of the heightened privacy interests at stake stemming from their immense storage capacities and unique qualities—i.e., they contain information and data about all aspects of our lives to a much greater extent than floppy disks, CDs, zip drives, and camera memory cards. Thus, the only apparent sure way for law enforcement to satisfy the private search doctrine’s “virtual certainty” requirement when a laptop or smartphone is involved (and thereby avoid inviting defendants to invoke the exclusionary rule) is to view exactly what the private searcher viewed.

In contrast, the “virtual certainty” requirement in the context of old school floppy disks, CDs, zip drives, and memory cards is quite simply a lower standard of certainty because the balance between privacy interests and legitimate governmental interests is not tipped heavily in favor of privacy interests.

While floppy disks, CDs, and zip drives somewhat resemble “containers,” such as the package in Jacobsen, smartphones and laptops are entirely different Fourth Amendment beasts. Accordingly, the four cases should all be analyzed through the lens that the particular electronic device at issue in each case is most significant because it guides the determination of whether the after-occurring search fell within the scope of the initial private search. Looking at the case law this way makes it so that it is not the container approach versus the file approach. Rather, it is (justifiably) the container approach for certain older electronic storage devices and the file approach for modern electronic devices that implicate weightier privacy concerns.


Hailstorms in Baltimore: The Fourth Circuit’s Opportunity to Create Oversight and Accountability for a Secretive Police Technology

Jordan Hughes, MJLST Staffer

The past several months have once again shone a spotlight on the difficulty of holding police and law enforcement accountable for their actions. The American public has become more aware than ever of the unions and structures in place to shield officers from liability. Despite years of DOJ investigations and investigative reporting into the procedures of departments around the country, many regular police practices remain hidden from the public eye. Including the use of secretive new technologies that allow for unprecedented levels of discretion—and unprecedented potentials for abuse.

The Hailstorm is one such dragnet-style electronic capturing device that over 85 federal and state enforcement agencies have used largely in secret for more than two decades. This past spring, the 4th Circuit joined the fledgling ranks of federal courts asked to grapple with constitutional questions raised by the elusive technology. Baltimore police used a Hailstorm in 2014 to locate Kerron Andrews, who had an outstanding arrest warrant. Andrews v. Balt. City Police Dep’t, No. CCB-16-2010, 2018 U.S. Dist. LEXIS 129523, at *4 (D. Md. Aug. 1, 2018). The device enabled Baltimore police to pinpoint the apartment building where Andrews was sitting, despite having been unable to find him using standard location information released to them by his phone carrier. The police never disclosed the device during their surveillance, citing instead a “pen register order” as authorization for its use. A Maryland state court held that the government violated Andrews’ Fourth Amendment rights through use of the Hailstorm, and a state appellate court upheld that decision. Andrews then sued the police department in a federal district court, but the federal court considered the search constitutional and granted summary judgment against him. Andrews appealed.

The 4th Circuit, in Andrews v. Balt. City Police Dep’t, No. 18-1953, 2020 U.S. App. LEXIS 9641 (4th Cir. Mar. 27, 2020), both acknowledged the serious constitutional questions at stake and declined to make a ruling on them due to a lack of information. The district court was directed on remand to make findings concerning the Baltimore Police Department’s practice regarding Hailstorm technology, as well as the extent of constitutional intrusions involved in the search. Whatever the outcome, the 4th Circuit is likely to hear this case again. When it finally does, the court will have to decide how to apply the Fourth Amendment to a technology that may be fully incompatible with the freedom from broad and general searches that it typically guarantees.

What is a Hailstorm?

The “Hailstorm” is a model of “cell site simulator” technology sold by Harris Corporation. Other commonly used Harris models include the “StingRay,” “TriggerFish,” and “KingFish.” Generically, these devices are known as international mobile subscriber identity (“IMSI”) catchers.

IMSI catchers essentially mimic a wireless carrier’s base station, causing cell phones to communicate their unique identifiers and location data to the device even when they’re not in use. They function as a dragnet, capturing the unique numerical identifiers of all wireless devices within a particular area. The technology provides both identification and location data for devices. It is precise enough for law enforcement to narrow a device’s location to six feet, and to identify the exact unit a device is in from outside a large apartment complex. IMSI catchers are also capable of capturing the contents of communications, although there has not been a disclosed instance yet of law enforcement using an IMSI catcher in this fashion. IMSI catchers are small, and can easily be handheld or mounted on vehicles or drones.

What is the concern?

The Hailstorm raises a number of concerns under the Fourth Amendment—the constitutional provision meant to protect Americans from unreasonable searches and seizures. The ACLU, in a 2014 guide for defense attorneys, outlined the major Fourth Amendment questions that arise with the use of any IMSI catcher. These include:

  1. Level of scrutiny: IMSI catchers are almost certainly intrusive enough to violate both reasonable expectations of privacy and property interests, thus giving rise to Fourth Amendment scrutiny. When used in connection with a residence, the devices provide critical details about the inside of the property that constitutes a search under any framework. While the Supreme Court has held that there is no reasonable expectation of privacy on outgoing phone numbers voluntarily sent to a third party, that analysis likely cannot extend to data that gets redirected and captured by a Hailstorm without the phone-owner’s knowledge or consent.
  2. General search: There is an argument that any search conducted by an IMSI catcher constitutes a general search, and thus should be categorically barred by the Fourth Amendment. An IMSI catcher indiscriminately gathers all signaling information from a captured phone, seemingly incompatible with a constitutional requirement that surveillance minimize the collection of information unsupported by probable cause. Further, the dragnet functionality conducts this information grab on all devices in a vicinity, including innocent third parties whom the government lacks probable cause to search.
  3. Inaccurate warrants: When law enforcement does apply for a warrant to use an IMSI catcher, those warrants are very likely inaccurate. Warrant applications, driven by federal policies of non-disclosure, typically either (a) omit the fact that the government intends to use an IMSI catcher, (b) mislead the court by saying the government intends to use less intrusive devices (like a pen register) instead, or (c) fail to provide any information on what the technology is and how it operates. In either scenario, the warrant is predicated on a material omission that deprives a court of its constitutional obligation to balance government interests against intrusions into private rights.
  4. Invalid warrants: If a warrant accurately states law enforcement’s intended use of an IMSI catcher, it may be facially invalid due to the necessarily general nature of the search. The entire purpose of the warrant requirement is to require law enforcement to state with particularity the area to be searched and the persons or things to be seized. It remains an open question whether warrant particularity requirements can ever be compatible with intrusive dragnet surveillance technologies.

A separate and perhaps more troubling concern is the extreme lengths, only recently uncovered, that the government has gone to in order to keep this technology a secret. The federal government uses extensive non-disclosure agreements to prevent federal, state, and even local law enforcement from disclosing any details on the capabilities and usage of IMSI catchers. There have been a couple instances where judges demanded police to disclose possible use of an IMSI catcher at trial. Prosecutors in these instances have voluntarily dropped the evidence, offered plea bargains without jail time, or voluntarily dismissed the case altogether rather than disclose the device’s usage. Law enforcement agents have also demonstrated a willingness to offer alternative explanations for evidence obtained by an IMSI catcher. In one case where the FBI used a StingRay, for example, a discovered email from a special agent read: “we need to develop independent probable cause for the search warrant . . . FBI does not want to disclose the [redacted] (understandably so).”

IMSI catchers in the courts so far

The first reported decision dealing with an IMSI catcher was in 1995. In re United States, 885 F. Supp. 197 (C.D. Cal. 1995). The court, which had difficulty applying current law to the new surveillance technology, demanded that law enforcement develop stronger safeguards before permitting its use. Since 1995, nation-wide police practices of avoiding disclosure of the devices has largely shielded them from the view of courts. More recent orders from even the most tech-savvy magistrate judges suggest that judicial officers across the country still have little exposure to or understanding of IMSI technology. The lack of exposure and understanding is critical to continuing the law enforcement practice of applying for approval to use a “pen register” device.

Among the courts that have been faced with the question of IMSI catcher use, several—including the 7th Circuit in 2016—have declined to answer questions concerning the devices’ constitutionality. United States v. Patrick, 842 F.3d 540 (7th Cir. 2016). In his dissent, Chief Judge Wood described the avoidance strategies of law enforcement as “bad faith” that could justify suppression, and closed by writing that “it is time for the Sting[R]ay to come out of the shadows, so that it can be subject to the same kind of scrutiny as other mechanisms.”

The 7th Circuit ultimately did revisit the question of Sting[R]ays in Sanchez-Jara in 2018. United States v. Sanchez-Jara, 889 F.3d 418 (7th Cir. 2018). That court rejected the “general search” argument and upheld a warrant that referred generally to “electronic investigative techniques” without specifying the use of IMSI catcher technology. The other federal circuits have yet to reach a decision on the issue.

Andrews v. Balt. City Police Dep’t will almost certainly appear before the 4th Circuit again. While the question in that case deals with whether a pen register application can cover use of a Hailstorm device, deeper questions surrounding the constitutionality of a Hailstorm search underlie every aspect of the litigation. The court will be faced with a police department that has a history of abusing discretion, and that has shielded the courts from its use of IMSI catchers for years, in a moment of increased public scrutiny of police practices and procedures. The 4th Circuit thus has a unique opportunity to create a level of increased accountability for law enforcement, and to change the trajectory of police surveillance strategies for years to come.


Hacking the Circuit Split: Case Asks Supreme Court to Clarify the CFAA

Kate Averwater, MJLST Staffer

How far would you go to make sure your friend’s love interest isn’t an undercover cop? Would you run an easy search on your work computer? Unfortunately for Nathan Van Buren, his friend was part of an FBI sting operation and his conduct earned him a felony conviction under the Computer Fraud and Abuse Act (CFAA), 18 USC § 1030.

Van Buren, formerly a police sergeant in Georgia, was convicted of violating the CFAA. His acquaintance turned informant for the FBI and recorded their interactions. Van Buren knew Andrew Albo from Albo’s previous brushes with law enforcement. He asked Van Buren to run the license plate number of a dancer. Albo claimed he was interested in her and wanted to make sure she wasn’t an undercover cop. Trying to better his financial situation, Van Buren told Albo he needed money. Albo gave Van Buren a fake license plate number and $6,000. Van Buren then ran the fake number in the Georgia Crime Information Center (GCIC) database. Albo recorded their interactions and the trial court convicted Van Buren of honest-services wire fraud (18 USC §§ 1343, 1346) and felony computer fraud under the CFAA.

Van Buren appealed and the Eleventh Circuit vacated and remanded the honest-services wire fraud conviction but upheld the felony computer fraud conviction. His case is currently on petition for review before the Supreme Court.

The relevant portion of the CFAA criminalizes obtaining “information from any protected computer” by “intentionally access[ing] a computer without authorization or exceed[ing] authorized access.” Van Buren’s defense was that he had authorized access to the information. However, he admitted that he used it for an improper purpose. This disagreement over access restrictions versus use restrictions is the crux of the circuit split.  Van Buren’s petition emphasizes the need for the Supreme Court to resolve these discrepancies.

Most favorable to Van Buren is the Ninth Circuit’s reading of the CFAA. The court previously held that the CFAA did not criminalize abusing authorized access for impermissible purposes. Recently, the Ninth Circuit reaffirmed this interpretation. The Second and Fourth Circuits align with the Ninth in interpreting the CFAA narrowly, declining to criminalize conduct similar to Van Buren’s.

In affirming his conviction, the Eleventh Circuit rested on their previous decision in Rodriguez, a much broader reading of the CFAA. The First, Fifth, and Seventh Circuits join the Eleventh in interpreting the CFAA to include inappropriate use.

Van Buren’s case has sparked a bit of controversy and prompted multiple organizations to file amicus briefs. They are pushing the Supreme Court to interpret the CFAA in a narrow way that does not criminalize common activities. Broad readings of the CFAA lead to criticism of the law as “a tool ripe for abuse.”

Whether or not the Supreme Court agrees to hear the case, next time someone offers you $6,000 to do a quick search on your work computer, say no.


United States v. Microsoft Corp.: A Chance for SCOTUS to Address the Scope of the Stored Communications Act

Maya Digre, MJLST Staffer

 

On October 16th, 2017 the United States Supreme Court granted the Federal Government’s petition for certiorari in United States v. Microsoft Corp. The case is about a warrant issued to Microsoft that ordered it to seize and produce the contents of a customer’s e-mail account that the government believed was being used in furtherance of narcotics trafficking. Microsoft produced the non-content information that was stored in the U.S., but moved to quash the warrant with respect to the information that was stored abroad in Ireland. Microsoft claimed that the only way to access the information was through the Dublin data center, even though this data center could also be accessed by their database management program located at some of their U.S. locations.

 

The district court of New York determined that Microsoft was in civil contempt for not complying with the warrant. The 2nd Circuit reversed, stating that “Neither explicitly or implicitly does the statute envision the application of its warrant provision overseas” and “the application of the Act that the government proposes – interpreting ‘warrant’ to require a service provider to retrieve material from beyond the borders of the United States – would require us to disregard the presumption against extraterritoriality.” The court used traditional tools of statutory interpretation in the opinion including plain meaning, presumption against extraterritoriality, and legislative history.

 

The issue in the case, according to ScotusBlog is “whether a United States provider of email services must comply with a probable-cause-based warrant issued under 18 U.S.C. § 2703 by making disclosure in the United States of electronic communications within that provider’s control, even if the provider has decided to store that material abroad.” Essentially, the dispute centers on the scope of the Stored Communications Act (“SCA”) with respect to information that is stored abroad. The larger issue is the tension between international privacy laws, and the absolute nature of warrants issued in the United States. According to the New York Times, “the case is part of a broader clash between the technology industry and the federal government in the digital age.”

 

I think that the broader issue is something that the Supreme Court should address. However, I am not certain that this is the best case for the court. The fact that Microsoft can access the information from data centers in the United States with their database management program seems to weaken their claim. The case may be stronger for companies who cannot access information that they store abroad from within the United States. Regardless of this weakness, the Supreme Court should rule in favor of the State to preserve the force of warrants of this nature. It was Microsoft’s choice to store the information abroad, and I don’t think the choices of companies should impede legitimate crime-fighting goals of the government. Additionally, if the Court ruled that the warrant does not reach information that is stored abroad, this may incentivize companies to keep their information out of the reach of a U.S. warrant by storing it abroad. This is not a favorable policy choice for the Supreme Court to make; the justices should rule in favor of the government.

 

Unfortunately, the Court will not get to make a ruling on this case after Microsoft decided to drop it following the DOJ’s agreement to change its policy.