Shaadie Ali, MJLST Staffer
A controversial proposed rule from the SEC on AI and conflicts of interest is generating significant pushback from brokers and investment advisers. The proposed rule, dubbed “Reg PDA” by industry commentators in reference to its focus on “predictive data analytics,” was issued on July 26, 2023.[1] Critics claim that, as written, Reg PDA would require broker-dealers and investment managers to effectively eliminate the use of almost all technology when advising clients.[2] The SEC claims the proposed rule is intended to address the potential for AI to hurt more investors more quickly than ever before, but some critics argue that the SEC’s proposed rule would reach far beyond generative AI, covering nearly all technology. Critics also highlight the requirement that conflicts of interest be eliminated or neutralized as nearly impossible to meet and a departure from traditional principles of informed consent in financial advising.[3]
The SEC’s 2-page fact sheet on Reg PDA describes the 239-page proposal as requiring broker-dealers and investment managers to “eliminate or neutralize the effect of conflicts of interest associated with the firm’s use of covered technologies in investor interactions that place the firm’s or its associated person’s interest ahead of investors’ interests.”[4] The proposal defines covered technology as “an analytical, technological, or computational function, algorithm, model, correlation matrix, or similar method or process that optimizes for, predicts, guides, forecasts, or directs investment-related behaviors or outcomes in an investor interaction.”[5] Critics have described this definition of “covered technology” as overly broad, with some going so far as to suggest that a calculator may be “covered technology.”[6] Despite commentators’ insistence, this particular contention is implausible – in its Notice of Proposed Rulemaking, the SEC stated directly that “[t]he proposed definition…would not include technologies that are designed purely to inform investors.”[7] More broadly, though, the SEC touts the proposal’s broadness as a strength, noting it “is designed to be sufficiently broad and principles-based to continue to be applicable as technology develops and to provide firms with flexibility to develop approaches to their use of technology consistent with their business model.”[8]
This move by the SEC comes amidst concerns raised by SEC chair Gary Gensler and the Biden administration about the potential for the concentration of power in artificial intelligence platforms to cause financial instability.[9] On October 30, 2023, President Biden signed an Executive Order that established new standards for AI safety and directed the issuance of guidance for agencies’ use of AI.[10] When questioned about Reg PDA at an event in early November, Gensler defended the proposed regulation by arguing that it was intended to protect online investors from receiving skewed recommendations.[11] Elsewhere, Gensler warned that it would be “nearly unavoidable” that AI would trigger a financial crisis within the next decade unless regulators intervened soon.[12]
Gensler’s explanatory comments have done little to curb criticism by industry groups, who have continued to submit comments via the SEC’s notice and comment process long after the SEC’s October 10 deadline.[13] In addition to highlighting the potential impacts of Reg PDA on brokers and investment advisers, many commenters questioned whether the SEC had the authority to issue such a rule. The American Free Enterprise Chamber of Commerce (“AmFree”) argued that the SEC exceeded its authority under both its organic statutes and the Administrative Procedures Act (APA) in issuing a blanket prohibition on conflicts of interest.[14] In their public comment, AmFree argued the proposed rule was arbitrary and capricious, pointing to the SEC’s alleged failure to adequately consider the costs associated with the proposal.[15] AmFree also invoked the major questions doctrine to question the SEC’s authority to promulgate the rule, arguing “[i]f Congress had meant to grant the SEC blanket authority to ban conflicts and conflicted communications generally, it would have spoken more clearly.”[16] In his scathing public comment, Robinhood Chief Legal and Corporate Affairs Officer Daniel M. Gallagher alluded to similar APA concerns, calling the proposal “arbitrary and capricious” on the grounds that “[t]he SEC has not demonstrated a need for placing unprecedented regulatory burdens on firms’ use of technology.”[17] Gallagher went on to condemn the proposal’s apparent “contempt for the ordinary person, who under the SEC’s apparent world view [sic] is incapable of thinking for himself or herself.”[18]
Although investor and broker industry groups have harshly criticized Reg PDA, some consumer protection groups have expressed support through public comment. The Consumer Federation of America (CFA) endorsed the proposal as “correctly recogniz[ing] that technology-driven conflicts of interest are too complex and evolve too quickly for the vast majority of investors to understand and protect themselves against, there is significant likelihood of widespread investor harm resulting from technology-driven conflicts of interest, and that disclosure would not effectively address these concerns.”[19] The CFA further argued that the final rule should go even further, citing loopholes in the existing proposal for affiliated entities that control or are controlled by a firm.[20]
More generally, commentators have observed that the SEC’s new prescriptive rule that firms eliminate or neutralize potential conflicts of interest marks a departure from traditional securities laws, wherein disclosure of potential conflicts of interest has historically been sufficient.[21] Historically, conflicts of interest stemming from AI and technology have been regulated the same as any other conflict of interest – while brokers are required to disclose their conflicts, their conduct is primarily regulated through their fiduciary duty to clients. In turn, some commentators have suggested that the legal basis for the proposed regulations is well-grounded in the investment adviser’s fiduciary duty to always act in the best interest of its clients.[22] Some analysts note that “neutralizing” the effects of a conflict of interest from such technology does not necessarily require advisers to discard that technology, but changing the way that firm-favorable information is analyzed or weighed, but it still marks a significant departure from the disclosure regime. Given the widespread and persistent opposition to the rule both through the note and comment process and elsewhere by commentators and analysts, it is unclear whether the SEC will make significant revisions to a final rule. While the SEC could conceivably narrow definitions of “covered technology,” “investor interaction,” and “conflicts of interest,” it is difficult to imagine how the SEC could modify the “eliminate or neutralize” requirement in a way that would bring it into line with the existing disclosure-based regime.
For its part, the SEC under Gensler is likely to continue pursuing regulations on AI regardless of the outcome of Reg PDA. Gensler has long expressed his concerns about the impacts of AI on market stability. In a 2020 paper analyzing regulatory gaps in the use of generative AI in financial markets, Gensler warned, “[e]xisting financial sector regulatory regimes – built in an earlier era of data analytics technology – are likely to fall short in addressing the risks posed by deep learning.”[23] Regardless of how the SEC decides to finalize its approach to AI in conflict of interest issues, it is clear that brokers and advisers are likely to resist broad-based bans on AI in their work going forward.
Notes
[1] Press Release, Sec. and Exch. Comm’n., SEC Proposes New Requirements to Address Risks to Investors From Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers (Jul. 26, 2023).
[2] Id.
[3] Jennifer Hughes, SEC faces fierce pushback on plan to police AI investment advice, Financial Times (Nov. 8, 2023), https://www.ft.com/content/766fdb7c-a0b4-40d1-bfbc-35111cdd3436.
[4] Sec. Exch. Comm’n., Fact Sheet: Conflicts of Interest and Predictive Data Analytics (2023).
[5] Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers, 88 Fed. Reg. 53960 (Proposed Jul. 26, 2021) (to be codified at 17 C.F.R. pts. 240, 275) [hereinafter Proposed Rule].
[6] Hughes, supra note 3.
[7] Proposed Rule, supra note 5.
[8] Id.
[9] Stefania Palma and Patrick Jenkins, Gary Gensler urges regulators to tame AI risks to financial stability, Financial Times (Oct. 14, 2023), https://www.ft.com/content/8227636f-e819-443a-aeba-c8237f0ec1ac.
[10] Fact Sheet, White House, President Biden Issues Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence (Oct. 30, 2023).
[11] Hughes, supra note 3.
[12] Palma, supra note 9.
[13] See Sec. Exch. Comm’n., Comments on Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers (last visited Nov. 13, 2023), https://www.sec.gov/comments/s7-12-23/s71223.htm (listing multiple comments submitted after October 10, 2023).
[14] Am. Free Enter. Chamber of Com., Comment Letter on Proposed Rule regarding Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers (Oct. 10, 2023), https://www.sec.gov/comments/s7-12-23/s71223-270180-652582.pdf.
[15] Id. at 14-19.
[16] Id. at 9.
[17] Daniel M. Gallagher, Comment Letter on Proposed Rule regarding Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers (Oct. 10, 2023), https://www.sec.gov/comments/s7-12-23/s71223-271299-654022.pdf.
[18] Id. at 43.
[19] Consumer Fed’n. of Am., Comment Letter on Proposed Rule regarding Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers (Oct. 10, 2023), https://www.sec.gov/comments/s7-12-23/s71223-270400-652982.pdf.
[20] Id.
[21] Ken D. Kumayama et al., SEC Proposes New Conflicts of Interest Rule for Use of AI by Broker-Dealers and Investment Advisers, Skadden (Aug. 10, 2023), https://www.skadden.com/insights/publications/2023/08/sec-proposes-new-conflicts.
[22] Colin Caleb, ANALYSIS: Proposed SEC Regs Won’t Allow Advisers to Sidestep AI, Bloomberg Law (Aug. 10, 2023), https://news.bloomberglaw.com/bloomberg-law-analysis/analysis-proposed-sec-regs-wont-allow-advisers-to-sidestep-ai.
[23] Gary Gensler and Lily Bailey, Deep Learning and Financial Stability (MIT Artificial Intel. Glob. Pol’y F., Working Paper 2020) (in which Gensler identifies several potential systemic risks to the financial system, including overreliance and uniformity in financial modeling, overreliance on concentrated centralized datasets, and the potential of regulators to create incentives for less-regulated entities to take on increasingly complex functions in the financial system).