Initial Thoughts on July 26th SEC Release: Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers

Josh Stone, InHouseEsq.com – 7/28/2023


I’ve taken a quick look at the July 26th SEC Release: Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers, https://www.sec.gov/files/rules/proposed/2023/34-97990.pdf and below are my initial thoughts. I looked at it from the perspective of someone advising investment advisers, not broker-dealers. This list was not generated by AI. Given the volume of new rules (this one 243 pages), perhaps the SEC is generating them using AI.
• The gist of the rule is that registered investment advisers must:
o First, evaluate any use or reasonably foreseeable potential use of a “covered technology” in any “investor interaction” to identify any “conflict of interest” (including by testing each such covered technology to determine whether the use of such covered technology is associated with a conflict of interest);
o Next, the adviser must determine whether any identified conflict places the interest of the investment adviser ahead of the interests of investors;
o Then, the conflict needs to be eliminated or neutralized. This is a strict standard and disclosure is not enough.
o Policies and procedures are required for advisers using these “covered technologies” for “investor interactions”.
o All of this is subject to expanded recordkeeping rules.
• The release used some confusing language and terms, like “PDA-like technologies,” “investor interaction” and “covered technology”. In this case, PDA refers to predictive data analytics. PDA-like is an awkward adjective. Covered technology means an analytical, technological, or computational function, algorithm, model, correlation matrix, or similar method or process that optimizes for, predicts, guides, forecasts, or directs investment-related behaviors or outcomes. What doesn’t fall under this definition? Investor interaction means engaging or communicating with an investor, including by exercising discretion with respect to an investor’s account; providing information to an investor; or soliciting an investor; except that the term does not apply to interactions solely for purposes of meeting legal or regulatory obligations or providing clerical, ministerial, or general administrative support. This seems extremely broad as well. A conflict of interest exists when an investment adviser uses a covered technology that takes into consideration the interest of the investment adviser.
• The release notes that: “Broker-dealers and investment advisers operate within regulatory frameworks that in many cases require them to, as applicable, disclose, mitigate, or eliminate conflicts.” Given that this is the case, why is the SEC singling out this set of conflicts for more specific regulation? Investment advisers have been using models, automated systems and AI tools for years. Why now do we see a specific rule? We all know that artificial intelligence is dominating the media right now. There must be a great deal of pressure on regulators to appear proactive and to address AI in some manner, given that AI is being presented as a potentially existential risk (e.g., warnings of autonomous machines destroying humanity).
• Have these risks been flagged in examinations? Have there been enforcement actions in this area? If more of a predicted or theoretical risk, could these risks just have been the subject of a risk alert?
• As per the release, “The Commission is aware that some more complex covered technologies lack explainability as to how the technology functions in practice, and how it reaches its conclusions (e.g., a “black box” algorithm where it is unclear exactly what inputs the technology is relying on and how it weights them). The proposed conflicts rules would apply to these covered technologies, and firms would only be able to continue using them where all requirements of the proposed conflicts rules are met, including the requirements of the evaluation, identification, testing, determination, and elimination or neutralization sections.” This puts registered U.S. firms at a disadvantage to other competitors in the market, including offshore advisors and their clients.
• As usual, Commissioner Hester Peirce’s dissent statement is insightful. I typically read her statement before reading newly proposed rules. https://www.sec.gov/news/statement/peirce-statement-predictive-data-analytics-072623
• Technology can be an equalizer, allowing smaller advisers to compete with larger advisers without hiring expensive personnel. This will impose costs upon them. There will be more jobs for attorneys and compliance personnel.
• I’m sure that the SEC will get lots of comment letters regarding this item. AI is a huge business and a lot of money has flowed into it. Although this proposed release pertains only to certain specific conflicts of interest, since there are no limits on the scope of what comment letters might address, the release presents an opportunity for the public to essentially brainstorm about how AI and other technologies should be regulated by the SEC. Companies selling this technology to investment firms will likely want their technologies excluded. People with a (maybe healthy) paranoia about AI will likely want to throw in additional protections in connection with this brainstorming exercise. Perhaps, commenters will want to go beyond the mitigation of specific conflicts and address additional risks related to the use of covered technology, like the risk of covered technology engaging in bias (based upon traditional protected class status or additional factors) in connection with the investment process. For example, perhaps systems are recommending investments that are more or less risky based upon an investor’s gender.
• This proposal needs to be read in the context of the deluge of rules targeting investment advisers, especially fund managers. I try to keep up with proposed rules (including for purposes of contributing to no-action letters of not-for-profits) and want to say PLEASE NO MORE! Those of us in the industry want to have personal lives as well.
• Crazy Prediction for 2024 – The SEC brings an enforcement action against OpenAI claiming ChatGPT is acting as an unregistered investment adviser (where users pay for the service).