The Securities and Exchange Commission will seek input on whether digital customer engagement innovations used by financial firms should be governed by existing rules or may need new ones, SEC chair Gary Gensler told Reuters.
While the SEC's thinking on the subject is at an "early stage," its rules may need updating to account for an artificial intelligence-led revolution in predictive analytics, differential marketing and behavioral prompts designed to optimize customer engagement, he said.
The SEC plans to launch a sweeping consultation in coming days that could have major ramifications for retail brokers, wealth managers and robo-advisers, which increasingly use such tools to drive customers to higher-revenue products.
"We're at a transformational time. I really believe data analytics and AI can bring a lot of positives, but it means we should look back and think about what does this mean for user interface, user engagement, fairness and bias," said Gensler. "What does it mean about rules written in an earlier era?"
The SEC is planning a sweeping consultation in coming days that could have major ramifications.
The consultation was partly sparked by January's meme stock saga, which resulted in intense scrutiny of retail broker practices, including "gamification" -- game-like prompts designed to optimize customer engagement.
Gensler told Congress in a May hearing about the saga that the SEC would seek public input on gamification.
He now says the agency should examine the gamut of digital-engagement practices. While such features can increase consumers' access to capital markets, they may also expose them to increased risks.
Certain behavioral prompts could potentially be considered investment advice and regulated as such, he added.
"These digital engagement practices raise questions as to when marketing becomes advice, when is it a recommendation, what's the duty of care?" said Gensler, who was previously a professor at MIT where he taught classes on financial technology.
Gensler echoed a growing worry among regulators that such tools may perpetuate discriminatory behavior. With some marketing practices, for example, companies customize product offerings and prices to customers' preferences and profile.
"The data that's coming in to these data analytics, whether it be machine learning or deep learning, will represent the biases in society, as they exist already," he said.