top of page

Engage, Educate and Empathise with Compliance on AI/ML

  • Writer: Adam Howard
    Adam Howard
  • Apr 18, 2023
  • 2 min read

Compliance is in for a bit of a regulatory nightmare if their product and technology leads are trying to ram in new Ai and generative platforms - so you need to tread carefully if you want to maintain momentum on any AI/ML powered financial experience.


Engage


It’s tempting when one sits in their lofty AI tower to design and develop AI/ML features in isolation away from the prying eyes of scrutiny and oversight, dreaming of investment banking bonus rounds. Just remember the ivory tower will come crashing down when the head of compliance isn’t ‘comfortable’. The best solution is to engage on any early stage model development as early as possible to discuss what you want the model to do and how you will manage its development. Don’t think asking for forgiveness later will work - it will get you fired.


Educate


The pace of change in AI is such that even experienced practitioners cant keep up, so why would the Compliance Officer or DPO? You must educate every step of the way to mitigate the fear that is emanating throughout the industry, particularly about the dreaded ‘black box’. Its more likely that your black box doesn’t even use ML and instead uses some rules based statistical modelling so don’t hide behind it and educate the compliance and legal teams in what your software is doing and what others are doing.


Empathise


Compliance teams are in a very difficult spot - on one hand they will have desk and technology heads, product managers, sales and marketing teams looking to deploy any new shiny toy that comes along in search of market share and competitor advantage. On the other hand they have regulations that are purposely broad sweeping and vague in a catch all approach to stop mum and dad consumers from being hypnotised out of their life savings. So empathise with them and do the first two as best you can.


We don’t know yet how regulation for the far end of the Financial Experience Maturity Model - Generative will play out yet. Some countries will look to ban it outright such a Italy banning ChatGPT, however I think it more likely within Finance to look to control how these large language model are used - I think a general rule of thumb is if you can explain and evidence how a specific financial experience element such as a generative summary or chat was created then you will be ok - and for god sakes don’t let your new AI recommend any trades and have appropriate messaging on risks baked into any conversations > compliance will lose it.


 
 
 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
PI5A3540 copy_edited.jpg

Hi,
I'm Adam

We are entering the most disruptive phase of financial customer experience innovation yet (hard to believe i know). Its full of opportunity and risks, but its moving incredibly fast - I can help. 

Post Archive 

Tags

Subscribe to my Personalisation.ai newsletter. 

Thanks for subscribing!

  • Twitter
  • LinkedIn
bottom of page