Key components of real time personalisation.
- Adam Howard
- Mar 7, 2023
- 11 min read
Updated: Jul 28, 2023
What you need to consider when creating a real time financial experience platform.
Almost every strategic deck I have seen in finance in the last few years has personalisation at its heart, or hyper personalisation - same thing as far as i’m concerned although I do preach a particular brand of real-time customer experience that generates a personalised view of the markets at any given time, rather than the traditional Martech version where the user journey or the product offering is tailored to the profile stored in the CRM.
As discussed in a previous article , ‘Financial experiences suck’ I think most financial experiences are lagging customer experience expectations and that most people are getting the radio when they should be streaming from Spotify. So how would you go about building a platform capable of generating an AI experience of the markets?
Natural language Processing & Named Entity recognition
Large Language Model
Behaviour Capture
Mapping
Customer data security
Financial Knowledge Graph
Machine Learning
Dynamic Data Storage
Headless Architecture
Regulation & Compliance
Content & Data Sources
Natural Language Processing
The word of everyones machine reading lips at the moment. NLP!! I don’t think there is anyone, certainly not those involved in tech that hasn’t signed up to ChatGPT3 from Open AI - its whats known as a LLM or Large Language Model, super interesting tech that is already having a substantial disruptive influence on a load of sectors and jobs. You will need a smaller version of this in order for a machine to understand and enrich the content and data need to create a real time personalised experience.
Machine reading real time content and data gives you the key capability of NER or Named Entity Recognition which means you can ingest a load of content and data and enrich it with tags which allows you t provide the following:
Curate content from a content inventory - for example if i need to curate tweets from Twitter API regarding the $MSFT investment in ‘Open AI’ and the calamity of the ‘Alphabet’ launch of ‘BARD’ and either deliver them to a customer who is interested in those stocks or create a topical trading page for AI Stocks.
Analyse instrument noise within content - for example identifying unusual behaviour within a content source around a particular instrument. Using the above example we would have seen a serious spike in Microsoft and AI in a Twitter or Public web content feed that would have highlighted something was afoot in the world of AI.
Without an NLP component in the stack you cant read and enrich the high velocity, high variety content and data to create a unique experience for the user based on their behaviour, interests and preferences in the market.
Large Language Models
LLM's weren't so pervasive when I first wrote this article back in February 2023, they certainly are now and you cant ignore the capability they bring to real time financial experiences both from a generative sense like content summarisation and also conversational trading and analysis.
There are some constraints on how successful integrating an LLM like ChatGPT or BARD etc into the financial experience platform can be, mostly due to how much context an LLM can take - for example it is currently difficult to get real time content to be considered by the model a it needs to be embedded, the more content needed the more cost implications.
Techniques for this are coming through however wit orchestration such as LangChain and Vector Databases such as Pinecone allowing for increased real time implementations of the models. Less real time use cases are absolutely in play however for example embedding your education content into ChatGPT would create quite a good education chatbot.
Behaviour Capture
A critical component and maybe not what you think. You must be able to link the user behaviour to the markets - otherwise we are really dealing with typical digital analytics, I will discuss the difference between behaviour and digital analytics in more detail in another article but whats key here is to understand that an interaction with content or data on an app or platform should link to an instrument or a topic.
So you need a method to associate an interaction with an entity that was attached (enriched) to that content or data by the previous component process (NLP/NER) - lets say its a curated article about the Reserve Bank of Australia and interest rates and we have our NLP and our Knowledge Graph (see below) up to scratch we have tagged the content as ‘RBA’, “AUD/USD’ and ‘Interests Rates’ and a user interacts with that content within a content feed on a mobile app. You need technology to map that click, or hover to those named entities which elevates the interactions from a click through to a weighted interest in the FX instrument Aussie Dollar and the topics associated with RBA and Interests rates as well as the asset class Foreign Exchange.
In order to do this you will need to tag up your channels, websites, apps, email and all of the components and UI elements to capture the interaction and pass the content ID which will contain the entity data. This is similar to how google analytics works and can be a javascript tagging library that looks to track everything from a click to a filter, drop down or input.
When you understand what a customer has read and ingested about the markets you move from understanding what time they read the news to what instruments and topics interests them in the markets. This is the key data for delivering a good financial experience.
Mapping
I have had countless conversations about mapping, what it is and why it’s important. In essence if you want to create a real time financial experience for the available content and data inventory you need to make sure your apples and apples and your pears are pears.
Internally mapping isn’t typically a drama until you need to align all instruments under a single ID - this is very important when you bring together multiple content and data sources - say a premium news provider, charts, price, tweets, video and some alerts. All of these sources may identify an instrument using a different unique identifier so its important to have a way of mapping an external UID to an internal UID. The more complex the financial experience the more important mapping is especially when considering the roll over of new instrument products per month.
Mapping also critically needs to happen from behaviour capture to ensure if a user interacts with any of the content and data sources on a channel that that interaction is mapped to the instrument or topic mentioned within the text, otherwise the interaction has no context, its just a click.
Customer data security
Get this bad boy sorted out of the gates. When you are operating in a financially regulated sector such as the financial markets, compliance with all legislation related to customer data is incredibly important. What seems like a typical implementation of a front end data capture, send and storage will get very complicated very quickly once the organisation compliance and secops teams get involved - they will block your build quick and you will be out the door if you cannot demonstrate superb handling of customer data.
First thing to consider is capturing, transporting and storing of data should be kept to an absolute minimum, if you don’t need it don’t touch it. Therefore if you are capturing customer data such as a a behaviour make sure you are only capturing a unique identifier for both the customer and the interaction - you shouldn’t need anything else from the front end other than the interaction event, you can map everything to the instruments without customer data. Key thing to remember when delivering personalisation is I don’t care who you are I care what you do.
If you can, always build the data store within the cloud and within the organisations perimeter, the secops teams will appreciate the cloud capabilities more than storing it on tin, although data tenancy is still important when dealing with cross border regulation such as US and UK customers, so make sure you can tell compliance where the data is stored.
Pseudo anonymise if you can, always map a UID from the front end to a different UID on the database to minimise even further the risk of hacking being able to link an identifier to an actual person. Don’t store any UID if possible, always anonymise them if you can.
Financial Knowledge Graph
Its relatively easy to build a curation service for public financial content, let’s say if you want to show Nvidia tweets or show Youtube videos about the Bitcoin chart setup this week you can pull them from the api using NLP component mentioned above or even simpler just filtering these words from the api. However that wont give you the edge many financial customers are looking for in alternative content and it wont give build a more interconnected market aware financial experience.
Every instrument, particularly equities is connected or related to a load of other entities which if there is change in them can have an impact on price - good personalisation wont just surface an instrument it will surface information that surrounds it. In simple terms people, products and places can impact the value of an stock. Some recent examples:
Google poorly releases a ChatGPT3 competitor and promptly loses 100 Billion dollars.
Disney brings back Bob Iger and the stock jumps 10%
Turkey closes its stock market for a week after uncontrolled selloff following earthquake
All companies and therefore equities have dependencies and relationships that can and should be understood when trading and investing - they are can also be exploited for engagement purposes. For example its reasonable to assume investors in Tesla would also be more likely to engage with stock that are involved in battery technology, lithium mining and sustainability. This highlights a key dependency of any good recommender system - it needs to understand the relationships between companies from being in the same peer group to being a hardware provider within the LIDAR system used by Tesla.
Now you can get a pretty good graph going by mining data sources which define market relationships or search engines which have great knowledge graph systems already about companies such as related companies and related people and product information. But you can also look to create a graph from the content and data you have ingested into your newly minted personalisation platform. Ultimately it will be a blend of semantic data and relationships that are discovered by the graph technology.
What you are trying to achieve with this component is depth of information, a fuller more engaging picture of the markets that is demonstrably related to the interests of the customer. It also gives a much broader palette to draw the financial experience and will definitely improve any recommender systems you have developed because it can test more diverse entities.
If you want to know more about Knowledge Graphs here is a great resource https://www.ontotext.com/knowledgehub/fundamentals/what-is-a-knowledge-graph/.
Machine Learning
I believe you can get a long way in creating a real time personalised financial experience without complex machine learning models, good rules based systems can give you a great basis for personalisation and recommendation systems. I’ll discuss personalisation and recommender systems in another article but the ML capability can start quite light and develop more features as you move forward - at the very least you will need a blend of content and collaborative filtering to both recommend related instruments (say in the same peer group or supply chain - content filtering and if you have enough user data, recommend instruments and content that other users in a similar cohort positively responded to but you haven’t yet (she likes these, im like her so I should like these too - collaborative filtering).
There are some great platforms out there now that can access hundreds of open source ML models where you can create systems that would work for finance wither as a universal model or a component based model such as a search or watchlist recommender system.
You can deliver a highly personalised financial experience without a deep neural network - but depends on your scale, your resources and your goals for what the customer experience is trying to achieve.
Dynamic Data Storage
When you are offering a personalised real time financial experience you could be required to process thousands of instruments in near real time across multiple content and data sources. Depending on your implementation one of the big benefits of this platform is ability to gain data from all of the entities you are tracking and the interactions.
I refer to these as behaviour verticals where you monitor the behaviour of the content (what the market is talking about), the customer (what the users are interacting with) and the markets (what the market is buying and selling) - I will cover these concepts in more detail in another article but the key take away here is there is so much data that you can leverage once you track all three behaviour verticals that you need historical data stored in a colder, slower, more cost effective data store and more real time data requirements can be handled in hotter more real time and expensive data stores.
Headless Architecture
Your personalised CX should be delivered to the channels (web, app, platforms) via a headless architecture. This decouples the backend and the front end which you will definitely need because you will be doing a lot of variant testing of the UI, UX, data and the models and the personalisation will be driven from the backend integrations.
In short there will be a lot of changes and optimisation going on as you fine tune your platform and the experiences - you will need the flexibility to develop new features in the platform that you want to deploy to the front end via api, you will also want every channel to be able to draw on these features so that each customer experiences multi channel personalisation. It would be a shame to build the business logic, the models, the api’s and only deploy to one asset via one team.
Ultimate goal here is having great personalised experiences running on all channels - say a personalised news email and a personalised search recommendation in the trading platform coming from the same data and business logic via micro services headless api.
Regulation & Compliance
A personal favourite of mine having spent years working closely with compliance teams and legal teams on deploying AI into regulated financial entities in multiple regions. This one in particular can derail all the work you have put in designing and building your platform. Make sure you develop the policy framework, the reporting framework and the customer data framework at the same time as the technical implementation because compliance and security teams will want to know these key things:
What data is collected (customer data & privacy)
What data is created (AI & Models)
How and why is data created (AI & Models)
How is data transported and stored (Data security)
You will need a framework to engage with compliance and legal in order to explain what your platform is doing and why its doing it and be able to evidence this with data. Model explainability in regulated entities is very deep so be ready to open the black box. The policy framework is also important - work with the teams to create and update policies for internal and external use that details exactly what this personalised financial experience is and what the risks are to the customer.
Content & Data Sources
Each personalised CX is generated from the available content inventory that has been ingested, processed and curated by the platform - the better the inventory the better the experience. Make sure you have a good mix of long and short form, technical and fundamental, social and video and dont forget you need data sources such as market fundamentals, price and technical analysis.
Don’t think of content just as a single experience - you are looking to craft a generated experience by combining all available information. For example an economic event is just a scheduled date and some data - but combined with targeting, date proximity scoring, summarisation of analysis, sentiment, anomaly detection and curation the retail sales figures becomes and engagement beast that will bring in the punters.
Human Centric!
As with all things technology its what you build its how you use it - these platforms can offer financial institutions capabilities that will dramatically improve the lifetime customer value of many of their hard won users - however you need to understand the shift from manual campaign driven sales, marketing and product to automated AI generated and personalised financial experiences. This will shift your business to a human centred footing.
I can help with that.
Cheers
Adam Howard
Founder
Comments