
Trust is the invisible foundation that keeps our lives moving.
When you step into a taxi, you trust the driver to get you to your destination safely. When you visit a doctor, you trust their years of education and experience. When you pick up a product at the supermarket, you trust the suppliers, regulators, and processes behind it. Without trust, every decision would demand scrutiny, every action would stall under doubt. Society, as we know it, would grind to a halt.
And yet, when it comes to artificial intelligence (AI), explainability appears to be vital for its adoption. Explainability is all about understanding why an AI system arrived at a specific decision, recommendation, or prediction. The importance of explainability lies in the fact that humans do not inherently trust AI. We know it can be prone to biases and errors, which is why many seek explanations for how it arrives at its conclusions. According to McKinsey, companies that see significant returns from AI, attributing at least 20% of their profits to it, are more likely to follow best practices that promote explainability, leading to greater trust and better outcomes.
We had the pleasure of speaking with Dr. Tristan Fletcher, co-founder of ChAI, an AI platform designed to mitigate commodity price volatility through forecasting. Tristan explained that unless a model is infallible, humans will always seek to understand how conclusions are drawn. That’s why, when launching ChAI, they knew that to gain any traction at all, they’d have to build in explainability from the beginning.
“We knew that when predicting the direction of a commodity’s price, it’s essential for our clients to understand why those predictions have been made – not least because they might need to explain those decisions to their stakeholders. That’s why our customers can drill down into as much detail as they need to fully understand what is driving their price forecast.
Sometimes there’s a trade-off between accuracy and explainability, and where there has been one, we prioritise explainability every time because, at the end of the day, we want people to be able to act on our forecasts. Take for example the decision not to mitigate a $100m exposure to aluminium price volatility based on a signal from ChAI, it’s going to be a lot easier for a client to justify if it is backed up by a detailed explanation of all the factors our model has taken into account and how much it has weighted them.”
So, the big question is: Is AI the Future of Trading?
Well, yes and no. The emphasis businesses already place on explainable AI suggests that humans will be around for a while longer. Explainability at its core, ensures that humans remain integral to the decision-making process, and with that, we remain accountable.
Because, at the end of the day, whether you’re getting in a taxi, visiting the doctor, or trying a new product at the supermarket, if something goes wrong, someone will be liable. Likewise, even though a human trader’s decisions are influenced by factors beyond their control or awareness – such as fear or greed – if a trader makes the wrong decision, they’ll either get less capital to trade with or lose their job.
So, while AI may one day ingest and make sense of structured and unstructured data beyond human comprehension, making it a valuable tool in trading, whether or not it replaces human traders isn’t just about capability – it’s about accountability.
Machines don’t bear the weight of a bad call. And that’s why human traders will remain at the helm, because in financial markets, accountability is everything. And that, quite simply, is something AI can’t replace.
Looking for more insights?
Get exclusive insights from industry leaders, stay up-to-date with the latest news, and explore the cutting-edge tech shaping the sector by subscribing to our newsletter, Commodities Tech Insider.
Modal Title
About the authors
Read more blogs

While tech companies slash jobs and cut rates, commodity trading firms are investing heavily in their technologists—who earn up to 2-3x more than those in traditional tech roles. Why? Because trading firms need developers who...


In recent years, the demand for Openlink Endur business analysts, project managers, and developers has shot up, driven by initiatives at energy giants, like Shell’s switch from Align to Endur, and increased volatility in the...


Today, data is one of the single most important drivers of success. Yet, managing modern data’s sheer volume and complexity is a challenge that few organisations are prepared for. We had the pleasure of speaking...
