Ever wondered what the banks do with all our data? You’d think that they’d be having a field day with algorithms working around the clock to analyse where our money goes. If only it were that simple though, as the Fintech Connect Live* event in London highlighted recently. During a panel discussion, flanked by some of the newer players in finance and fintech, Harry Powell, Head of Advanced Data Analytics at Barclays* Analytics Centre of Excellence, took a more practical view on how big data, machine learning, and automation can function in banking.
There’s no clean slate for Powell to work with, as Barclays has been in existence for 327 years and even the bank’s less antiquated information systems feature numerous legacy businesses, and masses of legacy data.
“I think we have 248 different systems, most of which now integrate with each other. So while everyone in the bank gets the idea that we have data, we can connect it together and there’s clever stuff you can do with it, the problem is always domain,” he says. “To make things happen in an old organisation takes a lot more because of where we start.”
“Big data analytics is coming into this field with a bigger view on risk”
Similarly circumspect is Clemens Baader, Chief Analytics Officer of 4finance*, an online and mobile consumer lending organisation that processes its data to avoid perceived biases to facilitate financial services in emerging markets.
“Any machine learning has input factors and anything that’s ethically questionable we would not have it as an input. It’s not part of the algorithm,” he says, stressing that although ML can make decisions easier for managers, “We don’t want to be in a situation where machine learning makes a judgement. That’s where privacy and common sense come into play.” It’s the lack of accountability in black box decision making that irks Baader: “‘Oh, the model decided,’ is not a good answer. It’s not one you can get away with.” he says.
Dan Somers, CEO of Warwick Analytics*, felt there was a need to be more open to explore analytics initiatives and experiment. He remarked: “This is not so much technical, it’s more about mindsets. It’s about people trying to justify a business case to do something that doesn’t have a business case. People who are in big organisations that are able to do this are very crafty and clever – doing something small, making a big impact and building on it.”
He may well have a point, as Powell later described ongoing work at Barclays on financial vulnerability. It involves a shift away from a monthly perspective to 12 or 18 day points on income and expenditure, which, historically, risk assessment hasn’t investigated.
“Data science and big data analytics are coming into this field with a slightly bigger view on risk,” says Powell. “This is not just about saving the bank from losing money but helping our customers make smarter decisions to improve their lives.”
You never know, banks undergoing a transformational approach to risk could be just the sort of progress that allows tomorrow’s ingenious analytics start-up to add an extra Intel® Xeon® processor E5 server to its shopping list.
Indeed, Intel has established a hardware and software portfolio that is ideally suited to making the sorts of connections that financial services are building new business upon. The Saffron Natural Intelligence Platform SDK for developers and data scientists is used by major players across industry to detect patterns in data that can be used to manage risk, make predictions, and identify trends to develop new products.
Intel® Xeon® and Intel® Xeon Phi™ processor servers are already a match for big data and analytics, but that’s not all. Intel’s acquisition of Nervana Systems* last year brings to the table a new AI chip architecture specifically designed for deep learning. And when it comes to predictions, expect to hear more about Intel’s optimised AI solutions for enterprise in 2017.
*Trademarks are the property of their owners
For more information: