It has been going on for decades on the mainframe
The future is about doing it better in multi-platform environments
And – in real time
Given the meteoric rise in the use of smartphones, the increasing inter-connectivity between what were previously non-connected devices (IoT), and more business-relevant data becoming available now more than ever before, you need to ask yourself – “Okay, just exactly how do I find the proverbial “needle in the haystack?”
That needle business-specific – in Finance: finding the elusive alpha and then making smarter trades before the competition does – in Insurance: preventing a fraudulent transaction before it gets processed for payment etc., or providing appropriate premium rates for “just in time/as needed Insurance – in Healthcare: coming up with quicker and more accurate diagnoses of health issues. Enabling patients to get treatment before physical effects manifest and prevent unnecessary progress of the illness before it is literally too late.
The above concerns are some examples of what is making previously distinctly separate disciplines (just look at how in Finance – traditional “Fundamental-based trading operations” – and Quantitative-based trading firms) strike up new (and not always comfortable!) partnerships.
It seems that both sides of this particular coin have perhaps begrudgingly agreed that they need to respect each other and figure out how best to co-operate and come up with strategies that will work more often than not. These so called “Quantamental” approaches are being adopted by such luminaries as BlackRock Inc., Point72 Asset Management, Third Point and Tudor Investment – as well as Man Group.
Indeed – more and more firms on the Wall Street front have gone so far as to pursue active Machine Learning to develop advanced statistical routines to ensure their firms do not fall foul of the soon to be implemented SEC’s Regulations for Liquidity Risk Models.
Similarly, Healthcare and Insurance providers are also increasingly focusing on how best to amass the huge amounts of data that are increasingly available to them and to analyze how best to modify their current (dated) business and/or treatment strategies.
In Wall Street trading environments for example – the combination of the traditional reliance upon human skills such as Research, face to face meetings – as well as that (as yet…) un-programmable “gut-feel” – doesn’t always provide the consistency of old – especially when measured against the tireless machines grinding away at analyzing ever increasing amounts of data and identifying previously unknown patterns and/or sentiment – to then generate (sometimes huge) numbers of trades in literally nanoseconds.
The volumes of data involved in these circumstances can be huge. Firms need access to Research, Exchange Rates, insights from emails/Social Media – possibly voice calls and instant messaging too.
Some banks are already experimenting with how best to deal with the challenges outlined above and seem to be moving towards getting all of the data into the one/same place – and then applying Machine Learning techniques to make smarter and better informed decisions.
But – before we boldly go where no man (or machine!) has gone before – are you sure that your existing systems are performing adequately and that you’re not wasting valuable system resources? If not – then you really should be taking a look at some of the recent proliferation of new IT tools that allow non-technical personnel to apply their own labels to replace the oftentimes “Geeky” labels that so often get used to define various fields in databases – and start to imagine how to truly optimize things and generally become more effective.
A mixture of the best “brawn” (as in the tried and tested) and the “brains” (as in the new insights garnered from Artificial Intelligence/Machine Learning) when conducted strategically could potentially provide firms with the opportunity to really stand out from the old, the traditional (and the tired) approaches of yesteryear.
Think about this – if you can make better business decisions, manage risk more proactively, profit from others’ inefficiencies, lower your costs of entry into new markets, potentially reduce your staff headcount and reduce the costs of human error – then why not at least check things out – or risk becoming the victim rather than the victor?
If you’re still uncertain as to whether you want to be a pioneer or a laggard – maybe you should consider forming consortia with other peers in your segment. Such an approach can significantly defray the costs of your going it alone – although you’ll be unlikely to realize the full (potential) rewards from being the lone pioneer.
Maybe you should consider a hybrid approach – e.g. where only a select few carefully chosen peers conduct the funding of the initial development/IT costs – with a view to then jointly marketing and providing the services to others – on a future royalty basis that remunerates the “founding fathers” of the consortium.
Whatever it is that you decide – always ensure that you keep abreast of the latest business and Regulatory developments – and more importantly – the cybersecurity and technical developments first and foremost – and about how you need to get various seemingly disparate data from various (increasingly mobile) device sources – in real-time – to quickly get an upper hand on your competitors.
The days of looking back at “stale” data are all but gone. For example, today’s most successful Wall Street AI trading strategies utilize up-to-the-second (or less!) data from such seemingly diverse and unrelated sources as satellites, transport systems and patterns, traffic conditions and more – that the only way to effectively make good use of it all is to ensure your systems are able to handle the volume and the oftentimes un-structured nature of the various data formats.
Even more importantly – successful companies will be the ones that figure out how to capture these disparate datasets in real-time and consolidate them into the one environment for more accurate and timely results.