Investment Management both relies on and creates a ton of data. Trading platforms, prime brokers, custodians, index providers–these are just some examples of the myriad data pipelines feeding investment firms every day. Then there are portfolio management tools, performance measures, and websites like Morningstar and Lipper that monitor investment managers’ products. In all, a typical investment manager can use up to 50 data sources every day.
This means Investment Firms are facing a big problem: How to get all these data sources, many of which are recent additions, to play nice with their legacy apps. In a landscape of rapidly increasing data, growing fee compression, and tight regulation, competing interest present a huge challenge.
Why has data analytics become so important in the finance industry?
Finance has digitized rapidly in recent years, and data now plays a more crucial role in the services that firms provide to their clients. Moreover, clients and regulation both increasingly demand more transparency–especially in areas of ESG (Environmental, Social, and Governance) as well as traditional factor data.
Taken together, the rapid increase of data and growing demand for clarity into the back office mean that many firms’ technology has not caught up to the mandate. High priority tasks like reporting and providing client insights rely on the ability to swiftly pull, analyze, and present this data. Often, managers’ tools aren’t up to the task.
Why is data analytics a challenge for the finance industry?
With data streaming in from various sources, keeping it all organized and accessible suddenly becomes a gargantuan challenge on its own. A firm may have data coming in from an investment management software, portfolio analytics tools, market reporting tools, CRM, and even email. Often, this data is contained in separate spreadsheets and word processing docs saved in a traditional document storage solution–most likely just a file structure–and usually it’s in several different formats.
This organizational “system” was never meant to interface with the diverse source data now so crucial to investment operations. Consequently, each source operates as a data silo, the information only accessible by the department using it. For managers it means manually gathering data and plugging it into reporting, a time and labor intensive activity. It also makes collaboration between teams chaotic and creates huge opportunities for human error.
Then there’s Compliance
Investment managers must routinely meet constantly shifting compliance benchmarks. Among them is keeping tabs on exactly what client data the firm stores, where it is, and who can access it.
As many finance professionals know, failure to meet any compliance requirement is a particularly costly mistake. Identifying potential compliance weaknesses before violations occur is a top priority for many firms, but the shifting territory and the disconnectedness of data tools and sources translates to an alarming amount of work just to stay afloat.
And none of this addresses the very real problem of internal checks and balances. Who changed what, when they changed, what they changed. In mixed system of legacy applications, static file structures, and more modern data sources, answering these questions is thorny at best.
The Way
As with many problems, it’s tempting to suggest this or that new technology will clear everything up. But that response neglects the tightening belt that gets Investment Managers here in the first place. Legacy apps are still in use because training, implementation, and maintenance are costly.
Instead of throwing new technology at data problems, look at the most important data processes you have, the ones most critical to your daily operations, and make them work better for you.
Compartmentalizing projects means you can streamline one process at a time while planning for the long term. For example, spinning up a data warehouse to ingest and store data coming from your record keepers, then plugging that into your most used reporting interface, can save an immense amount of time in just one process, while also setting the stage for future processes to join.
Or, look at the legacy apps you rely on most and innovate ways to make them better. Often, investing a small amount–whether it’s in consulting fees, outsourced development, or light third-party add-ins–can stretch the value of legacy software for years to come.
Ultimately, your goal should be to break down silos, consolidate data, and bring the tools you use to access and manage the data much closer to the sources. Where firms often go wrong is thinking they have to completely overhaul everything at once, throwing out all legacy software and replacing it with new. But not only is that a herculean endeavor, it disregards years of knowledge using tools that, while possibly less modern, are nonetheless completely viable given the right approach to technology.
This is the perspective we take when considering any technology problem, and especially complex data consolidations and integrations. There is often a huge amount of training and knowledge preceding us, so we look to that knowledge and process as a guide for how best to reduce the pain of data management. Frequently, the best way forward is not kicking one app out in favor of another, but instead making an app in which you’ve already invested work better for everyone.
Bill Erickson is Interject’s Director of Communications. Contact him or the Interject Solutions Team at info@gointerject.com