Information Arbitrage… Better Served With Analytics
There are few industries that rely on data as much as investment banking. From the early days of the trading floor pits with hand signals being flashed back and forth (known as open outcry) to the vast amounts of data that flash across hundreds of thousands of Bloomberg terminals all across the world, data and information is king.[/vc_column_text][vc_single_image image=”26878″ img_size=”full” el_class=”responsive-img”][vc_column_text]
There are many more facets to an investment bank than just its trading (e.g. corporate finance, research) but they all rely on data in some way, shape or form. However, getting the access to the right types of data and doing so in an efficient amount of time has been a challenge. The problem mainly comes from the fact that many workers don’t have the skillset or knowledge of how to tackle their data problems in a better way.
Many people I’ve spoken to in the industry have the belief that only data scientists or quantitative analysts have the know-how to work with data but this is a myth that most certainly needs busting.
Additionally, some organisations in the investments space have already heeded the call and begun to engage with the use of better reporting and analytics tools. However, not all have, so the widespread use of tools like Power BI, Qlik and Tableau across an organisation can be transformative if applied in the right way.[/vc_column_text][vc_single_image image=”26876″ img_size=”full” el_class=”responsive-img”][vc_column_text]
Various data analytics tools.
Connecting your data sources
One of the ways in which your organisation can make better use of its datasets is by simply connecting various databases or sources of information that may have been unconnected previously. When they sit separately, they require more time to give your users anything insightful. As opposed to when a number of datasets are connected, the user, in that case, can locate the information they need much faster and this saves time and money.
A good example of this was when I was at Canaccord Genuity in London where we built dashboards that connected different sources of customer information to provide for a more complete snapshot of them. The vision there was that this would lead to a much better ability to provide unique and bespoke services to users than before. Instead of a few users who had the knowledge of what to look for and where we empowered more of our team by making this information available to them on their own machines.[/vc_column_text][vc_video link=”https://www.youtube.com/watch?v=WASBBR0E974″][vc_column_text]
Example of Qlik Sense being used to connect to Google Analytics data which tracks usage details on your apps.
There are a number of benefits that can be gained just by connecting your data and you might already know this but just not have had the tools to embark on that journey. Depending on the complexity of your datasets and the solution required, there are very likely going to be tools out there that can help. For our situation, at the time, it was simple and we could use QlikView to extract, transform and load the data into a number of dashboards and visualisations. For more complex work, a client may need the use of advanced tools like Alteryx to do their data preparation work before it is loaded into an application.[/vc_column_text][vc_single_image image=”26875″ img_size=”full” alignment=”center”][vc_column_text]
Using Alteryx to manage a data flow.
Another way in which better tools can enable further value in investment banks comes from the use of software which automates the way in which data is processed or makes it easier to create reports which can encompass large amounts of data. Typically, this is some sort of dashboard or visual report and whilst the typical analyst tools of choice like Excel or even R/Python (for the more advanced) prove quite useful, they all have limitations.[/vc_column_text][vc_single_image image=”26879″ img_size=”full” alignment=”center” el_class=”responsive-img”][vc_column_text]
Python code example courtesy of Github.
Excel has a limited amount of data it can handle before files become unmanageable. R and Python can batch process vast amounts of data quickly but don’t give users much flexibility with filtering options and are not seen as reporting tools. On the other hand, business intelligence tools can solve this because they are built to handle very large amounts of data and also give users the flexibility to filter on a number of dimensions to get the answers they need. They even integrate into the advanced analysis tools of R and Python which gives a user the best of both worlds in that scenario.
In my work during that time, I looked to incorporate into a business intelligence tool any process that was repeated and being done in Excel.[/vc_column_text][vc_single_image image=”26877″ img_size=”full”][vc_column_text]
An example of correlation analysis being done in Microsoft Excel – this is limited to single stock analysis and in multi-stocks, the files get quite large
For example, if there was a particular dataset I’d get from Bloomberg I would create a spreadsheet that would extract the data (using Bloomberg’s Excel API) and then use a BI tool like Qlik Sense that was built to ingest that information and perform a number of calculations on this dataset. Because it was in Qlik Sense it was easy for me to explore the data through a number of different filters and the visualisations could dynamically showcase new insights to me each and every time I refreshed the information.
Automating tasks like this saves a large amount of time when you add it up for 1 employee so imagine the benefits gained by making this available for many more. More importantly, your data can yield a whole new world of insights that would have previously been impossible to glean without analytics software.
Another factor that is starting to be seen in the world of investment analysis is the creation of self-service analytics. This is important because instead of having to wait for your business analysts or IT team to send you the data (i.e. guided analytics) there are mechanisms available so that you can create your own reports and get access to data much more easily and quickly (self-service analytics).
The best types of self-service scenarios see end users able to access the data they need without having to write any of the expressions or code to create a report and at the other end of the benefit spectrum, the business analysts/IT teams don’t have to respond for every request for new information.[/vc_column_text][vc_single_image image=”26926″ img_size=”full”][vc_column_text]
An example of how easy it is to build a self-service filter driven report with charts in Qlik Sense.
In my early days at QMG, I would get asked to create various types of analysis and reports for our analysts. What helped was the creation of an internal system using QlikView (older software) which allowed our users to select the dimensions and measures that they needed for their report. This self-service application enabled the users to get data when they needed it and allowed me to have time to focus on even more advanced forms of analytics on our data. This would have happened much more slowly if I were to service each and every request.
Ecosystems like this are able to be provided by the BI tools that are out there with some of them being a better choice for creating this than others. It depends on each organisations situation but all these tools do their bit in helping companies progress to a stage where their employees and their output becomes much more enhanced thanks to having access to data.
Putting it all together
So how does a business even get started on creating these capabilities? A mistake often made is the belief that these capabilities stem from expensive software delivered by expensive consultants that require expensive rework if changes ever need to get made. The truth is far from this and many organisations I saw in Europe and the US were able to gain significant benefits from working with better analytical tools. These tools enabled the enhanced collection and scientific analysis of data that could potentially yield insights that would have previously been impossible to glean.
The best approach to beginning a journey like this comes from investigation and this is helped along when the consultants you work with are agnostic in their approach to software choice and data strategy. Consultants like this can help businesses become successful because they find the best way to solve data problems based on client budgets and strategies.
At ABM, we aim to provide similar value across our various services and through the consultants who are out in the field. We are upfront with our process of reviewing your unique situation and our recommendations don’t push the agenda of any one type of software. Rather, we work with you to ensure that the data solutions are fit for purpose and give you great value. Better yet, we deliver a low-cost review and recommendation package that enables you to clearly see the roadmap of what could be done with your data and how you can benefit more than you currently do.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/2″][vc_column_text]
ABM Systems[/vc_column_text][vc_single_image image=”26834″ img_size=”full” onclick=”custom_link” link=”https://www.linkedin.com/in/markmonfort/”][/vc_column][vc_column width=”1/2″][/vc_column][/vc_row]