Are fund selectors sitting on a vast data treasure trove?
The majority of investment research and due diligence concludes with a qualitative recommendation. No matter how much structured quantitative data is provided, the process of refining information into actionable insight is done through qualitative enrichment, created by the analysts at the investing firm. This process itself creates a highly valuable data set in the form of notes, reviews, and findings that sit in a database (or in files) that are rarely utilized to their full potential.
This treasure trove of internal data can be utilized to constantly improve the investment model, if only it was structured in an optimal way through a proper data journey. This article shares best practices and elaborates on the steps of an investor´s data journey when assessing new fund managers or monitoring their current portfolio. It also covers the challenges investors face, areas for potential improvement, and the role of technology when transforming this process.
How does the fund selector’s data journey look like?
The data journey goes through five main components: sourcing, consolidating, structuring, analyzing, and reporting the data.
Here is an overview of the key elements to be considered when constructing an efficient data governance project to establish better investment decisions.
- Data Sourcing: Collecting data from different sources (e.g., directly from managers, from the investors own staff, from specialized news channels, from data providers, emails, etc.).
- Consolidation: Storing relevant data in a single place and being able to manipulate different types of data from one point of access rather than emails and disperse folders.
- Structure: There are 2 types of data:
- Structured data, which is often clearly defined, standardized, and classified.
- Unstructured data, which is often sitting in files and cannot be easily processed or analyzed.
- Analysis: Investors analyze the data to extract useful information necessary to make decisions (e.g., calculations and comparisons).
- Reporting: Based on internal or external stakeholder needs, reports are needed in a document format. The report typically outlines key characteristics and recommendations based on the analysis done.
What are the challenges of digging through the data avalanche?
The main challenges allocators face is related to the following:
- Technology tools
Too much data is being collected and sitting within emails or shared folders. In many cases, much of the data is either not necessary or cannot be classified as data.
Unstructured data makes the process of analysis tedious and almost impossible. Poor data quality makes any analysis inaccurate.
While structured quantitative data gives you a birds-eye view of fund managers, unstructured qualitative data can provide asset owners with a much deeper understanding of the fund manager’s behavior and performance.
Finding the insight buried within unstructured data is not an easy task.
Not implementing proper tools to manage the whole data journey or implementing multiple tools makes investors lose control and impacts efficiency.
How to get started on improving your data journey?
Many questions should be asked before and during the journey.
- How to structure collected data, how to improve its quality and how to do it easily and at scale?
Some companies attempt to do this manually, assigning employees to the cumbersome task of reviewing and categorizing unstructured data points. But manual data processing is prone to human error and is time-consuming.
- When should we start structuring the data?
Ideally, data should be structured during the collection phase.
- What are we collecting and why?
Not all data is equal. Many investors collect any-and-all types of data that is stored and often unused. Investors might be asking questions and collecting data that can be irrelevant to the process. This increases chaos and makes the process of retrieving the relevant parts more complicated. It´s is not about eliminating “nice to haves” from the equation, but about proper data management. Avoiding unnecessary tasks reduces the noise associated with the inputs and outputs.
Just after completing the sourcing, structuring, and quality assurance, analysis can be conducted. But how can this be achieved, and how can it be easily adjusted to new requirements or new ideas?
- Top-down or bottom-up approach?
Collecting data just for the sake of doing it is ineffective. Missing valuable data points within the sourcing process will make the process incomplete.
The data collection and structuring should be driven by reporting and analysis needs. Important questions to ask here are:
- Who is the audience?
- What is the relevant data?
- How will it be presented?
- How will it be generated easily?
The process should be a dynamic and continuous cycle. Reporting needs will drive the definition of the sourcing. The more sources are utilized, the more analysis and reporting can be accomplished.
How do such considerations impact your choice of technology tools?
You might think that speeding up the collection process is the key, but the mission only starts here. As we mentioned earlier, the process is cumbersome and involves recurring tasks; therefore, it is only part of the equation.
The solution is to implement a holistic approach that allows the data to move seamlessly from the sourcing to the reporting.
Connectivity starts when you source data through your due diligence process. Although users often focus on the automation element of due diligence, that is of secondary importance. The primary importance is connecting the data from the start in order to obtain insights from the back end and generate reports.
An end-to-end solution is a key for a data transformation project.
Structure from the Source
Investors should structure data from the start. These activities include:
- Predefining the type of data collected (e.g., numerical, text, multiple-choice, etc.)
- Setting up validation rules (e.g., red flags or a scoring system)
- Mapping the data in a flexible way to the fund profile
Future-Proofing Due Diligence
Flexibility is the vital element as the requirements will keep evolving. It’s the best way to future-proof your process.
The due diligence process isn’t static, and you should expect your data structuring process to evolve. Once it’s in place, you’ll need to keep an eye on how it’s working, where it adds value, and where it falls short of your expectations.
Regularly reviewing what is included in the process is a good practice that can improve the efficiency of all the tasks.
Implementing a technology tool that adapts quickly to changes in the process is a crucial factor in the decision to carrying out a data transformation project.
A sound data management framework provides a format to conduct top-down and bottom-up analysis.
It’s essential to remain flexible and adjust your process as you learn what works best for your firm.
Maximize the Value of the Information You Collect
Structuring your qualitative data can yield valuable insights about risks and opportunities, and you don’t have to do it alone.
The right tool can help you automate the process by designing a system in which information flows directly from your firm’s investment questionnaires into a CRM database, where it can then be analyzed, benchmarked, and compared.
A dedicated due diligence data vendor like Dasseti can provide a seamless approach that is tailored to your needs and adapts over time to fit evolving requirements.
Tap into your data goldmine with Dasseti
Don’t let unstructured data go untapped. Tap into the treasure trove of information at your fingertips with automated data analytics.
Find out how Dasseti can help by scheduling a demo today.