I briefly covered this topic in Custodians, Data, and Digital Assets, but I wanted to expand a bit on the importance of data infrastructure in fintech and the future of data pipelines. Hope you enjoy this blog and please know that this only scratches the surface of the data exchange industry. This blog was inspired and informed by a white paper published earlier this year by the DTCC titled Data Strategy & Management in Financial Markets and my experience working in the fintech industry.
What is data exchange?
In order for financial applications to operate efficiently on-demand, financial technology companies must create robust networks of data pipelines that ferry large packets of data from one organization to another. No single fintech organization can be the sole provider of all data. Instead, organizations collaborate in an open source nature to exchange data with one another. The benefit of this collaboration is enhanced data flow from disparate sources which ultimately leads to better outcomes for consumers interacting with the tech.
Seamless data exchange can lead to the creation of operationally efficient tools that enhance the user’s ability to make informed decisions. Conversely, when data exchange rails break, consumers and businesses can experience interruptions that may have material implications for their business. For example, if a trader cannot view accurate position data for her portfolios she may not have the complete picture needed to operate in the marketplace.
Though speed and efficiency are primary concerns for data engineers across organizations, establishing robust security measures and diligent monitoring ought to be prioritized as well. Data exchange tools may be targeted by bad actors because of the richness of information that passes through the pipelines continuously. A strong understanding of the security options available in the marketplace is a must for any data infrastructure engineer.
Data exchange technology used by fintech companies
APIs are perhaps the most common type of data exchange tool used today. Application Programming Interfaces are protocols that allow different software apps to communicate data with one another. APIs provide standardized methods for requesting and sharing data. API integration in any software platform is important for businesses that want to conduct cross-application information exchange.
APIs allow for modular improvement of software applications and encourage the reusability of aspects of already deployed technology. Consistent API protocols allow developers to focus on application enhancements rather than spend time on building data exchange rails from scratch for every engineering effort. Additionally, APIs allow for cross-functionality between web apps and mobile apps which affords users the ability to use applications on different devices.
Data integration platforms are another method through which data can be exchanged. A data integration platform provides a venue where data stored in cloud applications, APIs, databases, spreadsheets, and more can be connected to other sources. Data is often transformed and enriched on these platforms to normalize the data for future processing and exchange. Data mapping, batching, and profiling is also conducted to configure, group, and assess the data before they continue their journey.
Finally, middleware solutions act as intermediaries between different software applications. Middleware solutions are often used when consistent data pipelines need to be established between disparate systems. Middleware can provide standardized communication between systems to ensure that data is flowing properly and on an established schedule. Often technologists will use middleware solutions when dealing in legacy technology data bridges. Middleware systems are also efficient managers of distributed systems with multiple APIs and data streaming functions and can be customized to fit any organization’s operating schedule.
What happens when data exchanges break down?
You have probably experienced the issue of broken or interrupted data exchange while using an app. Data exchange failures can be caused by communication issues, data format incompatibility, network congestion, and other issues. Impacts from exchange issues can range from the mildly annoying to the extremely costly depending on the purpose of the app and requirements established by the user.
Connecting your personal accounts to apps and services can be a laborious task, and it can often be frustrating when the data exchange fails. Since the advent of the internet age, banks have become adept at creating data exchange pipelines for consumers looking to connect their accounts to the many vendors that operate online businesses.
I’d wager that if tomorrow your bank terminated its data exchange services, you would highly consider switching to another bank immediately. The configuration of the data exchange between our banks and other apps is a significant reason why we, as consumers, do not often switch banks. Familiarity with the banking app and the embedded tools is another reason why we don’t switch banks often.
Many in the finance industry do not fully realize the amount of maintenance and organization that is required to run an efficient data storage operation within any given business. Heterogeneous dataset ingested from multiple sources, each with their own data standards and practices, can be challenging to manage without a team of data professionals armed with powerful data management tools.
Data translation, for example, can present a major effort for many organizations due to the amount of time and analysis it takes to create automated translation protocols that account for the vast majority of the ingestible data points. Even then, there will always be a significant amount of work needed to manage the outlier data points that affect the flow of the data exchange between organizations.
As with many things in the financial industry, the quality of the information you are presented with can always be called into question if you are a diligent researcher. Datasets with questionable components, which is virtually all datasets, present decision makers with data informed signals that are not always clear.
Organizations will continue to need highly capable individuals with data analysis training to help guide management when making decisions informed by the data that the organization manages. Focus on data management will continue to increase as tech platforms built on top of the data exchange infrastructure thrive in the modern world where consumers want seamless and instantaneous data transfer.
What happens going forward?
Standardized data exchange through the use of APIs, data clouds, middleware, and other tech allows for plug and play among fintech companies which opens up many new avenues for growth opportunities in the industry. Data connections between businesses creates an infrastructure layer for the financial industry on which new apps can be built. In the end, consumers will benefit from businesses continuing the trend of open source data exchange protocol sharing to sift through vast pools of data.
Distributed ledger technology, otherwise known as the blockchain, could disrupt the data transformation and storage norms. DLT spreads normalized data across many nodes to create a single source of truth. DLT preserves the stored datasets by preventing any changes to the string of data entries also known as the ledger. DLT creates an audit trail of activity and obviates the need for reconciliation of the data.
As more and more of our lives are conducted online, data packets will continue to grow and the pressure on data infrastructure will increase. Fast and efficient data exchange will become evermore valuable to businesses and individuals. Digital alternative investment vehicles like interest in private equity partnerships, environmental credits, digital assets, and non-fungible tokens will challenge data infrastructure engineers to create data thoroughfares that can process complex data at high speed.
If they are not already, these challenges will soon be at the doorstep of all financial technology companies. This reality underscores the necessity for strong in-house data organization fundamentals and standards of practice. Data demand from customers and organizations should encourage decision makers to continue to invest in data management teams and software that can automate some of the heavy lifting needed to create data environments that allow for smooth business operations and growth.