How Capital Markets Firms Can Mitigate Risk in Periods of Uncertainty

How Capital Markets Firms Can Mitigate Risk in Periods of Uncertainty

The following is a sponsored post from Michael Hom, Head of Financial Solutions at InterSystems, Gold Sponsors of FinovateFall Digital 2020, September 14 through 18, 2020.


Currently, external factors like the COVID-19 pandemic mean that the global economy has become increasingly volatile and capital markets firms are having to work harder than ever to make sure users, both retail and institutional, can continue to trade without interruption.

As these financial organizations look to mitigate risk in this period of uncertainty, gaining operational resilience, implementing risk mitigation strategies, and having the right technology in place will be crucial to continue to deliver value to customers, comply with regulations, get ahead of the competition – and, most importantly, maintain trust.

Given this, the pressure for incumbents to upgrade infrastructure is only increasing, but challenges remain in doing so. While the pandemic may have been the linchpin for organizations to start embracing new technologies there are still barriers to overcome and best practices to be put into play to not only mitigate risk, but also prepare capital markets for what’s to come in the future:

Replacing legacy technology

Critical to mitigating risk is ensuring data is available quickly and easily accessible. For many capital markets firms this is an area where they struggle due to a significant amount of legacy technology in their infrastructure and, consequently, data siloes.

Connecting these disparate systems will be vital to not only help them with performance issues they have today, adapting to situations such as mass remote working, for example, but also so they are capable of growing with them into the future.

This requires them to adopt solutions that can seamlessly run, scale, and expand into the cloud. By replacing legacy infrastructure, they will have the benefit of providing new technologies and innovations access to their wealth of valuable data.

These solutions should also be location agnostic to allow capital markets firms to be agile and take advantage of new technology and services and bring that into their existing infrastructure.

Investment in the future

As these institutions look to replace their legacy technology, they should focus their investments on two key areas.

First, they should invest in platform scalability as being able to scale up as the market spikes is crucial and can be a major differentiator. This scalability can even give firms a competitive edge with some firms having recently gained market share solely due their ability to scale up.

The second area of investment should be in analytics and automation that can support and, in some cases, reduce the manual-intensive workload. We’ve already seen increases in algorithmic trading and customer chatbot technologies, while many organizations within the financial services industry use AI to automate processes, such as fraud checks and compliance.

With less time spent on time-intensive manual tasks, capital markets firms will be able to direct their attention to more value-adding services for their clients. The use of AI will help to spot patterns and anomalies in those patterns much faster for fraud prevention, while also reducing the risk of human error.

Gaining access to real-time data

Is your data strategy keeping up in real-time?

Within capital markets firms, there is a growing requirement to be able to access real-time data so these organizations can simplify their stack and get access to transactions that are happening in the moment. This will allow them to produce more time-sensitive reporting so they can make appropriate business decisions and better comply with regulatory requirements.

Data fabric

Data fabrics are fast becoming a key trend within data management across the board, helping to reduce friction. Improving the accuracy, availability and accessibility of data and should also be a consideration as capital markets weather this period of uncertainty and beyond.

A data fabric that uses the latest technology will help organizations to better grasp data governance, ensure that their data is clean and accurate, to harmonize that data where appropriate, and make it more accessible. All of these will help them derive more value and better insights from their data to help drive their enterprises and those of their customers forward.

How can capital markets firms not only survive, but also thrive?

As capital markets firms look beyond this period of volatility to thriving long term, it’s vital they embrace agility by implementing modern technology with a focus on analytics and automation. This will allow them to quickly adapt to changing and new business needs by helping them to make use of their data, analyze it, monetize it, and turn it into actionable intelligence.

In an increasingly competitive landscape, where new market entrants aren’t weighed down by legacy technology and architectures, this will be a key differentiator and enable capital markets firms to take advantage of new opportunities within the market faster.


If you want to hear more about this subject, listen to this webinar in which InterSystems takes a deep dive into the challenges facing capital markets firms and how they can mitigate risk, alongside a panel of other industry experts from Northern Trust, Westwood Group, and SIX Securities & Exchanges. Or read InterSystems latest blog posts on Data Excellence.

Lending in the New Normal: The Digitalization Challenge

Lending in the New Normal: The Digitalization Challenge

The following is a sponsored blog post by Chris Papathanassi, Global Solution Lead, Lending with Finastra. Papathanassi discusses the two challenges facing lenders: data quality and ensuring a true “golden source” and leveraging real value through data connections. Find out more in the full report >>

Today, digital is the only way to do business. But even though everything they do can be expressed in ones and zeros, most financial service organizations simply aren’t set up to be truly digital. In the context of the current disrupted, volatile and remote-working global economy, doing digital brilliantly is now a matter of survival and urgency for many financial firms – no longer simply a ‘nice to have’. 

Digital transformation is difficult for even the simplest business models, and in lending in particular, there is a real challenge. When it can take up to three months to get cash out of the door, it’s hard to see how any bank can keep up with the digital shift. There is a continued dependency in lending on paper documentation and face-to-face contact.

Despite this, the challenges of digitalization are more than balanced out by the potential benefits. You’re likely aware of a few of these already: 

  • Increased efficiency – removing repetitive, non-value-added work and moving towards real-time processing 
  • Personalization – delivering relevant customer service even in a socially-distanced context 
  • Improved credit management – providing integrated, rules-based systems for greater decision speed and transparency 
  • Proactive risk management – using APIs and platforms to “join up” the risk and sales processes
  • Self-service for corporates – providing a digital channel that empowers corporate customers 
  • Unlocking the value of data – bringing data together from disparate sources so its true value as a commodity can be leveraged

So, what needs to happen for lending to get there? 

One of the key issues is data quality and the “golden source”.  The bespoke nature of lending makes it hard to maintain data quality and consistency. Lenders have their own individual nuances and conventions. And corporate borrowers that have lending relationships with many different organizations will download and manipulate data so it’s in a format they can work with.

Can you trust the data?

As one major bank asked us: “How can we ensure what the source of truth is across different applications?”

What’s more, as data moves through different systems in a digitalized and connected world, it changes too.  

This points to the second challenge, which is that digitalized lending data is only valuable when it can be connected to the other pieces of the puzzle, to provide the big picture lenders and borrowers need. Right now, firms are still downloading data into Excel, manipulating it, and re-sending it.

Digitalization plus API capabilities, however, makes it possible for stakeholders to see the same pieces of data in the same state. It’s this connectivity that is key to realizing the full benefits of digitalization and addressing the “source of truth” issue. 

Digitalization also opens data to new technologies such as AI, machine learning, and robotic process automation, which can create new efficiencies and value for banks and customers. And for processes such as syndicated lending that have multiple players, it can be combined with cloud technology to enable more collaboration and better access to a single source of truth.

APIs in the cloud can make innovation more accessible to banks, overcoming the challenges of integrating in-house and external products. In essence, on platforms, banks have access to pre-integrated, interoperable solutions and better access to the broader financial services ecosystem, where they can explore innovations and consume them at speed. 

This potentially changes the shape of the lending industry, opening up interesting questions. What do banks want to be? Leaders in the lending business or providers of specialist products? With digitalization both options are possible, creating an opportunity for lenders to add value and build their lending businesses – or to disintermediate healthily. 

Interested in learning more? Download Finastra’s new report, Lending in the New Normal >>

Customer Knowledge Augmentation and Activation

Customer Knowledge Augmentation and Activation

Today we feature a sponsored post from Celfocus. Henrique Cravo also sits down with Greg Palmer on the Finovate Podcast to talk about intelligent use of data, the importance of tailored experiences, and overcoming barriers left by legacy systems. To get a deeper insight into the topic register for the webinar: Delivering Customer Knowledge Augmentation and Activation here >>

There aren’t many industries where organisations have so much data about customers for such a long period of time. For example, I have had the same bank for the last 25 years which, by the way, is the same as my father’s.

My bank has been my partner when I wanted to go to university or buy my first apartment. It knows how much I make and where I spend it, but I never truly felt they used that knowledge to either enrich my experience or deliver tailored offers. Why is that?

There is a wealth of value to explore in untapped customers’ financial behaviour and banks are in prime position to reap the benefits, but they need to adapt.

The transformation has already begun with banks introducing more channels, learning best practices from digital native banks and fintechs, and even creating new digital business models to test what works, aiming to later integrate those learnings into the core business.

Still, banks are drowning in data and have very little insight on how to transform it into actionable knowledge to better serve customers, personalise offers, and deliver a consistent customer experience.

Furthermore, through segmentation, banks can use their knowledge about current customers to define campaigns and other initiatives to fulfill one of their main objectives, attract new customers. Their continuous appetite for growth steams from delivering unique and innovative value propositions to current but also future customers which today can be, in many situations, hallenging.

From risk takers, tech-savvy, and hungry for innovation customers to techavoiders that value human touch, banks must accommodate different engagement approaches and insights to differentiate customer profiles. This happens not because they don’t have the data, but because they can’t mine it.

It’s clear that very soon ‘Customer intelligence’ will be the most important predictor of revenue growth and profitability. The use of behavioural analytics will be key to identify customer friction points and there will be a surge in building technological capabilities to get more insight on customers’ needs.

A New Engagement Model for the Digital Age

By nature, financial products are complex and both companies and individuals are deeply affected by their financial choices, so there’s a foreseeable need for contact, ensuring a correct understanding of what is at stake.

Bank tellers, financial advisers, and other resources are key in accommodating customers’ requests and providing value-added and timely information. They benefit first-hand from customer insights, which enable them to provide not only a better service, but also to increase the customer value by offering the best solutions.

In addition to assisted channels, there is the emergence of self-service applications aiming to allow customers to engage on their terms, when they want, going as far as allowing customers to configure product features, including pricing. If, in this case, the human factor is eliminated, the need for accuracy is even greater, otherwise, the sale may fail or the inquiry can go unanswered.

Customer expectations have changed mainly due to the experience from other digital native organisations, coming or not, from the financial sector. The easy interactions, the tailored offers, integration between physical and digital channels or the unmatched service, creates a gap between what many financial institutions can deliver and what customers are getting elsewhere.

Responding to the pressure to change, banks must find a balance between opening but guaranteeing trust continues to be paramount, at all levels. Up to this point, the perspectives presented argued for the need for banks to not only gain insights and knowledge from the data they already have, but also the challenge in adjusting to new customers’ demands and how they choose to engage.

However, the biggest challenge is how to orchestrate these two dimensions and provide customers with experiences that leverage the knowledge banks have delivered in a seamless way, using whatever channel customers choose from.

The holy grail of an enhanced experience in the banking sector is to have an holistic and end-to-end perspective of the customer experience.

Introducing Celfocus Customer Knowledge Augmentation and Activation

Celfocus Customer Knowledge Augmentation and Activation is a modular and integrated framework tailored to leverage banks’ customer knowledge and deliver tailored services.

This framework is anchored in 2 main modules. The first comprises the tools and technologies to augment customer knowledge by activating every single customer through automated AI and Cognitive data insights, and the second aims at delivering tailored experiences that trigger new targets, portfolios, and customer lock.

By encompassing the Customer Value Augmentation and Enhance Customer Experience modules, the solution provides banks’ full control of the customer journey from planning to execution, focusing on building the technology capabilities to get more intelligence about customers’ needs, and how to best serve them.

Download the full whitepaper from Celfocus, Customer Knowledge
Augmentation and Activation >>

Mastering Digital Collaboration in the Financial Industry

Mastering Digital Collaboration in the Financial Industry

Financial organizations are managing mass amounts of information on a daily basis. 

Whether it’s a loan application, credit approval, or new customer records, sharing documents securely is key for effective task completion and departmental collaboration. 

With a variety of document formats needed for each of these tasks, professionals must often switch from application to application to complete processes. Standard processes are often outdated and inefficient. 

Discover how financial organizations can streamline their workflows and collaborate more effectively within their current applications.

Read the Accusoft infographic to learn more.

MyLife’s Jeff Tinsley on Creating a “Reputation Score” and the Future of Personal Data

MyLife’s Jeff Tinsley on Creating a “Reputation Score” and the Future of Personal Data

It’s the FraudTech day of the Finovate Fintech Halftime Review, and we welcome Jeff Tinsley, CEO of MyLife to talk fraud management and prevention and how MyLife can be used by financial institutions to educate and add value for their consumers.

David Penn, our own Finovate Analyst, asks what sort of things go into creating a Reputation Score, and how MyLife protects people from fraud?

Watch the full interview.

Find out more about MyLife and get in touch with Tim (timp@mylife-inc.com) for any questions or partnership inquiries.

ITSCREDIT’s João Lima Pinto on the Genie Advisor App and a New Direction for 2020

ITSCREDIT’s João Lima Pinto on the Genie Advisor App and a New Direction for 2020

As part of our Finovate FinTech Halftime Review, Finovate Analyst David Penn sat down with João Lima Pinto, Chairman of ITSCREDIT. With nearly 20 years of solid experience in the financial sector, actively participating in the design and implementation of innovative omnichannel and credit solutions, Pinto has garnered much success by leading a variety of business development, product and project management, business analysis, and product operations functions.

Among the topics discussed include ITSCREDIT’S Genie Advisor app, how the company has seen the COVID-19 crisis impact its customer base, and its plan to address the challenges and move forward in 2020.

Watch the full interview now.

Improving Payable Processes: An Implementation Primer

Improving Payable Processes: An Implementation Primer

This is a sponsored post by Accusoft. For more information on sponsored contributions please email sponsor@finovate.com.

Accounts payable (AP) processes remain a sticking point for many organizations. Caught between the efficiency issues of paper-based solutions and the potential complexity of adopting technology-driven services, stagnation often results. Accusoft explores its top five tips to smooth out your system and reap the rewards.

Businesses now recognize the necessity of change, but many aren’t sure where to start. When it comes to new permutations of payable processes, a roadmap is invaluable. Here’s a look at five key forms completion and invoice processing improvements to help companies account for evolving AP expectations.

  1. Identifying errors

Staff remain the biggest source of AP errors. There’s no malice here; humans simply aren’t the ideal candidates for repetitive data entry. In this case, effective implementation of new processes depends on customizable software tools capable of accurately capturing forms data and learning over time to better identify and avoid common errors. The benefit? Staff are free to work on time-sensitive AP approval and reviews rather than double-checking basic forms data.

2. Improving invoice routes

Invoice routing is time-consuming and often confusing for AP staff. To avoid potential oversights, most companies use two to three approvers per invoice, creating multiple approval workflows. While the process reduces total error rates, it also introduces new complexity. What happens if invoice versions don’t match or approvers don’t agree on their figures? In the best-case scenario, your company needs extra time to process every invoice. Worst case? Double payment of AP invoices or payments result in missed critical deadlines. Here, a single-application approach to invoice processing helps improve invoice routes and reduce redundant approval steps.

3. Integrating data location

Where is your accounts payable data located? For many companies, there’s no easy answer; some invoices are paper, others are digitally stored on secure servers, and there are still more trapped in emails and messages across your organization. Instead of chasing down AP data, implement an invoice rehoming process. Solutions like Accusoft’s FormSuite for Invoices support thousands of invoice formats and keep them all in the same place.

4. Innovating at speed and scale

Complexity holds back many accounts payable programs. If new technologies complicate existing processes, employee error rates will go up and there’s a chance they’ll avoid digital deployments altogether in favor of familiar paper alternatives. In this case, automation is the key to implementation; speedy solutions capable of scanning paper forms, identifying key data, and then digitally converting this information at scale. 

5. Increasing visibility

You can’t fix what you can’t see. Paper-based invoice processing naturally frustrates visibility by making it difficult to find key documents and assess total financial liabilities. Integrated APIs that work with your existing accounts payable applications can help improve inherent visibility by creating a single source of AP data under the secure umbrella of your corporate IT infrastructure.

Want to learn more about the potential pathways available for companies to improve their AP processes and reduce total complexity? Check out Volume 1 of our Accounts Payable eGuide series, No Pain, No Gain?

Mission-Critical, Concurrent Transactional, and Analytic Processing at Scale

Mission-Critical, Concurrent Transactional, and Analytic Processing at Scale

This is a sponsored blog post by InterSystems, a financial data technology company based in Cambridge, Massachusetts.

Successful financial services organizations today must be able to simultaneously process transactional and analytic workloads at high scale – accommodating billions of transactions per day while supporting thousands of analytic queries per second from hundreds of applications – without incident. The consequences of dropped trades, or worse – a system
failure – can be severe, incurring financial losses and reputational damage of the firm.

InterSystems’ IRIS Data Platform is a hybrid transactional/ analytic processing (HTAP) database platform that delivers the performance of an in-memory database with the reliability and built-in durability of a traditional operational database.

InterSystems IRIS is optimized to concurrently accommodate both very high transactional workloads and a high volume of analytical queries on the transactional data. It does so without compromise, incident, or performance degradation, even during periods of extreme volatility and requires fewer DBAs than other databases. In fact, many installations do not need a dedicated DBA at all.

An open environment for defining business logic and building mobile and/or web-based user interfaces enables rapid development and agile business innovation.

For one leading global investment bank, InterSystems data platform is processing billions of daily transactions, resulting in a 3x to 5x increase in throughput, a 10x increase in performance, and a 75% reduction in operating costs. The application has operated without incident since its inception.

Traditionally, online transaction processing (OLTP) and online analytical processing (OLAP) workloads have been handled independently, by separate databases. However, operating separate databases creates complexity and latency because data must be moved from the OLTP environment to the OLAP environment for analysis. This has led to the development of a new kind of database. In 2014, Gartner coined the term hybrid transaction/analytical processing1, or HTAP, for this new kind of database, which can process both OLTP and OLAP workloads in a single
environment without having to copy the transactional data for analysis.

At the core of InterSystems IRIS is the industry’s only comprehensive, multi-model database that delivers fast transactional and analytic performance without sacrificing scalability, reliability, or security. It supports relational, object-oriented, document, key value, and hierarchical data types, all in a common persistent storage tier.

InterSystems IRIS offers a unique set of features that make it attractive for mission-critical, high-performance transaction management and analytics applications, including:

  • High performance for transactional workloads with built-in guaranteed durability
  • High performance for analytic workloads
  • Lower total cost of ownership

InterSystems IRIS is enabling financial services organizations to process high transactional and analytic workloads concurrently, without compromising either type – using a single platform – with the highest levels
of performance and reliability, even when transaction volumes spike.

Founded in 1978, InterSystems is a privately held company headquartered in Cambridge, Massachusetts (USA), with offices worldwide, and its software products are used daily by millions of people in more than 80 countries. For more information, visit: Financial.InterSystems.com

Synthetic Data Can Conquer FinServ’s Fear of Data Security and Privacy

Synthetic Data Can Conquer FinServ’s Fear of Data Security and Privacy

This is a sponsored blog post by Randy Koch, CEO of ARM Insight, a financial data technology company based in Portland, Oregon. Here, he explores what synthetic data is, and why financial institutions should start taking note.

You’ve heard it before – data is invaluable. The more data your company possesses the more innovation and insights you can bring to your customers, partners and solutions. But financial services organizations, which handle extremely sensitive card data and personally identifiable information (PII), face a difficult data management challenge. These organizations have to navigate how to use their data as an asset to increase efficiencies or reduce operational costs, all while maintaining privacy and security protocols necessary to comply with stringent industry regulations like the Payment Card Industry Data Security Standard (PCI DSS) and the General Data Protection Regulation (GDPR).

It’s a tall order.

We’ve found that by accurately finding and converting sensitive data into a revolutionary new category – synthetic data – financial services organizations can finally use sensitive data to maximize business and cutting-edge technologies, like artificial intelligence and machine learning solutions, without having to worry about compliance, security and privacy.

But first, let’s examine the traditional types of data categorizations and dissect why financial services organizations shouldn’t rely on them to make data safe and usable.

Raw and Anonymous Data – High Security and Privacy Risk

The two most traditional types of data categorization types – raw and anonymous – come with pros and cons. With raw data, all the personally identifiable information (PII) fields for both the consumer (name, social security number, email, phone, etc.) and the associated transaction remain tagged to data. Raw data carries a considerable risk – and institutional regulations and customer terms and conditions mandate strict privacy standards for raw data management. If a hacker or an insider threat were to exfiltrate this type of data, the compliance violations and breach headlines would be dire. To use raw data widely across your organization borders on negligence – regardless of the security solutions you have in place.

And with anonymous data, PII is removed, but the real transaction data remains unchanged. It’s lower risk than raw data and used more often for both external and internal data activities. However, if a data breach occurs, it is very possible to reverse engineer anonymous data to reveal PII. The security, compliance and privacy risks still exist.

Enter A New Data Paradigm – Synthetic Data

Synthetic data is fundamentally new to the financial services industry. Synthetic data is the breakthrough data type that addresses privacy, compliance, reputational, and breach headline risks head-on. Synthetic data mimics real data while removing the identifiable characteristics of the customer, banking institution, and transaction. When properly synthesized, it cannot be reverse engineered, yet it retains all the statistical value of the original data set. Minor and random field changes made to the original data set completely protect the consumer identity and transaction.

With synthetic data, financial institutions can freely use sensitive data to bolster product or service development with virtually zero risks. Organizations that use synthetic data can truly dig down in analytics, including spending for small business users, customer segmentation for marketing, fraud detection trends, or customer loan likelihood, to name just a few applications. Additonally, synthetic data can safely rev up machine learning and artificial intelligence engines with an influx of valuable data to innovate new products, reduce operational costs and produce new business insights.

Most importantly, synthetic data helps fortify internal security in the age of the data breach. Usually, the single largest data security risks for financial institutions is employee misuse or abuse of raw or anonymous data. Organizations can render misuse or abuse moot by using synthetic data.

An Untapped Opportunity

Compared to other industries, financial institutions haven’t jumped on the business opportunities that synthetic data enables. Healthcare technology companies use synthetic data modeled on actual cancer patient data to facilitate more accurate, comprehensive research. In scientific applications, volcanologists use synthetic data to reduce false positives for eruption predictions from 60 percent to 20 percent. And in technology, synthetic data is used for innovations such as removing blur in photos depicting motion and building more robust algorithms to streamline the training of self-driving automobiles.

Financial institutions should take cues from other major industries and consider leveraging synthetic data. This new data categorization type can help organizations effortlessly adhere to the highest security, privacy and compliance standards when transmitting, tracking and storing sensitive data. Industry revolutionaries have already started to recognize how invaluable synthetic data is to their business success, and we’re looking forward to seeing how this new data paradigm changes the financial services industry for the better.