The Five D’s of Fintech: Debiasing

FinTechCreating a Level Playing Field

Much of the modern financial services edifice has been built on structures and arrangements that have relied on certain participants having exclusive access to information, third party human involvement due to complex and risky nature of financial services, and an opacity of transactions to preserve business models of certain parties.  Such structures and arrangements have worked just fine until now, but now the balance is tilting in favor of creating a more open environment where trust and transparency is improved, and bias and conflicts of interest are mitigated.  Toward this end, a variety of regulations are being proposed and enforced, and technology is increasingly creating opportunities as well. There is also a focus on simplification and enablement to empower the traditionally disadvantaged parties, and automation so as to minimize the principal-agent problem.  Fintechs are at the forefront in driving a large part of this.  This last installment in the Fintech series (The Five D’s of Fintech) focuses on how opacity of information, bias, and conflicts of interest in financial services are being targeted by fintechs that want to create an open, level playing field through technology and new business models. 

Opacity of Information, Lack of Trust & Transparency

Trade execution is one area where various issues related to trust and transparency are being targeted by fintechs.  The information asymmetry that exists between large institutional players such as market makers and small retail investors inevitably leads to inefficient outcomes for the latter – this is true in case of certain exchanges whereby high speed traders, because they get up to date market information quicker, are able to jump in front of retail investors during order execution thereby giving the former a price advantage.  IEX Exchange in 2012 came out with a new model for providing trade execution services with the goal of leveling the playing field and providing transparency.  For example, IEX’s “speed bump” was an innovation that introduced a delay in their trades with the goal of negating the high frequency trader’s speed advantages.  IEX also does not engage in “payment for order flow” – an arrangement in which HFT’s and market makers make payments to brokers for retail orders – which could potentially lead to conflicts of interest and ultimately shortchanging the retail investors.  And it also actively publishes a comprehensive set of metrics and KPIs to demonstrate execution quality.

In other cases, there is lack of transparency on asset price, which provides an unfair advantage to the seller – as in the case in the digitally archaic corporate bond market where in many to most instances, there is a lack of visibility of order books across markets, and dealers unilaterally set the price of the bonds (since they may be the only ones that carry those securities on their books), leading buyers to be meek price takers. Electronic bond trading platforms such as MarketAxess and TradeWeb have recently introduced “all to all” trading in corporate bond markets whereby buyers can get broad market visibility and price transparency and trade with any dealer or asset manager in the network.  New outfits such as LiquidityBook are jumping in with new solutions centered on pre-trade and post-trade transparency.

Pre-trade and post-trade transparency and ‘best execution’ are becoming the norm, thanks to regulations like MIFID2.  Investors, retail and institutional asset managers alike, are looking to find out more about where brokers are sending their orders. For many years now, full broker transparency on order routing has been difficult to attain.  These issues have increasingly become the focus of both regulators as well as fintech players.  As full order routing transparency continues to top the ever-growing list of concerns for asset managers, trading technology companies are slowly but surely emerging to provide such solutions to the buy-side.  Dash Financial Technologies is a technology and services company that provides order routing and customized trading solutions for the buy side enabling full order routing transparency.  The firm has seen considerable growth in recent years, in large part due to its transparency services.  Another example is Luminex, which is a consortium formed by major buy side players that provides order execution transparency. 

Bias

Portfolio manager and financial advisor bias in asset and wealth management has been the focus of study and research for some time now, with well-known biases and subtle preferences factoring into investment decisions.  For example, portfolio managers are prone to confirmation and overconfidence bias, which may ultimately lead to investment underperformance.  Financial advisors, despite having fiduciary focus, can be susceptible to a range of cognitive biases.  After all, portfolio managers and financial advisors are only human, and as behavioral economics has proved, no matter how rational advisors may think they are, subtle biases do often creep into decision making.

The application of behavioral economics to personal finance – dubbed “behavioral finance” – is the latest development in the wealth management industry purported to help drive investor outcomes.  Behavioral finance techniques can help reduce bias in investment decisions.  While bias in the context of wealth management has traditionally focused on end investor/client bias, equally important are the biases of the financial advisors themselves – indeed biases on either side can reinforce and influence each other.  Time-tested, automated, and quantitative strategies can help take human emotion out of investment decision-making, both for financial advisors and their clients.  As financial advisors increasingly focus on financial planning and driving investor satisfaction and outcomes, they realize the importance of leveraging behavioral finance techniques that can help advisors consistently apply the wisdom of proven investment strategies across clients and across market cycles.  However, per recent surveys, they cite difficulty translating theory into implementation, and a lack of software/tools as the primary reason preventing adoption (Charles Schwab survey) – until now.

Many technology savvy advisors and wealthtech firms are focused squarely on enabling behavioral finance capabilities through technology. United Capital for example recently unveiled FinLife Partners, a tool for advisors that taps into clients’ views on money and spending and how they prioritize their decision-making (United Capital, a well-known financial advisor, was recently acquired by Goldman Sachs).  Advisor Software has released Behavioral IQ, which aims to get a clearer picture of six behavioral traits influencing clients’ approach to risk and decision-making though a series of questions and follow-ups. The tool, which essentially weighs biases by analyzing such factors as confidence and loss aversion, lets advisors make more appropriate recommendations.  Lirio’s Finworx also applies insight into clients’ risk tolerance and decision-making approaches derived from questionnaires.

As technology commoditizes parts of wealth management (tax loss harvesting, rebalancing, notifications, to name a few), it is moving increasingly upstream to functions such as asset allocation and financial planning that have been the sole preserve of human financial advisors, thanks to machine learning and more generally, AI.  As technology revolutionizes wealth management, it reduces the risk of mistakes and bias, intentional or not.  Although AI can introduce bias and discrimination of its own, new techniques and methods to manage and mitigate such issues are already paving the way to widespread AI adoption.

Conflicts of Interest

The retirement industry is increasingly delivering on its fiduciary responsibility toward investors thanks to a string of successful class action lawsuits that has forced large plan sponsors and providers in the 401(k) space to clearly disclose fees and conflicts of interest.  However, there are segments of the industry where the problem is still rife.  For example, the small 401(k) plan market is not only underserved and overpriced, but it also pays high compliance fines due to failed DOL audit checks. 403(b) plans, offered to schoolteachers and employees of tax-exempt organizations, are particularly notorious for shirking their fiduciary responsibility.  403(b) plans, unlike 401(k) plans, are not subject to ERISA mandates, which have been key to ensuring fiduciary treatment of plan design and operation.  Indeed, many 403(b) plans suffer from higher costs, lack of transparency and conflicts of interest where, for example, administrators, who receive kickbacks from asset managers, promote highly priced and complex investment products (such as annuities) to sponsors and participants.  No wonder, New York’s Department of Financial Services recently launched an investigation into the 403(b) plan practices of brokers and investment providers such as insurers, the well-known purveyors of annuity products.

It is these segments that retiretech firms are beginning to target by bringing new value propositions and models to not just provide ease of use and fiduciary support, but also transparency and elimination of conflicts of interest.  Backed by private equity, these firms are targeting the RIA channel, leveraging cloud, APIs, machine learning, low-cost passive investments and the latest in “on-boarding” practices to make plans cheap, easy, and riskless enough for small company employers to sponsor.  Fintech firms like Vestwell are offering a full white-label retirement platform to RIA’s including investment services, trading, administration, recordkeeping, as well as taking on the fiduciary responsibility through the so called 3(38) and 3(16) arrangements whereby advisors are relieved of the financial cost and legal liability associated with providing retirement services. Outfits such as Dream Forward and ForUsAll are providing advisor focused retirement platform services and solutions by partnering with low cost asset managers and third party recordkeepers, thus eliminating conflicts of interest that typically arise with traditional bundled retirement plan service providers.  BidMoni offers a retirement plan marketplace that allows sponsors and advisors to manage provider bids and RFPs, and provide comparisons and ongoing fee analysis – all with the purpose of reducing the fiduciary responsibility burden on sponsors and advisors.  Furthermore, many of these innovators are approaching retirement planning with the context of the broader financial needs assessment and planning – which helps to align retirements products and services with the participant’s true financial picture.

As the wealth management and the retirement industry takes on the fiduciary challenge, industry structures favoring such arrangements as commissions and revenue sharing are slowly giving way to ones where investors are in charge of devising their financial future in full freedom and transparency, and fintech, starting with small corners in a vast industry, is leading the way for a broader transformation.

Conclusion

It is still early days, but financial services are slowly but surely becoming more accessible, more equitable, and end consumer focused, and credit goes in large part to fintechs that are demonstrating what is possible and motivating the staid incumbents to change.  Driving transparency, reducing bias, mitigating conflicts of interest is laudable, however, achieving perfection is neither practical, nor desirable.  Industry structures and business models do need to evolve from what and where they are today, nevertheless, and fintechs are demonstrating what is feasible while still preserving the economics of the industry.

The Five D’s of Fintech: Decentralization

The recent rout in the cryptocurrency markets has not been positive for crypto enthusiasts, but has hardly put the damper on its underlying technology, the Blockchain.  Blockchain is moving out of the labs into real world applications for capital markets and securities firms. Over the past four years, global banks have successfully piloted Blockchain, and are now planning to scale those pilots into production. Government jurisdictions and regulators have been critical of cryptocurrencies, but are embracing Blockchain, which is a type of distributed ledger technology, as a way to streamline the financial markets. Infrastructure service providers such as exchanges and data vendors have teamed up with startups and banks to develop consortia based solutions. Various other parts of financial services are beginning to look at Blockchain.  Decentralization is the topic of this post, the third installment in the fintech series (see The Five D’s of Fintech: Introduction), in which we will look at where and how Blockchain is transforming financial services, in particular the retirement industry.

Modern developed financial markets are an intricate patchwork of intermediaries connecting transacting parties: middlemen that connect entities, assist in drawing up contracts, and provide monitoring and verification of transactions (centralized clearing and settlement agencies are examples of such intermediaries).  While this arrangement has worked well so far, it suffers from various issues: an example is information disclosure that the intermediary requires of the transacting parties, which opens up potential for conflicts of interest and privacy risk.   Distributed ledger technology such as Blockchain, which reduces the cost of networking and cost of transaction verification, has the potential to fundamentally alter this intermediary based structure of modern financial markets.

Why Blockchain in Financial Services

At the heart of financial markets operations is how data is used to exchange information, store records of ownership and obligation, and manage transfer of assets.  Complexity of products and markets, a legacy of old IT infrastructures, and organizational factors making managing such data a hugely complex undertaking.  This is evident by the number of typical reconciliations that happen in a given transaction across the value chain, the time it takes to settle transactions, and the preponderance of security breaks.  Financial institutions spend a large amount of resources on managing such data, and yet take on significant operational and regulatory risk.  Blockchain can help alleviate these operational issues, provide transparency and enhance security.

Blockchain is a type of distributed ledger technology: a shared, immutable database that is the system of record for all transactions, current and historic.  Based on advanced cryptography, Blockchain provides a set of foundational capabilities.

picture1

Blockchain has big applicability in financial services precisely because it can address the industry challenges mentioned above all at once. Blockchain provides data efficiency: the immutable ledger provides golden source historical information access, decentralized nature provides availability of data, and shared access provides accurate data provenance and enrichment.  The decentralized peer to peer model allows trustless parties to engage in secure streamlined transactions obviating the need for centralized processing.  Blockchain’s smart contract functionality enables a network of value for managing the transfer of assets e.g., currency, securities.

Retirement: An Industry in Transformation 

At $28.2 trillion, the US retirement industry, comprised of advisors, asset managers and retirement providers, is a large part of the financial services ecosystem. Closely associated with asset and wealth management, retirement is becoming increasingly important due to broader trends in financial markets.  In the aftermath of the financial crisis,  asset management has taken on new market significance, and given the evolving dynamics, it will take on a leadership role in setting the technology agenda and driving change.  The retirement industry is encountering additional pressures.  There is an excess supply of retirement providers in the market leading to consolidation.  Participants are expecting full financial well-being solutions.  There is emerging competition from “wealthtech” firms now squarely focusing on retirement solutions.  The retirement industry is facing the triple challenge of improving cost efficiency, providing transparency, and improving services to drive engagement.

Efforts undertaken by asset managers and retirement providers to drive cost efficiency and improve transparency may not be enough, given consumer expectations, regulatory evolution and strengthening competition.  Barriers to realizing fundamental cost efficiencies and improving services result from a lack of a trusted shared data environment that facilitates the creation and management of the retirement plan, essentially a contract guiding the investment and distribution of retirement assets. A big chunk of the industry cost structure results from having to make do without such an environment.  A lack of a true marketplace where sponsors and participants can choose advisors and provider services limits innovation, choice, and scale.  Further, regulators lack true on-demand granular transparency, and data access to understand true systemic risk, which can limit efficacy of regulatory enforcement and policy making.  Beyond driving local optimizations, the retirement industry has an opportunity to fundamentally remake itself.

How Blockchain Can Revolutionize Retirement

The retirement industry needs to acknowledge the Blockchain movement not just because the operational backbone of financial services will one day move to Blockchain, but because Blockchain can help the industry tackle its issues here and now.  It can help drive cost efficiency by streamlining operational processes and reducing manual work effort; it can improve transparency by enabling all stakeholders to securely view and report on data; it can drive innovation and engagement by enabling new products, improving processes, and empowering users.

There are value propositions for all players in the retirement marketplace.  For participants, it can provide a comprehensive view of assets, analytics and advisory; for plan sponsors, it can reduce overhead and provide transparency into fees and performance; for providers, it can bring operational efficiency and enable value-added solutions; and for regulators, it can enable easier enforcement of regulations and access to data for future policy making.  Blockchain can enable all this because it facilitates the creation of an open data environment for industry players on which additional services and innovation can be enabled.

Forward thinking retirement industry players have started experimenting with Blockchain already.  Asset managers such as Vanguard, BlackRock and Fidelity have Blockchain based research projects, pilots and product roadmaps for data sourcing, custody operations and crypto investment products.  Insurance outfits such as Prudential and John Hancock are partnering with Blockchain startups to run POCs for trading and recordkeeping  platforms.  Even payroll providers such as ADP and Kronos are actively evaluating Blockchain for their payroll solutions and services.

A Strategic Imperative

A typical retirement provider has operational pain points and challenges in each part of the functional value chain.   Being a general purpose technology, Blockchain has application areas across all retirement provider functions. User identity and fraud management is a key area for Blockchain: Blockchain based solution for digital identities, KYC/AML, and fraud detection is a proven use case.  Further, empowering users with control of their data will increasingly become critical due to regulation and customer expectations, which Blockchain can help address.

picture2

Since Blockchain is a rapidly evolving technology, providers should take a long-term view to adoption, going through stages of adoption to learn the technology and prove business value.  Organizations should start out by testing basic Blockchain based data sharing, gradually moving transactions and whole processes to the new platform. There are various use cases which the industry can focus on in the immediate term on a proofing/piloting basis. Such uses cases can help providers evaluate “ways to play” and develop institutional expertise with the technology.

picture3

Blockchain is Here To Stay

Blockchain is taking the financial services world by storm. Financial services is spending about $1.7 billion annually on Blockchain with a 67% annual increase in 2017, and one in 10 financial institutions is now reporting Blockchain budgets in excess of $10 MM. Now well beyond initial experimentation, Blockchain-related interest within financial services has reached critical significance. 14% of the banks and other companies in a recent survey claim to have successfully deployed a production Blockchain solution. Headcount dedicated to Blockchain initiatives doubled in 2017, and a typical top tier bank now has 18 employees dedicated to Blockchain development. Financial institutions are coming together in Blockchain-based industry utilities and mutualizing commercial interests by working with Blockchain startups.

While initial applications have focused on cross-border payments and trade finance, it is capital markets where Blockchain has the most disruptive potential, promising to revamp the industry’s operational backbone. Not surprisingly, capital markets infrastructure providers are at the forefront of Blockchain innovation, closely followed by commercial and investment banks, with asset management a distant third. As infrastructure providers, banks, and broker-dealers prove out use cases, uptake is expected to increase in asset management.

All financial services organizations should be paying close attention to Blockchain due to its transformative potential. Since Blockchain reduces the need for financial intermediation, traditional intermediaries such as banks, counter parties and distributors will need to develop new value propositions. Blockchain is the enabling technology behind all cryptocurrencies and emerging crypto finance markets that are now emerging to challenge the traditional ones. Furthermore, Blockchain enables creation of new digital assets and services, which will unleash a wave of financial innovation and a corresponding support ecosystem.

Time for Action

With a game changing technology such as Blockchain, the temptation for most industry players will be to adopt a wait-and-watch attitude. However, such a stance may come at a price: innovative incumbents or new fintech players may prove hard to beat, or a Blockchain infrastructure solution may simply be unavoidable due to broad industry adoption. More fundamentally however is the prospect of other industry players setting the agenda for the future Blockchain architecture, which may prove to be strategically disadvantageous for the laggards. The retirement industry should get ahead of the game with a proactive stance toward innovating with Blockchain.

Providers need to develop a sense of urgency and action, creating a case for change and highlighting the opportunity costs of doing nothing. Providers should take the following actionable steps:

  • Define How To Innovate: Identify the scale and scope of adoption, understand capabilities required, and define the roadmap
  • Build The Foundation: Undertake “no regret” preparation e.g., technology modernization, data quality and governance, API/cloud architecture
  • Test The Waters: Test Blockchain applications and evaluate “ways to play” in the future through partnerships and M&A opportunities
  • Be A Part Of The Movement: Participate in industry forums, consortiums, and innovation efforts on Blockchain and develop institutional expertise

The industry is expected to go through the growing pains of Blockchain adoption: new economic opportunities will emerge, regulatory overhaul and industry participation will need to happen, and rules of play and governance will need to be defined. For retirement, as for many in the financial services world, it is not a question of “if” but “when” for Blockchain, and providers should start preparing for the eventual transition today.

 

Token Economics 101

Background

Much has been discussed about how modern internet based digital platforms such as Google, Facebook, and EBay to name a few, while they have created market value and consumer choice, are prone to rent seeking due to monopolistic winner-take-all effects (There is No Single Solution To Making Internet More Decentralized). Not just responsible for causing economic distortions, platforms are prime targets for cyber attackers, as recent incidents such as Equifax and Sony security breaches have demonstrated. Further, although such platforms have created consumer surplus, they have also led to consumer deficits of their own, by taking away privacy and control of one’s data. This view of the platforms is a part of a larger story of internet’s balkanization: emergence of independent domains thanks to platforms that do not interoperate, proprietary networks resulting from regulations that target net neutrality and geographic islands due to governments restricting free flow of information. Platform companies, which have built their platforms atop the internet, get blamed the most for this failure. The real failure, however, is that of the market that failed to create effective structures to govern and to incentivize appropriate development of the internet and the applications sitting on top of it.  This malaise is not just limited to Internet based markets, but extends to all existing intermediated centralized market systems of today. Rather serendipitously, the world may have discovered an alternative way to better manage the creation and development of efficient market systems: cryptocurrency or token economies. By aligning the incentives and needs of suppliers, consumers and service providers, token economies create value for all participants in an efficient manner. Indeed, it is not just Internet based solutions, but any market where goods and services are exchanged that token economies can bring efficiencies.

“Token Economy” As In Applied Psychology?

Token markets are those that whose workings are based on a cryptocurrency or token, such as Bitcoin. On one hand, such markets are guided by an explicitly defined crypto-economic system or token economy, but on the other hand, they also evolve dynamically based upon participants’ interactions. Originating in applied psychology, the term “token economy” is a system of incentives that reinforce and build desirable behaviors; essentially the token economy implements the theory of incentives used to explain the origins of motivation. Token economies are a useful tool in behavioral economics, where it has been applied both in lab and in the real world to study human decision making within economic contexts. In crypto markets, a token economy is implemented using digital assets as tokens and a framework of rules as the market protocol implemented in code using cryptography. Token economics in this sense refers to the study, design, and implementation of economic systems based on cryptocurrencies. Supporting the token economies of crypto-currency markets is distributed ledger technology (DLT), such as Blockchain. Tokenization is the process of converting some asset into a token unit that is recorded on the DLT: thus anything of economic value, such as real estate, commodities, currency markets, etc. can be tokenized, giving rise to a variety of different token economies.

What is Different?

By enabling costless verification and reducing the cost of networking, token economies streamline operational functioning of markets and open them up for broad innovation (Some Simple Economics of the Blockchain). Markets are formed when buyers and sellers come to together to exchange goods and services. Effective functioning of the markets depends upon effective verification of market transactions between the sellers and buyers, a function that market intermediaries take on as market complexity increases. The intermediary structure, however, brings its own set of issues: they can misuse information disclosed to them by buyers/sellers, conflicts of interest, agency problems, moral hazard may still persist, they can misuse their market power (as has happened with some tech giants recently) and their presence increases the overall cost structure of the market. Further, hurdles to developing trust between parties limits the extent to which business relationships develop thus limiting innovation: it is costly and time-consuming to develop and enforce contracts to guide the exchange of property rights. Through the use of DLT, token economies provide costless verification (and thus mitigate the need for intermediaries), and reduce cost of networking (and thus provides a way to efficiently exchange property rights). But what is fundamentally different about token economies is that they create a framework in which participants mutualize their interests and have strong incentives to continually improve the functioning of the economy.

Flywheels, Loops, and Knock On Effects

A key part of the token economy design is “mechanism design”, which is the design of system of incentives to encourage fruitful development of the token economy infrastructure as well as the overlying applications and services. Mechanism design addresses how tokens are used for payments to participants to manage the DLT network, service usage, profit sharing, governance and so on. An optimally designed token economy creates the appropriate mix of incentives to ensure a high performing market infrastructure, valuable services, and smooth running of the platform. As overlay protocols, applications and APIs are built on top of the base DLT (The Blockchain Application Stack), mechanism design ensures value is distributed equitably not just within a layer but across the layers. An appropriately designed token system unleashes powerful feedback loops that perpetuate desirable behaviors and actions in the marketplace – indeed, mechanism design can either make the token economy the El Dorado of all crypto economies, or be the death knell of the token even before it has had a fighting chance. Depending upon the specific market it is trying to make, each token economy will have a unique token design and a study of how feedback loops are expected to take hold, but there are some common fundamentals to how incentives work in a token economy.

Slide1

A new token system often comes to market through an ICO (initial coin offering) which allows the founding team to distribute tokens to raise capital for funding the enterprise. A lot of initial pull for the token is dependent upon the team’s vision and the soundness of the token economy. As the market’s perceived value of the token increases, participants, customers and speculators take note and invest in the token, thus increasing its demand. Typically in token economies, since the token supply has a ceiling, increasing token demand leads to increasing value (assuming token holders hold on to the token at least for some time period), which attracts even more participants, customers and speculators. Increasing number of participants in the network strengthens the network in two ways: it makes it more decentralized and secure, and participants work to improve the network and develop services on the platform. This increases the utility of the system, which attracts more customers, and further strengthens the feedback loop.

These same mechanisms can work in reverse, leading to the token economy’s death spiral if there is a perceived loss of value in the token economy.

Slide2

A token’s perceived utility can take a hit for a variety of reasons: if it does not create enough utility in the market, or if there is a security weakness that opens it up to a cyberattack. A perceived loss in the utility can lead to selloff in the market which depreciates the value of the token. As the token value depreciates, more speculators start dumping the token. A depreciating token value also disincentivizes participants from developing the token protocol and services, which reduces the token’s utility for consumers leading to reduced demand. An increased supply of tokens in the market further reduces the token’s value, perpetuating the negative feedback loop.

Token Economies Galore

Token markets originated with digital currencies for P2P payments, the first one to the market being Bitcoin, which was then followed by a number of alternative digital currencies (so called “alt-coins”) such as zCashLiteCoin and Monero as a means to address use cases for which Bitcoin was not the best solution. Tokenization moved to computing when Ethereum was launched for decentralized computing through smart contracts, and markets such as StorjFilecoin, and Sia came into being for decentralized storage, which is a challenge in its infancy for centralized cloud providers such as Amazon. Blockstack wants to go further by providing an entire decentralized infrastructure for building decentralized applications of the future, effectively replacing the current Internet based architecture. Not even current day platforms are safe: OpenBazaar offers a decentralized e-commerce marketplace, and Lazooz is a protocol for real time ride sharing. Most of the tokens these economies operate with are “utility tokens” that are used to access a service that such economies provide e.g., using decentralized storage or buying something in a e-commerce marketplace. Attention is now turning to “security/asset-backed tokens” which represent ownership in a company through cash flows, equities, futures, etc. or hard assets such as commodities or real estate. Asset-backed tokenization is in full swing, targeting commodities, precious metals and real estate (Digix and Goldmint for precious metals, D1and Cedex for diamonds, etc.). Security token offerings, like those enabled by Polymath, will provide investors exposure to token economies.

This is Just the Beginning

Just as the Cambrian period enabled the creation of a multitude of new life forms, the emergence of token economies is opening up a wide range of previously unavailable markets as well as new ways to compete against entrenched incumbents. Sure, many of the new token economies coming in to being today will die out, but many of the ones that survive will be epic. Designing the right economic model for tokens is crucial and token economics provides the necessary groundwork and framework for devising such models. More study in token economics is required especially since it involves technically complex mechanisms and market designs that are totally new. As the industry experiments with tokens across various markets, it will learn more about what is just great economic theory versus what really works in the complexity of real-world interactions.

Understanding The Building Blocks of a Distributed Ledger System

Introduction to DLTs

Distributed Ledger technology (DLT) is being hailed as a transformative technology with comparisons being drawn to the Internet in its potential to transform and disrupt industries.  As a “platform” technology for decentralized, trust-based peer-to-peer computing, DLT helps shape new “domain” capabilities, just as computer networking enabled the Internet and creation of capabilities across communication, collaboration and commerce. Like the Internet, it will have far reaching consequences for enterprise architectures of the future.  Not only will DLT transform the technology stack of established domains (witness how Blockchain is transforming identity management infrastructure in the enterprise), but it will also give rise to new architecture paradigms as computing moves to decentralized trust-based networks, for example, in how an enterprise interacts with its business partners, suppliers and buyers.  The Internet took 30 years to have disruptive effects in the enterprise, and DLT’s full impact is expected to play out over similar time frames.

DLT represents a generic class of technologies (Blockchain is a prominent example), but all DLTs share the concept of the distributed ledger: a shared, immutable database that is the system of record for all transactions, current and historic, which is maintained by a community of participating nodes that have some sort of an incentive (usually a token or a cryptocurrency) to maintain the ledger in good standing.  The emergence of DLT’s can be traced to back to the original blockchain applications, Bitcoin and Ethereum.  Various other distributed ledger applications have emerged to solve specific industry/domain issues: R3’s Corda in financial services, Ripple for payments, etc.  Innovation in the DLT space is proceeding at a feverish pace.  The well-established DLT based networks can be essentially segmented based on two dimensions: how ledger integrity is guaranteed through validation, and whether the ledger is private or public.

DLT and Enterprise Architecture

As participants to DLT based networks developed by industry utilities or consortiums, organizations may not have a strong need to master internal architecture design and trade-offs associated this such a platform.  However, the architecture community in those organizations will still be required to understand how the networks they are participating in work, to the extent required to understand the implications for their organizations.  Furthermore, as intra-company applications of DLT become mainstream, enterprise architects will be increasingly called to provide perspectives on most optimal design of the underlying technology.  As DLT moves from innovation labs into the mainstream enterprise, architects will need to  start preparing their organizations for accepting DLT-based applications into the organizational landscape.  A good place to start for the enterprise architects will be to understand just what the DLT technical architecture encompasses.  This involves understanding what building blocks comprise a DLT system, and what architectural decisions need to be made.

The Building Blocks of a DLT System

To understand a complex technology such as DLT, it may be helpful to draw parallels to the TCP/IP stack for computer networking, which Blockchain has been compared to in the past (The Truth About Blockchain).  While there may not be a straight one-to-one correspondence between the Internet’s OSI model and the DLT architecture, drawing the parallel helps one understand conceptually how the building blocks fit together.  The OSI model is a generic architecture that represents the several flavors of networking that exist today, ranging from closed, proprietary networks to open, standards-based. The DLT building blocks provide a generic architecture that represents the several flavors of DLTs that exist today, and ones yet to be born.

In theory, it should be possible to design each building block independently with well-defined interfaces for the whole DLT system to come together as one whole, with higher level building blocks abstracted from the lower level ones. In reality, architectural choices in a building block influence those in other building blocks e.g., choice of a DLT’S data structure influences the consensus protocol most suitable for the system.  As common industry standards for DLT architecture and design develop (Hyperledger is an early development spearheaded by The Linux Foundation) and new technology is proved out in the marketplace, a more standardized DLT architecture stack will perhaps emerge, again following how computer networking standards emerged.  There is value, nevertheless, in being able to conceptually view a DLT system as an assembly of these building blocks to understand the key architecture decisions that need to be made.

Key Architectural Tradeoffs in DLT Systems

Architecting a DLT system involves making a series of decisions and tradeoffs across key dimensions.  These decisions optimize the DLT for the specific business requirement: for some DLT applications, performance and scalability may be key, while for some others, ensuring fundamental DLT properties (e.g., immutability and transparency) may be paramount.   Inherent in these decisions are architectural tradeoffs, since the dimensions represent ideal states seldom realized in practice.  These tradeoffs essentially involve traversing the triple constraint of Decentralization, Scalability, and Security.

Decentralization reflects the fundamental egalitarian philosophy of the original Bitcoin/Blockchain vision i.e., the distributed ledger should be accessible, available and transparent to all at all times, and that all participating nodes in the network should validate the ledger and thus have the full ledger data.  Decentralization enables trustless parties to participate in the network without the need for central authorization.  Scalability refers to the goal of having appropriate level of transaction throughput, storage capacity of the DLT to record transaction data, and the latency for the transaction to be validated and recorded once it is submitted.  Scalability ensures that appropriate performance levels are maintained as the size of the network grows.  Finally, Security is being able to maintain the integrity of the ledger by warding off attacks or making it impossible to maliciously change the ledger for one’s benefit. Fundamentally, this dimension reflects a security design that is inbuilt into the fabric of how the ledger operates, and not rely on external ‘checking’ to ensure safety.

Bringing It Together: DLT Building Block Decisions and Architectural Tradeoffs

Applying the architectural decisions to the DLT system allows one to come up with different flavors of DLT systems, each making tradeoffs to navigate the triple constraint described above.  Traversing the sides of the triangle allows one to transcend different DLT architecture styles with the vertices of the triangle denoting most pure architectural states seldom realized in practice.  For example, systems like Bitcoin and Ethereum aim to tend toward Vertex A maximizing Decentralization through their decentralized P2P trustless model, and Security through their consensus building and validation methods that prevent malicious attacks (although both Bitcoin and Ethereum have been shown to have other security vulnerabilities), but sacrifice much in terms of Scalability (Bitcoin’s scalability woes are well-known, and Ethereum is only slightly better).  On the other hand, permissioned DLTs, such as Corda, aim to tend to Vertex C maximizing Scalability and guaranteeing Security, but sacrifice Decentralization (by definition, permissioned DLT’s are not transparent since they restrict access and also validation is provided only by a set of pre-authorized validating nodes), and also may suffer other security issues (both the trusted nodes and the central authority in a permissioned DLT system can be attacked by a nefarious party).  DLT variations such as Bitcoin Lightning Network and Ethereum Raiden tend toward Vertex B, aiming to use off-chain capabilities to improve Scalability of traditional Blockchain and Ethereum networks, while preserving Decentralization (despite some recent concerns that these networks have a tendency to become centralized in the long run), although their off-chain capabilities may require additional Security capabilities (they also partially move away from the Blockchain’s decentralized security apparatus).   Let’s examine how these tradeoffs come into play at the level of DLT building blocks.

Layer 3: Ledger Data Structure

Ledger Data Structure encapsulates decisions around how the distributed ledger is actually structured and linked at a physical level e.g., chain of blocks, a graph, etc.  Additionally, it captures decisions around how many ledger chains there are, and specifies if the nodes carry the entire or just a part of the ledger.  In traditional Blockchain, the ledger is structured as a global sequential linked list of blocks instances of which are replicated across all participating nodes.  This design goes hand in hand with the Proof of Work consensus protocol that traditional Blockchain has in ensuring high levels of Decentralization and Security- since each node has current instance of the global ledger chain, and there is decentralized consensus building for block validation (although, a few security vulnerabilities with Blockchain have come to the forefront and Proof Work is susceptible to centralization due to economies of scale in mining).  As we know, this design takes a toll on Scalability – Blockchain can process only a few transactions per minute and time required for processing a block is high (Bitcoin generates a new block every 10 minutes).

Some new designs are coming with alternate data structures that improve Scalability & Performance, such as NXT’s and SPECTRE’s DAG (directed acyclic graph) of blocks, which mine DAG blocks in parallel to allow for more throughput and lower transaction time, and IOTA’s Tangle, the so called “blockless” DLT’s that get rid of block mining altogether and rely on a DAG of transactions to maintain system state and integrity.  These new designs have to be implemented and used at scale, with many of these designs having their own set of challenges (some claim they will continue to rely on some form of centralization to gain scale, and also have security related challenges).  However, DLT community’s interest has been high: IOTA’s Tangle has been creating a buzz in the DLT circles has a possible serious contender in the IoT world (since its data structure and protocol is well suited for handling volumes of continual streams of data), and several blockless DLT startups have been born lately.

Tinkering with how the ledger data is stored across nodes represent another opportunity for gaining in Scalability.  For example, sharding, a concept fairly well established in the distributed database world, is coming to DLTs.  Applied to DLTs, sharding enables the overall Blockhain state to be split into shards which are then stored and processed by different nodes in the network in parallel – allowing higher transaction throughput (Ethereum’s Casper utilizes sharding to drive scalability and speed).  Similarly, Scalability can be improved by having multiple chains, possibly private,  to enable separation of concerns: “side chains” enable processing to happen on a separate chain without overloading the original main chain.  While such designs improve Scalability, they move away from DLT’s  vision of enabling democratic access and availability to all participants at all times, and also present Security related challenges, part of the reason why widespread adoption of sidechains has been slow.

Layer 2: Consensus Protocol

Consensus protocol determines how transactions are validated and added to the ledger, and the decision-making in this building block involves deciding which specific protocol to choose based on the underlying data structure and objectives related to the triple constraint. Proof of Work, the traditional Blockchain consensus protocol, requires transactions to be validated by all participating nodes, and enables high degree of Decentralization and Security, but suffers on Scalability.  Alternative protocols, such as Proof of Stake, provide slightly better Scalability by changing the inventive mechanism to align more closely with the good operation of the ledger.  Protocols such as those based on Byzantine Fault Tolerance (BFT), which have been successfully applied to other distributed systems, are applicable to private ledgers, and depend upon a collection of pre-trusted nodes.  Such protocols sacrifice Decentralization to gain in Scalability.

Ethereum’s Raiden and Bitcoin’s Lightning Network are innovations to drive scalability to Ethereum and Bitcoin respectively by securely moving transactions off the main chain to a separate transacting channel, and then moving back to the main chain for settlement purposes – the so called “Layer 2” innovations.  This design allows load to move off of the main ledger, however, since transactions occuring on the channel are not recorded on the ledger, it sacrifices Security as the transacting channels need additional security apparatus not part of the original chain, as well as Decentralization (since channel transactions are not accessible to participants).

A number of other protocols and schemes to improve scalability and security are in the works, many of which are variations of the basic PoW and PoS, and which envision a future comprising not one single ledger chain, but a collection of chains.  For example, Kadena, which uses a PoW on a braid of chains, EOS which uses a delegated PoS, and Cosmos Tendermint, which uses BFT-based PoS across a universe of chains.

Layer 1:  Computation and App Data

DLT resources such as storage and computation come at a premium, and it costs real money to submit transactions in a DLT systems.  In the topmost layer, therefore, the architectural decisions deal with providing flexibility and functionality related to data storage and computation – essentially how much of it should reside on-chain, and how much off-chain.  Additionally, this layer deals with decisions around how to integrate the DLT with events from the real world.

For computation, Bitcoin Blockchain and Ethereum provide constructs for putting data and business logic to be executed on-chain, and Ethereum is far advanced than Blockchain in this since it offers “smart contracts”, which is essentially code that is executed on the chain when certain conditions are met.  There are obviously advantages to doing all computation on chain: interoperability between parties and immutability of code, which facilitates trust building.  There is, however, a practical limit to how complex smart contracts can be, a limit that is easily reached.  Offloading complex calculation to off-chain capabilities allows one to leverage the DLT capabilities in a cost-effective and high performing manner.  TrueBit,  on online marketplace for computation, enables a pattern in which complex resource-intensive computation can be offloaded to a community of miners who compete to complete the computation for a reward and provide results that can be verified on-chain for authenticity.  While this provides upside in terms of Scalability and Decentralization, there are Security related implications of using off-chain computation, an area of active research and development.

What applies to computation, also applies to data storage in the DLT world.  While Blockchain and Ethereum provide basic capabilities for storing data elements, a more suitable design for managing large data sets in DLT transactions is through off-chain data infrastructure providers or cloud storage providers while maintaining hashed pointers to these data sets on-chain.  Solutions like Storj, Sia, and IPFS aim to provide a P2P decentralized secure data management infrastructure that can hook into DLTs through tokens and smart contracts, manage data and computation securely through such technologies as Secure MPC (multi party computation).  Similar to off-chain computation, off-chain storage has upside in terms of Scalability and Decentralization, however, there are security and durability related implications.

What provides immutability to the distributed ledger (its deterministic method of recording transactions) is also its Achille’s heel: it is difficult for the ledger to communicate with and interpret data it gets from the outside non-deterministic world.  Oracles, services which act as middle men between the distributed ledger and the non-DLT world, bridge that gap and make it possible for smart contracts to be put to real world use.  Various DLT oracle infrastructures are in development: ChainLink, Zap, Oraclize, etc.  that provide varying features; choosing the right oracle architecture is thus extremely crucial for the specific use case under consideration.  Similar to off-chain data, oracles provide upside in terms of Scalability and Decentralization, however there are security and data verifiability related concerns.

Untitled

Conclusion

These are still early days for the DLT technology, and the many improvements that need to happen to make DLT commercially implementable are yet to come.  Beyond scalability and security, DLTs face a number of hurdles in enterprise adoption, such as interoperability, complexity and lack of developer friendly toolkits.  The future is probably going to be a not just one ledger technology here or there, but a multitude, each optimized for the specific use case within an organization, and even superstructures such as chains of chains connected with oracles, middleware and such.  And these structures will not replace existing technology architecture either; they will exist alongside and will need to be integrated with legacy technologies.  Like networking, DLTs will give rise to new processes, teams, and management structures.  Enterprise architects will play a central role in facilitating the development of DLT as a true enterprise technology.

The Five D’s of Fintech: Disintermediation

Finance, as the adage goes, is the art of passing money from hand to hand until it finally disappears. Like giant spiders in the middle of a system of webs, banks have played a key role in intermediating money flows across the vast capitalistic system, and doing so in a highly profitable manner. Banking from yore – taking deposits from customers and making loans to borrowers – has given way to a lending system that is dominated by non-banking financials institutions and capital markets. This “banking disintermediation” – banks no longer holding the loans they originated on their balance sheets but selling them off; borrowers going directly to the capital markets rather than to banks to obtain a credit; or savers investing directly in securities – refers to credit disintermediation and has been in the making for a number of decades: banks moved from the boring low-profitability business of credit provisioning to high margin fee-based businesses (investment banking, M&A, insurance etc.). “Disintermediation” – the third theme in The Five D’s of Fintech – has taken on a renewed significance in the wake of the rising tide of fintech: this time around, it refers to how banks may potentially be disintermediated from their customers in their fee-based businesses. Customer relationship, the bedrock of origination and sales that dominates such fee-based businesses, is now under attack, and banks have started to take note.

Fintech led disintermediation has been palpable in areas where venture investment has traditionally poured into: peer-to-peer lending, remittances, payments, equity crowdfunding and even robo-advisory. Fintech use cases on disintermediation of traditional payment players are old news. By comparison, the impact of fintech led disintermediation of the banks in capital markets appears to be relatively small. By some estimates, only $4 billion of the $96 billion (around 4%) of the fintech investment since the beginning of the millennium has been directed to capital markets, and of 8000+ fintechs globally, only 570 (around 7%) are operating in capital markets (Fintech in Capital Markets: A Land of Opportunity).

This may be beginning to change not least because of the juicy returns that can potentially be plucked from the investment banks in origination and sales (which is the key activity in fee based business such as investment banking, M&A/payments, asset management etc.) which at an estimated 22% ROE (Cutting through the noise around financial technology), is a much more profitable business than credit provisioning (which is the key activity in core banking (lending, deposits, etc.)) with an estimated ROE of only 6%. Entry barriers to capital markets and investment banking are significant: the highly regulated nature of the business, the oligopolistic nature of the industry, vicious competition from incumbent banks, and complex operating environment. However, regulatory pressures, compressed margins and technology enabled opportunities are forcing incumbent banks to acknowledge fintech led opportunities.

The investment banking world is being buffeted by the forces of digitization, regulation, and competition. Continued electronification of OTC markets, for example, has meant greater transparency around the price discovery process for investors. Dealers have pulled back from markets where regulatory requirements have rendered them unprofitable for those banks, thus providing an opening wedge for non-bank players such as hedge funds and institutional investors to step in. The financial crisis has tarnished banks’ reputation. It is against this backdrop that fintech outfits are evolving from just providing solutions to automate and streamline bank transactions to being serious threats across asset classes (equity/debt/FX), markets (primary/secondary), and the value chain (origination through distribution and dealing). In the world of supply chain financing, technology-led disintermediation has begun to make inroads. Regulatory pressures, decreasing margins, and KYC/AML challenges have made it difficult for commercial banks to scale their supply chain finance business: the market is underserved and rapidly growing. This is changing with the emergence of such platforms as PrimeRevenue, an American supply chain financing platform, that connects 70 lenders, including 50-odd banks, to 25,000 suppliers with $7bn-worth of invoices a month. While such platforms are not overthrowing banks’ supply chain financing business completely, they are slowly but surely intermediating themselves into direct relationships with end customers: witness how C2FO has offered Citi more lending opportunities in return for access to its supply chain financing customers.

The origination market for new securities issuance is an area where incumbents are still strong but evolving conditions may favor fintech led innovation. Some large asset managers want to ensure that corporations structure securities that those asset managers want to buy, not necessarily those that a bank may structure: this may just be the kind of solution a fintech may develop. Platforms that connect investors directly with issuers are few and far between in the equity and debt primary markets, and even those that are there have dealers in the mix, but slowly and surely there outfits such as Origin (a private placement platform aiming to simplify and automate the issuance of Medium Term Notes) and Overbond (a Canadian platform provider aiming to streamline bond issuance for both private- placements and public offerings) that are going to market with automation and auditing solutions today, but which may in the future choose to intermediate themselves more prominently with investors and issuers by offering data assets and related value added services.

There are, however, more worrisome developments for banks in the securities origination market. Investors are discovering new avenues for investing, for example equity crowdfunding, which may affect bank’s traditional business in private placements and investment banking payments. There are indications that institutional investors have already started to use crowdfunding websites to gain stakes in new businesses, such as in the real estate sector. One website, for example, enables crowdfunders to pool their capital and compete with institutional investors or co-invest with venture capital funds. There is a perceived threat that more buy-side firms will tap crowdfunding sites. Most worrisome for banks is perhaps the specter of alternate fund raising models whereby conventional IPO gives way to alternate intermediated mechanisms such as electronic auctions, crowdfunding and initial coin offerings (ICOs).

New capital raising models represent the latest in how disintermediation is playing out at the very heart of the capitalistic system. Equity crowdfunding has the potential to disintermediate early stage financing and democratizing access to capital and investment opportunities (in spite of the regulatory and reporting burdens imposed by the JOBS Act): witness the rise of such platforms as Kickstarter and Indiegogo. Equity crowdfunding has been thriving for a few years now, but it is Initial Coin Offering (ICOs) that has attracted investor attention of late.

In contrast to a traditional venture funding or IPOs where investors typically get a slice of the equity and control in the business, ICOs involve raising money by selling a cryptocurrency – called a “token” – at a discount to investors who can then trade the tokens on a crypto exchange. ICOs are different from traditional fundraising and even equity crowdfunding: they are unregulated, and startups do not have to fork out a slice of their equity. Indeed, ICOs have been on a tear: a quarter of the blockchain investment over the past couple of years (approx. $250 million) has come from ICOs, two recent ICOs (Qtum and OMG) passed the unicorn mark in a matter of mere months, and storied VCs are investing in ICO funded startups (Cryptocurrency Hedge Fund Polychain Raises $10 million). ICOs offer yet another fund raising avenue for startups that are skittish about handing over control to outsiders, those that do not want to go through the regulatory hassle of taking their company public, or those that have been neglected by the market. For the investors, ICOs provide the lure of massive returns (although they are extremely high risk and many are scams).

Capital markets’ “Uber moment” may not be around the corner, but the capitalistic middlemen are going to have to increasingly define what value they bring in light of disintermediated banking models being championed by fintechs. It is not just capital raising where fintech startups are mushrooming, but in other areas of the capital markets as well, for example clearing and settlement where blockchain-based startups are providing capabilities for fully automated settlement without the need for a central counterparty. Symbiont, a New York based blockchain startup has demonstrated how “smart securities” can streamline post trade processing and settlement. This has huge implications for middle/back office functions in banks, and the wider capital market ecosystem including custodians, clearing houses and depository institutions. This redundancy of centralized processing in the fintech utopia will be the theme of the next installment of the 5 D’s of Fintech series.

The Five D’s of Fintech: Disaggregation

FinTech

“Death by a thousand cuts”, “Sandblasting”, “Mastodons attacked by ants” and similar such metaphors have been used to describe the scourge of fintech and insurtech insurgents and their impact on incumbent banks and insurance companies. “Disaggregation” or “Unbundling” of products services lies behind fintech’s poaching of customers from banks, not singlehandedly but collectively, one product or service at a time. Stories and studies abound on how technology is unbundling not just products and services, but impacting entire value chains. This disaggregation is now well established in traditional banking (See CBInsight’s blog post as an example), but only now emerging in insurance. Disaggregation is the topic of this post, the second installment in the fintech series (see The Five D’s of Fintech: Introduction), in which we will look at where and how such disaggregation is taking place in the insurance industry.

The insurance industry is experiencing a technological double whammy of sorts: not only does technology enable creation and usage of new behavioral context which fuels new competition, but also it stands to threaten the underlying business economics of the industry. Insurance experts talk about “behavior disaggregation” to describe how consumer behaviors can be tracked and analyzed to engage with consumers directly in real-time, price risk accurately and discretely, and provide frictionless services. For example, It is not hard to imagine a “connected home” where various safety measures one might take e.g., installing a burglar alarm, are instantly used to recalibrate risk and thus adjust the homeowners or renters insurance. Tech savvy startups are already leveraging such behavior disaggregation: pay as you go car insurance, from Metromile is an example where driver behavior can be tracked to provide fit for purpose policies. Oscar, a health insurer in New York, gives all its policyholders a fitness tracker; whenever they hit a set goal (walking 10,000 steps in a day, say) they get a refund of a dollar. Where incumbent insurers put their consumers in a policy straightjacket based on blunt risk indicators such as age and occupation, insurtech players leverage technology to study behaviors down to the ‘last mile’ to offer highly flexible solutions, customized not just by the asset type insured but also by duration, for example through “micro duration” policies. These companies have the focus and the agility to build this specialization: like Metromile has done for car insurance, Trov has done for general merchandize, Ladder for life, and Hippo for home. Increasingly traditional insurers may find that insurtech’s win-win value propositions are hard to beat with traditional product bundles – hence the case for unbundling. But unbundling existing products is not enough. Unfortunately for traditional insurers, insurtech’s micro targeting of risk reduces incumbents’ existing risk and profit pools: they are thus forced to not just compete with the upstarts, but also seek out new revenue sources.

Insurtech is forcing disaggregation at an industry-level scale. Incumbents all have traditionally managed the entire value chain spanning product/service distribution, underwriting, claims management and investment/risk management. Using independent agency networks, conventional underwriting models and re-insurance for risk management, carriers have thrived in the marketplace. This value chain is now coming apart as each link in the chain is impacted by technology. New digital sources of distribution are threatening to disintermediate carriers from end consumers – just witness the rise of CoverHound for auto, PolicyGenius for life and disability, Abaris for annuities, and various others. Traditional risk pools are shrinking, risk is migrating from consumers to products, and the nature of risk is evolving thanks to self-driving cars, IoT technologies and the sharing economy — all this has led to emergence of new competitors offering alternate underwriting models such as Lemonade, Guevara, and Friendsurance. Upstarts such as Claimable seek to intermediate to provide easy claims settlement experience to end consumers. New arrangements such as Managing General Agents, catastrophe bonds and collaterized reinsurance are disaggregating the carrier/re-insurer relationship: now carriers can go directly to capital markets, and re-insurers can strike up business arrangements with startups focused on customer acquisition.   The neatly linked insurance value chain is slowly moving to horizontal stacks based structure (see BCG’s Philip Evans’ idea of stacks here).

Insurance is different from traditional financial services in that players in insurance, unless they are pure brokers, have to take on element of risk and hold associated capital, all of which comes with ton loads of regulatory requirements. Due to these reasons, insurtech has been slow to penetrate the $6 trillion insurance industry goliath. The pace however may accelerate. According to McKinsey, automation could leave up to 25 percent of the insurance industry’s current full-time positions consolidated or replaced over the next decade (see Automating the Insurance Industry). If nothing else, carriers should do everything in their power to prevent disintermediation, which will be the topic of the next installment in the fintech series.

The Five D’s of Fintech: Introduction

FinTech“Fintech” (a portmanteau of financial technology that refers to the disruptive application of technology to processes, products and business models in the financial services industry) is coming of age: two of the most prominent Fintechers, OnDeck and Lending Club, have gone public, many more are processing transactions to the order of billions of dollars, and outfits providing market intelligence in fintech are cropping up – there is even a newly minted index to track activity in marketplace lending. Banks are increasingly taking note of the Fintech movement, partnering with startups, investing in them or even acquiring them outright. Venture funding in fintech grew by 300% in one year to $12 billion in 2014.  According to the Goldman Sachs’s “Future of Finance” report, the total value of the market that can potentially be disrupted by Fintechers is an estimated $4.3 trillion.

Fintech is a complex market, spanning a broad swath of finance across individual and institutional markets and including market infrastructure providers as well. It is a broadly defined category for upstarts who have a different philosophy around how finance should function and how it should serve individuals and institutions. While some Fintechers seek to reduce transaction fees and improve customer experience, others exist to provide more visibility into the inner working of finance. In spite of this diversity, there are some common threads and recurring themes around why Fintech firms exist and what their market philosophy is. The 5 D’s of Fintech – Democratization, Disaggregation, Disintermediation, Decentralization and De-biasing – represent common themes around the mission, business models, values, and goals of many of these firms. In this series of posts on Fintech, we will look at each of the 5 D’s of Fintech, starting with Democratization — the mission of many a Fintech firm.

The Five D’s of Fintech

Fintech Slides

Democratization

Technology has long enabled democratized access to financial services, however Fintech is taking the movement to another level by targeting specific market niches with customized value propositions. A central appeal of many Fintechers is their promise to bring to the masses resources and capabilities which heretofore have been the preserve of the wealthy, elite, or the privileged. This has been made possible by both by market opportunity and internal capability: market opportunity of serving a market whitespace, and the ability to do so economically through the use of data and advanced technologies.

The financial inclusion that Fintechers are now enabling is driven by their ability to clear obstacles, remove barriers, and enable access where none existed before, whether it is serving the unserved or underserved SMBs that have typically been shunned by traditional banks (Funding Circle), providing credit to the underbanked segment lacking the traditional credit scores (Kreditech), enabling investment advice without the need to rely on expensive financial advisors (Nutmeg or Betterment), or facilitating access to the capital markets by offering low-cost brokerage services (Robinhood). Financial services are now “for the people” and “by the people” as well: Quantiacs, a fintech startup with the aim of revolutionizing the hedge fund industry, is essentially a market place for quantitative trading strategies that enables anyone to market their quantitative skills and trading strategies. Or OpenFolio, which is an online community that allows one to link portfolios and measure investment performance against their communities and relevant benchmarks. Wealth management perhaps is a market ripest for democratization as shown by rapid emergence of a raft of outfits such as HedgeCoVest and iBillionaire (platforms that allow investors to mirror the trades of hedge funds and billionaires, respectively), Loyal3 (which offers no fee access to IPOs), Algomi and True Potential (which undo trading obstacles for investors).

As Vikas Raj with Accion Venture Lab notes, the real potential of fintech is in democratizing access to finance for the billions of low-income unbanked population in the emerging markets. The high complexity and low scale nature of this market is exactly the kind Fintechers are good at capitalizing on, and this is evident from the long list of companies that is emerging in this market beyond Silicon Valley and New York. Where traditional finance and government agencies have failed, Fintech has the promise and the potential to excel.

Other industries can learn a lot by observing how Fintech is driving democratization in finance. Whether it is healthcare, education, media or government services, there is potential value in market segments that are currently un/under served which a Fintech like movement can unlock. Adopting the technologies underlying Fintech is part of the story; what is needed first is the recognition of the potential for change, the support from the markets, and an entrepreneurial spirit to lead the movement.

 

 

Paths to Power: What can Architects Learn from Sources of America’s Global Dominance

In discussing American exceptionalism and influence in the modern world, political scientists and academics talk about various sources of American power, for example US military might or its economic dominance.  Walter Russell Mead, an academic and a foreign policy analyst, defines four kinds of American power: the sharp power of the US military that provides the foundation, the soft power of American pop culture and the sticky power of an American-led economic world order that build atop the foundation, and finally at the very top the hegemonic power of the US, a prerogative and privilege to provide credence and confidence to all things worldly, whether they be economic coalitions or other global pursuits.  Organizational theorists and researchers have done extensive study of power and influence of individuals and groups within organizations, however the political scientists’ lens provides an interesting framework to understand how power should be built and exercised.  An example of such an application is looking at how the architecture function wields and exercises power with an IT organization.

Architects typically exercise their power and exert influence through formal governance processes and mechanisms – the infamous “architecture review board” is one of the various architecture forums that have given many a sleepless night to IT program managers.  Architecture governance can be characterized as the architects’ sharp power that is exercised by poking and prodding the IT teams to behave in a certain manner.  Just as a nation’s sharp military power serves as an important foundation for its power structure, so too does the architecture governance framework for the architecture group.  But, a foundation is not enough to create and sustain power.  US’s overwhelming military power is a powerful deterrent for many, but it is far from enough to ensure American dominance in world affairs.  And this has been demonstrated time and again by many IT organizations where in spite of having strong governance processes, the architecture teams fails to have appropriate influence in decision-making.  What is needed to complement the sharp power of governance are the architecture team’s sticky and soft powers.

America’s sticky power rests on two foundations: a dollar based global monetary system and free trade.  A dollar-based system not only allows America to reap direct economic benefits e.g., seigniorage from issuing bank notes to lenders all over the world, but it also allows America to exert the influence of its monetary and fiscal policies far and wide.  Globalization and fee trade, championed by American based multinationals, has created an economic framework in which benefits accrue to all countries that participate.  There are strong incentives for a country to participate in such an economic system, and strong disincentives to leave.  Architects have an opportunity to wield sticky power by creating a system in which their work creates benefits for their key stakeholders – architects should reorient their work to facilitate the outcomes and key decision-making IT management would like to have but cannot due to political, organizational or technical reasons.  Be it brokering talks, striking alliances, or negotiating trade-offs, architects in this role will need to function as first-class deal makers.  And in this role, architects will be seen belonging to an elite group that knows how to get things done. In an ideal state, IT managers will have no choice but to engage with the architects, because of the assurance that such an engagement will provide for the achievement of the outcomes they desire.  Achieving initial success will create a self-reinforcing system in which stakeholders will be increasingly willing to work with the architecture team and increasingly hesitant to work against them.  Sticky power, however, needs an inducing agent, something that will draw others in.  That is where soft power comes in.

American soft power emanates from America’s ability to successfully export its culture, values, ideals, and ideas to the rest of the world.  Architecture’s ability to successfully project its image across the entire organization is critical to garnering soft power.  The typical perception of architecture work being just technically focused is a result of architecture’s team failure to project an image commensurate with the stature of true architecture work.  Architects need to build a culture, set of ideas, principles and work habits that are unique and suited to the problem-solving, deal-making, and relationship and influence building that architecture work demands.  It starts with hiring the best and the brightest – not just the technically savvy, but true leaders, strategic thinkers and doers.  But creating such a culture is not enough – it needs to be packaged and “exported” to the rest of the organization, with the goal of creating an environment where the rest actually want to go join the architecture team because it has a certain cool factor.  On this scale, the typical architecture team fails miserably – one has yet to come across an IT organization where he architecture team is considered one of the cool teams to join.

Hegemonic power, the holy grail on the road to building power and influence, is power that comes from powers of all other kinds – it is the cherry on the top of the icing that allows America to set the agenda and rules for world affairs, or be the “gyroscope of the world”.  The architecture team is in a prime position to be the gyroscope of the IT organization.  By combining the power of governance with the ability to create real value for their stakeholders and attracting the best talent, the architecture team can influence decision-making at the highest levels – it can set the agenda to facilitate its own goals and outcomes and thus perpetuate its power and influence.   The nature of architecture work is such that sustaining power and influence are crucial to ensuring long term success of architects.  Maintaining power on an ongoing basis however takes effort, wise decision-making, and moving with the times – just witness the case of Britain, which only around a hundred years ago, was the world’s leading power by far and wide, but which was gradually supplanted by America as it held onto its stubborn positions and made fatal policy mistakes.

Testing in a Continuous Delivery Environment

In his book Out of the Crisis, W. Edwards Deming cautioned “Cease dependence on inspection to achieve quality. Eliminate the need for inspection on a mass basis by building quality into the product in the first place”. Ever since, ‘Building Quality In’ has become one of the central tenets of quality focused lean initiatives, including lean software development. The act of testing in software development is an example of inspection: inspection to find bugs and faults in the developed software; static code analysis is another example of inspection. Quality is important in the context of software development because software bugs cost both users and software providers dearly: a study conducted on behalf of the US National Institute of Standards and Technology estimated the cost of software bugs to the US economy to be around $60 billion. Perhaps the extent of this scourge is not surprising since in many organizations software testing is not effective: testing/QA teams run “quality gates” as an afterthought and even then testing does not necessarily translate into quality. When Agile came around, practitioners came up with new approaches to testing, aptly described under the banner of “Agile Testing”, that provided some improvement by driving more collaboration across teams and bringing testing up in the development cycle. Now with the advent of DevOps, testing specifically has taken on a new level of significance since continuous delivery is not just about delivering software rapidly, but software that works as well. A few have even coined a term for this new discipline: continuous testing. All that is well, but what does testing mean in a continuous integration/delivery environment?

blue devops_4

In a continuous delivery (CD) environment, quality becomes the responsibility of all. This does not mean that the QA and testing teams do not have a role to play in a CD environment. On the contrary, the QA and testing function moves into a strategic role, providing oversight, direction and leadership for diving overall quality. For example, instead of spending countless hours running manual tests, QA teams will invest resources and time to develop and implement a comprehensive test automation strategy, or they will spend effort putting in place governance processes, metrics and incentives to drive quality at every step. An example of how quality becomes everybody’s responsibility is what the development staff would do in such an environment. Development teams in a CD environment are empowered to take on quite a bit of testing upon themselves. In addition to a ‘test first’ approach, developers may also be required run pre commit testing that runs a suite of unit, component and integration tests. Indeed many CI servers provide the capability for ‘private builds’, which allows an individual developer to see if their code changes can be integrated into the main trunk for a successful build. Pre commit testing should enable developers to conduct a quick ‘smoke test’ to ensure that their work will not break the code in the main trunk. Therefore, pre commit testing may contain a selection of integration and acceptance tests. Once the developer checks in the code to the CI server after pre commit testing, the CI server runs the commit stage tests, which includes performing any static code analysis as required, component and integration testing, followed by system testing. Commit stage testing results are immediately fed back to the development team to get any errors or bugs addressed. Successful commit stage testing increases confidence in the build’s ability to be a candidate for acceptance testing. Builds failing commit stage testing do not progress to the next stage: the acceptance testing stage.

Acceptance testing is the domain of business analysts and business representatives assigned to the project team. However, this should not mean that development staff do not have any involvement in acceptance testing. Successful testing in a CD environment gives developers more ownership in driving quality by allowing them to conduct automated acceptance tests in their development environments. Common obstacles to enabling this, such as insufficient licenses and/or manual deployment and setup processes, need to be removed. Acceptance testing is a critical step in the deployment pipeline: a release is deemed acceptable for deployment only if it passes the acceptance test stage. The entire team should focus on fixing acceptance testing issues for a given release. A CD environment requires acceptance testing to be automated as much as possible: a fully automated acceptance testing suite enables the tests to be run for a build as when needed – this speeds up the development process and also enables creation of a powerful suite of regression tests that can be run over and over again. Some tools even offer capabilities to encode acceptance test criteria and to programmatically drive creation of acceptance testing based on those criteria: thus testing, and hence ultimately delivered software, can never be out of sync with evolving acceptance criteria and requirements.

If the system under development is a high performance system, some capacity and performance testing may become part of acceptance testing as well. Usually however capacity testing and testing for other ‘non-functional requirements’ is separate stage in a CD deployment pipeline. Although a CD environment requires such tests to be as automated as possible e.g. through the use of Recorded Interaction Templates and other devices, the success criteria for such tests is somewhat subjective – so although a release may fail automated capacity testing technically, it may still be greenlighted to go ahead based on human judgment. Ultimately, as the release completes the non-functional testing stage gate, it may then be put through more of the traditional manual testing. This is where human testers can excel and apply their expertise in UI testing, exploratory testing, and in creating unique testing conditions that automated testing may have not tested the software for. Manual testing effort thus is one of the last stages in the testing pipeline in a CD environment.

If testing is to indeed become ‘continuous’ in nature, there are several critical factors that need to be in place. Perhaps the most critical one is test automation, which is often times slammed by some practitioners to be difficult to do or non-value added. Whatever the reservations with automation, testing in a CD environment cannot possibly be efficient and effective without automation of testing – especially since testing is done in big numbers and it is done quite often. Automation is just one piece of the various test design and execution strategies to make testing execute efficiently and thus be successful in a CD environment. For example, CD practitioners recommend a commit testing stage lasting no more than 10 minutes – a hurdle that can be met only by adopting such strategies. Automation also applies to provisioning of and deployment to environments. ‘Push button’ deployments and provisioning of test environments is critical if developers are to conduct smoke acceptance tests to quickly test their work. Similarly, test data needs to be managed effectively. Test design and test isolation need to be such that data requirements for testing purposes are fit for purpose and parsimonious : wholesale replication of production data is neither feasible nor recommended in a CD environment. Data management, like environment management, needs to be fully automated with configurable design and push button techniques.

Testing has the opportunity to move from being a reactive purely static analysis function to being a proactive quality focused initiative. Achieving this requires making a number of tough decisions related to processes, division of responsibilities and organization of the testing effort. Making these tough decisions in a traditional environment is many times a choice. Moving to a CD environment however mandates those decisions to be made, which should be reason enough for organizations to start examining today how they can evolve and improve their testing efforts toward that ultimate model.

Transparency in Pharma and Drug Industries

With the passing of the Affordable Care Act and the recent ratification provided by the US Supreme Court in King Vs Burwell for federal tax subsidies, there is now a strengthening movement toward providing more transparency and accountability in the health care industry in America. Pharma and drug companies, an important part of the health value chain, suffer from a number of transparency issues. While the ACA made some progress in enabling this transparency by mandating that pharma companies make public drug related payments made to doctors, much remains to be done on other fronts such as publication of clinical trials data and full disclosure of drug side effects. A few, such as Dr. Ben Goldacre, the founder of AllTrials, have launched public movements to campaign for more data transparency in the pharma and drug industry. More data transparency, however, can be a double-edged sword. There are benefits, however, there are practical considerations as well.

untitled

Providing transparency around clinical data can be valuable. When clinical data on a class of antidepressants called selective serotonin-reuptake inhibitors (SSRIs) was analyzed, an increased risk of suicide among adolescents from the use of SSRIs was discovered. Similarly, when the raw clinical data of Tamiflu was analyzed, Tamiflu’s efficacy in fighting viral infections and reducing hospital admission rates was brought into question. Like any large scale statistical analyses of data, clinical data analysis upon which drug companies, regulators and government agencies depend for risk evaluation and approvals, can have anything ranging from egregious mistakes to subtle biases. These can stem from a number of factors, including selection bias in the controlled trials, or mistakes in interpreting statistical significance. The latter, in which the statistical model either lacks statistical power (thus increasing the likelihood of false negatives) and/or the threshold for detecting significance is not enough number of standard deviations (thus increasing the likelihood of false positives), are fairly common in the scientific research community. Couple this with other exacerbating factors, such as research scientists lacking appropriate skills in advanced statistical analysis, a prevalent tendency toward publishing positive hypotheses as opposed to negative ones, and ineffective peer reviews of clinical research findings, and one has a perfect storm in which such mistakes can be fairly common. Greater transparency of clinical data allows any external third party to review and validate the research findings and thus bring to light any potential issues and insights that may have escaped the research team’s due diligence or the government agency’s regulatory scrutiny. Thousands of clinical trials have never been registered with oversight agencies and results from around half of all clinical trials remain unpublished. Making that data available to statisticians would almost certainly lead to new discoveries and clinically useful findings (quoted directly from an article in The Economist).

The noble intention behind the push for greater transparency however may not translate into desirable effects and worse may have unintended consequences. One of the biggest fears is inappropriate analyses and interpretations of the clinical datasets. In a litigious environment, with pharma and drug companies already battling an image of being inhumane greedy corporates least concerned with the ability of the people to afford exorbitant drug prices, this can spell disaster. And it may serve as a strong innovation disincentive for the pharma industry in the long term, with the opportunity cost of not experimenting with novel treatment techniques ultimately being borne by consumers in the form of shortened life spans and/or degraded quality of life. Even when there is less room for misinterpretation, practical challenges with replicating the results of experiments may prevent one from exactly reproducing the results of clinical trials. It is a well-established fact that experimenters employ tacit knowledge and improvisations that are not always captured in experimental setup and process steps. Furthermore, many research teams may use proprietary models to analyze and interpret raw clinical data to arrive at their conclusions – models that they may be averse to sharing with the public, but which nonetheless are critical to arriving at the proper conclusions. The cost of full data disclosure for drug companies is not even discussed many times, but there is a non-trivial cost component to retaining and disclosing such data to the public at large.

So, mandating full disclosure of raw clinical data is just one of the items in an entire menu that needs to be put in place if indeed the objective is to improve the safety, efficacy and efficiency of the pharma and drug industry. The field of biostatistics can go long ways in educating researchers on correctly employing and interpreting clinical datasets. Independent data monitoring committees to oversee the construction and execution of clinical trials to ensure appropriate application of analytic techniques could be in place to provide guidance as the experiments are being conducted. Big data and modern statistical analytic techniques could be developed further to provide researchers with means to more effectively analyze data. In the process of doing all this, if we can help prevent even minor mistakes or incorrect interpretations of drug data, we will have made medicine that much safer for mankind.