As the world recovers from yet another international stock-market roller-coaster — an outright crash probably only prevented by safety algorithms installed after the Singapore flash crash of 2013 — it might be helpful for us to remember a couple of lessons from history.

Over the past eight years since the eruption of the 2007-2008 global financial crisis, and over the last 14 years since the tech bubble implosion, and over the last 18 years since the Asian financial crisis, we’ve learned, re-learned and re-re-learned that a lot of what we thought about the financial markets just wasn’t true. We’ve learned that when we check to see who has our back in the financial arena, there might not be anyone there. We’ve learned that banks and financial markets are not as fiscally secure as we’d like. We’ve all learned that much of Wall Street (or London’s Square Mile or Shanghai’s Pudong district or Hong Kong’s, well, entirety) is not a paragon of probity, integrity or adherence to rock-solid rules.

We’ve learned that a lot of the financial markets’ assurances and guarantees are backed by nothing more than smiles and poll-tested platitudes, Don Draper style. In fact, much of the 2007-2008 crisis was the direct result of many institutions not having the funds to back their promises and their bets.

We’ve learned (or re-learned) that financial markets are managed by humans who, like the rest of us, are deeply flawed, limited in their knowledge and awareness of the future (and even the present), and too often self interested at the expense of the people for whom they claim to work.

Economists call the last one the principal-agent problem, and boy, is it a problem. It matters especially when funds are held in reserve or some other form of fiduciary trust by a third party, as is the case with various forms of insurance, derivatives, and other financial arrangements.

The world would be greatly improved if agreements were enforced by automated, open-source code (pre-programmed digital robots, basically) operating across a transparent system — a system that, like Bitcoin (and more generally, like much of the Internet), has no central point of failure, control or censorship, and once up and running, is hard to erase, shut down or for an individual to otherwise muck up.

Funds wouldn’t be held in custody by an individual or bank — they’d be held by an algorithm that reliably and dispassionately follows pre-set, transparent “if-this-then-that” orders in real time.

Since the start of the global financial crisis (which, in some ways, has never really ended), there has been the recurring threat of counterparties fully or partially backing out of agreements to pay holders of credit default swaps. These are supposed to work as insurance against defaults. They’re supposed to pay holders when a default happens.

You shouldn’t have to worry that the very risk-avoidance tools you’re buying — to minimize your exposure to risk — have risks of their own. Risk of the contract being reneged on for the very reason you bought it. Risk that “banks” aren’t really banks with — like, ya know? — funds. Risk that “insurance” isn’t really insurance.

This kind of risk has distorted the derivatives market, significantly undermining its vital role as a form of risk management in the modern global economy. Maybe that’s one reason why growth of the global economy, in productivity and of average wages, has been unusually subpar the past few years.

In an automated, open, decentralized market, people could have 100 percent confidence that the system would not “break.” We could be confident it could not be rigged or gamed. Confident that agreements couldn’t be broken or reneged on. Confident that there was little or no counterparty risk. Confident that all contracts would remain enforced by cryptographically secured, automated processes, not human discretion.

On an algorithm-enforced market run on a decentralized Bitcoin-like network, everything is out in the open for the entire world to see (including all of the code that runs it), and contracts have to be digitally funded and held in custody by the algorithms to be activated and enabled. Funds aren’t released until a completely disinterested program dispassionately executes the preset terms and conditions for releasing them. The funds are padlocked into the network by consensus and cryptography until that happens.

Recent credit-default swap fiascos show that, at key moments, when billions of dollars are at stake, humans err and fail. That does not happen on an automated system designed to run beyond human discretion and control after agreements have been “signed.”

Payments, not promises. Funds, not fluff. Escrow, not Jell-O. No IOUs. No empty promises. No Potemkin Villages. Oh, and sorry, Popeye fans: no “I’ll gladly pay you Tuesday” Wimpys.

Peronet Deispegnes runs special ops for Augur.