On May 25, the European Union (EU) will begin enforcing the General Data Protection Regulation (GDPR), which requires companies to behave responsibly in their collection and management of personal data. Penalties for noncompliance include fines up to four percent of global revenue, adding to the damage that data breaches can do to customer confidence and institutional reputation. (Interestingly, some studies suggest breaches don’t hurt a company’s stock price.)

At the same time, the rate of change in IT is only accelerating and businesses need to be able to go fast to keep up with innovation and stay competitive. While the general consensus agrees on the need for data privacy, only some companies have shifted away from speed and growth at all costs toward building software that respects user privacy.

Users are no longer ignoring privacy settings. In fact, they’re demanding accountability and organizing movements to quit offending platforms. As Benedict Evans recently noted in the wake of the Facebook/Cambridge Analytica scandal, “A great deal of what seemed normal in the past might not seem normal now.”

Given that change in user sentiment, don’t focus simply on complying with GDPR — or any other privacy regulation for that matter. Regulations can be amended, repealed, or superseded any time. Instead, design radically private software that fundamentally respects the privacy of users. When privacy is a top priority, your customers will trust you, and that trust will enable you to move faster.

Here are some tips on how to build radically private software:

1. Acquire data progressively and only when you genuinely need it

Only collect data you have a need for and only do it when you have the need for it. It’s no longer okay to have software that asks users to grant every permission under the sun without explaining why. For example, disclose at the outset why you’re asking for a user’s contacts rather than asking them to grant you liberal permission to email their contacts however, and whenever, your business wants.

You should be able to honestly provide answers to “why,” “when,” and “how” you’re leveraging a user’s data, and specifically “what” data you’re using. Imagine a social media company explaining why it needs to scrape a user’s phone call metadata, including names, phone numbers, and the length of each call made and received. “Why, we presumed that a 14 minute call with your cousins means you’re closer than a 12 minute call with aunt.” Does the company really need to know the length of your calls, or does it just need to know who you talk to most often?

2. Clearly state what you’ll be using the data for and how that benefits users

GDPR doesn’t allow the use of illegible terms and legalese that’s hard to understand. If a shopping app asks for a user’s name, address, and email address, it should tell that user — in plain, understandable terms — that it needs this information because it needs to know where to make deliveries. What’s more, it should state clearly that it’s not going to use the data to send spam mail and that it won’t sell user email addresses to a third party that will try to market stuff to them.

As Timer Berners-Lee tweeted in March: “General rules for us all: Any data about me, wherever it is, is mine and mine alone to control. If you are given the right to use data for one purpose, use it for that purpose alone. Get users’ active and informed consent for these uses of their data.”

3. Get active and informed consent for all data use

Radically private software means that if users don’t give informed consent, you can’t use their data at all. It also means that if users give consent for one use, you can’t apply that consent to some other use.

What’s more, radically private software means that consent must be active. Offering a consent box that’s pre-checked is not okay and is unlikely to be acceptable under GDPR because the user is not actively selecting that box. Users need to give active consent by clicking on and checking the box themselves.

Additionally, consent must be informed. No more UI tricks like the button to give consent is big and red, while the button to withhold data is small and gray. Those antics will no longer cut it. Being open and transparent about consent is critical to building trust with your customers and helping you better navigate privacy issues when they arise.

4. Consider issuing a data receipt for each such transaction

When users agree to share their data or give some other consent, consider emailing them a receipt for that consent, similar to a sales receipt. The benefit to you, the developer, is that you have a record of the consent and can reference it via the email you sent the user. A data receipt can also ensure that both you and the user are on the same page when it comes to understanding privacy policies.

5. Make opting out easy — and let users change their minds and take their data with them

As the privacy landscape evolves, we’ll see more people making the decision to opt out of some consents. Developers should offer this option. People are now choosing to #DeleteFacebook rather than opt out of data sharing because deleting is easier and they don’t have more granular options for what they can opt in or out on. You need to make it easier for people to opt out and tell them what opting out means.

And if people decide to leave your service, they should be able to take their data with them. This is happening already in the European banking industry. A regulation requires banks to make customer account data available in easy-to-use formats so they can change banks more seamlessly. For example, a banking app might contain a helpful budgeting feature that shows customers how much they spend on food, rent, entertainment, etc. The customer owns this data and, if they switch banks, can bring it to their new bank.

Enabling this sort of data portability is important in radically private software.

6. Encrypt data in transit, and always go with the strongest option available

Should you be encrypting all data, even when it’s in transit? The answer is yes. The barriers to entry are now so low that it’s unacceptable for a developer to put up a non-secure website. Everybody knows you shouldn’t write your own encryption; use strong industry standards for security instead.

7. Educate users about how automated decisions are made

A great example of educating users on automated decisions is the Stitch Fix Algorithms Tour. It transparently shows how “Your online personal stylist” uses data. It tells customers why it’s gathering their data: Because it wants to be a matchmaker, connecting customers with styles they’ll love and probably would not have discovered on their own. No one can accuse Stitch Fix of hiding its data collection efforts and intents. This helps the company build trust and win over customers.

This type of transparency has actually been a legal requirement in certain industries for a while. For instance, when a company makes a credit decision about a customer, it needs to explain the factors that went into the decision — for example, “The reason you’ve been denied credit is you have too many credit cards and you’re overdrawn too often.” Or words to that effect. Such explanations are not new; what is new is the complexity of the decision-making process and the resulting complexity of the explanations.

8. Understand that truly anonymizing data is almost impossible, and any data you release is open to misuse

A recent example here of anonymized data causing problems comes from Strava, a mobile app that tracks your exercise via your phone’s GPS. Last November, it released a heat map that showed the activity of all users around the world. One analyst quickly pointed out that the map might be cross-referenced with a map of military bases to discern regular jogging routes, patrols, even the location of forward operating bases in Afghanistan.

Organizations often release anonymous data with good intentions. They want people to be able to do cool stuff with it. But it’s almost always possible to de-anonymize the data and trace individuals within it, so removing classic personal data such as names, addresses, and phone numbers is not enough. With data points like location and time of day, it’s still possible to construct a pretty good picture of a user.

9. Communicate clearly with users about steps you’ll take if a data breach occurs

Clear communication is the key to trust. Breaches are almost inevitable these days, and while almost no company goes to this degree today, here’s what true transparency from a company could look like: You tell users up front what you’ll do when a breach happens to you. Here’s the process you’ll implement in the first 24 hours, here’s the team you’ll pull together, here’s where you’ll post information to inform customers and keep them up to date, and here’s how customers can opt out quickly if a breach happens.

Privacy is not a policy you should implement just to meet a legal requirement like GDPR. And it’s not enough to do the minimum. Privacy is now front of mind for every consumer and, done well, it can work to your advantage. You can win the trust of customers and get a leg up on your competition if you build radically private software.

Ian Huston is Data Scientist and Associate Director at Pivotal Dublin.