Technology and big data are changing how markets work. From how customers are acquired and retained, to how prices are set, the changes are revolutionary. The result is new challenges for policy makers and regulators.
Target marketing – that is, the marketing of products and services to consumers who are likely to be interested in the offer, or profitable for the business – is not new. But more sophisticated technology is turbo charging marketing in the 21st century. Businesses are better able to access consumers’ personal information, particularly online, and utilise complex systems to predict an individual’s behaviour.
One area of marketing that has not had a lot attention is lead generation, that is, the process of identifying people who are potential sales targets or ‘leads’. Earlier this year, Consumer Action Law Centre published a short report
examining this sector, and we found that the marketing strategy has re-engineered itself in the digital landscape.
Our report found that businesses are using techniques like online surveys or tricky consent practices to collate and then sell lists of customer ‘leads’ to third-party businesses. These third-party businesses then contact the customer and, in many cases, use high-pressure tactics to make a sale. This has occurred across a range of sectors, including vocational training, payday lending and even in the solar power industry.
But it is not just marketing that is changing. Big data is being used to price differently as well.
The policy director from UK’s Consumer Advice Bureau has written about this recently. He notes, that in the 20th century, prices were a characteristic of the product itself, reflecting intuitive things like the cost of manufacture or labour. Today, however, firms can more easily personalise their prices, asking different customers to pay different amounts for the same thing.
Price discrimination is not always a problem. We are used to it in areas like airline pricing where you can get a deal if you book early or are willing to travel at an inconvenient time. However, the ability to price discriminate is changing in key markets like insurance and credit.
Traditionally, the business of insurance was about pooling of risk for the benefit of all. The 19th century American insurance entrepreneur Jacques DR described insurance as "a great fund" to "share the burden of suffering and calamity".
Today, however, decisions such as whether an insurer will insure someone’s house, car or even life, and on what price and terms, are made by algorithms. This creates efficiencies for the insurer, but outsourcing decision-making to machines risks bizarre and even discriminatory outcomes.
Insurers are teaming up with banks, retailers, airlines and many others to access information about our finances, lifestyle and diets. Right now, insurers are setting our car insurance premiums based on the exponentially increasing personal information that they have about us.
You might pay more if you did not finish high school, eat a lot of pasta and rice, or fill up on petrol late at night. Insurers justify this because the data they hold tells them such people are more likely to make a claim. In the UK, it was reported that one insurer charges you more if you have a Hotmail address, and several will hike up the price if your name is Mohammed.
As insurers rely more on increasingly personal data to individualise and fragment risk profiles, we risk losing the mutual nature of insurance and the hard realities of financial and social exclusion emerge.
Credit pricing is changing as well, particularly through data sharing systems like credit reporting. The mandating of more comprehensive credit reporting, as proposed by legislation currently before Federal Parliament, is being trumpeted by Treasurer Scott Morrison as a boon to consumers by enhancing competition from new providers in the fintech sector.
The flipside, however, is that some lenders will use this information to charge customers more for credit. In countries like the UK or US that have this level of data sharing, it’s not uncommon to see credit cards priced at 50 per cent per annum and those marketed as a “credit rate builder”.
Here in Australia, the NAB recently announced changes to the way it prices personal loans. Rates now vary between 10.69 per cent and 18.69 percent, based on individual customer risk. But the basis of that assessment isn’t transparent – and there is little recourse if you consider that they’ve judged risk incorrectly. Moreover, these arrangements are likely to mean higher interest rates for those that are already struggling.
Credit scores are also becoming more ubiquitous, even outside the area of lending. Some real estate agents are asking applicants for their credit score when applying for a rental property. What’s next, an employer asking you? This is common in the US with some employers arguing that good credit is an attribute of a responsible person. The risk is that a bad credit rating becomes a signal for shortcomings that have nothing to do with paying bills.
So how are policy-makers to respond? I’m not suggesting rash decisions, but we do need to think through these problems and take steps to address them where there is unfairness or harm.
There are some policies that may help. The recent announcement from the Federal Government that it will legislate for a Consumer Data Right is one. Depending on how this operates, it may put greater power into consumers' hands. Consumer advocates have pushed for a right of deletion to be part of the new right, so as to remedy the imbalance of power between big business and everyday people.
We also need stronger and more active regulators, that work together to promote good public outcomes. To support this, we need strong and cross-cutting consumer advocacy to make sure consumers are well-represented in important decisions, balancing the well-funded lobbying power of industry.
We also need greater willingness to intervene, in targeted ways, to limit extortionate prices or unfair conduct affecting some groups of consumers. This sort of intervention relies on an acceptance of behavioural market failure, that is, problems that arise because of biases that mean people don’t make good decisions. Capping the costs charged by payday lenders is an example of such an intervention – a sensible intervention given that those attracted to such lending aren’t making decisions on price, but on the basis of fulfilling an immediate need like paying rent.
And we must make these decisions with an eye on what sort of society and markets we want. For us at Consumer Action, we seek a just marketplace, where people have power and business plays fair. Most Australians, I think, would stand behind that vision for the future.