Using algorithms to determine character

Is data-driven personality analysis fairer than standard measures?

Paul Gu, co-founder and head of product at Upstart, a loan company among a new generation applying mathematical models to determine if you will pay back a loan or stay in a job. “It’s not whether you can pay. It’s a question of how important you see your obligation,” Gu said. Photograph: Jim Wilson/The New York Times
Paul Gu, co-founder and head of product at Upstart, a loan company among a new generation applying mathematical models to determine if you will pay back a loan or stay in a job. “It’s not whether you can pay. It’s a question of how important you see your obligation,” Gu said. Photograph: Jim Wilson/The New York Times

Computers aren’t just doing hard maths problems and showing us cat videos. Increasingly, they judge our character.

Maybe we should be grateful.

A company in Palo Alto, California, called Upstart has over the last 15 months lent $135 million to people with mostly negligible credit ratings. Typically, they are recent graduates without mortgages, car payments or credit card settlements.

Those are among the things that normally earn a good or bad credit score, but these people haven’t been in the working world that long. So Upstart looks at their school examination results and what colleges they attended. As much as job prospects, the company is assessing personality.

READ MORE

"It's not whether you can pay. It's a question of how important you see your obligation," said Paul Gu, Upstart's co-founder and head of product.

The idea, validated by data, is that people who did things like double-checking the homework or studying extra in case there was a pop quiz are thorough and likely to honour their debts.

Analytics, meet judgment of people. “I guess you could call it character, though we haven’t used that label,” said Mr Gu, who is 24.

The same personality dynamic holds for people who go to great schools or have top grades. Douglas Merrill, the founder and chief executive of ZestFinance, is a former Google executive whose company writes loans to subprime borrowers through non-standard data signals.

One signal is whether someone has ever given up a prepaid wireless phone number. Where housing is often uncertain, those numbers are a more reliable way to find you than addresses; giving one up may indicate you are willing (or have been forced) to disappear from family or potential employers. That is a bad sign.

Zest recently branched into “near prime” borrowers, who have either fallen from the prime category or risen from subprime. The question is why these people have changed categories, and Zest tries to figure out if a potentially reliable borrower has had some temporary bad luck, like a one-time medical expense.

“‘Character’ is a loaded term, but there is an important difference between ability to pay and willingness to pay,” said Mr Merrill. “If all you look at is financial transactions, it’s hard to say much about willingness.”

Mr Merrill, who also has a PhD in psychology (from Princeton, in case Mr Gu wants to lend him money), thinks that data-driven analysis of personality is ultimately fairer than standard measures.

“We’re always judging people in all sorts of ways, but without data we do it with a selection bias,” he said. “We base it on stuff we know about people, but that usually means favouring people who are most like ourselves.” Familiarity is a crude form of risk management, since we know what to expect. But that doesn’t make it fair.

Character (though it is usually called something more neutral-sounding) is now judged by many other algorithms. Workday, a company offering cloud-based personnel software, has released a product that looks at 45 employee performance factors, including how long a person has held a position and how well the person has done. It predicts whether a person is likely to quit and suggests appropriate things, like a new job or a transfer, that could make this kind of person stay.

It can also characterise managers as what Workday informally calls “rainmakers” or “terminators”, depending on how well they hold talent. Inside Workday, the company has analysed its own sales force to see what makes for success. The top indicator is tenacity.

"We all have biases about how we hire and promote," said Dan Beck, Workday's head of technology strategy. "If you can leverage data to overcome that, great."

People studying these traits will be encouraged to adopt them, he said, since “if you know there is a pattern of success, why wouldn’t you adopt it?”

In a sense, it’s no different from the way people read the biographies of high achievers, looking for clues for what they need to do differently to succeed. It’s just at a much larger scale, based on observing everybody.

There are reasons to think that data-based character judgments are more reasonable. Jure Leskovec, a professor of computer science at Stanford, is finishing up a study comparing the predictions of data analysis against those of judges at bail hearings, who have just a few minutes to size up prisoners and decide if they could be risks to society. Early results indicate that data-driven analysis is 30 per cent better at predicting crime, Mr Leskovec said.

“Algorithms aren’t subjective,” he said. “Bias comes from people.”

That is only true to a point: Algorithms do not fall from the sky. Algorithms are written by human beings. Even if the facts aren’t biased, design can be, and we could end up with a flawed belief that maths is always truth.

Upstart’s Mr Gu, who said he had perfect examination results but dropped out of Yale, wouldn’t have qualified for an Upstart loan using his own initial algorithms.

He has since changed the design, and he said he is aware of the responsibility of the work ahead.

“Every time we find a signal, we have to ask ourselves, ‘Would we feel comfortable telling someone this was why they were rejected?’ ” he said.

© The New York Times 2015