That has numbers of people in $2500 income intervals. Calculate people in interval * 1/(income/minutes per year) for each interval sum and divide by total with income and I get an average poverty of 49 minutes.
I think he might be using after tax income and may be calculating based on household income/household members or similar instead which would explain the discrepancy (since children don't work).
You have to reconcile it with unemployed and children.
The idea of measuring an average time effort across employed, sick and unemployed is not a bad one, but I’m not sure adding children to the mix is a good idea.
That just creates a measure where children equals poverty.
That's because it is the average of the "time to earn 1$" per individual.
So let's say you're Elon Musk and it takes you a negligible enough time to do this that we can say that t_Elon = 0.
Now say you are way below the poverty line and earn 6000$/year. This means t_Poor = 87 mins.
If we average 80 t_Poor and 20 t_Elon we find we get 0.8 x 87 mins = 67 mins. Even when the average income in this case would be 0.2 x income_Elon. Something like 7 billion $/year.
I hope this shows why you can't just take the inverse to get the average income. The only way that was true was if everyone earned the exact same income.
Why is this a better metric?
The average income is biased towards big earners, while this metric is more centered around the mode of the distribution (poor people).
It captures the income distribution much better than average income.
Well, Average is indeed a worthless metric, and that's why everyone is using median for these statistics unless they're arguing in bad faith
If you do want to use average, you'd at least need to remove 10% both from the top and bottom before calculating it, but it's still gonna be super untrustworthy.
Not sure what to take away from your comment, I'm still unsure what kind of metric you're pitching and why it'd be a valuable thing to track
I'm not pitching any metric. I'm describing the metric being discussed.
The average is certainly not worthless. The average gives the expectation, which is more meaningful than the median, which is just the arbitrary line at the 50th percentile.
So if we're trying to find the expected value of how rich someone is, the average income is the answer. And if we want the expected value of how poor someone is, this new metric (average poorness) is the answer.
If we want to learn about poverty, obviously the poorness is more important.
But of course, you could also calculate the median poorness in this case, but that would actually just be the inverse of the median income, so no new information would be gained.
Say you have 10 people: one making $800/year, 8 making $80k/year, and one evil billionaire making $800 million. Their times to earn $1 are respectively 10 hours, 0.1 hours, and essentially zero. If you take the arithmetic mean of that you get 1.09 hours, and that's dominated by the single poor person. If you double that person's income to $1600, then they're at 5 hours to earn $1, and the overall average is nearly cut in half to 0.58. Meanwhile you can reduce the income of all the middle class people to $40k and not much changes; the average time to $1 would be (5+8(0.2)+0)/10=0.66.
It captures the income distribution much better than average income.
Not really, and certainly not better than median income which is what people typically use. It tries to measure exactly how little income the very poor make, which is not normally what people mean when they talk about inequality or poverty, and also hard to measure at the accuracy that you need when small changes produce huge swings in the result. In particular I don't believe he's correctly accounted for government benefits; hardly anyone in the US is consuming less than $8000/year.
Thanks for the comment, I was trying to parse the meaning of "time needed to earn $1" for a bit. This just boils down to what countries have the highest floor for their poorest members.
I did the same math. The closest guess I have is that it is derived from the poverty line for a family of four, $32150 (which divided by four is $8037).
I’m as frustrated as anybody else with how the economy is going in the US. But we should be skeptical about a new metric with an intuitive name that seems to confirm exactly what we all suspect but is sort of complex to interpret/measure, right?
In particular it seems weird that only we had a massive change during COVID.
Also seems a little odd that Germany was always better than the US, even in the 90’s when things were pretty good here.
Putting it together, we need to have COVID all the time here, so we can match the economic development of Germany immediately post-reunification.
Interesting. I hope this catches on -- it's tough to visualize poverty in concrete terms, but assessing how long it takes a population to make a given unit of purchasing power in average is a clever idea. It's sobering to realize that it would take a friend across the sea over 100 hours to assemble the funds for a $100 bill that I wouldn't look at twice...
Although I wish this sparked a conversation on how we can do better instead of national dick measuring contests. Those don't help.
He finds that “average poverty is substantially higher in the US, even though average incomes are higher than in most Western European countries”.
That seems like a complicated way to "talk about median income without talking about median income". By the end, they do describe the basic situation: US has greater total wealth and total income but that wealth and income is so unequally distributed that more people are poor.
This metric makes a lot of intuitive sense and reflects the consumer sentiment I hear from neighbors. "Working more for less" isn't a new complaint, but something that measures that is interesting.
I would be very interested to find out how those stats are related to things like, GINI or old pre-GDP economic measures of raw production.
Average != median. This measure seems to be so high because there are so many low paid workers in the US due to low minimum wage.
Median workers in the US have some of the highest hourly wages at PPP in the rich world and they have been increasing, but they are pretty similar to those in Germany. The big difference in annual pay at PPP is down to hours worked.
For 2022 average annual hours worked per worker in the US is 1790 while in Germany it is 1340 [1]. Meanwhile average hourly wages at PPP in US are $34.9 vs $34.6 in Germany [2]
> The $1 is measured in international dollars. This means it buys the same amount of goods and services in any country as a US dollar does in the United States. It is often used alongside purchasing power parity (PPP) data. The “time” refers to a day of life for anyone, at any age and in any circumstance — not just the hours worked by someone with a job.
So IIUC this "average poverty" (measured in time per international dollar) includes people living off social welfare? Otherwise, if it only included the working population, wouldn't we have
average poverty ≝ (average yearly income* of the working population / 1yr)⁻¹
and so it should be inversely proportional to the average yearly income* metric mentioned in the article?
*) Adjusted for purchasing power, i.e. measured in international dollars.
The declining standard of living in the USA is has become painfully obvious. I think we're past solutions. The question is if it will go the way of Italy or the way of Yugoslavia.
OK, the idea is interesting, but the numbers seem completely bogus. In what world does it take the average American 63 minutes to earn $1, even one "international" dollar?
I get the "international" part - purchasing power. The number still seems way off, though.
In a time when minimum wage is $7/hr, how is the average American earning $1/hr?
It seems biased to ignore things like growth in housing prices and the stock market where we have seen some massive gains in recent years. If it's easy to invest in property or companies or bonds or treasuries or whatever to make a dollar. That should count.
It doesn't make sense to talk about poverty in the US without talking about race (or, ok, for you libertarians in the audience: "geography"). It certainly doesn't make sense to average the economic standing of a population split between imperial core and colonized enclaves.
Is the measure they are using inflation adjusted over time? If not this shows an enormous loss in purchasing capacity over time for the average person, which is certainly how its felt over the past decades as inflation has outrun wages for most people.
I would guess this is because places like Germany having incredibly low annual working hours.[1] The bottom of the list is populated by all European countries.
85 comments
> As of 2025, the time needed to earn $1 is 63 minutes in the US.
Confused, I clicked one of the links and tried to understand. Found this:
> The time to get $1 refers to a day of life for anyone at any age and in any circumstance, not just the hours worked by someone with a job.
Clicking another link took me to the abstract at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4785458 but that didn't answer any questions either.
I can't find anything really of substance in this, other than someone trying to redefine a lot of terms in confusing ways
$1 every 63 minutes would be $8343/year. I cannot think of any way to reconcile that with the US average household income or any other related figure.
https://www.census.gov/data/tables/time-series/demo/income-p...
That has numbers of people in $2500 income intervals. Calculate people in interval * 1/(income/minutes per year) for each interval sum and divide by total with income and I get an average poverty of 49 minutes.
I think he might be using after tax income and may be calculating based on household income/household members or similar instead which would explain the discrepancy (since children don't work).
The idea of measuring an average time effort across employed, sick and unemployed is not a bad one, but I’m not sure adding children to the mix is a good idea.
That just creates a measure where children equals poverty.
So let's say you're Elon Musk and it takes you a negligible enough time to do this that we can say that t_Elon = 0.
Now say you are way below the poverty line and earn 6000$/year. This means t_Poor = 87 mins.
If we average 80 t_Poor and 20 t_Elon we find we get 0.8 x 87 mins = 67 mins. Even when the average income in this case would be 0.2 x income_Elon. Something like 7 billion $/year.
I hope this shows why you can't just take the inverse to get the average income. The only way that was true was if everyone earned the exact same income.
Why is this a better metric?
The average income is biased towards big earners, while this metric is more centered around the mode of the distribution (poor people).
It captures the income distribution much better than average income.
If you do want to use average, you'd at least need to remove 10% both from the top and bottom before calculating it, but it's still gonna be super untrustworthy.
Not sure what to take away from your comment, I'm still unsure what kind of metric you're pitching and why it'd be a valuable thing to track
The average is certainly not worthless. The average gives the expectation, which is more meaningful than the median, which is just the arbitrary line at the 50th percentile.
So if we're trying to find the expected value of how rich someone is, the average income is the answer. And if we want the expected value of how poor someone is, this new metric (average poorness) is the answer.
If we want to learn about poverty, obviously the poorness is more important.
But of course, you could also calculate the median poorness in this case, but that would actually just be the inverse of the median income, so no new information would be gained.
It's focused on the very poorest, who are not the mode. (Income distribution is approximately lognormal; see https://www.researchgate.net/figure/The-lognormal-distributi...).
Say you have 10 people: one making $800/year, 8 making $80k/year, and one evil billionaire making $800 million. Their times to earn $1 are respectively 10 hours, 0.1 hours, and essentially zero. If you take the arithmetic mean of that you get 1.09 hours, and that's dominated by the single poor person. If you double that person's income to $1600, then they're at 5 hours to earn $1, and the overall average is nearly cut in half to 0.58. Meanwhile you can reduce the income of all the middle class people to $40k and not much changes; the average time to $1 would be (5+8(0.2)+0)/10=0.66.
It captures the income distribution much better than average income.
Not really, and certainly not better than median income which is what people typically use. It tries to measure exactly how little income the very poor make, which is not normally what people mean when they talk about inequality or poverty, and also hard to measure at the accuracy that you need when small changes produce huge swings in the result. In particular I don't believe he's correctly accounted for government benefits; hardly anyone in the US is consuming less than $8000/year.
The median income is not a very good measure at all, you would need more quantiles to capture the distribution.
This metric is much better at capturing the distribution than the average income.
It's true that the mode is not at the very bottom of the distribution, but it is much more aligned with the lower tail than with the upper tail.
And I don't know why you mistrust this figure so much, it's based on this World Bank dataset, which you can verify yourself: https://datacatalog.worldbank.org/search/dataset/0064304/100...
In particular it seems weird that only we had a massive change during COVID.
Also seems a little odd that Germany was always better than the US, even in the 90’s when things were pretty good here.
Putting it together, we need to have COVID all the time here, so we can match the economic development of Germany immediately post-reunification.
Although I wish this sparked a conversation on how we can do better instead of national dick measuring contests. Those don't help.
That seems like a complicated way to "talk about median income without talking about median income". By the end, they do describe the basic situation: US has greater total wealth and total income but that wealth and income is so unequally distributed that more people are poor.
I would be very interested to find out how those stats are related to things like, GINI or old pre-GDP economic measures of raw production.
Median workers in the US have some of the highest hourly wages at PPP in the rich world and they have been increasing, but they are pretty similar to those in Germany. The big difference in annual pay at PPP is down to hours worked.
For 2022 average annual hours worked per worker in the US is 1790 while in Germany it is 1340 [1]. Meanwhile average hourly wages at PPP in US are $34.9 vs $34.6 in Germany [2]
[1] https://ourworldindata.org/grapher/annual-working-hours-per-...
[2] https://ourworldindata.org/grapher/average-hourly-earnings
> The $1 is measured in international dollars. This means it buys the same amount of goods and services in any country as a US dollar does in the United States. It is often used alongside purchasing power parity (PPP) data. The “time” refers to a day of life for anyone, at any age and in any circumstance — not just the hours worked by someone with a job.
So IIUC this "average poverty" (measured in time per international dollar) includes people living off social welfare? Otherwise, if it only included the working population, wouldn't we have
and so it should be inversely proportional to the average yearly income* metric mentioned in the article?*) Adjusted for purchasing power, i.e. measured in international dollars.
I get the "international" part - purchasing power. The number still seems way off, though.
In a time when minimum wage is $7/hr, how is the average American earning $1/hr?
Can anyone make that number make any sense?
[1]https://en.wikipedia.org/wiki/List_of_countries_by_average_a...