Visualizing the gap between what humans can do and what needs to be done
January 18, 2025
The overwhelming volume of work that needs to be done (click for full size)
Every minute, millions of security events flow through corporate networks. Thousands of telescopes capture asteroids that could threaten Earth. Medical researchers analyze countless genetic sequences looking for disease patterns. And millions of hours of video are captured—many of which include crimes being committed.
But nobody's paying attention.
Not because we don't care, but because there's just too many things to watch. To do. To monitor. To take action on.
Let's call these Work Tasks.
All the different work tasks that we could potentially be doing
This mock-up shows the size of the problem. The x-axis represents Work Task volume—how many tasks, how much data, how many decisions. The y-axis represents Work Task difficulty—the complexity, expertise required, and cognitive load.
The area under this curve? That's everything that should be monitored, analyzed, and acted upon to maintain and improve our civilization.
Now let's overlay what humans can actually accomplish:
We're only doing a tiny amount of what needs to be done
That tiny blue area? That's us. That's the sum total of human capacity to process information, make decisions, and take action.
Think about it practically from a trained professional perspective:
And there are only so many of these people in the first place. They're not easy to train, hire, or keep available in a talent pipeline.
We're not failing, we're just finite. And the bigger our society the more impossible it is to cover all the tasks that require human-level analysis.
Also keep in mind that part of the answer here is automation, for sure. Some things can—and have—been done with tooling that can find basic patterns in data. But what we're talking about here is work that can't be done by automation because it requires intelligence.
So this is where AI comes in. Yes, of course, one way to look at AI is that it's going to disrupt millions of jobs and be a huge problem for society. And I do think that's the case.
But another way to think about this is comparing AI not to the work that a human would have done, but the work that a human never would have gotten to.
And not just any particular human, but any human.
AI taking on some of the work that never would've been done
On this view, AI isn't about replacing humans, but rather doing the work that was never getting done at all.
There are two dimensions that we're actually getting help with here: the amount of work, and the difficulty of the work.
And this isn't theoretical; it's already happening all over the world.
This has been happening for decades already with Machine Learning. But now with modern AI we can do things that require more advanced human-level intelligence.
Let's make this concrete with actual examples:
Domain | Total Work Needed | Human Capacity | AI-Augmented Reality |
---|---|---|---|
Cybersecurity | Monitor billions of events/day | Review n alerts/day | Analyze millions, flag critical threats |
Medical Imaging | Process 100,000s scans waiting | Read n scans/day | Pre-screen all, prioritize urgent cases |
Fraud Detection | Check millions of transactions | Investigate n cases/day | Real-time analysis of all transactions |
Space Safety | Track millions of objects | Monitor n objects/day | Continuous tracking of all debris |
The gap Between what humans can cover and what they need to cover is vast
The whole idea is based on this concept of Work Tasks, where each work task has a difficulty level. And there are simply n number of Work Tasks that exist in the world, n number that are being covered by humans, and n number that are not.
The area under the curve framework reveals three critical insights:
The work exists whether we do it or not: Those security breaches happen. Those diseases go undiagnosed. Those asteroids keep flying. The work is real, with real consequences.
Human capacity has hard limits: We can't just "try harder" our way out of this. The gap is too vast. It's not about effort—it's about fundamental biological constraints.
AI can help cover the gap: Look at the visualizations again and notice that AI doesn't necessarily have to shrink the human portion of the coverage. I think that will also happen, honestly, but for different reasons and the point is unrelated to this one.
Understanding work as area under the curve—combining both volume and difficulty—gives us another frame to think about human-AI collaboration.
There are so many instances of fraud, corruption, subversion of democracy, malicious actors in the cyber world, criminals caught on camera, criminals recorded in audio, plainly obvious dangerous health signals, etc—that are just never acted upon.
Everyone can't hire a security team. Everyone can't hire a team of journalists to investigate a company.
This essential human work doesn't scale.
And AI can be part of that solution.