Who is actually running your company?
Imagine you are the boss of a very large company. You sit in a big office. You have many employees. Every day, you make big choices. You decide who to hire. You decide which companies to buy from. You decide how to follow the law. You feel like you are in control. You feel like the "captain of the ship." But there is a quiet crisis happening right now. It is happening in your office and in offices all over the world.
The way we lead organizations is changing. We are using more computer programs to help us. These programs use "algorithms." An algorithm is just a set of rules for a computer to follow. We use them because they are fast. They can look at more data than any human ever could. But as we use these tools, something strange is happening. The power to make choices is moving, and it’s doing so away from people.
It is moving into the code of the software we use every day.
If we needed a label to identify this phenomena quickly, I’d call this "Structural Agency Displacement." That is a big name for a simple idea. "Agency" is just your power to make a choice. "Displacement" means moving something out of its place. So, this means the power to choose is being moved out of human hands. It is being put into the "structure" of the software, and in the hands of the machine.
And no, this is not a mistake or a bug in the software. It is how the systems were built. They were built to be fast and to make money. But in the process, they started making the choices for us. We think we are still the bosses. We think we are just using tools. But the tools are often the ones setting the path. They decide what we see. They decide what we think is "normal." They even decide how fast we have to work.
The Three-Way Choice: Speed, Money, or Freedom
When a company builds a computer system, they face a hard choice. This choice has typically three parts. We can call it a "trilemma." Think of it like a triangle. You can pick two sides, but it is very hard to have all three. The three sides are:
System Efficiency: This is about speed and scale. It means doing things fast and doing them for many people at once. It also means saving money.
Company Profit: This is about making money. It means keeping users happy so they keep coming back.
User Autonomy: This is the most important one for humans. It means the person using the computer is still the one making the final choice. Their "agency" is safe.
In the world of apps we use for fun, the choice is easy. Think about Netflix or TikTok. Netflix uses an algorithm to suggest movies. It is very good at it. In fact, 80% of what people watch on Netflix comes from these suggestions. This makes Netflix a lot of money. It keeps people watching. TikTok is the same. Its "For You" feed is amazing at showing you what you like.
But there is a catch. To make the system that fast and that profitable, they had to give up on "User Autonomy." You aren't really choosing what to watch. The machine is choosing for you. You are just going along for the ride. This is fine when you are just watching a movie. It is not fine when you are running a business.
In a business, things like hiring and law-following are vital. If a computer makes a choice that breaks the law, the company is still in trouble. A human being is still legally responsible. But if the human didn't really "choose," how can they be responsible? This is the big problem we face today.
How the Machine Takes Over
How does a piece of software take away a human's power? It doesn't use force (well, not yet). It doesn't have a mind of its own. Instead, it uses the way it is built. It uses "architecture." There are three main ways this happens.
1. The Power of the "Default"
Have you ever noticed that you usually pick the first option a computer gives you? We all do it. It is easier. I wrote many times about “Institutional Inertia”, but that doesn’t appear out of thin air, it’s rooted in the natural reluctance of humans to change and, in this case, we see it at work in a different context.
If a computer program picks a vendor for you, it is "the default." To pick a different vendor, you might have to click five more buttons. You might have to write a note explaining why. You might have to wait for your boss to approve it. The computer makes its choice the "easy path." It makes your choice the "hard path."
Most people are busy. They are tired. They will take the easy path almost every time. In this way, the computer makes the choice for them without ever saying "you must do this." It just makes "doing this" the path of least resistance.
2. Controlling What You See
If you are hiring someone, a computer might look at 1,000 resumes. It picks the top 5 to show you. You look at those five and pick the best one. You feel like you made a choice.
But did you? The computer already threw away 995 people. You never saw them. You don't know why they were thrown away. The computer did the real work of choosing before you even logged in. You are just picking from a tiny list that the machine pre-vetted.
If the machine is biased against certain people, you will never know. You only see what the machine wants you to see. It feels like a full picture, but it is just a small window.
3. Moving Too Fast to Think
Human brains are good at deep thinking. But we need time to do it. Computer systems move very fast. They send us alerts. They send us emails. They give us "pings" all day long.
This constant stream of information breaks our focus. It makes us move faster. When we move fast, we don't stop to ask "why?" We just react. We click "Accept." We click "Approve." By keeping the pace high, the system prevents us from slowing down to challenge how the system works. We become like parts in a machine instead of people in charge.
Real Stories: Banks and Job Hunting
To understand this, let's look at real life. Many banks now use AI to decide who gets a loan. About 95% of "FinTech" lenders use these tools. They say the AI is better because it doesn't have human bias. But is that true?
A study of a car lender in China showed something interesting. At first, the computer just gave the loan officers a "score." The humans could look at the score and decide what to do. In this setup, the humans didn't change much. They still had the same old biases they always had.
Then, the bank changed the "architecture" of the software. They made the computer's choice the "default." If the computer said "Give the loan," the human had to sign off on it. If they wanted to say "No," they had to do a lot of extra work.
What do you think happened? The bias went down! The machine was actually more fair than the humans. But the humans weren't "choosing" anymore. They were just clicking "OK" on what the machine said. This shows that the way software is built can change how we act, even if we don't notice it.
In Human Resources (HR), the story is different. Sometimes these programs learn from the past. If a company only hired men for 20 years, the machine might learn that "men are better." It will then start throwing away resumes from women.
The HR manager sees the list of candidates. They see only men. They think, "Well, I guess only men applied." They approve the list. They think they are being fair. But they are just following a biased machine. This creates a loop. The past mistakes of humans get "baked into" the code of the future.
The "Human-in-the-Loop" Lie
Many companies say they have a "human-in-the-loop." This is a fancy way of saying a person still signs off on everything. Often times they use this just as a shield. They say, "Don't worry, a person is still in charge."
But is that person really "in the loop"? If they don't understand how the computer made its choice, they aren't in charge. If they are too busy to check the work, they aren't in charge. If the computer makes it hard to say "no," they aren't in charge.
A "human-in-the-loop" only works if the human has the power, the time, and the information to say "No." Without those three things, the human is just a rubber stamp. They are there to take the blame if things go wrong, but they don't have the power to make things go right.
How to Take Control Back
We cannot just throw away our computers. We need them to stay competitive. But we can change how we use them. We need a new way to govern these tools. Here are three ideas for how to do it.
1. Preference Declaration Systems
Instead of letting the computer guess what we want, we should tell it. Before the machine starts working, humans should write down their rules. We should say, "These are the things we value." The machine should then have to follow those rules. If the machine wants to do something else, it must ask for permission first.
2. Friction Symmetry
This is a very important rule for design. "Friction" is how hard it is to do something. If it takes one click to say "Yes" to a computer, it should take one click to say "No."
Right now, most systems have "Asymmetric Friction." It is easy to agree and hard to disagree. We need to make it balanced. If a manager wants to override the computer, it shouldn't be a mountain of paperwork. It should be a normal part of the job.
3. Algorithmic Impact Assessments (AIA)
Before a company buys a new piece of software, they should check it. They should ask: "Who is making the choices here?" They should map out where the power sits. They should also set a "sunset clause." This means the system has to be re-checked every year to make sure it is still doing what it was supposed to do.
A Message to Leaders
If you are a senior leader, you might feel like you are still the master of your domain. You have the title. You have the corner office. But if you look closely at how your company works, you might see a different story.
Many leaders are now "bound" by their technology. They feel in control, but they are only choosing from a menu the computer wrote. They are going at a speed the computer set. They are validating choices that a machine already made.
This shift is not a law of nature. It is happening because we want things to be fast and cheap. But we are paying a price. The price is our own authority. We are giving away the "locus of control."
If we don't pay attention, this will happen unnoticed. We will wake up one day and realize that the humans are no longer running the company. The software is. And if the software is running the company, who is looking out for the people? Who is looking out for the future?
The only way forward is to pay attention. We must be disciplined. We must look at the "underlying architecture" of our tools. We must make sure that when we use a computer, we are still the ones in the driver's seat. Governance is not about checking boxes after the work is done. It is about how we build the systems in the first place.
If authority is going to move, it should not do so in secret. We must see it happening. We must decide if that is what we want. Because if we don't decide, the machine will decide for us. And once the machine decides, it is very hard to change its mind.