Skin In the Game: Agency Cost and the Value of Advices
There is a concept in asset management known as GP commitment. It represents the portion of a fund’s capital (typically 1%-5%) that managers personally invest alongside outside investors. Another common mechanism is carried interest, whereby a meaningful share of profits (often 20%), earned above a predefined hurdle rate, is allocated to the manager rather than to investors. The intent is to encourage managers not merely to perform, but to overperform.
Both mechanisms aim to achieve the same thing: skin in the game, an alignment of interests between investors, or principals, and managers, or agents. In other words, they are designed so that a manager acting in their own self-interest will, by extension, also act in investors' interests.
The importance of skin in the game extends well beyond ethics. Asset managers are already legally and professionally duty-bound to act in their clients' best interests. Yet even the most well-intentioned agent remains vulnerable to perception biases and distorted incentives. As the saying goes, the road to hell is paved with good intentions.
The gap between outcomes when interests are perfectly aligned and when they are not is known as agency cost. It is omnipresent. Think of a Fortune 500 CEO protected by a golden parachute, whose personal payoff is largely decoupled from the long-term performance of the company they steward. Or a physician prescribing an aggressive treatment out of fear of scrutiny for inaction, with limited regard for the balance between short-term risk and long-term benefit. Or the executives of major banks during the Global Financial Crisis, who kept the bonuses earned in the years leading up to the collapse. Or, more trivially, the way your risk tolerance suddenly changes when driving a rental car covered by comprehensive insurance.
But the absence of skin in the game does more than increase agency costs. By separating decision-making from consequences, it encourages agents to repeatedly take unnecessary risks, being asymmetrically exposed to upside and downside outcomes, thereby increasing the fragility of the entire system.
The same asymmetry appears when advice is given without exposure to its consequences. When someone can benefit from being right but is insulated from being wrong, the informational value of their advice degrades. Errors are cheap to make and slow to correct, or worse, explained away in hindsight. Confidence is often mistaken for competence, and sophistication for insight.
In such environments, the cost of bad advice is borne elsewhere, by investors, employees, patients, or citizens, while the advisor moves on, reputation largely intact. Feedback loops break. Learning slows. Fragility accumulates until a shock forces reality back into the system.
The heuristic:
If someone benefits from being right but is largely unharmed by being wrong, discount the signal.
If someone shares in the downside, financially, reputationally, or personally, pay closer attention.
If a system systematically separates decision-making from consequences, assume hidden fragility, regardless of how compelling the narrative.
In other words, when building systems, always consider whether agents have adequate skin in the game.