
We’re not just bundling and unbundling, but unleashing energy, organizing it, and then unleashing new energy on the next thing.Įntropy Theory explains global progress, industry trends, and company success and failure. Ever-increasing entropy gives work verve. It adds a directional vector to Jim Barksdale’s oft-repeated quote: “There’s only two ways I know to make money: bundling and unbundling.” Bundling and unbundling is Sisyphean - bundle, unbundle, start over again, repeat. This idea explains so much - from business theories to industry evolution to company success - that I’m giving it a name: Entropy Theory.Įntropy Theory explains industry evolution as a story of ever-increasing chaos and suggests that the most successful businesses are those that use the latest technology to wrangle that chaos, until entropic forces unleash the next set of opportunities.Įntropy Theory sits on top of and connects so many of the other theories that we talk about here: Aggregation Theory, Disruption Theory, Creative Destruction, Coase’s Theory of the Firm, The Law of Conservation of Attractive Profits, and more. Each new burst of entropy creates more surface area for innovation. The upward-sloping push and pull between entropy and its opposite, negentropy, is responsible for humanity’s forward progress. The companies and people that create order from the chaos are Entropy Wranglers. More choice, speed, and flexibility create more chaos, which in turn creates opportunities for companies to capture value by temporarily bringing order to the ever-increasing entropy. Like the universe, the market also gets messier and more chaotic all the time.Įvery industry is on a parallel journey of increasing, accelerating entropy. The universe tends to get messier and more disordered all the time. The Second Law of Thermodynamics states that all closed systems tend to maximize entropy. This isn’t a quarantine-induced panic thought, it’s an inexorable fact of the universe. We hope that this Special Issue provides a platform to outline the continuing efforts to understand this field.🎧 If you prefer listening, bring your ears over here: Entropy Theory (Audio Edition)Įverything tends towards chaos and disorder. The aim of this Special Issue is to bring together original research and review articles highlighting the recent advances in this field. Information entropy is widely used in signal processing, system analysis and other related fields. Information entropy also measures the complexity of a system. Information entropy is always a useful tool to deal with the information quantity contained in the information and random variables. The lower the entropy is, the less information can be transmitted. In the information world, the higher the entropy is, the more information can be transmitted. In information theory, entropy gives a measure of the amount of information in an event drawn from a distribution. It has important applications in many fields, such as cybernetics, probability theory, life science and astrophysics. With the development of statistical physics and information theory, the essence of entropy is gradually explained, that is, the degree of internal chaos of a system. It generally refers to a measure of the state of some material systems. It is one of the state parameters of matter to describe the degradation of energy. Entropy comes from thermodynamics in physics.
