Machine Intelligence

In practice, mathematicians prove theorems in a social context. It is a socially conditioned body of knowledge and techniques.
— William Thurston (1946-2012)

William Thurston, a Fields Medalist of 1982 for his work on Haken manifolds, wrote a wonderful 1994 essay “On Proof and Progress in Mathematics”, describing his vision for how to do mathematics. Thurston advocated a free-form, intuitive style  of mathematical discourse with less emphasis on conventional proofs, in part a reaction to the formal style in which he was trained, which emphasized rigorous proofs at the expense of exposition. Thurston is a gifted mathematician; his instincts about what is true in mathematics were often described as remarkable. “He kind of had a truth filter; …his mind rejected false mathematics,” recalled one of Thurston’s students. Thurston believed that mathematicians needed to improve their ability to communicate mathematical ideas rather than just the details of formal proofs. This human understanding was what gave mathematics not only its utility but its beauty.

In truth, the increasing complexity of modern mathematics today is already bumping up against the limits of human understanding. A case in point is Andrew Wile’s 109-page proof of Fermat’s Last Theorem, corrected and published in 1995 in the Annals of Mathematics after two years of intensive peer review by a small number of mathematicians who were capable of fully understanding at that time all the details of what Wile has done, 358 years after it was conjectured. Apparently Fermat had underestimated the length of this marvelous proof by a rather wide margin.

Normal
0




false
false
false

EN-US
JA
X-NONE

 
 
 
 
 
 
 
 
 
 
 
 

MicrosoftInternetExplorer4

 
 
 
 
 
 
 
 
 
 
 

Cogito, ergo sum.

Cogito, ergo sum.

Practitioners are increasingly relying on computers to test conjectures, visualize mathematical objects and construct proofs. On August 10, 2014, Thomas Hales announced the completion of a computer-verified proof of the Kepler conjecture, a theorem dating back to 1611 that describes the most space-efficient way to pack spheres in a box (i.e., commonly known as the “fruit-packing problem”). What is perhaps unusual is that 15 years ago, Hales had already presented a computer-assisted proof that Kepler’s intuition was correct. Hales’ original proof in 1998 involves 40,000 lines of custom computer code and runs to 300 pages. It took 12 reviewers four years to check for errors. Even when the 121-page proof was finally published in the Annals of Mathematics in 2005, the reviewers could say only that they were “99 percent certain” the proof was correct; they were all too exhausted from checking the proof any further.

But not this time. Hales’ new proof was verified by a pair of “formal proof assistants” – software programs developed as part of the Flyspeck Project – named Isabelle and HOL Light. In effect, what Hales had done was to transform his earlier paper into a form that could be completely checked by machine – in a mere 156 hours of runtime. “This technology cuts the mathematical referees out of the verification process,” said Hales. “Their opinion about the correctness of the proof no longer matters.” Score one for the machines – in proof checking.

Not everyone wants to be a mathematician. Some might just want to play chess, for example – with a computer. Garry Kasparov came up with the idea of “collaborative chess” after he was defeated by Deep Blue in a 1997 rematch. In what is now called “freestyle” chess, humans are allowed unrestricted use of computer during tournaments. The idea is to create the highest level of chess ever played, a synthesis of the best of man and machine. Kasparov played the first public game of human-computer freestyle chess in June 1998 in León, Spain against Veselin Topalov, a top-rated grand master.

Each used a regular computer with off-the-shelf chess software and a historical database of hundreds of thousands of chess games, including some of the best ever played. The man-machine hybrid team is often called a “centaur,” after the mythical half-human, half-horse creature. Topalov fought Kasparov to a 3-3 draw, even though Kasparov was the stronger player and had trounced Topalov 4-0 a month before. The centaur play evened the odds. Kasparov's advantage in calculating tactics had been nullified by the machine.

Human strategic guidance combined with the tactical acuity of a computer,” Kasparov concluded, “was overwhelming.” Having a computer partner meant never having to worry about making a tactical blunder. The humans could then concentrate on strategic planning instead of spending so much time on calculations. Under these conditions, human creativity was even more paramount. “In chess, as in so many things, what computers are good at is where humans are weak, and vice versa,” observed Kasparov. “Weak human + machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + machine + inferior process.”

Mechanical Turk: good old days of chess.

Mechanical Turk: good old days of chess.

Kasparov vs. Deep Blue: The Brain's Last Stand.

Kasparov vs. Deep Blue: The Brain's Last Stand.

But machines are rapidly encroaching on skills that used to belong to humans alone. For example, as Tyler Cowen recently noted, the turning point in freestyle chess may be approaching as centaurs are starting to lose ground against the Bots. This phenomenon is both broad and deep, and has profound economic implications. It is very likely that progress in information technology will exceed our expectations and surprise us in unexpected ways. Are the days far off before machines would start building hypotheses all by themselves – unassisted by humans – which they in turn can proceed to test automatically? It makes us wonder: “what are humans still good for?

I can change light bulbs, too!

I can change light bulbs, too!

References:

  1. Thurston, William (1994, April). On Proof and Progress in Mathematics. Bulletin of the American Mathematical Society, Vol. 30, No. 2, pp. 161-177. Retrieved from: http://www.ams.org/journals/bull/1994-30-02/S0273-0979-1994-00502-6/S0273-0979-1994-00502-6.pdf
  2. Hales, Thomas (2012). Dense Sphere Packings: A Blueprint for Formal Proofs. Cambridge University Press. Retrieved from: https://code.google.com/p/flyspeck/source/browse/trunk/kepler_tex/DenseSpherePackings.pdf
  3. Ellenberg, Jordan (2014, August 29). Will Machines Take Over Mathematics? Wall Street Journal. Retrieved from: http://online.wsj.com/articles/will-machines-put-mathematicians-out-of-work-1409336701
  4. McClain, Dylan Loeb (2005, June 21). In Chess, Masters Again Fight Machines. The New York Times. Retrieved from: http://www.nytimes.com/2005/06/21/arts/21mast.html
  5. Mueller, Tom (2005, December 12). Your Move. The New Yorker, pp. 62-69. Retrieved from: http://www.tommueller.co/s/New-Yorker-Your-Move.pdf
  6. Kasparov, Garry (2007). How Life Imitates Chess. Macmillan.
  7. Freeman, Richard B. (2014, May). Who Owns the Robots Rules the World. IZA World of Labor. Retrieved from: http://www.sole-jole.org/Freeman.pdf
  8. Thompson, Clive (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. Penguin Press.
  9. Cowen, Tyler (2013). Average is Over: Powering America Beyond the Age of the Great Stagnation. Dutton Adult.
  10. Brynjolfsson, Erik and McAfee, Andrew (2012). Race against the Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy. Digital Frontier Press.

Cauldron of Changes

Divine inspirations from Cerridwen's cauldron of Awen (Image Credit: Mara Freeman).

Divine inspirations from Cerridwen's cauldron of Awen (Image Credit: Mara Freeman).

“Cerridwen is the shape-shifting Celtic goddess of knowledge, transformation and rebirth. The Awen, cauldron of poetic inspiration, is one of her main symbols. In one part of the Mabinogion, which is the cycle of myths found in Welsh legend, Cerridwen brews up a potion in her magical cauldron to give to her son Afagddu (Morfran). She puts young Gwion in charge of guarding the cauldron, but three drops of the brew fall upon his finger, blessing him with the knowledge held within. Cerridwen pursues Gwion through a cycle of seasons until, in the form of a hen, she swallows Gwion, disguised as an ear of corn. Nine months later, she gives birth to Taliesen, the greatest of all the Welsh poets.”

“Witchcraft to the ignorant, … simple science to the learned.” Is how Leigh Brackett, a science fiction writer, puts it in a 1942 story “The Sorcerer of Rhiannon.” Indeed, what's brewing in today’s cauldron of changes could easily have made Cerridwen green with envy. After all, we know very well that “any sufficiently advanced technology is indistinguishable from magic.” Arthur C. Clarke (1962).

Hear the story of how bridges are built... ==>

Hear the story of how bridges are built... ==>

We live in an age of great technological change. Affordable access to ever greater processing speed and storage capacity is readily available through the cloud. The marginal costs of producing information goods in our networked society are rapidly falling. So here is an interesting question to ponder: what are some of the hardest problems of our times that can be solved with advanced financial technologies that work like magic?

You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.
— R. Buckminster Fuller (1895-1983)

References:

  1. Heimans, Jeremy and Timms, Henry (2014, December). Understanding New Power. Harvard Business Review. Retrieved from: https://hbr.org/2014/12/understanding-new-power
  2. Piketty, Thomas (2014). Capital in the Twenty-First Century. Belknap Press.
  3. Rifkin, Jeremy (2014). The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism. Palgrave Macmillan. Talk at: https://www.youtube.com/watch?v=5-iDUcETjvo

Stone Soup

Use brings overflowing abundance.
— Dan Bricklin (“Cornucopia of the Commons”, 2000)

A weary traveler came upon a small village, asking for a warm meal and shelter for the night. “We’ve had no food for ourselves,” the villagers looked as hungry as they could. “It has been a poor harvest.”

“Well then, seeing that you have nothing, I’ll have to make stone soup," the traveler said loudly. The villagers stared. A large iron pot was brought to the village square, and it took many buckets of water to fill. A fire was built and the pot was set to boil. With great ceremony, the traveler produced three round, smooth stones and dropped it into the pot. The villagers’ eyes grew round.

“Stones like these generally make good soup.” The traveler smacked his lips in anticipation. “But if there were carrots, it would be much better.” Soon a boy appeared, holding a bunch of carrots retrieved from its hiding place. “A good stone soup should have cabbage,” said the traveler as he sliced the carrots into the pot. “But no use asking for what you don’t have.”

A girl returned with two small cabbages in her hands. “If we only had a bit of onions, this soup would be fit for a prince.” The villagers found a sack of onions, and then some barley, potatoes, and even sides of beef. Soon there was a steaming pot of delicious broth for everyone in the village to share – and all from a few stones. It seemed like magic!

The Stone Soup Paradigm: A Unified Blueprint for How Everything Fits Together (aka the Industry Model Canvas).

The Stone Soup Paradigm: A Unified Blueprint for How Everything Fits Together (aka the Industry Model Canvas).

A collaborative network has a systematic advantage over markets and firms in matching best available human capital to best available information inputs to produce new information goods, according to Yochai Benkler. He posits that the same framework that explains the emergence of property and firms could, in principal, also explain the emergence of information production organized around a collaborative network. In particular, collaborative network will emerge when the cost of organizing an acitivity on a peered basis is lower than the cost of using the market or hierarchical organization. Based on a similar rationale, one could say that as long as the cost of implementing and enforcing property rights in a given resource is higher than the value of increased efficiency in resource utilization due to the property regime, then the resource will operate without property rights, i.e., as commons.

Examples of successful collaborative network includes those that had brought the world Linux, Apache, Mozilla, Perl, Wikipedia, Project Gutenberg, etc. Under certain circumstances, a collaborative network could be a more cost-effective institutional form than either markets or hierarchical organizations. In a networked information economy, the characteristics of resources required for information production, as well as the cost and efficiency of communication among human participants in the productive enterprise, naturally favors the institution of collaborative network over the alternatives of markets or hierarchical organizations. Specifically, Benkler identified four attributes of the networked information economy as contributing factors: (i) the object of production – information – is considered a public goods and feeds into further production as input at almost zero social cost; (ii) the physical capital costs of information production has declined dramatically; (iii) human capital – creative talent – is central to production but highly variable; and (iv) the dramatic decline in communication costs permits more efficient coordination of distributed efforts and aggregation of results. Taken together, these factors allow substantially cheaper movement of information inputs to human beings, human talent to resources, and modular contributions to projects, so that widely dispersed contributions can be integrated into finished information goods.

In a sense, we can think of the different modes of organizing production as information processes with different strategies for reducing the uncertainty that agents face in evaluating different courses of actions. For example, markets reduce uncertainty regarding allocation decisions by producing a clear set of price signals; firms or hierarchical organizations resolve uncertainty by instituting an ordered set of organizational commands. A collaborative network, in comparison, permits extensive communication and feedback among participants about what needs to be done, who is doing what and how people value any given outcome. The substantial information gain from a collaborative network thus lies in its capacity to collect and process information about human capital. After all, given the variability of human creativity, an organizational model that does not require the contractual specification of human intellectual effort but allows individuals to self-identify for tasks will be better at gathering and utilizing information about who should be doing what than a system that does require such specification.

In addition, a collaborative network enjoys allocation gain made possible by the large sets of available resources, agents, and projects. This gain is cumulative and there are increasing returns to the size of a collaborative network. In contrast, markets or firms rely on properties and contracts to secure access to bounded sets of agents and resources in the pursuit of specific projects. The decision costs in a firm or transaction costs in a market can thus be a limiting factor, unlike peer production organized through a collaborative network with completely unbounded availability of all agents to all resources for all projects. In principle, a world in which all agents can act effectively on all resources will be substantially more productive in creating information goods than a world in which firms divide the universe of agents and resources into bounded sets. Furthermore, any redundancy from duplication of efforts will likely lead to an evolutionary model of innovation where alternate solutions present themselves for natural selection.

In general, one can state that any production organized around a collaborative network is limited not by the total cost or complexity of a project, but by its modularity, granularity, and the cost of integration. Hence, the key to large-scale production is in the assembly of many fine-grained contributions, i.e., how a project can be broken down into a large number of small components that can be independently and asynchronously produced before they are combined into a whole. In fact, a project will likely be more efficient if it can accommodate variously sized contributions, so that people with different levels of diverse motivations can easily collaborate by making smaller or larger grained contributions. Approaches to integration include technology embedded in the collaborative network (e.g., NASA Clickworkers), social norms (e.g., Wikipedia), and market or hierarchical mechanisms (e.g., Linux kernel community). Often, provisioning of the integration function itself presents yet another level of opportunities for innovative use of the collaborative network in a radically complementary way (e.g., Slashdot, Feynman’s “sum over histories”).

For example, as Dan Bricklin noted in his essay, The Cornucopia of the Commons, a good architecture of participation is such that every user who uses its service automatically helps to build the value of the shared database in small increments. This architectural insight may actually explain the runaway success of open source Linux, the Internet, and the World Wide Web better than a spirit of volunteerism, observed Tim O’Reilley in his article, The Architecture of Participation. The relative challenge facing GNU HURD after two decades of efforts, as compared to the early success enjoyed by the horde of Linux developers, highlights the importance of a good architecture of participation; technically competent contributory efforts alone does not guarantee success. To wit, a good architecture of participation allows users pursuing their own “selfish” interests to build collective value as an automatic byproduct, as if led by an “invisible hand” in the collaborative network that would have made Adam Smith proud. In other words, a desired netwrok effect can be induced in a new collaborative network simply by good design, or alternatively be overlaid on top of an existing collaborative network by application of consistent effort (e.g., Amazon Associates program).

The Stone Soup paradigm separates production methodology from ownership concept, as they  could easily get mixed up during certain debates. For instance, the descriptive statement “given enough eyeballs, all bugs are shallow” resides in the realm of production methodology; whereas “free speech, not free beer is a normative statement that is properly in the realm of ownership concept. Notwithstanding the object of information production by different groups actually belongs in the same category, e.g., a Unix clone, the mixing of these two ideas by proponents of Open Source vs. Free Software can sometimes be counter-productive to even greater collaboration that one might otherwise contemplate.

The Stone Soup paradigm also decouples value creation from value capture, as these two stages have shown a high degree of interesting separation in various “Clothesline Paradox” economies described by Tim O’Reilley. He illustrated an instance of value capture by the web-hosting industry based on value created in open source software, e.g., the ISPs can be viewed as essentially offering the open-source DNS, Apache, MySQL, and WordPress to their customers and charging a monthly service fee. Similarly, companies like Google, Facebook, and Twitter is known to have captured enormous value created by the pioneers of the World Wide Web, but in a roundabout way (e.g., as advertising revenue and in stock market capitalization). So what of the hidden economies of value creation without vaue capture? Do such opportunities exist and, more practically, where do we find them?

To paraphrase Clayton Christensen’s “Law of Conservation of Attractive Profits”: when something that used to be valuable becomes commoditized, something that is adjacent in the value chain suddenly becomes valuable. As an example, when IBM made PC hardware a commodity, Microsoft figured out a way to make PC software proprietary and valuable. As the Internet and open-source movement commoditized software, companies like Google in turn figured out how to make data and algorithms proprietary and valuable. In short, companies make good profits when they solve the hardest problems of their times. In the case of financial trading, it could be the challenge of dealing with provisioning and allocation of information goods acting as proxies for market inefficiencies, which have trading capacity limitation and are thus semi-rival in nature. We think new trading opportunities will arise when meaning can be extracted from vast corpuses of data, financial or otherwise. There is thus an incredible opportunity for new financial trading business models to emerge in the world of open data access. What do you think?

Ant.jpg
Ants.jpg
One question that I wondered about was why the ant trails look so straight and nice. The ants look as if they know what they’re doing, as if they have a good sense of geometry. I put some sugar on the other end of the bathtub… and behind where the ant went I drew a line so I could tell where his trail was. The ant wandered a little bit wrong to get back to the hole, so the line was quite wiggly, unlike a typical ant trail.

When the next ant to find the sugar began to go back, … he followed the first ant’s return trail back, rather than his own incoming trail. Already it was apparent that the second ant’s return was slightly straighter. With successive ants the same “improvement” of the trail by hurriedly and carelessly “following” it occurred. I followed eight or ten ants with my pencil until their trails became a neat line right along the bathtub.
— Richard Feynman (“Surely You’re Joking, Mr. Feynman!”, 1985)

References:

  1. Raymond, Eric S. (1997, May). The Cathedral and the Bazaar. Retrieved from: http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ or http://www.unterstein.net/su/docs/CathBaz.pdf
  2. Raymond, Eric S. (1999, June). The Magic Cauldron. Retrieved from: http://www.catb.org/~esr/writings/magic-cauldron/magic-cauldron.html
  3. Benkler, Yochai (2002, December). Coase’s Penguin, or, Linux and The Nature of the Firm. Yale Law Journal, Vol. 112, No. 3, pp. 369-446. Retrieved from: http://www.yalelawjournal.org/pdf/354_t5aih5i1.pdf
  4. Hardin, Garrett (1968, December). The Tragedy of the Commons. Science, Vol. 162, No. 3859, pp. 1243-1248. Retrieved from: http://www.sciencemag.org/content/162/3859/1243.full
  5. Bricklin, Dan (2000, August). The Cornucopia of the Commons. Retrieved from: http://www.bricklin.com/cornucopia.htm and http://www.bricklin.com/speeches/c-of-c/
  6. O’Reilley, Tim (2004, June). The Architecture of Participation. Retrieved from: http://archive.oreilly.com/pub/a/oreilly/tim/articles/architecture_of_participation.html
  7. Baer, Steve (1975). The Clothesline Paradox. The CoEvolution Quarterly, Winter 1975. Retrieved from: http://www.wholeearth.com/issue/2008/article/358/the.clothesline.paradox
  8. O’Reilley, Tim (2012, July 18). The Clothesline Paradox: How Sharing Economies Create Value. OSCON 2012. Retrieved from: http://www.slideshare.net/timoreilly/the-clothesline-paradox-and-the-sharing-economy-pdf-with-notes-13685423 and http://edge.org/conversation/-39the-clothesline-paradox-39-

Collective Invention

Organizations which design systems are constrained to produce designs which are copies of the communication structures of these organizations.
— Melvin Conway (“How Do Committees Invent?”, 1968)

According to what is now celebrated as Conway’s Law, “if you have four groups working on a compiler, you’ll get a 4-pass compiler.” In other words, process becomes product. In an illustrative example, Melvin Conway describes a contract research organization with eight people who were to produce a COBOL and an ALGOL compiler. After some initial estimates of difficulty and time, five people were assigned to the COBOL job and three to the ALGOL job. Not surprisingly, the resulting COBOL compiler ran in five phases, and the ALGOL compiler ran in three.

In a similar fashion, the communication structure of an organization is shaped by its administrative structure and thus mirrors it. An example in mini-computer hardware design is offered by Tracy Kidder. Excerpted from his classic book, The Soul of a New Machine, is the following narrative: “Looking into the VAX, West had imagined he saw a diagram of DEC’s corporate organization. He felt that VAX was too complicated. He did not like, for instance, the system by which various parts of the machine communicated with each other, for his taste, there was too much protocol involved. He decided that VAX embodied flaws in DEC’s corporate organization. The machine expressed that phenomenally successful company’s cautious, bureaucratic style.” In other words, organization design becomes product design.

Conway advised that: a design effort should be organized according to the need for communication. We know from experience that the design which occurs first is almost never the best possible. Since the need for communication changes according to how design evolves over time, it is important to keep organizations lean and flexible. Therefore, Conway suggested that one must not naively assume that adding manpower adds to productivity. Instead, with great prescience in 1968, he advised answering basic questions about value of resources and techniques of communication as a first step towards a technology (i.e., a process innovation) of building systems with confidence.

The process of developing a new technology through open discussion is called collective invention. According to Peter Meyer, “it is a process in which improvements or experimental findings about a production process or tool are regularly shared.” Mayer documented five episodes of collective invention from recent historical experience: (1) steam engine (1811-1904); (2) iron blast furnace in Britain’s Cleveland district (1850s-1970s); (3) early steel production in the U.S. (1865-1880); (4) microcomputer clubs (1975-1985); and (5) Linux (1991-present).

In each case, there was no one single user or any single inventor. Instead, there was one central figure that played a key role in coordinating the success of collective invention: (1) Joel Lean, who edited the Lean’s Engine Reporter; (2) Isaac Lowthian and others, who published technical information about blast furnaces in operation; (3) Alexander Holley, who consulted widely and published technical reports for his steel industry clients; (4) Lee Felsenstein, who moderated the Homebrew Computer Club; and (5) Linus Torvald, who started Linux and guided its development. They offered the valuable service of information brokerage from the center of a star-shaped social network of experimenters, thus reducing the cost of search for innovations in the network. This enabled the accumulation of innovations to happen over time through experimentation and sharing across the collective invention network.

There are many explanations as to why individuals or firms would want to participate in sharing. A state of technological uncertainty and opportunity contributes to most of them. Sharing and experimentation are in fact complementary activities. It would be inefficient to do one without the other within the collective search process. After all, without a venue in which to share findings or learn from each other, some experiments might not have been carried out in the first place. Among the many diverse motivations that Robert Allen has identified as drivers of information sharing, two of them bear closer examination: (i) there are advantages in establishing engineering standards by giving away designs or software; and (ii) while firms compete locally against other firms, collectively they compete against other regions, and thus have an incentive to work together to make local production as efficiently as possible and remote regions irrelevant. Taken as a whole and reinterpreted in the modern context, one might say that agreeing to common engineering standards is a first step towards building a shared knowledge infrastructure that enables a proximate cluster of emerging firms to build a new and more efficient value network, in the process displacing the incumbents that are entrenched in the older value network. Economists would recognize this phenomenon and characterize it as follows: experimentation creates productive capital through sharing.

Individuals or firms have diverse resources, opportunities, insights, abilities, interests, skills, and agendas. Each one may have something unique to contribute to the collective search process. Those who want to make progress in the collective search process experiment more and find that it is optimal to share their findings. And those who participate in sharing find it optimal to experiment more. This summarizes the underlying self-reinforcing dynamics that drive a collective invention network. In contrast, a hierarchically organized system suffers from what is commonly known as the “Peter Principle,” where the selection of a candidate for a position is based on the candidate's performance in their current role rather than on abilities relevant to the intended role, and as a result “mangers rise to the level of their incompetence” throughout the system. Hierarchy impedes performance.

The emergence of the Internet and the Web offers an excellent example of large-scale collective invention in action, where the decentralized nature of its communication structure initially took shape as a “network of networks”, and subsequently evolved into what is now a “network of platforms.” This occurred over the past few decades within the collective invention network nurtured by DARPA, NSF, and eventually the open Internet and the Web, and through exposure to economic experimentation and community feedback from usage.

The structure of the Internet itself was unanticipated; its development started at a time when "packet switching" and "network of networks" were budding theoretical concepts, and nobody knew where implementation would lead. Not only was the early Internet a radical technological departure from existing practice, the geographical dispersal of its diverse research participants represented another major departure from DARPA's typical centralized program administration. Even the governance structure of the Internet was unprecedented in its openness and transparency, led by a surprising set of technological leaders, many of whom were graduate students at the time. Was it any coincidence then that the Internet became the underlying structure for decentralized exploration which created massive market value over time by aggregating innovations from its diverse participants as it continued to evolve?

The accumulated knowledge enabled the further creation of value in myriad numbers of applications, e.g., the new, ongoing "Internet of Things", that continue to shape the world around us. Perhaps in the not-too-distant future we might even see a “networked intelligence” arising through this process of collective invention. It would take off on its own, and re-design itself at an ever increasing rate,” is one such scenario anticipated by Stephen Hawking. In his recent remark about the future of artificial intelligence, Hawking also predicted that: “humans, who are limited by slow biological evolution, couldn't compete, and would be superseded.

Original Plan: The Mermaid (by John Reinhard Weguelin).

Original Plan: The Mermaid (by John Reinhard Weguelin).

Unanticipated Outcome: The Collective Invention (by Rene Magritte).

Unanticipated Outcome: The Collective Invention (by Rene Magritte).

Interestingly, collective invention is most valuable when there is uncertainty in a large search space along many dimensions, as there is the possibility of truly major innovations on the horizon. Collective invention may simply be an essential phase of technology improvement at its earliest stage. Early automobiles and airplanes seem to have developed along a collective invention path before industries started to form. When searching for market inefficiencies in the financial universe, we often wonder if sharing across the network – beyond academic publishing – may be a way of searching more efficiently given the accessibility of present cloud-computing infrastructure, and how that might impact the incumbents of today’s financial markets. What do you think?

Everything we see hides another thing, we always want to see what is hidden by what we see.
— Rene Magritte (1898-1967)

References:

  1. Conway, Melvin (1968). How do Committees Invent? Datamation (April, 1968). Retrieved from: http://www.melconway.com/Home/Committees_Paper.html and http://www.melconway.com/Home/pdf/committees.pdf
  2. Kidder, Tracy (1981). The Soul of a New Machine. Little Brown and Company.
  3. Allen, Robert C. (1983). Collective Invention. Journal of Economic Behavior and Organization, Vol. 4, pp. 1-24. Retrieved from: http://www.nuffield.ox.ac.uk/users/allen/collinvent.pdf
  4. Cowan, Robin and Jonard, Nicolas (2000). The Dynamics of Collective Invention. Journal of Economic Behavior and Organization, Vol. 52, No. 4, pp. 513-532. Retrieved from: http://arno.unimaas.nl/show.cgi?fid=279
  5. Meyer, Peter B. (2003). Episodes of Collective Invention. U.S. Bureau of Labor Statistics. Retrieved from: http://www.bls.gov/ore/pdf/ec030050.pdf
  6. Greenstein, Shane (2009). Nurturing the Accumulation of Innovations: Lessons from the Internet. In: Accelerating Energy Innovation: Insights from Multiple Sectors (2011), Rebecca M. Henderson and Richard G. Newell, editors (pp. 189-223). Retrieved from: http://www.nber.org/chapters/c11755.pdf
  7. Raymond, Eric S. (1996). A Brief History of Hackerdom. Retrieve from: http://www.catb.org/~esr/writings/cathedral-bazaar/hacker-history/
  8. Raymond, Eric S. (1999). Revenge of the Hackers. Retrieve from: http://www.catb.org/~esr/writings/cathedral-bazaar/hacker-revenge/
  9. Kessler, Andy (2005). How We Got Here: A Slightly Irreverent History of Technology and Markets. Harper Collins.
  10. Isaacson, Walter (2014). The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. Simon and Schuster.