The Second Industrial Revolution, 1870 – 1914 The mutual feedbacks of Science and Technology Parts of this summary is based on the book “The Lever of Riches” (1990) as well as on a number of subsequent essays by Professor Joel Mokyr (Economics and History, Northwestern University - USA) The second Industrial Revolution is usually dated between 1870 and 1914, although a number of its characteristic events can be dated to the 1850s. Technology is knowledge. Modern economic growth, Simon Kuznets (1965) argued more than 30 years ago, depends on the growth of useful knowledge. The first Industrial Revolution and most technological developments preceding it had little or no scientific base. It created a chemical industry with no chemistry, an iron industry without metallurgy, power machinery without thermodynamics. Engineering, medical technology, and agriculture until 1850 were pragmatic bodies of applied knowledge in which things were know to work, but rarely was it understood why they worked. This meant that often people did not know which things did not work: enormous amounts of energy and ingenuity were wasted on alchemy, perpetuum mobiles, the stones of the wise and fountains of youth. Only when science demonstrated that such pipedreams were impossible, research moved into a different direction. Moreover, even when things were known to work, they tended to be inflexible and slow to improve. It was often difficult to remove bugs, improve quality, and make products and processes more user friendly without a more profound understanding of the natural processes involved. It was in this regard that the inventions after 1870 were different from the ones that preceded it. The period 1859 - 1873 has been characterized as one of the most fruitful and dense in innovations in history (Mowery and Rosenberg, 1989, p. 22). From the point of view of useful knowledge that mapped into new technology, this view is certainly correct. The second Industrial Revolution accelerated the mutual feedbacks between these two forms of knowledge or between science (very broadly defined) and technology. The second Industrial Revolution extended the rather limited and localized successes of the first to a much broader range of activities and products. Living standards and the purchasing power of money increased rapidly, as the new technologies reaches like never before into the daily lives of the middle and working classes. The other aspect of the second Industrial Revolution worth stressing is the changing nature of the organization of production. The second Industrial Revolution witnessed the growth in some industries of huge economies of scale and some vast concerns emerged, far larger than anything seen before. This change occurred because of ever more important economies of scale in manufacturing. Some of these were purely physical such as the fact that in chemicals, for instance, the cost of construction of containers and cylinders is proportional to the surface area while capacity is proportional to volume. Since the first depends o n the square of the diameter and the latter on the cube, costs per unit of output decline with output. With the rise of the chemical industry, oil refining, and other industries using containers, as well as engines of various types, size began to matter more and more. Some economies of scale were organizational, such as mass production by interchangeable parts
technology. Others were more in the nature of marketing advantages, or even the ruthless pursuit of monopolies. Yet it should be stressed that even with rise of giant corporations such as Carnegie Steel, Dupont, Ford Motors, and General Electric in the U.S. and their equivalents in Europe, these firms employed but a small fraction of the labor force and the typical firm in the industrialized West by 1914 remained relatively small, a niche player, often specialized yet flexible and catering more often than not to a localized or specific section of the market Railroad and telegraph networks and in large cities gas, water supply, and sewage systems were in existence already. These systems expanded enormously after 1870, and a number of new ones were added: electrical power and telephone being the most important ones. The second Industrial Revolution turned the large technological system from an exception to a commonplace. Systems required a great deal of coordination that free markets did not always find easy to supply, and hence governments or other leading institutions ended stepping in to determine railroad gauges, electricity voltages, the layout of typewriter keyboards, rules of the road, and other forms of standardization. Steel By 1850, the age of iron had become fully established. But for many uses, wrought iron was inferior to steel. The wear and tear on wrought iron machine parts and rails made them expensive in use, and for many uses, especially in machines and construction, wrought iron was insufficiently tenacious and elastic. The problem was not to make steel; the problem was to make cheap steel. As is well known, this problem was definitively solved by Henry Bessemer in 1856. The growth of the steel industry following his invention has come in the popular mind to symbolize the technology of the second Industrial Revolution, and while steel was of course of great significance, such emphases tend to blur the advances in many other industries. Cheap steel soon found many uses beyond its original spring and dagger demand; by 1880 buildings, ships, and railroad tracks were increasingly made out of steel. It became the fundamental material from which machines, weapons, and implements were made, as well as the tools that made them. Steel's spectacular success after 1860 should not obscure important advances in other stages of the iron industry. Chemicals In chemistry, Germans took the lead. Although Britain still was capable of achieving the lucky occasional masterstroke that opened a new area, the patient, systematic search for solutions by people with formal scientific and technical training suited the German traditions better. In 1840 Justus von Liebig, a chemistry professor at Giessen, published his Organic Chemistry in Its Applications to Agriculture and Physiology, which explained the importance of fertilizers and advocated the application of chemicals in agriculture. Other famed German chemists, such as Friedrich Wöhler, Robert Bunsen, Leopold Gmelin, August von Hofmann, and Friedrich Kekulé von Stradonitz, jointly created modern organic chemistry, without which the chemical industry of the second half of the nineteenth century would not have been possible. It was one of the most prominent examples of how formal scientific knowledge came to affect production techniques. German chemists began the search for artificial dyes, and almost all additional successes in this area were scored by them. German chemists succeeded in developing indigotin (synthetic indigo, perfected in 1897). and sulphuric acid (1875). Soda making
had been revolutionized by the Belgian Ernest Solvay in the 1860s. In explosives, dynamite, discovered by Alfred Nobel, was used in the construction of tunnels, roads, oil wells, and quarries. In the production of fertilizer, developments began to accelerate in the 1820s. Some of them were the result of resource discoveries, like Peruvian guano which was imported in large quantities to fertilize the fields of England. Others were by products of industrial processes. Because the physical and chemical processes in agriculture are far more complex than in manufacturing, better theoretical knowledge was required, and serendipity eventually ran into diminishing returns. In Germany, especially Saxony, state supported institutions subsidized agricultural research and the results eventually led to vastly increased yields. Nitrogen fertilizers were produced from the caliche (natural sodium nitrate) mined in Chile. The famous Haber process to make ammonia, developed by Fritz Haber and BASF chemists Carl Bosch and Alwin Mittasch and the discovery of how to convert ammonia into nitric acid around 1908, made it possible for Germany to continue producing nitrates for fertilizers and explosives during World War I after its supplies of Chilean nitrates were cut off. Chemistry also began its road toward the supply of new artificial materials. Charles Goodyear, the American tinkerer invented in 1839 the vulcanization process of rubber that made widespread industrial use of rubber possible. Another American, John Wesley Hyatt, succeeded in creating the first synthetic plastic in 1869, which he called celluloid. Its economic importance was initially modest because of its inflammability, and it was primarily used for combs, knife handles, piano keys, and baby rattles, but it was a harbinger of things to come. The break through in synthetic materials came only in 1907, when the Belgian born American inventor Leo Baekeland discovered bakelite. The reason for the long delay in the successful development of Bakelite was simply that neither chemical theory nor practice could cope with such a substance before (Bijker, 1987, p. 169). Yet Baekeland did not fully understand his own process, as the macromolecular chemical theories that explain synthetic materials were not developed until the 1920s. Perhaps the classic instance of a "free lunch" in which large gains in well being were achieved at low cost was in the fine chemical industry, which after 1870 began to rationalize the hitherto chaotic industry of pharmaceutics. The use of anesthetics became widespread after Queen Victoria used chloroform when she gave birth to Prince Leopold in 1853. Disinfectants and antiseptics, particularly phenol and bromines were produced in large quantities after Joseph Lister's rediscovery of the role of microbes in the infection of wounds. One of the most remarkable inventions was the acetyl compound of salicylic acid. The medicinal properties of willow bark had been known since antiquity, and in 1897, a Bayer chemist by the name of Felix Hoffman on a hunch took the old compound off the shelf for his father who could not tolerate the side effects of sodium salicylic acid. Immediately it became clear that of salicylic acid, later known as aspirin, was a true wonder drug: effective, without serious negative side effects, and cheap to produce. Within a few years Bayer sent samples to 30,000 German doctors and the new drug was soon used universally. Electricity
Like chemistry, electricity was a field in which totally new knowledge was applied to solve economic problems. The economic potential of electricity had been suspected since the beginning of the nineteenth century. Humphrey Davy had demonstrated its lighting capabilities as early as 1808. Relying on the scientific discoveries of scientists such as the Dane Hans Oersted and the American Joseph Henry, Michael Faraday invented the electric motor in 1821 and the dynamo in 1831. The first effective
application of electricity was not in power transmission, but in communication. The telegraph was associated with a string of inventors, the most important of whom were S.T. von Soemmering, a German, who demonstrated its capabilities in 1810; William Cooke, an Englishman who patented a five needle system to transmit messages (1837); and Samuel Morse, an American, who invented the code named after him that made the single needle system feasible. The first successful submarine cable was laid by Thomas Crampton's Company between Dover and Calais in 1851, and became a technological triumph that lasted thirty seven years. The telegraph, together with the railroads, was an early example of a technological system, a combination of separate inventions that had to be molded together. Just as the strength of a chain can never be greater than that of its weakest link, the efficiency and reliability of a system can never be greater than that of its weakest component. The idea of utilizing electrical current to affect a magnetized needle to transmit information at a speed much faster than anything previously possible was a classic macro invention. Long distance telegraph, however, required many subsequent micro inventions. Submarine cables were found to be a difficult technology to master. Signals were often weak and slow, and the messages distorted. Worse, cables were subject at first to intolerable wear and tear. Of the 17,700 kilometers of cable laid before 1861, only 4,800 kilometers were operational in that year the rest was lost. The transatlantic cable, through which Queen Victoria and President Buchanan famously exchanged messages in August 1858 ceased to work three months later. The techniques of insulating and armoring the cables properly had to be perfected, and the problem of capacitance (increasing distortion on long distance cables) had to be overcome. Before the telegraph could become truly functional, the physics of transmission of electric impulses had to be understood. Physicists, and above all William Thomson (later Lord Kelvin), made fundamental contributions to the technology. Thomson invented a special galvanometer, and a technique of sending short reverse pulses immediately following the main pulse, to sharpen the signal (Headrick, 1989, pp. 215 218). In this close collaboration between science and technology, too, telegraphy was clearly a second generation technology. The use of electricity as a prime means of transmitting and using energy was technically even more difficult than the development of the telegraph. Before it could be made to work, an efficient way had to be devised to generate electric power using other sources of energy; devices to transform electricity back into kinetic power, light, or heat at the receiving end had to be created; and a way of transmitting current over large distances had to be developed. In addition, electricity came in two forms, alternating and direct current, and a decision had to be made regarding which of the two forms was to dominate. Electric generators were crucial. Although Davy had shown as early as 1808 how electricity could drive an arc lamp, apart from light houses it was not widely used in lighting. Following the discovery in the mid 1860s of the principle of the self excited generator by C.F. Varley and Werner von Siemens, the Belgian Z. T. Gramme built in 1870 a ring dynamo, which produced a steady continuous current without overheating. Gramme's machine substantially reduced the cost of alternating current. The vacuum problem was solved in 1865, when Hermann Sprengel designed a vacuum pump. Only then could the arc lamp be made practical. In 1876 a Russian inventor, Paul N. Jablochkoff, invented an improved arclamp (or "candle"), which used alternating current. Subsequently factories, streets, railway stations and similar public places began to replace gas light with arc light. In 1878, Charles F. Brush of Ohio invented a high tension direct current lamp, which by the mid 1880s had come to dominate arc lighting. Inventors such as Thomas Edison and George Westinghouse realized that
electricity was a technological network, a system of closely interconnected compatible inventions. The use of electricity expanded quickly in the 1870s. A miniature electric railway was displayed at the Berlin exhibition in 1879; electric blankets and hot plates appeared at the industrial exhibition of Vienna in 1883; and electric streetcars were running in Frankfurt and Glasgow by 1884. The early 1880s saw the invention of the modern lightbulb by Joseph Swan in England and Thomas A. Edison in the United States. An electric polyphase motor using alternating current was built by the Croatian born American Nikola Tesla in 1889, and improved subsequently by Westinghouse. Led by Westinghouse and Tesla, the forces for alternating current defeated those advocating direct current, led by Edison. By 1890, the main technical problems had been solved; electricity had been tamed. What followed was a string of micro-inventions that increased reliability and durability and reduced cost. In 1900, an incandescent lightbulb cost one fifth what it had twenty years earlier and was twice as efficient. Transportation.
By 1870 the application of steam power to transportation was hardly a novelty, and they were properly speaking products of the first Industrial Revolution (though the screw propellor and the marine steam engine were both perfected in the 1850s). Railroads became faster, safer, and more comfortable during the second Industrial Revolution but these resulted from microinventions rather than from big breakthroughs. The only truly discontinuous changes to railroads in this period were the application of new power sources: the Diesel engine, invented in 1897 by Rudolf Diesel and the use of electrical locomotives. Rudolf Diesel was a good specimen of the new inventor, an engineer trained in science, a "rational" inventor, in search of efficiency above all else. Although some electrical railroads were in operation by 1914, wholesale electrification and the conversion to Diesel occurred much later. Changes in ships were more drastic. Despite the rather amazing improvement in sailing ships resulting in the famous clipper ships, wind power was destined for niches in sports and leisure boats. First, after 1870 increasingly ships were built of steel. This made it possible to build larger ships. Since the maximum speed of a ship is proportional to the (square roots of) the water line, and iron and steel ships could be made much larger than wooden ships, ships grew bigger, more powerful, and faster at unprecedented rates. The invention of the steam turbine by Gustav de Laval and Charles Parson in 1884 and its subsequent improvement led to a revolution at sea: the rotary motion of the turbine could develop enormous speed (the prototype that Parsons built in 1884 ran at 18,000 rpm and had to be geared down), was far more efficient, faster, cleaner, and quieter, than the old reciprocating marine steam engines, and their adoption after 1900, when most of the bugs had been removed, was led by naval ships. While the typical ship of 1815 was not much different from the typical ship of 1650, by 1910 both merchant ships and men of war had little in common with their steam operated predecessors half a century earlier. The result was a sharp decline in transportation costs. In the first half of the nineteenth century freight rates fell by 0.88 percent a year, which reflected mostly improvements in sailing ships. The decline after 1850 accelerated to 1.5 percent a year, rates that are all the more impressive in view of persistently rising labor costs. Despite some organizational improvements, there can be little doubt that the decline in transatlantic freight rates was the result of technological improvements (Harley, 1988). On some occasions, a technical solution looked simple, and it may seem surprising at first sight that it took so long before producers got it right. Throughout the entire
nineteenth century, mechanics experimented with a device that would allow individuals to propel themselves rapidly while seated. A variety of velocipedes and "penny farthing" types of bicycles were experimented with, largely for recreational purposes. Yet it was not until John K. Starley, a Coventry mechanic, built the Rover safety cycle in 1885 that the balanced position and easy steering of today's bicycles became feasible. The case of the bicycle illustrates that neither purely technical factors nor purely economic factors nor even a combination of the two can fully account for technological change. The bicycle was a novelty in the deepest sense of the word; it did not replace an existing technique with a similar, more efficient one. The people who adopted the bicycle in the 1890s had previously walked or used public transportation. Thebicycle became a means of mass transportation with incalculable effects on urban residential patterns, especially after the invention of the pneumatic tire in 1888 by a Belfast veterinary surgeon, J.B. Dunlop, who was unhappy with the comfort of his ten year old son's tricycle ride. After a few years of further improvements, the design of the bicycle stabilized, and few further significant improvements were introduced after 1900. The classic case of a novel combination of known techniques laced with a number of important original contributions was the development of the automobile. The internal combustion engine was first suggested by Huygens in the seventeenth century. In 1824, Sadi Carnot had described the limitations of the steam engine as an energy source and pointed to heated air as the best potential means to generate motive power. Despite prolonged research efforts, it turned out to be difficult to employ steam power for carriages. During the nineteenth century dozens of inventors, realizing the advantages of an internal combustion engine over steam, tried their hand at the problem. A working model of a gas engine was first constructed by the Belgian Jean Etienne Lenoir in 1859 and perfected in 1876, when a German traveling salesman, Nicolaus August Otto, built a gas engine using the four stroke engine. Otto worked on the problem from 1860 on, after he read about Lenoir's machine in a newspaper. He was an inspired amateur, without formal technical training. Otto initially saw the four stroke engine as a makeshift solution to the problem of achieving a high enough compression and only later was his four stroke principle, which is still the heart of most automobile engines, acclaimed as a brilliant breakthrough (Bryant, 1967, pp. 650 57). The "silent Otto," as it became known (to distinguish it from a noisier and less successful earlier version), was a huge financial success. The advantage of the gas engine was not its silence, but that, unlike the steam engine, it could be turned on and off at short notice. Otto's gas engine was soon to adopt a new fuel. Some what earlier, in the 1860s, the process of crude oil refining using a method called cracking was developed. At that time the main interest was in lubricants, paraffins, and heavy oils, with petrol or gasoline considered a dangerously inflammable byproduct. In 1885 two Germans, Gottlieb Daimler and Karl Benz, succeeded in building an Otto type, four stroke gasoline burning engine, employing a primitive surface carburetor to mix the fuel with air. Benz's engine used an electrical induction coil powered by an accumulator, foreshadowing the modern spark plug. The Dunlop pneumatic tire, first made for bicycles, soon found application to the automobile. In 1893 Wilhelm Maybach, one of Daimler's employees, invented the modern floatfeed carburetor. Other technical improvements added around 1900 included the radiator, the differential, the crank starter, the steering wheel, and pedal brake control. The effect of the automobile and the bicycle on technology was similar to that of the mechanical clock five centuries earlier: mechanics involved in making and repairing the devices acquired the skills and the ideas to extend the principles involved. Yet they also combined with late nineteenth
century ideas of interchangeable parts and mass production, and by 1914 Henry Ford sold almost a quarter of a million model T automobiles per year. The conquest of the air is an excellent example of how formal knowledge about nature and pragmatic experience combined to produce one of the most dramatic macroinventions of all times, namely the Wright brothers celebrated heavier than air flight at Kitty Hawk in 1903. The Wright brothers had access to and used the knowledge on aerodynamics that had been accumulating since the path breaking work of George Cayley early in the nineteenth century in part through the advice of Octave Chanut, one of the leading aeronautical engineers of his time. At the same time they were skilled mechanics who had earned their spurs as bicycle repairmen in Dayton. The development of the airplane in many ways is paradigmatic of the new mode of technological progress that emerged with the second Industrial Revolution: formal and informal knowledge combining together to produce a discontinuous event followed by decades of microinventions which eventually produced a major industry, with further technological progress stagnating when most of the obvious improvements were exhausted.
Production Engineering : From a purely economic point of view, it could be argued that the most important invention was not another chemical dye, a better engine, or even electricity, since, with the exception of steel, most of the inventions described had serviceable albeit less efficient and more expensive substitutes, if not as efficient or as cheap. There is one innovation, however, for which "social savings" calculations from the vantage point of the twentieth century are certain to yield large gains. The so called American System of manufacturing assembled complex products from mass produced individual components. Modern manufacturing would be unthinkable without interchangeable parts. The term American is somewhat misleading: the idea that interchangeability had enormous advantages in production and maintenance had occurred to Europeans in the eighteenth century. Moreover, what was regarded in the 1850s as the American System was not exactly interchange ability, but the application of high quality, specialized machine tools to a sequence of operations, particularly in woodworking, as well as higher operating speeds and sequential movements of materials. As Ferguson (1981) has pointed out, mechanized mass production and interchangeable parts were not identical, and the former did not imply the latter. Interchangeable parts was not an "invention." It was eventually to become a vastly superior mode of producing goods and services, facilitated by the work of previous inventors, especially the makers of accurate machine tools and cheap steel. To be truly interchangeable, the parts had to be identical, requiring high levels of accuracy and quality control in their manufacture. It is now realized that full interchangeability was more difficult to achieve than had previously been believed. The use of interchangeable parts grew slowly after 1850, and recent research has shown that the American System was adopted far more haltingly and hesitantly than had hitherto been thought. Many American firms, such as McCormick, Singer, and Colt, owed their success to factors other than complete interchangeability (Hounshell, 1984). At first, goods made with interchangeable parts were more expensive and were adopted mostly by government armories, which considered quality more important than price (Howard, 1978; M. R. Smith, 1977). Only after the Civil War did U.S. manufacturing gradually adopt mass production methods, followed by Europe. First in firearms, then in clocks, pumps, locks, mechanical reapers, typewriters, sewing machines, and eventually engines and bicycles, interchangeable parts technology proved
superior and replaced the skilled artisan working with chisel and file. Although in the long run true interchangeability was inexorable, its diffusion in Europe was slowed down by two factors: the demand for distinctive high quality goods, which long kept consumers faithful to skilled artisans, and the resistance of labor, which realized that mass production would make its skills obsolete. Of related importance was the development of continuous flow production, in which workers remained stationary while the tasks were moved to them. In this way, the employer could control the speed at which operations were performed and minimize the time wasted by workers between operations. In the last third of the nineteenth century continuous flow processes were adopted on a large scale, especially in the great stockyards of Chicago and Cincinnati. Henry Ford's automobile assembly plant combined the concept of interchangeable parts with that of continuous flow processes, and it allowed him to mass produce a complex product and yet keep its price low enough so that it could be sold as a people's vehicle. As Giedion (1948, p. 117) points out, Ford's great success was rooted in the fact that, un like Oliver Evans, he came at the end of a long development of interchangeability and continuous flow processes. Success depends not only on the ingenuity and energy of the inventor, but also on the willingness of contemporaries to accept the novelty. Agriculture and Food Processing : The standard of living of the population depended, above all, on food supply and nutrition. The new technologies of the nineteenth century affected food supplies through production, distribution, preservation, and eventually preparation. In agriculture, the adoption of the new husbandry based on fodder crops and stallfed livestock continued apace, though in France and in most of eastern Europe progress was slow. New implements and tools appeared on the scene, but here the traditional obstacles to technological progress in agriculture retarded growth: inventions that were useful in some environments failed elsewhere. A few technologies such as barbed wire (invented in 1868) were made, but the bulk of technology was site and crop specific. Agricultural productivity owed much to the extended use of fertilizers. Farmers learned to use nitrates, potassium, and phosphates produced by the chemical industries. In addition to the guano already mentioned, the large American stockyards produced fertilizers made from animal bones combined with sulphuric acid. The productivity gains in European agriculture are hard to imagine without the gradual switch from natural fertilizer, produced mostly in loco by farm animals, to commercially produced chemical fertilizers. Fertilizers were not the only scientific success in farming: the use of fungicides, such as Bordeaux mixture, invented in 1885 by the French botanist M. Millardet in 1885, helped conquer the dreaded potato blight that had devastated Ireland forty years earlier. Technological progress outside agriculture affected food supplies in many ways. Steel implements, drainage and irrigation pipes, steam operated threshers, seed drills, and mechanical reapers slowly but certainly improved productivity and expanded the supply of food and raw materials. Yet here more than anywhere else the old resisted the new, and modern tools and techniques continued to coexist with manual operations that had not changed in centuries. Mechanizing agriculture involved overcoming some technical difficulties. Much work in agriculture, such as weeding, picking, and milking, was carried out by movements of the human fingers, as opposed to the sweeping or beating motions of the human arm. Furthermore, the mechanization of agriculture was slow (compared to manufacturing) because of the lack of power substitutes. In most industrial processes, the act of production can occur at the site of the
power source. The utilization of more efficient energy sources was thus rather simple. In agriculture, the power sources had to be brought to the production site (i.e., the land) for most activities, and thus plowing, harrowing, reaping, raking, and binding remained dependent on draft animals long after manufacturing and transportation had adopted the steam engine. The application of steam power to agriculture was not a success. Only when the work could be carried out near the power source, did mechanization come relatively early: the threshing machine built in 1784 by the Scotsman Andrew Meikle spread quickly, as did the winnowing machine built in 1777 by a London mechanic, James Sharp. These machines were attached to steam engines in the first half of the nineteenth century, but they remained something of an exception. The internal combustion engine solved all that, and by the eve of World War I, the first tractors and combines were being introduced on both sides of the Atlantic. In 1880, it still took 20 man hours on average to harvest an acre of wheat in the United States; by 1935 this figure had fallen to 6.1. Of special interest to the historian interested in economic welfare is the development of food preparation and preservation. Much human suffering has been caused over the ages by nutritional deficiencies and by the unwitting consumption of contaminated foods. Food canning had been invented in 1795, but because the process was not understood, the food was over processed and tasted poorly. Only after Louis Pasteur's path breaking discoveries was it understood why canning worked, and not until the end of the century did it become clear that the optimal cooking temperature was about 240 F. Canned food played an important role in provisioning the armies in the American Civil War, and led to vastly increased consumption of vegetables, fruit, and meat in the rapidly growing cities. Other food preservatives were also coming into use. Gail Borden invented milk powder in the 1850s and helped win the Civil War for the Union and a fortune for himself. By the end of the century his dehydration idea was also successfully applied to eggs and soups. The centrifugal cream separator, invented by Gustav de Laval in 1877 soon became the cornerstone of the dairy cooperatives in Denmark, the Netherlands, and Ireland. Pasteur himself showed how to sterilize bottled cow milk. Cooling was an alternative form of preservation. In the eighteenth century, ice was preserved in special icehouses, and an international market in ice emerged in the early nineteenth century, though in the warm seasons the price of ice was so high that only the rich could afford it. Mechanical refrigeration was gradually developed and improved upon between 1834 (when the first patent for the manufacture of ice was issued in Britain) and 1861 (when the first frozen beef plant was set up in Sydney, Australia). By 1870, beef transported from the United States to England was preserved by chilling (29o30oF). The efficient method of preserving beef, however, was by deep freezing at about 14oF. In 1876 the French engineer Charles Tellier built the first refrigerated ship, the Frigorifique, which sailed from Buenos Aires to France with a load of frozen beef. By the 1880s, beef, mutton, and lamb from South America and Australia were supplying European dinner tables. Farming in Europe suffered from this competition, but the consumer, the ultimate and final arbiter of all questions of economic progress, benefitted greatly. Technological changes reduced the price of food in general to the point where after 1870 in many countries farmers, rather than consumers, turned to their governments for help. The decline of the price of proteins relative to carbohydrates helped to augment and improve European diets. Other Manufacturing Sectors .
In textiles, progress after 1870 was gradual and not marked by great breakthroughs. One major innovation that came to its own in this period was the sewing machine. Apparel making had lagged behind the rest of the textile industry during the early stages of the Industrial Revolution despite an international search for a machine that would replace the motion of the human hand in the stitching process. These machines were at first technically unworkable, but after 1830 a solution began to appear on the horizon. Elias Howe, an American, is usually credited with the invention, but he merely perfected one crucial feature, the lock stitch, patented in 1846, and made his machine of metal parts. The man who deserves the most credit for perfecting the sewing machine is Isaac Merritt Singer, who powered his machine with a foot treadle. A conservative estimate of the increase in productivity resulting from the sewing machine puts it at 500 percent (Schmiechen, 1984, p. 26). It was later adapted to make shoes (the McKay shoe sewing machine dates from 1861) and carpets (1880). Annual production of sewing machines went from 2,200 in 1853 to half a million in 1870. Unlike many other inventions in textiles, the sewing machine did not lead to factories, as it did not require a centralized power source. Instead, it kept struggling domestic workers (mostly women) occupied, and created a system of notorious sweat shops. The sewing machine was slow in adopting interchangeable parts; the Singer Company was successful primarily because of brilliant marketing, but still used skilled fitters long after its competitors had switched to fully interchangeable parts (Hounshell, 1984, pp. 67 123). In the rest of the textile industry, the period after 1870 had the character of a mopping up operation. The combing of wool, which had long defied mechanization, was further improved by the introduction of the Lister Donisthorpe nip machine in the 1850s, reviving the fortunes of the Yorkshire worsted industries. The Heilmann combing machine was used widely on the Continent, especially in Alsace. In spinning, the throstle that had dominated the scene until the 1860s was slowly superseded by ring spinning. Unlike mule spinning, in which the twist was imparted by the combined action of rollers and the revolving spindles, ring spinning involved the twisting of the yarn by the circular movement of a clip on the rapidly turning ring, called the traveler. The traveler guided the thread and ensured its winding on the spindle. Ring spinning was continuous rather than intermittent, and required less skill and strength than mule spinning. It produced a slightly inferior product, however, especially for fine yarns. Ring spinning did not spread widely until the second half of the nineteenth century, but then it conquered the U.S. textile industry rapidly. Lancashire in Britain remained loyal to mules, however, a fact that haslong intrigued economic historians, as it seemingly indicates a reluctance on the part of British industry to adopt a superior technology. In weaving, the power loom continued to replace handloom weavers after 1850, but automation came to weaving only after J.H. Northrop built the first automatic loom in 1894 and it was widely adopted in the U.S. and the Continent in the next two decades. By this time, Britain's textile industries had lost their position at the cutting edge of technology, and British adoption of the Northrop loom was slow. Finally, I turn briefly to what may best be called information technology. Following the telegraph in the 1830s, a "modern" pattern starts to emerge, in which practice followed theory. Thus, Hermann Helmholz, a German physicist, experimented with the reproduction of sound, which inspired a Scottish born speech therapist teacher at Boston University named Alexander Graham Bell to work on what became the telephone (1876). Supplementary inventions such as the switchboard (1878) and the loading coil (1899), made the telephone of the most successful inventions of all time. Wireless telegraphy is another outstanding example of the new order of things, in which science
led technology rather than the other way around (Aitken, 1976). The principle of wireless telegraphy, as yet unsuspected, was implicit in the theory of electromagnetic waves proposed on purely theoretical grounds by James Clerk Maxwell in 1865. The electromagnetic waves suggested by Maxwell were finally demonstrated to exist by a set of brilliant experiments conducted by Heinrich Hertz in 1888. The Englishman Oliver Lodge and the Italian Guglielmo Marconi combined the theories of these ivory tower theorists into wireless telegraphy in the mid 1890s, and in 1906 the Americans Lee DeForest and R.A. Fessenden showed how wireless radio could transmit not only Morse signals but sound waves as well. Little science but a lot of experimentation was necessary for another invention that had an enormous effect on information technology: the typewriter. The idea of the typewriter is conceptually obvious, but a number of minor technical bugs, such as bars clashing when two letters were typed very closely together, bedeviled its perfection. These problems were finally solved by Christopher L. Sholes of Milwaukee, reputedly the 52nd person to invent the typewriter. Sholes sold his patent to the Remington Company in 1874, and a small revolution in the office began. The technical problems in the printing industry were more complex. Typesetting had always been laborious and slow work, and the need for improvement was becoming acute as literacy rose and the thirst for information grew in the late eighteenth and nineteenth centuries. The first rotary press was built in Philadelphia in 1846. A horizontal cylinder contained the printed material, and rotated in contact with smaller cylinders each of which corresponded to a page, with automatic grippers guiding the pages from cylinder to cylinder. This machine, originally conceived by Robert Hoe, found its way to Europe, where many leading newspapers adopted it. It was fast but labor intensive. Typecasting, equally revolutionary in that it recast the type each time anew using an automatic process, was perfected in the United States in 1838; by 1851 it had spread all over Europe. An alternative technique, invented by Henry Bessemer (of steel fame), was the piano type, where the operator worked at a keyboard. For a while, a confusing multitude of automatic type setting techniques were in operation at the same time. Between 1886 and 1890, a German immigrant to the United States, Ottmar Mergenthaler, invented the linotype machine, which cast and set a whole line at a time using a keyboard controlling hundreds of matrices, from which the letter molds were made. Linotype machines were primarily used for newspapers; for books, a related machine, the monotype typesetter, was developed in the 1890s. With the increase in demand for paper, new raw materials became necessary and, after much experimentation, the use of wood pulp was perfected in about 1873. Household Technology and Human Welfare
From a purely material point of view, the technological changes described above must have increased the wellbeing of the populations of Western Europe and North America, with somewhat delayed and smaller effects being felt in Eastern Europe and in a few areas in Asia and South America. Yet one could argue that it is not easy to establish the net effect of technological progress on human wellbeing. Industrialization led to urbanization, to the concentration of large numbers of workers in dangerous and unpleasant factories and mines, to alienation, the breaking up of traditional communities compounded by large waves of emigration. Moreover, the half century of technical advance I called the second Industrial Revolution was punctuated by a cataclysm of unprecedented dimensions. The sheer massiveness of the destruction of the First World War was in large part attributable to the power of the new technology: steel, chemicals,
high explosives, barbed wire, internal combustion engines, mass production the nightmare of 1914-18 reflects the achievements of the previous decades faithfully. And yet it is clear that until 1914 life was getting better, incomes were rising, work hours slowly declining, some forms of social insurance emerging, nutrition and housing slowly improving. The statistical evidence from demography seems to bear this out without any question. Between 1870 and 1914 infant mortality declined by about 50%: in France, which was fairly typical, the rate fell from 201 per thousand in 1870 to 111 in 1914. In Germany the corresponding numbers were 298 and 164. Life expectancy at birth increased accordingly, in Britain it went from about 40 years to 50. This decline was in part due simply to rising incomes: as people enjoyed higher incomes, they could buy more and better food, live in less congested and better heated dwellings, own better clothes, and had access to running water, sewage, and medical care. Yet the period of the second Industrial Revolution witnessed another technological development which has not been properly appreciated for it huge impact on human welfare. We tend to think of technology as occurring in manufacturing and service firms and on the land, but economists have long understood that household technology in many ways resembles production technology in that knowledge leads us to employ techniques and routines by which we manipulate natural regularities to better our material conditions: economic agents cooked food, sewed clothing, took care of babies and the elderly and so on, based on what they knew. The knowledge used in household production went through its greatest transformation ever in the years surveyed here. Once again, the change was not quite as sharp: some important precedents can be discerned long before, and the process was far from complete in 1914. Yet changes in household technology were crucial in reducing mortality after 1870 (See Mokyr ,1996; Mokyr and Stein, 1997; and Easterlin, 1996)). The knowledge underlying these changes was the sudden growth in the understanding in the nature of infectious disease due to the work of Pasteur, Koch, and their associates. Within a few decades the medical profession managed to workout a more or less complete theory of infectious disease in which many of the causative agents were identified and their modes of transmission established. Medical practices before 1914 improved only in isolated areas, and had but little effect. The main impact that the new bacteriology had was in preventive techniques. Households increasingly realized that by following certain simple recipes they could reduce the incidence of infectious disease. Germs could not be seen, but they could be fought by simple household techniques, available at relatively low cost. Once water was established as a carrier of certain diseases, people began to realize the importance of filtering, boiling and later chlorination. When insects were identified as a carrier of malaria and yellow fever, a war against insects erupted. Foodborne diseases could be reduced by proper cooking, cleaning, and preservation. All this had to be taught and the teaching took time. Many mistakes were made, wrong turns made, causal mechanisms misidentified and false recommendations made. Yet when all is said and done, the effects of this technological revolution on human welfare are the most unequivocal: the sharp decline in mortality rates in this period speaks for itself. Conclusions
The second Industrial Revolution was, in many ways, the continuation of the first. In many industries there was direct continuity. Yet it differed from it in a number of crucial aspects. First, it had a direct effect on real wages and standards of living which clearly differed significantly in 1914 from 1870. Second, it shifted the geographical
focus of technological leadership away from Britain to a more dispersed locus, though leadership remained firmly the monopoly of the industrialized Western world. Finally, by changing the relation between knowledge of nature and how it affected technological practices, it irreversibly changed the way technological change itself occurs. In so doing, what was learned in these years prepared the way for many more Industrial Revolutions to come. Joel Mokyr
Genetic information as part of the “Great Chain of Being” Andreas Musolff, University of Durham ([email protected]) Abstract The article investigates discourse traditions of key-metaphors in popular accounts of genetics, such as those of genes or cells as selfish replicators, or of selection as a self-propelling agent or prime mover of evolution. Such agentive metap
8 February 2013 Tena koutou katoa, Welcome to 2013! We have had a really busy and productive week here at school. The 3-way conferences had almost 100% attendance and the feedback from staff, students, and families has been overwhelmingly positive. The opportunity to set goals and establish a home school partnership from the outset will be invaluable to your child's learning as the year pro