Our Great Posts

A Short History of Steel

Blast furnaces were first developed by the Chinese in the 6th century B.C., but they were more widely used in Europe during the Middle Ages and increased the production of cast iron. At very high temperatures, iron begins to absorb carbon, which lowers the melting point of the metal, resulting in cast iron (2.5 percent to 4.5 percent carbon).

Cast iron is strong, but it suffers from brittleness due to its carbon content, making it less than ideal for working and shaping. As metallurgists became aware that the high carbon content in iron was central to the problem of brittleness, they experimented with new methods for reducing the carbon content in order to make iron more workable.

Modern steelmaking evolved from these early days of making iron and subsequent developments in technology.

 

Wrought Iron

By the late 18th century, ironmakers learned how to transform cast pig iron into a low-carbon wrought iron using puddling furnaces, developed by Henry Cort in 1784. Pig iron is the molten iron that is run out of blast furnaces and cooled in the main channel and adjoining molds. It got its name because the large, central and adjoining smaller ingots resembled a sow and suckling piglets.

 

To make wrought iron, the furnaces heated molten iron which had to be stirred by puddlers using long oar-shaped tools, allowing oxygen to combine with and slowly remove carbon.

 

As the carbon content decreases, iron’s melting point increases, so masses of iron would agglomerate in the furnace. These masses would be removed and worked with a forge hammer by the puddler before being rolled into sheets or rails. By 1860, there were more than 3,000 puddling furnaces in Britain, but the process remained hindered by its labor and fuel intensiveness.

Blister Steel

Blister steel—one of the earliest forms of steel—began production in Germany and England in the 17th century and was produced by increasing the carbon content in molten pig iron using a process known as cementation. In this process, bars of wrought iron were layered with powdered charcoal in stone boxes and heated.

After about a week, the iron would absorb the carbon in the charcoal. Repeated heating would distribute carbon more evenly, and the result, after cooling, was blister steel. The higher carbon content made blister steel much more workable than pig iron, allowing it to be pressed or rolled.

Blister steel production advanced in the 1740s when English clockmaker Benjamin Huntsman found that the metal could be melted in clay crucibles and refined with a special flux to remove slag that the cementation process left behind. Huntsman was trying to develop a high-quality steel for his clock springs. The result was crucible—or cast—steel. Due to the cost of production, however, both blister and cast steel were only ever used in specialty applications.

As a result, cast iron made in puddling furnaces remained the primary structural metal in industrializing Britain during most of the 19th century.

The Bessemer Process and Modern Steelmaking

The growth of railroads during the 19th century in both Europe and America put great pressure on the iron industry, which still struggled with inefficient production processes. Steel was still unproven as a structural metal and production was slow and costly. That was until 1856 when Henry Bessemer came up with a more effective way to introduce oxygen into molten iron to reduce the carbon content.

Now known as the Bessemer Process, Bessemer designed a pear-shaped receptacle—referred to as a converter—in which iron could be heated while oxygen could be blown through the molten metal. As oxygen passed through the molten metal, it would react with the carbon, releasing carbon dioxide and producing a more pure iron.

The process was fast and inexpensive, removing carbon and silicon from iron in a matter of minutes but suffered from being too successful. Too much carbon was removed and too much oxygen remained in the final product. Bessemer ultimately had to repay his investors until he could find a method to increase the carbon content and remove the unwanted oxygen.

At about the same time, British metallurgist Robert Mushet acquired and began testing a compound of iron, carbon, and manganese—known as spiegeleisen. Manganese was known to remove oxygen from molten iron, and the carbon content in the spiegeleisen, if added in the right quantities, would provide the solution to Bessemer’s problems. Bessemer began adding it to his conversion process with great success.

One problem remained. Bessemer had failed to find a way to remove phosphorus—a deleterious impurity that makes steel brittle—from his end product. Consequently, only phosphorus-free ores from Sweden and Wales could be used.

In 1876 Welshman Sidney Gilchrist Thomas came up with a solution by adding a chemically basic flux—limestone—to the Bessemer process. The limestone drew phosphorus from the pig iron into the slag, allowing the unwanted element to be removed.

This innovation meant that iron ore from anywhere in the world finally could be used to make steel. Not surprisingly, steel production costs began decreasing significantly. Prices for steel rail dropped more than 80 percent between 1867 and 1884, initiating growth of the world steel industry.

The Open Hearth Process

In the 1860s, German engineer Karl Wilhelm Siemens further enhanced steel production through his creation of the open hearth process. This produced steel from pig iron in large shallow furnaces.

Using high temperatures to burn off excess carbon and other impurities, the process relied on heated brick chambers below the hearth. Regenerative furnaces later used exhaust gases from the furnace to maintain high temperatures in the brick chambers below.

This method allowed for the production of much larger quantities (50-100 metric tons in one furnace), periodic testing of the molten steel so it could be made to meet particular specifications, and the use of scrap steel as a raw material. Although the process itself was much slower, by 1900 the open hearth process had largely replaced the Bessemer process.

Birth of the Steel Industry

The revolution in steel production that provided cheaper, higher quality material, was recognized by many businessmen of the day as an investment opportunity. Capitalists of the late 19th century, including Andrew Carnegie and Charles Schwab, invested and made millions (billions in the case of Carnegie) in the steel industry. Carnegie’s US Steel Corporation, founded in 1901, was the first corporation ever valued at more than $1 billion.

Electric Arc Furnace Steelmaking

Just after the turn of the century, Paul Heroult’s electric arc furnace (EAF) was designed to pass an electric current through charged material, resulting in exothermic oxidation and temperatures up to 3,272 degrees Fahrenheit (1,800 degrees Celsius), more than sufficient to heat steel production.

Initially used for specialty steels, EAFs grew in use and by World War II were being used for the manufacturing of steel alloys. The low investment cost involved in setting up EAF mills allowed them to compete with the major U.S. producers like US Steel Corp. and Bethlehem Steel, especially in carbon steels or long products.

Because EAFs can produce steel from 100 percent scrap—or cold ferrous—feed, less energy per unit of production is needed. As opposed to basic oxygen hearths, operations also can be stopped and started with little associated cost. For these reasons, production via EAFs has been steadily increasing for more than 50 years and accounted for about 33 percent of global steel production, as of 2017.

Oxygen Steelmaking

The majority of global steel production—about 66 percent—is produced in basic oxygen facilities. The development of a method to separate oxygen from nitrogen on an industrial scale in the 1960s allowed for major advances in the development of basic oxygen furnaces.

Basic oxygen furnaces blow oxygen into large quantities of molten iron and scrap steel and can complete a charge much more quickly than open-hearth methods. Large vessels holding up to 350 metric tons of iron can complete conversion to steel in less than one hour.

The cost efficiencies of oxygen steelmaking made open-hearth factories uncompetitive and, following the advent of oxygen steelmaking in the 1960s, open-hearth operations began closing. The last open-hearth facility in the U.S. closed in 1992 and in China, the last one closed in 2001.

Original Source