What is the history of Wrought Iron?

The exact date of the technique of smelting iron ore to produce usable metal is unknown.  Archeologists have discovered iron implements in Egypt which date to about 3000 B.C. and iron ornaments were used even earlier.   The advanced technique of hardening iron weapons by heat   treating was known to the Greeks about 1000 B.C.

Iron is one of the most useful metal ever discovered, but it is also one of the more difficult metals to understand in history, especially in Medieval History.  Iron comes in several forms, and the complications involved in producing each of them foster further confusion.

The alloys produced by early iron worker’s, and all of the iron alloys made until about the 14th century A.D., would be classified today as Wrought Iron.  The process of making the tough, malleable alloy known as wrought iron differs from other forms of steel making.

Because the process of wrought iron required a great deal of hand labor, production of wrought iron in tonnage quantities was impossible.  Wrought iron is no longer produced commercially.

It can be effectively replaced in nearly all applications by low-carbon steel, which is less expensive to produce and is typically of more uniform quality than wrought iron.

Wrought iron was made by heating a mass of iron ore and charcoal in a forge or furnace, having a forced draft.  Under this treatment the ore was reduced to the sponge of metallic iron filled with a slag composed of metallic impurities and charcoal ash.  The sponge iron was removed from the furnace, while still incandescent, and beaten with heavy sledges to drive out the slag and to weld and consolidate the iron.  The product contained under these conditions usually contained about 3% of slag particles  and 0.1% of other impurities.  Often this technique of iron making produced, by accident, a true steel rather than wrought iron.  Iron workers learned to make steel by heating wrought iron and charcoal in clay boxes for several days.  By this process the iron absorbed enough carbon to become a true steel.

Carbon is the major variable that distinguishes between wrought iron, steel and cast iron.  Too little and you have wrought iron, too much and the iron begins to flow as cast iron, the right amount of carbon (1% or more) and you have steel.

After the 14th century the furnaces used in smelting were increased in size and increased draft was used to force the combustion of gases through the “charge”, the mixture of raw material.  In these larger furnaces, the iron ore in the upper part of the furnace was first reduced to metallic iron and then took on more carbon as a result of the gases forced through it by the blast.  The product of these furnaces was pig iron, an alloy treatment at a lower temperature than steel or wrought iron.  Pig iron was then further refined to make steel.

Modern steel making employs blast furnaces that are merely refinement of the furnaces used by the old iron worker’s.  The process of refining molten iron with blasts of air, was accomplished by the British inventor Sir Henry Bessemer, who developed the Bessemer furnace/converter in 1855.  Since the 1960’s, several so-called mini mills have been producing steel from scrap metal in electric furnaces.  Such mills are an important component of total U.S. steel production.  The giant steel mills remain essential for the production of steel from iron ore.