How a WWII Disaster—and Cover-up—Led to a Cancer Treatment Breakthrough

How a WWII Disaster—and Cover-up—Led to a Cancer Treatment Breakthrough



We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

On the night of December 2, 1943, the Germans bombed a key Allied port in Bari, Italy, sinking 17 ships and killing more than 1,000 American and British servicemen and hundreds of civilians. Caught in the surprise World War II air raid was the John Harvey, an American Liberty ship carrying a secret cargo of 2,000 mustard bombs to be used in retaliation if Hitler resorted to gas warfare.

The Luftwaffe’s lucky strike, which released a poisonous cloud of sulfur mustard vapor over the harbor—and liquid mustard into the water—prompted an Allied cover-up of the chemical weapons disaster. But it also led to an army doctor’s serendipitous discovery of a new treatment for cancer.

Eisenhower and Churchill Coordinated an Immediate Cover-up, Thwarting Victims’ Treatment

In the devastating aftermath of the attack, which the press dubbed a “little Pearl Harbor,” U.S. General Dwight D. Eisenhower and British Prime Minister Winston Churchill moved to conceal the truth about the shipment of poison gas, for fear Germany might use it as an excuse to launch an all-out chemical war. As a result of the military secrecy, medical personnel weren’t alerted to the danger of contamination from the liquid mustard that spread insidiously over the harbor, mixing with the tons of fuel oil from the damaged ships.

In the crush of casualties that first night, hundreds of survivors, who had jumped or been blown overboard and swam to safety, were mistakenly believed to be suffering from only shock and immersion. They were given morphine, wrapped in warm blankets and left to sit in their oil-soaked uniforms for as long as 12, and even 24 hours, while the seriously wounded were attended to first. It was tantamount to marinating in mustard gas. But all remained ignorant of the peril.

READ MORE: The History of Chemical Weapons Use Goes Back to the Ancient World

By dawn, the patients had developed red, inflamed skin and blisters on their bodies “the size of balloons.” Within 24 hours, the wards were full of men with eyes swollen shut. The doctors suspected some form of chemical irritant, but the patients did not present typical symptoms or respond to standard treatments. The staff’s unease only deepened when notification came from headquarters that the hundreds of burn patients with unusual symptomology would be classified “Dermatitis N.Y.D.“—not yet diagnosed.

Then without warning, patients in relatively good condition began dying. These sudden, mysterious deaths left the doctors baffled and at a loss as to how to proceed. Rumors spread that the Germans had used an unknown poison gas. With the daily death toll rising, British officials in Bari placed a “red light” call alerting Allied Force Headquarters (AFHQ) in Algiers to the medical crisis. Lieutenant Colonel Stewart Francis Alexander, a young chemical warfare specialist attached to Eisenhower’s staff, was dispatched immediately to the scene of the disaster.

The Investigator’s Findings Were Censored

Despite the British port authorities’ denials, Alexander quickly diagnosed mustard gas exposure. Convinced that preoccupation with military security had compounded the tragedy, he doggedly pursued his own investigation to identify the source of the chemical agent and determine how it had poisoned so many men.

After carefully studying the medical charts, he plotted the destroyed cargo ships’ positions relative to the gas victims and succeeded in pinpointing the John Harvey as the epicenter of the chemical explosion. When divers pulled up fragments of fractured gas shells, the casings were identified as being from 100-pound American mustard bombs.

On December 11, 1943, Alexander informed headquarters of his initial findings. Not only was the gas from the Allies’ own supply, but the victims labeled “Dermatitis N.Y.D.” had suffered prolonged exposure as a result of being immersed in a toxic solution of mustard and oil floating on the surface of the harbor.

The response Alexander received was shocking. While Eisenhower accepted his diagnosis, Churchill refused to acknowledge the presence of mustard gas in Bari. With the war in Europe entering a critical phase, the Allies agreed to impose a policy of strict censorship on the chemical disaster: All mention of mustard gas was stricken from the official record, and Alexander’s diagnosis deleted from the medical charts.

READ MORE: Agent Orange Wasn't the Only Deadly Chemical Used in Vietnam

The Bari Disaster Led to an Epiphany About the Chemical’s Effect on White Blood Cells

Alexander’s “Final Report of the Bari Mustard Casualties” was immediately classified, but not before his startling discovery of the toxic effects on white blood cells caught the attention of his boss in the Chemical Warfare Service (CWS), Colonel Cornelius P. “Dusty” Rhoads. In civilian life, Rhoads served as head of New York’s Memorial Hospital for the Treatment of Cancer and Allied Diseases.

Of the more than 617 casualties who suffered from gas exposure at Bari, 83 died, all demonstrating mustard’s suppressive effect on cell division—suggesting it might be used to inhibit the fast-multiplying malignant white cells that can invade and destroy healthy tissue. Alexander had extracted invaluable data from the morgue full of case studies, pointing to a chemical that could possibly be used as weapon in the fight against certain types of cancer.

Based on Alexander’s landmark Bari report, and a top-secret Yale University clinical trial that demonstrated that nitrogen mustard (a more stable cousin of sulfur mustard) could shrink tumors, Rhoads was convinced the harmful substance—in tiny, carefully calibrated doses—could be used to heal. In 1945, he persuaded the General Motors tycoons Alfred P. Sloane and Charles F. Kettering to fund the Sloan Kettering Institute for Cancer Research (SKI), to create a state-of-the-art laboratory, staffed by wartime scientists, to synthesize new mustard derivatives and develop the first medicine for cancer—known today as chemotherapy.

In 1949, Mustargen (mechlorethamine) became the first experimental chemotherapeutic drug approved by the FDA and was used successfully to treat non-Hodgkin’s lymphoma. This triumph galvanized the search for other chemical agents that specifically targeted malignant cells but spared normal ones, leading the American Cancer Society to credit the Bari disaster with initiating “the age of cancer chemotherapy.”

Jennet Conant is the bestselling author of The Great Secret: The Classified Chemical Weapons Disaster that Launched the War on Cancer as well as numerous other books on World II.


Chemotherapy

“Alexander’s “Final Report of the Bari Mustard Casualties” was immediately classified, but not before his startling discovery of the toxic effects on white blood cells caught the attention of his boss in the Chemical Warfare Service (CWS), Colonel Cornelius P. “Dusty” Rhoads. In civilian life, Rhoads served as head of New York’s Memorial Hospital for the Treatment of Cancer and Allied Diseases.”

“Of the more than 617 casualties who suffered from gas exposure at Bari, 83 died, all demonstrating mustard’s suppressive effect on cell division—suggesting it might be used to inhibit the fast-multiplying malignant white cells that can invade and destroy healthy tissue. Alexander had extracted invaluable data from the morgue full of case studies, pointing to a chemical that could possibly be used as weapon in the fight against certain types of cancer.”

“Based on Alexander’s landmark Bari report, and a top-secret Yale University clinical trial that demonstrated that nitrogen mustard (a more stable cousin of sulfur mustard) could shrink tumors, Rhoads was convinced the harmful substance—in tiny, carefully calibrated doses—could be used to heal. In 1945, he persuaded the General Motors tycoons Alfred P. Sloane and Charles F. Kettering to fund the Sloan Kettering Institute for Cancer Research (SKI), to create a state-of-the-art laboratory, staffed by wartime scientists, to synthesize new mustard derivatives and develop the first medicine for cancer—known today as chemotherapy.”

“In 1949, Mustargen (mechlorethamine) became the first experimental chemotherapeutic drug approved by the FDA and was used successfully to treat non-Hodgkin’s lymphoma. This triumph galvanized the search for other chemical agents that specifically targeted malignant cells but spared normal ones, leading the American Cancer Society to credit the Bari disaster with initiating “the age of cancer chemotherapy.””

From Chemical Weapon to Chemotherapy, 1917–1946 NITROGEN MUSTARD: THE ORIGIN OF CHEMOTHERAPY FOR CANCER Shocking Facts About Chemotherapy Treatments -You Won’t Do Chemo After Watching This Video Chemotherapy Treatment Kills – Why You Should Never Do It NEW STUDY: CHEMOTHERAPY CAN HELP SPREAD CANCER, CAUSE MORE AGGRESIVE TUMORS – Corbett Report THE TRUE HISTORY OF CHEMOTHERAPY & THE PHARMACEUTICAL MONOPOLY


A Rough Start for Presidential Primaries

Early 20th-century politicians advocated for primaries by saying they’d make the nominating process more democratic, even if that wasn’t always politicians’ main reason for supporting them. In 1912, former president Theodore Roosevelt—who’d previously opposed primaries—publicly supported them when he realized it might be the only way to wrest the Republican Party nomination from the sitting president (and his former VP) William Howard Taft.

Only 13 of the 48 states held Republican primaries in the 1912 election, so although Roosevelt won most of the races, he didn’t secure enough delegates to win the nomination. He responded by breaking from the Republicans and starting the Progressive Party or “Bull Moose Party” so he could run for president on its ticket. The new party’s nominating process, however, was deeply undemocratic: the Progressive convention refused to seat Black delegates, including those …read more


1. Germany Repelled on Two Fronts

WATCH: The Lasting Impact of War

After storming across Europe in the first three years of the war, overextended Axis forces were put on the defensive after the Soviet Red Army rebuffed them in the brutal Battle of Stalingrad, which lasted from August 1942 to February 1943. The fierce battle for the city named after Soviet dictator Joseph Stalin resulted in nearly two million casualties, including the deaths of tens of thousands of Stalingrad residents.

As Soviet troops began to advance on the Eastern Front, the Western Allies invaded Sicily and southern Italy, causing the fall of Italian dictator Benito Mussolini’s government in July 1943. The Allies then opened a Western Front with the amphibious D-Day invasion of Normandy on June 6, 1944. After gaining a foothold in northern France, Allied troops liberated Paris on August 25 followed by Brussels …read more


How a Chemical Weapons Disaster in WWII Led to a U.S. Cover-Up—and a New Cancer Treatment

The old port town of Bari, on Italy’s Adriatic coast, was bustling. It was December 2, 1943. The British had taken Puglia’s capital in September, and though the front now lay just 150 miles to the north, the medieval city, with its massive cliffs cradling the sea, had escaped the fighting almost unscathed.

Only a few miles outside of town, lines of women and children begged for black-market food, but here shop windows were full of fruit, cakes and bread. Young couples strolled arm in arm. Even ice cream vendors were doing a brisk trade.

Bari was a Mediterranean service hub, supplying the 500,000 Allied troops engaged in driving the Germans out of Italy. Grand waterfront buildings were recently designated the headquarters of the United States Fifteenth Air Force. The liberating Tommies had already chased the Nazis from the skies over Italy, and the British, who controlled the port, were so confident they had won the air war that Air Marshal Sir Arthur Coningham announced that Bari was all but immune from attack. “I would regard it as a personal affront and insult if the Luftwaffe should attempt any significant action in this area,” he said that day at a press conference.

Four days earlier, the American Liberty ship John Harvey had pulled in with a convoy of nine other merchantmen, and some 30 Allied ships were crammed into the harbor, packed against the seawall and along the pier. Their holds were laden with everything from food and medical gear to engines, corrugated steel for landing strips, and 50-gallon drums of aviation fuel. Visible on the upper decks were tanks, armored personnel carriers, jeeps and ambulances. Bright lights winked atop huge cranes that hoisted baled equipment up and out.

At 7:35 p.m.—a blinding flash followed by a terrific bang.

The ancient port’s single antiaircraft battery opened fire. Then came an earsplitting explosion, then another, and another. German Junkers Ju-88s flew in low over the town, dropping bombs short of the harbor. Smoke and flames rose from the city’s winding streets.

As incendiaries rained down on the harbor, turning night into day, gunners aboard the anchored ships scrambled to shoot down the enemy—too late. The attacking German airplanes fled into the night. The raid lasted less than 20 minutes.

Soon a tremendous roar came from the harbor. An exploding ammunition tanker sent a huge rolling mass of flames a thousand feet high. A reporter for Time magazine noted a “fiery panorama.” Eight ships were already “burning fiercely,” he wrote, and the “entire center of the harbor was covered with burning oil.”

A ruptured bulk-fuel pipeline sent thousands of gallons gushing into the harbor, where it ignited into a gigantic sheet of flame, engulfing the entire north side of the port. Flames leapt from ship to ship. Crews worked frantically to free ships before raging fires forced them to jump overboard and swim for it.

The attack on Bari, which the press called “a little Pearl Harbor,” shook the complacency of the Allied forces, who had been convinced of their air superiority in Italy. All told, the Nazis sunk 17 Allied ships and destroyed more than 31,000 tons of valuable cargo. More than 1,000 American and British servicemen were killed, and almost as many wounded, along with hundreds of civilians.

In the crucial days that followed, the task of treating gravely injured sailors would be made even more difficult by wartime secrecy. It would be almost 30 years before the world would learn the truth about what really took place that night, and even today few are aware of the surprising role of the disaster and its impact on the lives of ordinary Americans.

Lt. Col. Stewart Francis Alexander, asleep in his quarters at Allied Force Headquarters in Algiers, was awake at the first harsh jangle of the telephone. There appeared to be a developing medical crisis in Bari. Too many men were dying, too quickly, of unexplained causes. The symptoms were unlike anything military physicians had seen before, and they had begun to suspect that the Germans had used an unknown poison gas. There was an urgent request for assistance. Alexander, a medical officer attached to the staff of Gen. Dwight D. Eisenhower at AFHQ, had received special training in chemical warfare. He was being dispatched immediately to the scene.

Lt. Col. Stewart Alexander, a physician and cardiologist turned chemical weapons expert, who led the investigation into the Bari disaster. (Stewart F. Alexander Papers)

Alexander looked young for a combat physician. Five-foot-eight and skinny, he was 29, and only the hair thinning at his temples lent him an air of authority. He was popular with the troops, though some patients kidded that his gentle bedside manner was best suited to a pediatrician. But he had been through the brutal invasion of North Africa under Maj. Gen. George S. Patton, and despite a quiet modesty Alexander had proven himself determined and resourceful.

He could have sat out the war in a stateside hospital or research laboratory, but the desire to serve ran deep. He was descended from self-made immigrants, part of a wave of Eastern European Jews who, fleeing famine and persecution, journeyed to the United States in the 1880s and ’90s and were forever grateful for the opportunity afforded them in their new home. Alexander’s father was an old-fashioned family practitioner in Park Ridge, New Jersey, and Alexander’s one ambition was to follow in his footsteps. After excelling at the Staunton Military Academy, in Virginia, he entered Dartmouth College at age 15. A standout in his science courses, he was allowed to advance directly to medical school in his senior year, graduating at the top of his class in 1935. After completing Dartmouth’s two-year program, he earned his medical degree from Columbia University, and did his residency training in New York. Then Alexander returned home, where he proudly hung his shingle next to his father’s. They enjoyed their shared dream of practicing medicine together for only a few months.

In the spring of 1940, Alexander notified the draft board that he was “available any time.” He was called up in November and spent time with the 16th Infantry Regiment, stationed at Gunpowder Military Reservation, in Maryland, not far from Edgewood Arsenal, home of the Chemical Warfare Service, or CWS. Before long, he contacted CWS with an innovative new design for spectacles that fit within the facepiece of a gas mask. (He was granted a patent on the spectacles, but he turned the rights over to the Army.)

Dugway Proving Ground, in Utah, where the U.S. Army tested chemical weapons during World War II. (David Maisel / INSTITUTE)

Transferred to Edgewood, Alexander underwent a crash course in poison gases, consulting specialists and experimenting on animals to evaluate toxic agents and forms of treatment he even investigated agents’ medicinal potential. After Pearl Harbor, he taught Army medical personnel how to treat chemical casualties. He was promoted to director of the Medical Division of CWS’s Research Laboratory at age 27, and when General Patton embarked in October 1942 with 35,000 troops to attack the coast of Morocco, the first time U.S. ground forces would face Axis armies, Alexander accompanied him as the Consultant in Chemical Warfare Medicine to the Western Task Force.

Now, at 5 p.m. on December 7, 1943, five days after the attack on Bari, Alexander’s plane touched down at the city airfield. Waiting for him on the tarmac was the district’s senior British Royal Army Medical Corps officer and a group of hospital directors. “Their agitation was immediately apparent,” Alexander recalled, “and I was taken to the hospital at once.”

The 98th British General Hospital, located in a large complex of brick buildings 15 minutes from the harbor, had been spared. Built on the monumental scale beloved by the Fascists, the Bari Polyclinic housed sizable medical wards, a surgical block and laboratories.

“With every fresh explosion, the building creaked and rattled, rocking like a ship in a storm,” E. M. Somers Cocks, a nurse from New Zealand, recalled of the attack. “Doors were wrenched off hinges, windows were shattered, and the bricked-up windows scattered their bricks like hail.” A concussion blast knocked out the power, plunging the hospital into darkness. They were still sweeping up glass when the wounded began to arrive—hundreds of bloodied sailors suffering from shock, burns and exposure. Almost all of them were covered in thick, black crude oil. The litter-bearers brought up the rear, carrying the grievously injured. These were sailors who had jumped from blazing ships, or swum through pools of flaming oil, and were horribly burned.

Left, Bari, on Italy’s southeastern coast, in November 1943. The British had captured the strategic port city two months earlier. Right, a rescue boat searches for survivors in Bari Harbor after the December 1943 attack. Fuel from damaged freighters and a ruptured pipeline flooded the harbor. (George Kaye / Alexander Turnbull Library / National Library of New Zealand U.S. Army Signal Corps / National Archives)

With so many patients needing urgent attention, there was no time to get many sailors out of their soiled clothes, so the ward matrons did what they could. The “immersion” cases received a shot of morphine, blankets to keep them warm and strong, hot, sweet tea. Then they were left to rest. A British nurse, Gwladys Rees, remembered trying to fix an intravenous line by the light of a match while the wind blew through shattered windows. “We worked by the dim glow of hurricane lamps, long into the night and early morning,” she recalled. “Intravenous bottles were dripping from every third bed, and the corridors were crammed with patients for whom we could find no accommodation.”

The first "unusual" indication, the doctors informed Alexander, was that casualties did not present typical symptoms or respond to treatment in the typical manner. Many patients, despite a thready pulse and low blood pressure, did not appear to be in clinical shock. Instead of being restless or anxious, they were apathetic—some even said they felt “rather well”—and their extremities were warm rather than cold.

After dawn, nurses observed that a few of the men complained of being thirsty, even though orderlies had just gone around with the drinks cart. Suddenly there were so many men clamoring for water the whole ward was in an uproar. Patients were yelling about the intense heat, tearing off their clothing, and, in their frenzy, trying to rip off their bandages.

Overnight, the majority of immersion cases had developed red and inflamed skin, with blisters “as big as balloons and heavy with fluid,” Rees recalled. This, together with widespread nausea and vomiting, led doctors to think the cause might be poisonous fumes, possibly from the fuel oil and explosives. “We began to realize that most of our patients had been contaminated by something beyond all imagination,” she said.

Six hours after the attack, patients who had managed to fall asleep awoke complaining of eye pain. They said their eyes felt “gritty, as though sand particles had gotten in,” Alexander wrote in his report. Within 24 hours, the wards were full of men with eyes swollen shut. As the staff’s unease deepened, British naval headquarters sent a notification that there was a “possibility of blister gas exposure” among the casualties. The hundreds of burn patients with unusual symptoms were to be classified “Dermatitis N.Y.D.”—not yet diagnosed—pending further instructions.

Given the crush of casualties that first night, nonurgent cases who had appeared in “good condition” were sent away, sometimes in their wet uniforms. The next morning many returned, clearly needing treatment. Nurses tried to clean them up, scrubbing the black scum from patients’ skin with kerosene, but many took a turn for the worse. “We did everything humanly possible, but it was no good,” Rees said. “It was horrible to see these boys, so young and in so much obvious pain. We couldn’t even give them strong sedatives, since we weren’t quite sure how they would react with whatever had poisoned them.”

The first unexplained death occurred 18 hours after the attack. Within two days, there were 14. Alexander noted the startling downward spiral. “Individuals that appeared in rather good condition in a matter of minutes would become moribund and die,” the doctors told him. The British doctors were mystified. The symptoms did not correspond to case histories of mustard gas poisoning from World War I, or to manuals issued by the Chemical Warfare Service. If the toxic agent was mustard—named for its unpleasant garlicky odor—respiratory complications should have been more prominent.

A World War II-era poster, with an apparent caricature of Mussolini, to help U.S. troops identify mustard gas, a weapon named for its unpleasant odor. (Otis Historical Archives / National Museum of Health and Medicine)

Several days later, patients with no previous respiratory problems became congested and developed very sore throats, making it hard to swallow. These patients died not as a result of broncho-pneumonia, as might have been expected, but from cardio-circulatory failure.

Alexander walked the crowded wards. He examined patients, gently lifting blankets to study their wounds. With extraordinary delicacy, he probed the strange patches of thickened red skin. He spoke with each patient in turn, asking how he had come by his injuries. Which ship was he on? How did he come to be rescued? Did he receive first aid at the docks? What about at the hospital? One sailor after another told of being caught in the firestorm, of the pandemonium, of somehow making it to the hospital. There they had waited for as long as 12 and even 24 hours before receiving treatment.

Drawing back the covers from one patient, Alexander studied the burns on an otherwise healthy body. The sailor said he had been aboard a PT boat in the harbor when the German bombers flew over. He heard a loud boom as a nearby ship blew up, and the boat was hightailing it back to shore when he felt a spray of oily liquid land on his neck and run down his chest and back. Alexander observed the outline of raw, raised skin, shiny with ointment, delineating where he had been sprayed, as if the splash had been imprinted on his flesh. The burns Alexander had seen on other patients were varied, but already he could distinguish between chemical burns and those caused by fire and heat: “Certain patterns were present depending on how the individual had been exposed.”

It appeared to Alexander that sailors who had been thrown overboard and completely immersed in the harbor were burned extensively, while those in boats had comparatively superficial burns wherever the toxic soup had hit them. Several men who had sat in the solution, possibly in lifeboats, had only local burns of the buttocks and groin. A few lucky souls who took it upon themselves to wipe off the oily mixture that first night sustained only minor injuries.

As he made his rounds, it was increasingly clear to Alexander that most of these patients had been exposed to a chemical agent. His sense of smell supported his hypothesis. Entering the hospital, he had noticed something different from the usual cloying mix of sweat, urine and disinfectant. “Traces of an odor implanted in my mind said mustard gas,” he later recalled.

He knew that the three most common blister agents were sulfur mustard, lewisite and nitrogen mustard. Although generally referred to as “gas,” all three agents were liquids at room temperature. And all three produced skin injuries resembling burns and serious eye injuries. Particularly worrying was the new, pure-grade nitrogen mustard developed by the Germans, which Alexander had studied the previous year, at Edgewood, after two classified samples were smuggled out of Germany. Its effects were reportedly more rapid than sulfur mustard, and it could penetrate intact skin and cause systemic poisoning. Practically colorless and odorless, apart from a faint fishy smell, it was not easily detected in the field. The Germans were also known to use mixtures of blister agents, so any combination was a real possibility.

Declassified photographs of test subjects in U.S. military trials who were exposed to toxic agents such as nitrogen mustard during the war. (Courtesy Naval Research Laboratory)

It had been five days since the initial exposure, and if there was any chance of saving the hundreds of Allied sailors lying in hospitals all over Bari, plus countless Italian civilians, he would need to act swiftly.

He decided to put the question directly to the commanding officer of the 98th General Hospital, Col. Wellington J. Laird. “I feel these men may have been exposed to mustard in some manner, Colonel,” Alexander said tentatively. “Do you have any idea how this might have happened?”

As the chemical warfare consultant to, Alexander was cleared to the “highest degree.” He knew the Allies had begun secretly stockpiling poison gas in the Mediterranean, in case Germany, with its back against the wall, resorted to all-out chemical warfare. But he was skeptical that the Allies would have shipped mustard shells into a busy port like Bari and allowed the toxic cargo to sit there as a prime target for an enemy strike. Still Alexander couldn’t rule it out. Tactfully, he tried again. “Have you checked with the port authorities?” he asked Laird. “Could the ships in the harbor have been carrying mustard?”

Laird replied, “I have, and they tell me they have no such information available.”

The burden of proof rested on him. He ordered a series of tests for the patients who were still alive, and insisted on “careful and complete autopsies” on patients who had died under mysterious circumstances. He ordered samples of the harbor waters collected and analyzed. He borrowed personnel from displaced hospital units and put them to work gathering data, performing lab tests on tissue samples and compiling pathology reports.

Suspecting that Laird had dodged his question, Alexander visited Navy House, the British admiralty’s local headquarters. Weary after the long day, he was blunt: Was there mustard gas in Bari Harbor? This was again “absolutely denied.”

Alexander left unconvinced. What he needed was proof. But this was not the familiar menace he had studied at Edgewood. This was a new horror, “mustard gas poisoning though in a different guise than that recognized from WWI,” he wrote later.

At first light, Stewart Alexander headed for the harbor. He picked his way through mounds of rubble and surveyed the twisted skeletal remains of the Allied convoys. Out on the mole, men were working like ants, removing jagged chunks of concrete and scrap metal. The port, which had been closed for five days and swept for mines, had partially reopened that morning. A number of burned-out vessels had already been towed out to sea and sunk or blown apart. A coal barge still smoldered on a quay close by, and the fly ash stung his nostrils.

The dark oil-slimed water in the harbor basin looked sinister. One sailor had recalled that the floating oil had been a foot thick on the surface of the water after the raid. It was a mixture of high-octane gasoline and fuel from two dozen Allied ships and, Alexander suspected, mustard gas or a derivative, possibly dropped by the Germans among the incendiary bombs. Alexander wondered what other agents might have been thrown into the mix. The Germans possessed phosphorus and magnesium bombs, both of which would have caused deep chemical burns and eye injuries. Another possibility was that an Allied cargo ship had been carrying white phosphorus shells and smoke pots—designed to mask approaches and unnerve the enemy—which were released when the vessel was hit.

If it was an aerial gas attack, determining which ships were hit and in what order would help him understand which crews suffered the most direct exposure. Even men not on the water would have inhaled significant doses of the noxious vapor as it spread across the harbor—some of it sinking, some burning, some mixing with the tons of oil floating on the surface, and some evaporating and mingling with the clouds of smoke and flame. German planes could have dropped time-fused mustard bombs that would burst open approximately 200 feet above the water or, in a low-altitude “spray attack,” could have released liquid mustard from tanks that would then have been transformed by the slipstream into tiny droplets resembling a vapor. Alexander reasoned that in either case the attack would have contaminated all the ships in the inner harbor, including the crippled vessels that remained afloat, and drenched all the men on the docks below.

Yet Alexander had found no evidence of mustard contamination in his survey of the dock area. And the Royal Navy personnel he interviewed appeared shocked at the suggestion that poison gas might have been released in the air raid. “Mustard?” one British officer repeated in surprise, shaking his head. “That’s impossible. There’s no mustard here.”

When he spoke with British port authorities, they continued to “state categorically that there was no mustard in the area.” Undeterred, Alexander described in detail the ghastly burns he had seen at the hospital, and he insisted there was no way those injuries could have been sustained by anything except chemical exposure. Of the 534 men admitted to the Allied hospitals following the attack, 281 were suffering from symptoms consistent with mustard poisoning. That day, 45 had died. These were just the documented cases. Many more fatalities could be expected if they did not receive proper treatment urgently. The vast majority of the victims were British—their own countrymen.

The authorities began to waver. They allowed that if mustard gas was present in the harbor, “it could only have come from the German planes.” Alexander considered the ramifications of the charge that Hitler, in a desperate gamble, had risked a gas offensive. But coming as it did after a string of firm denials of so much as a whiff of mustard in Bari, it seemed to Alexander too neat an explanation.

For days he pored over the clinical records. “Reading the reports,” he wrote, “is to take a journey into the nightmare of the effects of chemical contamination.”

From his training, Alexander knew that agents such as mustard are toxic in vapor or liquid form when they reach the eyes, nose, lungs or gastrointestinal tract. But the chemicals can also be absorbed by the skin. And any toxic agent in contact primarily with the epidermis would, therefore, result in delayed clinical signs—as was the case with the baffling Bari victims.

These were the symptoms he bore in mind as he studied the case of Seaman Philip Henry Stone, a patient who had abruptly died after asking for a drink. The doctors had pointed to him as an example of one of the inexplicable “early deaths.” The pathologist noted “a generalized dusky erythema,” or reddened skin, on the chest, abdomen and thighs, and many blisters on the face, ears, arms, back and external genitalia. “The lips were dull black in color,” he wrote.

During the autopsy, the pathologist also found that the esophagus displayed a “curious black longitudinal streaking,” probably due to dead cells and tissue. The lungs, a mottled blackish-red color, were congested, the bronchi were filled with pus, and the trachea engorged with fluid. The stomach showed the same black areas, and there were necrotic areas near the opening, most likely caused by swallowing a diluted solution of mustard mixed with oil.

After studying the reports, Alexander concluded that many sailors who sustained blast injuries would not have succumbed to the hemorrhages were it not for other complications: “The serious consequences of imposing the mustard vapor injury upon a lung partially damaged or bruised by blast is at once apparent.”

Alexander was still trying to decide how best to proceed, given the official resistance to his diagnosis, when he received stunning news. A diver he had ordered to search the harbor floor had found fractured gas shells. Tests performed on-site revealed traces of mustard. Ordnance officers from the U.S. Air Force identified the casings as belonging to a 100-pound M47A2 mustard gas bomb. German mustard gas bombs were always marked with the distinctive Gelb Kreuz, or yellow cross. This bomb was definitely American.

Alexander’s instincts were right—an Allied ship, later identified as the John Harvey , had been carrying a cargo of mustard gas. The secret shipment had most likely been destined for a chemical stockpile at Foggia, 75 miles away, in order to improve the U.S. capability to retaliate against a German chemical attack.

As Alexander knew from his training, the M47 bomb was made of simple sheet metal, designed to hold white phosphorus or liquid sulfur mustard. Although the M47A2 model was coated inside with an oil to protect it from corrosion caused by the agent, the bombs were still fragile. They would have been blown to pieces in the German bombardment, releasing lethal mustard into the atmosphere and the oily harbor water.

Alexander found it hard to believe that this was the first time British officials were learning of the chemical weapons. The circumstances of the accident would need further investigating, as would the extent to which the military authorities had covered up the escaped gas. By failing to alert hospital staff to the risk of contamination, they had greatly added to the number of fatalities. At that moment, however, Alexander’s patients took precedence. Now that his suspicions were confirmed, he could advise the staff at Allied hospitals on proper treatment for mustard exposure and try to reduce the number of deaths.

Instead of bringing matters to a close, however, Alexander’s discovery that mustard gas had come from the Allies’ own supply had made a difficult job that much more complicated. The British port officials’ attempts to obfuscate rankled, but that paled in comparison to their effort to shift responsibility to the Luftwaffe. It was not a harmless fabrication. Alexander shuddered to think about the “grave political implications.” He later recalled thinking, “If they were going to accuse the Germans of dropping mustard when the Germans had not. ”

Earlier that year, President Roosevelt had issued a stern warning that any Axis use of chemical weapons would be followed by the “fullest possible retaliation.” The significance of “any error in interpreting the factor of and source of mustard gas in Bari,” Alexander recalled, “was horrendous.” If the Allied leaders drew the faulty conclusion that the enemy had deployed chemical weapons, it could trigger widespread chemical warfare.

Adding to his anxiety, the daily death toll from mustard contamination, which had started to decline, suddenly spiked, demonstrating the secondary effects of pneumonia on patients already weakened by chemical exposure. There seemed no way to predict how many more men would die.

Nine days after the bombing, Alexander gave his initial findings to AFHQ in Algiers. “The burns in the hospitals in this area labeled ‘dermatitis N.Y.D.’ are due to mustard gas,” he asserted. “They are unusual types and varieties because most of them are due to mustard which has been mixed into the surface oil in the harbour.”

A survivor of the Bari attack. Widespread symptoms of contamination quickly led Stewart Alexander to deduce that poison gas had become mixed in the harbor water. (Stewart F. Alexander Papers)

Alexander felt growing urgency that his diagnosis be recognized at the highest levels. Some British medical personnel appeared to be waiting for an official stamp of approval before implementing his treatment strategies. More important, there could be no misunderstanding the source of the mustard. He sent high-priority cables to both the American president and the British prime minister, informing them of the nature of the casualties at Bari and the almost certain origin of the gas on an American Liberty ship. Roosevelt appeared to accept his findings, and responded: “Please keep me fully informed.”

Churchill, however, sent a terse reply: He did not believe there was mustard gas in Bari.

Alexander was speechless. He admired Churchill, and he speculated that the British leader’s overriding concern was that the Allies “not acknowledge we had poison gas in that theater of operation because if the Germans retaliated they would be dropping poison gas on England.” There was no questioning the wisdom of this command decision, but Churchill’s opposition undermined Alexander’s credibility and ability to do his job.

Alexander sent a second telegram. He cited his findings at much greater length, stating “beyond any doubt” that these casualties were due to mustard exposure. He was informed that Churchill maintained that “the symptoms do not sound like mustard gas,” which Churchill had witnessed firsthand during World War I. His instructions were the same: “The doctor should reexamine his patients.”

Flummoxed, and unsure how a “lowly, lonely American medical officer” was supposed to respond, Alexander appealed to the liaison officer for advice. The man advised him: One did not argue with the prime minister.

After a sleepless night, Alexander returned early to the hospital determined to prove there had been no mistake about his diagnosis. Churchill was a brilliant man, with an uncanny instinct for the salient fact, and he had put his finger on the most important question about the Bari victims: Why were the toxic effects so much more serious than any other recorded in military history? Far more patients were dying of mustard symptoms at Bari than on the battlefields of WWI, when the fatality rate had been around 2 percent. The death rate in Bari was more than six times higher—and climbing.

The difference, he believed, was the amount of mustard absorbed through the skin from the unprecedented, intimate and lengthy contact as a result of being immersed in the oily harbor water, and then left to sit in soaked uniforms. “In this group of cases,” Alexander postulated, “the individuals, to all intents and purposes, were dipped into a solution of mustard-in-oil, and then wrapped in blankets, given warm tea, and allowed a prolonged period for absorption.”

Alexander’s medical inquiry into mustard’s effects on the victims was just beginning. As he sat reviewing the case sheets and pathology reports, one recurring observation leapt out at him: the devastating effects on the patients’ white blood cells. He flipped through a stack of records. There it was again and again—the white blood cell counts fell off sharply. In patients who recovered, white blood cell concentrations corrected by the second or third day but in some cases, the white blood cell count dropped precipitously beginning on the third or fourth day. He noted that lymphocytes, the white blood cells found in the lymph organs and of importance to the immune system, “were the first to disappear.” What he was looking at made the hair on the back of his neck stand on end. Alexander had seen these exact results before, but never in human beings.

In March 1942, the authorities at Edgewood, having received the nitrogen mustard compounds smuggled out of Germany, turned the samples over to Alexander to investigate their impacts on the body. Alexander and his colleagues immediately began detailed experimental protocols on animals. The first studies, which recorded the effects of exposure on the skin, eyes and respiratory tracts of rabbits, showed results that were completely in line with exposure to sulfur mustard in the past and with what was expected from a highly toxic agent of this kind.

Next, they set up an experiment to determine the effects on the blood and blood-forming organs. Twenty healthy rabbits were exposed to lethal doses of the agent. To the research team’s astonishment, the white blood cell count of the rabbits dropped to zero or points very close to zero. No one at the lab had ever seen such rapid destruction of white blood cells and the accompanying deterioration of lymph nodes and bone marrow. The researchers consulted the literature and found no reports of the same kind of reduction of white cells in the blood, known as leucopenia, or anything that had the same effect. Alexander’s first thought was that they must have a “bad batch of rabbits.” But when they repeated the experiment with a new group, the results were the same.

The first chemotherapy based on nitrogen mustard was approved in 1949. Several chemotherapeutic drugs based on Alexander’s research remain in wide use today. (Richard Lautens / Toronto Star via Getty Images)

Alexander ordered the tests repeated with other lab animals to rule out the possibility of poor stock or species sensitivity. They tried guinea pigs, rats, mice and goats. Each time, they achieved the same dramatic effects: sudden, severe leucopenia, severe lymphopenia, lymph node depletion and marrow depression. After exposure, the white blood cell counts rapidly disappeared, and the lymph nodes were almost completely dissolved, left as “shrunken little shells” of what they had been.

While still at Edgewood, Alexander was fascinated by the idea that mustard interfered with the body’s mechanism for producing blood cells, especially white blood cells. Because of the dramatic and reproducible effects, he could not help but wonder about the possibility of using the compounds directly, or in modified forms, on human beings with diseases of the blood. If nitrogen mustard attacked white blood cells, perhaps it could be used to control leukemia, the most common type of cancer in children, with its unrestrained white blood cell growth, by using different dosages to destroy some but not all excess cells without annihilating patients. But when Alexander proposed an ambitious set of experiments into mustard’s medicinal properties, he was told first by his chief, and then, on appeal, by the National Research Council , that this was not the remit of the Edgewood laboratory. There was not enough time or money to pursue collateral lines of investigation that did not facilitate the national defense. He was ordered to put the project aside and return to his work on mustard casualty management, treatment and decontamination. Chasing miracle cures would have to wait until after the war.

Now, sitting in an Allied military hospital 6,000 miles away, not even two years later, Alexander held in his hands incontrovertible evidence: “mustard gas did, in truth, selectively destroy blood cells and blood-forming organs,” he wrote. Doctors and medical researchers had never before encountered such an extraordinary level of sulfur mustard toxicity, which, when it mixed with the oil dumped into Bari Harbor, approximated the damage done by the experimental nitrogen mustard compounds—and allowed its systemic effects to be seen clearly for the first time. It had taken a freak accident, and the massive exposures of wartime, to verify in people the phenomenon demonstrated in laboratory rabbits. “It all added up to the same conditions I had seen in my prewar animal work,” Alexander later recalled. “Blood cells disappeared, and lymph nodes just melted away.” He remembered thinking, “If nitrogen mustard could do this, what could it do for a person with leukemia or lymphosarcoma?”

Alexander could not save the worst of the Bari mustard gas casualties, he knew, but perhaps he could make their deaths count for something. A one-in-a-million chance had landed him, one of the few doctors in the world who had studied mustard’s curative potential, in the middle of a disaster with a morgue full of case studies. It was an unthinkably rare chance to perform a pioneering investigation into the toxin’s biological effects on the human body—the kind that would be impossible with living volunteers.

He ran down the hall, yelling for more blood tests. He made sure special care was taken in preparing specimen samples to send to Edgewood for microscopic examination, and improvised a fixative solution, hoping the tissue specimens would withstand the long journey. The hematological analysis would not be as complete as he would like. The heavy burden carried by Allied combat hospitals, and the limited facilities, would prevent them from conducting important tests, including studies of bone marrow and blood chemistry. Alexander would need to be scrupulous in gathering as much data as possible, and in badgering lab technicians to do what he felt was necessary. This time, he wanted to make absolutely sure that his insight into the systemic effects of mustard was entered into the medical record, with an eye toward seeing whether the substance could be used not to destroy, but to heal.

On December 27, 1943, Lt. Col. Stewart Alexander submitted his preliminary report on his ten-day investigation of the Bari Harbor catastrophe. It was immediately classified. Eisenhower and Churchill acted in concert to keep the findings secret so there was no chance Hitler could use the incident as an excuse to launch a gas offensive. Any mention of mustard gas was stricken from the official record, and the medical staff of the British hospitals in Bari were instructed to alter the patients’ charts. Alexander’s diagnosis of toxic exposure was deleted and replaced with the generic terminology for combat casualties—burns, lung complications, all other injuries and deaths “due to enemy action.”

The feared German chemical attack never came. The Wehr-macht was deterred by logistical constraints, combined with Allied air superiority and the threat of massive retaliatory strikes. Ironically, the Germans had known all along about the source of the poison gas in the harbor. Nazi spies in the port had suspected that the Allies might be concealing mustard bombs among the munitions they were stockpiling in Italy. After the air strike, they sent down their own diver, an Italian frogman loyal to the Fascists, who recovered a fragment of an M47 bomb casing, which confirmed the chemical weapons were American.

British officials never acknowledged Alexander’s Bari report, but it garnered high praise from Eisenhower’s senior medical advisers. They lauded the exceptional job Alexander had done under challenging conditions, but told him that a commendation was withheld for fear of “offending the Prime Minister.” Nevertheless, Col. Cornelius P. “Dusty” Rhoads, chief of the Medical Division of the Chemical Warfare Service, hailed Alexander’s meticulous investigation as so complete, and of such immense value to medicine, that it represented almost “a landmark in the history of mustard poisoning.”

Rhoads was eager to explore the toxic agent’s therapeutic potential. Like Alexander, he believed the Bari data pointed the way toward a promising new chemical targeting white blood cells that could be used as a weapon in the fight against cancer. Rhoads, who in civilian life was head of New York’s Memorial Hospital for the Treatment of Cancer and Allied Diseases , seized on the wealth of new information provided by the Bari victims as a breakthrough. His ambitious plans for Memorial Hospital now converged with Alexander’s report and crystallized into a single mission—to exploit military research into poison gas to find a chemical that could selectively kill cancer cells.

Cornelius “Dusty” Rhoads, center, former medical chief of the Chemical Warfare Service and director of the Sloan Kettering Institute for Cancer Research. (Courtesy Memorial Sloan Kettering Cancer Center)

Armed with the Bari report, and the results of a top-secret Yale University trial that demonstrated for the first time that a regimen of intravenous nitrogen mustard—in tiny, carefully calibrated doses—could result in human tumor regression, Rhoads went in search of funding to develop this experimental treatment, known today as chemotherapy. He persuaded Alfred P. Sloan Jr., the chairman of General Motors, along with the company’s wizard engineer, Charles F. Kettering, to endow a new institute that would bring together leading scientists and physicians to make a concentrated attack on cancer. On Tuesday, August 7, 1945, the day the world learned that an atom bomb had been dropped on Japan, they announced their plans for the Sloan Kettering Institute for Cancer Research . World War II was over, but the war on cancer had just been launched.

The official secrecy surrounding the Bari disaster continued for decades. The military refused to acknowledge the chronic effects of mustard exposure on hundreds of surviving sailors, naval personnel and civilians, resulting in years of suffering, controversy and lawsuits for medical compensation in both the United States and Britain. In 1961, Alexander volunteered to help the National Academy of Sciences conduct a study of the American survivors, but the project stalled when identifying victims of contamination proved too difficult. “All the records said ‘burns due to enemy action,’” recalled Alexander.

Alexander was discharged from the Chemical Warfare Service in June 1945, and returned home with a chest full of medals and battle ribbons, as well as a new bride, Lt. Col. Bernice “Bunny” Wilbur, the highest-ranking Army nurse in the Mediterranean Theater. He turned down Rhoads’ offer to work at the fledgling Sloan Kettering Institute. Instead, he kept his promise to his father to continue their family practice in Park Ridge, New Jersey, where he became a much beloved physician and cardiologist, and where he raised two daughters with Bunny. He served for 18 years as director of the Bergen Pines County Hospital, and taught at the medical schools of Columbia and New York University. He never boasted of his wartime exploits, but he always took quiet pride in his unique contribution to medicine, and did not mind that while many textbooks eventually traced the modern age of chemotherapy to the Bari disaster, the details of his investigation remained enshrouded in secrecy. He died on December 6, 1991, of a malignant melanoma—skin cancer—but not before the U.S. Army belatedly commended him, three years earlier, for his actions during the Bari episode. “Without his early diagnosis and rapid initiation of appropriate and aggressive treatment, many more lives would have been lost and the severity of injuries would have been much greater,” the commendation read. “His service to the military and civilians injured during this catastrophe reflects the finest measure of a soldier and physician.”

Adapted from The Great Secret: The Classified World War II Disaster That Launched the War on Cancer , by Jennet Conant. Copyright © 2020 by Jennet Conant. Used by permission of W. W. Norton & Company, Inc.


Inventing Reasons for War

There had to be a pretext for starting a war, a cloak to cover naked British aggression. Sihayo kaXongo, a Zulu border chief, had the misfortune of having adulterous wives, and his domestic difficulties provided Frere with an excuse for war. Two of the wives fled with their lovers into Natal, but the British colony did not prove a refuge. Mehokazulu, one of Sihayo’s sons, took a party that crossed the border, tracked the fugitives down, and dragged them back for execution. It was said the adulterous wives were clubbed to death.

The incident gave Frere two “reasons” for war. First, Mehokazulu had been guilty of violating the border, “invading” Natal with a force of indeterminate size. The wives had been killed without trial or due process, another “violation” of British—though not Zulu—moral principles. Over the years European missionaries in Zululand had complained of Cetshwayo’s rule, generally denouncing him as a bloodthirsty tyrant who arbitrarily killed his victimized subjects. These tales, of course, played into Frere’s hands.

By the fall of 1878 Frere’s statements were becoming more shrill and outrageous. He spoke darkly of Cetshwayo’s “faithless and cruel character” and “atrocious barbarity,” even though he had never met the king and most of the stories were hearsay. In truth Cetshwayo wanted peace with the British. He knew that Queen Victoria’s empire, the realm of the “Great White Queen,” stretched around the globe. Much of the misunderstanding stemmed from cultural, not political, differences. The king did execute people on occasion, but such “barbarities” were well within the norms of Zulu society.


NGO Contributions to Community Health and Primary Health Care: Case Studies on BRAC (Bangladesh) and the Comprehensive Rural Health Project, Jamkhed (India)

Non-governmental organizations (NGOs) working in developing countries are chiefly a post-World War II phenomenon. Though they have made important contributions to health and development among impoverished people throughout the world, the documentation of these contributions has been limited. Even though BRAC and the Jamkhed Comprehensive Rural Health Project (CRHP) are but two of 9.7 million NGOs registered around the world, they are unique. Established in 1972 in Bangladesh, BRAC is now the largest NGO in the world in terms of population served—now reaching 130 million people in 11 different countries. Its programs are multi-sectoral but focus on empowering women and improving the health of mothers and children. Through its unique scheme of generating income through its own social enterprises, BRAC is able to cover 85% of its $1 billion budget from self-generated funds. This innovative approach to funding has enabled BRAC to grow and to sustain that growth as its social enterprises have also prospered. The Jamkhed CRHP, founded in 1970 and located in the Indian state of Maharashtra, is notable for its remarkable national and global influence. It is one of the world’s early examples of empowering communities to address their health problems and the social determinants of those problems, in part by training illiterate women to serve as community health workers. The Jamkhed CRHP served as a major influence on the vision of primary health care that emerged at the 1978 International Conference on Primary Health Care at Alma-Ata, Kazakhstan. Its Institute for Training and Research in Community Health and Population has provided on-site training in community health for 45,000 people from 100 different countries. The book written by the founders entitled Jamkhed: A Comprehensive Rural Health Project, describing its pioneering approach, has been translated into five languages beyond English and is one of the most widely read books on global health. These two exemplary NGOs provide a glimpse of the breadth and depth of NGO contributions to improving the health and well-being of impoverished people throughout the world.

Keywords

Subjects

Introduction

An unprecedented amount of progress in human development has been seen since the 1990s. Barring a few unfortunate exceptions, such progress has been experienced across the world by most nations and populations, including the low-income countries. The Millennium Development Goals (MDGs) were the first expression of the determination by the comity of nations that helped to raise interest and to target resources around a selection of development goals for every country. By 2015 , the MDG movement had turned out to be an enormous success. A great improvement was recorded for most countries around the world. According to a United Nations report:

Remarkable gains have been made in the reduction of extreme poverty, increasing primary education access in the developing regions ensuring gender parity in schools, improving health and disease outcomes and access to improved sources of water.

This progress was possible because of the determination of different stakeholders to achieve the desired results. These stakeholders included governments, civil society, development partners, UN agencies, non-governmental organizations (NGOs), academia, the media, and, more importantly, the people themselves. NGOs have played an important role in this journey, and in some countries their role has been extraordinary. This article analyzes the contributions of the NGOs with particular reference to community health and primary health care. Starting with a short history of NGO engagement in development work, the article discusses the many diverse roles played by them. It discusses the sources of NGO strength, their “love/hate” relationships with governments, and the constraints NGOs often face in carrying out their mission.

In this article we first provide a brief overview of NGOs in development and health. Then we provide an overview of the work of NGOs in Bangladesh and their contribution to national health and development, followed by an overview of the NGO BRAC, with an emphasis on its health-related programs. NGOs in Bangladesh are noted for their effective and sustained work, and we highlight the example of BRAC, Bangladesh’s largest and most celebrated NGO. Finally, we provide an overview of one of the world’s pioneering and most influential NGOs in community health and primary health care—the Comprehensive Rural Health Project (CRHP), Jamkhed, in rural Maharashtra, India.

This article is not a comprehensive review of NGOs globally. Such a task would be enormous and beyond the scope of what the authors are capable of. Since the beginning of the MDG era much attention has been given to development in Africa, where NGOs are playing an increasingly important role. Unfortunately this topic has not been covered in this article. The article also suffers from its emphasis on two NGOs—one of which has become the largest NGO in the world, and the other of which is now one of the oldest and most influential. This is largely because of the authors’ own first-hand knowledge of them. It is acknowledged that there are many other NGOs in South Asia and elsewhere—large and small—that are highly effective. The goal here is to present BRAC and CRHP Jamkhed as noteworthy examples of what the broader NGO community has contributed to development and health, with a particular focus on community empowerment, community health, and primary health care.

A Historical Overview of NGOs in Development and Health

“When someone perceives a need, an NGO is likely to follow.”

Fox (1987)

What is meant by NGOs? They have been defined in various ways. According to Edwards and Hulme (1996), NGOs are “intermediary organizations engaged in funding or offering other forms of support to communities and other organizations that seek to promote development.” The World Bank has defined NGOs as “private organizations that pursue activities to relieve suffering, promote the interests of the poor, protect the environment, provide basic social services, or undertake community development” (Delisle, Roberts, Munro, Jones, & Gyorkos, 2005). Wikipedia has defined NGOs in the following way:

[NGOs] are usually non-profit and sometimes international organizations independent of governments and international governmental organizations (though often funded by governments) that are active in humanitarian, educational, health care, public policy, social, human rights, environmental, and other areas to effect changes according to their objectives . . . Sometimes the term is used as a synonym of “civil society organization” to refer to any association founded by citizens.

International NGOs working to promote global development are chiefly a post-World War II phenomenon. NGOs active before World War II were mostly Christian mission organizations, such as the Salvation Army or the Catholic Relief Services (Fox, 1987). Immediately following World War II, a new generation of NGOs emerged that were secular in nature and initially provided relief in war-torn Europe and later in low-income countries. CARE (originally Committee for American Relief Everywhere), for example, began as a grassroots program to send packages of food to poverty-stricken Europeans after World War II. Although the NGO movement started in the high-income countries, their active presence is now seen and felt in most countries across the world, particularly in low-income countries. Even countries following communist ideologies such as China and Vietnam are not excluded from this. Since the beginning of the current century, NGOs born in the South (that is, the developing countries below the equator) have started extending their footprints into other Southern countries, replicating their successful experiences. BRAC, as we will discuss, is one such NGO.

As governments in the North began to recognize the limits of UN agencies and the governments of the South (and especially those of newly independent countries), the governments of the North started directing their development assistance through NGOs. These governments in the North and their citizens “turned to NGOs out of pragmatic considerations, seeing them as more efficient conduits for development inputs than the often discredited official agencies” (Masoni, 1985). Over the past few decades, increasing numbers of NGOs have been established by citizens in low-income countries to serve unmet local needs. These are now referred to as national NGOs. Poverty, disasters, war, and other misfortunes provided grounds and reasons for NGOs to flourish in low-income countries. As Fox (1987) said, “when someone perceives a need, an NGO is likely to follow.”

In 2002 , according to the United Nations (UNDP, 2002), there were 37,000 NGOs in the world. According to more recent estimates, there are now 9.7 million registered NGOs in 196 countries (Quora, 2018). Most of the NGOs working in developing countries have their bases in rural areas and urban shanties/slums, which are often not served at all or minimally served by the existing government service structure. Through their grassroots origins, NGOs understand the problems of development more clearly than other agencies. As Drabek (1987) said, “the development failures of the past have revealed that to pour money into dealing with the symptoms of poverty is not enough—it is the underlying problems of poverty which require actions.” As such, NGOs are involved in almost everything that is known to cause poverty—lack of access to health care, lack of education, undernutrition, lack of decent jobs, gender discrimination and disempowerment of women, exclusion of the poor from access to fair market transactions, and lack of institutions that promote the common good. NGOs began to flourish in the 1980s because of the severe poverty and unmet needs of poor people around the world and also because of the dysfunction as well as the lack of commitment of government and other segments of society in addressing these needs (Abed & Chowdhury, 1989). Such a stalemate called for “means of getting benefits more directly and cheaply to the poor than governments have been able to accomplish on their own” (Korten, 1987).

In the health sector, for example, NGOs are considered to successfully implement effective programs where governments have failed. Why? According to Chatterjee (1988), “flexibility and dynamism, dedicated leadership, professionalism, intensive management, and people’s participation” are some of the reasons. Appreciating the contributions of NGOs in Pakistan, a government representative stated that “health promotion and health education is their art, because they are rooted into the societies by virtue of their work and because they enjoy a better rapport and trust of the community” (Ejaz, Shaikh, & Rizvi, 2011).

NGOs have been playing an increasingly influential role in global policy-making. In a position paper prepared on behalf of NGOs for the International Conference on Primary Health Care in Alma-Ata, the World Federation of Public Health Associations (WFPHA) endorsed the concept of primary health care as envisaged by WHO and UNICEF back in 1978 by emphasizing the role that NGOs could play in taking the primary health care agenda forward. The paper said:

NGOs provide important links between community and government. They possess certain strengths and characteristics that enable them to function as effective agents in this process. In addition, they have exhibited a special capacity to work within the community in response to expressed needs

The paper went on to emphasize the role of human development and multi-sectoral engagement and the centrality of poverty elimination for health development:

[NGOs] support the view that the promotion of primary health care must be closely tied to a concern for total human development. The totality of human development and, in fact, a holistic view of health encompasses the physical, mental, social, and spiritual well-being of the individual

How large should an NGO aspire to become? There is no consensus on the optimal size of an NGO. Traditionally they have been small and have worked with small populations. Today, however, many work nationally, such as the Self Employed Women’s Association (SEWA) in India and the Aids Survivors Organization (TASO) in Uganda. Others work at the multinational regional level, such as the African Medical Research Foundation (AMREF) in East Africa. Others work globally, such as BRAC, Save the Children, CARE, Oxfam, and Doctors without Borders. Advocates of small-scale NGOs maintain that an NGO should be kept small in order to remain effective and efficient, which helps them be politically independent and allows them to be innovative and cost-efficient. Annis (1987), on the other hand, challenged such contentions, saying that, in the face of pervasive poverty, “‘small-scale’ can merely mean insignificant, ‘Politically independent’ can mean powerless or disconnected, ‘Low cost’ can mean underfinanced or poor quality and ‘Innovative’ can mean simply temporary or unsustainable.” Sir Fazle Hasan Abed, founder and chairperson of BRAC, famously has said:

Small is beautiful. Big is necessary

An Overview of NGO Work in Bangladesh

Bangladesh has been a “positive deviant” among the low- and middle-income countries, in the sense that its achievements in addressing poverty and improving health have been extraordinary relative to its peers, given its severely impoverished condition at the time of independence in 1971 . Reviewing the NGO contributions to Bangladesh’s development success vis-à-vis the state, White (1999) said of Bangladesh:

But it is not the state, which still receives over 85 per cent of total Overseas Development Assistance, which is responsible for this international acclaim. Instead it is actors outside the state . . . who produce programmes that are widely praised as models to be replicated.

Rahman and Khan (2017), in a recent conference presentation, analyzed the growing role and increasing influence of NGOs in Bangladesh. They quoted statistics released by the government’s NGO Affairs Bureau, indicating that there are 2,533 registered NGOs who receive international donor support. Of these, 10% (253) are foreign international NGOs and the rest are national. This does not include thousands of other NGOs that are registered under the government’s Social Welfare Ministry and do not receive foreign donations and are thus are not registered with the NGO Affairs Bureau. The authors classified the work of NGOs in Bangladesh under six different categories as follows:


The 24 biggest disasters and tragedies the London Underground has ever seen

Just like any train network anywhere in the world, the London Underground is sometimes hit by disaster and tragedy.

The Tube network is an incredible gift that most of us probably don&apost appreciate, but there&aposs no doubting how much we need it when it carries one billion passengers a year.

There is one fatal accident for every 300 million journeys made. Of these, five passenger deaths have happened because of train operation in nearly 80 years, every since the London Passenger Transport Board was established.

Any other fatalities have been caused by wartime and terrorist bombings as well as station fires.

1. Charing Cross - 1938

Not one but two accident happened near Charing Cross station, which is where Embankment now is, in the same year. On March 10 two Northern Line trains crashed between Waterloo and the station, in which 12 people suffered minor injuries. In the May 17 crash, two District Line trains collided and six people were killed.

2. Bounds Green - 1940

It was in the heart of the War, on October 13 1940, that a German bomb fell at Bounds Green station and 16 people were killed.

3. Balham - 1940

A bomb fell in the road above the station on October 14, the blast big enough to penetrate into the tunnel nine metres below.

The water mains and sewage pipes broke, causing flooding, and a total of 68 people died, including 64 shelterers and four railway staff.

4. Bank - 1941

On January 11, in a similar tragedy, 56 people were killed when a bomb hit the Central Line ticket hall of Bank station and the road collapsed into the station.

5. Bethnal Green - 1943

Again in the war, a huge 173 people died when a crowd of people rushed into the station thinking they had heard bombing. A woman tripped causing many other people to fall and around 300 people were crushed on the stairwell.

6. Northwood crash - 1945

On December 31 in 1945, two trains crashed on the Metropolitan line because of fog.

The driver of the second train had passed a danger signal under the Stop and Proceed rule but didn&apost see the preceding train soon enough to stop.

The fire was started by electrical arching and, unfortunately, three people were killed.

7. Edgware buffer stop collision - 1946

Not long after, on July 27 1946, a Northern Line train hit buffers at Edgware. No passengers were killed. The driver died, but it was proved he suffered a heart attack before the collision.

8. Stratford crash - 1953

In a horrific turn of events, on April 8, two Central Line trains crashed in a tunnel during disruption caused by signal failure. 12 people lost their lives.

9. Holland Park train fire - 1958

A fire occurred on the Central Line due to electrical short circuits in the trains causing arcing. The carriages had to be evacuated and both passengers and crew members suffered from smoke inhalation. Sadly one passenger died.

10. Redbridge train fire - 1960

In a very similar but not quite so severe train fire two years later, this time in East London, no one died but people did suffer after inhaling smoke.

11. Neasden crash - 1968

The crash happened in the North West London area when a ballast train passed three signals at danger and collided with the back of a Bakerloo Line passenger train in the platform. The driver of the ballast train died before he could be cut out. No one else lost their lives, but an inspector and the train&aposs guard were rushed to hospital.

12. Moorgate collision - 1975

On February 28 1975 a complete tragedy hit the London Underground which ended in 43 deaths.

A southbound Northern City Line train crashed into the tunnel end beyond the platform at Moorgate station in the City. It was the greatest loss of life on the Underground in peacetime.

The driver was on of those who died so the cause of the accident was never fully determined and an accidental death verdict was recorded at the official inquest.

13. Oxford Circus fire - 1984

In another train fire, flames started raging inside the station at 9.50pm on November 23 and wasn&apost extinguished until 3am the next day. 14 people suffered smoke inhalation but luckily no one died. It is thought it was caused by a smoker&aposs materials being pushed through a ventilation grille into the materials store, which caught alight.

14. Kilburn crash - 1984

On December 11 a northbound Metropolitan line train passed a signal at danger when it was foggy. The driver reset the controls and moved forward but was killed when the train crashed with a train stopped in front.

15. King&aposs Cross fire - 1987

A huge fire broke out in King&aposs Cross St Pancras station on November 18 in which 31 people died from the toxic fumes and the extreme heat. It was caused by a discarded match or cigarette igniting debris and grease beneath the wooden escalators. The incident caused the widely ignored smoking ban to be more rigorously enforced and eventually the wooden escalators were replaced.

16. Gunnersbury Triangle - 1999

On April 24 a District Line train derailed soon after leaving Gunnersbury on its was to Richmond, specifically at the Gunnersbury Triangle junction points where the line diverged from the railtrack route of the North London Line. The trailing car derailed and ended up strewn across the track. Luckily no one was seriously hurt.

17. Chancery Lane - 2003

A Central Line train derailed on January 25 at Chancery Lane and 32 passengers were injured after a motor became detached from the train. It meant the entire line and the Waterloo & City line were closed for around three months while the cause of the failure was determined and rectified, as both lines used 1992 stock trains.

18. Hammersmith derailment - 2003

In the same year, on October 17, the last carriage of a six-car eastbound Piccadilly line train came off the raise east of Hammersmith station due to a broken rail but luckily none of the 70 passengers were actually hurt.

19. Camden Town derailment - 2003

Just two days later, on October 19, the last carriage of a Northern Line train derailed as it approached Camden Town station. Seven passengers were injured, with six being minor and one being a broken femur.

20. White City derailment - 2004

Then on May 11 the following year, the leading bogie of a the seventh car of a Central Line train came off the tracks on the approach to White City, but once again no one on board was injured fortunately.

21. 7 July 2005 London bombings

Often referred to as the 7/7 bombings, this tragedy was one that shook the nation. A series of coordinated terrorist attacks on London&aposs public transport, 56 people died, including the four bombers who carried out the attacks. A further 784 people were injured and the attacks affected London Underground trains as well as a London bus.

22. Mile End derailment - 2007

On July 5 two cars of a westbound Central Line train derailed at 65km/h between Bethnal Green and Mile End stations. 520 passengers were trapped underground for two hours until they were escorted from the derailed train by forming lines and following each other along the tracks all the way to Mile End station.

8 people needed hospital treatment and a further 13 were treated at the scene for minor injuries. Most of the injuries were sustained on the walk to the station.

Get WhatsApp news alerts to your phone

We&aposve set up a new WhatsApp group so you can receive the latest London headlines straight to your phone.

To receive one message a day with the main headlines, as well as breaking news alerts, send one of the following to 07900 342671 on WhatsApp, depending on where you want to receive news from:

  • LONDON NEWS
  • CENTRAL LONDON NEWS
  • NORTH LONDON NEWS
  • EAST LONDON NEWS
  • SOUTH LONDON NEWS
  • WEST LONDON NEWS

Then add the number to your phone contacts book as &aposMyLondon&apos. You must do this or you will not receive the messages.

You will receive one message a day. You can reply with the word STOP at any time.

Your phone number won&apost be shared with other members of the group.

23. Runaway train - 2010

Early in the morning of August 13 a broken down track machine became uncoupled from the locomotive towing it so it rolled southwards from Archway station. The runaway train passed through loads of stops until it reached Warren Street station where a hill caused it to finally stop.

Then on February 28 (2013), the London Underground, Tube Lines and German company Schweerbau were each fined £100,000 at the Old Bailey for Health and Safety breaches.

24. Passenger dragged at Holborn station - 2014

In the most recent recording of tragedy on the London Underground, on February 3, a passenger had to be rushed to hospital after being dragged along the platform by a departing Piccadilly line train after her scarf got caught in a closing Tube door.


Get A Copy


Contents

1880s–1924: The origin of IBM Edit

The roots of IBM date back to the 1880s, tracing from four predecessor companies: [8] [9] [10] [11]

  • The Bundy Manufacturing Company was the first manufacturer of time clocks. The company was founded in 1889 by Harlow Bundy in Binghamton, New York.
  • The Tabulating Machine Company was the first manufacturer of punch card based data processing machines. Herman Hollerith started building the machines as early as 1884, and founded the Tabulating Machine Company in 1896 in Washington, D.C.
  • The International Time Recording Company was founded in 1900 by George Winthrop Fairchild in Jersey City, New Jersey, and reincorporated in 1901 in Binghamton. The company relocated in 1906 to nearby Endicott, New York.
  • The Computing Scale Company of America was founded in 1901 in Dayton, Ohio.

On June 16, 1911, these four companies were amalgamated into a new holding company named the Computing-Tabulating-Recording Company (CTR), based in Endicott. [12] [13] [14] [15] The amalgamation was engineered by noted financier Charles Flint. Flint remained a member of the board of CTR until his retirement in 1930. [16] At the time of the amalgamation, CTR had 1,300 employees and offices and plants in Endicott and Binghamton, New York Dayton, Ohio Detroit, Michigan Washington, D.C. and Toronto, Ontario.

After amalgamation, the individual companies continued to operate using their established names, as subsidiaries of CTR, until the holding company was eliminated in 1933. [17] The divisions manufactured a wide range of products, including employee time-keeping systems, weighing scales, automatic meat slicers, coffee grinders, and punched card equipment. The product lines were very different Flint stated that the "allied" consolidation:

. instead of being dependent for earnings upon a single industry, would own three separate and distinct lines of business, so that in normal times the interest and sinking funds on its bonds could be earned by any one of these independent lines, while in abnormal times the consolidation would have three chances instead of one to meet its obligations and pay dividends. [18]

Of the companies amalgamated to form CTR, the most technologically significant was The Tabulating Machine Company, founded by Herman Hollerith, and specialized in the development of punched card data processing equipment. Hollerith's series of patents on tabulating machine technology, first applied for in 1884, drew on his work at the U.S. Census Bureau from 1879–82. Hollerith was initially trying to reduce the time and complexity needed to tabulate the 1890 Census. His development of punched cards in 1886 set the industry standard for the next 80 years of tabulating and computing data input. [19]

In 1896, The Tabulating Machine Company leased some machines to a railway company [20] but quickly focused on the challenges of the largest statistical endeavor of its day – the 1900 US Census. After winning the government contract, and completing the project, Hollerith was faced with the challenge of sustaining the company in non-Census years. He returned to targeting private businesses in the United States and abroad, attempting to identify industry applications for his automatic punching, tabulating and sorting machines. In 1911, Hollerith, now 51 and in failing health sold the business to Flint for $2.3 million (of which Hollerith got $1.2 million), who then founded CTR. When the diversified businesses of CTR proved difficult to manage, Flint turned for help to the former No. 2 executive at the National Cash Register Company (NCR), Thomas J. Watson, Sr. Watson became General Manager of CTR in 1914 and President in 1915. By drawing upon his managerial experience at NCR, Watson quickly implemented a series of effective business tactics: generous sales incentives, a focus on customer service, an insistence on well-groomed, dark-suited salesmen, and an evangelical fervor for instilling company pride and loyalty in every worker. As the sales force grew into a highly professional and knowledgeable arm of the company, Watson focused their attention on providing large-scale tabulating solutions for businesses, leaving the market for small office products to others. He also stressed the importance of the customer, a lasting IBM tenet. The strategy proved successful, as, during Watson's first four years, revenues doubled to $2 million, and company operations expanded to Europe, South America, Asia, and Australia.

At the helm during this period, Watson played a central role in establishing what would become the IBM organization and culture. He launched a number of initiatives that demonstrated an unwavering faith in his workers. He hired the company's first disabled worker in 1914, he formed the company's first employee education department in 1916 and in 1915 he introduced his favorite slogan, "THINK", which quickly became the corporate mantra. Watson boosted company spirit by encouraging any employee with a complaint to approach him or any other company executive – his famed Open Door policy. He also sponsored employee sports teams, family outings, and a company band, believing that employees were most productive when they were supported by healthy and supportive families and communities. These initiatives – each deeply rooted in Watson's personal values system – became core aspects of IBM culture for the remainder of the century.

"Watson had never liked the clumsy hyphenated title of the CTR" and chose to replace it with the more expansive title "International Business Machines". [21] First as a name for a 1917 Canadian subsidiary, then as a line in advertisements. Finally, on February 14, 1924, the name was used for CTR itself.

Key events Edit

  • 1890-1895: Hollerith's punched cards used for 1890 Census. The U.S. Census Bureau contracts to use Herman Hollerith's punched card tabulating technology on the 1890 United States Census. That census was completed in 6-years and estimated to have saved the government $5 million. [22] The prior, 1880, census had required 8-years. The years required are not directly comparable the two differed in: population size, data collected, resources (census bureau headcount, machines, . ), and reports prepared. The total population of 62,947,714, the family, or rough, count, was announced after only six weeks of processing (punched cards were not used for this tabulation). [23][24] Hollerith's punched cards become the tabulating industry standard for input for the next 70 years. Hollerith's The Tabulating Machine Company is later consolidated into what becomes IBM.
  • 1906: Hollerith Type I Tabulator. The first tabulator with an automatic card feed and control panel. [25]
  • 1911: Formation. Charles Flint, a noted trust organizer, engineers the amalgamation of four companies: The Tabulating Machine Company, the International Time Recording Company, the Computing Scale Company of America, and the Bundy Manufacturing Company. The amalgamated companies manufacture and sell or lease machinery such as commercial scales, industrial time recorders, meat and cheese slicers, tabulators, and punched cards. The new holding company, Computing-Tabulating-Recording Company, is based in Endicott. Including the amalgamated subsidiaries, CTR had 1,300 employees with offices and plants in Endicott and Binghamton, New York Dayton, Ohio Detroit, Michigan and Washington, D.C. [26][27]
  • 1914: Thomas J. Watson arrives. Thomas J. Watson Sr., a one-year jail sentence pending – see NCR – is made general manager of CTR. Less than a year later the court verdict was set aside. A consent decree was drawn up which Watson refused to sign, gambling that there would not be a retrial. He becomes president of the firm Monday, March 15, 1915. [28]
  • 1914: First disabled employee. CTR companies hire their first disabled employee. [29]
  • 1915: "THINK" signs. "THINK" signs, based on the slogan coined by Thomas J. Watson, Sr. while at NCR and promoted by John Henry Patterson (NCR owner) are used in the companies for the first time. [30]
  • 1916: Employee education. CTR invests in its subsidiary's employees, creating an education program. Over the next two decades, the program would expand to include management education, volunteer study clubs, and the construction of the IBM Schoolhouse in 1933. [31]
  • 1917: CTR in Brazil. Premiered in Brazil in 1917, invited by the Brazilian Government to conduct the census, CTR opened an office in Brazil [32]
  • 1920: First Tabulating Machine Co. printing tabulator. With prior tabulators the results were displayed and had to be copied by hand. [33]
  • 1923: CTR Germany. CTR acquires majority ownership of the German tabulating firm Deutsche Hollerith Maschinen Groupe (Dehomag).
  • 1924: International Business Machines Corporation. "Watson had never liked the clumsy hyphenated title of Computing-Tabulating-Recording Company" and chose the new name both for its aspirations and to escape the confines of "office appliance". The new name was first used for the company's Canadian subsidiary in 1917. On February 14, 1924, CTR's name was formally changed to International Business Machines Corporation (IBM). [21] The subsidiaries' names did not change there would be no IBM labeled products until 1933 (below) when the subsidiaries are merged into IBM.

1925–1929: IBM's early growth Edit

Our products are known in every zone. Our reputation sparkles like a gem. We've fought our way through and new fields we're sure to conquer too. For the ever-onward IBM

Watson mandated strict rules for employees, including a dress code of dark suits, white shirts and striped ties, and no alcohol, whether working or not. He led the singing at meetings of songs such as "Ever Onward" from the official IBM songbook. [34] The company launched an employee newspaper, Business Machines, which unified coverage of all of IBM's businesses under one publication. [35] IBM introduced the Quarter Century Club, [36] to honor employees with 25 years of service to the company, and launched the Hundred Percent Club, to reward sales personnel who met their annual quotas. [37] In 1928, the Suggestion Plan program – which granted cash rewards to employees who contributed viable ideas on how to improve IBM products and procedures – made its debut. [38]

IBM and its predecessor companies made clocks and other time recording products for 70 years, culminating in the 1958 sale of the IBM Time Equipment Division to Simplex Time Recorder Company, [40] IBM manufactured and sold such equipment as dial recorders, job recorders, recording door locks, time stamps and traffic recorders. [41] [42]

The company also expanded its product line through innovative engineering. Behind a core group of inventors – James W. Bryce, Clair Lake, [43] Fred Carroll, [44] and Royden Pierce [45] – IBM produced a series of significant product innovations. In the optimistic years following World War I, CTR's engineering and research staff developed new and improved mechanisms to meet the broadening needs of its customers. In 1920, the company introduced the first complete school time control system, [46] and launched its first printing tabulator. [47] Three years later the company introduced the first electric keypunch, [48] and 1924's Carroll Rotary Press produced punched cards at previously unheard of speeds. [35] In 1928, the company held its first customer engineering education class, demonstrating an early recognition of the importance of tailoring solutions to fit customer needs. [49] It also introduced the 80-column punched card in 1928, which doubled its information capacity. [49] This new format, soon dubbed the "IBM Card", became and remained an industry standard until the 1970s.

Key events Edit

  • 1925: First tabulator sold to Japan. In May 1925, Morimura-Brothers entered into a sole agency agreement with IBM to import Hollerith tabulators into Japan. The first Hollerith tabulator in Japan was installed at Nippon Pottery (now Noritake) in September 1925, making it IBM customer #1 in Japan. [50][51][52]
  • 1927: IBM Italy. IBM opens its first office in Italy in Milan, and starts selling and operating with National Insurance and Banks.
  • 1928: A Tabulator that can subtract, Columbia University, 80-column card. The first Hollerith tabulator that could subtract, the Hollerith Type IV tabulator. [53] IBM begins its collaboration with Benjamin Wood, Wallace John Eckert and the Statistical Bureau at Columbia University. [54][55] The Hollerith 80-column punched card is introduced. Its rectangular holes are patented, ending vendor compatibility (of the prior 45 column card Remington Rand would soon introduce a 90 column card). [56]

1930–1938: The Great Depression Edit

The Great Depression of the 1930s presented an unprecedented economic challenge, and Watson met the challenge head-on, continuing to invest in people, manufacturing, and technological innovation despite the difficult economic times. Rather than reduce staff, he hired additional employees in support of President Franklin Roosevelt's National Recovery Administration plan – not just salesmen, which he joked that he had a lifelong weakness for, but engineers too. Watson not only kept his workforce employed, but he also increased their benefits. IBM was among the first corporations to provide group life insurance (1934), survivor benefits (1935), and paid vacations (1936). He upped his ante on his workforce by opening the IBM Schoolhouse in Endicott to provide education and training for IBM employees. And he greatly increased IBM's research capabilities by building a modern research laboratory on the Endicott manufacturing site.

With all this internal investment, Watson was, in essence, gambling on the future. It was IBM's first ‘Bet the Company’ gamble, but the risk paid off handsomely. Watson's factories, running full tilt for six years with no market to sell to, created a huge inventory of unused tabulating equipment, straining IBM's resources. To reduce the cash drain, the struggling Dayton Scale Division (the food services equipment business) was sold in 1933 to Hobart Manufacturing for stock. [57] [58] When the Social Security Act of 1935 – labeled as "the biggest accounting operation of all time" [59] – came up for bid, IBM was the only bidder that could quickly provide the necessary equipment. Watson's gamble brought the company a landmark government contract to maintain employment records for 26 million people. IBM's successful performance on the contract soon led to other government orders, and by the end of the decade, IBM had not only safely negotiated the Depression but risen to the forefront of the industry. Watson's Depression-era decision to invest heavily in technical development and sales capabilities, education to expand the breadth of those capabilities, and his commitment to the data processing product line laid the foundation for 50 years of IBM growth and successes.

His avowed focus on international expansion proved an equally key component of the company's 20th-century growth and success. Watson, having witnessed the havoc the First World War wrought on society and business, envisioned commerce as an obstacle to war. He saw business interests and peace as being mutually compatible. In fact, he felt so strongly about the connection between the two that he had his slogan "World Peace Through World Trade" carved into the exterior of IBM's new World Headquarters (1938) in New York City. [60] The slogan became an IBM business mantra, and Watson campaigned tirelessly for the concept with global business and government leaders. He served as an informal, unofficial government host for world leaders when they visited New York, and received numerous awards from foreign governments for his efforts to improve international relations through the formation of business ties.

Key events Edit

  • 1931: The first Hollerith punched card machine that could multiply, the first Hollerith alphabetical accounting machine. The Hollerith 600 Multiplying Punch. [61] The first Hollerith alphabetical accounting machine – although not a complete alphabet, the Alphabetic Tabulator Model B was quickly followed by the full alphabet ATC. [56]
  • 1931: Super Computing Machine. The term Super Computing Machine is used by the New York World newspaper to describe the Columbia Difference Tabulator, a one-of-a-kind special purpose tabulator-based machine made for the Columbia Statistical Bureau, a machine so massive it was nicknamed Packard. [62][63] The Packard attracted users from across the country: "the Carnegie Foundation, Yale, Pittsburgh, Chicago, Ohio State, Harvard, California and Princeton." [64]
  • 1933: Subsidiary companies are merged into IBM. The Tabulating Machine Company name, and others, disappear as subsidiary companies are merged into IBM. [65][66]
  • 1933: Removable control panels. IBM introduces removable control panels. [67]
  • 1933: 40-hour week. IBM introduces the 40-hour week for both manufacturing and office locations.
  • 1933: Electromatic Typewriter Co. purchased. Purchased primarily to get important patents safely into IBM hands, electric typewriters would become one of IBM's most widely known products. [68] By 1958 IBM was deriving 8% of its revenue from the sale of electric typewriters. [69]
  • 1934 – Group life insurance. IBM creates a group life insurance plan for all employees with at least one year of service. [70]
  • 1934: Elimination of piece work. Watson, Sr., places IBM's factory employees on salary, eliminating piece work and providing employees and their families with an added degree of economic stability. [71]
  • 1934: IBM 801. The IBM 801 Bank Proof machine to clear bank checks is introduced. A new type of proof machine, the 801 lists and separates checks, endorses them, and records totals. It dramatically improves the efficiency of the check clearing process. [72]
  • 1935: Social Security Administration. During the Great Depression, IBM keeps its factories producing new machines even while demand is slack. When Congress passes the Social Security Act in 1935, IBM – with its overstocked inventory – is consequently positioned to win the landmark government contract, which is called "the biggest accounting operation of all time." [73]
  • 1936: Supreme Court rules IBM can only set punched card specifications. IBM initially required that its customers use only IBM manufactured cards with IBM machines, which were leased, not sold. IBM viewed its business as providing a service and that the cards were part of the machine. In 1932 the government took IBM to court on this issue. IBM fought all the way to the Supreme Court and lost in 1936 the court ruling that IBM could only set card specifications. [74]
  • 1937: Scientific computing. The tabulating machine data center established at Columbia University, dedicated to scientific research, is named the Thomas J. Watson Astronomical Computing Bureau. [75]
  • 1937: The first collator, the IBM 077 Collator. [76]
  • 1937: IBM produces 5 to 10 million punched cards every day. By 1937. IBM had 32 presses at work in Endicott, N.Y., printing, cutting and stacking five to 10 million punched cards every day. [77]
  • 1937: IBM 805 test scoring machine. IBM's Rey Johnson designs the IBM 805 Test Scoring Machine to greatly speed the process of test scoring. The 805's innovative pencil-mark sensing technology gives rise to the ubiquitous phrase, "Please completely fill in the oval". [78]
  • 1937: Berlin conference. As president of the International Chamber of Commerce, Watson Sr., presides over the ICC's 9th Congress in Berlin. While there he accepts a Merit Cross of the German Eagle with Star medal from the Nazi government honoring his activities on behalf of world peace and international trade (he later returned it). [79][80]
  • 1937: Paid holidays, paid vacation. IBM announces a policy of paying employees for six annual holidays and becomes one of the first U.S. companies to grant holiday pay. Paid vacations also begin." [81]
  • 1937: IBM Japan. Japan Wattoson Statistics Accounting Machinery Co., Ltd. (日本ワットソン統計会計機械株式会社, now IBM Japan) was established. [51]
  • 1938: New headquarters. When IBM dedicates its new World Headquarters on 590 Madison Avenue, New York, New York, in January 1938, the company has operations in 79 countries. [60]

1939–1945: World War II Edit

In the decades leading up to the onset of WW2 IBM had operations in many countries that would be involved in the war, on both the side of the Allies and the Axis. IBM had a lucrative subsidiary in Germany, which it was the majority owner of, as well as operations in Poland, Switzerland, and other countries in Europe. As with most other enemy-owned businesses in Axis countries, these subsidiaries were taken over by the Nazis and other Axis governments early on in the war. The headquarters in New York meanwhile worked to help the American war effort.

IBM in America Edit

IBM's product line [82] shifted from tabulating equipment and time recording devices to Sperry and Norden bombsights, Browning Automatic Rifle and the M1 Carbine, and engine parts – in all, more than three dozen major ordnance items and 70 products overall. Watson set a nominal one percent profit on those products and used the profits to establish a fund for widows and orphans of IBM war casualties. [83]

Allied military forces widely utilized IBM's tabulating equipment for mobile records units, ballistics, accounting and logistics, and other war-related purposes. There was extensive use of IBM punched-card machines for calculations made at Los Alamos during the Manhattan Project for developing the first atomic bombs. [84] During the War, IBM also built the Automatic Sequence Controlled Calculator, also known as the Harvard Mark I for the U.S. Navy – the first large-scale electromechanical calculator in the U.S..

In 1933 IBM had acquired the rights to Radiotype, an IBM Electric typewriter attached to a radio transmitter. [85] "In 1935 Admiral Richard E. Byrd successfully sent a test Radiotype message 11,000 miles from Antarctica to an IBM receiving station in Ridgewood, New Jersey" [86] Selected by the Signal Corps for use during the war, Radiotype installations handled up to 50,000,000 words a day. [87]

To meet wartime product demands, IBM greatly expanded its manufacturing capacity. IBM added new buildings at its Endicott, New York plant (1941), and opened new facilities in Poughkeepsie, New York (1941), Washington, D.C. (1942), [88] and San Jose, California (1943). [89] IBM's decision to establish a presence on the West Coast took advantage of the growing base of electronics research and other high technology innovation in the southern part of the San Francisco Bay Area, an area that came to be known many decades later as Silicon Valley.

IBM was, at the request of the government, the subcontractor for the Japanese internment camps' punched card project. [90]

IBM equipment was used for cryptography by US Army and Navy organizations, Arlington Hall and OP-20-G and similar Allied organizations using Hollerith punched cards (Central Bureau and the Far East Combined Bureau).

IBM in Germany and Nazi Occupied Europe Edit

The Nazis made extensive use of Hollerith equipment and IBM's majority-owned German subsidiary, Deutsche Hollerith Maschinen GmbH (Dehomag), supplied this equipment from the early 1930s. This equipment was critical to Nazi efforts to categorize citizens of both Germany and other nations that fell under Nazi control through ongoing censuses. This census data was used to facilitate the round-up of Jews and other targeted groups, and to catalog their movements through the machinery of the Holocaust, including internment in the concentration camps.

As with hundreds of foreign-owned companies that did business in Germany at that time, Dehomag came under the control of Nazi authorities prior to and during World War II. A Nazi, Hermann Fellinger, was appointed by the Germans as an enemy-property custodian and placed at the head of the Dehomag subsidiary.

Historian and author Edwin Black, in his best selling book on the topic, maintains that the seizure of the German subsidiary was a ruse. He writes: "The company was not looted, its leased machines were not seized, and [IBM] continued to receive money funneled through its subsidiary in Geneva." [91] In his book he argues that IBM was an active and enthusiastic supplier to the Nazi regime long after they should have stopped dealing with them. Even after the invasion of Poland, IBM continued to service and expand services to the Third Reich in Poland and Germany. [91] The seizure of IBM came after Pearl Harbor and the US Declaration of War, in 1941.

IBM responded that the book was based upon "well-known" facts and documents that it had previously made publicly available and that there were no new facts or findings. [92] IBM also denied withholding any relevant documents. [93] Writing in the New York Times, Richard Bernstein argued that Black overstates IBM's culpability. [94]

Key events Edit

  • 1942: Training for the disabled. IBM launches a program to train and employ disabled people in Topeka, Kansas. The next year classes begin in New York City, and soon the company is asked to join the President's Committee for Employment of the Handicapped. [95]
  • 1943: First female vice president. IBM appoints its first female vice president. [96]
  • 1944: ASCC. IBM introduces the world's first large-scale calculating computer, the Automatic Sequence Control Calculator (ASCC). Designed in collaboration with Harvard University, the ASCC, also known as the Mark I, uses electromechanical relays to solve addition problems in less than a second, multiplication in six seconds, and division in 12 seconds. [97]
  • 1944: United Negro College Fund. IBM President Thomas J. Watson, Sr., joins the Advisory Committee of the United Negro College Fund (UNCF), and IBM contributes to the UNCF's fund-raising efforts. [98]
  • 1945: IBM's first research lab. IBM's first research facility, the Watson Scientific Computing Laboratory, opens in a renovated fraternity house near Columbia University in Manhattan. In 1961, IBM moves its research headquarters to the T.J. Watson Research Center in Yorktown Heights, New York. [99]

1946–1959: Postwar recovery, rise of business computing, space exploration, the Cold War Edit

IBM had expanded so much by the end of the War that the company faced a potentially difficult situation – what would happen if military spending dropped sharply? One way IBM addressed that concern was to accelerate its international growth in the years after the war, culminating with the formation of the World Trade Corporation in 1949 to manage and grow its foreign operations. Under the leadership of Watson's youngest son, Arthur K. ‘Dick’ Watson, the WTC would eventually produce half of IBM's bottom line by the 1970s.

Despite introducing its first computer a year after Remington Rand's UNIVAC in 1951, within five years IBM had 85% of the market. A UNIVAC executive complained that "It doesn't do much good to build a better mousetrap if the other guy selling mousetraps has five times as many salesmen". [34] With the death of Founding Father Thomas J. Watson, Sr. on June 19, 1956 at age 82, IBM experienced its first leadership change in more than four decades. The mantle of chief executive fell to his eldest son, Thomas J. Watson, Jr., IBM's president since 1952.

The new chief executive faced a daunting task. The company was in the midst of a period of rapid technological change, with nascent computer technologies – electronic computers, magnetic tape storage, disk drives, programming – creating new competitors and market uncertainties. Internally, the company was growing by leaps and bounds, creating organizational pressures and significant management challenges. Lacking the force of personality that Watson Sr. had long used to bind IBM together, Watson Jr. and his senior executives privately wondered if the new generation of leadership was up to challenge of managing a company through this tumultuous period. [100] "We are," wrote one longtime IBM executive in 1956, "in grave danger of losing our "eternal" values that are as valid in electronic days as in mechanical counter days."

Watson Jr. responded by drastically restructuring the organization mere months after his father died, creating a modern management structure that enabled him to more effectively oversee the fast-moving company. [101] He codified well known but unwritten IBM practices and philosophy into formal corporate policies and programs – such as IBM's Three Basic Beliefs, and Open Door and Speak Up! Perhaps the most significant of which was his shepherding of the company's first equal opportunity policy letter into existence in 1953, one year before the U.S. Supreme Court decision in Brown vs. Board of Education and 11 years before the Civil Rights Act of 1964. [102] He continued to expand the company's physical capabilities – in 1952 IBM San Jose launched a storage development laboratory that pioneered disk drives. Major facilities would later follow in Rochester, Minnesota Greencastle, Indiana Kingston, New York and Lexington, Kentucky. Concerned that IBM was too slow in adapting transistor technology Watson requested a corporate policy regarding their use, resulting in this unambiguous 1957 product development policy statement: "It shall be the policy of IBM to use solid-state circuitry in all machine developments. Furthermore, no new commercial machines or devices shall be announced which make primary use of tube circuitry." [103]

Watson Jr. also continued to partner with the United States government to drive computational innovation. The emergence of the Cold War accelerated the government's growing awareness of the significance of digital computing and drove major Department of Defense supported computer development projects in the 1950s. Of these, none was more important than the SAGE interceptor early detection air defense system.

In 1952, IBM began working with MIT's Lincoln Laboratory to finalize the design of an air defense computer. The merger of academic and business engineering cultures proved troublesome, but the two organizations finally hammered out a design by the summer of 1953, and IBM was awarded the contract to build two prototypes in September. [104] In 1954, IBM was named as the primary computer hardware contractor for developing SAGE for the United States Air Force. Working on this massive computing and communications system, IBM gained access to pioneering research being done at Massachusetts Institute of Technology on the first real-time, digital computer. This included working on many other computer technology advancements such as magnetic core memory, a large real-time operating system, an integrated video display, light guns, the first effective algebraic computer language, analog-to-digital and digital-to-analog conversion techniques, digital data transmission over telephone lines, duplexing, multiprocessing, and geographically distributed networks. IBM built fifty-six SAGE computers at the price of US$30 million each, and at the peak of the project devoted more than 7,000 employees (20% of its then workforce) to the project. SAGE had the largest computer footprint ever and continued in service until 1984. [105]

More valuable to IBM in the long run than the profits from governmental projects, however, was the access to cutting-edge research into digital computers being done under military auspices. IBM neglected, however, to gain an even more dominant role in the nascent industry by allowing the RAND Corporation to take over the job of programming the new computers, because, according to one project participant, Robert P. Crago, "we couldn't imagine where we could absorb two thousand programmers at IBM when this job would be over someday, which shows how well we were understanding the future at that time." [106] IBM would use its experience designing massive, integrated real-time networks with SAGE to design its SABRE airline reservation system, which met with much success.

These government partnerships, combined with pioneering computer technology research and a series of commercially successful products (IBM's 700 series of computer systems, the IBM 650, the IBM 305 RAMAC (with disk drive memory), and the IBM 1401) enabled IBM to emerge from the 1950s as the world's leading technology firm. Watson Jr. had answered his self-doubt. In the five years since the passing of Watson Sr., IBM was two and a half times bigger, its stock had quintupled, and of the 6000 computers in operation in the United States, more than 4000 were IBM machines. [107]

Key events Edit

  • 1946: IBM 603. IBM announces the IBM 603 Electronic Multiplier, the first commercial product to incorporate electronic arithmetic circuits. The 603 used vacuum tubes to perform multiplication far more rapidly than earlier electromechanical devices. It had begun its development as part of a program to make a "super calculator" that would perform faster than 1944's IBM ASCC by using electronics. [108]
  • 1946: Chinese character typewriter. IBM introduces an electric Chinese ideographic character typewriter, which allowed an experienced user to type at a rate of 40 to 45 Chinese words a minute. The machine utilizes a cylinder on which 5,400 ideographic type faces are engraved. [109]
  • 1946: First black salesman. IBM hires its first black salesman, 18 years before the Civil Rights Act of 1964. [110]
  • 1948: IBM SSEC. IBM's first large-scale digital calculating machine, the Selective Sequence Electronic Calculator, is announced. The SSEC is the first computer that can modify a stored program and featured 12,000 vacuum tubes and 21,000 electromechanical relays. [111]
  • 1950s: Space exploration. From developing ballistics tables during World War II to the design and development of intercontinental missiles to the launching and tracking of satellites to manned lunar and shuttle space flights, IBM has been a contractor to NASA and the aerospace industry. [112]
  • 1952: IBM 701. IBM throws its hat into the computer business ring by introducing the 701, its first large-scale electronic computer to be manufactured in quantity. The 701, IBM President Thomas J. Watson, Jr., later recalled, is "the machine that carried us into the electronics business." [113]
  • 1952: Magnetic tape vacuum column. IBM introduces the magnetic tape drive vacuum column, making it possible for fragile magnetic tape to become a viable data storage medium. The use of the vacuum column in the IBM 701 system signals the beginning of the era of magnetic storage, as the technology becomes widely adopted throughout the industry. [114]
  • 1952: First California research lab. IBM opens its first West Coast lab in San Jose, California: the area that decades later will come to be known as "Silicon Valley." Within four years, the lab begins to make its mark by inventing the hard disk drive. [113]
  • 1953: Equal opportunity policy letter. Thomas J. Watson, Jr., publishes the company's first written equal opportunity policy letter: one year before the U.S. Supreme Court decision in Brown vs. Board of Education and 11 years before the Civil Rights Act of 1964. [102]
  • 1953: IBM 650. IBM announces the IBM 650 Magnetic Drum Data-Processing Machine, an intermediate size electronic computer, to handle both business and scientific computations. A hit with both universities and businesses, it was the most popular computer of the 1950s. Nearly 2,000 IBM 650s were marketed by 1962. [115]
  • 1954: NORC. IBM develops and builds the fastest, most powerful electronic computer of its time: the Naval Ordnance Research Computer (NORC): for the U.S. Navy Bureau of Ordnance. [116]
  • 1956: First magnetic Hard disk drive. IBM introduces the world's first magnetic hard disk for data storage. The IBM 305 RAMAC (Random Access Method of Accounting and Control) offers an unprecedented performance by permitting random access to any of the million characters distributed over both sides of 50 two-foot-diameter disks. Produced in California, IBM's first hard disk stored about 2,000 bits of data per square inch and cost about $10,000 per megabyte. By 1997, the cost of storing a megabyte had dropped to around ten cents. [117]
  • 1956: Consent decree. The United States Justice Department enters a consent decree against IBM in 1956 to prevent the company from becoming a monopoly in the market for punched-card tabulating and, later, electronic data-processing machines. The decree requires IBM to sell its computers as well as lease them and to service and sell parts for computers that IBM no longer owned. [118]
  • 1956: Corporate design. In the mid-1950s, Thomas J. Watson, Jr., was struck by how poorly IBM was handling corporate design. He hired design consultant Eliot Noyes to oversee the creation of a formal Corporate Design Program and charged Noyes with creating a consistent, world-class look and feel at IBM. Over the next two decades, Noyes hired a host of influential architects, designers, and artists to design IBM products, structures, exhibits, and graphics. The list of Noyes contacts includes such iconic figures as Eero Saarinen, Marcel Breuer, Mies van der Rohe, John Bolles, Paul Rand, Isamu Noguchi and Alexander Calder. [119]
  • 1956: First European research lab. IBM opens its first research lab outside the United States, in the Swiss city of Zurich. [120]
  • 1956: Changing hands. Watson Sr. retires and hands IBM to his son, Watson Jr. Senior dies soon after. [121]
  • 1956: Williamsburg conference. Watson Jr. gathered some 100 senior IBM executives together for a special three-day meeting in Williamsburg, Virginia. The meeting resulted in a new organizational structure that featured a six-member corporate management committee and delegated more authority to business unit leadership. It was the first major meeting IBM had ever held without Thomas J. Watson Sr., and it marked the emergence of the second generation of IBM leadership. [122]
  • 1956: Artificial intelligence. Arthur L. Samuel of IBM's Poughkeepsie, New York, laboratory programs an IBM 704 to play checkers (English draughts) using a method in which the machine can "learn" from its own experience. It is believed to be the first "self-learning" program, a demonstration of the concept of artificial intelligence. [123]
  • 1957: FORTRAN. IBM revolutionizes programming with the introduction of FORTRAN (Formula Translator), which soon becomes the most widely used computer programming language for technical work. FORTRAN is still the basis for many important numerical analysis programs. [124]
  • 1958: SAGEAN/FSQ-7. The SAGE (Semi-Automatic Ground Environment) AN/FSQ-7 computer is built under contract to MIT's Lincoln Laboratory for the North American Air Defense System. [125]
  • 1958: IBM domestic Time Equipment Division sold to Simplex. IBM announces the sale of the domestic Time Equipment Division (clocks et al.) business to Simplex Time Recorder Company. The IBM time equipment service force will be transferred to the Electric Typewriter Division. [126]
  • 1958: Open Door program. First implemented by Watson, Sr., in the 1910s, the Open Door was a traditional company practice that granted employees with complaints hearings with senior executives, up to and including Watson Sr. IBM formalized this practice into policy in 1958 with the creation of the Open Door Program. [127]
  • 1959: Speak up! A further example of IBM's willingness to solicit and act upon employee feedback, the Speak Up! Program was first created in San Jose. [128]
  • 1959: IBM 1401. IBM introduces 1401, the first high-volume, stored-program, core-memory, transistorized computer. Its versatility in running enterprise applications of all kinds helped it become the most popular computer model in the world in the early 1960s. [129]
  • 1959: IBM 1403. IBM introduces the 1403 chain printer, which launches the era of high-speed, high-volume impact printing. The 1403 will not be surpassed for print quality until the advent of laser printing in the 1970s. [130]

1960–1969: The System/360 era, Unbundling software and services Edit

On April 7, 1964, IBM introduced the revolutionary System/360, the first large "family" of computers to use interchangeable software and peripheral equipment, a departure from IBM's existing product line of incompatible machines, each of which was designed to solve specific customer requirements. [131] The idea of a general-purpose machine was considered a gamble at the time. [132]

Within two years, the System/360 became the dominant mainframe computer in the marketplace and its architecture became a de facto industry standard. During this time, IBM transformed from a medium-sized maker of tabulating equipment and typewriters into the world's largest computer company. [133]

In 1969 IBM "unbundled" software and services from hardware sales. Until this time customers did not pay for software or services separately from the very high price for the hardware. Software was provided at no additional charge, generally in source code form. Services (systems engineering, education and training, system installation) were provided free of charge at the discretion of the IBM Branch office. This practice existed throughout the industry. IBM's unbundling is widely credited with leading to the growth of the software industry. [134] [135] [136] [137] After the unbundling, IBM software was divided into two main categories: System Control Programming (SCP), which remained free to customers, and Program Products (PP), which were charged for. This transformed the customer's value proposition for computer solutions, giving a significant monetary value to something that had hitherto essentially been free. This helped enable the creation of the software industry. Similarly, IBM services were divided into two categories: general information, which remained free and provided at the discretion of IBM, and on-the-job assistance and training of customer personnel, which were subject to a separate charge and were open to non-IBM customers. This decision vastly expanded the market for independent computing services companies.

The company began four decades of Olympic sponsorship with the 1960 Winter Games in Squaw Valley, California. It became a recognized leader in corporate social responsibility, joining federal equal opportunity programs in 1962, opening an inner-city manufacturing plant in 1968, and creating a minority supplier program. It led efforts to improve data security and protect privacy. It set environmental air/water emissions standards that exceeded those dictated by law and brought all its facilities into compliance with those standards. It opened one of the world's most advanced research centers in Yorktown, New York. Its international operations grew rapidly, producing more than half of IBM's revenues by the early 1970s and through technology transfer shaping the way governments and businesses operated around the world. Its personnel and technology played an integral role in the space program and landing the first men on the moon in 1969. In that same year, it changed the way it marketed its technology to customers, unbundling hardware from software and services, effectively launching today's multibillion-dollar software and services industry. See unbundling of software and services, below. It was massively profitable, with a nearly fivefold increase in revenues and earnings during the 1960s.

In 1967 Thomas John Watson, Jr., who had succeeded his father as chairman, announced that IBM would open a large-scale manufacturing plant at Boca Raton to produce its System/360 Model 20 midsized computer. On March 16, 1967, a headline in the Boca Raton News [138] announced “IBM to hire 400 by year’s end.” The plan was for IBM to lease facilities to start making computers until the new site could be developed. A few months later, hiring began for assembly and production control trainees. IBM's Juan Rianda moved from Poughkeepsie, New York, to become the first plant manager at IBM's new Boca operations. To design its new campus, IBM commissioned internationally renowned architect Marcel Breuer (1902–1981), who worked closely with American architect Robert Gatje (1927-2018). In September 1967, the Boca team celebrated a milestone, shipping its first IBM System/360 Model 20 to the City of Clearwater – the first computer in its production run. A year later, IBM 1130 Computing Systems were being produced and shipped from the 203 building. By 1969, IBM's Boca workforce had reached 1,000. That employment number grew to around 1,300 in the next year as a Systems Development Engineering Laboratory was added to the division's operations.

Key events Edit

  • 1961: IBM 7030 Stretch. IBM delivers its first 7030 Stretch supercomputer. Stretch falls short of its original design objectives, and is not a commercial success. But it is a visionary product that pioneers numerous revolutionary computing technologies which are soon widely adopted by the computer industry. [139][140]
  • 1961: Thomas J. Watson Research Center. IBM moves its research headquarters from Poughkeepsie, NY to Westchester County, NY, opening the Thomas J. Watson Research Center which remains IBM's largest research facility, centering on semiconductors, computer science, physical science, and mathematics. The lab which IBM established at Columbia University in 1945 was closed and moved to the Yorktown Heights laboratory in 1970. [141]
  • 1961: IBM Selectric typewriter. IBM introduces the Selectric typewriter product line. Later Selectric models feature memory, giving rise to the concepts of word processing and desktop publishing. The machine won numerous awards for its design and functionality. Selectrics and their descendants eventually captured 75 percent of the United States market for electric typewriters used in business. [142] IBM replaced the Selectric line with the IBM Wheelwriter in 1984 and transferred its typewriter business to the newly formed Lexmark in 1991. [143]
  • 1961: Report Program Generator. IBM offers its Report Program Generator, an application that allows IBM 1401 users to produce reports. This capability was widely adopted throughout the industry, becoming a feature offered in subsequent generations of computers. It played an important role in the successful introduction of computers into small businesses. [144]
  • 1962: Basic beliefs. Drawing on established IBM policies, Thomas J. Watson, Jr., codifies three IBM basic beliefs: respect for the individual, customer service, and excellence. [145]
  • 1962: SABRE. Two IBM 7090 mainframes formed the backbone of the SABRE reservation system for American Airlines. As the first airline reservation system to work live over phone lines, SABRE linked high-speed computers and data communications to handle seat inventory and passenger records. [146]
  • 1964: IBM System/360. In the most important product announcement in company history to date, IBM introduces the IBM System/360: a new concept in computers which creates a "family" of small to large computers, incorporating IBM Solid Logic Technology (SLT) microelectronics and using the same programming instructions. The concept of a compatible "family" of computers transforms the industry. [147]
  • 1964: Word processing. IBM introduces the IBM Magnetic Tape Selectric Typewriter, a product that pioneered the application of magnetic recording devices to typewriting, and gave rise to desktop word processing. Referred to then as "power typing," the feature of revising stored text improved office efficiency by allowing typists to type at "rough draft" speed without the pressure of worrying about mistakes. [148]
  • 1964: New corporate headquarters. IBM moves its corporate headquarters from New York City to Armonk, New York. [149]
  • 1965: Gemini space flights. A 59-pound onboard IBM guidance computer is used on all Gemini space flights, including the first spaceship rendezvous. IBM scientists complete the most precise computation of the Moon's orbit and develop a fabrication technique to connect hundreds of circuits on a silicon wafer. [150]
  • 1965: New York World's Fair. The IBM Pavilion at the New York World's Fair closes, having hosted more than 10 million visitors during its two-year existence. [151]
  • 1966: Dynamic Random-Access Memory (DRAM). IBM invents one-transistor DRAM cells which permit major increases in memory capacity. DRAM chips become the mainstay of modern computer memory systems: the "crude oil" of the information age is born. [152]
  • 1966: IBM System/4 Pi. IBM ships its first System/4Pi computer, designed to meet U.S. Department of Defense and NASA requirements. More than 9000 units of the 4Pi systems are delivered by the 1980s for use in the air, sea, and space. [153]
  • 1966: IBM Information Management System (IMS). IBM designed the Information Management System (IMS) with Rockwell and Caterpillar starting in 1966 for the Apollo program, where it was used to inventory the very large bill of materials (BOM) for the Saturn V moon rocket and Apollo space vehicle.
  • 1967: Fractal geometry. IBM researcher Benoit Mandelbrot conceives fractal geometry – the concept that seemingly irregular shapes can have identical structure at all scales. This new geometry makes it possible to mathematically describe the kinds of irregularities existing in nature. The concept greatly impacts the fields of engineering, economics, metallurgy, art, health sciences, and computer graphics and animation. [154]
  • 1968: IBM Customer Information Control System (CICS). IBM introduces the CICS transaction monitor. CICS remains to this day the industry's most popular transactions monitor. [155]
  • 1969: Antitrust. The United States government launches what would become a 13-year-long antitrust suit against IBM. The suit becomes a draining war of attrition, and is eventually dropped in 1982, [156] after IBM's share of the mainframe market declined from 70% to 62%. [157]
  • 1969: Unbundling. IBM adopts a new marketing policy that charges separately for most systems engineering activities, future computer programs, and customer education courses. This "unbundling" gives rise to a multibillion-dollar software and services industry. [158]
  • 1969: Magnetic stripe cards. The American National Standards Institute makes the IBM-developed magnetic stripe technology a national standard, jump starting the credit card industry. Two years later, the International Organization for Standardization adopts the IBM design, making it a world standard. [159]
  • 1969: First moon landing. IBM personnel and computers help NASA land the first men on the Moon.

1970–1974: The challenges of success Edit

The Golden Decade of the 1960s was a hard act to follow, and the 1970s got off to a troubling start when CEO Thomas J. Watson Jr. suffered a heart attack and retired in 1971. For the first time since 1914 – nearly six decades – IBM would not have a Watson at the helm. Moreover, after just one leadership change over those nearly 60 years, IBM would endure two in two years. T. Vincent Learson succeeded Watson as CEO, then quickly retired upon reaching the mandatory retirement age of 60 in 1973. Following Learson in the CEO office was Frank T. Cary, a 25-year IBMer [160] who had run the very successful data processing division in the 1960s.

Datamation in 1971 stated that "the perpetual, ominous force called IBM rolls on". [161] The company's dominance let it keep prices high and rarely update products, [162] all built with only IBM components. [163] During Cary's tenure as CEO, the IBM System/370 was introduced in 1970 as IBM's new mainframe. The S/370 did not prove as technologically revolutionary as its predecessor, the System/360. From a revenue perspective, it more than sustained the cash cow status of the 360. [164] A less successful effort to replicate the 360 mainframe revolution was the Future Systems project. Between 1971 and 1975, IBM investigated the feasibility of a new revolutionary line of products designed to make obsolete all existing products in order to re-establish its technical supremacy. This effort was terminated by IBM's top management in 1975. But by then it had consumed most of the high-level technical planning and design resources, thus jeopardizing progress of the existing product lines (although some elements of FS were later incorporated into actual products). [165] Other IBM innovations during the early 1970s included the IBM 3340 disk unit – introduced in 1973 and known as "Winchester" after IBM's internal project name — was an advanced storage technology which more than doubled the information density on disk surfaces. Winchester technology was adopted by the industry and used for the next two decades.

Some 1970s-era IBM technologies emerged to become familiar facets of everyday life. IBM developed magnetic stripe technology in the 1960s, and it became a credit card industry standard in 1971. The IBM-invented floppy disk, also introduced in 1971, became the standard for storing personal computer data during the first decades of the PC era. IBM Research scientist Edgar 'Ted' Codd wrote a seminal paper describing the relational database, an invention that Forbes magazine described as one of the most important innovations of the 20th century. The IBM 5100, 50 lbs. and $9000 of personal mobility, was introduced in 1975 and presaged – at least in function if not size or price or units sold – the Personal Computer of the 1980s. IBM's 3660 supermarket checkout station, introduced in 1973, used holographic technology to scan product prices from the now-ubiquitous UPC bar code, which itself was based a 1952 IBM patent that became a grocery industry standard. Also in 1973, bank customers began making withdrawals, transfers and other account inquiries via the IBM 3614 Consumer Transaction Facility, an early form of today's Automatic Teller Machines.

IBM had an innovator's role in pervasive technologies that were less visible as well. In 1974, IBM announced Systems Network Architecture (SNA), a networking protocol for computing systems. SNA is a uniform set of rules and procedures for computer communications to free computer users from the technical complexities of communicating through local, national, and international computer networks. SNA became the most widely used system for data processing until more open architecture standards were approved in the 1990s. In 1975, IBM researcher Benoit Mandelbrot conceived fractal geometry—a new geometrical concept that made it possible to describe mathematically the kinds of irregularities existing in nature. Fractals had a great impact on engineering, economics, metallurgy, art and health sciences, and are integral to the field of computer graphics and animation.

A less successful business endeavor for IBM was its entry into the office copier market in the 1970s, after turning down the opportunity to purchase the xerography technology. [34] The company was immediately sued by Xerox Corporation for patent infringement. Although Xerox held the patents for the use of selenium as a photoconductor, IBM researchers perfected the use of organic photoconductors which avoided the Xerox patents. The litigation lasted until the late 1970s and was ultimately settled. Despite this victory, IBM never gained traction in the copier market and withdrew from the marketplace in the 1980s. Organic photoconductors are now widely used in copiers.

Throughout this period, IBM was litigating the antitrust suit filed by the Justice Department in 1969. But in a related bit of case law, the landmark Honeywell v. Sperry Rand U.S. federal court case was concluded in April 1973. The 1964 patent for the ENIAC, the world's first general-purpose electronic digital computer, was found both invalid and unenforceable for a variety of reasons thus putting the invention of the electronic digital computer into the public domain. Further, IBM was ruled to have created a monopoly via its 1956 patent-sharing agreement with Sperry-Rand.

American antitrust laws did not affect IBM in Europe, where as of 1971 [update] it had fewer competitors and more than 50% market share in almost every country. Customers preferred IBM because it was, Datamation said, "the only truly international computer company", able to serve clients almost anywhere. Rivals such as ICL, CII, and Siemens began to cooperate to preserve a European computer industry. [161]

Key events Edit

  • 1970: System/370. IBM announces System/370 as successor to System/360.
  • 1970: Relational databases. IBM introduces relational databases which call for information stored within a computer to be arranged in easy-to-interpret tables to access and manage large amounts of data. Today, nearly all database structures are based on the IBM concept of relational databases.
  • 1970: Office copiers. IBM introduces its first of three models of xerographic copiers. These machines mark the first commercial use of organic photoconductors which since grew to become the dominant technology.
  • 1971: Speech recognition. IBM achieves its first operational application of speech recognition, which enables engineers servicing equipment to talk to and receive spoken answers from a computer that can recognize about 5,000 words. Today, IBM's ViaVoice recognition technology has a vocabulary of 64,000 words and a 260,000-word back-up dictionary. [166]
  • 1971: Floppy disk. IBM introduces the floppy disk. Convenient and highly portable, the floppy becomes a personal computer industry standard for storing data. [167]
  • 1973: Winchester storage technology. The IBM 3340 disk unit—known as "Winchester" after IBM's internal project name—is introduced, an advanced technology which more than doubled the information density on disk surfaces. It featured a smaller, lighter read/write head that was designed to ride on an air film only 18 millionths of an inch thick. Winchester technology was adopted by the industry and used for the next two decades. [168]
  • 1973: Nobel Prize. Dr. Leo Esaki, an IBM Fellow who joined the company in 1960, shares the 1973 Nobel Prize in physics for his 1958 discovery of the phenomenon of electron tunneling. His discovery of the semiconductor junction called the Esaki diode finds wide use in electronics applications. More importantly, his work in the field of semiconductors lays a foundation for further exploration in the electronic transport of solids. [169]
  • 1974: SNA. IBM announces Systems Network Architecture (SNA), a networking protocol for computing systems. SNA is a uniform set of rules and procedures for computer communications to free computer users from the technical complexities of communicating through local, national, and international computer networks. SNA becomes the most widely used system for data processing until more open architecture standards were approved in the 1990s. [170]

1975–1992: Information revolution, rise of software and PC industries Edit

Year Gross income (in $m) Employees
1975 14,430 288,647
1980 26,210 341,279
1985 50,050 405,535
1990 69,010 373,816
1995 71,940 225,347

President of IBM John R. Opel became CEO in 1981. [171] His company was one of the world's largest and had a 62% share of the mainframe computer market that year. [157] While frequently relocated employees and families still joked that IBM stood for "I've Been Moved", and employees of acquisitions feared that hordes of formal IBM employees would invade their more casual offices, [172] IBM no longer required white shirts for male employees, who still wore conservative suits when meeting customers. Former employees such as Gene Amdahl used their training to found and lead many competitors [34] and suppliers. [173]

Expecting Japanese competition, IBM in the late 1970s began investing in manufacturing to lower costs, offering volume discounts and lower prices to large customers, and introducing new products more frequently. [162] The company also sometimes used non-IBM components in products, [163] and sometimes resold others' products as its own. [174] In 1980 it introduced its first computer terminal compatible with non-IBM equipment, [175] and Displaywriter was the first new product less expensive than the competition. [157] IBM's share of the overall computer market, however, declined from 60% in 1970 to 32% in 1980. [176] Perhaps distracted by the long-running antitrust lawsuit, [34] the "Colossus of Armonk" completely missed the fast-growing minicomputer market during the 1970s, [174] [177] [178] [179] and was behind rivals such as Wang, Hewlett-Packard (HP), and Control Data in other areas. [176]

In 1979 BusinessWeek asked, "Is IBM just another stodgy, mature company?" By 1981 its stock price had declined by 22%. [176] IBM's earnings for the first half the year grew by 5.3%—one third of the inflation rate—while those of minicomputer maker Digital Equipment Corporation (DEC) grew by more than 35%. [175] The company began selling minicomputers, [180] but in January 1982 the Justice Department ended the antitrust suit because, The New York Times reported, the government "recognized what computer experts and securities analysts had long since concluded: I.B.M. no longer dominates the computer business". [157]

IBM wished to avoid the same outcome with the new personal computer industry. [179] The company studied the market for years and, as with UNIVAC, others like Apple Computer entered it first [34] IBM did not want a product with a rival's logo on corporate customers' desks. [181] The company opened its first retail store in November 1980, [182] and a team in the Boca Raton, Florida office built the IBM PC using commercial off-the-shelf components. The new computer debuted on August 12, 1981 [163] from the Entry Systems Division led by Don Estridge. IBM immediately became more of a presence in the consumer marketplace, thanks to the memorable Little Tramp advertising campaign. Though not a spectacular machine by technological standards of the day, the IBM PC brought together all of the most desirable features of a computer into one small machine. It had 128 kilobytes of memory (expandable to 256 kilobytes), one or two floppy disks and an optional color monitor. And it had the prestige of the IBM brand. It was not cheap, but with a base price of US$1,565 it was affordable for businesses – and many businesses purchased PCs. Reassured by the IBM name, they began buying microcomputers on their own budgets aimed at numerous applications that corporate computer departments did not, and in many cases could not, accommodate. Typically, these purchases were not by corporate computer departments, as the PC was not seen as a "proper" computer. Purchases were often instigated by middle managers and senior staff who saw the potential – once the revolutionary VisiCalc spreadsheet, the killer app, had been surpassed by a far more powerful and stable product, Lotus 1-2-3.

IBM's dominance of the mainframe market in Europe and the US encouraged existing customers to buy the PC, [181] [183] and vice versa as sales of what had been an experiment in a new market became a substantial part of IBM's financials, the company found that customers also bought larger IBM computers. [184] [177] [172] Unlike the BUNCH and other rivals IBM quickly adjusted to the retail market, [181] [185] with its own sales force competing with outside retailers for the first time. [172] By 1985 IBM was the world's most profitable industrial company, [172] and its sales of personal computers were larger than that of minicomputers despite having been in the latter market since the early 1970s. [180]

By 1983 industry analyst Gideon Gartner warned that IBM "is creating a dangerous situation for competitors in the marketplace". [34] The company helped others by defining technical standards and creating large new software markets, [184] [186] [162] but the new aggressiveness that began in the late 1970s helped it dominate areas like computer leasing and computer-aided design. [162] Free from the antitrust case, IBM was present in every computer market other than supercomputers, and entered communications [186] by purchasing Rolm—the first acquisition in 18 years—and 18% of MCI. [172] The company was so important to component suppliers that it urged them to diversify. When IBM (61% of revenue) abruptly reduced orders from Miniscribe shares of not only Miniscribe but that of uninvolved companies that sold to IBM fell, as investors feared their vulnerability. [173] IBM was also vulnerable when suppliers could not fulfill orders [187] customers and dealers also feared becoming overdependent. [181] [162]

The IBM PC AT's 1984 debut startled the industry. Rivals admitted that they did not expect the low price of the sophisticated product. IBM's attack on every area of the computer industry and entry into communications caused competitors, analysts, and the press to speculate that it would again be sued for antitrust. [188] [189] [172] Datamation and others said that the company's continued growth might hurt the United States, by suppressing startups with new technology. [162] Gartner Group estimated in 1985 that of the 100 largest data-processing companies, IBM had 41% of all revenue and 69% of profit. Its computer revenue was about nine times that of second-place DEC, and larger than that of IBM's six largest Japanese competitors combined. The 22% profit margin was three times the 6.7% average for the other 99 companies. Competitors complained to Congress, ADAPSO discussed the company with the Justice Department, and European governments worried about IBM's influence but feared affecting its more than 100,000 employees there at 19 facilities. [162]

However, the company soon lost its lead in both PC hardware and software, thanks in part to its unprecedented (for IBM) decision to contract PC components to outside companies like Microsoft and Intel. Up to this point in its history, IBM relied on a vertically integrated strategy, building most key components of its systems itself, including processors, operating systems, peripherals, databases and the like. In an attempt to accelerate the time-to-market for the PC, IBM chose not to build a proprietary operating system and microprocessor. Instead, it sourced these vital components from Microsoft and Intel respectively. Ironically, in a decade which marked the end of IBM's monopoly, it was this fateful decision by IBM that passed the sources of its monopolistic power (operating system and processor architecture) to Microsoft and Intel, paving the way for rise of PC compatibles and the creation of hundreds of billions of dollars of market value outside of IBM.

John Akers became IBM's CEO in 1985. During the 1980s, IBM's significant investment in building a world class research organization produced four Nobel Prize winners in physics, achieved breakthroughs in mathematics, memory storage and telecommunications, and made great strides in expanding computing capabilities. In 1980, IBM Research legend John Cocke introduced Reduced Instruction Set Technology (RISC). Cocke received both the National Medal of Technology and the National Medal of Science for his innovation, but IBM itself failed to recognize the importance of RISC, and lost the lead in RISC technology to Sun Microsystems. In 1984 the company partnered with Sears to develop a pioneering online home banking and shopping service for home PCs that launched in 1988 as Prodigy. Despite a strong reputation and anticipating many of the features, functions, and technology that characterize the online experience of today, the venture was plagued by extremely conservative management decisions, and was eventually sold in the mid-1990s. The IBM token-ring local area network, introduced in 1985, permitted personal computer users to exchange information and share printers and files within a building or complex. In 1988, IBM partnered with the University of Michigan and MCI Communications to create the National Science Foundation Network (NSFNet), an important step in the creation of the Internet. But within five years the company backed away from this early lead in Internet protocols and router technologies in order to support its existing SNA cash cow, thereby missing a boom market of the 1990s. Still, IBM investments and advances in microprocessors, disk drives, network technologies, software applications, and online commerce in the 1980s set the stage for the emergence of the connected world in the 1990s.

But by the end of the decade, IBM was clearly in trouble. It was a bloated organization of some 400,000 employees that was heavily invested in low margin, transactional, commodity businesses. Technologies IBM invented and or commercialized – DRAM, hard disk drives, the PC, electric typewriters – were starting to erode. The company had a massive international organization characterized by redundant processes and functions – its cost structure couldn't compete with smaller, less diversified competitors. And then the back-to-back revolutions – the PC and the client-server – did the unthinkable. They combined to dramatically undermine IBM's core mainframe business. The PC revolution placed computers directly in the hands of millions of people. It was followed by the client/server revolution, which sought to link all of those PCs (the "clients") with larger computers that labored in the background (the "servers" that served data and applications to client machines). Both revolutions transformed the way customers viewed, used and bought technology. And both fundamentally rocked IBM. Businesses' purchasing decisions were put in the hands of individuals and departments – not the places where IBM had long-standing customer relationships. Piece-part technologies took precedence over integrated solutions. The focus was on the desktop and personal productivity, not on business applications across the enterprise. As a result, earnings – which had been at or above US$5 billion since the early 1980s, dropped by more than a third to US$3 billion in 1989. A brief spike in earnings in 1990 proved illusory as corporate spending continued to shift from high-profit margin mainframes to lower margin microprocessor-based systems. In addition, corporate downsizing was in full swing.

Akers tried to stop the bleeding – desperate moves and radical changes were considered and implemented. As IBM assessed the situation, it was clear that competition and innovation in the computer industry were now taking place along segmented, versus vertically integrated lines, where leaders emerged in their respective domains. Examples included Intel in microprocessors, Microsoft in desktop software, Novell in networking, HP in printers, Seagate in disk drives and Oracle Corporation in database software. IBM's dominance in personal computers was challenged by the likes of Compaq and later Dell. Recognizing this trend, management, with the support of the Board of Directors, began to implement a plan to split IBM into increasingly autonomous business units (e.g. processors, storage, software, services, printers, etc.) to compete more effectively with competitors that were more focused and nimble and had lower cost structures.

IBM also began shedding businesses that it felt were no longer core. It sold its typewriter, keyboard, and printer business – the organization that created the popular "Selectric" typewriter with its floating "golf ball" type element in the 1960s – to the investment firm of Clayton, Dubilier & Rice Inc. and became an independent company, Lexmark Inc.

These efforts failed to halt the slide. A decade of steady acceptance and widening corporate growth of local area networking technology, a trend headed by Novell Inc. and other vendors, and its logical counterpart, the ensuing decline of mainframe sales, brought about a wake-up call for IBM. After two consecutive years of reporting losses in excess of $1 billion, on January 19, 1993, IBM announced a US$8.10 billion loss for the 1992 financial year, which was then the largest single-year corporate loss in U.S. history. [190] All told, between 1991 and 1993, the company posted net losses of nearly $16 billion. IBM's three-decade-long Golden Age, triggered by Watson Jr. in the 1950s, was over. The computer industry now viewed IBM as no longer relevant, an organizational dinosaur. And hundreds of thousands of IBMers lost their jobs, including CEO John Akers.

Key events Edit

  • mid-1970s: IBM VNET. VNET was an international computer networking system deployed in the mid-1970s, providing email and file-transfer for IBM. By September 1979, the network had grown to include 285 mainframe nodes in Europe, Asia, and North America.
  • 1975: Fractals. IBM researcher Benoit Mandelbrot conceives fractal geometry—the concept that seemingly irregular shapes can have identical structure at all scales. This new geometry makes it possible to describe mathematically the kinds of irregularities existing in nature. Fractals later make a great impact on engineering, economics, metallurgy, art, and health sciences, and are also applied in the field of computer graphics and animation. [191]
  • 1975: IBM 5100 Portable computer. IBM introduces the 5100 Portable Computer, a 50 lb. desktop machine that put computer capabilities at the fingertips of engineers, analysts, statisticians, and other problem-solvers. More "luggable" than portable, the 5100 can serve as a terminal for the System/370 and costs from $9000 to $20,000. [192]
  • 1976: Space Shuttle. The Enterprise, the first vehicle in the U.S. Space Shuttle program, makes its debut at Palmdale, California, carrying IBM AP-101 flight computers and special hardware built by IBM.
  • 1976: Laser printer. The first IBM 3800 printer is installed. The 3800 is the first commercial printer to combine laser technology and electrophotography. The technology speeds the printing of bank statements, premium notices, and other high-volume documents, and remains a workhorse for billing and accounts receivable departments. [193]
  • 1977: Data Encryption Standard. IBM-developed Data Encryption Standard (DES), a cryptographic algorithm, is adopted by the U.S. National Bureau of Standards as a national standard. [194]
  • 1979: Retail checkout. IBM develops the Universal Product Code (UPC) in the 1970s as a method for embedding pricing and identification information on individual retail items. In 1979, IBM applies holographic scanner technology in IBM's supermarket checkout station to read the UPC stripes on merchandise, one of the first major commercial uses of holography. IBM's support of the UPC concept helps lead to its widespread acceptance by retail and other industries around the world. [195]
  • 1979: Thin film recording heads. Instead of using hand-wound wire structures as coils for inductive elements, IBM researchers substitute thin film "wires" patterned by optical lithography. This leads to higher performance recording heads at a reduced cost and establishes IBM's leadership in "areal density": storing the most data in the least space. The result is higher-capacity and higher-performance disk drives. [196]
  • 1979: Overcoming barriers to technology use. Since 1946, with its announcement of Chinese and Arabic ideographic character typewriters, IBM has worked to overcome cultural and physical barriers to the use of technology. As part of these ongoing efforts, IBM introduces the 3270 Kanji Display Terminal the System/34 Kanji System with an ideographic feature, which processes more than 11,000 Japanese and Chinese characters and the Audio Typing Unit for sight-impaired typists.
  • 1979: First multi-function copier/printer. A communication-enabled laser printer and photocopier combination was introduced, the IBM 6670 Information Distributor. This was the first multi-function (copier/printer) device for the office market.
  • 1980: Thermal conduction modules. IBM introduces the 3081 processor, the company's most powerful to date, which features Thermal Conduction Modules. In 1990, the Institute of Electrical and Electronics Engineers, Inc., awards its 1990 Corporate Innovation Recognition to IBM for the development of the Multilayer Ceramic Thermal Conduction Module for high performance computers. [197]
  • 1980: Reduced instruction set computing (RISC) architecture. IBM successfully builds the first prototype computer employing IBM Fellow John Cocke's RISC architecture. RISC simplified the instructions given to computers, making them faster and more powerful. Today, RISC architecture is the basis of most workstations and widely viewed as the dominant computing architecture. [198]
  • 1981: IBM PC. The IBM Personal Computer goes mass market and helps revolutionize the way the world does business. A year later, Time Magazine gives its "Person of the Year" award to the Personal Computer. [199]
  • 1981: LASIK surgery. Three IBM scientists invent the excimer laser surgical procedure that later forms the basis of LASIK and PRK corrective eye surgeries. [200]
  • 1982: Antitrust suit. The United States antitrust suit against IBM, filed in 1969, is dismissed as being "without merit." [201]
  • 1982: Trellis-coded modulation. Trellis-coded modulation (TCM) is first used in voice-band modems to send data at higher rates over telephone channels. Today, TCM is applied in a large variety of terrestrial and satellite-based transmission systems as a key technique for achieving faster and more reliable digital transmission. [202]
  • 1983: IBM PCjr. IBM announces the widely anticipated PCjr., IBM's attempt to enter the home computing marketplace. The product, however, fails to capture the fancy of consumers due to its lack of compatibility with IBM PC software, its higher price point, and its unfortunate ‘chiclet’ keyboard design. IBM terminates the product after 18 months of disappointing sales. [203]
  • 1984: IBM 3480 magnetic tape system. The industry's most advanced magnetic tape system, the IBM 3480, introduces a new generation of tape drives that replace the familiar reel of tape with an easy-to-handle cartridge. The 3480 was the industry's first tape system to use "thin-film" recording head technology.
  • 1984: Sexual discrimination. IBM adds sexual orientation to the company's non-discrimination policy. IBM becomes one of the first major companies to make this change. [204]
  • 1984: ROLM partnership/acquisition. IBM acquires ROLM Corporation for $1.25 billion. [172] Based in Santa Clara, CA (subsequent to an existing partnership), [205] IBM intended to develop digital telephone switches to compete directly with Northern Telecom and AT&T. [206] Two of the most popular systems were the large scale PABX coined ROLM CBX and the smaller PABX coined ROLM Redwood. ROLM is later acquired by Siemens AG in 1989–1992. [207][208]
  • 1985: MCI. IBM acquires 18% of MCI Communications, the United States's second-largest long-distance carrier, in June 1985. [172]
  • 1985: RP3. Sparked in part by national concerns over losing its technology leadership crown in the early 1980s, IBM re-enters the supercomputing field with the RP3 (IBM Research Parallel Processor Prototype). IBM researchers worked with scientists from the New York University's Courant Institute of Mathematical Science to design RP3, an experimental computer consisting of up to 512 processors, linked in parallel and connected to as many as two billion characters of main memory. Over the next five years, IBM provides more than $30 million in products and support to a supercomputer facility established at Cornell University in Ithaca, New York. [209]
  • 1985: Token Ring Network. IBM's Token Ring technology brings a new level of control to local area networks and quickly becomes an industry standard for networks that connect printers, workstations and servers. [210]
  • 1986: IBM Almaden Research Center. IBM Research dedicates the Almaden Research Center in California. Today, Almaden is IBM's second-largest laboratory focused on storage systems, technology and computer science. [211]
  • 1986: Nobel Prize: Scanning tunneling microscopy. IBM FellowsGerd K. Binnig and Heinrich Rohrer of the IBM Zurich Research Laboratory win the 1986 Nobel Prize in physics for their work in scanning tunneling microscopy. Drs. Binnig and Rohrer are recognized for developing a powerful microscopy technique which permits scientists to make images of surfaces so detailed that individual atoms may be seen. [212]
  • 1987: Nobel Prize: High-Temperature Superconductivity. J. Georg Bednorz and IBM FellowAlex Müller of the IBM Zurich Research Laboratory receive the 1987 Nobel Prize for physics for their breakthrough discovery of high-temperature superconductivity in a new class of materials. They discover superconductivity in ceramic oxides that carry electricity without loss of energy at much higher temperatures than any other superconductor. [213]
  • 1987: Antivirus tools. As personal computers become vulnerable to attack from viruses, a small research group at IBM develops, practically overnight, a suite of antivirus tools. The effort leads to the establishment of the High Integrity Computing Laboratory (HICL) at IBM. HICL goes on to pioneer the science of theoretical and observational computer virus epidemiology. [214]
  • 1987: Special needs access. IBM Researchers demonstrate the feasibility for blind computer users to read information directly from computer screens with the aid of an experimental mouse. And in 1988 the IBM Personal System/2 Screen Reader is announced, permitting blind or visually impaired people to hear the text as it is displayed on the screen in the same way a sighted person would see it. This is the first in the IBM Independence Series of products for computer users with special needs. [215]
  • 1988: IBM AS/400. IBM introduces the IBM Application System/400 (AS/400), a new family of easy-to-use computers designed for small and intermediate-sized companies. As part of the introduction, IBM and IBM Business Partners worldwide announce more than 1,000 software packages in the biggest simultaneous applications announcement in computer history. The AS/400 quickly becomes one of the world's most popular business computing systems. [216]
  • 1988: National Science Foundation Network (NSFNET). IBM collaborates with the Merit Network, MCI Communications, the State of Michigan, and the National Science Foundation to upgrade and expand the 56K bit per second NSFNET to 1.5M bps (T1) and later 45M bps (T3). This partnership provides the network infrastructure and lays the groundwork for the explosive growth of the Internet in the 1990s. The NSFNET upgrade boosts network capacity, not only making it faster, but also allowing more intensive forms of data, such as the graphics now common on the World Wide Web, to travel across the Internet. [217]
  • 1989: Silicon germanium transistors. The replacing of expensive and exotic materials like gallium arsenide with silicon germanium (known as SiGe), championed by IBM Fellow Bernie Meyerson, creates faster chips at lower costs. Introducing germanium into the base layer of an otherwise all-silicon bipolar transistor allows for significant improvements in operating frequency, current, noise and power capabilities. [218]
  • 1990: System/390. IBM makes its most comprehensive product announcement in 25 years by introducing the System/390 family. IBM incorporates complementary metal oxide silicon (CMOS) based processors into System/390 Parallel Enterprise Server in 1995, and in 1998 the System/390 G5 Parallel Enterprise Server 10-way Turbo model smashed the 1,000 MIPS barrier, making it the world's most powerful mainframe. [219]
  • 1990: RISC System/6000. IBM announces the RISC System/6000, a family of nine workstations that are among the fastest and most powerful in the industry. The RISC System/6000 uses Reduced instruction set computing technology, an innovative computer design pioneered by IBM that simplifies processing steps to speed the execution of commands. [220]
  • 1990: Moving individual atoms. Donald M. Eigler, a physicist and IBM Fellow at the IBM Almaden Research Center demonstrated the ability to manipulate individual atoms using a scanning tunneling microscope, writing I-B-M using 35 individual xenon atoms. [221]
  • 1990: Environmental programs'. IBM joins 14 other leading U.S. corporations in April to establish a worldwide program designed to achieve environmental, health and safety goals by continuously improving environmental management practices and performance. IBM has invested more than $1 billion since 1973 to provide environmental protection for the communities in which IBM facilities are located. [222]
  • 1991: Services business. IBM reenters the computer services business through the formation of the Integrated Systems Solution Corporation. Still in compliance with the provisions of the 1956 Consent Decree, in just four ISSC becomes the second largest provider of computer services. The new business becomes one of IBM's primary revenue streams. [223]
  • 1992: Thinkpad. IBM introduces a new line of notebook computers. Housed in a distinctive black case and featuring the innovative TrackPoint device nestled in the middle of the keyboard, the ThinkPad is an immediate hit and goes on to collect more than 300 awards for design and quality. [224]

1993–2018: IBM's near disaster and rebirth Edit

Year Gross income (in $m) Employees
1985 50,050 405,535
1990 69,010 373,816
1995 71,940 225,347
2000 85,090 316,303
2005 91,400 329,373
2010 99,870 426,751

In April 1993, IBM hired Louis V. Gerstner, Jr. as its new CEO. For the first time since 1914 IBM had recruited a leader from outside its ranks. Gerstner had been chairman and CEO of RJR Nabisco for four years, and had previously spent 11 years as a top executive at American Express. Gerstner brought with him a customer-oriented sensibility and the strategic-thinking expertise that he had honed through years as a management consultant at McKinsey & Co. Recognizing that his first priority was to stabilize the company, he adopted a triage mindset and took quick, dramatic action. His early decisions included recommitting to the mainframe, selling the Federal Systems Division to Loral in order to replenish the company's cash coffers, continuing to shrink the workforce (reaching a low of 220,000 employees in 1994), and driving significant cost reductions within the company. Most importantly, Gerstner decided to reverse the move to spin off IBM business units into separate companies. He recognized that one of IBM's enduring strengths was its ability to provide integrated solutions for customers – someone who could represent more than piece parts or components. Splitting the company would have destroyed that unique IBM advantage. [225]

These initial steps worked. IBM was in the black by 1994, turning a profit of $3 billion. Stabilization was not Gerstner's endgame – the restoration of IBM's once great reputation was. To do that, he needed to devise a winning business strategy. [226] Over the next decade, Gerstner crafted a business model that shed commodity businesses and focused on high-margin opportunities. IBM divested itself of low margin industries (DRAM, IBM Network, personal printers, and hard drives). The company regained the business initiative by building upon the decision to keep the company whole – it unleashed a global services business that rapidly rose to become a leading technology integrator. Crucial to this success was the decision to become brand agnostic – IBM integrated whatever technologies the client required, even if they were from an IBM competitor. [227] IBM augmented this services business with the 2002 acquisition of the consultancy division of PricewaterhouseCoopers for $3.5 billion US. [228]

Another high margin opportunity IBM invested heavily in was software, a strategic move that proved equally visionary. Starting in 1995 with its acquisition of Lotus Development Corp., IBM built up its software portfolio from one brand, DB2, to five: DB2, Lotus, WebSphere, Tivoli, and Rational. Content to leave the consumer applications business to other firms, IBM's software strategy focused on middleware – the vital software that connects operating systems to applications. The middleware business played to IBM's strengths, and its higher margins improved the company's bottom line significantly as the century came to an end. [229]

Not all software that IBM developed was successful. While OS/2 was arguably technically superior to Microsoft Windows 95, OS/2 sales were largely concentrated in networked computing used by corporate professionals. OS/2 failed to develop much penetration in the consumer and stand-alone desktop PC segments. There were reports that it could not be installed properly on IBM's own Aptiva series of home PCs. [230] Microsoft made an offer in 1994 where if IBM ended development of OS/2 completely, then it would receive the same terms as Compaq for a license of Windows 95. IBM refused and instead went with an "IBM First" strategy of promoting OS/2 Warp and disparaging Windows, as IBM aimed to drive sales of its own software and hardware. By 1995, Windows 95 negotiations between IBM and Microsoft, which were difficult, stalled when IBM purchased Lotus Development whose Lotus SmartSuite would have directly competed with Microsoft Office. As a result, IBM received their license later than their competitors which hurt sales of IBM PCs. IBM officials later conceded that OS/2 would not have been a viable operating system to keep them in the PC business. [231] [232]

While IBM hardware and technologies were relatively de-emphasized in Gerstner's three-legged business model, they were not relegated to secondary status. The company brought its world-class research organization to bear more closely on its existing product lines and development processes. While Internet applications and deep computing overtook client servers as key business technology priorities, mainframes returned to relevance. IBM reinvigorated their mainframe line with CMOS technologies, which made them among the most powerful and cost-efficient in the marketplace. [233] Investments in microelectronics research and manufacturing made IBM a world leader in specialized, high margin chip production – it developed 200 mm wafer processes in 1992, and 300 mm wafers within the decade. [234] IBM-designed chips were used in PlayStation 3, Xbox 360, and Wii game consoles. IBM also regained the lead in supercomputing with high-end machines based upon scalable parallel processor technology.

Equally significant in IBM's revival was its successful reentry into the popular mindset. Part of this revival was based on IBM technology. On October 5, 1992, at the COMDEX computer expo, IBM announced the first ThinkPad laptop computer, the 700C. The ThinkPad, a premium machine which then cost US$4350, included a 25 MHz Intel 80486SL processor, a 10.4-inch active matrix display, removable 120 MB hard drive, 4 MB RAM (expandable to 16 MB) and a TrackPoint II pointing device. [235] The striking black design by noted designer Richard Sapper made the ThinkPad an immediate hit with the digerati, and the cool factor of the ThinkPad brought back some of the cachet to the IBM brand that was lost in the PC wars of the 1980s. Instrumental to this popular resurgence was the 1997 chess match between IBM's chess-playing computer system Deep Blue and reigning world chess champion Garry Kasparov. Deep Blue's victory was a historic first for a computer over a reigning world champion. Also helping the company reclaim its position as a technology leader was its annual domination of supercomputer rankings [236] and patent leadership statistics. [237] Ironically, a serendipitous contributor in reviving the company's reputation was the Dot-com bubble collapse in 2000, where many of the edgy technology high flyers of the 1990s failed to survive the downturn. These collapses discredited some of the more fashionable Internet-driven business models that stodgy IBM was previously compared against.

Another part of the successful reentry into the popular mindset was the company's revival of the IBM brand. The company's marketing during the economic downturn was chaotic, presenting many different, sometimes discordant voices in the marketplace. This brand chaos was attributable in part to the company having 70 different advertising agencies in its employ. In 1994, IBM eliminated this chaos by consolidating its advertising in one agency. The result was a coherent, consistent message to the marketplace. [238]

As IBM recovered its financial footing and its industry leadership position, the company remained aggressive in preaching to the industry that it was not the Old IBM, that it had learned from its near-death experiences, and that it had been fundamentally changed by them. It sought to redefine the Internet age in ways that played to traditional IBM strengths, couching the discussion in business-centric manners with initiatives like e-commerce and On Demand. [239] And it supported open source initiatives, forming collaborative ventures with partners and competitors alike. [240]

Change was manifested in IBM in other ways as well. The company revamped its varied philanthropic practices to bring a sharp focus on improving K-12 education. It ended its 40-year technology partnership with the International Olympic Committee after a successful engagement at the 2000 Olympic Games in Sydney, Australia. On the human resources front, IBM's adoption and integration of diversity principles and practices was cutting edge. It added sexual orientation to its non-discrimination practices in 1984, in 1995 created executive diversity task forces, and in 1996 offered domestic partner benefits to its employees. The company is routinely listed as among the best places for employees, employees of color, and women to work. [241] And in 1996, the Women in Technology International Hall of Fame inducted three IBMers as part of its inaugural class of 10 women: Ruth Leach Amonette, the first woman to hold an executive position at IBM Barbara Grant, PhD, first woman to be named an IBM site general manager and Linda Sanford, the highest – placed technical woman in IBM. Fran Allen – an early software pioneer and another IBM hero for her innovative work in compilers over the decades – was inducted in 1997. [242]

Gerstner retired at the end of 2002, and was replaced by long-time IBMer Samuel J. Palmisano.

Key events Edit

  • 1993: Billion-dollar losses. IBM misreads two significant trends in the computer industry: personal computers and client-server computing: and as a result loses more than $8 billion in 1993, its third straight year of billion-dollar losses. Since 1991, the company has lost $16 billion, and many feel IBM is no longer a viable player in the industry. [243]
  • 1993: Louis V. Gerstner, Jr.. Gerstner arrives as IBM's chairman and CEO on April 1, 1993. For the first time since the arrival of Thomas J. Watson, Sr., in 1914, IBM has a leader pulled from outside its ranks. Gerstner had been chairman and CEO of RJR Nabisco for four years and had previously spent 11 years as a top executive at American Express. [244]
  • 1993: IBM Scalable POWERparallel system. IBM introduces the Scalable POWERparallel System, the first in a family of microprocessor-based supercomputers using RISC System/6000 technology. IBM pioneers the breakthrough scalable parallel system technology of joining smaller, mass-produced computer processors rather than relying on one larger, custom-designed processor. Complex queries could then be broken down into a series of smaller jobs that are run concurrently ("in parallel") to speed their completion. [245]
  • 1994: Turnaround. IBM reports a profit for the year, its first since 1990. Over the next few years, the company successfully charts a new business course, one that focuses less on its traditional strengths in hardware, and more on services, software, and its ability to craft technology solutions. [246]
  • 1994: IBM RAMAC Array Storage Family. The IBM RAMAC Array Family is announced. With features like highly parallel processing, multi-level cache, RAID 5, and redundant components, RAMAC represents a major advance in information storage technology. Consisting of the RAMAC Array Direct Access Storage Device (DASD) and the RAMAC Array Subsystem, the products become one of IBM's most successful storage product launches ever, with almost 2,000 systems shipped to customers in its first three months of availability. [247]
  • 1994: Speech recognition. IBM releases the IBM Personal Dictation System (IPDS), the first wave of speech recognition products for the personal computer. It is later renamed VoiceType, and its capabilities are expanded to include control of computer applications and desktops simply by talking to them, without touching a keyboard. In 1997 IBM announces ViaVoice Gold, software that gives people a hands-free way to dictate text and navigate the desktop with the power of natural, continuous speech. [248]
  • 1995: Lotus Development Corporation acquisition. IBM acquires all of the outstanding shares of the Lotus Development Corporation, whose pioneering Notes software enables greater collaboration across an enterprise and whose acquisition makes IBM the world's largest software company. [249]
  • 1995: Glueball calculation. IBM scientists complete a two-year calculation – the largest single numerical calculation in the history of computing – to pin down the properties of an elusive elementary particle called a "glueball." The calculation was carried out on GF11, a massively parallel computer at the IBM Thomas J. Watson Research Center. [250]
  • 1996: IBM Austin Research Laboratory opens. Based in Austin, Texas, the lab is focused on advanced circuit design as well as new design techniques and tools for very high performance microprocessors. [251]
  • 1996: Atlanta Olympics. IBM suffers a highly public embarrassment when its IT support of the Olympic Games in Atlanta experiences technical difficulties. [252]
  • 1996: Domestic partner benefits. IBM announces Domestic Partner Benefits for gay and lesbian employees. [253]
  • 1997: Deep Blue. The 32-node IBM RS/6000 SP supercomputer, Deep Blue, defeats World Chess Champion Garry Kasparov in the first known instance of a computer vanquishing a reigning world champion chess player in a tournament-style competition. [254]
  • 1997: eBusiness. IBM coins the term and defined an enormous new industry by using the Internet as a medium for real business and institutional transformation. e-business becomes synonymous with doing business in the Internet age. [255]
  • 1998: CMOS Gigaprocessor. IBM unveils the first microprocessor that runs at 1 billion cycles per second. IBM scientists develop new Silicon on insulator chips to be used in the construction of a mainstream processor. The breakthrough ushers in new circuit designs and product groups. [256]
  • 1999: Blue Gene. IBM Research starts a computer architecture cooperative project with the Lawrence Livermore National Laboratory, the United States Department of Energy (which is partially funding the project), and academia to build new supercomputers (4) capable of more than one quadrillion operations per second (one petaflop). Nicknamed "Blue Gene," the new supercomputers perform 500 times faster than other powerful supercomputers and can simulate folding complex proteins. [257]
  • 2000: Quantum mirage nanotechnology. IBM scientists discover a way to transport information on the atomic scale that uses electrons instead of conventional wiring. This new phenomenon, called the Quantum mirage effect, enables data transfer within future nanoscale electronic circuits too small to use wires. The quantum mirage technique is a unique way of sending information through solid forms and could do away with wiring that connects nanocircuit components. [258]
  • 2000: IBM ASCI White – Fastest supercomputer. IBM delivers the world's most powerful computer to the US Department of Energy, powerful enough to process an Internet transaction for every person on Earth in less than a minute. IBM built the supercomputer to accurately test the safety and effectiveness of the nation's aging nuclear weapons stockpile. This computer is 1,000 times more powerful than Deep Blue, the supercomputer that beat Garry Kasparov in chess in 1997. [259]
  • 2000: Flexible transistors. IBM created flexible transistors, combining organic and inorganic materials as a medium for semiconductors. This technology enables things like an "electronic newspaper", so lightweight and inexpensive that leaving one behind on the airplane or in a hotel lobby is no big deal. By eliminating the limitations of etching computer circuits in silicon, flexible transistors make it possible to create a new generation of inexpensive computer displays that can be embedded into curved plastic or other materials. [260]
  • 2000: Sydney Olympics. After a successful engagement at the 2000 Olympic games in Sydney, IBM ends its 40-year technology partnership with the International Olympic Committee. [261]
  • 2001: Holocaust controversy. A controversial book, IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America's Most Powerful Corporation by Edwin Black, accuses IBM of having knowingly assisted Nazi authorities in the perpetuation of the Holocaust through the provision of tabulating products and services. Several lawsuits are filed against IBM by Holocaust victims seeking restitution for their suffering and losses. All lawsuits related to this issue were eventually dropped without recovery. [262]
  • 2001: Carbon nanotube transistors. IBM researchers build the world's first transistors out of carbon nanotubes – tiny cylinders of carbon atoms that are 500 times smaller than silicon-based transistors and 1,000 times stronger than steel. The breakthrough is an important step in finding materials that can be used to build computer chips when silicon-based chips can't be made any smaller. [263]
  • 2001: Low power initiative. IBM launches its low-power initiative to improve the energy efficiency of IT and accelerates the development of ultra-low power components and power-efficient servers, storage systems, personal computers and ThinkPad notebook computers. [264]
  • 2001: Greater density & chip speeds. IBM is first to mass-produce computer hard disk drives using a revolutionary new type of magnetic coating – "pixie dust" – that eventually quadruples data density of current hard disk drive products. IBM also unveils "strained silicon," a breakthrough that alters silicon to boost chip speeds by up to 35 percent. [265][266]
  • 2002: The Hard disk drive business is sold to Hitachi. [267]
  • 2003: Blue Gene/L. The BLUE GENE team unveils a proto-type of its Blue Gene/L computer roughly the size of a standard dishwasher that ranks as the 73rd most powerful supercomputer in the world. This cubic meter machine is a small scale model of the full Blue Gene/L built for the Lawrence Livermore National Laboratory in California, which will be 128 times larger when it's unveiled two years later. [268]
  • 2005: Crusade Against Cancer. IBM joins forces with Memorial Sloan-Kettering Cancer Center (MSKCC), the Molecular Profiling Institute and the CHU Sainte-Justine Research Center to collaborate on cancer research by building state-of-the-art integrated information management systems. [269]
  • 2005: The PC division is sold. The PC division (including Thinkpads) is sold to Chinese manufacturer, Lenovo. [270]
  • 2006: Translation software. IBM delivers an advanced speech-to-speech translation system to U.S. forces in Iraq using bidirectional English to Arabic translation software that improves communication between military personnel and Iraqi forces and citizens. The breakthrough software offsets the current shortage of military linguists. [271]
  • 2007: Renewable energy. IBM is recognized by the US EPA for its leading green power purchases in the US and for its support and participation in EPA's Fortune 500 Green Power Challenge. IBM ranked 12th on the EPA's list of Green Power Partners for 2007. IBM purchased enough renewable energy in 2007 to meet 4% of its US electricity use and 9% of its global electricity purchases. IBM's commitment to green power helps cut greenhouse gas emissions. [272]
  • 2007: River watch using IBM Stream Computing. In a unique collaboration, The Beacon Institute and IBM created the first technology-based river monitoring network. The River and Estuary Observatory Network (REON) allows for minute-to-minute monitoring of New York's Hudson River via an integrated network of sensors, robotics and computational technology. This first-of-its-kind project is made possible by IBM's "Stream Computing," a fundamentally new computer architecture that can examine thousands of information sources to help scientists better understand what is happening as it happens. [273][274]
  • 2007: Patent power. IBM has been granted more US patents than any other company. From 1993 to 2007, IBM was awarded over 38,000 US patents and has invested about $5 billion a year in research, development, and engineering since 1996. IBM's current active portfolio is about 26,000 patents in the US and over 40,000 patents worldwide is a direct result of that investment. [275]
  • 2008: IBM Roadrunner No.1 Supercomputer. For a record-setting ninth consecutive time, IBM takes the No.1 spot in the ranking of the world's most powerful supercomputers with the IBM computer built for the Roadrunner project at Los Alamos National Laboratory. It is the first in the world to operate at speeds faster than one quadrillion calculations per second and remains the world speed champion for over a year. The Los Alamos system is twice as energy-efficient as the No. 2 computer at the time, using about half the electricity to maintain the same level of computing power. [276]
  • 2008: Green power. IBM opens its "greenest" data center in Boulder, Colorado. The energy-efficient facility is part of a $350 million investment by IBM in Boulder to help meet customer demand for reducing energy costs. The new data center features leading-edge technologies and services, including high-density computing systems with virtualization technology. Green Power centers allow IBM and its customers to cut their carbon footprint. [277]
  • 2011: Watson. IBM's supercomputer Watson competed on the TV show Jeopardy! against Ken Jennings and Brad Rutter and won convincingly. The competition was presented by PBS. [278]
  • June 16, 2011: IBM founded 100 years ago. Mark Krantz and Jon Swartz in USA Today state It has remained at the forefront through the decades. the fifth-most-valuable U.S. company [today] . demonstrated a strength shared by most 100-year-old companies: the ability to change. . survived not only the Depression and several recessions, but technological shifts and intense competition as well.[279]
  • October 28, 2018 Red Hat acquisition for $34 billion On October 28, 2018, IBM announced its intent to acquire Red Hat for US$34 billion, in one of its largest-ever acquisitions. The company will operate out of IBM's Hybrid Cloud division. [280][281][282][283][284]

2019–present Edit

The 2019 acquisition of Red Hat enabled IBM to change its focus on future platforms, according to IBM Chief Executive Arvind Krishna. [285]

In October 2020, IBM announced it is splitting itself into two public companies. [286] IBM will focus on high-margin cloud computing and artificial intelligence, built on the foundation of the 2019 Red Hat acquisition. The legacy Managed Infrastructure Services unit will be spun off into a new public company Kyndryl to manage clients’ IT infrastructure and accounts, and have 4,600 clients in 115 countries, with a backlog of $60 billion. [287] [288]

This new focus on hybrid cloud, separating IBM from its other business units, will be larger than any of its previous divestitures, and welcomed by investors. [289] [290] [291]

IBM dominated the electronic data processing market for most of the 20th century, initially controlling over 70 percent of the punch card and tabulating machine market and then achieving a similar share in the computer market. [292] IBM asserted that its successes in achieving and maintaining such market share were due to its skill, industry and foresight governments and competitors asserted that the maintenance of such large shares was at least in part due to anti-competitive acts such as unfair prices, terms and conditions, tying, product manipulations and creating FUD (Fear, Uncertainty and Doubt) in the marketplace. [293] IBM was thus the defendant in more than twenty government and private antitrust actions during the 20th century. IBM lost only one of these matters but did settle others in ways that profoundly shaped the industry as summarized below. By the end of the 20th century, IBM was no longer so dominant in the computer industry. Some observers suggest management's attention to the many antitrust lawsuits of the 1970s was at least in part responsible for its decline. [292]

1936 Consent Decree Edit

In 1932 U.S. Government prosecutors asserted as anti-competition tying IBM's practice of requiring customers who leased its tabulating equipment to purchase punched cards used on such equipment. IBM lost [294] and in the resulting 1936 consent decree, IBM agreed to no longer require only IBM cards and agreed to assist alternative suppliers of cards in starting production facilities that would compete with IBM's thereby create a separate market for the punched cards and in effect for subsequent computer supplies such as tapes and disk packs. [295]

1956 Consent Decree Edit

On January 21, 1952 the U.S. Government filed a lawsuit which resulted in a consent decree entered as a final judgment on January 25, 1956. [296] The government's goal to increase competition in the data processing industry was effected through several provisions in the decree: [297]

  • IBM was required to sell equipment on terms that would place purchasers at a disadvantage with respect to customers leasing the same equipment from IBM. Prior to this decree, IBM had only rented its equipment. This created markets both for used IBM equipment [297] and enabled lease financing of IBM equipment by third parties (leasing companies). [297]
  • IBM was required to provide parts and information to independent maintainers of purchased IBM equipment, [297] enabling and creating a demand for such hardware maintenance services.
  • IBM was required to sell data processing services through a subsidiary that could be treated no differently than any company independent of IBM, enabling competition in the data processing services business.
  • IBM was required to grant non-exclusive, non-transferable, worldwide licenses for any and all patents at reasonable royalty rates to anyone, provided the licensee cross-licensed its patents to IBM on similar terms. [296] This removed IBM patents as a barrier to competition in the data processing industry and enabled the emergence of manufacturers of equipment plug compatible to IBM equipment.

While the decree did little to limit IBM's future dominance of the then-nascent computer industry, it did enable competition in segments such as leasing, services, maintenance, and equipment attachable to IBM systems and reduced barriers to entry through mandatory reasonable patent cross-licensing.

The decree's terms remained in effect until 1996 they were phased out over the next five years. [298]

1968-1984 Multiple Government and Private Antitrust Complaints Edit

In 1968 the first of a series of antitrust suits against IBM was filed by Control Data Corp (CDC). It was followed in 1969 by the US government's antitrust complaint, then by 19 private US antitrust complaints and one European complaint. In the end IBM settled a few of these matters but mainly won. The US government's case sustained by four US Presidents and their Attorneys General was dropped as “without merit” in 1982 by William Baxter, US President Reagans’ Assistant Attorney General in charge of the Antitrust Division of the Department of Justice. [299]

1968-1973 Control Data Corp. v. IBM Edit

CDC filed an antitrust lawsuit against IBM in Minnesota's federal court alleging that IBM had monopolized the market for computers in violation of section 2 of the Sherman Act by among other things announcing products it could not deliver. [300] A 1965 internal IBM memo by an IBM attorney noted that Control Data had publicly blamed its declining earnings on IBM, "and its frequent model and price changes. There was some sentiment that the charges were true." [301] In 1973 IBM settled the CDC case for about $80 million in cash and the transfer of assets including the IBM Service Bureau Corp to CDC. [300]

1969-1982 U.S. v. IBM Edit

On January 17, 1969, the United States of America filed a complaint in the United States District Court for the Southern District of New York, alleging that IBM violated the Section 2 of the Sherman Antitrust Act by monopolizing or attempting to monopolize the general-purpose electronic digital computer system market, specifically computers designed primarily for business. Subsequently, the US government alleged IBM violated the antitrust laws in IBM's actions directed against leasing companies and plug-compatible peripheral manufacturers.

In June 1969 IBM unbundled its software and services in what many observers believed was in anticipation of and a direct result of the 1969 US Antitrust lawsuit. Overnight a competitive software market was created. [302]

Among the major violations asserted [303] were:

  • Anticompetitive price discrimination such as giving away software services.
  • Bundling of software with "related computer hardware equipment" for a single price.
  • Predatorily priced and preannounced specific hardware "fighting machines".
  • Developed and announced specific hardware products primarily for the purpose of discouraging customers from acquiring competing products.
  • Announced certain future products knowing that it was unlikely to be able to ship such products within the announced time frame.
  • Engaged in below cost and discount conduct in selected markets in order to injure peripheral manufacturers and leasing companies.

It was in some ways one of the great single firm monopoly cases of all times. IBM produced 30 million pages of materials during discovery it submitted its executives to a series of pretrial depositions. Trial began six years after the complaint was filed and then it battled in court for another six years. The trial transcript contains over 104,400 pages with thousands of documents placed in the record. It ended on January 8, 1982 when William Baxter, the then Assistant Attorney General in charge of the Antitrust Division of the Department of Justice dropped the case as “without merit.” [299]

1969-1981 Private antitrust lawsuits Edit

The U.S.'s 1969 antitrust lawsuit was followed by about 18 private antitrust complaints all but one of which IBM ultimately won. Some notable lawsuits include:

Greyhound Computer Corp. Edit

Greyhound a leasing company filed a case under Illinois’ state antitrust law in Illinois state court. [304] This case went to trial in federal court in 1972 in Arizona with a directed verdict for IBM on the antitrust claims however, the court of appeals in 1977 reversed the decision. Just before the retrial was to start in January 1981, IBM and Greyhound settled the case for $17.7 million. [300]

Telex Corp. Edit

Telex, a peripherals equipment manufacturer filed suit on January 21, 1972, charging that IBM had monopolized and had attempted to monopolize the worldwide manufacture, distribution, sales, and leasing of electronic data processing equipment including the relevant submarket of plug-compatible peripheral devices. After a non-jury trial in 1973, IBM was found guilty “possessing and exercising monopoly power” over the “plug-compatible peripheral equipment market,” and ordered to pay triple damages of $352.5‐million and other relief including disclosure of peripheral interface specifications. Separately Telex was found guilty of misappropriated IBM trade secrets. [305] The judgment against IBM was overturned on appeal and on October 4, 1975, both parties announced they were terminating their actions against each other. [306]

Other private lawsuits Edit

Other private lawsuits ultimately won by IBM include California Computer Products Inc., [307] Memorex Corp., [308] Marshall Industries, Hudson General Corp., Transamerica Corporation [309] and Forro Precision, Inc.

1980-1984 European Union Edit

The European Economic Communities Commission on Monopolies initiated proceedings against IBM under article 86 of the Treaty of Rome for exploiting its domination of the continent's computer business and abusing its dominant market position by engaging in business practices designed to protect its position against plug-compatible manufacturers. The case was settled in 1984 with IBM agreeing to change its business practices with regard to disclosure of device interface information. [310]

Evolution of IBM's computer hardware Edit

The story of IBM's hardware is intertwined with the story of the computer industry – from vacuum tubes, to transistors, to integrated circuits, to microprocessors and beyond. The following systems and series represent key steps:

    - overview – 1948, the first operational machine able to treat its instructions as data – 1949 – 1952–1958 – 1954, the first supercomputer[311] – 1954, the world's first mass-produced computer
  • SAGE AN/FSQ-7 – 1958, half an acre of floor space, 275 tons, up to three megawatts, . the largest computers ever built – 1959–1964, transistorized evolution of IBM 700 series – 1959, ". by the mid-1960s nearly half of all computer systems in the world were 1401-type systems." [312] – 1964, the first family of computers designed to cover the complete range of applications, small to large, commercial and scientific RISC processor was earlier IBM AS/400 then IBM eServer iSeries was earlier IBM System/390 (computer)

Components Edit

Evolution of IBM's operating systems Edit

IBM operating systems have paralleled hardware development. On early systems, operating systems represented a relatively modest level of investment, and were essentially viewed as an adjunct to the hardware. By the time of the System/360, however, operating systems had assumed a much larger role, in terms of cost, complexity, importance, and risk.

Mainframe operating systems include:

  • OS family, including: OS/360, OS/MFT, OS/MVT, OS/VS1, OS/VS2, MVS, OS/390, z/OS
  • DOS family, including: DOS/360, DOS/VS, DOS/VSE, z/VSE
  • VM family, including: CP/CMS (See: History of CP/CMS), VM/370, VM/XA, VM/ESA, z/VM
  • Special purpose systems, including: TPF, z/TPF

Other significant operating systems include:

High-level languages Edit

Early IBM computer systems, like those from many other vendors, were programmed using assembly language. Computer science efforts through the 1950s and early 1960s led to the development of many new high-level languages (HLL) for programming. IBM played a complicated role in this process. Hardware vendors were naturally concerned about the implications of portable languages that would allow customers to pick and choose among vendors without compatibility problems. IBM, in particular, helped create barriers that tended to lock customers into a single platform.

IBM had a significant role in the following major computer languages:

    – for years, the dominant language for mathematics and scientific programming – an attempt to create a "be all and end all" language – eventually the ubiquitous, standard language for business applications – an early interactive language with a mathematical notation – an internal systems programming language proprietary to IBM – an acronym for 'Report Program Generator', developed on the IBM 1401 to produce reports from data files. General Systems Division enhanced the language to HLL status on its midrange systems to rival with COBOL. – a relational query language developed for IBM's System R now the standard RDBMS query language – a macro and scripting language based on PL/I syntax originally developed for Conversational Monitor System (CMS) and authored by IBM Fellow Mike Cowlishaw

IBM and AIX/UNIX/Linux/SCO Edit

IBM developed a schizophrenic relationship with the UNIX and Linux worlds. The importance of IBM's large computer business placed strange pressures on all of IBM's attempts to develop other lines of business. All IBM projects faced the risk of being seen as competing against company priorities. This was because, if a customer decided to build an application on an RS/6000 platform, this also meant that a decision had been made against a mainframe platform. So despite having some excellent technology, IBM often placed itself in a compromised position.

A case in point is IBM's GFIS products for infrastructure management and GIS applications. Despite long having a dominant position in such industries as electric, gas, and water utilities, IBM stumbled badly in the 1990s trying to build workstation-based solutions to replace its old mainframe-based products. Customers were forced to move on to new technologies from other vendors many felt betrayed by IBM.

IBM embraced open source technologies in the 1990s. It later became embroiled in a complex litigation with SCO group over intellectual property rights related to the UNIX and Linux platforms.

BICARSA (Billing, Inventory Control, Accounts Receivable, & Sales Analysis) Edit

1983 saw the announcement of the System/36, the replacement for the System/34. And in 1988, IBM announced the AS/400, intended to represent a point of convergence for both System/36 customers and System/38 customers. The 1970s had seen IBM develop a range of Billing, Inventory Control, Accounts Receivable, & Sales Analysis (BICARSA ) applications for specific industries: construction (CMAS), distribution (DMAS), and manufacturing (MMAS), all written in the RPG II language. By the end of the 1980s, IBM had almost completely withdrawn from the BICARSA applications marketplace. Because of developments in the antitrust cases against IBM brought by the US government and European Union, IBM sales representatives were now able to work openly with application software houses as partners. (For a period in the early 1980s, a 'rule of three' operated, which obliged IBM sales representatives, if they were to propose a third-party application to a customer, to also list at least two other third-party vendors in the IBM proposal. This caused some amusement to the customer, who would typically have engaged in intense negotiations with one of the third parties and probably not have heard of the other two vendors.)

Non-computer lines of business Edit

IBM has largely been known for its overtaking UNIVAC's early 1950s public fame, then leading in the computer industry for much of the latter part of the century. However, it has also had roles, some significant, in other industries, including:

  • IBM was the largest supplier of unit record equipment (punched cards, keypunches, accounting machines, . ) in the first part of the 20th century.
  • Food services (meat and coffee grinders, computing cheese slicers, computing scales) – founding to 1934, sold to Hobart Manufacturing Co. [313]
  • Time recorders (punch clocks, school, and factory clocks) – founding to 1958, sold to Simplex Time Recorder Company. [40] See IBM: History of the Time Equipment Division and its Products and this 1935 catalog - International Time Recording Catalog , personal printers. See IBM Electric typewriter, IBM Selectric typewriter. IBM divested in 1991, now part of Lexmark. [314] - 1970 to 1988. Sold to Eastman Kodak in 1988.
  • Other office products such as dictation machines, word processors.
  • Military products (Browning Automatic Rifle, bombsights) – IBM's World War II production
  • Digital telephone switches – partnership (1983), acquisition (1984), and sale (1989–1992) of ROLM to Siemens AG[205][206][207][208]
  • Stadium scoreboards
  • Real estate (at one time owning vast tracts of undeveloped land on the U.S. east coast)
  • Medical instruments: heart-lung machine, prostheses, IBM 2991 Blood Cell Washer, IBM 2997 Blood Cell Separator, IBM 5880 Electrocardiograph System

CEOs, Notable IBMers Edit

For IBM's corporate biographies of former CEOs and many others see: IBM Archives Biographies Builders reference room


Watch the video: Doku Katastrophen der Seefahrt TEIL 44 - Untergang nach Plan Deutsch - 2014