Philosophy
Deceased--John McCarthy
John McCarthySeptember 4th, 1927 to October 24th, 2011
"John McCarthy, 84, Dies; Computer Design Pioneer"byJohn MarkoffOctober 25th, 2011The New York Times John McCarthy, a computer scientist who helped design the foundation of today’s Internet-based computing and who is widely credited with coining the term for a frontier of research he helped pioneer, Artificial Intelligence, or A.I., died on Monday at his home in Stanford, Calif. He was 84.The cause was complications of heart disease, his daughter Sarah McCarthy said.Dr. McCarthy’s career followed the arc of modern computing. Trained as a mathematician, he was responsible for seminal advances in the field and was often called the father of computer time-sharing, a major development of the 1960s that enabled many people and organizations to draw simultaneously from a single computer source, like a mainframe, without having to own one.By lowering costs, it allowed more people to use computers and laid the groundwork for the interactive computing of today.Though he did not foresee the rise of the personal computer, Dr. McCarthy was prophetic in describing the implications of other technological advances decades before they gained currency.“In the early 1970s, he presented a paper in France on buying and selling by computer, what is now called electronic commerce,” said Whitfield Diffie, an Internet security expert who worked as a researcher for Dr. McCarthy at the Stanford Artificial Intelligence Laboratory.And in the study of artificial intelligence, “no one is more influential than John,” Mr. Diffie said.While teaching mathematics at Dartmouth in 1956, Dr. McCarthy was the principal organizer of the first Dartmouth Conference on Artificial Intelligence.The idea of simulating human intelligence had been discussed for decades, but the term “artificial intelligence” — originally used to help raise funds to support the conference — stuck.In 1958, Dr. McCarthy moved to the Massachusetts Institute of Technology, where, with Marvin Minsky, he founded the Artificial Intelligence Laboratory. It was at M.I.T. that he began working on what he called List Processing Language, or Lisp, a computer language that became the standard tool for artificial intelligence research and design.Around the same time he came up with a technique called garbage collection, in which pieces of computer code that are not needed by a running computation are automatically removed from the computer’s random access memory.He developed the technique in 1959 and added it to Lisp. That technique is now routinely used in Java and other programming languages.His M.I.T. work also led to fundamental advances in software and operating systems. In one, he was instrumental in developing the first time-sharing system for mainframe computers.The power of that invention would come to shape Dr. McCarthy’s worldview to such an extent that when the first personal computers emerged with local computing and storage in the 1970s, he belittled them as toys.Rather, he predicted, wrongly, that in the future everyone would have a relatively simple and inexpensive computer terminal in the home linked to a shared, centralized mainframe and use it as an electronic portal to the worlds of commerce and news and entertainment media.Dr. McCarthy, who taught briefly at Stanford in the early 1950s, returned there in 1962 and in 1964 became the founding director of the Stanford Artificial Intelligence Laboratory, or SAIL. Its optimistic, space-age goal, with financial backing from the Pentagon, was to create a working artificial intelligence system within a decade.Years later he developed a healthy respect for the challenge, saying that creating a “thinking machine” would require “1.8 Einsteins and one-tenth the resources of the Manhattan Project.”Artificial intelligence is still thought to be far in the future, though tremendous progress has been made in systems that mimic many human skills, including vision, listening, reasoning and, in robotics, the movements of limbs. From the mid-’60s to the mid-’70s, the Stanford lab played a vital role in creating some of these technologies, including robotics and machine-vision natural language.In 1972, the laboratory drew national attention when Stewart Brand, the founder of The Whole Earth Catalog, wrote about it in Rolling Stone magazine under the headline “SPACEWAR: Fanatic Life and Symbolic Death Among the Computer Bums.” The article evoked the esprit de corps of a group of researchers who had been freed to create their own virtual worlds, foreshadowing the emergence of cyberspace. “Ready or not, computers are coming to the people,” Mr. Brand wrote.Dr. McCarthy had begun inviting the Homebrew Computer Club, a Silicon Valley hobbyist group, to meet at the Stanford lab. Among its growing membership were Steven P. Jobs and Steven Wozniak, who would go on to found Apple. Mr. Wozniak designed his first personal computer prototype, the Apple 1, to share with his Homebrew friends.But Dr. McCarthy still cast a jaundiced eye on personal computing. In the second Homebrew newsletter, he suggested the formation of a “Bay Area Home Terminal Club,” to provide computer access on a shared Digital Equipment computer. He thought a user fee of $75 a month would be reasonable.Though Dr. McCarthy would initially miss the significance of the PC, his early thinking on electronic commerce would influence Mr. Diffie at the Stanford lab. Drawing on those ideas, Mr. Diffie began thinking about what would replace the paper personal check in an all-electronic world.He and two other researchers went on to develop the basic idea of public key cryptography, which is now the basis of all modern electronic banking and commerce, providing secure interaction between a consumer and a business.A chess enthusiast, Dr. McCarthy had begun working on chess-playing computer programs in the 1950s at Dartmouth. Shortly after joining the Stanford lab, he engaged a group of Soviet computer scientists in an intercontinental chess match after he discovered they had a chess-playing computer. Played by telegraph, the match consisted of four games and lasted almost a year. The Soviet scientists won.John McCarthy was born on Sept. 4, 1927, into a politically engaged family in Boston. His father, John Patrick McCarthy, was an Irish immigrant and a labor organizer.His mother, the former Ida Glatt, a Lithuanian Jewish immigrant, was active in the suffrage movement. Both parents were members of the Communist Party. The family later moved to Los Angeles in part because of John’s respiratory problems.He entered the California Institute of Technology in 1944 and went on to graduate studies at Princeton, where he was a colleague of John Forbes Nash Jr., the Nobel Prize-winning economist and subject of Sylvia Nasar’s book “A Beautiful Mind,” which was adapted into a movie.At Princeton, in 1949, he briefly joined the local Communist Party cell, which had two other members: a cleaning woman and a gardener, he told an interviewer. But he quit the party shortly afterward.In the ’60s, as the Vietnam War escalated, his politics took a conservative turn as he grew disenchanted with leftist politics.In 1971 Dr. McCarthy received the Turing Award, the most prestigious given by the Association of Computing Machinery, for his work in artificial intelligence. He was awarded the Kyoto Prize in 1988, the National Medal of Science in 1991 and the Benjamin Franklin Medal in 2003.Dr. McCarthy was married three times. His second wife, Vera Watson, a member of the American Women’s Himalayan Expedition, died in a climbing accident on Annapurna in 1978.Besides his daughter Sarah, of Nevada City, Calif., he is survived by his wife, Carolyn Talcott, of Stanford; another daughter, Susan McCarthy, of San Francisco; and a son, Timothy, of Stanford.He remained an independent thinker throughout his life. Some years ago, one of his daughters presented him with a license plate bearing one of his favorite aphorisms: “Do the arithmetic or be doomed to talk nonsense.”John McCarthy [Wikipedia]
-
Turing Week: Poker And Ai: I'll See Your Turing Test And Raise You An Algorithm
In honor of the 100th anniversary of Alan Turing's birth and because I'm having a devil of a time reclaiming comments from some of my older posts, I'll be reposting my favorite posts about Turing from the last six years this week (Friday's...
-
Poker And Ai: I'll See Your Turing Test And Raise You An Algorithm
Alan Turing came up with the first standard criterion for artificial intelligence. According to the Turing test, if you were to interact with the computer and not know it was a computer, say by asking it questions and having it answer, then we could say...
-
Deceased--douglas C. Engelbart
Douglas C. Engelbart January 30th, 1925 to July 2nd, 2013 "Computer Visionary Who Invented the Mouse" by John Markoff July 3rd, 2013 The New York Times Douglas C. Engelbart was 25, just engaged to be married and thinking about his future when he had an...
-
An Early Computer?--charles Babbage's "punch Cards"
"It Started Digital Wheels Turning" by John Markoff November 7th, 2011 The New York Times Researchers in Britain are about to embark on a 10-year, multimillion-dollar project to build a computer — but their goal is neither dazzling analytical power...
-
Deceased--sir Maurice Wilkes
Sir Maurice Wilkes June 26th, 1913 to November 29th, 2010 "Father of British computing Sir Maurice Wilkes dies" November 30th, 2010 BBC NEWS The "father" of British computing, Sir Maurice Wilkes, has died at the age of 97. Sir Maurice was the designer...
Philosophy