internet invention

The Sputnik Scare

On October 4, 1957, the Soviet Union launched the world’s first manmade satellite into orbit. The satellite, known as Sputnik, did not do much: It tumbled aimlessly around in outer space, sending blips and bleeps from its radio transmitters as it circled the Earth. Still, to many Americans, the beach-ball-sized Sputnik was proof of something alarming: While the brightest scientists and engineers in the United States had been designing bigger cars and better television sets, it seemed, the Soviets had been focusing on less frivolous things—and they were going to win the Cold War because of it.
After Sputnik’s launch, many Americans began to think more seriously about science and technology. Schools added courses on subjects like chemistry, physics and calculus. Corporations took government grants and invested them in scientific research and development. And the federal government itself formed new agencies, such as the National Aeronautics and Space Administration (NASA) and the Department of Defense’s Advanced Research Projects Agency (ARPA), to develop space-age technologies such as rockets, weapons and computers.

The Birth of the ARPAnet

Scientists and military experts were especially concerned about what might happen in the event of a Soviet attack on the nation’s telephone system. Just one missile, they feared, could destroy the whole network of lines and wires that made efficient long-distance communication possible. In 1962, a scientist from M.I.T. and ARPA named J.C.R. Licklider proposed a solution to this problem: a “galactic network” of computers that could talk to one another. Such a network would enable government leaders to communicate even if the Soviets destroyed the telephone system.
In 1965, another M.I.T. scientist developed a way of sending information from one computer to another that he called “packet switching.” Packet switching breaks data down into blocks, or packets, before sending it to its destination. That way, each packet can take its own route from place to place. Without packet switching, the government’s computer network—now known as the ARPAnet—would have been just as vulnerable to enemy attacks as the phone system.


In 1969, ARPAnet delivered its first message: a “node-to-node” communication from one computer to another. (The first computer was located in a research lab at UCLA and the second was at Stanford; each one was the size of a small house.)  The message—“LOGIN”—was short and simple, but it crashed the fledgling ARPA network anyway: The Stanford computer only received the note’s first two letters.

The Network Grows

By the end of 1969, just four computers were connected to the ARPAnet, but the network grew steadily during the 1970s. In 1971, it added the University of Hawaii’s ALOHAnet, and two years later it added networks at London’s University College and the Royal Radar Establishment in Norway. As packet-switched computer networks multiplied, however, it became more difficult for them to integrate into a single worldwide “Internet.”
By the end of the 1970s, a computer scientist named Vinton Cerf had begun to solve this problem by developing a way for all of the computers on all of the world’s mini-networks to communicate with one another. He called his invention “Transmission Control Protocol,” or TCP. (Later, he added an additional protocol, known as “Internet Protocol.” The acronym we use to refer to these today is TCP/IP.)  One writer describes Cerf’s protocol as “the ‘handshake’ that introduces distant and different computers to each other in a virtual space.”

The World Wide Web

Cerf’s protocol transformed the Internet into a worldwide network. Throughout the 1980s, researchers and scientists used it to send files and data from one computer to another. However, in 1991 the Internet changed again. That year, a computer programmer in Switzerland named Tim Berners-Lee introduced the World Wide Web: an Internet that was not simply a way to send files from one place to another but was itself a “web” of information that anyone on the Internet could retrieve. Berners-Lee created the Internet that we know today.
Since then, the Internet has changed in many ways. In 1992, a group of students and researchers at the University of Illinois developed a sophisticated browser that they called Mosaic. (It later became Netscape.) Mosaic offered a user-friendly way to search the Web: It allowed users to see words and pictures on the same page for the first time and to navigate using scrollbars and clickable links. That same year, Congress decided that the Web could be used for commercial purposes. As a result, companies of all kinds hurried to set up websites of their own, and e-commerce entrepreneurs began to use the Internet to sell goods directly to customers. More recently, social networking sites like Facebook have become a popular way for people of all ages to stay connected.

famous scientists

Abu Nasr Al-Farabi

Early Life:

Al-Farabi completed his earlier education at Farab and Bukhara but, later on, he went to Baghdad for higher studies, where he studied and worked for a long time. During this period he acquired mastery over several languages as well as various branches of knowledge and technology. Farabi contributed considerably to science, philosophy, logic, sociology, medicine, mathematics and music, but the major ones are in philosophy, logic and sociology and for which he stands out as an Encyclopedist.

Contributions and Achievements:

As a philosopher, Farabi was the first to separate philosophy from theology. It is difficult to find a philosopher both in Muslim and Christian world from Middle Ages onwards who has not been influenced by his views. He believed in a Supreme Being who had created the world through the exercise of balanced intelligence. He also asserted this same rational faculty to be the sole part of the human being that is immortal, and thus he set as the paramount human goal the development of that rational faculty. He considerably gave more attention to political theory as compared to any Islamic philosopher.
Later in his work, Al-Farabi laid down in Platonic fashion the qualities necessary for the ruler, he should be inclined to rule by good quality of a native character and exhibit the right attitude for such rule. At the heart of Al-Farabi’s political philosophy is the concept of happiness in which people cooperate to gain contentment. He followed the Greek example and the highest rank of happiness was allocated to his ideal sovereign whose soul was ‘united as it were with the Active Intellect’. Therefore Farabi served as a tremendous source of aspiration for intellectuals of the middle ages and made enormous contributions to the knowledge of his day, paving the way for the later philosopher and thinkers of the Muslim world.
Farabian epistemology has both a Neoplatonic and an Aristotelian dimension. The best source for al-Farabi’s classification of knowledge is his Kitab ihsa al-ulum. This work neatly illustrates Al-Farabi’s beliefs, both esoteric and exoteric. Through all of them runs a primary Aristotelian stress on the importance of knowledge. Thus al-Farabi’s epistemology, from what has been described may be said to be encyclopedic in range and complex in articulation, using both a Neoplatonic and an Aristotelian voice.
Farabi also participated in writing books on early Muslim sociology and a notable book on music titled Kitab al-Musiqa (The Book of Music) which is in reality a study of the theory of Persian music of his day, although in the West it has been introduced as a book on Arab music. He invented several musical instruments, besides contributing to the knowledge of musical notes. It has been reported that he could play his instrument so well as to make people laugh or weep at will. Al-Farabi’s treatise Meanings of the Intellect dealt with music therapy, where he discussed the therapeutic effects of music on the soul.

Later Life:

Farabi traveled to many distant lands throughout his life and gained many experiences a lot, due to which he made so many contributions for which he is still remembered and acknowledged. Inspite of facing many hardships, he worked with full dedication and made his name among the popular scientists of history. He died a bachelor in Damascus in 339 A.H. /950 A.D. at the age of 80 years.

Invention of the PC

Invention of the PC: The Computer Age

The earliest electronic computers were not “personal” in any way: They were enormous and hugely expensive, and they required a team of engineers and other specialists to keep them running. One of the first and most famous of these, the Electronic Numerical Integrator Analyzer and Computer (ENIAC), was built at the University of Pennsylvania to do ballistics calculations for the U.S. military during World War II. ENIAC cost $500,000, weighed 30 tons and took up nearly 2,000 square feet of floor space. On the outside, ENIAC was covered in a tangle of cables, hundreds of blinking lights and nearly 6,000 mechanical switches that its operators used to tell it what to do. On the inside, almost 18,000 vacuum tubes carried electrical signals from one part of the machine to another.

Invention of the PC: Postwar Innovations

ENIAC and other early computers proved to many universities and corporations that the machines were worth the tremendous investment of money, space and manpower they demanded. (For example, ENIAC could solve in 30 seconds a missile-trajectory problem that could take a team of human “computers” 12 hours to complete.) At the same time, new technologies were making it possible to build computers that were smaller and more streamlined. In 1948, Bell Labs introduced the transistor, an electronic device that carried and amplified electrical current but was much smaller than the cumbersome vacuum tube. Ten years later, scientists at Texas Instruments and Fairchild Semiconductor came up with the integrated circuit, an invention that incorporated all of the computer’s electrical parts–transistors, capacitors, resistors and diodes–into a single silicon chip.

But one of the most significant of the inventions that paved the way for the PC revolution was the microprocessor. Before microprocessors were invented, computers needed a separate integrated-circuit chip for each one of their functions. (This was one reason the machines were still so large.) Microprocessors were the size of a thumbnail, and they could do things the integrated-circuit chips could not: They could run the computer’s programs, remember information and manage data all by themselves.

The first microprocessor on the market was developed in 1971 by an engineer at Intel named Ted Hoff. (Intel was located in California’s Santa Clara Valley, a place nicknamed “Silicon Valley” because of all the high-tech companies clustered around the Stanford Industrial Park there.) Intel’s first microprocessor, a 1/16-by-1/8-inch chip called the 4004, had the same computing power as the massive ENIAC.

The Invention of the PC

These innovations made it cheaper and easier to manufacture computers than ever before. As a result, the small, relatively inexpensive “microcomputer”–soon known as the “personal computer”–was born. In 1974, for instance, a company called Micro Instrumentation and Telemetry Systems (MITS) introduced a mail-order build-it-yourself computer kit called the Altair. Compared to earlier microcomputers, the Altair was a huge success: Thousands of people bought the $400 kit. However, it really did not do much. It had no keyboard and no screen, and its output was just a bank of flashing lights. Users input data by flipping toggle switches.

In 1975, MITS hired a pair of Harvard students named Paul G. Allen and Bill Gates to adapt the BASIC programming language for the Altair. The software made the computer easier to use, and it was a hit. In April 1975 the two young programmers took the money they made from “Altair BASIC” and formed a company of their own—Microsoft—that soon became an empire. 

The year after Gates and Allen started Microsoft, two engineers in the Homebrew Computer Club in Silicon Valley named Steve Jobs and Stephen Wozniak built a homemade computer that would likewise change the world. This computer, called the Apple I, was more sophisticated than the Altair: It had more memory, a cheaper microprocessor and a monitor with a screen. In April 1977, Jobs and Wozniak introduced the Apple II, which had a keyboard and a color screen. Also, users could store their data on an external cassette tape. (Apple soon swapped those tapes for floppy disks.) To make the Apple II as useful as possible, the company encouraged programmers to create “applications” for it. For example, a spreadsheet program called VisiCalc made the Apple a practical tool for all kinds of people (and businesses)–not just hobbyists.

The PC Revolution

The PC revolution had begun. Soon companies like Xerox, Tandy, Commodore and IBM had entered the market, and computers became ubiquitous in offices and eventually homes. Innovations like the “Graphical User Interface,” which allows users to select icons on the computer screen instead of writing complicated commands, and the computer mouse made PCs even more convenient and user-friendly. Today, laptops, smart phones and tablet computers allow us to have a PC with us wherever we go.