Wednesday 15 June 2011

Assignment 3 (6 Al-Farabi) 2011


The Development Of The Computer

The ideas and inventions of many engineers, mathematicians and scientists led to the development of the computer.

The first computer was developed in 1642 and consisted of gears and wheels. The first wheel would count from 1 to 9, the second wheel would count from 10 to 99, the third wheel would count from 100 to 999, etc. The only problem with the first computer, was that it could only add and subtract. It’s inventor was a French Mathematition  and Scientist by the name of Blaise Pascal.

In 1670, the German mathematician, Liebniz improved Blaise’s invention so that it could multiply and divide as well. Liebniz also found a system of counting other than decimal, called binary which made the machine easier to use.

George Boole, in the 1800's, perfected binary mathematics and could logically work out complex binary calculations in his head which helped greatly to move the computer industry.

The French textile weaver, Joseph Jacquard, made his contribution to the computer in 1801 with the loom. The loom was a machine that used punched cards to weave patterns. Holes would be punched in patterns on cards and then placed between the rising needle and thread creating the pattern punched. By changing cards and alternating patterns, Jacquard could create complex woven patterns.

Charles Babbage was inspired by these punched hole cards and during the 1830's developed the idea of a mechanical computer. He worked on this idea for 40 years but, unfortunately, he did not have the technology to provide for the precision parts needed to build this computer.

Hollerith, an American inventor, invented a punched hole computer called a Tabulator in 1888. His machine used electrically charged nails that, when passed through a hole punched in a card, created a circuit. The circuit would then register on another part where it was read and recorded. He founded the Tabulating Machine Company in 1896.

Over the next few years, Hollerith continued to improve the machine. He then sold his shares in 1911 and the name was changed to The Computing Tabulating Recording Company. Then in 1924, the name was changed to International Business Machines Corporations or IBM.

An American electrical engineer started work to develop a computer that would help scientists do long and complex calculations. Vannevar Bush built a differential analyser to solve equations like quantities of weight, voltage or speed. These computers became known as analog computers. These analog computers are not as accurate as normal computers. Examples are thermometers, thermostats, speedometers, simulators etc.

Scientists saw greater potential in computer electronics. John Atanasoff built the first special purpose analog computer in 1939. This was inpoved in 1944 by using switching devices called electromechanical relays. In 1946, the ENIAC (Electronic Numerical Integrator And Computer) computer was developed. Instead of electromechanical relays, it used 18000 electric valves. This computer weighed more then 27 metric tons, occupied more then 140 square metres of floor space and used 150 kilowatts of power during operation. It was able to do 5000 addition and 1000 multiplications per second. The only problem was that it took very long to program the computer to do the calculations as it could not store the information.

Stored programming techniques was worked on by an American team who developed the EDVAC (Electronic Discrete Variable Automatic Computer) in 1951. At the same time, two of the team members worked on a more advanced computer that could use both numbers and the alphabet. This was called the UNIVAC 1 (UNIVersal Automatic Computer) and was the first computer available to be sold to people and businesses.

The invention of the transistor in 1947, meant that computers could be faster and more reliable. The first fully transistorized computer was introduced in 1958 by Control Data Corporation followed by IBM in 1959.

Technology advancements in the 1960's saw the creation of the integrated circuit which contained thousands of transistors and other parts on a silicon chip. This meant that computers could become smaller. During the early 1970's, many different kinds of circuits were available some of which could even hold memory as well as computer logic. This resulted in smaller computers becoming available and the central chip that controlled the computer became known as the microprocessor.

Today, the technology has become so good that it is possible to hold a computer in the palm of your hand.

Telecommunication and Networking
A telecommunications network is a collection of terminals, links and nodes which connect together to enable telecommunication between users of the terminals. Networks may use circuit switching or message switching. Each terminal in the network must have a unique address so messages or connections can be routed to the correct recipients. The collection of addresses in the network is called the address space. 



 
  
Stand alone Computer And Computer Network 

Stand Alone Computer
A desktop or laptop computer that is used on its own without requiring a connection to a local area network (LAN) or wide area network (WAN). Although it may be connected to a network, it is still a stand-alone PC as long as the network connection is not mandatory for its general use.

In offices throughout the 1990s, millions of stand-alone PCs were hooked up to the local network for file sharing and mainframe access. Today, computers are commonly networked in the home so that family members can share an Internet connection as well as printers, scanners and other peripherals. When the computer is running local applications without Internet access, the machine is technically a stand-alone PC.



   

Computer Network

A computer network, often simply referred to as a network, is a collection of computers and devices interconnected by communications channels that facilitate communications and allows sharing of resources and information among interconnected devices. Computer networking or Data communications (Datacom) is the engineering discipline concerned with the computer networks. Computer networking is sometimes considered a sub-discipline of electrical engineering, telecommunications, computer science, information technology and/or computer engineering since it relies heavily upon the theoretical and practical application of these scientific and engineering disciplines.
    
Wired And Wireless Communication 

Wired Connection
The wired network connections can be defined as a jack in which we plug a network connection and hence a wired connection is established. Wired network connections are similar to the phone jacks except for the size. Network jacks are yellow and orange in color. Wired network allows for the fast and easy access to the internet. All the wired networks are full duplex, switched. All these wired connections are of about 100mb. Wired connections are much faster than dial up connections. Any DSL connection is 1800 times faster than the ordinary dial up connection. Though wired network connections are of many uses, but they require an extensive network of wires. The wired connection require extensive cable network underneath. This network requires an expense of coaxial cable, copper cable and sometimes fiber optic to get established.

 Wireless Connection
Wireless network refer to any type of computer network that is not connected by cables of any kind. It is a method by which telecommunications networks and enterprise (business), installations avoid the costly process of introducing cables into to a building, or as a connection between various equipment locations.radio waves. This implementation takes place at the physical level, (layer), of the network structure. 

Description of  Dell,Apple Computers,Microsoft And Intel.  


Dell
The world's largest mail-order computer vendor. Founded by Michael Dell in 1984, Dell Computer has built a reputation for delivering quality PCs at competitive prices. 

Apple Computers 
A personal computer company founded in 1976 by Steven Jobs and Steve Wozniak. Throughout the history of personal computing, Apple has been one of the most innovative influences. In fact, some analysts say that the entire evolution of the PC can be viewed as an effort to catch up with the Apple Macintosh.
In addition to inventing new technologies, Apple also has often been the first to bring sophisticated technologies to the personal computer. Apple's innovations include:

  • Graphical user interface (GUI). First introduced in 1983 on its Lisa computer. Many components of the Macintosh GUI have become de facto standards and can be found in other operating systems, such as Microsoft Windows.

  • Color. The Apple II, introduced in 1977, was the first personal computer to offer color monitors.

  • Built-in networking . In 1985, Apple released a new version of the Macintosh with built-in supportLocalTalk). for networking (

  • Plug & play expansion. In 1987, the Mac II introduced a new expansion bus called NuBus that made it possible to add devices and configure them entirely with software.

  • QuickTime. In 1991, Apple introduced QuickTime, a multi-platform standard for video, sound, and other multimedia applications.

  • Integrated television. In 1993, Apple released the Macintosh TV, the first personal computer with built-in television and stereo CD.

  • RISC. In 1994, Apple introduced the Power Mac, based on the PowerPC RISC microprocessor. 


  • Microsoft

    (Microsoft Corporation, Redmond, WA, www.microsoft.com) The most successful software company in the industry. Microsoft's software and Intel's hardware pioneered the PC and revolutionized the computer industry. Founded in 1975 by Bill Gates and Paul Allen, its Windows operating systems are the de facto standards on the desktop and major contenders in the server arena. Microsoft Office is the most successful application suite in history. The company also has a thriving business in programming languages, which are its roots, as well as in numerous other software categories.

    Gates and Allen were two college students when they wrote the first BASIC interpreter for the Intel 8080 microprocessor. MBASIC was licensed to Micro Instrumentation and Telemetry Systems to accompany its Altair 8800 kit. By the end of 1976, more than 10,000 Altairs were sold, and versions were licensed to Radio Shack, Apple and others. Although the company became a leader in microcomputer programming languages, its outstanding success was caused by fitting IBM PCs with DOS in 1981 and non-IBM PCs with MS-DOS. In 1990, Windows 3.0, its third version of Windows, was enormously popular. Later, Windows 95 and Windows NT cemented Microsoft's leadership.

    After the explosion of the Web, Microsoft worked feverishly to gain a foothold. By giving away its Internet Explorer browser and then integrating it into Windows 98, Internet Explorer became the leading on-ramp to the Internet. The Microsoft Network (MSN) ISP division is also a growing part of the company, and although many do not think of it as such, Microsoft is a very large hardware company. Its 2003 revenues for mice, keyboards, Xboxes and other devices exceeded five billion dollars. See Microsoft trial, Microsoft and IBM, Windows, DOS, Microsoft Office, Internet Explorer, Microsoftie and Altair.


    William H. Gates, III
    Bill Gates has become the most widely known business entrepreneur in the world, regardless of industry. (Image courtesy of Microsoft Corporation.)







    The Microsoft Campus


    Intel 

    The world's largest manufacturer of computer chips. Although it has been challenged in recent years by newcomers AMD and Cyrix, Intel still dominates the market for PC microprocessors. Nearly all PCs are based on Intel's x86 architecture.

    Intel was founded in 1968 by Bob Noyce and Gordon Moore. Strategically, it is closely allied with MicrosoftWindows 3.x and 95 operating systems are designed for x86 microprocessors. The popularity of Windows creates a demand for Intel or Intel-compatible microprocessors. Many people refer to this alliance as Wintel (short for Windows-Intel). 

    WHO ARE THESE PEOPLE???

    TIM BERNERS-LEE
    Sir Timothy John "Tim" Berners-Lee, OM, KBE, FRS, FREng, FRSA (born 8 June 1955), also known as "TimBL", is a British physicist, computer scientist and MIT professor, credited for his invention of the World Wide Web (not the Internet), making the first proposal for it in March 1989. On 25 December 1990, with the help of Robert Cailliau and a young student at CERN, he implemented the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet.
    Berners-Lee is the director of the World Wide Web Consortium (W3C), which oversaw the Web's continued development. He is also the founder of the World Wide Web Foundation, and is a senior researcher and holder of the 3Com Founders Chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He is a director of The Web Science Research Initiative (WSRI), and a member of the advisory board of the MIT Center for Collective Intelligence. In 2004, Berners-Lee was knighted by Queen Elizabeth II for his pioneering work. In April 2009, he was elected as a member of the United States National Academy of Sciences, based in Washington, D.C.Biodata of Himself

    MARK DEAN

    Dr. Mark Dean
    As a child, Mark Dean excelled in math. In elementary school, he took advanced level math courses and, in high school, Dean even built his own computer, radio, and amplifier. Dean continued his interests and went on to obtain a bachelor's degree in electrical engineering from the University of Tennessee, a masters degree in electrical engineering from Florida Atlantic University and a Ph.D. in electrical engineering from Stanford. He is one of the most prominent black inventors in the field of computers.
    Dr. Mark Dean started working at IBM in 1980 and was instrumental in the invention of the Personal Computer (PC). He holds three of IBM's original nine PC patents and currently holds more than 20 total patents. The famous African-American inventor never thought the work he was doing would end up being so useful to the world, but he has helped IBM make instrumental changes in areas ranging from the research and application of systems technology circuits to operating environments. One of his most recent computer inventions occurred while leading the team that produced the 1-Gigahertz chip, which contains one million transistors and has nearly limitless potential.

    DOUGLAS ENGELBART 
    Douglas Engelbart has always been ahead of his time, having ideas that seemed far-fetched at the time but later were taken for granted. For instance, as far back as the 1960s he was touting the use of computers for online conferencing and collaboration. Engelbart's most famous invention is the computer mouse, also developed in the 1960s, but not used commercially until the 1980s. Like Vannevar Bush and J.C.R. Licklider, Engelbart wanted to use technology to augment human intellect. He saw technology, especially computers, as the answers to the problem of dealing with the ever more complex modern world and has dedicated his life to the pursuit of developing technology to augment human intellect
     
    LINUS TORVALDS 

    Linus TorvaldsAKA Linus Benedict Torvalds
    Born: 28-Dec-1969
    Birthplace: Helsinki, Finland
    Gender: Male
    Religion: Atheist
    Race or Ethnicity: White
    Sexual orientation: Straight
    Occupation: Computer Programmer
    Nationality: Finland
    Executive summary: Created the kernel for the GNU/Linux OS
    By the time he was 10, Linus Torvalds was programming his grandfather's Commodore VIC-20. At 21, he wrote the first version of the Linux operating system.
    Torvalds earned his masters degree in computer science at the University of Helsinki, where the computers ran UNIX, an operating system designed by Bell Labs. UNIX was common on huge computers with many users, but it was bulky, expensive, and impractical for personal computers. Torvalds had a PC that came with Microsoft's crappy and crash-prone operating system, MS-DOS, and of course, he hated it. He installed Minix, a PC-compatible mini-mimic of UNIX, but he wanted something more flexible and user-friendly, so in 1991 Torvalds spent several months writing a compact operating system for his PC. He almost called it Freax, but decided on Linux.
    He posted an announcement to the Minix group on USENET, and made the Linux source code available to other nerds free of charge. Programmers everywhere started adding their own improvements, and eventually companies like Red Hat, Corel, Caldera, and TurboLinux began selling their own versions of Linux.
    The open-source nature of Linux is its greatest strength. Instead of having paid programmers devising improvements and looking for bugs from 9-to-5 with tight deadlines and budgets and memos from bosses, Linux is perpetually being tinkered with by the most obsessed and enthusiastic high-tech hobbyists and experts. Some would say it's the difference between manufacturing and art. As a result, Linux rarely crashes -- no blue screen of death -- and ongoing improvements have made it easy to install, even if you're not an expert. It's more stable, reliable, and secure than Windows, and its users are largely immune to the gazillion worms and viruses designed to exploit Microsoft's myriad holes and bugs.
    The kernel written by Torvalds comprises about 2% of the current Linux, but he still makes the ultimate decisions about which modifications are added and which aren't. As more and more Linux applications have become available, Linux surpassed Macintosh in 2003 to become the second most popular desktop operating system: Microsoft 94%, Linux 3%, Macintosh a bit less, followed by the fractional "other".
    Torvalds's grandfather was a noted Finnish poet, Ole Torvalds. His father was a radical and a card-carrying member of the Communist Party in the '60s and is now a reporter for Finnish radio and TV. His parents divorced when Linus was young, and he was raised by his mother and grandparents. The Torvalds primarily speak Swedish, which makes them part of a small minority in Finland. He now lives near San Jose, California.
    "I've tried to stay out of the Microsoft debate," he says. "If you start doing things because you hate others and want to screw them over the end result is bad."
    Father: Nils Torvalds (reporter)
    Mother: Anna Torvalds (translator)
    Wife: Tove Torvalds (kindergarten teacher, karate champion, three daughters)
    Daughter: Patricia Miranda Torvalds (b. 5-Dec-1996)
    Daughter: Daniela Yolanda Torvalds (b. 16-Apr-1998)
    Daughter: Celeste Amanda Torvalds (b. 20-Nov-2000)
        University: MS Computer Science, University of Helsinki
        Transmeta (1997-2003)
        World Technology Network
        EFF Pioneer Award 1998
        Animal Bite fairy penguin, Australia (c. 1993)
        Naturalized US Citizen 2010
        FILMOGRAPHY AS ACTOR
        Revolution OS (15-Feb-2002) Himself
    Official Website:
    http://www.cs.helsinki.fi/u/torvalds/
    Author of books:
    Just for Fun: The Story of an Accidental Revolutionary
    (2001, memoir, with David Diamond)

     HOPE MY BLOG HERE WOULD BENEFIT THOSE IT ENTHUSIAST OUT THERE! THIS IS MY FIRST BLOG, SO FEEL FREE TO COMMENT IF THERE IS ANY MISTAKES!
     
     

    No comments:

    Post a Comment