Sponsored Links

September 9, 2012

Development of Computer

The ideas and inventions of many engineers, mathematicians and scientists led to the development of the computer. The development of the computer lead the people being faster in their work and accuracy also.Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results. 


The first computer was developed in 1642 and consisted of gears and wheels. The first
wheel would count from 1 to 9, the second wheel would count from 10 to 99, the third wheel would count from 100 to 999, etc. The only problem with the first computer, was that it could only add and subtract. It’s inventor was a French Mathematician and Scientist by the name of Blaise Pascal.While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.


In 1670, the German mathematician, Liebniz improved Blaise’s invention so that it could multiply and divide as well. Liebniz also found a system of counting other than decimal, called binary which made the machine easier to use. George Boole, in the 1800's, perfected binary mathematics and could logically work out complex binary calculations in his head which helped greatly to move the computer industry. The French textile weaver, Joseph Jacquard, made his contribution to the computer in 1801 with the loom. The loom was a machine that used punched cards to weave patterns. Holes would be punched in patterns on cards and then placed between the rising needle and thread creating the pattern punched. By changing cards and alternating patterns, Jacquard could create complex woven patterns. Charles Babbage was inspired by these punched hole cards and during the 1830's developed the idea of a mechanical computer. He worked on this idea for 40 years but, unfortunately, he did not have the technology to provide for the precision parts needed to build this computer. Hollerith, an American inventor, invented a punched hole computer called a Tabulator in 1888. His machine used electrically charged nails that, when passed through a hole punched in a card, created a circuit. The circuit would then register on another part where it was read and recorded. He founded the Tabulating Machine Company in 1896. Over the next few years, Hollerith continued to improve the machine. He then sold his shares in 1911 and the name was changed to The Computing Tabulating Recording Company. Then in 1924, the name was changed to International Business Machines Corporations or IBM. An American electrical engineer started work to develop a computer that would help scientists do long and complex calculations. Vannevar Bush built a differential analyser to solve equations like quantities of weight, voltage or speed. These computers became known as analog computers. These analog computers are not as accurate as normal computers. Examples are thermometers, thermostats, speedometers, simulators etc. Scientists saw greater potential in computer electronics. John Atanasoff built the first special purpose analog computer in 1939. This was inpoved in 1944 by using switching
devices called electromechanical relays. In 1946, the ENIAC (Electronic Numerical Integrator And Computer) computer was developed. Instead of electromechanical relays, it used 18000 electric valves. This computer weighed more then 27 metric tons, occupied more then 140 square metres of floor space and used 150 kilowatts of power during operation. It was able to do 5000 addition and 1000 multiplications per second. The only problem was that it took very long to program the computer to do the calculations as it could not store the information.First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch.It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors. Stored programming techniques was worked on by an American team who developed the EDVAC (Electronic Discrete Variable Automatic Computer) in 1951. At the same time, two of the team members worked on a more advanced computer that could use both numbers and the alphabet. This was called the UNIVAC 1 (UNIVersal Automatic Computer) and was the first computer available to be sold to people and businesses.The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards. The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC. The invention of the transistor in 1947, meant that computers could be faster and more reliable. The first fully transistorized computer was introduced in 1958 by Control Data Corporation followed by IBM in 1959. Technology advancements in the 1960's saw the creation of the integrated circuit which contained thousands of transistors and other parts on a silicon chip. This meant that computers could become smaller. 
During the early 1970's, many different kinds of circuits were available some of which could even hold memory as well as computer logic. This resulted in smaller computers becoming available and the central chip that controlled the computer became known as the microprocessor.








Today, the technology has become so good that it is possible to hold a computer in the palm of your hand. There are many more forms of computers in the today's days. Destop, laptops, tablet pc are some for them. In the previous days the computers are as big as the room but now we can keep it in our pocket and bags. The all are due to the new and vast technologies that are developed for the seek of the mankind.

September 7, 2012

Development of Technology

The use of the term technology has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts.The term was often connected to technical education, as in the Massachusetts Institute of Technology (chartered in 1861). "Technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The meanings of technology changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology." In German and other European languages, a distinction exists between Technik and Technologie that is absent in English, as both terms are usually translated as "technology." By the 1930s, "technology" referred not to the study of the industrial arts, but to the industrial arts themselves.[4] In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."Bain's definition remains common among scholars today, especially social scientists. But equally prominent is the definition of technology as applied science, especially among scientists and engineers, although most social scientists who study technology reject this definition. More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work on technologies of the self ("techniques de soi"). Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge". Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here". The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole. Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter." Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.
Development of the technologies in these days is increasing in the rapid manner that it is very amazing to think about the life 100 or 200 years back.This is all due to the emerging technologies that have enabled people to step on the moon. In the history of technology, emerging technologies are contemporary advances and innovation in various fields of technology. Various converging technologies have emerged in the technological convergence of different systems evolving towards similar goals. Convergence can refer to previously separate technologies such as voice (and telephony features), data (and productivity applications) and video that now share resources and interact with each other, creating new efficiencies. Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage;[1] converging technologies represent previously distinct fields which are in some way moving towards stronger inter-connection and similar goals. However, the opinion on the degree of impact, status and economic viability of several emerging and converging technologies vary.