Sponsored Links

December 19, 2012

Has NASA predicted the blackout?

There is a rumor going on since a long time that there will be the blackout from 22nd Dec to 25th Dec 2012. And there is also the claim that the earth will face the new alignment and shift to another dimension. It is said that NASA has claim the blackout but now it was said false by NASA itself. NASA hasn't predicted such thing. The rumor was false. The whole story is given below.

NASA predicts total blackout on 23-25 Dec 2012 during alignment of Universe.

US scientists predict Universe change, total blackout of planet for 3 days from Dec 22nd 2012.

It is not the end of the world, it is an alignment of the Universe, where the Sun and the earth will align for the first time. The earth will shift from the current third dimension to zero dimension, then shift to the forth dimension. During this transition, the entire Universe will face a big change, and we will see a entire brand new world.

The 3 days blackout is predicted to happen on Dec 23, 24, 25....during this time, staying calm is most important, hug each other, pray pray pray, sleep for 3 nights...and those who survive will face a brand new world....for those not prepared, many will die because of fear. Be happy, enjoy every moment now. Don't worry, pray to God everyday. There is a lot of talk about what will happen in 2012, but many people don't believe it, and don't want to talk about it for fear of creating fear and panic.

We don't know what will happen so we must be prepare for it. So its better enough to listen to the preparation video of NASA.


But NASA says that the rumor was false. NASA hasn't predicted those things. The blackout, alignment and shift aren't scientific. The link shows the page of NASA that say about the false rumors.
You can view the original content of the video here.





October 25, 2012

Top ten mobile companies

According to market analyst Gartner, nowadays the top ten mobile phone companies are listed below:

1. Samsung
Samsung Group is a South Korean multinational conglomerate company headquartered in Samsung Town, Seoul. It comprises numerous subsidiaries and affiliated businesses, most of them united under the Samsung brand, and is the largest South Korean chaebol.
In the todays time Samsung is well known for the mobile phone technologies. It produces the various models of the mobile phones including the andriod OS activated, samsung tablets, and other remarkable products.

2. Nokia
3. Apple 4. ZTE 5. LG Electronics 6. Huawei 7. TCL Communications 8. HTC 9. Motorola 10. Research in Motion

October 12, 2012

Mobile in Past and Present

The first mobile telephone call was made on 17 June 1946 from a car in St. Louis, Missouri, USA, using the Bell System's Mobile Telephone Service. This was followed in 1956 by the world’s first partly automatic car phone system, Mobile System A (MTA) in Sweden. The MTA phones were composed of vacuum tubes and relays, and had a weight of 88.2 pounds (40 kg).

John F. Mitchell, Motorola's chief of portable communication products and Martin Cooper's boss in 1973, played a key role in advancing the development of handheld mobile telephone equipment. Mitchell successfully pushed Motorola to develop wireless communication products that would be small enough to use anywhere and participated in the design of the cellular phone. Martin Cooper, a Motorola researcher and executive, was the key researcher on Mitchell's team that developed the first hand-held mobile telephone for use on a cellular network. Using a somewhat heavy portable handset, Cooper made the first call on a handheld mobile phone on April 3, 1973 to his rival, Dr. Joel S. Engel of Bell Labs.

The new invention sold for $3,995 and weighed two pounds, leading to a nickname "the brick". The world's first commercial automated cellular network was launched in Japan by NTT in 1979, initially in the metropolitan area of Tokyo. In 1981, this was followed by the simultaneous launch of the Nordic Mobile Telephone (NMT) system in Denmark, Finland, Norway and Sweden. Several countries then followed in the early-to-mid 1980s including the UK, Mexico and Canada.

On 6 March 1983, the DynaTAc mobile phone launched on the first US 1G network by Ameritech. It cost $100m to develop, and took over a decade to hit the market.The phone had a talk time of just half an hour and took ten hours to charge. Consumer demand was strong despite the battery life, weight, and low talk time, and waiting lists were in the thousands.

In 1991, the second generation (2G) cellular technology was launched in Finland by Radiolinja on the GSM standard, which sparked competition in the sector as the new operators challenged the incumbent 1G network operators. Ten years later, in 2001, the third generation (3G) was launched in Japan by NTT DoCoMo on the WCDMA standard. This was followed by 3.5G, 3G+ or turbo 3G enhancements based on the high-speed packet access (HSPA) family, allowing UMTS networks to have higher data transfer speeds and capacity.

By 2009, it had become clear that, at some point, 3G networks would be overwhelmed by the growth of bandwidth-intensive applications like streaming media.Consequently, the industry began looking to data-optimized 4th-generation technologies, with the promise of speed improvements up to 10-fold over existing 3G technologies. The first two commercially available technologies billed as 4G were the WiMAX standard (offered in the U.S. by Sprint) and the LTE standard, first offered in Scandinavia by TeliaSonera.

There are several reasons as to why the mobiles have gained such tremendous popularity amongst people worldwide. Imagine the days when there was no mobile and the world talked on landlines. If you have lived in those days you would surely also be aware about the problems that one had to face. Just imagine those cracked voices and wrong numbers and the helplessness of talking from near the phone.

Well, mobiles came in and everything changed and changed drastically. All of a sudden cracked voices, wrong numbers, the torture of talking from near the phone and the helplessness of not being able to talk while on move became a thing of past. Mobiles ensured that communication on phone became extremely smooth and a pleasure. Mobiles can easily be carried and as such talking while on move also became a possibility.

Then again it was not only the call making and receiving facility that mobile phones offered instead there was a whole range of benefits that can be reaped of a mobile. For the first time people saw a phone that was also a music player. Yes! It was now possible to listen to music on phone. Also was possible was to play games, calculate difficult sums, send and receive SMS, e-mails, video and still photographs and to download picture, video, favourite music and ring tones.

Mobiles come in various colours and sleekest designs and there is a huge variety to select from. It therefore should not come as a surprise to anyone about the mammoth fan following that the mobiles enjoy across the world.

October 10, 2012

Mobile Technology

Mobile technology is the technology used for cellular communication. Mobile code division multiple access (CDMA) technology has evolved rapidly over the past few years. Since the start of this millennium, a standard mobile device has gone from being no more than a simple two-way pager to being a mobile phone, GPS navigation device, an embedded web browser and instant messaging client, and a handheld game console. Many experts argue that the future of computer technology rests in mobile computing with wireless networking. Mobile computing by way of tablet computers are becoming more popular. The most popular tablet at the moment is the iPad, by Apple. Tablets are available on the 3G and 4G networks.

4G networking
One of the most important features in the mobile networks is the domination of high-speed packet transmissions or burst traffic in the channels. The same codes used in the 2G-3G networks will be applied to future 4G mobile or wireless networks, the detection of very short bursts will be a serious problem due to their very poor partial correlation properties. Recent study has indicated that traditional multi-layer network architecture based on the Open Systems Interconnection (OSI) model may not be well suited for 4G mobile network, where transactions of short packets will be the major part of the traffic in the channels. As the packets from different mobiles carry completely different channel characteristics, the receiver should execute all necessary algorithms, such as channel estimation, interactions with all upper layers and so on, within a very short time to make the detections of each packet flawless and even to reduce the clutter of traffic.

Operating systems
Many types of mobile operating systems (OS) are available for smartphones, including: Android, BlackBerry OS, webOS, iOS, Symbian, Windows Mobile Professional (touch screen), Windows Mobile Standard (non-touch screen), and Bada. Among the most popular are the Apple iPhone, and the newest - Android. Android is a mobile operating system (OS) developed by Google. Android is the first completely open source mobile OS, meaning that it is free to any cell phone carrier. The Apple iPhone, which has several OSs like the 3G and 3G S, is the most popular smart phone at this time, because of its customizable OS which you can use to download applications ("apps") made by Apple like games, GPS, Utilities, and other tools. Any user can also create their own Apps and publish them to Apple's App Store. The Palm Pre using webOS has functionality over the Internet and can support Internet-based programming languages such as Cascading Style Sheets (CSS), HTML, and JavaScript. The Research In Motion (RIM) BlackBerry is a smartphone with a multimedia player and third-party software installation. The Windows Mobile Professional Smartphones (Pocket PC or Windows Mobile PDA) are like that of a personal digital assistant (PDA) and have touchscreen abilities. The Windows Mobile Standard does not have a touch screen but uses a trackball, touchpad, rockers, etc.

The original smartphone OS is Symbian, with a rich history and the largest marketshare until 2011. Although no single Symbian device has sold as many units as the iPhone, Nokia and other manufacturers (currently including Sony Ericsson and Samsung, and previously Motorola) release a wide variety of Symbian models each year which gave Symbian the greatest marketshare.

Channel hogging and file sharing
There will be a hit to file sharing, the normal web surfer would want to look at a new web page every minute or so at 100 kbs a page loads quickly. Because of the changes to the security of wireless networks users will be unable to do huge file transfers because service providers want to reduce channel use. AT&T claimed that they would ban any of their users that they caught using peer-to-peer (P2P) file sharing applications on their 3G network. It then became apparent that it would keep any of their users from using their iTunes programs. The users would then be forced to find a Wi-Fi hotspot to be able to download files. The limits of wireless networking will not be cured by 4G, as there are too many fundamental differences between wireless networking and other means of Internet access. If wireless vendors do not realize these differences and bandwidth limits, future wireless customers will find themselves disappointed and the market may suffer setback.

Future of smartphone
The next generation of smartphones are going to be context-aware, taking advantage of the growing availability of embedded physical sensors and data exchange abilities. One of the main features applying to this is that the phones will start keeping track of your personal data, but adapt to anticipate the information you will need based on your intentions. There will be all-new applications coming out with the new phones, one of which is an X-Ray device that reveals information about any location at which you point your phone. One thing companies are developing software to take advantage of more accurate location-sensing data. How they described it was as wanting to make the phone a virtual mouse able to click the real world. An example of this is where you can point the phone's camera while having the live feed open and it will show text with the building and saving the location of the building for future use.
Along with the future of a smart phone comes the future of another device. Omnitouch is a device in which applications can be viewed and used on your hand, arm, wall, desk, or any other everyday surface. The device uses a sensor touch interface, which enables the user to access all the functions through the use of finger touch. It was developed at Carnegie Mellon University. This device uses a projector and camera that is worn on the person's shoulder, with no controls other than the user's fingers.

September 9, 2012

Development of Computer

The ideas and inventions of many engineers, mathematicians and scientists led to the development of the computer. The development of the computer lead the people being faster in their work and accuracy also.Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results. 


The first computer was developed in 1642 and consisted of gears and wheels. The first
wheel would count from 1 to 9, the second wheel would count from 10 to 99, the third wheel would count from 100 to 999, etc. The only problem with the first computer, was that it could only add and subtract. It’s inventor was a French Mathematician and Scientist by the name of Blaise Pascal.While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.


In 1670, the German mathematician, Liebniz improved Blaise’s invention so that it could multiply and divide as well. Liebniz also found a system of counting other than decimal, called binary which made the machine easier to use. George Boole, in the 1800's, perfected binary mathematics and could logically work out complex binary calculations in his head which helped greatly to move the computer industry. The French textile weaver, Joseph Jacquard, made his contribution to the computer in 1801 with the loom. The loom was a machine that used punched cards to weave patterns. Holes would be punched in patterns on cards and then placed between the rising needle and thread creating the pattern punched. By changing cards and alternating patterns, Jacquard could create complex woven patterns. Charles Babbage was inspired by these punched hole cards and during the 1830's developed the idea of a mechanical computer. He worked on this idea for 40 years but, unfortunately, he did not have the technology to provide for the precision parts needed to build this computer. Hollerith, an American inventor, invented a punched hole computer called a Tabulator in 1888. His machine used electrically charged nails that, when passed through a hole punched in a card, created a circuit. The circuit would then register on another part where it was read and recorded. He founded the Tabulating Machine Company in 1896. Over the next few years, Hollerith continued to improve the machine. He then sold his shares in 1911 and the name was changed to The Computing Tabulating Recording Company. Then in 1924, the name was changed to International Business Machines Corporations or IBM. An American electrical engineer started work to develop a computer that would help scientists do long and complex calculations. Vannevar Bush built a differential analyser to solve equations like quantities of weight, voltage or speed. These computers became known as analog computers. These analog computers are not as accurate as normal computers. Examples are thermometers, thermostats, speedometers, simulators etc. Scientists saw greater potential in computer electronics. John Atanasoff built the first special purpose analog computer in 1939. This was inpoved in 1944 by using switching
devices called electromechanical relays. In 1946, the ENIAC (Electronic Numerical Integrator And Computer) computer was developed. Instead of electromechanical relays, it used 18000 electric valves. This computer weighed more then 27 metric tons, occupied more then 140 square metres of floor space and used 150 kilowatts of power during operation. It was able to do 5000 addition and 1000 multiplications per second. The only problem was that it took very long to program the computer to do the calculations as it could not store the information.First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch.It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors. Stored programming techniques was worked on by an American team who developed the EDVAC (Electronic Discrete Variable Automatic Computer) in 1951. At the same time, two of the team members worked on a more advanced computer that could use both numbers and the alphabet. This was called the UNIVAC 1 (UNIVersal Automatic Computer) and was the first computer available to be sold to people and businesses.The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards. The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC. The invention of the transistor in 1947, meant that computers could be faster and more reliable. The first fully transistorized computer was introduced in 1958 by Control Data Corporation followed by IBM in 1959. Technology advancements in the 1960's saw the creation of the integrated circuit which contained thousands of transistors and other parts on a silicon chip. This meant that computers could become smaller. 
During the early 1970's, many different kinds of circuits were available some of which could even hold memory as well as computer logic. This resulted in smaller computers becoming available and the central chip that controlled the computer became known as the microprocessor.








Today, the technology has become so good that it is possible to hold a computer in the palm of your hand. There are many more forms of computers in the today's days. Destop, laptops, tablet pc are some for them. In the previous days the computers are as big as the room but now we can keep it in our pocket and bags. The all are due to the new and vast technologies that are developed for the seek of the mankind.

September 7, 2012

Development of Technology

The use of the term technology has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts.The term was often connected to technical education, as in the Massachusetts Institute of Technology (chartered in 1861). "Technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The meanings of technology changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology." In German and other European languages, a distinction exists between Technik and Technologie that is absent in English, as both terms are usually translated as "technology." By the 1930s, "technology" referred not to the study of the industrial arts, but to the industrial arts themselves.[4] In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."Bain's definition remains common among scholars today, especially social scientists. But equally prominent is the definition of technology as applied science, especially among scientists and engineers, although most social scientists who study technology reject this definition. More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work on technologies of the self ("techniques de soi"). Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge". Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here". The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole. Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter." Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.
Development of the technologies in these days is increasing in the rapid manner that it is very amazing to think about the life 100 or 200 years back.This is all due to the emerging technologies that have enabled people to step on the moon. In the history of technology, emerging technologies are contemporary advances and innovation in various fields of technology. Various converging technologies have emerged in the technological convergence of different systems evolving towards similar goals. Convergence can refer to previously separate technologies such as voice (and telephony features), data (and productivity applications) and video that now share resources and interact with each other, creating new efficiencies. Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage;[1] converging technologies represent previously distinct fields which are in some way moving towards stronger inter-connection and similar goals. However, the opinion on the degree of impact, status and economic viability of several emerging and converging technologies vary.

March 19, 2012

AGE OF WI-FI


Wi-Fi sometimes spelled Wifi or WiFi is a popular technology that allows an electronic device to exchange data wirelessly (using radio waves) over a computer network, including high-speed Internet connections. The Wi-Fi Alliance defines Wi-Fi as any "wireless local area network (WLAN) products that are based on the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards".However, since most modern WLANs are based on these standards, the term "Wi-Fi" is used in general English as a synonym for "WLAN".

A device using Wi-Fi, such as a personal computer, video game console, smartphone, tablet, or digital audio player, can connect to a network resource such as the Internet via a wireless network access point. Such an access point (or hotspot) has a range of about 20 meters (65 ft) indoors and a greater range outdoors. Hotspot coverage can comprise an area as small as a single room with walls that block radio signals or a large area up to many square miles, which can be covered by multiple overlapping access points.

"Wi-Fi" is a trademark of the Wi-Fi Alliance and the brand name for products using the IEEE 802.11 family of standards. Only Wi-Fi products that complete Wi-Fi Alliance interoperability certification testing successfully may use the "Wi-Fi CERTIFIED" designation and trademark.

Wi-Fi has had a checkered security history. Its earliest encryption system, WEP, proved easy to break. Much higher quality protocols, WPA and WPA2, were added later. However, an optional feature added in 2007, called Wi-Fi Protected Setup (WPS), has a flaw that allows a remote attacker to recover the router's WPA or WPA2 password in a few hours on most implementations.Some manufacturers have recommended turning off the WPS feature. The Wi-Fi Alliance has since updated its test plan and certification program to ensure all newly-certified devices resist brute-force AP PIN attacks.