What will computers be like in 5 years




















What will computer hardware look like in the next five years? With everything we know today, we can certainly suss some of this out now. For all that we poke fun at smart toasters, smart condoms, and smart locks, the IoT is advancing by leaps and bounds.

Even after the iPhone launched, it took years for these products to evolve into something like their modern iterations with app stores and video streaming capabilities.

If the data need to travel only several millimeters from one function on a chip to another on the same chip, the data delay times can be reduced to picoseconds trillionths of a second. Higher-density chips also allow data to be processed 64 bits at a time, as opposed to the eight, 16 or, at best, bit processors that are now available in Pentium-type personal computers. This procedure allows several phases of data processing to happen at once, again increasing the rate of data throughput.

In another, very different approach, manufacturers are working on integrating the entire computer--including all memory, peripheral controls, clocks and controllers--on the same centimeter-square piece of silicon. This new 'superchip' would be a complete computer, lacking only the human interface.

Palm-size computers that are more powerful than our best desktop machines will become commonplace; we can also expect that prices will continue to drop.

A surprising statistic is that some 90 percent of the time, the newest desktop computers run in virtual 86 mode--that is, they are made to run as if they were ancient , eight-bit machines--despite all their fancy high-speed, bit buses and super color graphics capability. This limitation occurs because most of the commercial software is still written for the architecture.

Windows NT, Windows 95 and the like are the few attempts at utilizing PCs as bit, high-performance machines. Fiber-optics and light systems would make computers more immune to noise, but light travels at exactly the same speed as electromagnetic pulses on a wire. There might be some benefit from capitalizing on phase velocities to increase the speed of data transfer and processing.

Phase velocities can be much greater than the host carrier wave. Utilizing this phenomenon would open an entirely new technology that would employ very different devices and ways of transporting and processing data.

For more than40 years, electrical engineers and physicists have been working on the technologies of analog and digital optical computing, in which the information is primarily carried by photons rather than by electrons. Optical computing could, in principle, result in much higher computer speeds. Much progress has been achieved, and optical signal processors have been successfully used for applications such as synthetic aperture radars, optical pattern recognition, optical image processing, fingerprint enhancement and optical spectrum analyzers.

In the past two decades, however, a lot of effort has been expended on the development of digital optical processors.

Much work remains before digital optical computers will be widely available commercially, but the pace of research and development has increased in the s. Recent research has shown ways around this difficulty. Digital partitioning algorithms, which can break matrix-vector products into lower-accuracy subproducts, working in tandem with error-correction codes, can substantially improve the accuracy of optical computing operations.

Holographic data storage also offers a lot of promise for high-density optical data storage in future optical computers or for other applications, such as archival data storage. The promise of all-optical computing remains highly attractive, however, and the goal of developing optical computers continues to be a worthy one. Newsletter Get smart. Sign up for our email newsletter. Since transistors are the work horses of a computer, doubling the transistors generally means doubling the computer processing power.

And it's not just CPUs that are improving at an exponential rate. Every couple of years, storage devices like memory and hard drives are bigger and faster, displays are better, and cameras capture better images. Moore's law describes a driving force of technological and social change, productivity, and economic growth - wikipedia Image credit: Wikimedia Commons. From Boyle to Newton, the best laws are self-explanatory, come with a catchy phrase and a cool drawing. There is probably some research that goes along with it, but I imagine that can be exhausting.

Based on my personal computer experiences, I propose a new technology predicting law. Today's computers operate using semiconductors, metals and electricity. Future computers might use atoms, dna or light. Moore's Law predicts doubling, but when computers go from quartz to quantum, the factor will be off the scale.

What would the world be like, if computers the size of molecules become a reality? These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body.

Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.

Ubiquitous computers are in the works. Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.

If you have the heart, take a gander at the most promising new computer technologies. If not, dare to imagine the ways that billions of tiny, powerful computers will change our society. I search the internet daily for new articles from around the world that interest me or I think will interest you. My hope is that it saves you time or helps students with their assignments.

Listed by most recent first dating back to Hit NEXT button for more articles. ZDNet posted Programming languages: Python just took a big jump forward.

OpenAI's Codex turns written language into computer code from Axios. ZDNet posted Supercomputing can help address blockchain's biggest problem. Here's how. The ultimate at-home entertainment tablet Lenovo video.

What happens when computers can literally do everything? EurekAlert posted Gold digger: Neural networks at the nexus of data science and electron microscopy.



0コメント

  • 1000 / 1000