By Dr. S. S. VERMA; Department of Physics, S.L.I.E.T., Longowal; Distt.-Sangrur (Punjab)-148 106
Computer era
Use and developments in computing and computing systems are driving the transformation of entire systems of production, management, and governance. We still have an image of computers as being rectangular objects either on a desk, or these days in our pockets; but computers are in our cars, they’re in our thermostats, they’re in our refrigerators. Today computers are in virtually everything we touch, all day long. Some of the ways that computing is changing now are that it is moving into the fabrics in our clothing and it’s moving into our very bodies. We are now in the process of refining
prosthetics that not only help people reach for something but in reaching,those prosthetics now send a message back to the brain. The computers are going to become really, very tiny, the size of an atom and that is going to make a huge difference; nano-computing, very small computers that we might swallow inside a pill and that will then learn about our illness and set about curing it; that brings together biological computing as well, where we can print parts of the body. So while we may be approaching the limits of what silicon chips can do, technology itself is still accelerating. It’s unlikely to stop being the driving force in modern life. If anything, its influence will only increase as new computing technologies push robotics, artificial intelligence, virtual reality, nanotechnology, and other world-shaking advances past today’s accepted limits. In short, exponential growth in computing may not be able to go on forever, but its end is still much farther in the future. The future computing will be a convergence of the biological, the physical and the digital (and the post-digital quantum), using as examples 3D-printing, biotechnology, robotics for prosthetics, the internet of things, autonomous vehicles, other kinds of artificial intelligence, we can see the extent of how life will change. We need to make sure that these developments benefit all of society.
Emerging computing technologies
In fact, the industry keeps finding new ways to pack more power onto tinier chips. Unfortunately, they haven’t found ways to cut costs on the same exponential curve. As Fast Company reported in February 2016, the worldwide semiconductor industry is no longer planning to base its R&D plans for silicon chips around the notion of doubling their power every two years, because it simply can’t afford to keep up that pace in purchasing the incredibly complex manufacturing tools and processes necessary. Besides, current manufacturing technology may not be able to shrink silicon transistors much more than it already has. And in any event, transistors have become so tiny that they may no longer reliably follow the usual laws of physics — which raises questions about how much longer we’ll dare to use them in medical devices or nuclear plants.
But that does not mean that the era of exponential tech-driven changeis about to come to a screeching halt. Even if silicon chips are approaching their physical and economic limits, there are other ways to continue the exponential growth of computing performance, from new materials for chips to new ways to define computing itself. We’re already seeing technological advances that have nothing to do with transistor speed, like more clever software driven by deep learning and the ability to achieve greater computing power by leveraging cloud resources. And that’s only the tiniest hint of what’s coming next. Here are a few of the
emerging technologies that promise to keep computing performance rocketing ahead:
In-memory computing: Throughout computing history, the slowest part of processing has been getting the data from the hard disks where it’s stored to random access memory (RAM), where it can be used. A lot of processor power is wasted simply waiting for data to arrive. By contrast, in-memory computing puts massive amounts of data into RAM where it can be processed immediately. Combined with new database, analytics, and systems designs, it can dramatically improve both performance and overall costs.
Graphene-based microchips: Graphene -one molecule thick and more conductive than any other known material-can be rolled up into tiny tubes or combined with other materials to move electrons faster, in less space, than even the smallest silicon transistor.
Quantum computing: Even the most sophisticated conventional computer can only assign a one or a zero to each bit. Quantum computing, by contrast, uses quantum bits, or Qubits, which can be a zero, a one, both at once, or some point in between, all at the same time. Theoretically, a quantum computer will be able to solve highly complex problems, like analyzing genetic data or testing aircraft systems, millions of times faster than currently possible. Google researchers announced in 2015 that they had developed a new way for qubits to detect and protect against errors, but that’s as close as we’ve come so far.
Molecular electronics: Researchers have used nanotechnology to build a “biocomputer” that can perform parallel calculations by moving multiple protein filaments simultaneously along nanoscopic artificial pathways. This biocomputer is faster than conventional electrical computers that operate sequentially, approximately 99 percent more energy-efficient, and cheaper than both conventional and quantum computers to produce and use. It’s also more likely to be commercialized soon than quantum computing is.
DNA data storage: A little bit of DNA stores a whole lot of information. In fact, researchers speculate that about a tea spoon of DNA could hold all the data humans have generated to date, from the first cave drawings to yesterday’s Facebook status updates. By converting data to base 4 we can encode it on synthetic DNA. It currently takes a lot of time and money, but gene editing may be the future of big data. It is reported that Microsoft is investigating the use of synthetic DNA for secure long-term data storage and has been able to encode and recover 100 percent of its initial test data.
Neuromorphic computing: The goal of neuromorphic technology is to create a computer that’s like the human brain—able to process and learn from data as quickly as the data is generated. A neuromorphic chip, for example, consists of 1,024 neurons — each one a 256-byte memory based on SRAM combined with 3,000 logic gates — all interconnected and working in parallel. So far, chips that train and execute neural networks for deep learning have been developed and that’s a step in the right direction.
Passive Wi-fi: A team of computer scientists and electrical engineers has developed a way to generate Wi-fi transmissions that use 10,000 times less power than the current battery-draining standard. While this isn’t technically an increase in computing power, it is an exponential increase in connectivity, which will enable other types of advances. One of the breakthrough technologies, Passive Wi-fi will not only save battery life, but enable a minimal-power Internet of Things, allowing previously power-hungry devices to connect via Wi-fi for the first time, and potentially create entirely new modes of communication.