The computer boom continues: scientists have introduced new ways to improve semiconductors and recreate thought processes
How to make the device smaller, more powerful and make it think like a human?
In recent years, the computer boom has reached unprecedented proportions. Our technological needs continue to grow and the scientific community is actively working to improve computational methods. Two research papers by a team led by Professor Jean Ann Incorvia of the Cockrell School of Engineering are now drawing particular attention. Research suggests improving semiconductors and developing next-generation computers that operate on the principle of the human brain.
“We are on the cusp of a new class of computers, and recreating the thought processes of our brains is becoming a huge scientific challenge,” Incorvia says. “At the same time, the computing methods that exist today are going nowhere, so it is important to continue innovative transformations of devices that power the technologies we are used to.”
One of the studies is devoted to the modernization of the design of transistors. It was published in ACS Nano . Researchers have found a way to connect so-called logic gates – the key elements that process digital signals inside the chips. The uniqueness of the new approach lies in the ability of valves to conduct both electrons and holes that arise when electrons move inside atoms.
“Speaking of the future of computing, if we can take advantage of the natural behavior of these two-dimensional materials and scale them up, we can cut the number of transistors we need in our circuits by half,” Incorvia said.
The innovation can increase the efficiency and power of computers, since it will now be possible to fit more transistors in one place. The size of the devices themselves can also be reduced due to the freed up space.
Another work published in Applied Physics Letters , is dedicated to a new generation of computers capable of “thinking”. Researchers have created artificial neurons based on magnetic materials. They randomly respond to electrical impulses, which helps with the processing of noisy data.
Noisy data is information that has been distorted or “contaminated” by random changes.
“Because the device itself reacts unpredictably to input data, it can handle noisy information better,” Incorvia explained.
Artificial neurons have surpassed other neural networks in terms of image interpretation, especially when processing blurry images.
New technologies can find application in the field of peripheral computing (edge computing), as well as in the space industry, where radiation resistance and efficient work with noisy data are required.