“Posits”, a new form of numbers that will revolutionize AI

"Posits", a new form of numbers that will revolutionize AI

Artificial intelligence is a part of our daily lives, even if we don’t always realize it. For example, did you know that they are hidden behind automatic translation tools or Google Messaging, Gmail?

If their operation is obscure to the layman, it is because they require immense computing power. For example, training OpenAI’s most advanced language model required a million billion billion operations, GPT-3And it’s 5 million dollars.

However, there is a solution to reduce these costs, According to IEEE Spectrum: A different way of representing numbers, “levels”.

We owe this invention to engineers John Gustafson And Isaac Yonemotowho thought of stages as an alternative to the traditional method of Floating point arithmetic operators. So it’s a question of finding a new way to encode real numbers.

Since then, a research team Complutense University of Madrid A brand new CPU tested this standard in the core and the results are very encouraging: the accuracy of a basic calculation task is said to have quadrupled.

A possible revolution in mathematics

To understand the level of technological progress that the levels represent, it is necessary to take into account the fact that real numbers cannot be fully encoded because they are infinite in number.

In the classical system, many real numbers must be rounded to fit into a certain amount of bits, a bit being the smallest unit of information in a computer. But with levels, you can represent more than floating point.

Also, for larger positive and negative numbers, their accuracy is increased. “It’s a good match for the natural distribution of numbers in a calculationGustafson explains. It’s perfect precision, right where you need it. Floating point arithmetic has many bit formats that nobody uses, it’s a waste.”

In their technical testing, the Complutense University team was able to compare calculations made using 32-bit floats and 32-bit positives. He concluded that the improvement in accuracy did not come at the expense of computation time, but only an increase in chip area and power consumption.

Despite the undeniable gains in numerical accuracy, it remains to be seen whether the training of large AIs will actually be affected by this new standard. “It’s possible that positrons speed up training because you lose less information along the way, but we don’t know yet, David Mallassen Quintana, a researcher at the University of Madrid explains. People have tried them in software; Now we want to try them in hardware.

Leave a Reply

Your email address will not be published. Required fields are marked *