window.onload = function() { console.log(document.getElementById("input_18_4_1").value); document.getElementById("input_18_4_1").checked = true; } lang="en-US"> New EPFL Algorithm Developed and Found in Deep Learning Software | Datafloq
Site icon Datafloq

New EPFL Algorithm Developed and Found in Deep Learning Software

PyTorch, an open-source machine learning library is majorly used for applications related to natural language processing (NLP) and computer vision.

More so, this library is a powerful player in the field of artificial intelligence (AI) and deep learning and is projected as a research-first library. Still, have doubts? This library has been utilized for multiple functions such as Tesla’s autopilot and Facebook’s translation software with the latest version that was recently launched. The recently launched Facebook’s translation software consists of the EPFL communication-efficient training algorithm, this algorithm also has the capability to help the planet.

Deep learning is a part of a much broader family in machine learning methods. And owing to the artificial neural networks, it helps us develop anything i.e. boosting drug discovery and toxicology to voice and image recognition tools and augment financial frauds, deep learning does it all.

With applications in machine learning getting wider and larger, more and more complex neural networks have grown and expanded, thus creating nearly trillions of other connections. However, to get these models trained faster, researchers and scientists have distributed training efforts over many Graphics Processing Units and computers. They have distributed it in the same manner in the way how humans would collaborate with computers to perform tasks. However, there’s a slight chance this might cause communication overhead.

What is the reason behind communication overhead?

Since neural networks are being trained rigorously, the communication that is needed for computers to achieve an accurate model amounts to producing many petabytes. Researchers have been trying their best to compress the bandwidth needed while training is still going on.

The development of the EPFL algorithm

Created by Ph.D. students, the PowerSGD has the ability to repeatedly multiply a vector to capture their main directions. The name is derived from the powerful method used. When this takes place, the EPFL researchers ensured to change the neural network model. Doing so allowed reduction to take place in the communication needed during the training process.

When this was applied to transform models in deep learning such as transformers models used for text or image recognition, the algorithm was able to save 99 percent of communication while also retaining the model’s accuracy.

With time, such machine learning models are bound to grow bigger. Thereby, developing new training algorithms to scale and reduce energy should be taken into consideration. Besides the PyTorch, these researchers were also happy to know their latest algorithm was used in OpenAI’s DALL-E. This DALL-E had the capacity to generate creative images even from the text.


Interested in what the future will bring? Download our 2025 Technology Trends eBook for free.

This field is for validation purposes and should be left unchanged.


PyTorch 1.8 and PowerSGD

An open-source machine learning library, PyTorch is used by nearly 80 percent of academic publications with the help of deep learning. The newest version i.e. 1.8 is now available which also contains EPFL developed PowerSGD first time ever.

Therefore, a much more efficient communication training scheme is now readily available for users and researchers to use. The communication compression could be activated by simply using a software switch.

In addition to these benefits, this algorithm utilizes less power thus less usage of energy is consumed which can further be utilized to fight climate change.

Offers decentralized learning

The team that developed the PowerSGD has also been working on a principle that can extend decentralized training. As a result, any agent will easily be able to train the deep learning model without even requiring a central server. This also prevents data from leaking.

Such a technique could be ideal in using privacy-sensitive applications like personal mobile devised or medical used cases.

Conclusion

With multiple essential features, PyTorch is one of the most popular deep learning frameworks in AI.

Exit mobile version