Research Hub: The role of algorithms, data, and hardware in the AI revolution
This blog was provided by Cognition BI
Nowadays AI seems to be omnipresent in every aspect of our lives. Ranging from spam filtering, and recommender systems (movies, clothes, friends, customers) based on thousands and even millions of features, logistics, and financial services to even assisting in the diagnosis and treatment of diseases and vaccines creation. These incredible advances in AI were driven initially by very complex algorithms requiring enormous hardware resources. However, this could not have been possible without a silent player: data; huge amounts of data. Some people call this era the data-centric era.
Some amazing results in image processing, playing games, and text understanding we are witnessing today require large amounts of data and processing hardware, resources only available to a few players. Even more, this enormous and expensive model achieves very poor performance in specialized contexts (e.g., NLP in biomedics). This poses a huge gap between what AI seems to achieve in certain general conditions, and what is actually possible in real-life applications. The real-life users can not obtain these benefits because the normal user/application reality is not well aligned with the data-centric paradigm: limited, distributed, and unstructured data (and many times limited hardware).
To fill this gap and generate the next AI revolution, by which every user/business/industry will benefit from the full AI potential, optimized algorithms for modest data and optimized also for specialized hardware will be a must.
As we might be moving to a Data-Centric era of AI, this era will be followed (sooner than later) by a new "Hardware-Centric" era. Cloud services providers might still play a crucial role in this era, but also “on-premise” hardware will be mandatory for many applications. In any case, the custom algorithmic design and optimization for the best hardware for the application in hand will be a traversal need.