At first, it may not be easy to understand what it means to share an AI model. How big is the AI model in real-world applications? The largest currently published AI model is a language model called PaLM (Pathways Language Model) announced by Google in April 2022, with 540 billion parameters.
In the case of the EXAM project presented in the previous article (AI-Sharing without Sharing Data (2)), it is based on a deep learning model in image recognition called ResNet-34 with 63.5 million settings. The 20 hospitals will each learn an AI model with at least 63.5 million parameters and have a weighted average model as a global shared model. There is magic in the fact that the AI model is large enough to be unimaginable by our intuition. That’s the power and magic of AI-Sharing, that all 20 big models combined show better performance than every big model.
AI-Sharing algorithms will also continue to improve in the future. Harex InfoTech’s User-Centric Artificial Intelligence Research Lab has recently developed an IPA (Iterative Parallel Average) and ISA (Iterative Serial Average) AI sharing algorithm and filed for patents. The IPA methodology performed better than existing federated learning algorithms, and it shows that it performs well even in a business environment where Korean and English data are mixed.
The results of this study will be presented at “The 23rd International Conference on Electronic Commerce (ICEC 2022)”, which will be held from June 22 to 23 at the Interburgo Daegu Hotel. According to the study, if an artificial intelligence is made from the intelligence of a person who has worked a total of 20 years, five years each in department stores in Korea, Japan, the United States and China, the existing federated learning algorithm can be compared to combining the intelligence of four people who have worked in each department store for five years. On the other hand, the IPA algorithm can be compared to the case of combining the intelligence of four people who worked in department stores in Korea, Japan, the United States and China for one year each, then repeating this process five times to create global intelligence. In addition, the institute recently tested the performance of the ISA algorithm, which creates global intelligence similar to the intelligence of a person working continuously for 20 years for department stores in Korea, Japan, the United States. and in China, which can lead to larger results.
Kyung Hee University’s AI&BM Lab and Harex InfoTech’s User-Centered Artificial Intelligence Research Lab also found that glocalization strategy is better for AI sharing. It was found that rather than sharing the entire AI model of each economic entity, it is better to let the parameters of the input and output parts of the AI model reflect the specificity of each economic entity such and to share only the common internal basic AI model of each economic entity.
With the development of natural language processing deep learning models represented by Transformer, and the combination of federated learning methods and newly developed AI sharing algorithms, it becomes possible to create synergy by sharing the basic intelligence of economic entities regardless of language and individual. characteristics of global economic entities.
As the possibility of such an AI sharing platform grows, the biggest AI drivers at each level, such as health, commerce, transportation, finance, smart farms, manufacturing , robots and smart cities, should be built by AI. – Sharing methods. Adopting this method has the great advantage of being able to cooperate without signing a data agreement between economic entities.
AI-Sharing will not only be about sharing artificial intelligence, but a new type of platform providing various services based on AI engines developed and maintained in an AI-Sharing way will emerge in private and government sectors, as well than in the world. business environments. The AI-Sharing platform will enable many individuals, companies and institutions to collaborate, which was previously impossible, and AI-Sharing, which will start small, will gradually grow, generating a great positive change in the socio-economic system.
Professor Kyoung Jun Lee, School of Business and Big Data Analytics at Kyung Hee University.
Professor Kyoung Jun Lee has worked in various university-industry cooperation projects with Samsung Electronics, LG Electronics, Naver, BC Card, SKT, KT, Shinhan Investment, Busan Bank, Hyundai Motors, etc., and he is currently focusing his research on User-Centered Artificial Intelligence with Harex InfoTech. Professor Lee was a member of the Korean Government 3.0 Committee and the 4th Industrial Revolution Strategy Committee. He has lectured on Internet Business, IoT, Future Technology and Consumer Revolution, Artificial Intelligence, Business Models, etc. at EBS Sunday Invitation Special Lecture (2000), Oh My Future (2016), My Job in Moon (2022), KBS Jang Young Sil Show (2015), CBS Sebasi Talk (2015), YTN Science (2018) , to MKYU Seven Tech (2021) and many more.
Original source: Reproduced from the Korean version of IT Chosun dated June 22, 2022
© Korea IT Times 무단전재 및 재배포 금지