It is news that Hugging Face, famous for its deep learning-based natural language processing open source, has attracted a $40M Series B investment. Hugging Face Transformer is probably one of the most popular open source projects in the field of natural language processing, which currently has around 42,000 stars and has 10,000 forks.
With Hugging Face Transformer, you can easily try out well-known natural language models such as BERT, GPT, XLNet, T5, and test a variety of tasks such as text classification, keyword extraction, automatic answering, paragraph summarization, and more. About 5,000 companies, including Microsoft's search engine Bing, are known to use the Hugging Face library. In addition, basic functions are provided as open source, but it is provided as a paid product by combining processing of models specialized for a specific service, serving inference APIs, and technical support.
A similar business model would be Google or Microsoft's Cloud API services, and Naver and Kakao have similar services. In addition, it is predicted that GPT-3 will also have a paid API service model. However, in the case of Hugging Face, it seems that it has a strong researcher lock-in effect due to an active open source policy, and in the natural language field, it seems that it reflects much more recent trends in terms of function. What is clear is that the competition of these AI frameworks will eventually lead to technological advances, and the barriers to AI service development will also be lowered.
We are looking forward to seeing what the Hugging Face will look like in terms of generating profits rather than attracting investment funds in the future. Here is a link to a more detailed article: