hfkr_logo (1).png

✨ About KREW

<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/1de4fedb-1460-4e83-a174-75e21bb61ae9/b54e3925-99d1-432f-8d8d-2e8ec9b471a0/hfkr_logo.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/1de4fedb-1460-4e83-a174-75e21bb61ae9/b54e3925-99d1-432f-8d8d-2e8ec9b471a0/hfkr_logo.png" width="40px" /> Hugging Face KREW research team strives to contribute to open source and gain a deep understanding of Hugging Face.

</aside>

🚀 Team Vision

<aside> <img src="notion://custom_emoji/1de4fedb-1460-4e83-a174-75e21bb61ae9/146f51a7-c117-8051-b67c-007a50ab1b1d" alt="notion://custom_emoji/1de4fedb-1460-4e83-a174-75e21bb61ae9/146f51a7-c117-8051-b67c-007a50ab1b1d" width="40px" /> Let's lead positive social change by enabling everyone in Korea to utilize machine learning!

</aside>

Team Member

KREW 10th Builders

KREW Activities

Pseudo Lab 6th <Hugging Face Documentation Translation>

image.png

Capture 20241123_0917@2x.png

image.png

The reason for starting this project was simple. Transformers is one of the most widely used libraries in natural language processing and deep learning, with many users in Korea as well. However, since most of the documentation was in English, there was an accessibility issue for Korean users. We thought that translating the official documentation into Korean would help many Korean users.

Rather than just contributing, team members shared and presented the content of the translated documents to study together, helping themselves grow in the process. Through this time, we were able to develop a deeper understanding of Hugging Face and its technology.

After translating most of the documentation, checking the visitor statistics revealed that the Korean translation became the most read documentation among non-English language documents. This was a valuable achievement showing that our team's efforts actually helped many users.

Pseudo Lab 7th <2023 Open Source Software Contribution Academy>

Since Hugging Face is open source, we wanted to contribute together with many people. For this reason, in 2023, we formed a team called <Hugging Face Localization> for OSSCA and participated as mentors. OSSCA stands for Open Source Software Contribution Academy and is a program hosted by the Ministry of Science and ICT, a government department of the Republic of Korea. Many people interested in artificial intelligence applied for this project, and we conducted activities for about 4 months with a total of 20 mentees.

During this period, our team did more than just translation work - we shared the contribution methods and know-how we had accumulated while translating Transformers with the mentees, and spent time learning how to actually contribute together. Beyond translating documents, we put in a lot of effort to help participants learn and grow in how they could make meaningful contributions within the Hugging Face ecosystem.

We attempted to contribute in various ways beyond translation. For example, we created model resources to help users effectively utilize the models uploaded to Hugging Face. We also improved our Hugging Face skills by studying the book Natural Language Processing with Transformers together. Through this process, we were able to gain a deeper understanding of Hugging Face's various features and learn how to actually apply them.

Furthermore, mentees formed teams based on shared interests and conducted personal projects using Hugging Face. Through these projects, participants were able to move beyond simple learning and experience working to create meaningful results using Hugging Face tools!

Untitled.png

2023 Pseudo Con <Hugging Face Hackathon>

Capture 20241123_1218@2x.png

PSX_20231112_113319.jpg

1000018844.jpg

1699714005519.jpg

1699714006193.jpg

In November 2023, we held a <Hugging Face Hackathon: AI in Daily Life> at Pseudo Con with support from Hugging Face OSS Team.

This hackathon was organized for those who wanted to create something with AI but couldn't try due to lack of resources. We provided participants with necessary resources to help them implement their desired projects using Hugging Face, and we also had time to share the process and the results together at Pseudo Con.

Furthermore, we conducted Hugging Face tutorials for newcomers, helping more people get hands-on experience with Hugging Face.

Pseudo Lab 8th <Hugging Face Official Documentation Translation>

Vibing Huggy.gif

Capture 20241123_1240@2x.png

In Pseudo Lab's 8th term, we translated various official documents besides transformers. We translated the Hub Python Library which enables interaction with Hugging Face's Hub and the Audio Course, one of Hugging Face Learn's educational materials, and we contributed to the Vision Course by writing documents directly in English!