📢 Gate Square #MBG Posting Challenge# is Live— Post for MBG Rewards!
Want a share of 1,000 MBG? Get involved now—show your insights and real participation to become an MBG promoter!
💰 20 top posts will each win 50 MBG!
How to Participate:
1️⃣ Research the MBG project
Share your in-depth views on MBG’s fundamentals, community governance, development goals, and tokenomics, etc.
2️⃣ Join and share your real experience
Take part in MBG activities (CandyDrop, Launchpool, or spot trading), and post your screenshots, earnings, or step-by-step tutorials. Content can include profits, beginner-friendl
Hugging Face's open-source top-of-the-line model
Jin10 Data July 9th news, early this morning, the globally renowned large model open source platform Hugging Face open sourced the top small parameter model SmolLM3. SmolLM3 has only 3 billion parameters, but its performance significantly surpasses similar open source models like Llama-3.2-3B and Qwen2.5-3B. It features a 128k context window and supports six languages including English, French, Spanish, and German. It supports both depth thinking and non-thinking dual reasoning modes, allowing users to switch flexibly.