Set as Homepage - Add to Favorites
【Jelena Jensen Archives】

DeepSeek released an updated version of its DeepSeek-V3 model on Jelena Jensen ArchivesMarch 24. The new version, DeepSeek-V3-0324, has 685 billion parameters, a slight increase from the original V3 model’s 671 billion. The company has not yet released a system card for the updated model. DeepSeek has also changed the model’s open-source license to an MIT license, aligning it with the DeepSeek-R1 model.

The original DeepSeek-V3 gained worldwide attention for its cost-effectiveness. In multiple benchmark tests, it outperformed other open-source models such as Qwen2.5-72B and Llama-3.1-405B, while delivering performance comparable to top proprietary models like GPT-4o and Claude-3.5-Sonnet. DeepSeek investor High-Flyer Quant has emphasized in a published paper that the model was trained at exceptionally low costs. By optimizing algorithms, frameworks, and hardware, the total training cost of DeepSeek-V3 was just $5.576 million – assuming an H800 GPU rental price of $2 per GPU per hour. [Cailian, in Chinese] 

1.6091s , 10090.078125 kb

Copyright © 2025 Powered by 【Jelena Jensen Archives】,Culture Information Network  

Sitemap

Top