Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Phi4-Mini on M4 mac Mini Pro 24GB в хорошем качестве

Phi4-Mini on M4 mac Mini Pro 24GB 6 дней назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Phi4-Mini on M4 mac Mini Pro 24GB

In this video, I put the Phi4-Mini Small Language Model from Microsoft through its paces on a base model Mac Mini M4 Pro with 24GB RAM and 512GB SSD. Useful Links: Install Ollama and DeepSeek R1 on Mac:    • DeepSeek R1 Apple MacBook Install   Ollama: https://ollama.com/ Microsoft Blog: https://techcommunity.microsoft.com/b... Gear I Use: https://www.head4space.com/gear/ Key points covered: • Walkthrough of Phi4-Mini using Ollama • Real performance testing with various prompts • Response time analysis with verbose mode • Comparison against Phi4 14B The model shows surprisingly good performance, with response times ranging from 1-30 seconds depending on the task complexity. Even running alongside screen recording software, which uses some GPU resources, the model maintains usable performance levels. This video demonstrates Phi4-Mini runs easily on a 24GB Mac Mini M4 Pro, leaving plenty of system resources for other tasks. For those interested in running AI models locally, as part of a wider AI toolkit this is a great option. Thanks to our YouTube community for suggesting this test! If you're interested in running AI locally, check out our other videos on language models.

Comments