How to Run DeepSeek on Your Mac [2 Methods]
This blog post will introduce what DeepSeek is and guide you on how to run DeepSeek on your Mac. Check it out now.
Recently, an AI app named DeepSeek has gained significant attention. If you're interested in it and want to know how to run it on your Mac computer, this quick guide is for you.
What Is DeepSeek?
DeepSeek is an open-source AI language model developed by a Chinese company. Similar to ChatGPT, it supports multiple languages and provides advanced conversation understanding and generation capabilities.
In January 2025, DeepSeek was launched on the App Store (iPhone only). However, as it gained popularity, concerns about privacy and security also arose. Before using DeepSeek, make sure to comply with local regulations and assess its security for your needs.
If you're looking for an alternative to ChatGPT, DeepSeeK is the best option.
DeepSeek Features
- Open-source & free: Unlike many closed AI services, DeepSeek allows developers to use it freely.
- Local deployment: You can run DeepSeek locally on Mac, Windows, and Linux, without relying on the cloud.
- High-performance inference: Supports models of different sizes, such as 7B, 70B, 671B, and more, catering to various needs.
How to Run DeepSeek on Your Mac
Currently, DeepSeek is available as an app for iOS and Android. Mac users can access DeepSeek via the web or run it locally by installing Ollama. Below, we’ll walk you through both methods in detail.
Method 1. Use DeepSeek-R1 Online
The easiest way to use DeepSeek-R1 is through its web version. Follow these steps:
- Go to the DeepSeek official website.
- Click "Start Now".
- Register or log in (Google account sign-in is available).
- Start chatting with DeepSeek.
Method 2. Run DeepSeek Locally on Mac
DeepSeek does not have an official Mac app, but you can run it locally using Ollama, an open-source framework for AI models.
Before You Start
Running DeepSeek locally requires sufficient storage space on your Mac. Consider using the professional Mac cleaner BuhoCleaner to free up several GBs of storage for smooth installation and operation.
Click the button below to try it out.
Once your Mac has enough storage space, follow the steps below to run DeepSeek locally on your Mac.
Step #1: Install Ollama on Your Mac
Ollama allows you to run AI models locally, including DeepSeek-R1, Llama 3.3, Phi-4, Mistral, and Gemma 2.
- Go to the Ollama official website and download the macOS version.
- Open the downloaded file and follow the installation instructions.
Step #2: Run DeepSeek-R1 in Mac Terminal
Once Ollama is installed, proceed to run DeepSeek-R1 in Terminal.
- Open Terminal by clicking Finder > Applications > Utilities> Terminal.
- Copy and paste one of the commands below into Terminal and press Return.
Model Name | Command |
---|---|
DeepSeek-R1 | ollama run deepseek-r1:671b |
DeepSeek-R1-Distill-Llama-70B | ollama run deepseek-r1:70b |
DeepSeek-R1-Distill-Qwen-32B | ollama run deepseek-r1:32b |
DeepSeek-R1-Distill-Qwen-14B | ollama run deepseek-r1:14b |
DeepSeek-R1-Distill-Llama-8B | ollama run deepseek-r1:8b |
DeepSeek-R1-Distill-Qwen-7B | ollama run deepseek-r1:7b |
DeepSeek-R1-Distill-Qwen-1.5B | ollama run deepseek-r1:1.5b |
Tips:
The 671b
in the 1st command means downloading the 671B model. The 671B is a very large-scale model. As for which model to download, it depends on your Mac hardware. Here is a table for your reference:
Model | Size | Mac Configuration Reference |
---|---|---|
deepseek - r1:1.5b | 1.1 GB | M2/M3 (8 GB RAM+) |
deepseek - 11:7b | 4.7 GB | M2/M3/M4 (16 GB RAM+) |
deepseek - r1:8b | 4.9 GB | M2/M3/M4 (16 GB RAM+) |
deepseek - 11:14b | 9.0 GB | M2/M3/M4 Pro (32 GB RAM+) |
deepseek - 11:32b | 20 GB | M2 Ultra Mac Studio (64 GB RAM+) |
deepseek - 11:70b | 43 GB | Server-grade devices (e.g., M2 Ultra Mac Studio + external GPU dock) |
- Once downloaded, you can chat with DeepSeek offline.
- For a better experience, you can use Open-WebUI (Docker Compose) to create a ChatGPT-like interface for DeepSeek.
- If you want to delete DeepSeek models stored on your Mac, navigate to this folder: ~/.ollama/models and delete the models you no longer need to the Trash.
DeepSeek Comparison: Online vs Local
Online (Web) | Local Deployment | |
---|---|---|
Security | Data stored on servers | Data stored on your Mac (Ollama security should be considered.) |
Offline Access | Requires an internet connection | Works completely offline |
Customization | No custom models | Supports different AI models |
Speed | Affected by network speed | Depends on Mac hardware |
Conclusion
Now you know what DeepSeek is and how to run it on your Mac—whether using the web version or installing it locally with Ollama.
If this guide was helpful, feel free to share it with others. Happy exploring!
Clare Phang has been captivated by the world of Apple products since the launch of the iconic iPhone in 2007. With a wealth of experience, Clare is your go-to source for iPhone, iPad, Mac, and all things Apple.