How to Run DeepSeek on Your Mac [2 Methods]

Clare Phang
Last updated: Feb 25, 2025

This blog post will introduce what DeepSeek is and guide you on how to run DeepSeek on your Mac. Check it out now.

Recently, an AI app named DeepSeek has gained significant attention. If you're interested in it and want to know how to run it on your Mac computer, this quick guide is for you.

How to Run DeepSeek on Your Mac [2 Methods]

What Is DeepSeek?

DeepSeek is an open-source AI language model developed by a Chinese company. Similar to ChatGPT, it supports multiple languages and provides advanced conversation understanding and generation capabilities.

In January 2025, DeepSeek was launched on the App Store (iPhone only). However, as it gained popularity, concerns about privacy and security also arose. Before using DeepSeek, make sure to comply with local regulations and assess its security for your needs.

If you're looking for an alternative to ChatGPT, DeepSeeK is the best option.

DeepSeek on App Store

DeepSeek Features

  • Open-source & free: Unlike many closed AI services, DeepSeek allows developers to use it freely.
  • Local deployment: You can run DeepSeek locally on Mac, Windows, and Linux, without relying on the cloud.
  • High-performance inference: Supports models of different sizes, such as 7B, 70B, 671B, and more, catering to various needs.

How to Run DeepSeek on Your Mac

Currently, DeepSeek is available as an app for iOS and Android. Mac users can access DeepSeek via the web or run it locally by installing Ollama. Below, we’ll walk you through both methods in detail.

Method 1. Use DeepSeek-R1 Online

The easiest way to use DeepSeek-R1 is through its web version. Follow these steps:

  1. Go to the DeepSeek official website.
  2. Click "Start Now".
    DeepSeek Web Version
  3. Register or log in (Google account sign-in is available).
  4. Start chatting with DeepSeek.
    Use DeepSeek Online

Method 2. Run DeepSeek Locally on Mac

DeepSeek does not have an official Mac app, but you can run it locally using Ollama, an open-source framework for AI models.

Before You Start

Running DeepSeek locally requires sufficient storage space on your Mac. Consider using the professional Mac cleaner BuhoCleaner to free up several GBs of storage for smooth installation and operation.

No Space Left on Mac Terminal

Click the button below to try it out.

DownloadFor macOS 10.10 and above
100,000+ Satisfied Users Worldwide
Quickly Clean Up Mac with BuhoCleaner

Once your Mac has enough storage space, follow the steps below to run DeepSeek locally on your Mac.

Step #1: Install Ollama on Your Mac

Ollama allows you to run AI models locally, including DeepSeek-R1, Llama 3.3, Phi-4, Mistral, and Gemma 2.

  1. Go to the Ollama official website and download the macOS version.
    Download Ollama for Mac
  2. Open the downloaded file and follow the installation instructions.
    Use Ollama to Run DeepSeek-R1

Step #2: Run DeepSeek-R1 in Mac Terminal

Once Ollama is installed, proceed to run DeepSeek-R1 in Terminal.

  1. Open Terminal by clicking Finder > Applications > Utilities> Terminal.
  2. Copy and paste one of the commands below into Terminal and press Return.
Model NameCommand
DeepSeek-R1ollama run deepseek-r1:671b
DeepSeek-R1-Distill-Llama-70Bollama run deepseek-r1:70b
DeepSeek-R1-Distill-Qwen-32Bollama run deepseek-r1:32b
DeepSeek-R1-Distill-Qwen-14Bollama run deepseek-r1:14b
DeepSeek-R1-Distill-Llama-8Bollama run deepseek-r1:8b
DeepSeek-R1-Distill-Qwen-7Bollama run deepseek-r1:7b
DeepSeek-R1-Distill-Qwen-1.5Bollama run deepseek-r1:1.5b
Run DeepSeek R1 on Mac in Terminal

Tips:

The 671b in the 1st command means downloading the 671B model. The 671B is a very large-scale model. As for which model to download, it depends on your Mac hardware. Here is a table for your reference:

ModelSizeMac Configuration Reference
deepseek - r1:1.5b1.1 GBM2/M3 (8 GB RAM+)
deepseek - 11:7b4.7 GBM2/M3/M4 (16 GB RAM+)
deepseek - r1:8b4.9 GBM2/M3/M4 (16 GB RAM+)
deepseek - 11:14b9.0 GBM2/M3/M4 Pro (32 GB RAM+)
deepseek - 11:32b20 GBM2 Ultra Mac Studio (64 GB RAM+)
deepseek - 11:70b43 GBServer-grade devices (e.g., M2 Ultra Mac Studio + external GPU dock)
  1. Once downloaded, you can chat with DeepSeek offline.
How to Use DeepSeek R1 in Terminal
Tips
  1. For a better experience, you can use Open-WebUI (Docker Compose) to create a ChatGPT-like interface for DeepSeek.
  2. If you want to delete DeepSeek models stored on your Mac, navigate to this folder: ~/.ollama/models and delete the models you no longer need to the Trash.

DeepSeek Comparison: Online vs Local

Online (Web)Local Deployment
SecurityData stored on serversData stored on your Mac (Ollama security should be considered.)
Offline AccessRequires an internet connectionWorks completely offline
CustomizationNo custom modelsSupports different AI models
SpeedAffected by network speedDepends on Mac hardware

Conclusion

Now you know what DeepSeek is and how to run it on your Mac—whether using the web version or installing it locally with Ollama.

If this guide was helpful, feel free to share it with others. Happy exploring!

Clare Phang has been captivated by the world of Apple products since the launch of the iconic iPhone in 2007. With a wealth of experience, Clare is your go-to source for iPhone, iPad, Mac, and all things Apple.