Generative AI Publication

Generative AI Publication

Share this post

Generative AI Publication
Generative AI Publication
How To Get DeepSeek R-1 Running Locally
Copy link
Facebook
Email
Notes
More

How To Get DeepSeek R-1 Running Locally

Here's a full setup guide on how you can install and use DeepSeek R-1 language model on your local PC. Use this reasoning model safely without internet connection.

Jim Clyde Monge's avatar
Jim Clyde Monge
Jan 25, 2025
∙ Paid
7

Share this post

Generative AI Publication
Generative AI Publication
How To Get DeepSeek R-1 Running Locally
Copy link
Facebook
Email
Notes
More
2
Share

Everyone seems to be talking about DeepSeek R-1, the new open-source AI language model made by a Chinese AI firm, DeepSeek. Some users claim it’s on par with, or even better than, OpenAI’s o1 in terms of reasoning capabilities.

Currently, DeepSeek is free to use, which is great news for users, but it does raise some questions. With the surge in user volume, how are they managing the server costs?

Hardware running costs cannot be cheap, right?

The one logical here would be — data. Data is the lifeblood of AI models. They’re probably collecting user data in some way that benefits their quant trading model or for some other form of monetization.

So, if you’re concerned about data privacy but still still want to use R1 without sharing your data, the best way is to run the model locally.

What is DeepSeek R-1?

A couple of days back, Deepseek R-1 was unveiled as a fully open-sourced model, meaning anyone can take the underlying codebase, adapt it, and even fine-tune it to their own needs.

From a technical standpoint, Deepseek R-1 (often abbreviated as R1) stems from a large base model called DeepSeek-V3. The lab then refined this model through a combination of supervised fine-tuning (SFT) on high-quality human-labeled data and reinforcement learning (RL).

The result is a chatbot that can handle intricate prompts, reveal the reasoning steps behind complex questions (sometimes more transparently than other models), and even render code in the chat interface for quick testing.

It’s honestly very impressive, especially for a model that’s open-source.

Generative AI Publication is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

How To Run It Locally

Keep reading with a 7-day free trial

Subscribe to Generative AI Publication to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Jim Clyde Monge
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More