How to Use Claude Code for Free
If you’ve been eyeing Claude Code but don’t fancy paying for yet another AI subscription, I’ve got good news. You can run it locally on your machine for absolutely nothing.
The trick? A tool called Ollama. It lets you run open-source language models on your own computer, and Claude Code can plug straight into it. No API key. No monthly bill. No cloud.
Here’s how to set it up in about 10 minutes.
What’s Actually Happening Here
Claude Code is the tool - it’s the interface you type into, the thing that writes and edits code for you. Normally it connects to Anthropic’s servers in the cloud, which costs money.
Ollama is like a mini AI server that runs on your laptop. It downloads an open-source model and serves it up locally. So instead of Claude Code talking to the cloud, it talks to Ollama sitting right there on your machine.
Same experience. Zero cost. The trade-off? Your computer does the heavy lifting, so you’ll want decent RAM. And the local models aren’t quite as sharp as the cloud versions. But for learning, experimenting, and building stuff? More than good enough.
Step 1: Install Ollama
Head to the Ollama website and download it. Works on both Windows and Mac.
Step 2: Download a Model
Open up your terminal (PowerShell on Windows, Terminal on Mac) and run:
ollama run gpt-oss:(size)Replace (size) with whichever variant suits your machine. The general rule: bigger model = smarter responses = needs more RAM. If you’ve got 16GB of RAM, you’ll be fine with the mid-sized options. If you’ve got 32GB+, go large.
You can browse all available models on the Ollama website. Pick one with a large context length - that’s what lets it handle bigger codebases without losing the plot halfway through.
Step 3: Install Claude Code
On Mac:
curl -fsSL https://claude.ai/install.sh | shOn Windows (PowerShell):
https://claude.ai/install.ps1 | iexWindows users - if the claude command doesn’t work after install, you might need to add it to your system path. Go to System Properties → Environment Variables → System Variables → Path → New → then paste your install directory (usually C:\Users\<your-username>\.local\bin).
After that, typing claude in your terminal should fire it up.
Step 4: Connect Claude Code to Your Local Model
This is where the magic happens. You’re telling Claude Code to stop looking at the cloud and start talking to your local Ollama instead.
The easiest way:
ollama launch claudeThen select the model you downloaded earlier. Done.
If that doesn’t work for whatever reason, you can do it manually with environment variables:
On Mac:
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://localhost:11434
claude --model gpt-oss:(size)On Windows (PowerShell):
$env:ANTHROPIC_AUTH_TOKEN="ollama"
$env:ANTHROPIC_BASE_URL="http://localhost:11434"
claude --model gpt-oss:(size)First time you run it, Claude Code will ask you to confirm that your project files are trustworthy. Say yes and you’re in.
That’s It. You’re Running Claude Code for Free.
No subscription. No API credits ticking down. No anxiety about how many prompts you’ve got left this month. Just you, your machine, and a local AI ready to build stuff.
Want to Test It? Build a Pomodoro Timer.
Seriously. Once you’re set up, paste this into Claude Code and watch what happens:
“Build a classic Snake game in Python with Pygame. Grid-based movement, arrow key controls, food spawning, score counter, increasing speed as the snake grows, wall and self-collision detection, game over screen with final score. Single file, clean code.”
If it spits out a working Pomodoro Timer, your setup is solid.
The Honest Trade-Offs
I’m not going to pretend this is identical to the paid cloud experience. It’s not. Local models are generally less capable than the latest cloud models. They can be slower depending on your hardware. And if you’re working on a massive codebase, you might hit context limits faster.
But for learning how AI-assisted coding works, for building side projects, for experimenting without worrying about cost - this is genuinely brilliant. It’s also a great way to figure out if the paid version is worth it for you before committing.
And there’s something satisfying about the whole thing running on your machine. No internet required. No data leaving your laptop. Just pure local AI vibes.
If you found this useful, send it to someone who keeps saying AI tools are too expensive to try.


