In brief
- You can give local AI models web access using free Model Context Protocol (MCP) servers—no corporate APIs, no data leaks, no fees.
- Setup is simple: Install LM Studio, add Brave, Tavily, or DuckDuckGo MCP configs, and your offline model becomes a private SearchGPT.
- The result: real-time browsing, article analysis, and data fetching—without ever touching OpenAI, Anthropic, or Google’s clouds.
So you want an AI that can browse the web and think the only options are from OpenAI, Anthropic, or Google?
Think again.
Your data doesn’t need to travel through corporate servers every time your AI assistant needs timely information. With Model Context Protocol (MCP) servers, you can give even lightweight consumer models the ability to search the web, analyze articles, and access real-time data—all while maintaining complete privacy and spending zero dollars.
The catch? There isn’t one. These tools offer generous free tiers: Brave Search provides 2,000 queries monthly, Tavily offers 1,000 credits, and certain options require no API key at all. For most users, that’s enough runway to never hit a limit… you don’t make 1,000 searches in one day.
Before going technical, two concepts need explanation. “Model Context Protocol” is an open standard released by Anthropic in November 2024 that allows AI models to connect with external tools and data sources. Think of it as a kind of universal adapter that connects Tinkeryoy-like modules that add utility and functionality to your AI model.
Instead of telling your AI exactly what to do (which is what an API call does), you tell the model what you need and it just sort of figures out on its own what to do to achieve that goal. MCPs are not as accurate as traditional API calls, and you’ll likely spend more tokens to make them work, but they are a lot more versatile.
“Tool calling”—sometimes called function calling—is the mechanism that makes this work. It’s the ability of an AI model to recognize when it needs external information and invoke the appropriate function to get it. When you ask, “What’s the weather in Rio de Janeiro?”, a model with tool calling can identify that it needs to call a weather API or MCP server, format the request properly, and integrate the results into its response. Without tool-calling support, your model can only work with what it learned during training.
Here’s how to give superpowers to your local model.
Technical requirements and setup
The requirements are minimal: You need Node.js installed on your computer, as well as a local AI application that supports MCP (such as LM Studio version 0.3.17 or higher, Claude Desktop, or Cursor IDE), and a model with tool calling capabilities.
You should also have Python installed.
Some good models with tool-calling that run on consumer-grade machines are GPT-oss, DeepSeek R1 0528, Jan-v1-4b, Llama-3.2 3b Instruct, and Pokee Research 7B.
To install a model, go to the magnifying glass icon on the left sidebar in LM Studio and search for one. The models that support tools will show a hammer icon near their name. Those are the ones that you’ll need.

Most modern models above 7 billion parameters support tool calling—Qwen3, DeepSeek R1, Mistral, and similar architectures all work well. The smaller the model, the more you might need to prompt it explicitly to use search tools, but even 4-billion parameter models can manage basic web access.
Once you have downloaded the model, you need to “load” it so LM Studio knows it must use it. You don’t want your erotic roleplay model do the research for your thesis.
Setting up search engines
Configuration happens through a single mcp.json file. The location depends on your application: LM Studio uses its settings interface to edit this file, Claude Desktop looks in specific user directories, and other applications have their own conventions. Each MCP server entry requires just three elements: a unique name, the command to run it, and any required environment variables like API keys.
But you don’t really need to know that: Just copy and paste the configuration that developers provide, and it will work. If you don’t want to mess with manual edits, then at the end of this guide you’ll find one configuration, ready to copy and paste, so you can have some of the most important MCP servers ready to work.
The three best search tools available through MCP bring different strengths. Brave focuses on privacy, Tavily is more versatile, and DuckDuckGo is the easiest to implement.
To add DuckDuckGo, simply go to lmstudio.ai/danielsig/duckduckgo and click on the button that says “Run in LM Studio.”
Then go to lmstudio.ai/danielsig/visit-website and do the same, click on “Run in LM Studio.”

And that’s it. You have just given your model its first superpower. Now you have your own SearchGPT for free—local, private, and powered by Duckduckgo.
Ask it to find you the latest news, the price of Bitcoin, the weather, etc, and it will give you updated and relevant information.

Brave Search is a bit harder to set up than DuckDuckGo, but offers a more robust service, running on an independent index of over 30 billion pages and providing 2,000 free queries monthly. Its privacy-first approach means no user profiling or tracking, making it ideal for sensitive research or personal queries.
To configure Brave, sign up at brave.com/search/api to get your API key. It requires a payment verification, but it has a free plan, so don’t worry.
Once there, go to the “API Keys” section, click on “Add API Key” and copy the new code. Do not share that code with anyone.

Then go to LM Studio, click on the little wrench icon on the top right corner, then click on the “Program” tab, then on the button to Install and integration, click on “Edit mcp.json.”

Once there, paste the following text into the field that appears. Remember to put the secret API key you just created inside the quotation marks where it says “your brave api key here”:
{
"mcpServers": {
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "your_brave_api_key_here"
}
}
}
}
That’s all. Now your local AI can browse the web using Brave. Ask it anything and it will give you the most up-to-date information it can find.

A journalist researching breaking news needs current information from multiple sources. Brave’s independent index means results aren’t filtered through other search engines, providing a different perspective on controversial topics.
Tavily is another great tool for web browsing. It gives you 1,000 credits per month and specialized search capabilities for news, code, and images. It’s also very easy to set up: create an account at app.tavily.com, generate your MCP link from the dashboard, and you’re ready.

Then, copy and paste the following configuration into LM Studio, just like you did with Brave. The configuration looks like this:
{
"mcpServers": {
"tavily-remote": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.tavily.com/mcp/?tavilyApiKey=YOUR_API_KEY_HERE"]
}
}
}
Use case: A developer debugging an error message can ask their AI assistant to search for solutions, with Tavily’s code-focused search returning Stack Overflow discussions and GitHub issues automatically formatted for easy analysis.
Reading and interacting with websites
Beyond search, MCP Fetch handles a different problem: reading full articles. Search engines return snippets, but MCP Fetch retrieves complete webpage content and converts it to markdown format optimized for AI processing. This means your model can analyze entire articles, extract key points, or answer detailed questions about specific pages.
Simply copy and paste this configuration. No need to create API keys or anything like that:
{
"mcpServers": {
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch"
]
}
You need to install a package installer called uvx to run this one. Just follow this guide and you’ll be done in a minute or two.
This is great for summarization, analysis, iteration, or even mentorship. A researcher could feed it a technical paper URL and ask, “Summarize the methodology section and identify potential weaknesses in their approach.” The model fetches the full text, processes it, and provides detailed analysis that’s impossible with search snippets alone.
Want something simpler? This command is now perfectly understandable even by your dumbest local AI.
“Summarize this in three paragraphs and let me know why it is so important: https://decrypt.co/346104/ethereum-network-megaeth-350m-token-sale-valuing-mega-7-billion”

There are a lot of other MCP tools to explore, giving your models different capabilities. For example, MCP Browser or Playwright enable interaction with any website—form filling, navigation, even JavaScript-heavy applications that static scrapers can’t handle. There are also servers for SEO audits, helping you learn things with Anki cards, and enhancing your coding capabilities.

The complete configuration
If you don’t want to manually configure your LM Studio MCP.json, then here’s a complete file integrating all these services.
Copy it, add your API keys where indicated, drop it in your configuration directory, and restart your AI application. Just remember to install the proper dependencies:
{
"mcpServers": {
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch"
]
},
"brave-search": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-brave-search"
],
"env": {
"BRAVE_API_KEY": "YOUR API KEY HERE"
}
},
"browsermcp": {
"command": "npx",
"args": [
"@browsermcp/mcp@latest"
]
},
"tavily-remote": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://mcp.tavily.com/mcp/?tavilyApiKey=YOUR API KEY HERE"
]
}
}
}
This configuration will give you access to Fetch, Brave, Tavily and MCP Browser, no coding required, no complex setup procedures, no subscription fees, and no data for big corporations—just working web access for your local models.
You’re welcome.
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.
