Portals works immediately without any setup — Pollinations is enabled by default and requires no API key. When you’re ready to use a different model or provider, open Settings → AI Configuration and choose from the supported providers below. Your API keys are stored only on your device and are never sent to any Portals server.Documentation Index
Fetch the complete documentation index at: https://docs.asquareportal.com/llms.txt
Use this file to discover all available pages before exploring further.
API keys are saved locally on your device. Portals does not transmit your keys to any external server.
Supported providers
Pollinations — free, no key required
Pollinations — free, no key required
Pollinations is the default provider in Portals. It runs a free inference service and requires no sign-up or API key. Portal Agent uses it automatically the first time you open the app.Default model: selected automatically by PollinationsHow to use it:
No configuration needed. If you previously switched to another provider and want to return to Pollinations, open Settings → AI Configuration, select Pollinations, and save.
OpenAI
OpenAI
Use GPT-4o and other OpenAI models with your own API key.Default model: You can optionally set an Organization ID and a custom Base URL if you are using a self-hosted or Azure OpenAI deployment.
gpt-4oGet an API key:Go to platform.openai.com
Sign in or create an account at platform.openai.com.
Create an API key
Navigate to API keys in the left sidebar and click Create new secret key. Copy the key — you won’t be able to view it again.
Anthropic (Claude)
Anthropic (Claude)
Use Claude models — including the claude-3-5-sonnet series — with your Anthropic API key.Default model:
claude-3-5-sonnet-20241022Get an API key:Go to console.anthropic.com
Sign in or create an account at console.anthropic.com.
Google Gemini
Google Gemini
Use Gemini models via the Google AI Studio or Vertex AI API.Default model: You can optionally provide a Project ID for Vertex AI deployments.
gemini-1.5-proGet an API key:Go to aistudio.google.com
Sign in at aistudio.google.com with your Google account.
Groq
Groq
Groq provides fast inference for open-weight models at low latency.Default model: set during configurationGet an API key:
Go to console.groq.com
Sign in or create an account at console.groq.com.
Ollama — local models
Ollama — local models
Ollama runs models entirely on your machine. No API key is required, and no data leaves your device.Default model:
llama3.2
Default base URL: http://localhost:11434Set up Ollama:Install Ollama
Download and install Ollama from ollama.com. Start the Ollama service.
Portals must be able to reach your Ollama server over HTTP. If you’re running Portals in a browser on a remote machine, make sure the Ollama endpoint is accessible from that origin.
OpenRouter
OpenRouter
OpenRouter routes requests to hundreds of models from multiple providers using a single API key.Default model: You can optionally provide a Site URL and Site Name that OpenRouter will include in request attribution.
anthropic/claude-3.5-sonnetGet an API key:Go to openrouter.ai
Sign in or create an account at openrouter.ai.
OpenRouter Free
OpenRouter Free
OpenRouter Free gives you access to a rotating selection of models at no cost. It requires an OpenRouter API key but uses only the free-tier models that OpenRouter makes available.How it works:Portals fetches the current list of available free models from OpenRouter and lets you select one. The available models change over time as OpenRouter updates its free tier.
Get an OpenRouter API key
Follow the same steps as the OpenRouter provider above to create an account and generate a key at openrouter.ai.
Select OpenRouter Free in Portals
Open Settings → AI Configuration, select OpenRouter Free, paste your key, and click Fetch free models.
OpenAI-compatible providers
Connect any OpenAI-compatible endpoint
Connect any OpenAI-compatible endpoint
If you run a self-hosted inference server — such as LM Studio, vLLM, or a custom deployment — that exposes an OpenAI-compatible API, you can connect it using the OpenAI Compatible provider type.Required fields:
- Base URL — the root URL of your API, for example
https://api.example.com/v1 - API Key — your authentication token (required by most servers, even local ones)
- Model — the model identifier to pass in requests