cloudflare
tcp/443 tcp/80 tcp/8443
Open service 104.21.71.79:8443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:50 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: close
CF-Cache-Status: MISS
Cache-Control: public, max-age=0, must-revalidate
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Vary: accept-encoding
Server-Timing: cfCacheStatus;desc="MISS"
Server-Timing: cfEdge;dur=208,cfOrigin;dur=0
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=1EEAEvr%2F5tpCOGSA3CI9wLcrtYv8j7U%2FUOTE3WN3CRNBZoEuFntU%2BYaPosFmISHhHQmMiLQ1SJUQV4yvuPqyN1o548hR3EyYOwS%2B4Y2Cqt3e7A%3D%3D"}]}
Server: cloudflare
CF-RAY: 9c3fde8edbc8cb6e-BOM
alt-svc: h3=":8443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 172.67.143.246:443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Vary: accept-encoding
Server-Timing: cfCacheStatus;desc="HIT"
Server-Timing: cfEdge;dur=27,cfOrigin;dur=0
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=pVeAE8nw8ht6MS8mpb14%2ByiJP2E4dPAtxO64pgi0zgLjn397Wtgnkwy8%2FsHUASD8M69RF2u7uzX51IqikN3gBKNbYtKrJdhB145bDpgla1o%3D"}]}
Server: cloudflare
CF-RAY: 9c3fde8a1ff8ab50-SIN
alt-svc: h3=":443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 2606:4700:3030::ac43:8ff6:443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Content-Length: 12610
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
ETag: "2bd3982c1f4c7336acf8375c83edbebc"
Vary: accept-encoding
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=BWlP4k3hC2RBHcUhnjSsZLdvwqlDKDfJqNKte0mkZXMd3wVfXDpMcjPQaQGJ3%2BQ05cekzMGzr4AwIj0%2F5U0La4oekfpHHDDSog49ABvYa8fVKR3OwdwRxz71lcBiZw%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde8b5b6fdc62-FRA
alt-svc: h3=":443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 172.67.143.246:8443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Content-Length: 12610
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
ETag: "2bd3982c1f4c7336acf8375c83edbebc"
Vary: accept-encoding
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=Noy1nwwd8vyOJtcs%2BuaAyATLoS%2F3TBEfWUVHxevfCQJsimj1VdDPSVdtf1v%2BpLFB3VDT0N5%2FRHrU%2BkxDZQJ6OBYlyFediybo3YseEmFIdFjbHA%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde8a0c899b5b-FRA
alt-svc: h3=":8443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 104.21.71.79:80 · info.smol.services
2026-01-26 12:03
HTTP/1.1 301 Moved Permanently
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Length: 0
Connection: close
Location: https://info.smol.services/
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=6IELr07FNcKG%2FAdKA9vgWpt7i08Hccez9pYAbg%2Ft4bMarOSrKOQDUZwrdXEMrLtpjCF5iBc7f2cXpy7yT4wMjS%2BQZpak8Frw129VOWiNmWkB6A%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde89bf3b93dc-LHR
alt-svc: h3=":443"; ma=86400
Open service 2606:4700:3030::ac43:8ff6:8443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Content-Length: 12610
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
ETag: "2bd3982c1f4c7336acf8375c83edbebc"
Vary: accept-encoding
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=7y9NhhiVipg6vJPlN2sZ%2B3Z6zuoJ3i%2BLXj4awFYg%2BkhrqWmhUQhxpnpwlP3JPkWyGjy3p9%2BkpIZVt3nHDHLWkoX1WxRzgufRRu3LFdWWboezFB2lRONQoZN9OcwcQw%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde89c873974b-FRA
alt-svc: h3=":8443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 2606:4700:3037::6815:474f:8443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Content-Length: 12610
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
ETag: "2bd3982c1f4c7336acf8375c83edbebc"
Vary: accept-encoding
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=lwukPbIqcleB%2F1meogBBzylKRe7t52jbJA0NwVrv7mCtrq3kh161klgOIhaOhC2aoSz9ywXWFKbpLBv7bT4425ZVa8fBH4pLl0W0pgqxz8ywvdvFJ77%2FJrnGYT%2BwGA%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde89dc3a2bc1-FRA
alt-svc: h3=":8443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 2606:4700:3030::ac43:8ff6:80 · info.smol.services
2026-01-26 12:03
HTTP/1.1 301 Moved Permanently
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Length: 0
Connection: close
Location: https://info.smol.services/
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=6b1geyMyAq3O1ORiLtapSi0cXc30rA4U0j3487xSqXOlxxXLP6hiUPL0AQBsSIEil0VjqrfBCRks7gNf0VapcYLwfLPjyO%2BclGXzFMO2NjwRvF%2BTfy%2BTTm8Y7%2BKpfA%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server-Timing: cfEdge;dur=11,cfOrigin;dur=0
Server: cloudflare
CF-RAY: 9c3fde897f357291-EWR
alt-svc: h3=":443"; ma=86400
Open service 2606:4700:3037::6815:474f:443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Vary: accept-encoding
Server-Timing: cfCacheStatus;desc="HIT"
Server-Timing: cfEdge;dur=14,cfOrigin;dur=0
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=WiZ2i7WofjZxOwbtuIwtfcDy8nJPCNSNYaEHr%2BX6VpF913LDO9XElpOaMC85zDu1WCVfPpsiGwznn2jAg6VjfrZHA%2FIQDzM%2BnhfcH5NJ8GVHiykzTW%2BV9zr3rUqCtA%3D%3D"}]}
Server: cloudflare
CF-RAY: 9c3fde893d90b544-EWR
alt-svc: h3=":443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I
Open service 172.67.143.246:80 · info.smol.services
2026-01-26 12:03
HTTP/1.1 301 Moved Permanently
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Length: 0
Connection: close
Location: https://info.smol.services/
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=oBedN7PCNNxuLW61zwlDmcGvrJVvnenzyIVnkiXtkXgm6e4gH01nBVcWMNvzcgOAWbPIZuFtSMIC8DAex1KplZzkGCesogAOnDGiOV044adfJQ%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde89286cd26a-FRA
alt-svc: h3=":443"; ma=86400
Open service 2606:4700:3037::6815:474f:80 · info.smol.services
2026-01-26 12:03
HTTP/1.1 301 Moved Permanently
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Length: 0
Connection: close
Location: https://info.smol.services/
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=zV4Z81ciZmdOIBDRpXWFo%2B9R22FO62iZgt4qLvAn72LowsI2QPnJLu4Zo%2FtsvCFXmiIZMVkyKwgICq5oHbsVmPv6rwxeIkeadWaaZpgQtvyX5IDrJfH%2Fdt%2FluVE2tg%3D%3D"}]}
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Server: cloudflare
CF-RAY: 9c3fde891f719aab-FRA
alt-svc: h3=":443"; ma=86400
Open service 104.21.71.79:443 · info.smol.services
2026-01-26 12:03
HTTP/1.1 200 OK
Date: Mon, 26 Jan 2026 12:03:49 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: close
CF-Cache-Status: HIT
Cache-Control: public, max-age=0, must-revalidate
Nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
Vary: accept-encoding
Server-Timing: cfCacheStatus;desc="HIT"
Server-Timing: cfEdge;dur=14,cfOrigin;dur=0
Report-To: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=AnUhKwjvY%2BNft9lVOagP6OaytvMUG1srWZRdK%2FUbbyFiqjCMVYKpVseXjasGk0Ce5xaKRNkzwu7%2FuX4A5w8%2By%2BjB7JE74Y9i4CdsuJHB1VOWpg%3D%3D"}]}
Server: cloudflare
CF-RAY: 9c3fde893d550d33-EWR
alt-svc: h3=":443"; ma=86400
Page title: Smolproxy
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Smolproxy</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/dark.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
<style>
body {
max-width: 900px;
margin: 0 auto;
padding: 20px;
}
</style>
</head>
<body>
<h1 id="smolproxy" tabindex="-1"><a class="header-anchor" href="#smolproxy" aria-hidden="true">#</a> Smolproxy</h1>
<p class="center"><strong>Service info: <a href="https://smolproxy.org">https://smolproxy.org</a></br></strong></p>
<section><h3 id="what-is-smolproxy%3F" tabindex="-1"><a class="header-anchor" href="#what-is-smolproxy%3F" aria-hidden="true">#</a> What is Smolproxy?</h3>
<p>Smolproxy is a service that provides API access to multiple LLMs. There are a lot of uses for LLMs, a few examples:</p>
<ul>
<li>Play around and experiment with the new technology</li>
<li>Use LLMs to help you with studying or programming</li>
<li>Roleplay with AI models</li>
<li>Use the provided APIs to create your own custom Discord/Telegram/etc bots for personal use</li>
</ul>
<p>You can use Smolproxy with any application/library/frontend where it is possible to specify a custom endpoint URL. Just to name a few:</p>
<ul>
<li>OpenAI libraries for Python, JS, and other languages</li>
<li>SillyTavern</li>
<li>Lobe Chat</li>
<li>LibreChat</li>
<li>big-AGI</li>
</ul>
<p>Current stable providers: OpenAI, Gemini, Mistral, Deepseek.</p>
<p><strong>There are no refunds. Claude access on the proxy is not guaranteed. If you want to buy this service just for stable Claude, do not do it.</strong></p>
<p>For example, if the software that you use supports a custom OpenAI base URL and you want to use Smol, you can simply set the base URL to <code>https://smolproxy.org/proxy/openai/v1</code> and set the API key to your token. See <a href="#llm-endpoints">the full list of available API endpoints</a>.</p>
<p>If you have any questions regarding service usage or payment, feel free to <a href="#contact">contact me</a>.</p>
</section><section><h3 id="newest-changes" tabindex="-1"><a class="header-anchor" href="#newest-changes" aria-hidden="true">#</a> Newest changes</h3>
<p>Jan 12, 2026 - Old domain <a href="https://smol.services">https://smol.services</a> seems to have been disabled, switched to the new domain for now: <a href="https://smolproxy.org">https://smolproxy.org</a></p>
<p>Dec 28, 2025 - Proxy was down for ~2 hours due to issues with the new VPS host. Added 1 day to all tokens as compensation.</p>
<p>Dec 22, 2025 - Added GLM 4.7.</p>
<p>Dec 17, 2025 - Added Gemini 3 Flash.</p>
<p>Dec 12, 2025 - Added GPT-5.2.</p>
<p>Nov 21, 2025 - Added <a href="https://gen.smol.services">https://gen.smol.services</a> - small frontend for Gemini 3 Pro image gen.</p>
<p>Nov 18, 2025 - Added Gemini 3 Pro (preview) to the proxy.</p>
<p>Nov 12, 2025 - Added GPT-5.1 to the proxy.</p>
<p>Oct 3, 2025 - Added GLM (ported from reanon, thanks for the implementation).</p>
<p>Sep 24, 2025 - Added Grok 4 Fast and Grok Code Fast 1.</p>
</section><section><h3 id="llm-endpoints" tabindex="-1"><a class="header-anchor" href="#llm-endpoints" aria-hidden="true">#</a> LLM endpoints</h3>
<ul>
<li>OpenAI: <code>/proxy/openai</code></li>
<li>Gemini: <code>/proxy/google-ai</code></li>
<li>Mistral: <code>/proxy/mistral-ai</code></li>
<li>Deepseek: <code>/proxy/deepseek</code>, <code>deepseek-reasoner</code> is Deepseek V3.2 (thinking) and <code>deepseek-chat</code> is Deepseek V3.2. Prefills work automatically (last assistant message - Claude-style).</li>
<li>GLM: <code>/proxy/glm</code> - thinking is disabled by default. To enable it in SillyTavern, add <code>reasoning_effort: "high"</code> in "Additional Parameters" -> "I