SPEC Prompt Compactor (Beta)

Reduce prompt filler while protecting code, URLs, emails, paths, and quoted strings. Output stays plain English (works with any LLM). Lossy by design.

Runs entirely in your browser. We don’t receive or store your text.

Try Lossless Codec Join Waitlist

Live SPEC Demo

Prompt / Text

0 chars

Compacted Output

0 chars
0%
Reduction
0
Chars Saved
0
Protected Segments
0ms
Processing Time
0→0
Est Tokens

Try These Examples

Polite Filler

"Can you please summarize this information and generate a short response for the assistant?"

Try: Balanced

Code + URL Safe

"Explain this snippet: `SELECT * FROM users WHERE id = 1;` and check https://ottobot.org/projects/spec"

Keeps protected segments

Paths + Tags

"Run llama3.1:8b-instruct-q4_K_M on /data/ai-system/kiah.py and summarize the output."

Leaves paths/tags intact

Most Aggressive

"Would you please help me figure out what I should do next with this configuration?"

Try: Aggressive

How SPEC Works

1

Polite Prefix Trimming

Removes common prompt boilerplate like “can you please…” (lossy, but usually safe)

can you please(removed)
2

Stopword Removal

Drops low-information words to reduce token waste (profile-dependent)

the, and, with(removed)
3

Protected Segments

Leaves structured content untouched (code blocks, inline code, URLs, emails, paths, quoted strings)

`code`, https://…, /data/…, "quoted"(unchanged)
4

Profiles

safe keeps more words. balanced removes more. aggressive removes the most.

safeleast lossy