“If you can’t explain it to a friend who skipped the lecture, you can’t explain it.”
You know that paragraph that sounds smart but lands like wet concrete? The one you reread five times and somehow get dumber? Cool—feed it through a filter that keeps the math, dumps the costume, and finally makes the point bite.
Keep the math; kill the fluff.
Use the paste-and-go prompt to get a 10th-grade rewrite + two analogies.
You “get it” when a friend can repeat it back in one minute.
What this is (and why you need it)
Academese is cosplay. It signals tribe membership, not comprehension. The Professor Filter strips glitter off the idea so you can see the wiring. You’ll keep the terms and the equations. You’ll lose the fog machine.
Paste-and-go prompt
Copy this into any LLM. Swap in your paragraph.
Professor Filter v1
Task: Rewrite the following paragraph at ~10th-grade reading level WITHOUT losing technical accuracy.
Rules:
- Keep official terms, symbols, and numbers correct (don’t “simplify” equations or definitions).
- Define jargon in plain English the first time it appears (one short clause).
- Give me exactly 2 analogies from different domains (one everyday, one technical).
- If any claim is uncertain, respond with: UNKNOWN.
- Return in three parts: (1) Plain-English rewrite (≤150 words), (2) Two analogies, (3) Original terms preserved (comma list).
Text:
[PASTE PARAGRAPH HERE]
Quick demo (so you trust it)
Original (too fancy):
“Entropy in information theory quantifies the expected surprise of a random variable’s outcomes. For a discrete distribution p(x)p(x)p(x), Shannon entropy is H(X)=−∑xp(x)log2p(x)H(X) = -\sum_x p(x)\log_2 p(x)H(X)=−∑xp(x)log2p(x). Higher entropy implies greater uncertainty and increased average code length for optimal compression.”
Professor-filtered output:
Plain English: Entropy tells you how unpredictable something is. If every outcome is equally likely, you’ll be more surprised each time, so you need more bits to describe what happened. For a set of outcomes with probabilities p(x)p(x)p(x), the entropy is H(X)=−∑xp(x)log2p(x)H(X) = -\sum_x p(x)\log_2 p(x)H(X)=−∑xp(x)log2p(x). That formula adds up “how likely” times “how much that outcome would surprise you” across all possibilities. More entropy = more uncertainty = longer messages, on average, if you compress them the best possible way.
Analogies:
• Everyday: Shuffle a deck. Before you flip the top card, you truly don’t know—so explaining which card it was needs more words.
• Technical: In coding, entropy is like the lower bound on your file size; noisy data forces longer, denser code.
Terms preserved: entropy, random variable, probability distribution, Shannon entropy, H(X)H(X)H(X), log2\log_2log2, compression.
Do it now (3 steps, 90 seconds)
- Paste the gnarly paragraph into the Professor Filter v1 prompt above.
- Read the 150-word rewrite and the two analogies out loud. If you stumble, rerun with “shorter sentences.”
- Screenshot the output and stash it with your notes. If the model says UNKNOWN, flag that sentence in your doc—don’t cite it.
Win condition
Explain the idea to a classmate who missed the lecture—in one minute, no notes. If they can repeat the point and one analogy back to you, you’re done. If not, run it again and force shorter, plainer lines.
When not to use this
Poetry, legal disclaimers, or anything where wording is the whole point. Also: if the paragraph is already clear and short, don’t fix what isn’t broken. You’re learning, not doing brand management.
Keep yourself honest (receipts)
Before/after screenshots. Note the time you spent rereading vs. the one-minute explanation working on the first try. That’s your delta. That’s the point.
Last Note
You don’t need a decoder ring; you need a filter. Paste the ugly paragraph, keep the math and the terms, strip the performative fog. The Professor Filter turns “I sound smart” into “I actually get it” without sanding off the meaning.
Want extra melt on stubborn jargon? Run the same text through Jargon Melt right after. It’s the combo move: one punch for clarity, one for context. Receipts over vibes, every time.