1. Home
  2. Companies
  3. Reddit
  4. Outage Map
Reddit

Reddit Outage Map

The map below depicts the most recent cities worldwide where Reddit users have reported problems and outages. If you are having an issue with Reddit, make sure to submit a report below

Loading map, please wait...

The heatmap above shows where the most recent user-submitted and social media reports are geographically clustered. The density of these reports is depicted by the color scale as shown below.

Reddit users affected:

Less
More
Check Current Status

Reddit is a social news aggregation, web content rating, and discussion website. Reddit's registered community members can submit content, such as text posts or direct links.

Most Affected Locations

Outage reports and issues in the past 15 days originated from:

Location Reports
Ballinasloe, Connaught 1
Copenhagen, Capital Region 1
Stoke-on-Trent, England 1
Sebring, FL 1
Chicago, IL 1
Melbourne, VIC 3
New Orleans, LA 2
Pune, MH 5
Toronto, ON 1
Barcelona, Catalonia 1
Algiers, Algiers 1
Cape Breton County, NS 1
Cochin, KL 2
Parker, CO 1
Chandler, OK 1
Delano, CA 1
Seattle, WA 1
Gabriola, BC 2
New York City, NY 1
New Delhi, NCT 2
Newark, NJ 1
Mumbai, MH 5
Bengaluru, KA 5
Mangalore, KA 1
Amritsar, PB 1
Vijayawada, AP 1
Vellore, TN 1
Kolkata, WB 3
Coimbatore, TN 1
Rājkot, GJ 1
Check Current Status

Community Discussion

Tips? Frustrations? Share them here. Useful comments include a description of the problem, city and postal code.

Beware of "support numbers" or "recovery" accounts that might be posted below. Make sure to report and downvote those comments. Avoid posting your personal information.

Reddit Issues Reports

Latest outage, problems and issue reports in social media:

  • arpitpandey735
    Arpit Pandey (@arpitpandey735) reported

    Day 1: I will handle your marketing across Reddit, X, and LinkedIn. Every founder I talk to has the same problem: a good product, inconsistent marketing. Don't have the time or bandwidth to do it every single day. I am keeping extremely affordable right now for new clients.

  • DavidGQuaid
    David Quaid - AI SEO (@DavidGQuaid) reported

    Another great example of GEO disinformation written by LLMs on a Single-Use Reddit Bot Account This is an incredible position that should require someone in a "though leadership" position but instead GEO agencies are just using repetition in order to make it seem like its common knowledge. Tell tale factors: Emphasis on E-E-A-T and Schema - because this is now confirmation bias. EEAT cannot make you rank if you have nothing else. Thats the problem with it and "good content" This means that SEO strategies founded on LLM "ideas" are complete BS

  • barkmeta
    Bark (@barkmeta) reported

    Let me explain what just happened… An AI just launched that eliminates all marketing jobs. Not some of them. All of them. SEO. Social media. Content writing. Ad creation. Brand design. Pitch decks. Community management. Reddit posts. Email campaigns. All of it. A marketing team costs $200K to $500K a year. An agency costs $10K to $20K a month. A freelance designer charges $5K per project. This does all of it. Every single function. For almost nothing. Backed by General Catalyst. Jeffrey Katzenberg. Executives from Dropbox, Stripe, and Google. $7.5 million in funding. Thousands already using it. And it has an API. Meaning other AI agents feed it work automatically. AI writes the copy. AI designs the assets. AI posts it. AI optimizes it. No human ever touches it. A full marketing department. End to end. Automated. A week ago AI replaced coders. Before that writers. Before that customer service. Now every marketing job. All at once. From one launch. Every single week another AI drops and another career becomes a subscription. And it’s not slowing down. It’s speeding up…

  • godofprompt
    God of Prompt (@godofprompt) reported

    🚨 BREAKING: Every prompt you're using right now is built on what worked months ago. Someone just open-sourced the solution. A Claude Code skill that scans Reddit and X from the last 30 days on any topic you give it, then writes you a fully structured, copy-paste-ready prompt based on what the community has actually figured out. Not last quarter. Not last year. This month. You type "/last30days prompting techniques for ChatGPT for legal questions" and it returns the exact patterns real lawyers and power users are running right now. Complete with a deployment-ready prompt you drop in and use immediately. It works for anything: → Midjourney techniques the Discord discovered this week → Cursor rules that actually work in 2026 → Claude prompting patterns from builders shipping real products → Suno, Runway, code generation, whatever is moving right now This is what prompt engineering looks like when it stops being a static document and starts being live intelligence. The entire AI prompting landscape updates faster than any guide, course, or thread can keep up with. This skill treats that reality as a feature, not a problem. Most people are still Googling for prompts that got patched out three updates ago. This pulls from the source: real users solving real problems in real time. 100% open source. MIT License. The gap between people using fresh prompts and people recycling old ones is about to get very visible.

  • TooRareToCare
    TooRareToCare (@TooRareToCare) reported

    @AtomicErectus @Sorelle_Arduino Agreed. I swear half the OPs in these Reddit groups make their stories up just to create issues that don't exist

  • xerusHQ
    Xerus (@xerusHQ) reported

    @bcherny bug in Claude code, eating my max plan limit in less than 10 mins, not only that it Claude usage it shows 15% but cli keeps on giving rate limit error. The issue is also all over reddit.

  • nyxreigns
    0xNyx ☠️ (@nyxreigns) reported

    @zeeesol You go TikTok abi reddit go down load picture

  • robcunningham25
    Robert Cunningham (@robcunningham25) reported

    Wispr Flow is a $700M valued piece of buggy trash right now. Constant issues, no real support, privacy red flags, jet engine fans on my laptop for no reason. Uninstalling soon if nothing changes. Other users on Reddit and complaining too, but into an echo-ey void #WisprFlow @StevenBartlett @WisprFlow

  • NandaKishoreHT
    Nanda Kishore (@NandaKishoreHT) reported

    I built 4 AI agents that scrape Reddit, Hacker News, GitHub, arXiv, and Product Hunt every morning, analyze the data through 3 competing LLMs, and auto-draft content for me. Here's what I actually learned wiring together Groq, Anthropic, NVIDIA Nemotron, Ollama, Brave Search, and Baileys into one system running on a $12/month droplet. The Agents: The setup is 4 agents, each with a specific job: 1. Kiran (Data Ops) — scrapes 8 sources via Brave Search API + direct Reddit/HN/GitHub APIs. Runs on Groq's Llama 3.3 70B because it handles structured JSON extraction well at zero cost. 2. Priya (Intelligence) — takes Kiran's raw trends and scores them. Runs on Claude Sonnet because pattern recognition across 50+ data points needs real reasoning, not just summarization. 3. John (Content Writer) — auto-generates X threads, LinkedIn posts, and video scripts from Priya's analysis. Also on Claude because writing quality matters. 4. Maya (Chief of Staff) — morning briefs, chat interface, coordinates the other three. Reads from a shared memory layer so she knows what everyone did. Multi-Model Competition: The most interesting part: I made 3 models compete on the same scraping task. Groq (Llama 3.3 70B), NVIDIA Nemotron 3 Super 120B via OpenRouter, and a local Ollama instance all get the exact same prompt with the exact same data. I score them purely on output quality, how many trends they found, how complete the fields are. No speed bonus. No default winner. Best analysis wins each run. The scoring was rigged at first — I gave Groq a 5-point advantage and a latency bonus. Removed both once I realized: this runs twice a day. Speed is irrelevant. Memory Layer: Agents that forget yesterday are useless. Built a JSON-based shared memory system - each agent writes observations, reads what others wrote. Kiran stores trend snapshots (30 days of history). Priya writes strategy decisions. John reads both before drafting. No database. No vector store. Just a JSON file with caps per section (200 messages per agent, 100 shared insights). Syncs daily to GitHub as markdown. Entire persistence layer is one file under 300 lines. Key lessons learned: 1) Brave Search API as a universal scraper. Reddit, HN, GitHub trending, arXiv, Product Hunt, even X posts - one API handles it all. Way simpler than maintaining 8 different scrapers. 2) Free tier models are real competitors. Nemotron 120B via OpenRouter free tier beat Groq on multiple runs when I stopped handicapping it. NVIDIA is not messing around. 3) Baileys for WhatsApp instead of whatsapp-web.js. No Chrome dependency, no Puppeteer, runs on a 1-vCPU droplet. The CJS/ESM interop on Node 22 was painful though — dynamic imports are the fix. 4) pm2 + node-cron for scheduling. Cron at 7am scrapes, 8am briefs, 9pm GitHub sync. No Lambda, no Kubernetes, no queue system. Just a droplet. How much did it cost? Not much actually? Total monthly cost: $12 droplet + API calls (Groq free, OpenRouter free tier, Brave ~$3, Anthropic ~$5 for Claude calls). Under $20/month for a system that scrapes the internet, runs multi-model analysis, writes content drafts, and sends them to my Telegram every morning. The repo is openclaw2 on my GitHub if you want to look at the architecture. What models are you running in your pipelines?

  • QM_Ashwin
    Ashwin Saxena (@QM_Ashwin) reported

    Reddit is down. Can't open any thread.

  • RAKSWALKER
    RAKS WALKER (@RAKSWALKER) reported

    @GamerQueenZoe Oh damn Maybe it's better to find some articles on reddit or somewhere to see if this is a common issue

  • Arkasiraee
    Ammanichanda (@Arkasiraee) reported

    @anvisha The breakdown to see what this really means, An AI just launched that eliminates all marketing jobs. Not some of them. All of them. SEO. Social media. Content writing. Ad creation. Brand design. Pitch decks. Community management. Reddit posts. Email campaigns. All of it. All these jobs are going away. A marketing team costs $200K to $500K a year. An agency costs $10K to $20K a month. A freelance designer charges $5K per project. This does all of it. Every single function. For almost nothing. Like 10% of the cost. Backed by General Catalyst. Jeffrey Katzenberg. Executives from Dropbox, Stripe, and Google. $7.5 million in funding. Thousands already using it. And it has an API. Meaning other AI agents feed it work automatically. AI writes the copy. AI designs the assets. AI posts it. AI optimizes it. No human ever touches it. It just gets better and better with time and eventually the curve will be even tough for humans to read. A full marketing department. End to end. Automated. A week ago AI replaced coders. Before that writers. Before that customer service. Now every marketing job. All at once. From one launch. Just one AI as of now. Every single week another AI drops and another career becomes a subscription. And it’s not slowing down. It’s speeding up You are less relevant with each passing minute.

  • migs_sr
    Migs_Sr (@migs_sr) reported

    Reddit needs shut down. Nothing but hate on that platform

  • Naick16
    Naick (@Naick16) reported

    @theandreboso Yeah lots of subreddits have really gone downhill. Some interesting ones I followed for a bit got overrun by 'subtle' promo posts. A made up AI slop story and a magical tool that suddenly fixed all their issues. Hard situation to be in for big subreddits and reddit in general

  • MauricioJFlorez
    Mauricio Florez (@MauricioJFlorez) reported

    @claudeai @antrophic what is happening with free claude? It is reaching the limit after just one prompt. Please guys read on reddit where there hundreds of complaints about this limit problem since two days ago. The same problem with pro and max versions after a few prompts.

Check Current Status