Reddit Outage Map
The map below depicts the most recent cities worldwide where Reddit users have reported problems and outages. If you are having an issue with Reddit, make sure to submit a report below
The heatmap above shows where the most recent user-submitted and social media reports are geographically clustered. The density of these reports is depicted by the color scale as shown below.
Reddit users affected:
Reddit is a social news aggregation, web content rating, and discussion website. Reddit's registered community members can submit content, such as text posts or direct links.
Most Affected Locations
Outage reports and issues in the past 15 days originated from:
| Location | Reports |
|---|---|
| City of Rapid City, SD | 1 |
| Edmonton, AB | 1 |
| San Francisco, CA | 1 |
| Pune, MH | 4 |
| Saint-Pierre, Réunion | 1 |
| Melbourne, VIC | 4 |
| Kensington, England | 1 |
| Vancouver, BC | 1 |
| Marseille, Provence-Alpes-Côte d'Azur | 2 |
| San Jose, CA | 1 |
| Thiruvananthapuram, KL | 1 |
| Ottawa, ON | 1 |
| Enguera, Valencia | 1 |
| Benalmádena, Andalusia | 1 |
| Sydney, NSW | 1 |
| Township of Evan, KS | 1 |
| Ballinasloe, Connaught | 1 |
| Copenhagen, Capital Region | 1 |
| Stoke-on-Trent, England | 1 |
| Sebring, FL | 1 |
| Chicago, IL | 1 |
| New Orleans, LA | 2 |
| Toronto, ON | 1 |
| Barcelona, Catalonia | 1 |
| Algiers, Algiers | 1 |
| Cape Breton County, NS | 1 |
| Cochin, KL | 2 |
| Parker, CO | 1 |
| Chandler, OK | 1 |
| Delano, CA | 1 |
Community Discussion
Tips? Frustrations? Share them here. Useful comments include a description of the problem, city and postal code.
Beware of "support numbers" or "recovery" accounts that might be posted below. Make sure to report and downvote those comments. Avoid posting your personal information.
Reddit Issues Reports
Latest outage, problems and issue reports in social media:
-
Cheick Kan (@kocheick_) reportedWhile building it, I would check different places (twitter, reddit, fb/ig) to see if people talk about it just to get an idea of whether my product can solve their problems or not, and so far that's been my way of confirming demand.
-
Taiwo Oladosu (@TaiTechSolution) reported@WorkWomp @tibo_maker Yeah, that’s exactly how Reddit behaves this isn’t a “you” problem, it’s a system problem. Job search subreddits are some of the strictest on the platform. New account + product mention = instant removal almost every time. Waiting for 30 days helps a bit, but it won’t fix the core issue on its own. What actually moves things forward is changing the approach: Right now, you’re trying to post into strict subs What works better is to enter conversations inside them first For your case (job search tool), the highest-converting entry points are usually: people frustrated with job boards “I applied to 200 jobs and nothing” threads resume / interview struggles “what tools are you using to find jobs?” If you go in as a product, you get removed If you go in as someone helping solve the problem, you build visibility And here’s the part most people miss: Even if your post gets deleted, comments still drive traffic + interest if they’re written properly. Also, trying to rely only on jobsearch subs early is a trap. They’re high traffic, but very defensive. You’ll get more traction faster by mixing in: founder / indie hacker spaces career transition communities productivity / workflow discussions Same problem, less resistance. If you keep pushing posts, you’ll keep hitting the same wall. If you shift to comment-first + problem-led positioning, you’ll start seeing traction even before your account “ages out.” If you want, I can show you how to structure a few replies specifically for job search threads so they don’t get flagged but still pull people in.
-
hoeem (@hooeem) reportedthis guy saw people placing their iPhones on their water bottles and made a product. what if you could scan through X + Reddit + YouTube comments to find a 100x idea? another example: Nivea found that people complained about deodorant marks on black & white clothes, they created one of their best selling products Nivea invisible black & whites from this. the ideas are out there on a random YouTube video, Reddit discussion, or on a shitpost on X. the problem is you’re a human. you only have so much bandwidth. build your own 100x idea machine with AI by following this article I wrote a few days ago… 👇
-
Ben Hatch (@benhatch44) reported@seraleev Did she attempt to upgrade an existing individual account to organization? I attempted to do this and also got stuck in limbo. After two weeks I canceled the upgrade request because it seemed to be going nowhere. From everything I researched on Reddit and other places, it seems others had this problem as well. If an organization account is needed best move is to just create a new developer account from scratch i think.
-
Erik_22 (@erik_S22) reported@bobpockrass Yeah that's what I thought, I got down voted like crazy on Reddit for saying it but the smt doesn't lie.
-
Devesh | Reddit Marketing (@deveshlogs) reportedWhy does your competitor show up in ChatGPT recommendations? Not backlinks Not SEO Not more content They show up where AI looks: • Reddit • Communities • Answer-first threads They stay consistent Others quit too early That’s the advantage Not effort Placement I broke it down in the Reddit Playbook: • What to post • Where to post • How to get cited by AI Comment “REDDIT” and I’ll send it
-
Stephen (@0xSMW) reported@SlykePhoxenix @kevinrose @digg use reddit or x. the last community iteration of digg was terrible.
-
Jacob Klug (@Jacobsklug) reportedThis Reddit post just went viral. 1,800+ upvotes. And it's the exact problem I see every week. The app works. Users are happy. Revenue is coming in. But the repo is untouchable. Duplicate functions. No structure. Touching one thing breaks something unrelated. This is what happens when you use AI without architecture. They don't think about what comes next. They add files, duplicate logic, solve the same problem three different ways. Not because the tools are bad. Because nobody gave them a system to build inside. The fix isn't "stop vibe coding." It's: → Define your data model before you prompt → Use a design system so the AI stays consistent → Build in modules, not one giant tangled app → Refactor every 2 weeks, not every 6 months We've rebuilt 250+ apps at Creme that started exactly like this. The rewrite is always more expensive than doing it right the first time. Vibe coding without architecture isn't building. It's accumulating debt.
-
Oh, Sup Y'all (@OhSupYall) reportedHas Reddit acknowledged their bot problem? It’s wild to me that almost half of all posts I see when I open the app are obviously bots, but somehow people are being fooled, unless the comments are bots too. It’s wild to think I used to get value out of this app. $RDDT @Reddit
-
Rick Decker (@1750agreed) reportedHey @TheSpoilerGirl Please read this from our dear friends at Reddit: "Steffy is a hypocritical male-centered pick me according to you Hope should shut up and deal with whatever you give her with no problem because she's lucky to have HFTF and even be in the building." Part one
-
tommy bologna (@therealtb404) reported@unusual_whales Reddit is not profitable. Y'all are going to have to fix your America derangement syndrome if you want to stay in the green.
-
𝜗℘ ◟ ͜ Sylvie ,🍰 . Vincent's ♡ (. ❛ ᴗ ❛.) (@vincentsbelle) reportedHow many accounts do you have? Twitter: - 2 Discord: - 4 Instagram: - 1 Facebook: - 1 Snapchat: - 0 (im not being tracked down by highschool chavs mate) TikTok: - 6 Twitch: - 1 Steam: - 1 YouTube: - 1 Spotify: 1 Pinterest: - 1 Reddit: - 1 Gmail: - 6 Telegram: - 0
-
Traddoc (Tdoc) (@Jclearfield2) reported@toddboese @and_catch_fire @MattWalshBlog Reddit is down the hall and to the left.
-
Bobby Shuttlecock (@womprat48) reported@_24kin I feel like I’m losing my mind. A post about this blew up on the baseball sub on Reddit, too. This is such a non-issue. He’s a rehabbing pitcher who let one get away from him trying to go up and in.
-
Sean G (@SeanG882) reportedAs someone who has spent years grinding in AI search at a big tech company, I have always been against GEO visibility reports. Recently, I saw a comment from PearlsSwine in the Reddit r/AEO . This guy basically said exactly what I’ve been thinking, and he laid out the reasons very clearly. Ask ChatGPT the same prompt at 9:00 a.m., then ask it again at 9:01, and you may get different answers. There are many reasons for this: temperature, sampling, server side routing between different model variants, invisible A/B tests, and so on. A single observation is just noise. To get a real signal, you would need heavy sampling, and that costs real money. That’s why most of these tools sample very thinly, then pretend the number is stable. It isn’t. Some people might say: “Then I’ll just spend more money and brute force it.” But the problem is, these tools cannot truly simulate how real users query. Real users may have memory turned on. They may have custom instructions. Their current conversation may already contain previous context. The wrapper app they’re using may inject its own system prompts. The platform may also apply different layers of personalization. These tools, meanwhile, are just firing naked queries from a brand new session. On top of that, ChatGPT routes users across different underlying models depending on plan tier, query type, and server load. Perplexity also has its own model selection mechanism. So the model your tracking tool is hitting is not necessarily the same model your prospect is using. You are measuring a product that is different from the one being consumed in the real world. Even worse, the models themselves can be refreshed, retrained, or replaced without notice. If your visibility score goes up 14% on a random Tuesday, is it because your content strategy worked, or because OpenAI quietly rolled out a new checkpoint? You can’t know. You will never know. Of course, the tool will attribute it to your actions, because that is the story that keeps you subscribed. And what about its prompt set? That is curated by the tool vendor, not dictated by your buyer journey. Real query distribution is extremely long tail. Synthesizing 500 so called “representative” prompts and claiming that this measures how AI sees your brand is like sticking a ruler into a puddle and saying you’ve measured sea level. On top of that, extracting brand mentions and sentiment from the answers is itself an imperfect process. So there is noise at the input layer: stochastic LLM outputs. And there is noise at the parsing layer: another layer of NLP analysis stacked on top of that. For all these reasons, I believe selling GEO visibility reports is basically selling snake oil. Is there a solution? From a technical perspective, not really, at least not for now. The only practical path is to rely on indirect signals, such as monitoring changes in target site traffic, interviewing users, and using similar methods to roughly estimate the impact of GEO.