Cloudflare status: hosting issues and outage reports
No problems detected
If you are having issues, please submit a report below.
Cloudflare is a company that provides DDoS mitigation, content delivery network (CDN) services, security and distributed DNS services. Cloudflare's services sit between the visitor and the Cloudflare user's hosting provider, acting as a reverse proxy for websites.
Problems in the last 24 hours
The graph below depicts the number of Cloudflare reports received over the last 24 hours by time of day. When the number of reports exceeds the baseline, represented by the red line, an outage is determined.
At the moment, we haven't detected any problems at Cloudflare. Are you experiencing issues or an outage? Leave a message in the comments section!
Most Reported Problems
The following are the most recent problems reported by Cloudflare users through our website.
- Cloud Services (38%)
- Domains (33%)
- Hosting (22%)
- Web Tools (5%)
- E-mail (2%)
Live Outage Map
The most recent Cloudflare outage reports came from the following cities:
| City | Problem Type | Report Time |
|---|---|---|
|
|
Hosting | 2 days ago |
|
|
Domains | 2 days ago |
|
|
Cloud Services | 8 days ago |
|
|
Hosting | 8 days ago |
|
|
Cloud Services | 9 days ago |
|
|
Hosting | 17 days ago |
Community Discussion
Tips? Frustrations? Share them here. Useful comments include a description of the problem, city and postal code.
Beware of "support numbers" or "recovery" accounts that might be posted below. Make sure to report and downvote those comments. Avoid posting your personal information.
Cloudflare Issues Reports
Latest outage, problems and issue reports in social media:
-
Neyo_AI (@Neyoak1) reported🔵 Cursor's entire strategy was "monk mode" — heads down on product, zero growth team, zero ad spend. Their biggest viral moment? A random Cloudflare VP posted a video of his 8-year-old daughter coding with it. 2.7M views. They had nothing to do with it.
-
Gabriel Abi Ramia → tubespark.ai (@gabrielabiramia) reportedQuantum doesn't threaten Bitcoin's mining — it threatens the signatures. ~4M BTC sitting in exposed wallets is the real problem nobody's talking about. Cloudflare set a 2029 deadline. Bitcoin devs are still not worried.
-
Rodrigo Rocco 👨💻📈📗 from JobBoardSearch 🔎 (@rrmdp) reportedThis is what the PostHog AI said but still think is wrong just because simple logic bot traffic is becoming a real problem they are getting smarter and screwing all traffic stats! If somebody know how to fix please share Note: JobBoardSearch already runs under Cloudflare
-
Omote-Ura (@omoteurax) reported@MetaMachina_RW Cloudflare Turnstile still phones home to Cloudflare's servers on every solve, so they see every login IP and timestamp even if you don't.
-
DisPaisy (@dispaisy) reported@jamesperkins @Cloudflare This is a Google problem. I had this problem for years. Every time I set up a new device, I needed to make sure that I already had more than one account signed into Google so it gave me the option to log into a new Google account or use an account that I was already logged into.
-
Ayush Garg (@gargayush909) reported@shydev69 @Cloudflare Same issue is in @Replit dashboard
-
Raynhardt Coetzee (@Raynhardt_dev) reportedif your AI streaming breaks depending on where you deploy, the problem isn't your streaming code. it's where it's running. Next.js 16.2 introduced a stable Adapter API for deploying across Vercel, Cloudflare, AWS, and others without rewriting. but for agent workloads, the Adapter API alone isn't enough. the part that actually matters is the runtime boundary. AI streaming needs a persistent Node.js runtime. middleware runs in the Edge runtime. if your streaming logic lives in middleware or gets caught in the React render loop, you'll hit inconsistent behaviour that changes per platform. the fix: a proxy.ts file at the project root. outside the render loop. every AI call goes through it. auth headers, stream handshakes, cross-origin proxying, handled in one place. Adapter API tells Next.js how to run anywhere. proxy.ts tells your agent where to run consistently. you need both. not just in development. in production.
-
Cruser (@0xCruser) reportedPolymarket runs behind cloudflare, order matching depends on location, milliseconds decide everything They are set up vps close to their infra websocket feed with live prices Scanner for up + down less than 1 dollar Auto buy on trigger Now the weird part
-
DiabolikDVD.com (@diabolikdvd) reported@TiffaHorror @PayPal There is also a very specific number on the page that refers to me, which they can look up and see the source of the problem. I use Cloudflare and am not a tech person but have somehow managed to fix similar problems with them on my own
-
Mykolas 🇱🇹 (@mmmykolas) reported@ZachSDaniel1 Just a mention, while these thumbnail definitions look great to start with, i advice against it. Store the actual file, and then use some sort of service like Cloudflare or Uploadcare to have formatable urls for different sizes and so forth. While it looks like a good idea to have these sizes predefined it's such a client/consumer based thing.
-
Tobias Möritz (@tobimori) reported@AArdvarkErick @Cloudflare @eastdakota 💯 same issue for me
-
Bruno Godinho (@FlaNacao2019) reportedTried everything: Reinstalled EA Anti-Cheat Reset Shader Cache Capped FPS at 60 Cloudflare DNS (1.1.1.1) Fullscreen Exclusive + lowest graphics Nothing fixed the delay. This is clearly a server/netcode issue on EA's side after the patch.
-
CulturedNiichan (Kuro) (@culturednii_v2) reported@Doc72_ the problem is to me the main threat is not someone else. It's cloudflare itself, big tech. I mean I don't use it so can't complain. I dont' know, if I really needed a secure web with cloudflare for any reason, the contents themselves should be encrypted via a JS library or smth
-
CryptoSangeet (@CryptoSangeet) reportedCloudflare doubling down on post-quantum shift… future-proofing or overkill? 🤔
-
Snipemeister (@Timeler_csgo) reported@jamesperkins @Cloudflare "Picky" says the guy too lazy to login to google before logging in on a third party site kekw
-
Vin (@VinayakNgm) reported@shydev69 @Cloudflare You are Right, and people who Support Cloudfare then don't Forget India is Mega Big Market for Cloudfare then Pakistan
-
Suresh Kamath (King of Typos) (@sarathikg1) reported@shydev69 @Cloudflare Why should they fix it? Some one from Pakistan will then say the same and ask them to fix it. International companies show the map as it is. They do not include disputed areas
-
Lazar Stojković ⚡️ (@LazarStojkovic) reported@jumperz Meh, it’s the same general idea as Cloudflare Durable Objects + Agents SDK. The only real news here is Anthropic making the durable session log a turnkey managed service.
-
LeafyWasHere (@Foiblaet) reportedJust kidding, it doesn’t work unless I change my network altogether as well, cloudflare should be seized by the state
-
CryptoSangeet (@CryptoSangeet) reportedCloudflare doubling down on post-quantum shift… future-proofing or overkill? 🤔
-
Tasmay Tibrewal (@TasmayTibrewal) reportedArunachal is visible. and i get that the territory (PoK & Aksai Chin) fall under India's sovereignity, but they are claimed by china and pakistan too. For us it seems like a violation, but for a third-party neutral body, it is conflict and chaos, which they dont want to meddle with. They want to show the map, offending the least people as possible, if they show PoK in India, then pakistanis will be offended and for aksai chin, chinese. Thus the best way (ideal way) to represent a map, is showing the controlled territories, rather than the claimed ones. Almost all international maps, including standard world maps operate on this. Cloudflare is not even to be blamed in this case, they are probably using a standard mapping library/service. It is on us (you) to understand, that this is the practice. If you want your claimed territory to be shown then: a) have control over it or b) have enough power on the global stage (probably only true for US today), to manipulate the map according to your claimed sovereignity If you look at it from a neutral perspective, it is a very simple, and ideal principle, not discriminating against anyone. One should stop being so self-absorbed, and look ideally to get the very obvious picture. This is a general issue I have observed with people commenting across channels, accounts for maps and all, when it is not even the fault of the person showing it, these are the standard, ideal practices, plus it is rare to find a global map with india shown as it should be. So either work on making India strong enough that by hook or by crook, everyone has to respect our sovereignity; or just shut up altogether, no point crying about it like a ***** everywhere.
-
Async Tear 💧 (@asynctear) reported@venelinkochev can i host a scraper with this vps? im thinking of cloudflare as proxy. not gonna hammer other server down
-
Micah Berkley - The 50 Cent of AI. (@MicahBerkley) reported@levelsio @Hernandez_A @GoDaddy I stand on what I said... setting up Cloudflare Tunnel + Workers + Zero Trust just to do one thing is a pain. And truthfully, even they often speak about it. Do you remember the hoops that you needed to follow to use their OpenClaw instance? You had to set up multiple secrets via Wrangler (CLI), set up GW Tokens, AUDs, R2 storage loops, Zero Trust and hidden critical menu options on their dashboard. GoDaddy used to almost absolutely terrible with configurations on their platforms.
-
akvn (@0xAkvn) reported@espicodes @0xanmol Thats fair, but even with enterprise contract cloudflare would likely be cheaper. I would also look into @PlanetScale instead of aurora for a ~1.5x performance improvement while being cheaper (they use AWS so latency would not be an issue)
-
TechInnovation (@TechInnovationz) reported$IonQ Cloudflare just moved their post-quantum security deadline to 2029. This matters more than most people realize. Cloudflare protects ~20% of the web. They’ve been ahead of the PQ curve since 2019. They activated post-quantum encryption by default in 2022. Today, 65%+ of human traffic to Cloudflare is already PQ-encrypted. And they just told the world the timeline is no longer 2035. It’s 2029. What triggered the acceleration, in their own words: → Google’s ECDLP paper from last week (the one we covered 500K qubits, 9-minute Bitcoin attack window) → Oratomic’s neutral-atom resource estimate published the same day: only ~10,000 qubits to break P-256 on neutral atom architecture, with 3-4 physical qubits per logical qubit instead of 1,000 → IBM Quantum Safe’s CTO publicly stating he’s “not ruling out large-scale quantum attacks against high-value targets as early as 2029” But the most chilling line in the Cloudflare post is from Scott Aaronson, quoted directly: “At some point, the people doing detailed estimates of how many physical qubits and gates it’ll take to break actually deployed cryptosystems using Shor’s algorithm are going to stop publishing those estimates… for all we know, that point may have been passed already.” Cloudflare’s response: “That point has now passed indeed.” The strategic shift in their post is the part nobody is talking about. Until now, the PQ industry focused on encryption defending against harvest-now-decrypt-later. Cloudflare is now prioritizing authentication. Their reasoning is brutal: “An imminent Q-Day flips the script: data leaks are severe, but broken authentication is catastrophic… An active quantum attacker only needs to find one trusted quantum-vulnerable key to get in.” Translation: when Q-Day arrives, the threat is no longer “they’ll read your old emails.” It’s “they’ll forge a code-signing certificate and push malware through your auto-update mechanism.” Long-lived keys, root certificates, API auth keys all attack vectors. Now read Cloudflare’s public recommendation to businesses, also in the post: “For businesses, we recommend making post-quantum support a requirement for any procurement.” A procurement requirement. Not a research curiosity. Not a future option. A line item on every RFP starting now. Look at IonQ QNSS stack through that filter: → ID Quantique commercial QKD, QRNG, post-quantum cryptography products. Already selling to telcos, banks, governments today. ~300 patents. Geneva-based, 24-year track record. → Qubitekk entanglement-based QKD on US critical infrastructure. 118 patents. Already deployed on the Chattanooga electrical grid via DOE-funded program. → Year of Quantum Security 2026 backed by FBI, NIST, CISA. IonQ is part of it. Every acquisition in the QNSS division was made before this Cloudflare announcement. Before the Google paper. Before the Oratomic paper. Before IBM’s CTO went public with 2029. The thesis isn’t “IonQ will benefit from PQ panic.” The thesis is that the only quantum company that already owns commercial PQ products at scale is the one that spent the past 18 months acquiring them. When the world’s largest CDN moves its deadline forward by six years and tells every business to make PQ a procurement requirement, the market for ID Quantique’s products doesn’t grow gradually. It compresses. Source: Cloudflare $IONQ #IonQ #PostQuantumCrypto #QuantumSecurity
-
CheckBit (@Share4Bitcoin) reported@shydev69 @Cloudflare @Cloudflare better change this or all your site would be down
-
Kai (@kai_h) reported@franklinveaux I know this isn’t of much help to you, but I just needed to tell someone that I’m just about done migrating my website from Wordpress to Astro + Decap CMS, and I’ll be able to host it as a static site on Cloudflare pages and it should be extremely resilient.
-
Kuan Yu (@phuakuanyu) reportedI manage 12 email accounts. I've never opened half of them in a browser. (1) I just set up 7 Zoho domain emails for different businesses. Add 3 Gmail accounts and one Google Workspace. That's 12 inboxes across two providers. Opening each one in a browser tab was never going to work. (2) So I skipped the browser entirely. My CLI agent connects to all 12 accounts via IMAP, pulls unread counts in parallel, and gives me a single status board. Tonight it found 52 unread emails across every account in under 10 seconds. (3) But checking isn't the hard part. Triage is. The agent reads full email bodies, classifies them into four tiers - SKIP, INFO, MEETING, ACTION - and presents only what needs a decision. Tonight's result: 4 items needed attention out of 52. The other 48 were noise. (4) I made my decisions in 30 seconds. Let the Cloudflare domain expire. Let the AWS free tier close. Then one more command: all 52 emails marked as read across all 12 accounts. Done. (5) The entire email workflow - check, read, triage, decide, clear - took under 3 minutes. No browser. No tabs. No switching between Gmail and Zoho. No password prompts. No loading spinners. (6) When you have 12 inboxes, the browser-based email client isn't slow. It's architecturally wrong. The inbox is just another API.
-
Shang T(alleres)sung (@_ShangTsung_) reported@Tokenexpress33 @krishdotdev About the safety, I don't know, all major tech companies had big leaks at least once. Or downtime. Even amazon fell!! Oh sorry mom I can't show you the picture of our grandma because cloudflare is down. Please try again later. You rely on a giant group of people for simple tasks.
-
Khizar (@khizar_mm) reporteddoes anyone know why cloudflare workers plan keeps having payment issues?