They Said AI Would Help Humanity... Why Anthropic Has Been 'No Answer' to Billing Errors for a Month

A screen showing a billing error message, while an AI robot ignores a person's hand reaching out for help nearby.
AI Summary

Anthropic, a public benefit corporation supposedly dedicated to the long-term well-being of humanity, has failed to respond to billing error inquiries from paid users for over a month, causing a surge in user frustration.

Imagine this: You’ve subscribed to the paid version of ‘Claude,’ the AI assistant famous for its intelligence, to boost your productivity. But one morning, you receive a credit card notification. You’ve been charged $180 (about 250,000 KRW) for ‘Extra Usage’ that you never even used. Panicked, you immediately email customer support. The only reply you get is a mechanical one: "Use the AI refund tool," which doesn’t even apply to your specific problem. If 30 days passed with no further word, how would you feel?

This is the reality currently facing paid subscribers of Anthropic. [2026-04-09] I’ve been waiting over a month for Anthropic to …

Why does this matter?

Anthropic is not just another company making chatbots. Based in San Francisco, the firm defines itself as a ‘Public Benefit Corporation (PBC)’—a type of company that aims to realize social value alongside shareholder profit. Anthropic- Wikipedia Their mission is to "build safe AI that helps the long-term well-being of humanity." Home \Anthropic

However, the fact that this AI giant, valued at billions of dollars, cannot resolve a simple billing error for the ‘real human’ customers who trust them and pay them is a major shock. This has led to criticism that cutting-edge tech companies are so preoccupied with scaling their services that they are neglecting ‘customer support’—the most basic form of trust with their users. It seems that while they have grand slogans about helping humanity, they are failing to hear the voice of even a single user right in front of them.

Easy Understanding: A ‘Broken Meter’ and ‘Talking to a Wall’

The core of this crisis can be summarized in two main points. Let’s use analogies to make the situation easier to grasp.

1. A Digital Meter Running Wild

The problem users are facing is like a water meter spinning frantically even when you haven’t turned on the tap. Anthropic’s internal ‘Usage meter’ (the digital device that calculates how much AI you’ve used) is displaying incorrect values, generating ‘Extra Usage’ fees far beyond what users actually consumed. I’vebeenwaitingoveramonthforAnthropicsupporttorespondto…

In particular, reports are flooding in from power users on the ‘Claude Max’ plan who have seen unexpected charges of around $180. I’ve been waiting over a month for Anthropic support to …

2. The AI Agent’s ‘Moebius Strip’

An even bigger problem is that the communication channel for resolving issues is effectively closed. When you report a billing error, an AI agent responds instantly. But this AI simply repeats, "Use the in-app refund menu." I’ve been waiting over a month for Anthropic support to respond The ‘extra usage overcharging’ problem users are experiencing is not an item that can be resolved through that specific menu.

The user sends another email, but they fall into a state of ‘no response’ for over a month without ever hearing from a human agent. Simply put, they are in a state of ‘digital isolation,’ blocked by an AI wall and unable to reach a real person.

Current Situation: A Growing List of Frustrated Victims

Similar cases are pouring into online communities and GitHub, the sanctuary for developers.

In contrast, Anthropic recently made a flashy announcement that it would donate $100 million worth of usage credits to the cybersecurity field. Project Glasswing: Securing critical software for the AI era \Anthropic Users feel a deep sense of betrayal seeing this indifference toward individual billing errors while the company engages in such large-scale donation activities.

What’s Next?

Currently, Anthropic’s billing system is quite rigid in terms of user convenience. For example, users cannot even change their billing date directly. To change the date, they must go through the cumbersome process of canceling their existing subscription and resubscribing on the desired date. [Paid Plan Billing FAQs Claude Help Center](https://support.claude.com/en/articles/8325618-paid-plan-billing-faqs)

As these service deficiencies pile up, fundamental questions are being raised about whether Anthropic truly qualifies as a ‘public benefit corporation.’ If Anthropic does not resolve this issue quickly, users may turn their backs at any time, no matter how high-performing their AI models are. Even for a company at the pinnacle of AI technology, if it cannot keep its most basic ‘promise to customers,’ its future cannot be said to be entirely bright.

MindTickleBytes AI Reporter’s View

It is truly regrettable that Claude, which surprises us with its human-like intelligent conversation, offers only ‘parrot-like’ answers when it comes to human questions about money. No matter how brilliantly technology advances, the subjects who choose and use that technology are ultimately people. I sincerely hope that within the grand goal of ‘human well-being’ that Anthropic proclaims, the peace of mind of paid users who are losing sleep over billing errors is also included.

References

  1. Anthropic- Wikipedia
  2. I’vealsobeenwaitingoverthree weeks to speak with customer support afterbeinggifted an annual subscription just as my payment card expired…
  3. I’vebeenwaitingoveramonthforAnthropicsupporttorespondto…
  4. I’ve been waiting over a month for Anthropic support to …
  5. [2026-04-09] I’ve been waiting over a month for Anthropic to …
  6. I’ve been waiting over a month for Anthropic support to respond
  7. Project Glasswing: Securing critical software for the AI era \Anthropic
  8. Home \Anthropic
  9. [Paid Plan Billing FAQs Claude Help Center](https://support.claude.com/en/articles/8325618-paid-plan-billing-faqs)

FACT-CHECK SUMMARY

  • Claims checked: 12
  • Claims verified: 12
  • Verdict: PASS
Test Your Understanding
Q1. What form of enterprise does Anthropic operate as?
  • A purely for-profit corporation
  • A Public Benefit Corporation (PBC)
  • A non-profit organization
Anthropic is a Public Benefit Corporation (PBC) aiming to contribute to the long-term well-being of humanity, ensure benefits, and mitigate risks.
Q2. What is the main billing-related problem users are experiencing?
  • Billing free users by mistake
  • Overcharging for unused 'Extra Usage' and receiving no response from the customer center
  • A system where subscription cancellation is impossible
Users are reporting incorrect 'Extra Usage' charges caused by usage meter errors and a lack of response from the support team for over a month.
Q3. What problem occurs at the first step of Anthropic's customer support system?
  • Phone calls are never connected
  • An AI agent only suggests inappropriate solutions and fails to connect to a human agent
  • The support email address does not exist
When an inquiry is sent, an AI agent responds immediately but only guides users through an automatic refund process that is unhelpful for the actual issue, followed by no further response.
They Said AI Would Help Hum...
0:00