Tell HN: Fiverr left customer files public and searchable

by morpheuskafka 232 comments 831 points
Read article View on HN

232 comments

[−] evmaki 30d ago
Extremely bad stuff here. Can't believe it's been 7 hours now and you can still pull up people's complete prepared tax returns right from a Google search. This should be a business-ending breach of trust and good practices, but I worry there's probably a lack of regulatory might or will to make anything happen.
[−] morpheuskafka 30d ago
The company put out its first statement:

> “Fiverr does not proactively expose users’ private information. The content in question was shared by users in the normal course of marketplace activity to showcase work samples, under agreements and approvals between buyers and sellers. This type of content requires the buyer’s explicit consent before it can be uploaded. As always, any request to remove content is handled promptly by our team."

https://sqmagazine.co.uk/fiverr-security-flaw-private-docume...

It sounds like they are trying to claim the users involved published the links and that's why they are on Google? But how could anyone believe that multiple users intentionally published their SSN?

Re the takedown, I'm also guessing it's from Cloudinary. Maybe HTTP Referrer based?

[−] janoelze 30d ago
The DMCA takedown also suggests at least one user was not aware of that file being public. This all comes down to what that "sharing" action specifically looked like.

ChatGPT recently had a similar case with the sharing feature on conversations leading to publicly indexed convos. That incident would have also matched the implied definition of sharing here.

[−] deepserket 30d ago
It looks like they (cloudinary?) blocked the content.

Each result from the query site:fiverr-res.cloudinary.com form 1040 returns 404

[−] abustamam 30d ago
Yikes! It should not require the service provider to block PII, but at least someone plugged the leak.
[−] TkTech 30d ago
It's very unfortunate but a significant amount of the most damaging stuff in this is from the underprivileged and those with minimal means who were trying to find help they could afford. Non-profits trying to get website help, confidential reports for charities trying to get translations, children seeking therapy (fiverr has a therapy category!?) for some truly dark stuff.

Utterly inexcusable that this is still up after so many hours.

[−] mellosouls 30d ago
Technically, 40 days and 7 hours!
[−] ChrisMarshallNY 30d ago
...and forty nights...
[−] kkarpkkarp 30d ago
...since you leaked my data away
[−] applfanboysbgon 30d ago
Software development jobs are too accessible. Jobs with access to/control over millions of people's data should require some kind of genuine software engineering certification, and there should be business-cratering fines for something as egregious as completely ignoring security reports. It is ridiculous how we've completely normalised leaks like this on a weekly or almost-daily basis.
[−] morpheuskafka 30d ago
At my last job, I opened up Shodan in my free time and clicked through our ASN with the free filters. In two minutes I found multiple iDRACs online. Surprisingly, none had default pw. But one had a public exploit vuln that was years old allowing takeover...

Turns out during the firewall hardware migration years ago, several units firewalls were switched to audit mode (not enforcing rules). So an entire institute (health research!) had their whole subnet public with zero firewalls, both the server OS and iDRAC interfaces. iDRAC isn't even supposed to be on the same VLAN per Dell let alone on the internet.

To top it off, after making some tickets (admittedly not all as serious, ex MFP web UIs on internet) from Shodan, I got pushback from the firewall team for causing units to submit to many changes.

I also got in trouble with our Qualys analyst for undermining his work because he hadn't gotten to that units annual review yet, even though I didn't even have a Qualys login. (And even if I had found it there, since when do we wait for annual reviews to fix that?)

It took at least three weeks internally to get it fixed, and by that I mean only the iDRAC IP blocked with the server itself still wide open.

And that's only because I mentioned it to my manager (awesome guy and not formally responsible for firewall rules) after an unrelated no firewall host incident came through and he authorized an emergency rule.

[−] bblb 30d ago
Huawei Enterprise devices tend to have a CAPTCHA by default on their BMC/OOB GUIs or the other various system/infrastructure service GUIs (such as the HuaweiCloud/FusionCloud products). I'm guessing the reason is that people leave the management ports and GUIs wide open to the public Internet, so the CAPTCHA is protecting at least from the very basic script kiddie bots.
[−] morpheuskafka 30d ago
They may be part of it, but as a publicly traded company, there's got to be a at least a few people there with a fancy pedigree (not that that actually means they are good at their job or care). But if such a test existed, they presumably would have passed it.

They also have an ISO 27001 certificate (they try to claim a bunch of AWSs certs by proxy on their security page, which is ironic as they say AWS stores most of their data while apparently all uploads are on this).

[−] trollbridge 30d ago
A while ago I had a customer come to me who had a simple Shopify site and fell for a phishing type of attack where someone simply had an email like "shopify_security at gmail" and kept telling her she needed to apply all kinds of changes. They laundered the payments through Fiverr.

Then they would install WordPress plugins to make the site worse and claim even more "work" was needed.

I documented the entire thing, including my own credentials, and sent it off to Fiverr. Fiverr's response was everything was fine and there was nothing they could do about it, even though it was obvious fraud.

Google never did anything about it either, nor did Shopify.

Given how they handled such a minor situation like that... I guess it shouldn't be surprising they're just asleep at the switch for a major one like this.

[−] bluefirebrand 30d ago

> But if such a test existed, they presumably would have passed it

Sure, and now they could have their credentials revoked, potential be legally liable, and never find work in this field again which would prevent them from cocking up another company this way

[−] Aurornis 30d ago

> should require some kind of genuine software engineering certification

Wouldn't change a thing, other than add another hassle you have to pay for to do your job.

This is the result of carelessness, not someone who didn't know that private data should be private because they weren't certified.

[−] applfanboysbgon 30d ago
This is the result of somebody who has no idea how the fuck the tech they're using works. They surely knew it should be private, but they did not know that they were making it publicly available because they were blindly fumbling their way around in a job beyond their competence level. There is a 0% chance this was ordinary carelessness, in the form of "I know better but don't care enough", this is so clearly a case of "I don't know what I'm doing".
[−] Aurornis 30d ago
Any time someone tries to suggest certification as a solution I ask the same question: How would it have solved this problem?

Would the certification require someone to take an official certification test for the framework used?

And therefore we’re only allowed to use frameworks which have certification tests available?

If you want to write some new software, do you have to generate a certification for it and get that approved so people are allowed to use it?

Sounds like a great way to force us all to use Big Company approved software because they’re the only ones with pockets deep enough to play all of the certification games

[−] fsflover 30d ago
How did original engineering certification prevent dangerous constructions? Did it force everyone to use a Big Company?
[−] applfanboysbgon 30d ago
The fact that you're thinking purely in frameworks is the exact problem that plagues the software industry. Framework-focused development is why we're in this mess; frameworks make it easy for people who don't understand how to program to publish shitty software by copying-and-pasting code and fudging around a few strings or variables to match their use case. That kind of accessibility is great for low-stakes software, letting anyone make interesting toys, but should be completely unacceptable in a professional environment with, for example, people's fucking tax documentation at stake.

If I had my way, the certification process starts at the bottom of the stack, ie. you should be expected to have a functional knowledge of assembly instructions, memory management, registers, the call stack, and build up from there. Not that we need to write assembly on a daily basis, but all of the abstractions are built on top of that, and you cannot realistically engineer secure software if you don't understand what is being abstracted away. If you do understand the things being abstracted away, you have the fundamentals necessary to do good work with any programming language or framework. Throw in another certification starting from networking fundamentals if your job involves that. 30 years ago, most professional programmers had this level of understanding as table stakes, so we can hardly say it's an unrealistic burden that's impossible to meet.

Would it be a higher barrier to entry that massively cuts the size of the field working on sensitive software and slows software development down, yes. That is exactly what we need. There was a time when people built bridges that collapsed, then we implemented standards and expected engineers to do real work to make sure that didn't happen. Is that work expensive and expertise-intensive, yes, do bridges still collapse, only very rarely. We are witnessing software bridge collapses on a weekly basis, which should be seen as completely unacceptable. The harm is less obvious than when everyone on a bridge dies, but I do think that routinely leaking millions of people's sensitive data is causing serious harm and likely does lead to people dying in second-order effects.

[−] bruce511 30d ago
I follow your logic here, and it's certainly a coherent argument.

That said, there are perhaps some factors you are overlooking which matter.

The first is that no amount of certification solves the actual problem (which is that security mistakes are made, often in new and novel ways.)

Secondly the amount of software being needed (and produced) is immense. Bridges require engineers, but the demand for new bridges is tiny. The demand for new software is enormous, and the current rate of production requires many more people that could ever be certified.

In other words, say you only allowed comp-sci graduates with a proper 4 year degree, covering assembly upwards etc. The supply of programmers would drop to what colleges could produce. Which is not nearly enough.

The analogy also falls down a bit on penalty-for-failure, a collapsed bridge kills people, bugs in my notepad app might lead to information leaks? Thats not the same thing.

In truth, at least for the last 35 years, the number of unqualified developers exceed qualified ones by orders of magnitude. And there still seems to be no limit to software demand.

Finally there have been no studies I am aware if that suggest that security flaws are added more frequently by non comp-sci grads compared to comp-sci grads. Anecdotally I don't see that distinction myself. (From my observation security outcomes correlate to the degree to which the individual considers security to be important.)

And, of course, security issues are not limited to programmers- management has a role to play as well. Should they be certified too?

So, I'm not convinced that your suggestion, however desirable, would solve the problem. And since it's clearly unimplementable in the real world it's a moot argument anyway.

[−] applfanboysbgon 30d ago
"Bridges" are shorthand. There is no shortage of need for new infrastructure. Any kind of construction needs engineers involved to ensure what's being built doesn't collapse from a gust of wind. Apparently, in the US, there seem to be about 1.5 million engineers and 4.5 million software developers. Well, I think in the short term, certifying only 1.5 million "software engineers" would be fine, actually. Note that my argument pertains only to sensitive software. If you want to make software that doesn't pose a danger to its users, you don't need an 'engineer'. This should have the second-order benefit of making PII toxic waste. If you need a real engineering team to process PII, companies that don't need PII will stop scraping every last fucking thing and leaking it. The majority of software in the world doesn't actually need PII to function, they could just be incentivized to stop hoarding it and use a regular "software development" team if they want to deliver cheap and fast.

I also wouldn't specifically associate this with college degrees. In fact I think universities are doing a shockingly bad job of producing functional software developers. But, on the other hand, you don't need a university to produce a good programmer. Software development is possibly the most open, information-available discipline in the world. Self-motivated learners can absolutely become competent on their own. The certification should be merit-based, and provide a clear path to learning the material the certification is based on. Many people will go through the effort to educate themselves and learn the required skills, especially if certified software engineers are in high demand and command a higher salary.

Regarding the penalty-for-failure, as I said, the harm is not as immediately apparent as when people die in a bridge collapse. But leaking sensitive information still leads to people dying, even if the connection is not as direct. Doxxing and blackmail frequently lead to suicide, and there are other damages that could lead to a butterfly effect culminating in a higher death rate, or, even if not death, tangible harm. This leak contained birth certificates, IDs, passports, tax documentation, passwords, all kinds of information that could be used to ruin someone's life with identity fraud. There is also, of course, some software in the world that is directly safety-critical, much of the software used in the health field for instance, which is also currently being written by the lowest bidder in many cases.

Regarding management, they don't need a certification but rather consequences for their actions. Currently the incentive structure is such that management is rewarded for cutting costs and is never punished for harming customers. Fiverr, for instance, should be facing an investigation that threatens to shut down the business given that not only did this happen in the first place, and not only did they ignore it for 40 days, but even after it went public the sensitive files were still accessible for 12+ hours (notably, after they were definitely made aware of it, given reports in this thread of people receiving replies from Fiverr about it). Maybe throw in some criminal liability for the people most responsible for a situation this horrible. Management would tighten up real quick.

I don't agree that this is unimplementable in the real world at all. If anything it's a complete abnormality that software development is the way it is, when most other skilled professions are licensed and regulated.

[−] joseangel_sc 30d ago
i have bad news for you
[−] ryandrake 30d ago
The certification obviously would have to have teeth. A certification that you needed in order to do work as a software professional, which could be revoked for cases of carelessness or negligence, would disincentivize carelessness and negligence.

This is how airline pilot certificates work. And in that career, certification actually works. It's not a miracle or unexplainable.

[−] throwanem 30d ago

> Would the certification require someone to take an official certification test for the framework used?

> And therefore we’re only allowed to use frameworks which have certification tests available?

When it's safety-critical, yes, absolutely. A service that handles sensitive PII, such as the one whose "engineers" should be prosecuted for this incident, is definitionally safety-critical.

If you're afraid in that world you'd be unable to work, maybe you deserve to be.

[−] hilariously 30d ago
It's so much worse in the industry, the truth is that many people literally have no idea how to secure things, what to secure, why to secure it - they pay no attention and are plainly ignorant of the state of the world and oftentimes just stupid.

I worked at a company where a customer called confused because when they googled our company as they did every day to login to their portal they found that drivers licenses we stored were available on the public internet.

The devs literally didn't know about direct object access and thought obfuscation was enough, didn't know about how robots.txt worked, didn't know about google webmaster shit, didn't know about sitemaps, they were just the cheapest labor the company could find who could do the thing.

This is a huge portion of outsourced labor in my experience, not because they are worse overseas in any respect, but because the people looking for cheap labor were always looking for the cheapest labor and had no idea how that applied to the actual technical work of running their business.

[−] jval43 30d ago

>

they were just the cheapest labor the company could find who could do the thing.

Thats the problem right there. The company doesn't care. No amount of personal certifications is going to fix that.

It MUST be on the companies. They should be fined out of existence for such breaches and they would quickly change tune.

[−] ChrisMarshallNY 30d ago
> They should be fined out of existence for such breaches and they would quickly change tune.

Looks like this is a great opportunity for an object lesson. Let’s see how it goes…

As far as certification stuff…

Civil engineering has had licensing forever. That’s because Bad Things Happen, when they make mistakes.

I do think that it would be a good idea to score/certify critical infrastructure stuff. That might involve certification of the people that make it, but it should certainly involve penalties for the people responsible. That might include the authors, but it should probably also include the folks that decide to use the bad code.

I know that ISO 9000 is an attempt to address this kind of thing. In my opinion, it’s kind of a mess. I’ve worked in ISO 9000 shops, and it’s not much fun. The thing you learn, pretty quickly, is how to end-run the process, as it’s so heavy, that it basically stops all forward progress. It doesn’t have to, but often does.

Mistakes get made. If you design carefully, these mistakes won’t cause real damage.

I just figured out that an app I wrote, that’s been out for two years, has an embarrassing bug (mea culpa). I’ll get it fixed today.

Because I’m pretty careful, it doesn’t affect stuff like user privacy. It just introduces performance overhead, in one operation, so the fix will mean that the app will suddenly speed up.

I’m not sure that certification would have solved it. My security mindset is why user privacy wasn’t affected, and that comes from experience.

> Good judgment comes from experience. Experience comes from bad judgement.

[−] Orygin 30d ago
Also if you are personally liable of gross negligence, you will:

1. Get paid more (as less fake "engineers" are available for the responsibility).

2. Push back harder (or at least document in detail) on malpractice during development. Manager did not listen to your warnings? Document it and when shit hits the fan, the manager gets the stick instead of you.

Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems.

Manager not listening? Remind them they will face a trial if the issue does surface.

[−] ailef 29d ago

> Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems.

What!? So, when you can't switch jobs because the market is bad or for any other reason, your choices are: 1) quit and lose the income (which you can't afford) or 2) sign on whatever and accept the risk of jail time?

[−] Orygin 29d ago
The job market in such society would not be the same as it is now.

If you are certified, chances are you will have lots of choices to work.

[−] ChrisMarshallNY 29d ago
Sounds like every other vocation out there.

Software devs have been insanely privileged, for the last couple of decades. That seems to be changing.

[−] seemaze 30d ago

>Wouldn't change a thing..

That's exactly what certification or licensure does; it imposes financial, civil, and criminal penalties for malpractice.

The liability of incurring penalties quickly outweigh the benefit of arbitraging costs with an unqualified practitioner.

[−] hurflmurfl 30d ago
I think just putting it on the companies is enough. If the fines are serious and can put your company out of business, and are enforced, then the companies themselves will probably work out processes for not doing stupid stuff. Whether that be creating some sort of certifications that would be prized by the companies, knowing to hire a specialized team for a security review, or anything else.

If everyone knows that messing up security gets you in real trouble and the company loses real money, and it happens all the time, and it's not just "Facebook fined $x million for doing shady stuff", then I think the industry will adapt.

Like when GDPR got released and no matter if I thought we are or are not handling PII, I had to read up and double-check my assumptions just because it was being talked about all over the place and it would be embarrassing to be caught with your pants down when you didn't actually intend to do a shady thing.

[−] Orygin 30d ago

> I think just putting it on the companies is enough. If the fines are serious and can put your company out of business

They don't care. It's either never enough to make them care, or the company can just bankrupt and you go do something else.

If you or your manager has the threat of jail in the back of their mind, it's no longer just someone else's money being lost, it's personal.

> If everyone knows that messing up security gets you in real trouble and the company loses real money

There's already huge fines on paper for this, but never ever are the fines enough. It's always factored in the "cost of doing business". Also it's still someone else's money, why would an engineer care?

Please show me a GDPR fine that hit hard enough to scare companies into not fucking up? Evidently here it was not enough for Fiverr.

Edit: Just to provide an example, Takata airbags have been recalled massively (if you don't know why, look it up) but the company is now bankrupted and who is footing the bill? Their customers.

You cannot impose a fine on them, as it's bankrupt (now, but it was always the plan). They deliberately sold dangerous airbags and now what can you do so it doesn't happen again? Fine them some more? or maybe throw a few execs in jail because they knew of the problem and continued as usual.

[−] userbinator 30d ago
some kind of genuine software engineering certification

That only gives those in power another way to push people into toeing the line. There's enough corporate authoritarianism these days as it is already. Give Stallman's "Right to Read" a read. His dystopia is exactly where we're going to be headed quickly if we keep demanding someone to "do something".

"The optimal amount of fraud is nonzero."

"Those who give up freedom for security deserve neither."

[−] computably 30d ago
You're responding to literally 7 words out of context.

> Jobs with access to/control over millions of people's data should require some kind of genuine software engineering certification

FAANG, Fortune 500, etc., almost universally go out of their way to violate user freedom in pursuit of profit. Regulation is practically the only way to force megacorps to respect users' rights and improve their security, as evidenced by right-to-repair, surveillance/privacy, and so on.

And none of that has anything to do with users' individual rights to create, run, and modify their own software.

(Yes, regulatory capture exists, no, it doesn't mean all regulation is bad.)

[−] userbinator 29d ago
If the megacorps are going in that direction of being strictly regulated, the rest of the industry will follow. It's the general movement of the Overton Window that's the underlying issue.
[−] subscribed 29d ago
No, they won't. No one in their right mind "wants" ISO27001, ISO9001, SOC or multiple PITA certifications.

Companies do that because they want to attract certain kind of customers and have enough spare manpower and money to go through this all year long.

....or they want to hold a very sensitive data that requires *proven* processes, trainings and skills.

My firm has several of these and we have to keep full compliance team and *always* have some auditor on site.

No one does it just because.

[−] victorbjorklund 30d ago
I once worked in a company and noticed that customer financial statements were publicly accessible. Ran into the software team. And got the reply that no one told them that it should be behind authentication. Some people really don't use their own brains.
[−] 21asdffdsa12 30d ago
If you do, you get into trouble with the hierarchy, all those middle-managers and responsibility distributing committees will be unemployed.
[−] Loughla 30d ago
Teachers have to be licensed and keep up on licensing.

Plumbers. Electricians. Lawyers. Doctors. Hell, I have to get a license to run my own business.

Why shouldn't software come with a branch for licenses if you're working with sensitive data?

[−] coldtea 30d ago
We're going the other way: now any random vibe coded slop is the norm.
[−] bad_haircut72 30d ago
Normalize "vibe-plumbing"
[−] bombcar 30d ago
Both plumbing and wiring are “easier” in a way than programming-as they’ll violently and potentially explosively let you know if you messed up; whereas programming lets you be blissfully unaware until you see your data plastered across the nightly news.
[−] bradleyankrom 30d ago
Hairdressers!
[−] ge96 30d ago
People at my company don't even lock their computer when they walk away from their desk. Which yeah it's in a controlled environment but still.
[−] fnimick 30d ago
At least I'm sure LLM tools deploying code to production won't result in this happening more frequently. "Make sure it's secure. Make no mistakes."
[−] philip1209 30d ago
good thing it's getting easier to code - nothing bad can come of this :-)
[−] borplk 29d ago
Unfortunately everything is going in the opposite direction.

We are in the age of AI-slop AI-everything AI-break-it AI-fix-it.

Software companies are competing with each other on how low they can push the quality and still get away with it.

There's no reward or incentive for paying attention to the details or the quality. In fact you will get penalised for it.

[−] pesus 30d ago
Wow, the other comments weren't exaggerating. This is really bad. If my tax returns or other data were part of this, I might consider legal action.

I wonder if somewhere like Wired/Ars Technica/404media might pick this up?

[−] mtmail 30d ago
You followed the correct reporting instructions.

https://www.fiverr.com/.well-known/security.txt only has "Contact: security@fiverr.com" and in their help pages they say "Fiverr operates a Bug Bounty program in collaboration with BugCrowd. If you discover a vulnerability, please reach out to security@fiverr.com to receive information about how to participate in our program."

[−] viaredux 30d ago
I am a freelancer on Fiverr, this is VERY concerning. The amount of PII that I have sent over Fiverr, after sending NDA's is potentially all out in the public. I hope there will be accountability for this. IMO Fiverr has had terrible management for years! They simply do not care about their freelancers (and apparently also not about their customers).
[−] HeliumHydride 30d ago
It seems that someone sent a DMCA complaint months ago relating to this: https://lumendatabase.org/notices/53130362
[−] gregsadetsky 30d ago
I wrote to security@fiverr.com and they just replied:

"You’re the second person to flag this issue to us

Please note that our records show no contact with Fiverr security regarding this matter ~40 days ago unlike the poster claims. We are currently working to resolve the situation"

[−] wxw 30d ago
Wow, surprised this isn't blowing up more. Leaking form 1040s is egregious, let alone getting them indexed by Google...
[−] viaredux 30d ago
Update: Fiverr denies allegations of a cybersecurity incident on X.

“To be clear, this is not a cyber incident. Fiverr does not proactively expose users’ private information. The content in question was shared by users in the normal course of marketplace activity to showcase work samples, under agreements and approvals between buyers and sellers. This type of content requires the buyer’s consent before it can be uploaded. As always, any request to remove content is handled promptly by our team.”

[−] qingcharles 30d ago
That's wild. Thousands of SSNs in there. Also a lot of Fiverr folks selling digital products and all their PDF courses are being returned for free in the search results.
[−] npilk 30d ago
Remember, if you use Google to access any of this “private” information, you’re a hacker and the state of Missouri might try to arrest you!

https://missouriindependent.com/2021/10/14/missouri-governor...

[−] janoelze 30d ago
really bad stuff in the results. very easy to find API tokens, penetration test reports, confidental PDFs, internal APIs. Fiverr needs to immediately block all static asset access until this is resolved. business continuity should not be a concern here.
[−] Barbing 30d ago
@dang example query feels incredibly doxxy, and feels bad form to link directly to full copies of people's [stuff] and [personal info] as seen on this page :/

I know this is all Fiverr's fault for allegedly missing the responsible disclosure but now is this the ideal way for us to discuss, with these particular examples? I ask not to spare Fiverr, but I would be so mad if I were first for the result in OP or my personal info linked directly...

[−] 101008 30d ago
There are health stuff too... and they are not even paying attention to this matter

https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...

[−] janoelze 30d ago
it's been 5 hours. even manual action to take down the most sensitive files should have completed about 3 hours ago at most. what is happening.
[−] johnmlussier 30d ago
Probably not in scope but maybe https://bugcrowd.com/engagements/cloudinary will care?

This is bad.

[−] viaredux 30d ago
I tried to alert other freelancers on Fiver Forums about htis, but my post got deleted on for 'violating community rules'. I don't see how it did violate the rules, suspicious to say the least.
[−] psygn89 30d ago
I guess they used Fiverr for security
[−] pcblues 30d ago
In spite of how the pollies sell it, regulation is the friend of anyone earning less than one million dollars per year. Regulation would fix this. Get on it.
[−] deepserket 30d ago
Answer from Fiverr:

"To be clear, this is not a cyber incident. Fiverr does not proactively expose users' private information. The content in question was shared by users in the normal course of marketplace activity to showcase work samples, under agreements and approvals between buyers and sellers. This type of content requires the buyer's consent before it can be uploaded. As always, any request to remove content is handled promptly by our team."

https://x.com/fiverr/status/2044389801495773339?s=20

[−] sergiotapia 30d ago
This is really bad, just straight up people's income, SSN and worse just right there in the search results on Brave Search even.
[−] fudged71 30d ago
It's been 10 hours and all the links in this comment section still work...
[−] ttrinity 25d ago
Non-dev here, I think the small business I work for was involved in this data leak. I started a project with a developer on Fivver - he is very well rated and has an extensive portfolio, so I felt comfortable working with him. Last week, he gave me a requirement document that I filled out with our info (stripe integration API keys, domain access, etc.) and a few days after I sent the PDF over Fivver we received a fraudulent charge for 4 tickets to (of all places) SEAWORLD. It had the owners full name, address, email, phone number, and debit card. I at first thought this was the dev himself, but now after seeing this recent news I believe it could be Fivver directly - any thoughts would be appreciated!
[−] ebbi 30d ago
I've been boycotting Fiverr, so I'm glad I'm not caught up in this. And judging by their response to this issue, I'm glad I've been boycotting it.
[−] figassis 30d ago
From what I’ve seen, this always ends in some small fine/settlement and “no admission of guilt”. This type of protection is the source of these mishaps.
[−] MyUltiDev 30d ago
The Cloudinary fix that nobody in this thread is naming is actually two lines. Upload the asset with type set to authenticated instead of the default upload type, and generate a signed URL server side with sign_url true whenever alogged in user requests it. Once the asset is authenticated the public URL stops resolving entirely, so even the Google indexed copies go cold. The reason Fiverr cannot just turn this on now is that they already have years of stored messages where every reference is the default public delivery type, and switching the existing media library from public to authenticated breaks every existing URL across the whole platform. That is the architectural brittleness someone upthread was pointing at, and it is also why the only realistic path forward for them is rotating new uploads to authenticated and accepting that the historical exposure is permanent. What would actually catch this category of mistake earlier, an SDK default that refused to upload anything as public unless you opt in?
[−] impish9208 30d ago
This is crazy! So many tax and other financial forms out in the open. But the most interesting file I’ve seen so far seems to be a book draft titled “HOOD NIGGA AFFIRMATIONS: A Collection of Affirming Anecdotes for Hood Niggas Everywhere”. I made it to page 27 out of 63.
[−] cleaning 30d ago
Wow this is really really bad. Insane this hasn't been fixed yet, media outlets are going to have a fun time with this story
[−] hoofhearted 30d ago
Still publicly available.

Just forwarded this post to a few members of Congress.

[−] yellow_lead 30d ago
The files are deleted now, that was fun while it lasted!
[−] rapfaria 30d ago
How big of a client is Fiverr? Surely Cloudinary would have alerts for an enterprise client leaking stuff?

Just insane

[−] dbg31415 30d ago
Fiverr probably hired someone from Fiverr to do the web build security.
[−] mraza007 30d ago
Woah that's brutal all the important information is wild in public
[−] nslsm 30d ago
I don't really see what the issue is here? Someone with access to the URL (either the freelancer or the client) leaked it to Google, Google crawled it. How is this Fiverr's fault? I mean, okay, they could sign URLs, but it's not like they left a folder with directory listing on. Someone with access to the URLs, not them, willingly made the URLs public.
[−] Nekorosu 30d ago
I was scammed on Fiverr myself, so I may be biased, but this feels consistent with the platform incentives I saw firsthand. The dispute process did not seem designed to deal well with coordinated abuse, and weak controls around sensitive files would point to the same broader issue: user safety and data handling do not appear to be high priorities.
[−] mellosouls 30d ago
For anybody who missed OPs bit at the end:

Responsible Disclosure Note -- 40 days have passed since this was notified to the designated vulnerability email (security@fiverr.com). The security team did not reply.

[−] morpheuskafka 30d ago
Files are now returning 404s as of right now, 0900 UTC 4/15.

Would be interesting if someone with an account can check if they are visible to intended users or not, and if so, if their mitigation is robust (signed URLs?).

[−] epaga 30d ago
I tried posting a warning to /r/fiverr but the admins removed the post. And the files are STILL public...how in the world is "sitting it out" their course of action?

Edit: I'm beginning to wonder if they might be locked out of their own site at this point. How hard could it be to just shut down the asset server until they get it sorted?

[−] yieldcrv 30d ago
this is a bad leak, appreciate the attempts at disclosure before this
[−] unkl_ 30d ago
Looks like the cloudinary links are returning 404 now
[−] smashah 30d ago
They bought and.co and then dropped it. strange company
[−] BoredPositron 30d ago
Just by scrolling over it that's really rough.
[−] csomar 30d ago
Given the existing DMCA requests and the fact that Google has become way less aggressive about indexing this stuff, it's clear this has been going on for a while. My guess is they've gutted enough of their internal processes that they literally can't restrict access to these files without breaking their own platform.

You really can't make this shit up: https://www.linkedin.com/feed/update/urn:li:activity:7445526...

The real question is: will Fiverr be the first company to truly crash and burn from an "AI-first" approach? Go LLM, go mayhem!

[−] vswaroop04 30d ago
Obvious! They dont care about freelancers
[−] eudamoniac 30d ago
The stock hasn't moved. Puts?