Besides the PDF processing value add, Cloudinary effectively acts like S3 here, serving assets directly to the web client. Like S3, it has support for signed/expiring URLs. However, Fiverr opted to use public URLs, not signed ones, for sensitive client-worker communication.
Moreover, it seems like they may be serving public HTML somewhere that links to these files. As a result, hundreds are in Google search results, many containing PII.
Example query: site:fiverr-res.cloudinary.com form 1040
In fact, Fiverr actively buys Google Ads for keywords like "form 1234 filing" despite knowing that it does not adequately secure the resulting work product, causing the preparer to violate the GLBA/FTC Safeguards Rule.
Responsible Disclosure Note -- 40 days have passed since this was notified to the designated vulnerability email (security@fiverr.com). The security team did not reply. Therefore, this is being made public as it doesn't seem eligible for CVE/CERT processing as it is not really a code vulnerability, and I don't know anyone else who would care about it.
I wonder if somewhere like Wired/Ars Technica/404media might pick this up?
This is too funny
https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...
They also have an ISO 27001 certificate (they try to claim a bunch of AWSs certs by proxy on their security page, which is ironic as they say AWS stores most of their data while apparently all uploads are on this).
Then they would install WordPress plugins to make the site worse and claim even more "work" was needed.
I documented the entire thing, including my own credentials, and sent it off to Fiverr. Fiverr's response was everything was fine and there was nothing they could do about it, even though it was obvious fraud.
Google never did anything about it either, nor did Shopify.
Given how they handled such a minor situation like that... I guess it shouldn't be surprising they're just asleep at the switch for a major one like this.
Wouldn't change a thing, other than add another hassle you have to pay for to do your job.
This is the result of carelessness, not someone who didn't know that private data should be private because they weren't certified.
Would the certification require someone to take an official certification test for the framework used?
And therefore we’re only allowed to use frameworks which have certification tests available?
If you want to write some new software, do you have to generate a certification for it and get that approved so people are allowed to use it?
Sounds like a great way to force us all to use Big Company approved software because they’re the only ones with pockets deep enough to play all of the certification games
If I had my way, the certification process starts at the bottom of the stack, ie. you should be expected to have a functional knowledge of assembly instructions, memory management, registers, the call stack, and build up from there. Not that we need to write assembly on a daily basis, but all of the abstractions are built on top of that, and you cannot realistically engineer secure software if you don't understand what is being abstracted away. If you do understand the things being abstracted away, you have the fundamentals necessary to do good work with any programming language or framework. Throw in another certification starting from networking fundamentals if your job involves that. 30 years ago, most professional programmers had this level of understanding as table stakes, so we can hardly say it's an unrealistic burden that's impossible to meet.
Would it be a higher barrier to entry that massively cuts the size of the field working on sensitive software and slows software development down, yes. That is exactly what we need. There was a time when people built bridges that collapsed, then we implemented standards and expected engineers to do real work to make sure that didn't happen. Is that work expensive and expertise-intensive, yes, do bridges still collapse, only very rarely. We are witnessing software bridge collapses on a weekly basis, which should be seen as completely unacceptable. The harm is less obvious than when everyone on a bridge dies, but I do think that routinely leaking millions of people's sensitive data is causing serious harm and likely does lead to people dying in second-order effects.
I worked at a company where a customer called confused because when they googled our company as they did every day to login to their portal they found that drivers licenses we stored were available on the public internet.
The devs literally didn't know about direct object access and thought obfuscation was enough, didn't know about how robots.txt worked, didn't know about google webmaster shit, didn't know about sitemaps, they were just the cheapest labor the company could find who could do the thing.
This is a huge portion of outsourced labor in my experience, not because they are worse overseas in any respect, but because the people looking for cheap labor were always looking for the cheapest labor and had no idea how that applied to the actual technical work of running their business.
Still get a few a week, but at least it’s public and amusing.
Plumbers. Electricians. Lawyers. Doctors. Hell, I have to get a license to run my own business.
Why shouldn't software come with a branch for licenses if you're working with sensitive data?
https://www.fiverr.com/.well-known/security.txt only has "Contact: security@fiverr.com" and in their help pages they say "Fiverr operates a Bug Bounty program in collaboration with BugCrowd. If you discover a vulnerability, please reach out to security@fiverr.com to receive information about how to participate in our program."
"You’re the second person to flag this issue to us
Please note that our records show no contact with Fiverr security regarding this matter ~40 days ago unlike the poster claims. We are currently working to resolve the situation"
(technically, I guess that doesn't prove anything other than it is in my Sent folder? it has a message ID but I guess only the purelymail admin could confirm that)
In any event, this should never have required an outside reminder. The indexing issue may be something non obvious. But the core decision not to use signed/expiring URLs is nothing less than good old security by obscurity.
Basically, they aren't set up for anyone to actually contact them and expect a resolution.
I don't think it even comes down to "lying". It's possible that they genuinely believe they didn't receive contact, but given that they are verifiably completely and totally incompetent and have no right to be employed in their current role, they've earned exactly zero benefit of doubt.
https://missouriindependent.com/2021/10/14/missouri-governor...
I know this is all Fiverr's fault for allegedly missing the responsible disclosure but now is this the ideal way for us to discuss, with these particular examples? I ask not to spare Fiverr, but I would be so mad if I were first for the result in OP or my personal info linked directly...
This is bad.
(Fiverr itself uses Bugcrowd but is private, having to first email their SOC as I did.)
https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...
I found the author on Amazon and the book still hasn't been released
this is sad
I will say that the title is the best part
Just insane
You really can't make this shit up: https://www.linkedin.com/feed/update/urn:li:activity:7445526...
The real question is: will Fiverr be the first company to truly crash and burn from an "AI-first" approach? Go LLM, go mayhem!
This is not how Google works.
Today, a photo file might be hosted at:
But it used to be a little closer to: And no auth required, URL only!You need links to pages either from your own website or backlinks from other websites. Alternatively if the page is in your sitemap then Google will typically pick it up or you can manually submit it for indexing. For important pages you would typically want internal links, backlinks, and have it in your sitemap.