Backblaze has stopped backing up OneDrive and Dropbox folders and maybe others (rareese.com)

by rrreese 693 comments 1130 points
Read article View on HN

693 comments

[−] julianozen 31d ago
We are going to drop blackblaze over this

We discovered this change recently because my dad was looking for a file that Dropbox accidentally overwrote which at first we said “no problem. This is why we pay for backblaze”

We had learned that this policy had changed a few months ago, and we were never notified. File was unrecoverable

If anyone at backblaze is reading this, I pay for your product so I can install you on my parents machine and never worry about it again. You decided saving on cloud storage was worth breaking this promise. Bad bad call

[−] wafflebot 31d ago
I'm going to drop Backblaze for my entire company over this.

I need it to capture local data, even though that local data is getting synced to Google Drive. Where we sync our data really has nothing to do with Backblaze backing up the endpoint. We don't wholly trust sync, that's why we have backup.

On my personal Mac I have iCloud Drive syncing my desktop, and a while back iCloud ate a file I was working on. Backblaze had it captured, thankfully. But if they are going to exclude iCloud Drive synced folders, and sounds like that is their intention, Backblaze is useless to me.

[−] moffkalast 31d ago
Bidirectional auto file sync is a fundamentally broken pattern and I'm tired of pretending it's not. It's just complete chaos with wrong files constantly getting overridden on both ends.

I have no clue why people still use it and I'd cut my losses if I were you, either backup to the cloud or pull from it, not both at the same time like an absolute tictac.

[−] Aurornis 31d ago

> I have no clue why people still use it

This is an instance of someone familiar with complex file access patterns not understanding the normal use case for these services.

The people using these bidirectional sync services want last writer wins behavior. The mild and moderately technical people I work with all get it and work with it. They know how to use the UI to look for old versions if someone accidentally overwrites their file.

Your characterization as complete chaos with constant problems does not mesh with the reality of the countless low-tech teams I've seen use Dropbox type services since they were launched.

[−] Anamon 31d ago
This would be half OK if it worked, but you can't trust it to. OneDrive, for instance, has an open bug for years now where it will randomly revert some of your files to a revision from several months earlier. You can detect and recover this from the history, but only if you know that it happened and where, which you usually won't because it happens silently. I only noticed because it happened to an append-only text file I use daily.
[−] jcgl 30d ago
A specific implementation (OneDrive) doing something dumb doesn't invalidate the entire paradigm though. Things work just fine elsewhere (Dropbox, Google Drive, Nextcloud, and Seafile are all solutions I've had good experiences with).
[−] lobsterthief 30d ago
Agreed, I’ve been using Dropbox for 15 years with minimal issues. The key is to ensure it’s running and syncing with the proper settings on both machines.

What can get things into a weird state is if both machines are editing the same file while only one of them is actively syncing. But for basic backup and sync, this is extremely rare.

[−] sunnybeetroot 30d ago
Even crazier is one drive has a limit on the total length of a file path, how is this even a thing that exists.
[−] ahhhhnoooo 30d ago
Unlimited strings are a problem. People will use it as storage.

No, I'm not joking. We used to allow arbitrary paths in a cloud API I owned. Within about a month someone had figured out that the cost to store a single byte file was effectively zero, and they could encode arbitrary files into the paths of those things. It wasn't too long before there was a library to do it on Github. We had to put limits on it because otherwise people would store their data in the path, not the file.

[−] jamesfinlayson 30d ago
I remember someone telling me that S3 used to be similarly abused - people were creating empty files and using S3 like a key-value store somehow, so AWS just jacked up the price of S3 head-object API call to push people back to DynamoDB or whatever.
[−] garaetjjte 30d ago
Just include filename size in file size for billing purposes?
[−] ahhhhnoooo 30d ago
Not sufficient, unfortunately. The strings for file paths are stored in wholly different infrastructure with wholly different optimizations. It probably lives in your database. You really don't want people just stuffing gigabytes into that, payment or no payment. Odds are you didn't plan your control plane around, "what if someone uses our strings as encoded data?"
[−] OrangeMusic 30d ago
They won't do it if it's not free
[−] Barbing 30d ago
In the fine print, only to be used against bad actors (w/guarantee that filenames under x chars would never be charged), or that too problematic? building good faith into policy + "hiding" info...

Reason - to not overcomplicate or give appearance of nickel-and-diming

[−] fn-mote 30d ago
No, just charge for the amount of storage they use on your server. Not the amount of data you think you’re storing. In non-special cases these will be the same number.
[−] sunnybeetroot 30d ago
Wow alright I have learnt something thank you
[−] dns_snek 30d ago
What do you expect to happen when your cloud storage file path is 5000 characters long and your local filesystem only supports a maximum of 4096?
[−] zrm 30d ago
You expect the files to still be accessible using relative paths. What do you expect to happen if your cloud storage file path is 50 characters long and is mounted in a folder which is 4050 characters long when PATH_MAX is 4096?

The sync application itself can handle this using openat(2) or similar and should probably be using that regardless to avoid races.

[−] dns_snek 29d ago
Ah, I forgot that the maximum path length is usually limited by PATH_MAX, it's the path segment that's usually limited by the filesystem.

Point taken, although I still think it's better for cloud storage services to err on the side of compatibility, i.e. what's the lowest common denominator between Linux, macOS, Android, iOS from 10 years ago and Windows 7?

[−] jamesfinlayson 30d ago
Oh yeah... I remember Windows behaving weirdly when I tried to copy some files with long names into a deeper directory tree. And it was just weird behaviour - no useful error message.
[−] fluoridation 30d ago
Windows in particular supports at the API level paths tens of thousands of characters long, much longer than Linux. The problem is applications need to explicitly support such paths using the long path syntax, otherwise they're limited to 255 characters.
[−] jamesfinlayson 29d ago
Yeah I thought there was some way of doing it, but weirdly it was explorer.exe that was behaving in odd ways.
[−] sunnybeetroot 30d ago
Great point I stand corrected
[−] thebrain 30d ago
Everything needs limits otherwise someone will figure out how to or accidentally break it.
[−] sunnybeetroot 30d ago
I stand corrected you’re right
[−] pseudohadamard 30d ago
Except the GNU stuff, which has as a design principle "no arbitrary limits". Meaning no limits at all, not "no sane limits":

  Avoid arbitrary limits on the length or number of any data structure, including filenames, lines, files, and symbols, by allocating all data structures dynamically.
I assume they're relying on the OOM Killer and quotas to prevent DoSes all over the place.
[−] omnimus 31d ago
I also have no clue why people use it.

You can build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP account could be accessed through built-in software.

[−] snowwrestler 30d ago
This reference is 19 years old this month, in case anyone who recognized it was still feeling young.
[−] morganf 30d ago
noooooooooooo!!!!!!!!
[−] j1elo 31d ago
Wait a moment, you just gave me an idea for a product
[−] burnte 31d ago
1 out of a thousand people might do that, the others will buy the product. That's why people use it, most people don't want to build everything themselves.
[−] srdjanr 31d ago
[−] Dylan16807 30d ago
But as usual it forgets the "For a Linux user" part.

If we remove the whole linux section and just ask "why not map a folder in Explorer" it's a reasonable question, probably even more reasonable in 2026 than in 2007. The network got faster and more reliable, and the dropbox access got slower.

[−] burnte 29d ago
LOL I should have remembered. :D Sorry!
[−] zymhan 31d ago
This cannot be a serious proposal. You should probably talk to people who don't use technology because they love it, but because they need it.
[−] jibal 30d ago
[−] jkl5xx 30d ago
It’s kind of wild to read through these comments and realize hn is still riffing on the same ideas. Is it e2ee? Does it run on Linux? Who would pay for something you can slap together in a weekend with a few bash scripts? Really highlights this community’s values, skills, and blind spots. Also a bit of a bummer that the privacy and open source situations today are even worse in many ways.
[−] pseudohadamard 30d ago
The equivalent of this is advice from a friend of mine who likes different teas, "just learn to read hanzi like I did and then you can select the ones you like". Apparently there's one called "government tea" (in the original) which he expected to taste of old leather, musty paperwork, and stale cigarette smoke.
[−] johnthescott 30d ago
or use rsync.net and a cronjob.
[−] lazide 30d ago
Obvious. Explorer even has support built in for transparent ‘native’ gui support. I’m not even sure why you felt the need to explain it in detail. Next you’ll be explaining how to walk. (/s, I loved it)
[−] PunchyHamster 31d ago
Slow as fuck compared to 2 synced dirs
[−] babypuncher 31d ago
I think this is a case of people using bidirectional file sync wrong. The point is to make the most up to date version of a file available across multiple devices, not to act as a backup or for collaboration between multiple users.

It works perfectly fine as long as you keep how it works in mind, and probably most importantly don't have multiple users working directly on the same file at once.

I've been using these systems for over a decade at this point and never had a problem. And if I ever do have one, my real backup solution has me covered.

[−] senkora 31d ago
+1. It works perfectly if your mental model is:

“Every file is only ever written to from a single client, and will be asynchronously made available to all other clients, and after some period of time has elapsed you can safely switch to always writing to the file from a different client”.

[−] PunchyHamster 31d ago
It works perfectly fine if you're user that know how it works. I use it with Syncthing and it works coz I know to not edit same file at the same time on 2 devices (my third and fourth device is always on internet so chances propagate reasonably fast even if the 2 devices aren't on on the same time)

But the moment that hits normal users, yeah, mess

[−] Backblaze-Jim 29d ago
Hello, Jim from Backblaze here. I wanted to offer some insight into what happened with backing up cloud-synced folders.

It is true that we recently updated how Backblaze Computer Backup handles cloud-synced folders. This decision was driven by a consistent set of technical issues we were seeing at scale, most of them driven by updates created by third-party sync tools, including unreliable backups and incomplete restores when backing up files managed by third-party sync providers.

To give a bit more context on the “why”: these cloud storage providers now rely heavily on OS-level frameworks to manage sync state. On Windows, for example, files are often represented as reparse points via the Cloud Files API. While they can appear local, they are still system-managed placeholders, which makes it difficult to reliably back them up as standard on-disk files.

Moreover, we built our product in a way to not backup reparse points for two reasons:

1. We wanted the backup client to be light on the system and only back up needed user-generated files. 2. We wanted the service to be unlimited, so following reparse points would lead to us backing up tons of data in the cloud

We’ve made targeted investments where we can, for example, adding support for iCloud Drive by working within Apple’s model and supporting Google Drive, but extending that same level of support to third-party providers like Dropbox or OneDrive is more complex and not included in the current version.

We are currently exploring building an add-on that either follows reparse points or backs up the tagged data in another way.

We also hear you clearly on the communication gap. Both the sync providers and Backblaze should have been more proactive in notifying customers about a change with this level of impact. Please don't hesitate to reach out to me or our support team directly if you have any questions. https://help.backblaze.com/hc/en-us/requests/new

We are here to help.

[−] lazide 31d ago
Throw some clock skew into the mix and it’s even more hilarious!
[−] Gud 31d ago
Why is this downvoted?
[−] therealpygon 30d ago
Same. Specifically I was considering Backblaze for our company’s backups (both products, computers and their bucket for server backups. That is no longer the case as of the news.
[−] NR-backblaze 29d ago
Natasha from Backblaze here. Just wanted to let you know that we do backup iCloud data, as long as the files are stored locally on your device (not just in iCloud as “optimize storage” / cloud-only files).

If iCloud is set to keep full copies on disk, Backblaze will treat those like normal files and back them up.

More details here: https://www.backblaze.com/computer-backup/docs/en/back-up-ic...

Happy to answer any questions if anything’s unclear.

[−] huflungdung 30d ago
[dead]
[−] qsi 30d ago
This is very well put, and echoes my sentiments! I had installed Backblaze on my own home machine many, many years ago, and it has saved my bacon a few times. Since then I've also installed it on any family members' machines that required backup and recommended it to friends. And I've been happy to pay for the service.

The deal was that Backblaze backs things up and I don't have to worry about it. Learning that it does not back things up is a punch to the gut. I am familiar with the exclusions and I have a look at that list to make I'm not missing anything from my backups. I had always thought the exclusions list was exhaustive.

Excluding other files and folders without telling me about it breaks the deal. Dropbox is important to several of the users I installed it for. Ignoring .git folders is another one that affects me and I had not known about. Ouch.

I will now have to look for alternatives. It has to be easy to install, run seamlessly on non-technical users' machines and be reliable.

I find it hard to be think of a worse breach of trust for a backup service than not to back up files!

[−] petercooper 31d ago
I'm going to join the exodus, though for a different reason. Switched to Orbstack and ever since Backblaze refuses to back up saying "disk full" as Orbstack uses a 8TB sparse disk image. You can exclude it, but if they won't (very easily) fix a known issue by measuring file sizes properly I don't feel confident about the product.
[−] julianozen 31d ago
Also taking recommendations for a simple services I can install on my dads windows machine and my moms Mac that will just automatically backup the main drive to the cloud just in case
[−] thescriptkiddie 30d ago
my parents also lost data that was supposed to be backed up on backblaze because they didn't use that computer for a month and when they turned it back on the hard drive was dead. apparently backblaze silently deletes backups more than 30 days old even if they are the newest backup and then happily keeps billing you for not storing your data.
[−] NR-backblaze 29d ago
Natasha from Backblaze here. This is a situation people rely on backups for, and I can understand how frustrating it is to run into this when you need a restore.

I also want to be clear, this wasn’t about saving on storage. It came from cases where backing up cloud-synced folders (like Dropbox) was leading to unreliable or incomplete restores because of how those files are managed under the hood.

When Dropbox began using reparse points for synced files, those files no longer behaved like standard local files. Because of that, Backblaze Computer Backup can’t reliably back them up or restore them. The current behavior is focused on ensuring we only back up data we can reliably restore, and we are actively exploring ways to better support Dropbox and data touched by other sync services.

[−] ibizaman 31d ago
Why not just use backblaze as cold storage and use restic or another tool with a GUI to backup to it?
[−] Backblaze-Jim 29d ago
Hello, Jim from Backblaze here. I wanted to offer some insight into what happened with backing up cloud-synced folders.

It is true that we recently updated how Backblaze Computer Backup handles cloud-synced folders. This decision was driven by a consistent set of technical issues we were seeing at scale, most of them driven by updates created by third-party sync tools, including unreliable backups and incomplete restores when backing up files managed by third-party sync providers.

To give a bit more context on the “why”: these cloud storage providers now rely heavily on OS-level frameworks to manage sync state. On Windows, for example, files are often represented as reparse points via the Cloud Files API. While they can appear local, they are still system-managed placeholders, which makes it difficult to reliably back them up as standard on-disk files.

Moreover, we built our product in a way to not backup reparse points for two reasons:

1. We wanted the backup client to be light on the system and only back up needed user-generated files. 2. We wanted the service to be unlimited, so following reparse points would lead to us backing up tons of data in the cloud

We’ve made targeted investments where we can, for example, adding support for iCloud Drive by working within Apple’s model and supporting Google Drive, but extending that same level of support to third-party providers like Dropbox or OneDrive is more complex and not included in the current version.

We are currently exploring building an add-on that either follows reparse points or backs up the tagged data in another way.

We also hear you clearly on the communication gap. Both the sync providers and Backblaze should have been more proactive in notifying customers about a change with this level of impact. Please don't hesitate to reach out to me or our support team directly if you have any questions. https://help.backblaze.com/hc/en-us/requests/new

We are here to help.

[−] varenc 30d ago
Dropbox itself should already keep version history of files for 30 days with the free plan, or more if you pay.
[−] azalemeth 31d ago
I guess the problem with Backblaze's business model with respect to Backblaze Personal is that it is "unlimited". They specifically exclude linux users because, well, we're nerds, r/datahoarders exists, and we have different ideas about what "unlimited" means. [1]

This is another example in disguise of two people disagreeing about what "unlimited" means in the context of backup, even if they do claim to have "no restrictions on file type or size" [2].

[1] https://www.reddit.com/r/backblaze/comments/jsrqoz/personal_... [2] https://www.backblaze.com/cloud-backup/personal

[−] Neil44 31d ago
The issue with a client app backing up dropbox and onedrive folders on your computer is the files on demand feature, you could sync a 1tb onedrive to your 250gb laptop but it's OK because of smart/selective sync aka files on demand. Then backblaze backup tries to back the folder up and requests a download of every single file and now you have zero bytes free, still no backup and a sick laptop. You could oauth the backblaze app to access onedrive directly, but if you want to back your onedrive up you need a different product IMO.
[−] noirscape 31d ago
I can understand in theory why they wouldn't want to back up .git folders as-is. Git has a serious object count bloat problem if you have any repository with a good amount of commit history, which causes a lot of unnecessary overhead in just scanning the folder for files alone.

I don't quite understand why it's still like this; it's probably the biggest reason why git tends to play poorly with a lot of filesystem tools (not just backups). If it'd been something like an SQLite database instead (just an example really), you wouldn't get so much unnecessary inode bloat.

At the same time Backblaze is a backup solution. The need to back up everything is sort of baked in there. They promise to be the third backup solution in a three layer strategy (backup directly connected, backup in home, backup external), and that third one is probably the single most important one of them all since it's the one you're going to be touching the least in an ideal scenario. They really can't be excluding any files whatsoever.

The cloud service exclusion is similarly bad, although much worse. Imagine getting hit by a cryptoworm. Your cloud storage tool is dutifully going to sync everything encrypted, junking up your entire storage across devices and because restoring old versions is both ass and near impossible at scale, you need an actual backup solution for that situation. Backblaze excluding files in those folders feels like a complete misunderstanding of what their purpose should be.

[−] klausa 31d ago
Exclusions are one thing, but I've had Backblaze _fail to restore a file_. I pay for unlimited history.

I contacted the support asking WTF, "oh the file got deleted at some point, sorry for that", and they offered me 3 months of credits.

I do not trust my Backblaze backups anymore.

[−] nstj 31d ago
As an FYI you can recover from force pushes to GitHub using the GitHub UI[0] or their API[1]. And if you force push to one of your own machines you can use the reflog[2]. [0]: https://stackoverflow.com/a/78872853 [1]: https://stackoverflow.com/a/48110879 [2]: https://stackoverflow.com/a/24236065
[−] KingMachiavelli 31d ago
They 100% should have communicated this change, absolutely unacceptable to change behavior without an extremely visible warning.

However, backing up these kinds of directories has always been ill-defined. Dropbox/Google Drive/etc. files are not actually present locally - at least not until you access the file or it resides to cache it. Should backup software force you to download all 1TB+ of your cloud storage? What if the local system is low on space? What if the network is too slow? What if the actually data is in an already excluded %AppData% location.

Similar issue with VCS, should you sync changes to .git every minute? Every hour? When is .git in a consistent state?

IMO .git and other VCS should just be synced X times per day and it wait for .git to be unchanged for Y minutes before syncing it. Hell, I bet Claude could write a special Git aware backup script.

But Google Drive and Dropbox mount points are not real. It’s crazy to expect backup software to handle that unless explicitly advertised.

[−] mcherm 31d ago
Some companies are in the business of trust. These companies NEED to understand that trust is somewhat difficult to earn, but easy to lose and nearly IMPOSSIBLE to regain. After reading this article I will almost certainly never use or recommend Backblaze. (And while I don't use them currently, they WERE on the list of companies I would have recommended due to the length of their history.)
[−] AegirLeet 31d ago
At some point, Backblaze just silently stopped backing up my encrypted (VeraCrypt) drives. Just stopped working without any announcement, warning or notification. After lots of troubleshooting and googling I found out that this was intentional from some random reddit thread. I stopped using their backup service after that.
[−] ncheek 31d ago
It looks like the following line has been added to /Library/Backblaze.bzpkg/bzdata/bzexcluderules_mandatory.xml which excludes my Dropbox folder from getting backed up:

That is the exact path to my Dropbox folder, and I presume if I move my Dropbox folder this xml file will be updated to point to the new location. The top of the xml file states "Mandatory Exclusions: editing this file DOES NOT DO ANYTHING".

.git files seem to still be backing up on my machine, although they are hidden by default in the web restore (you must open Filters and enable Show Hidden Files). I don't see an option to show hidden files/folders in the Backblaze Restore app.

[−] fuckinpuppers 31d ago
I noticed this (thankfully before it was critical) and I’ve decided to move on from BB. Easily over 10 year customer. Totally bogus. Not only did it stop backing it up the old history is totally gone as well.

The one thing they have to do is backup everything and when you see it in their console you can rest assured they are going to continue to back it up.

They’ve let the desktop client linger, it’s difficult to add meaningful exceptions. It’s obvious they want everyone to use B2 now.

[−] SCdF 31d ago
After mucking around with various easy to use options my lack of trust[1] pushed me into a more-complicated-but-at-least-under-my-control-option: syncthing+restic+s3 compatible cloud provider.

Basically it works like this:

- I have syncthing moving files between all my devices. The larger the device, the more stuff I move there[2]. My phone only has my keepass file and a few other docs, my gaming PC has that plus all of my photos and music, etc.

- All of this ends up on a raspberry pi with a connected USB harddrive, which has everything on it. Why yes, that is very shoddy and short term! The pi is mirrored on my gaming PC though, which is awake once every day or two, so if it completely breaks I still have everything locally.

- Nightly a restic job runs, which backs up everything on the pi to an s3 compatible cloud[3], and cleans out old snapshots (30 days, 52 weeks, 60 months, then yearly)

- Yearly I test restoring a random backup, both on the pi, and on another device, to make sure there is no required knowledge stuck on there.

This is was somewhat of a pain to setup, but since the pi is never off it just ticks along, and I check it periodically to make sure nothing has broken.

[1] there is always weirdness with these tools. They don't sync how you think, or when you actually want to restore it takes forever, or they are stuck in perpetual sync cycles

[2] I sync multiple directories, broadly "very small", "small", "dumping ground", and "media", from smallest to largest.

[3] Currently Wasabi, but it really doens't matter. Restic encrypts client side, you just need to trust the provider enough that they don't completely collapse at the same time that you need backups.

[−] peteforde 31d ago
Weirdly, reading this had the net impact of me signing up to Backblaze.

I had no idea that it was such a good bargain. I used to be a Crashplan user back in the day, and I always thought Backblaze had tiered limits.

I've been using Duplicati to sync a lot of data to S3's cheapest tape-based long term storage tier. It's a serious pain in the ass because it takes hours to queue up and retrieve a file. It's a heavy enough process that I don't do anything nearly close to enough testing to make sure my backups are restorable, which is a self-inflicted future injury.

Here's the thing: I'm paying about $14/month for that S3 storage, which makes $99/year a total steal. I don't use Dropbox/Box/OneDrive/iCloud so the grievances mentioned by the author are not major hurdles for me. I do find the idea that it is silently ignoring .git folders troubling, primarily because they are indeed not listed in the exclusion list.

I am a bit miffed that we're actively prevented from backing up the various Program Files folders, because I have a large number of VSTi instruments that I'll need to ensure are rcloned or something for this to work.

[−] kameit00 31d ago
I once had to restore around 2 TB of RAW photos. The app was a mess. It crashed every few hours. I ended up manually downloading single folders over a timespan of 2 weeks to restore my data. The support only apologized and could not help with my restore problem. After this I cancelled my subscription immediately and use local drives for my backups now, drives which I rotate (in use and locations).

I never trust them again with my data.

[−] benguild 31d ago
The fact that they’d exclude “.git” and other things without being transparent about it is scandalous
[−] donatj 31d ago
I can almost almost understand the logic behind not backing up OneDrive/Dropbox. I think it's bad logic but I can understand where it's coming from.

Not backing up .git folders however is completely unacceptable.

I have hundreds of small projects where I use git track of history locally with no remote at all. The intention is never to push it anywhere. I don't like to say these sorts of things, and I don't say it lightly when I say someone should be fired over this decision.

[−] Hendrikto 31d ago

> My first troubling discovery was in 2025, when I made several errors then did a push -f to GitHub and blew away the git history for a half decade old repo. No data was lost, but the log of changes was.

I know this is besides the point somewhat, but: Learn your tools people. The commit history could probably have been easily restored without involving any backup. The commits are not just instantly gone.

[−] gck1 30d ago
Backblaze is such a weird case. On one hand, it became the most trusted personal backup provider on reddit and HN, on another - their software is absolute junk, and as some comments in this thread are highlighting - even their restore can't be trusted.

I've never needed to restore anything, so can't say anything about this, but once, one of my devices deleted a file in Syncthing, and I went into Backblaze to see if they have any logs of deletions/file modifications (had it disabled in syncthing).

I don't remember the exact details, but I remember clearly that I felt like the entire thing was done by a junior engineer straight out of college. Trying to understand the names of some variables used there, I stumbled upon a reddit thread where the person who worked on the client was trying to explain why things were done the way they were - and I felt like it was me in my first 3 months of software engineering.

How did Backblaze gain this trust in the first place? Is it because nobody is offering "unlimited" storage at the same price point?

[−] minebreaker 31d ago
I just checked the Backblaze app and found that .iso was on the exclusion list. Just in case anyone here is as dumb as I...
[−] andybak 31d ago
I had a back and forth with them about .git folders a couple of years back and their defence was something like "we are a consumer product - not a professional developer product. Pay for our business offering"

But if that's truly their stance, then they are being deceptive about their non-business offering at the point of sale.

EDIT - see my other comment where I found the actual email

[−] nippoo 31d ago
It's ironic that Backblaze themselves wrote a blog post a couple of years ago explaining why Dropbox isn't enough as a backup service and you need Backblaze as an additional layer of protection: https://www.backblaze.com/blog/whats-wrong-with-google-drive...

That aged well...

[−] Vegenoid 31d ago
AFAICT Backblaze does back up .git directories. I have many repos backed up. The .git directory is hidden by default in the web UI (along with all other hidden files), but there is an option to show them.

You should try downloading one of your backed up git repos to see if it actually does contain the full history, I just checked several and everything looks good.

[−] dathinab 31d ago
Ironically drop box and one drive folders I can still somewhat understand as they are "backuped" in other ways (but potentially not reliable so I also understand why people do not like that).

But .git? It does not mean you have it synced to GitHub or anything reliable?

If you do anything then only backup the .git folder and not the checkout.

But backing up the checkout and not the .git folder is crazy.

[−] patates 31d ago
I think this should not be attributed to malice, however unfortunate. I had also developed some sync app once and onedrive folders were indeed problematic, causing cyclic updates on access and random metadata changes for no explicit reason.

Complete lack of communication (outside of release notes, which nobody really reads, as the article too states) is incompetence and indeed worrying.

Just show a red status bar that says "these folders will not be backed up anymore", why not?

[−] conception 31d ago
A lot of personal “nerd” options are listed in the thread (and like restic/borg are really good!) but nothing really centralized. Backblaze was a great fire and forget option for deploying as a last resort backup. I don’t think there are any competitors in that space if you are looking for continuous backup, centralized management and good pricing that doesn’t require talking to a salesperson to get things going and is pay as you go.
[−] devnulled 31d ago
I highly recommend switching to something more like Arq and then using whatever backend storage that you want. There are probably some other open source ways to do it, etc, but Arq scratches the itch of having control over your backups and putting them where you want with a GUI to easily configure/keep track of what is going on.

Maybe there's something newer/better now (and I bought lifetime licenses of it long ago), but it works for me.

That said, I use Arq + Backblaze storage and I think my monthly bill is very low, like under $5. Though I haven't backed-up much media there yet, but I do have control over what is being backed-up.

[−] venzaspa 31d ago
On the topic of backing up data from cloud platforms such as Onedrive, I suspect this is stop the client machine from actively downloading 'files on demand' which are just pointers in explorer until you go to open them.

If you've got huge amounts of files in Onedrive and the backup client starts downloading everyone of them (before it can reupload them again) you're going to run into problems.

But ideally, they'd give you a choice.

[−] stratts 31d ago
I think this is a risk with anything that promotes itself as "unlimited", or otherwise doesn't specify concrete limits. I'm always sceptical of services like this as it feels like the terms could arbitrarily change at any point, as we've found out here.

(as a side note, it's funny to see see them promoting their native C app instead of using Java as a "shortcut". What I wouldn't give for more Java apps nowadays)

[−] tomkaos 31d ago
I’ve been using it for years, and the one time I needed to restore a file, I realized that VMware VMs files were excluded from the backup. They are so many exclusion that I start doing physical backup again.
[−] Vingdoloras 31d ago
Unrelated to the main point, and probably too late to matter, but you can access repo activity logs via Github's API. I had to clean up a bad push before and was able to find the old commit hash in the logs, then reset the branch to that commit, similarly to how you'd fix local messes using reflog.
[−] palata 31d ago
My takeaway is that for data that matters, don't trust the service. I back up with Restic, so that the service only sees encrypted blobs.
[−] yard2010 31d ago
Use restic with resticprofile and you won't need anything else. Point it to a Hetzner storagebox, the best value you can get. Don't trust fisher price backup plans
[−] philjohn 31d ago
For those looking for something at a decent price for up to 5TB, take a look at JottaCloud, which is supported by rclone, and then you can layer restic on top for a complete backup solution.

JottaCloud is "unlimited" for $11.99 a month (your upload speed is throttled after 5TB).

I've been using them for a few years for backing up important files from my NAS (timemachine backups, Immich library, digitised VHS's, Proxmox Backup Server backups) and am sitting at about 3.5TB.

[−] dashesyan 31d ago
Time Machine has a similar issue. OneDrive silently corrupted hundreds of my files, replacing their content with binary zeros while retaining the original file size. I have Time Machine backups going back years, but it turns out TM does not backup Cloud files, even if you have them pinned to local storage! So I lost sales those files, including some irreplaceable family photos

I’ve added restic to my backup routine, pointed at cloud files and other critical data

[−] basilgohar 31d ago
This is really disturbing to hear as I've incorporated B2 into a lot of my flow for backups as well as a storage backend for Nextcloud and planned as the object store for some upcoming archival storage products I'm working on.

I know the post is talking about their personal backup product but it's the same company and so if they sneak in a reduction of service like this, as others have already commented, it erodes difficult-to-earn trust.

[−] hiisukun 31d ago
I think the target of the anger here should be (at least in part): OneDrive.

My understanding is that a modern, default onedrive setup will push all your onedrive folder contents to the cloud, but will not do the same in reverse -- it's totally possible to have files in your cloud onedrive, visible in your onedrive folder, but that do not exist locally. If you want to access such a file, it typically gets downloaded from onedrive for you to use.

If that's the case, what is Backblaze or another provider to do? Constantly download your onedrive files (that might have been modified on another device) and upload them to backblaze? Or just sync files that actually exist locally? That latter option certainly would not please a consumer, who would expect the files they can 'see' just get magically backed up.

It's a tricky situation and I'm not saying Backblaze handled it well here, but the whole transparent cloud storage situation thing is a bit of a mess for lots of people. If Dropbox works the same way (no guaranteed local file for something you can see), that's the same ugly situation.

[−] morpheuskafka 31d ago
Everyone is acting like this is obviously wrong, and they clearly should have communicated the change and made it visible in the exclusion settings.

However, there is a very good reason for not backing up what is in effect network attached storage. Particularly for OneDrive, as it often adds company SharePoint sites you open files from as mountpoints under your OneDrive folder (business OneDrive is basically a personal Sharepoint site under the hood). Trying to back them up would result in downloading potentially hundreds of gigabytes of files to the desktop only to them reupload them to OneDrive. That would also likely trigger data exfiltration flags at your corporate IT.

A Dropbox/OneDrive/Drive/etc folder is a network mount point by another name. (Many of them are not implemented as FUSE mounts or equivalent OS API, not folders on disk.) It's fundamentally reasonable for software that promises backing up the local disk not to backup whatever network drives you happen to have signed in/mounted.

[−] keitmo 31d ago
It seems to me that Backblaze does NOT exclude ".git". It's not shown by default in the restore UI -- you must enable "show hidden files" to see it -- but it's there. I just did a test restore of my top-level Project directory (container for all of my personal Git projects) and all .git directories are included in the produced .zip file.
[−] robertjpayne 30d ago
While there may be some issues with Backblaze there's no real trusted alternative with such a long history.

Regardless to the OP's issues:

- on macOS since 9.0.2.784 released in 2023 all .git folders are included in backups - Cloud drives are problematic to backup because they all use extension plugins to hide the network and your local disk only contains stubs instead of actual files. If Backblaze scans it fully it'll download everything and exhaust your disk space there's no easy solution here.

I don't buy for a minute they were trying to be "sneaky" to save some $$ I instead feel like for the majority of users they felt it was misleading to backup stubs only and would rather not brick user computers by downloading all the files. Remember they can't access your cloud disk directly so the only way they can get the file contents is by doing an fread and letting the cloud drive client sync the content on demand.

[−] simon_bitwise 31d ago
Yeah this is the core problem with how most backup tools handle Dropbox / iCloud / OneDrive now. Those folders aren’t really “normal files” anymore — a lot of the time they’re just placeholders, and touching them can trigger downloads or other weird behavior depending on the client. That said, just skipping the entire folder is kind of the worst possible outcome. Backup should be predictable. If something is on disk, it should get backed up. If it’s not, you should at least know that, not find out later when you need it. I’ve been working on Duplicati (https://github.com/duplicati/duplicati) and one thing we’ve tried to be careful about is not silently ignoring data. If something can’t be backed up, it should be visible to the user.

Feel free to reach out to me if you have any questions about setting up duplicati.