We discovered this change recently because my dad was looking for a file that Dropbox accidentally overwrote which at first we said “no problem. This is why we pay for backblaze”
We had learned that this policy had changed a few months ago, and we were never notified. File was unrecoverable
If anyone at backblaze is reading this, I pay for your product so I can install you on my parents machine and never worry about it again. You decided saving on cloud storage was worth breaking this promise. Bad bad call
I'm going to drop Backblaze for my entire company over this.
I need it to capture local data, even though that local data is getting synced to Google Drive. Where we sync our data really has nothing to do with Backblaze backing up the endpoint. We don't wholly trust sync, that's why we have backup.
On my personal Mac I have iCloud Drive syncing my desktop, and a while back iCloud ate a file I was working on. Backblaze had it captured, thankfully. But if they are going to exclude iCloud Drive synced folders, and sounds like that is their intention, Backblaze is useless to me.
This is very well put, and echoes my sentiments! I had installed Backblaze on my own home machine many, many years ago, and it has saved my bacon a few times. Since then I've also installed it on any family members' machines that required backup and recommended it to friends. And I've been happy to pay for the service.
The deal was that Backblaze backs things up and I don't have to worry about it. Learning that it does not back things up is a punch to the gut. I am familiar with the exclusions and I have a look at that list to make I'm not missing anything from my backups. I had always thought the exclusions list was exhaustive.
Excluding other files and folders without telling me about it breaks the deal. Dropbox is important to several of the users I installed it for. Ignoring .git folders is another one that affects me and I had not known about. Ouch.
I will now have to look for alternatives. It has to be easy to install, run seamlessly on non-technical users' machines and be reliable.
I find it hard to be think of a worse breach of trust for a backup service than not to back up files!
I'm going to join the exodus, though for a different reason. Switched to Orbstack and ever since Backblaze refuses to back up saying "disk full" as Orbstack uses a 8TB sparse disk image. You can exclude it, but if they won't (very easily) fix a known issue by measuring file sizes properly I don't feel confident about the product.
Also taking recommendations for a simple services I can install on my dads windows machine and my moms Mac that will just automatically backup the main drive to the cloud just in case
my parents also lost data that was supposed to be backed up on backblaze because they didn't use that computer for a month and when they turned it back on the hard drive was dead. apparently backblaze silently deletes backups more than 30 days old even if they are the newest backup and then happily keeps billing you for not storing your data.
Natasha from Backblaze here. This is a situation people rely on backups for, and I can understand how frustrating it is to run into this when you need a restore.
I also want to be clear, this wasn’t about saving on storage. It came from cases where backing up cloud-synced folders (like Dropbox) was leading to unreliable or incomplete restores because of how those files are managed under the hood.
When Dropbox began using reparse points for synced files, those files no longer behaved like standard local files. Because of that, Backblaze Computer Backup can’t reliably back them up or restore them. The current behavior is focused on ensuring we only back up data we can reliably restore, and we are actively exploring ways to better support Dropbox and data touched by other sync services.
Hello, Jim from Backblaze here. I wanted to offer some insight into what happened with backing up cloud-synced folders.
It is true that we recently updated how Backblaze Computer Backup handles cloud-synced folders. This decision was driven by a consistent set of technical issues we were seeing at scale, most of them driven by updates created by third-party sync tools, including unreliable backups and incomplete restores when backing up files managed by third-party sync providers.
To give a bit more context on the “why”: these cloud storage providers now rely heavily on OS-level frameworks to manage sync state. On Windows, for example, files are often represented as reparse points via the Cloud Files API. While they can appear local, they are still system-managed placeholders, which makes it difficult to reliably back them up as standard on-disk files.
Moreover, we built our product in a way to not backup reparse points for two reasons:
1. We wanted the backup client to be light on the system and only back up needed user-generated files.
2. We wanted the service to be unlimited, so following reparse points would lead to us backing up tons of data in the cloud
We’ve made targeted investments where we can, for example, adding support for iCloud Drive by working within Apple’s model and supporting Google Drive, but extending that same level of support to third-party providers like Dropbox or OneDrive is more complex and not included in the current version.
We are currently exploring building an add-on that either follows reparse points or backs up the tagged data in another way.
We also hear you clearly on the communication gap. Both the sync providers and Backblaze should have been more proactive in notifying customers about a change with this level of impact. Please don't hesitate to reach out to me or our support team directly if you have any questions. https://help.backblaze.com/hc/en-us/requests/new
I guess the problem with Backblaze's business model with respect to Backblaze Personal is that it is "unlimited". They specifically exclude linux users because, well, we're nerds, r/datahoarders exists, and we have different ideas about what "unlimited" means. [1]
This is another example in disguise of two people disagreeing about what "unlimited" means in the context of backup, even if they do claim to have "no restrictions on file type or size" [2].
The issue with a client app backing up dropbox and onedrive folders on your computer is the files on demand feature, you could sync a 1tb onedrive to your 250gb laptop but it's OK because of smart/selective sync aka files on demand. Then backblaze backup tries to back the folder up and requests a download of every single file and now you have zero bytes free, still no backup and a sick laptop.
You could oauth the backblaze app to access onedrive directly, but if you want to back your onedrive up you need a different product IMO.
I can understand in theory why they wouldn't want to back up .git folders as-is. Git has a serious object count bloat problem if you have any repository with a good amount of commit history, which causes a lot of unnecessary overhead in just scanning the folder for files alone.
I don't quite understand why it's still like this; it's probably the biggest reason why git tends to play poorly with a lot of filesystem tools (not just backups). If it'd been something like an SQLite database instead (just an example really), you wouldn't get so much unnecessary inode bloat.
At the same time Backblaze is a backup solution. The need to back up everything is sort of baked in there. They promise to be the third backup solution in a three layer strategy (backup directly connected, backup in home, backup external), and that third one is probably the single most important one of them all since it's the one you're going to be touching the least in an ideal scenario. They really can't be excluding any files whatsoever.
The cloud service exclusion is similarly bad, although much worse. Imagine getting hit by a cryptoworm. Your cloud storage tool is dutifully going to sync everything encrypted, junking up your entire storage across devices and because restoring old versions is both ass and near impossible at scale, you need an actual backup solution for that situation. Backblaze excluding files in those folders feels like a complete misunderstanding of what their purpose should be.
They 100% should have communicated this change, absolutely unacceptable to change behavior without an extremely visible warning.
However, backing up these kinds of directories has always been ill-defined. Dropbox/Google Drive/etc. files are not actually present locally - at least not until you access the file or it resides to cache it. Should backup software force you to download all 1TB+ of your cloud storage? What if the local system is low on space? What if the network is too slow? What if the actually data is in an already excluded %AppData% location.
Similar issue with VCS, should you sync changes to .git every minute? Every hour? When is .git in a consistent state?
IMO .git and other VCS should just be synced X times per day and it wait for .git to be unchanged for Y minutes before syncing it. Hell, I bet Claude could write a special Git aware backup script.
But Google Drive and Dropbox mount points are not real. It’s crazy to expect backup software to handle that unless explicitly advertised.
Some companies are in the business of trust. These companies NEED to understand that trust is somewhat difficult to earn, but easy to lose and nearly IMPOSSIBLE to regain. After reading this article I will almost certainly never use or recommend Backblaze. (And while I don't use them currently, they WERE on the list of companies I would have recommended due to the length of their history.)
At some point, Backblaze just silently stopped backing up my encrypted (VeraCrypt) drives. Just stopped working without any announcement, warning or notification. After lots of troubleshooting and googling I found out that this was intentional from some random reddit thread. I stopped using their backup service after that.
It looks like the following line has been added to /Library/Backblaze.bzpkg/bzdata/bzexcluderules_mandatory.xml which excludes my Dropbox folder from getting backed up:
That is the exact path to my Dropbox folder, and I presume if I move my Dropbox folder this xml file will be updated to point to the new location. The top of the xml file states "Mandatory Exclusions: editing this file DOES NOT DO ANYTHING".
.git files seem to still be backing up on my machine, although they are hidden by default in the web restore (you must open Filters and enable Show Hidden Files). I don't see an option to show hidden files/folders in the Backblaze Restore app.
I noticed this (thankfully before it was critical) and I’ve decided to move on from BB. Easily over 10 year customer. Totally bogus. Not only did it stop backing it up the old history is totally gone as well.
The one thing they have to do is backup everything and when you see it in their console you can rest assured they are going to continue to back it up.
They’ve let the desktop client linger, it’s difficult to add meaningful exceptions. It’s obvious they want everyone to use B2 now.
After mucking around with various easy to use options my lack of trust[1] pushed me into a more-complicated-but-at-least-under-my-control-option: syncthing+restic+s3 compatible cloud provider.
Basically it works like this:
- I have syncthing moving files between all my devices. The larger the device, the more stuff I move there[2]. My phone only has my keepass file and a few other docs, my gaming PC has that plus all of my photos and music, etc.
- All of this ends up on a raspberry pi with a connected USB harddrive, which has everything on it. Why yes, that is very shoddy and short term! The pi is mirrored on my gaming PC though, which is awake once every day or two, so if it completely breaks I still have everything locally.
- Nightly a restic job runs, which backs up everything on the pi to an s3 compatible cloud[3], and cleans out old snapshots (30 days, 52 weeks, 60 months, then yearly)
- Yearly I test restoring a random backup, both on the pi, and on another device, to make sure there is no required knowledge stuck on there.
This is was somewhat of a pain to setup, but since the pi is never off it just ticks along, and I check it periodically to make sure nothing has broken.
[1] there is always weirdness with these tools. They don't sync how you think, or when you actually want to restore it takes forever, or they are stuck in perpetual sync cycles
[2] I sync multiple directories, broadly "very small", "small", "dumping ground", and "media", from smallest to largest.
[3] Currently Wasabi, but it really doens't matter. Restic encrypts client side, you just need to trust the provider enough that they don't completely collapse at the same time that you need backups.
Weirdly, reading this had the net impact of me signing up to Backblaze.
I had no idea that it was such a good bargain. I used to be a Crashplan user back in the day, and I always thought Backblaze had tiered limits.
I've been using Duplicati to sync a lot of data to S3's cheapest tape-based long term storage tier. It's a serious pain in the ass because it takes hours to queue up and retrieve a file. It's a heavy enough process that I don't do anything nearly close to enough testing to make sure my backups are restorable, which is a self-inflicted future injury.
Here's the thing: I'm paying about $14/month for that S3 storage, which makes $99/year a total steal. I don't use Dropbox/Box/OneDrive/iCloud so the grievances mentioned by the author are not major hurdles for me. I do find the idea that it is silently ignoring .git folders troubling, primarily because they are indeed not listed in the exclusion list.
I am a bit miffed that we're actively prevented from backing up the various Program Files folders, because I have a large number of VSTi instruments that I'll need to ensure are rcloned or something for this to work.
I once had to restore around 2 TB of RAW photos.
The app was a mess. It crashed every few hours. I ended up manually downloading single folders over a timespan of 2 weeks to restore my data. The support only apologized and could not help with my restore problem. After this I cancelled my subscription immediately and use local drives for my backups now, drives which I rotate (in use and locations).
I can almost almost understand the logic behind not backing up OneDrive/Dropbox. I think it's bad logic but I can understand where it's coming from.
Not backing up .git folders however is completely unacceptable.
I have hundreds of small projects where I use git track of history locally with no remote at all. The intention is never to push it anywhere. I don't like to say these sorts of things, and I don't say it lightly when I say someone should be fired over this decision.
> My first troubling discovery was in 2025, when I made several errors then did a push -f to GitHub and blew away the git history for a half decade old repo. No data was lost, but the log of changes was.
I know this is besides the point somewhat, but: Learn your tools people. The commit history could probably have been easily restored without involving any backup. The commits are not just instantly gone.
Backblaze is such a weird case. On one hand, it became the most trusted personal backup provider on reddit and HN, on another - their software is absolute junk, and as some comments in this thread are highlighting - even their restore can't be trusted.
I've never needed to restore anything, so can't say anything about this, but once, one of my devices deleted a file in Syncthing, and I went into Backblaze to see if they have any logs of deletions/file modifications (had it disabled in syncthing).
I don't remember the exact details, but I remember clearly that I felt like the entire thing was done by a junior engineer straight out of college. Trying to understand the names of some variables used there, I stumbled upon a reddit thread where the person who worked on the client was trying to explain why things were done the way they were - and I felt like it was me in my first 3 months of software engineering.
How did Backblaze gain this trust in the first place? Is it because nobody is offering "unlimited" storage at the same price point?
I had a back and forth with them about .git folders a couple of years back and their defence was something like "we are a consumer product - not a professional developer product. Pay for our business offering"
But if that's truly their stance, then they are being deceptive about their non-business offering at the point of sale.
EDIT - see my other comment where I found the actual email
It's ironic that Backblaze themselves wrote a blog post a couple of years ago explaining why Dropbox isn't enough as a backup service and you need Backblaze as an additional layer of protection: https://www.backblaze.com/blog/whats-wrong-with-google-drive...
AFAICT Backblaze does back up .git directories. I have many repos backed up. The .git directory is hidden by default in the web UI (along with all other hidden files), but there is an option to show them.
You should try downloading one of your backed up git repos to see if it actually does contain the full history, I just checked several and everything looks good.
Ironically drop box and one drive folders I can still somewhat understand as they are "backuped" in other ways (but potentially not reliable so I also understand why people do not like that).
But .git? It does not mean you have it synced to GitHub or anything reliable?
If you do anything then only backup the .git folder and not the checkout.
But backing up the checkout and not the .git folder is crazy.
I think this should not be attributed to malice, however unfortunate. I had also developed some sync app once and onedrive folders were indeed problematic, causing cyclic updates on access and random metadata changes for no explicit reason.
Complete lack of communication (outside of release notes, which nobody really reads, as the article too states) is incompetence and indeed worrying.
Just show a red status bar that says "these folders will not be backed up anymore", why not?
A lot of personal “nerd” options are listed in the thread (and like restic/borg are really good!) but nothing really centralized. Backblaze was a great fire and forget option for deploying as a last resort backup. I don’t think there are any competitors in that space if you are looking for continuous backup, centralized management and good pricing that doesn’t require talking to a salesperson to get things going and is pay as you go.
I highly recommend switching to something more like Arq and then using whatever backend storage that you want. There are probably some other open source ways to do it, etc, but Arq scratches the itch of having control over your backups and putting them where you want with a GUI to easily configure/keep track of what is going on.
Maybe there's something newer/better now (and I bought lifetime licenses of it long ago), but it works for me.
That said, I use Arq + Backblaze storage and I think my monthly bill is very low, like under $5. Though I haven't backed-up much media there yet, but I do have control over what is being backed-up.
On the topic of backing up data from cloud platforms such as Onedrive, I suspect this is stop the client machine from actively downloading 'files on demand' which are just pointers in explorer until you go to open them.
If you've got huge amounts of files in Onedrive and the backup client starts downloading everyone of them (before it can reupload them again) you're going to run into problems.
I think this is a risk with anything that promotes itself as "unlimited", or otherwise doesn't specify concrete limits. I'm always sceptical of services like this as it feels like the terms could arbitrarily change at any point, as we've found out here.
(as a side note, it's funny to see see them promoting their native C app instead of using Java as a "shortcut". What I wouldn't give for more Java apps nowadays)
I’ve been using it for years, and the one time I needed to restore a file, I realized that VMware VMs files were excluded from the backup. They are so many exclusion that I start doing physical backup again.
Unrelated to the main point, and probably too late to matter, but you can access repo activity logs via Github's API. I had to clean up a bad push before and was able to find the old commit hash in the logs, then reset the branch to that commit, similarly to how you'd fix local messes using reflog.
Use restic with resticprofile and you won't need anything else. Point it to a Hetzner storagebox, the best value you can get. Don't trust fisher price backup plans
For those looking for something at a decent price for up to 5TB, take a look at JottaCloud, which is supported by rclone, and then you can layer restic on top for a complete backup solution.
JottaCloud is "unlimited" for $11.99 a month (your upload speed is throttled after 5TB).
I've been using them for a few years for backing up important files from my NAS (timemachine backups, Immich library, digitised VHS's, Proxmox Backup Server backups) and am sitting at about 3.5TB.
Time Machine has a similar issue. OneDrive silently corrupted hundreds of my files, replacing their content with binary zeros while retaining the original file size. I have Time Machine backups going back years, but it turns out TM does not backup Cloud files, even if you have them pinned to local storage! So I lost sales those files, including some irreplaceable family photos
I’ve added restic to my backup routine, pointed at cloud files and other critical data
This is really disturbing to hear as I've incorporated B2 into a lot of my flow for backups as well as a storage backend for Nextcloud and planned as the object store for some upcoming archival storage products I'm working on.
I know the post is talking about their personal backup product but it's the same company and so if they sneak in a reduction of service like this, as others have already commented, it erodes difficult-to-earn trust.
I think the target of the anger here should be (at least in part): OneDrive.
My understanding is that a modern, default onedrive setup will push all your onedrive folder contents to the cloud, but will not do the same in reverse -- it's totally possible to have files in your cloud onedrive, visible in your onedrive folder, but that do not exist locally. If you want to access such a file, it typically gets downloaded from onedrive for you to use.
If that's the case, what is Backblaze or another provider to do? Constantly download your onedrive files (that might have been modified on another device) and upload them to backblaze? Or just sync files that actually exist locally? That latter option certainly would not please a consumer, who would expect the files they can 'see' just get magically backed up.
It's a tricky situation and I'm not saying Backblaze handled it well here, but the whole transparent cloud storage situation thing is a bit of a mess for lots of people. If Dropbox works the same way (no guaranteed local file for something you can see), that's the same ugly situation.
Everyone is acting like this is obviously wrong, and they clearly should have communicated the change and made it visible in the exclusion settings.
However, there is a very good reason for not backing up what is in effect network attached storage. Particularly for OneDrive, as it often adds company SharePoint sites you open files from as mountpoints under your OneDrive folder (business OneDrive is basically a personal Sharepoint site under the hood). Trying to back them up would result in downloading potentially hundreds of gigabytes of files to the desktop only to them reupload them to OneDrive. That would also likely trigger data exfiltration flags at your corporate IT.
A Dropbox/OneDrive/Drive/etc folder is a network mount point by another name. (Many of them are not implemented as FUSE mounts or equivalent OS API, not folders on disk.) It's fundamentally reasonable for software that promises backing up the local disk not to backup whatever network drives you happen to have signed in/mounted.
It seems to me that Backblaze does NOT exclude ".git". It's not shown by default in the restore UI -- you must enable "show hidden files" to see it -- but it's there. I just did a test restore of my top-level Project directory (container for all of my personal Git projects) and all .git directories are included in the produced .zip file.
While there may be some issues with Backblaze there's no real trusted alternative with such a long history.
Regardless to the OP's issues:
- on macOS since 9.0.2.784 released in 2023 all .git folders are included in backups
- Cloud drives are problematic to backup because they all use extension plugins to hide the network and your local disk only contains stubs instead of actual files. If Backblaze scans it fully it'll download everything and exhaust your disk space there's no easy solution here.
I don't buy for a minute they were trying to be "sneaky" to save some $$ I instead feel like for the majority of users they felt it was misleading to backup stubs only and would rather not brick user computers by downloading all the files. Remember they can't access your cloud disk directly so the only way they can get the file contents is by doing an fread and letting the cloud drive client sync the content on demand.
Yeah this is the core problem with how most backup tools handle Dropbox / iCloud / OneDrive now.
Those folders aren’t really “normal files” anymore — a lot of the time they’re just placeholders, and touching them can trigger downloads or other weird behavior depending on the client.
That said, just skipping the entire folder is kind of the worst possible outcome. Backup should be predictable. If something is on disk, it should get backed up. If it’s not, you should at least know that, not find out later when you need it.
I’ve been working on Duplicati (https://github.com/duplicati/duplicati) and one thing we’ve tried to be careful about is not silently ignoring data. If something can’t be backed up, it should be visible to the user.
Feel free to reach out to me if you have any questions about setting up duplicati.
693 comments
We discovered this change recently because my dad was looking for a file that Dropbox accidentally overwrote which at first we said “no problem. This is why we pay for backblaze”
We had learned that this policy had changed a few months ago, and we were never notified. File was unrecoverable
If anyone at backblaze is reading this, I pay for your product so I can install you on my parents machine and never worry about it again. You decided saving on cloud storage was worth breaking this promise. Bad bad call
I need it to capture local data, even though that local data is getting synced to Google Drive. Where we sync our data really has nothing to do with Backblaze backing up the endpoint. We don't wholly trust sync, that's why we have backup.
On my personal Mac I have iCloud Drive syncing my desktop, and a while back iCloud ate a file I was working on. Backblaze had it captured, thankfully. But if they are going to exclude iCloud Drive synced folders, and sounds like that is their intention, Backblaze is useless to me.
The deal was that Backblaze backs things up and I don't have to worry about it. Learning that it does not back things up is a punch to the gut. I am familiar with the exclusions and I have a look at that list to make I'm not missing anything from my backups. I had always thought the exclusions list was exhaustive.
Excluding other files and folders without telling me about it breaks the deal. Dropbox is important to several of the users I installed it for. Ignoring .git folders is another one that affects me and I had not known about. Ouch.
I will now have to look for alternatives. It has to be easy to install, run seamlessly on non-technical users' machines and be reliable.
I find it hard to be think of a worse breach of trust for a backup service than not to back up files!
I also want to be clear, this wasn’t about saving on storage. It came from cases where backing up cloud-synced folders (like Dropbox) was leading to unreliable or incomplete restores because of how those files are managed under the hood.
When Dropbox began using reparse points for synced files, those files no longer behaved like standard local files. Because of that, Backblaze Computer Backup can’t reliably back them up or restore them. The current behavior is focused on ensuring we only back up data we can reliably restore, and we are actively exploring ways to better support Dropbox and data touched by other sync services.
It is true that we recently updated how Backblaze Computer Backup handles cloud-synced folders. This decision was driven by a consistent set of technical issues we were seeing at scale, most of them driven by updates created by third-party sync tools, including unreliable backups and incomplete restores when backing up files managed by third-party sync providers.
To give a bit more context on the “why”: these cloud storage providers now rely heavily on OS-level frameworks to manage sync state. On Windows, for example, files are often represented as reparse points via the Cloud Files API. While they can appear local, they are still system-managed placeholders, which makes it difficult to reliably back them up as standard on-disk files.
Moreover, we built our product in a way to not backup reparse points for two reasons:
1. We wanted the backup client to be light on the system and only back up needed user-generated files. 2. We wanted the service to be unlimited, so following reparse points would lead to us backing up tons of data in the cloud
We’ve made targeted investments where we can, for example, adding support for iCloud Drive by working within Apple’s model and supporting Google Drive, but extending that same level of support to third-party providers like Dropbox or OneDrive is more complex and not included in the current version.
We are currently exploring building an add-on that either follows reparse points or backs up the tagged data in another way.
We also hear you clearly on the communication gap. Both the sync providers and Backblaze should have been more proactive in notifying customers about a change with this level of impact. Please don't hesitate to reach out to me or our support team directly if you have any questions. https://help.backblaze.com/hc/en-us/requests/new
We are here to help.
This is another example in disguise of two people disagreeing about what "unlimited" means in the context of backup, even if they do claim to have "no restrictions on file type or size" [2].
[1] https://www.reddit.com/r/backblaze/comments/jsrqoz/personal_... [2] https://www.backblaze.com/cloud-backup/personal
I don't quite understand why it's still like this; it's probably the biggest reason why git tends to play poorly with a lot of filesystem tools (not just backups). If it'd been something like an SQLite database instead (just an example really), you wouldn't get so much unnecessary inode bloat.
At the same time Backblaze is a backup solution. The need to back up everything is sort of baked in there. They promise to be the third backup solution in a three layer strategy (backup directly connected, backup in home, backup external), and that third one is probably the single most important one of them all since it's the one you're going to be touching the least in an ideal scenario. They really can't be excluding any files whatsoever.
The cloud service exclusion is similarly bad, although much worse. Imagine getting hit by a cryptoworm. Your cloud storage tool is dutifully going to sync everything encrypted, junking up your entire storage across devices and because restoring old versions is both ass and near impossible at scale, you need an actual backup solution for that situation. Backblaze excluding files in those folders feels like a complete misunderstanding of what their purpose should be.
I contacted the support asking WTF, "oh the file got deleted at some point, sorry for that", and they offered me 3 months of credits.
I do not trust my Backblaze backups anymore.
However, backing up these kinds of directories has always been ill-defined. Dropbox/Google Drive/etc. files are not actually present locally - at least not until you access the file or it resides to cache it. Should backup software force you to download all 1TB+ of your cloud storage? What if the local system is low on space? What if the network is too slow? What if the actually data is in an already excluded %AppData% location.
Similar issue with VCS, should you sync changes to .git every minute? Every hour? When is .git in a consistent state?
IMO .git and other VCS should just be synced X times per day and it wait for .git to be unchanged for Y minutes before syncing it. Hell, I bet Claude could write a special Git aware backup script.
But Google Drive and Dropbox mount points are not real. It’s crazy to expect backup software to handle that unless explicitly advertised.
That is the exact path to my Dropbox folder, and I presume if I move my Dropbox folder this xml file will be updated to point to the new location. The top of the xml file states "Mandatory Exclusions: editing this file DOES NOT DO ANYTHING".
.git files seem to still be backing up on my machine, although they are hidden by default in the web restore (you must open Filters and enable Show Hidden Files). I don't see an option to show hidden files/folders in the Backblaze Restore app.
The one thing they have to do is backup everything and when you see it in their console you can rest assured they are going to continue to back it up.
They’ve let the desktop client linger, it’s difficult to add meaningful exceptions. It’s obvious they want everyone to use B2 now.
Basically it works like this:
- I have syncthing moving files between all my devices. The larger the device, the more stuff I move there[2]. My phone only has my keepass file and a few other docs, my gaming PC has that plus all of my photos and music, etc.
- All of this ends up on a raspberry pi with a connected USB harddrive, which has everything on it. Why yes, that is very shoddy and short term! The pi is mirrored on my gaming PC though, which is awake once every day or two, so if it completely breaks I still have everything locally.
- Nightly a restic job runs, which backs up everything on the pi to an s3 compatible cloud[3], and cleans out old snapshots (30 days, 52 weeks, 60 months, then yearly)
- Yearly I test restoring a random backup, both on the pi, and on another device, to make sure there is no required knowledge stuck on there.
This is was somewhat of a pain to setup, but since the pi is never off it just ticks along, and I check it periodically to make sure nothing has broken.
[1] there is always weirdness with these tools. They don't sync how you think, or when you actually want to restore it takes forever, or they are stuck in perpetual sync cycles
[2] I sync multiple directories, broadly "very small", "small", "dumping ground", and "media", from smallest to largest.
[3] Currently Wasabi, but it really doens't matter. Restic encrypts client side, you just need to trust the provider enough that they don't completely collapse at the same time that you need backups.
I had no idea that it was such a good bargain. I used to be a Crashplan user back in the day, and I always thought Backblaze had tiered limits.
I've been using Duplicati to sync a lot of data to S3's cheapest tape-based long term storage tier. It's a serious pain in the ass because it takes hours to queue up and retrieve a file. It's a heavy enough process that I don't do anything nearly close to enough testing to make sure my backups are restorable, which is a self-inflicted future injury.
Here's the thing: I'm paying about $14/month for that S3 storage, which makes $99/year a total steal. I don't use Dropbox/Box/OneDrive/iCloud so the grievances mentioned by the author are not major hurdles for me. I do find the idea that it is silently ignoring .git folders troubling, primarily because they are indeed not listed in the exclusion list.
I am a bit miffed that we're actively prevented from backing up the various Program Files folders, because I have a large number of VSTi instruments that I'll need to ensure are rcloned or something for this to work.
I never trust them again with my data.
Not backing up .git folders however is completely unacceptable.
I have hundreds of small projects where I use git track of history locally with no remote at all. The intention is never to push it anywhere. I don't like to say these sorts of things, and I don't say it lightly when I say someone should be fired over this decision.
> My first troubling discovery was in 2025, when I made several errors then did a push -f to GitHub and blew away the git history for a half decade old repo. No data was lost, but the log of changes was.
I know this is besides the point somewhat, but: Learn your tools people. The commit history could probably have been easily restored without involving any backup. The commits are not just instantly gone.
I've never needed to restore anything, so can't say anything about this, but once, one of my devices deleted a file in Syncthing, and I went into Backblaze to see if they have any logs of deletions/file modifications (had it disabled in syncthing).
I don't remember the exact details, but I remember clearly that I felt like the entire thing was done by a junior engineer straight out of college. Trying to understand the names of some variables used there, I stumbled upon a reddit thread where the person who worked on the client was trying to explain why things were done the way they were - and I felt like it was me in my first 3 months of software engineering.
How did Backblaze gain this trust in the first place? Is it because nobody is offering "unlimited" storage at the same price point?
But if that's truly their stance, then they are being deceptive about their non-business offering at the point of sale.
EDIT - see my other comment where I found the actual email
That aged well...
You should try downloading one of your backed up git repos to see if it actually does contain the full history, I just checked several and everything looks good.
But .git? It does not mean you have it synced to GitHub or anything reliable?
If you do anything then only backup the .git folder and not the checkout.
But backing up the checkout and not the .git folder is crazy.
Complete lack of communication (outside of release notes, which nobody really reads, as the article too states) is incompetence and indeed worrying.
Just show a red status bar that says "these folders will not be backed up anymore", why not?
Maybe there's something newer/better now (and I bought lifetime licenses of it long ago), but it works for me.
That said, I use Arq + Backblaze storage and I think my monthly bill is very low, like under $5. Though I haven't backed-up much media there yet, but I do have control over what is being backed-up.
If you've got huge amounts of files in Onedrive and the backup client starts downloading everyone of them (before it can reupload them again) you're going to run into problems.
But ideally, they'd give you a choice.
(as a side note, it's funny to see see them promoting their native C app instead of using Java as a "shortcut". What I wouldn't give for more Java apps nowadays)
JottaCloud is "unlimited" for $11.99 a month (your upload speed is throttled after 5TB).
I've been using them for a few years for backing up important files from my NAS (timemachine backups, Immich library, digitised VHS's, Proxmox Backup Server backups) and am sitting at about 3.5TB.
I’ve added restic to my backup routine, pointed at cloud files and other critical data
I know the post is talking about their personal backup product but it's the same company and so if they sneak in a reduction of service like this, as others have already commented, it erodes difficult-to-earn trust.
My understanding is that a modern, default onedrive setup will push all your onedrive folder contents to the cloud, but will not do the same in reverse -- it's totally possible to have files in your cloud onedrive, visible in your onedrive folder, but that do not exist locally. If you want to access such a file, it typically gets downloaded from onedrive for you to use.
If that's the case, what is Backblaze or another provider to do? Constantly download your onedrive files (that might have been modified on another device) and upload them to backblaze? Or just sync files that actually exist locally? That latter option certainly would not please a consumer, who would expect the files they can 'see' just get magically backed up.
It's a tricky situation and I'm not saying Backblaze handled it well here, but the whole transparent cloud storage situation thing is a bit of a mess for lots of people. If Dropbox works the same way (no guaranteed local file for something you can see), that's the same ugly situation.
However, there is a very good reason for not backing up what is in effect network attached storage. Particularly for OneDrive, as it often adds company SharePoint sites you open files from as mountpoints under your OneDrive folder (business OneDrive is basically a personal Sharepoint site under the hood). Trying to back them up would result in downloading potentially hundreds of gigabytes of files to the desktop only to them reupload them to OneDrive. That would also likely trigger data exfiltration flags at your corporate IT.
A Dropbox/OneDrive/Drive/etc folder is a network mount point by another name. (Many of them are not implemented as FUSE mounts or equivalent OS API, not folders on disk.) It's fundamentally reasonable for software that promises backing up the local disk not to backup whatever network drives you happen to have signed in/mounted.
Regardless to the OP's issues:
- on macOS since 9.0.2.784 released in 2023 all .git folders are included in backups - Cloud drives are problematic to backup because they all use extension plugins to hide the network and your local disk only contains stubs instead of actual files. If Backblaze scans it fully it'll download everything and exhaust your disk space there's no easy solution here.
I don't buy for a minute they were trying to be "sneaky" to save some $$ I instead feel like for the majority of users they felt it was misleading to backup stubs only and would rather not brick user computers by downloading all the files. Remember they can't access your cloud disk directly so the only way they can get the file contents is by doing an fread and letting the cloud drive client sync the content on demand.
Feel free to reach out to me if you have any questions about setting up duplicati.