Jump to content
Our response time is longer than usual currently. We're working to answer users as quickly as possible and thank you for your continued patience.

Backup/version control


Recommended Posts

I've recently had the experience of a complex pathway of repeated refinements dealing with several stakeholders resulting in a set of final designs using AP &B AD. I've also been checking the forums and there appear to be a growing number of corrupt file saves being reported.  Thankfully I've not had that happen.

I'm just wondering if there is scope for a setting to keep a number of generations of backups of a design file, getting bumped on each save? I was doing this manually but every so often I'd accidentally save over the current version and this had me wondering if whatever is causing the corrupt saves hit me I'd be in a real pickle. Version control would be even better but all the systems out there wouldn't be able to detect changes other than checksum/size/datestamp which would make it a more tedious job (IMO) to maintain.

Is anyone out there successfully using version control with Affinity products? Is there any other strategy to consider other than using a well defined folder/file naming scheme?

Link to comment
Share on other sites

2 hours ago, Paul Mc said:

Version control would be even better but all the systems out there wouldn't be able to detect changes other than checksum/size/datestamp which would make it a more tedious job (IMO) to maintain.

Somehow I didn't understand why there should be a problem with SVN systems. The only problem is, given the nature of the affinity data files, that the efficiency of the incremental backups will be low, and the version database will grow very quickly.

Affinity Store: Affinity Suite (ADe, APh, APu) 1.10.5.1342, 2.0.0.
Dell OptiPlex 7060, i5-8500 3.00 GHz, 16 GB, Intel UHD Graphics 630, Dell P2417H 1920 x 1080, Windows 11 Pro, Version 22H2, Build 22621.819.
Dell Latitude E5570, i5-6440HQ 2.60 GHz, 8 GB, Intel HD Graphics 530, 1920 x 1080, Windows 10 Pro, Version 21H1, Build 19043.2130.
Intel NUC5PGYH, Pentium N3700 2.40 GHz, 8 GB, Intel HD Graphics, EIZO EV2456 1920 x 1200, Windows 10 Pro, Version 21H1, Build 19043.2130.

Link to comment
Share on other sites

I'm playing around with Git-based versioning, and so far it is working well in my small, single-person, testing.

As @Pšenda says, though, all the files are binary blobs, so the database will probably grow quickly. But the basic versioning techniques should work fine, with commits and reverts, etc. possible.

So far my approach has been to direct the Affinity application to Open or Save files into the git repository directory, and at this early stage it's going fine. Commits, etc. are then done via the git client. I have not yet extended this to using remote repositories, with cloning, etc. but I foresee no difficulties in doing so.

-- Walt

Desktop:  Windows 11 Home, version 21H2 (22000.613) 64GB memory, AMD Ryzen 9 5900 12-Core @ 3.00 GHz, NVIDIA GeForce RTX 3090 
Laptop:  Windows 10 Home, version 21H2 (19044.1706) 32GB memory, Intel Core i7-10750H @ 2.60GHz, Intel UHD Graphics Comet Lake GT2 and NVIDIA GeForce RTX 3070 Laptop GPU.
        Affinity Photo 1.10.6 (.1665) and 2.0.0 / Affinity Designer 1.10.6 (.1665)  and 2.0.0 / Affinity Publisher 1.10.6 (.1665)  and 2.0.0
iPad Pro M1, 12.9", iPadOS 16.1.1, Apple Pencil 2, Magic Keyboard

      Affinity Photo 1.10.6 and 2.0.2 / Affinity Designer 1.10.6 and 2.0.2 / Affinity Publisher 2.0.2

Link to comment
Share on other sites

As @Pšenda and @walt.farrell have mentioned, Affinity is not a good fit for anything like mercurial, git, cvs, etc, because of the binary blob nature of the Affinity files.

There are tools around which can handle binary differences. With my FreeBSD hat on I highlight bsdiff. It is a good tool at what it does but it uses a lot of memory to do it. From the man page: 'The bsdiff utility uses memory equal to 17 times the size of oldfile, and requires an absolute minimum working set size of 8 times the size of oldfile.' So if you have a 1GB Affinity file and 8GB of RAM you are going to struggle. As for reliability I recently remotely updated a FreeBSD installation using a tool which depends on bsdiff to work. This required 9,277 patches, nearly all of them binary. I was confident enough in the process to press the return key, and sure enough I have a new version of the OS installed.

So binary files can be used in version control systems, subject to some caveats. If you are working on a large file and keep checking it in, you will eat your disk space quickly.

If you are on a Mac 'cp -c' uses clonefile(2) to make a copy. clonefile is a copy on write system call, I will let you look it up if you are interested. This is instantaneous because the new file uses old file's blocks on disk. Any subsequent writes or modifications to your new file creates new blocks on disks. This is very space efficient. So if you decide to roll your own file versioning scheme for Affinity files, use 'cp -c' and not just 'cp'.

A different option would be to keep your files on a file system with built in snapshots at the block level, e.g. on a NetApp, or Solaris/FreeBSD's ZFS. These are, to repeat some words from the previous paragraph, copy on write file systems, which is why they can do this sort of thing efficiently.

Sorry for the divergence into the benefits of file systems, but I have long had an interest in files and file systems. It's one of my pet subjects. At a place where I worked a few years ago we had several million new files created a day, and in that time I can think of only one file which actually suffered a problem (due to underlying hardware).

Link to comment
Share on other sites

On 6/3/2022 at 3:28 PM, Pšenda said:

Somehow I didn't understand why there should be a problem with SVN systems. The only problem is, given the nature of the affinity data files, that the efficiency of the incremental backups will be low, and the version database will grow very quickly.

Thanks @Pšenda, there isn't a problem with using SVN - my point was coming from a software developer perspective where branches and merges are a clear benefit which appear impossible with current offerings. And, yes, to use for snapshots is really the only use I can see, which, of course, will make the database grow.

Link to comment
Share on other sites

On 6/3/2022 at 4:23 PM, walt.farrell said:

So far my approach has been to direct the Affinity application to Open or Save files into the git repository directory, and at this early stage it's going fine. Commits, etc. are then done via the git client. I have not yet extended this to using remote repositories, with cloning, etc. but I foresee no difficulties in doing so.

Thanks @walt.farrell I might just give this a try. This is what I was thinking of doing when I posted the original message but was curious about how others might do this.

Link to comment
Share on other sites

Thanks @LondonSquirrel I hadn't considered the use of something like bsdiff - until now. I've used a patch builder under Windows many years ago and can see that there are several offerings on github with equivalents so I might investigate that option. I'm Windows based so ATM I'm not sure how your discussion of the cp command could be applied. I've just acquired a new QNAP NAS so disk space isn't a problem and QTS is a close relative of Linux so maybe I could push that functionality across to that box. It's early days on that one and I need to do some more homework before I'd feel comfortable having my paid work rely on it. Thanks for the divergence - it's all interesting stuff.

Link to comment
Share on other sites

1 hour ago, Paul Mc said:

t option. I'm Windows based so ATM I'm not sure how your discussion of the cp command could be applied.

Most easily, probably, via WSL (or, better, WSL2), but also via other methods of running UNIX/Linux tools under Windows. Possibly even in Git Bash, which the Git installations I've used for Windows provide. 

 

-- Walt

Desktop:  Windows 11 Home, version 21H2 (22000.613) 64GB memory, AMD Ryzen 9 5900 12-Core @ 3.00 GHz, NVIDIA GeForce RTX 3090 
Laptop:  Windows 10 Home, version 21H2 (19044.1706) 32GB memory, Intel Core i7-10750H @ 2.60GHz, Intel UHD Graphics Comet Lake GT2 and NVIDIA GeForce RTX 3070 Laptop GPU.
        Affinity Photo 1.10.6 (.1665) and 2.0.0 / Affinity Designer 1.10.6 (.1665)  and 2.0.0 / Affinity Publisher 1.10.6 (.1665)  and 2.0.0
iPad Pro M1, 12.9", iPadOS 16.1.1, Apple Pencil 2, Magic Keyboard

      Affinity Photo 1.10.6 and 2.0.2 / Affinity Designer 1.10.6 and 2.0.2 / Affinity Publisher 2.0.2

Link to comment
Share on other sites

10 hours ago, walt.farrell said:

Most easily, probably, via WSL (or, better, WSL2), but also via other methods of running UNIX/Linux tools under Windows. Possibly even in Git Bash, which the Git installations I've used for Windows provide. 

 

Thanks @walt.farrell I will take a look.

Link to comment
Share on other sites

On 6/3/2022 at 1:43 PM, Paul Mc said:

Is anyone out there successfully using version control with Affinity products? Is there any other strategy to consider other than using a well defined folder/file naming scheme?

Have you tried to save your work history – with branches – within your document(s)?

https://affinity.help/designer/English.lproj/index.html?page=pages/Panels/historyPanel.html?title=History panel

This allows to keep a lot of the changes "active" without bloating disk usage.

As @walt.farrell points out, placing large binary blobs into a VCS like Git does work, but takes up a lot of disk space. From a "hacker's" point-of-view, I'd try to keep the major work history within the project document and only place significant milestones, e.g., a customer delivery, under external version control (most likely with Git), where you can mark them with tags.

Link to comment
Share on other sites

1 hour ago, Andreas Scherer said:

Have you tried to save your work history – with branches – within your document(s)?

https://affinity.help/designer/English.lproj/index.html?page=pages/Panels/historyPanel.html?title=History panel

This allows to keep a lot of the changes "active" without bloating disk usage.

One problem with that approach is that sometimes (far too often, actually) it seems that after one Saves an Affinity document one cannot re-Open it. If all your "backups" are within that one file, they're all gone.

Saving into a Git-managed folder, and then Staging/Commiting the change, will give you multiple levels of recovery. As will using Save As under a new name (which is easier than setting up a Git repository :) ).

Ideally, though, after a Save or a Save As one will then try to Open that file, to make sure it wasn't corrupted during the Save process. But the only real way to do that and know, for sure, you have a good copy is (I think) to do Save As twice:

  1. Save As using new name name1. This saves a copy, and leaves you editing name1.
  2. Save As using new name name2. This saves another copy, and leaves you editing name2.
  3. Open name1, to prove that it was saved successfully.
  4. If step 3 works, you can Close name2, and if you want, you can delete it. Or, you can keep it as another level of backup, in case you manage to corrupt name1 somehow while working on it.

-- Walt

Desktop:  Windows 11 Home, version 21H2 (22000.613) 64GB memory, AMD Ryzen 9 5900 12-Core @ 3.00 GHz, NVIDIA GeForce RTX 3090 
Laptop:  Windows 10 Home, version 21H2 (19044.1706) 32GB memory, Intel Core i7-10750H @ 2.60GHz, Intel UHD Graphics Comet Lake GT2 and NVIDIA GeForce RTX 3070 Laptop GPU.
        Affinity Photo 1.10.6 (.1665) and 2.0.0 / Affinity Designer 1.10.6 (.1665)  and 2.0.0 / Affinity Publisher 1.10.6 (.1665)  and 2.0.0
iPad Pro M1, 12.9", iPadOS 16.1.1, Apple Pencil 2, Magic Keyboard

      Affinity Photo 1.10.6 and 2.0.2 / Affinity Designer 1.10.6 and 2.0.2 / Affinity Publisher 2.0.2

Link to comment
Share on other sites

On 6/3/2022 at 1:43 PM, Paul Mc said:

I'm just wondering if there is scope for a setting to keep a number of generations of backups of a design file, getting bumped on each save?

Do you mean something like that?

 

Affinity Store: Affinity Suite (ADe, APh, APu) 1.10.5.1342, 2.0.0.
Dell OptiPlex 7060, i5-8500 3.00 GHz, 16 GB, Intel UHD Graphics 630, Dell P2417H 1920 x 1080, Windows 11 Pro, Version 22H2, Build 22621.819.
Dell Latitude E5570, i5-6440HQ 2.60 GHz, 8 GB, Intel HD Graphics 530, 1920 x 1080, Windows 10 Pro, Version 21H1, Build 19043.2130.
Intel NUC5PGYH, Pentium N3700 2.40 GHz, 8 GB, Intel HD Graphics, EIZO EV2456 1920 x 1200, Windows 10 Pro, Version 21H1, Build 19043.2130.

Link to comment
Share on other sites

Thanks @Andreas Scherer

I knew about this feature but hadn't considered using it in this context. I've used it where there is a mainline of development with minor tweaks which may need to be removed It seemed to work OK but the final file didn't allow easy traversal and review of all the "final nodes" of the design without exhaustively cycling through all the decision points.

For the moment I'm trying out the @walt.farrell method above and suffer the disk hit as I think that will be the most robust in the face of client request changes.

Link to comment
Share on other sites

1 minute ago, Pšenda said:

Do you mean something like that?

 

Yes! It has not happened to me recently but I think it's only a matter of time.

Separate to this is that a recent project had me managing a complex tree of design alternatives and tweaks across several product briefs which were unique but had some elements of consistency. It was really hard work when each change request came through. Reviewing it and working backwards it was easy to see what I should have done 😊. I have a good backup system running here but even that felt too infrequent to save me should there have been a failure - or a mistake.

Link to comment
Share on other sites

Side note: For the time being, it seems I'm one Affinity user who hasn't encountered file corruption with any of the three apps (versions 1.8.4 to 1.10.5) on any of the Mac Minis in use. Fingers crossed! 🤓

For quite some time I've been using single Affinity Designer project files with multiple artboards, but I've tried the “history” feature just today. Obviously, its interface is far from being comparable to a “version control system” (maybe the “snapshot panel” can be of help). Provided that the internal document structure is in one form or another “text-based” (XML?), it would be a nice (but major) extension to have a genuine “VCS panel“ with tagging and branching etc.

Link to comment
Share on other sites

17 minutes ago, Andreas Scherer said:

Provided that the internal document structure is in one form or another “text-based” (XML?),

It's not, as far as I can see.

-- Walt

Desktop:  Windows 11 Home, version 21H2 (22000.613) 64GB memory, AMD Ryzen 9 5900 12-Core @ 3.00 GHz, NVIDIA GeForce RTX 3090 
Laptop:  Windows 10 Home, version 21H2 (19044.1706) 32GB memory, Intel Core i7-10750H @ 2.60GHz, Intel UHD Graphics Comet Lake GT2 and NVIDIA GeForce RTX 3070 Laptop GPU.
        Affinity Photo 1.10.6 (.1665) and 2.0.0 / Affinity Designer 1.10.6 (.1665)  and 2.0.0 / Affinity Publisher 1.10.6 (.1665)  and 2.0.0
iPad Pro M1, 12.9", iPadOS 16.1.1, Apple Pencil 2, Magic Keyboard

      Affinity Photo 1.10.6 and 2.0.2 / Affinity Designer 1.10.6 and 2.0.2 / Affinity Publisher 2.0.2

Link to comment
Share on other sites

Well initially VCS (version control systems) are not meant for dealing with huge binary data in the same fine granular manner as with text/code file data here, but there are some toolings and concepts available which at least allow to handle huge binary data versioning too.

People interested in this theme should take a look at GIT LFS & lakeFS

☛ Affinity Designer 1.10.5 ◆ Affinity Photo 1.10.5 ◆ OSX El Capitan

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...

Important Information

Please note there is currently a delay in replying to some post. See pinned thread in the Questions forum. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.