Jump to content
derei

Save Failed because ownership of the file could not be verified

Recommended Posts

2 minutes ago, walt.farrell said:

Of course they use buffers. But they have told us they do not load all of the file into memory, nor transfer it all into a local file. That's why the loss of the connection to a network file makes continuing impossible.

Ok than transfer just all what you need for a graceful showup, or if you determine that the file to load is not local and instead from a network connection fetch it first to a local temp storage and then load as much as you need from that. Either way they have to deal with this, since they seem to have NAS access, cloud drive etc. problems with reading/writing files. So they need some mechanism around that which ensures a correct file handling. I believe there are different scenarios possible to somehow overcome with the actual limited accessibility situation here.

12 minutes ago, walt.farrell said:

They have a local file, but it's the auto-recovery file, which just saves changes to the original file to allow recovery after a restart if the program crashes.

That would be something for me where I would start looking into, to (re)use some of the stuff for recovery on failures when writing across network boundaries. So that the actual state can be safely recovered in case of timeouts and failure. The point here is just to not loose the work (editing state) a user might already has done so far in the apps working memory, if the data can't be written over a network. Usually you don't want to close your document and maybe loose your last editings then due to some fail of network connectivity or an unreachable network drive. - For reading a file there shouldn't be a data lost here anyway, since either a file can be read in (loaded) or not, as far as the file isn't altered/damaged one can retry until some time interval is reached, if always timeouts do occur abort and let the user reinitiate another load file operation on demand.


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
7 hours ago, v_kyr said:

Ok than transfer just all what you need for a graceful showup, or if you determine that the file to load is not local and instead from a network connection fetch it first to a local temp storage and then load as much as you need from that.

There is no way to know in advance how much of the file would be needed for any particular purpose so the only alternative is to download the whole thing. But there are several issues to consider about doing that. The most obvious one is there would be two copies of the file, one local & one remote, that need to be kept in sync somehow at various times, & that would require more CPU time, more network traffic, more local storage space, & so on. 

7 hours ago, v_kyr said:

For reading a file there shouldn't be a data lost here anyway, since either a file can be read in (loaded) or not, as far as the file isn't altered/damaged one can retry until some time interval is reached, if always timeouts do occur abort and let the user reinitiate another load file operation on demand.

But the file is not loaded all at once into working memory, nor is it normally completely rewritten on each save (because the change data is serialized). If it helps, think of the document somewhat like a large database file -- you would not read the whole thing sequentially into memory or rewrite the whole thing on each change because that would be extremely inefficient & slow. Instead, you structure it so that you can fetch just the data you need at any particular time from it & write just the changes you make to that back to the file. We still have to do a lot of guessing about how this actually works, so the analogy will only stretch so far, but I think it might help explain why they recommend working on local copies.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
46 minutes ago, R C-R said:

but I think it might help explain why they recommend working on local copies.

They recommend working on local files for the sole reason that their file system/software etc does not currently play nice with "networked" drives.

In the corporate world of networked drives and cloud services it's a bit naïve for a software firm with a new product effectively saying only use it on documents stored on local drives.

In the corporate world I used to work in there was no way you could ever copy a file from a networked drive onto your local drive; for reasons of security, version control, backups and easy file sharing, to name just a few.

Multiple threads and discussions on why they recommend only working on local drives ignores the elephant in the room. Their software should be capable of working on all drives so it's either a bug or a failing in their implementation of the software which needs urgent fixing.

Which, hopefully, they are working on as a high priority.


Due to the ongoing Brexit negotiations, punctuation, spelling and grammar will be used sparingly until further notice.

Share this post


Link to post
Share on other sites
7 minutes ago, carl123 said:

Their software should be capable of working on all drives so it's either a bug or a failing in their implementation of the software which needs urgent fixing.

But at what price should it be made capable of doing that?

Do you think users would be happy if for example they had to wait for an unknown period of time for the network connection to be restored before they could continue to edit a document stored on a remote server? Would they be happy if VM use shot through the roof & performance took a huge hit because the 'fix' was to download temp copies of the complete content of every file they wanted to work on simultaneously that was stored on some drive(s) they are accessing over an uncertain connection? How much RAM & available scratch space do you think they would need to mitigate that?

As for the corporate world, how many corporate Photoshop users do you think try to work directly on files stored on remote servers instead of downloading local copies & working on them?

You may not be happy with the limitations imposed by way Affinity accesses document files, but it offers certain benefits I suspect most users would not want to do without, at least if they understood the tradeoffs involved. Consider for instance everything users may need to do to optimize Photoshop performance, most of which is a non-issue for Affinity users.

So while like everyone else, I hope they can do something to mitigate the 'must close' issue, I do not want them to do so at the expense of throwing out the baby with the wash, so to speak.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
2 hours ago, R C-R said:

There is no way to know in advance how much of the file would be needed for any particular purpose so the only alternative is to download the whole thing. But there are several issues to consider about doing that. The most obvious one is there would be two copies of the file, one local & one remote, that need to be kept in sync somehow at various times, & that would require more CPU time, more network traffic, more local storage space, & so on. 

Look into your WebBrowsers they do this all the time, every contents you see is transfered to your local temp directories over the net. If you look that mighty long Youtube video, that thing is transfered in chunks and resides in your temp cache (one local & one remote) etc. But you don't recognize it, the browser doesn't get slower or has obvious memory problems. Generally a lot of apps do shift things around in tmp stores and cache locations, that's common usage. - Further you can remove/delete the local cached copy after a successful transfer and loading into the app. Just empty the local cache after a successful read operation.

 

2 hours ago, R C-R said:

But the file is not loaded all at once into working memory, nor is it normally completely rewritten on each save (because the change data is serialized). If it helps, think of the document somewhat like a large database file -- you would not read the whole thing sequentially into memory or rewrite the whole thing on each change because that would be extremely inefficient & slow. Instead, you structure it so that you can fetch just the data you need at any particular time from it & write just the changes you make to that back to the file. We still have to do a lot of guessing about how this actually works, so the analogy will only stretch so far, but I think it might help explain why they recommend working on local copies.

That's all speculation, you don't know for sure how the file format is structured. Even if it is build like a compound backup file (initial + increments etc.) with some LZW compression around. It also doesn't matter, you can build additional transfer strategies around the usual read/write mechanisms without altering the format.


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
16 minutes ago, v_kyr said:

Look into your WebBrowsers ...

I don't understand why you keep trying to compare this to web browsers. It is not a valid comparison. Among other things, the size of video buffers are relatively tiny & don't store much more than a few seconds of content. Over a slow or intermittent connection the video is likely to stall, usually with a 'buffering' indicator of some kind. They are emptied or deleted automatically at some point, as are all cashes & other temp files. Besides, browsers do in fact become slower to respond when data has to be paged in & out of working memory, as do other apps that might be running concurrently. It is not ever just about the performance of one app; it is always about all of the processes running on the computer, including system level ones apps or users have no direct control over.

32 minutes ago, v_kyr said:

That's all speculation, you don't know for sure how the file format is structured.

It is true that we don't know all of the details, but they have told us in general terms enough to know that the entire document is not loaded into memory all at once, that native format files are in some unspecified way serialized, that this is done in part so incremental saves can be done very quickly in the background (& that the misleadingly named file recovery feature depends on this to avoid being intrusive), & that there is a threshold for when the files are 'de-serialized' & the whole file is rewritten.

We also know that what eventually became the Affinity range of software began life as a research project at Serif to test some ideas about how much could be done in a graphics editing app with very little available memory (& was tested using older generation iPads), & that they tout "the most advanced memory management system available" as a hallmark feature of Affinity.

So while I am in no way trying to deny that some of this is completely speculative, neither is it all merely conjecture based on wild guesses.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
7 minutes ago, R C-R said:

I don't understand why you keep trying to compare this to web browsers. It is not a valid comparison.

Since it's a pretty good example for implementations of loading/transfering data over network boundaries. Further all of these have and make use of download manager code portions which do permanent data transfer on a file basis from point A to B. So I used web browsers here just as an example for over network data transfer, thinking about if some underlaying principle or technique of those may can be reused for implementing a safer network capable file load/save mechanism. - And BTW the Affinity apps do also make use of browser portions in their "Welcome" panel.

In terms of file format, that's all fine and very well, as long as you could use it across networks, which does not seem to be the case in some cases. Therefore, you may have to put a transport layer around it to ensure a secure handling of the file transfer without data loss. Or some other mechanism which ensures stability for that.


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
46 minutes ago, v_kyr said:

Therefore, you may have to put a transport layer around it to ensure a secure handling of the file transfer without data loss.

Just so there is no misunderstanding about what you mean, you are not suggesting that the app itself should provide the process to do that, right? Allowing any application level process to do that would create a huge security risk.

That's part of why I suggested just adding a warning about opening network files & allowing the user to decide if they want to download them or not.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
16 minutes ago, R C-R said:

That's part of why I suggested just adding a warning about opening network files & allowing the user to decide if they want to download them or not.

That's a bit like having a car that warns you every time you are about to join the motorway/highway that it might unexpectedly crash.

Not what you would want to hear or have to make a decision about


Due to the ongoing Brexit negotiations, punctuation, spelling and grammar will be used sparingly until further notice.

Share this post


Link to post
Share on other sites
Just now, carl123 said:

That's a bit like having a car that warns you every time you are about to join the motorway/highway that it might unexpectedly crash.

Not what you would want to hear or have to make a decision about

Or is like... windows popping-up the UAC warning... bit annoying at times, but extremely useful!
I agree with this: if there isn't any elegant solution to FIX it (fix = make it work as it should), then a warning with the possibility of disabling (don't show it anymore) would be the common sense thing. 
Why? Because many (most) users would expect the software to work with a networked drive, not knowing there might be stops in communication and that might lead to their work being lost. So, a warning would... warn.

Share this post


Link to post
Share on other sites
1 minute ago, derei said:

So, a warning would... warn.

And a proper fix would...fix


Due to the ongoing Brexit negotiations, punctuation, spelling and grammar will be used sparingly until further notice.

Share this post


Link to post
Share on other sites
2 minutes ago, carl123 said:

And a proper fix would...fix

From what I see here, people are debating a lot in terms of fixing... opinions are strong on both sides.
It looks very much like Brexit ... to make a reference to your signature :))))) ... so, we'll wait and see. In the meantime, I'll hire an Egyptian Scribe to sync my files every time.

Share this post


Link to post
Share on other sites
7 minutes ago, carl123 said:

And a proper fix would...fix

The problem with a lot of so-called "proper" fixes is they break things more important or useful than whatever they are intended to fix. <Insert snide comment about Brexit here>

EDIT: Dang! @derei beat me to it. xD


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
45 minutes ago, R C-R said:

Just so there is no misunderstanding about what you mean, you are not suggesting that the app itself should provide the process to do that, right? Allowing any application level process to do that would create a huge security risk.

Of course the app should initiate, handle and control this process and with "secure handling of the file transfer" I meant more the procedure of loading/writing/sharing files beyond the network boundaries in some graceful manner. - Also "security risk" is always a pretty relative term here, considering how much third party code of libraries, APIs etc. is reused in the apps without knowing how much security holes all those might contain.

Quote

That's part of why I suggested just adding a warning about opening network files & allowing the user to decide if they want to download them or not.

When they can't fix or enhance their file load/save mechanisms into this direction, due to whatever choosen software design (the underlayed code) or programming reasons ever, that would be then the last thing to do. - Namely leaving things as they are and saying "sorry Dave I can't do that" since you might encounter a possible misbeahavior.


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
4 hours ago, v_kyr said:

Of course the app should initiate, handle and control this process ...

No app should ever be allowed to control any file or memory process directly. Only the OS should be able to do that. Anything else creates a huge security vulnerability for the entire system.

4 hours ago, v_kyr said:

When they can't fix or enhance their file load/save mechanisms into this direction, due to whatever choosen software design (the underlayed code) or programming reasons ever, that would be then the last thing to do.

No, the last thing they should do is apply any 'fix' that has a net negative effect on the usability or value of the software. A basic principle that applies no less to software design than to any other engineering discipline is that good design is the result of finding the right balance among mutually conflicting goals. This is sometimes referred to as the "pick two" principle but of course it really involves much more than considering just three mutually conflicting goals.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
6 minutes ago, R C-R said:

No app should ever be allowed to control any file or memory process directly. Only the OS should be able to do that. Anything else creates a huge security vulnerability for the entire system. 

Not sure what you are talking about, maybe you are misinterpreting things here I initially meant. With process I meant the sequence (the process, the flow) and not any operating system processes or the like. Also not the way the OS handles the memory management is meant.

26 minutes ago, R C-R said:

No, the last thing they should do is apply any 'fix' that has a net negative effect on the usability or value of the software...

Nobody said that it should have a negative effect on the software, who said or demanded that?

Further software design and programing software solutions are always iterative processes, you hardly ever will have from ground up a perfect design or bug free program. Claims change, extensions are required and demanded, etc. Every one telling something else here in software engineering is pretty blue-eyed, or never worked and made experiences with slightly more complex real world software projects.


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
25 minutes ago, v_kyr said:

Nobody said that it should have a negative effect on the software, who said or demanded that?

Of course it should not have an overall negative effect. The issue is if it is practical to do so, taking into account real & virtual memory use (which is not under application control), network connectivity/stability (also not under application control) & everything else that has been mentioned in the last dozen plus posts.

For example, what specifically should the app do if it (actually, the OS on behalf of the app) can't fetch data from the network drive or write back to it?


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
33 minutes ago, R C-R said:

For example, what specifically should the app do if it (actually, the OS on behalf of the app) can't fetch data from the network drive or write back to it?

At least not this here (the bottom text line of "doc must be closed") ...

failed.png.2982b7d505c091bcf8caccb3d1e74c3e.png

... also there were fair improvement suggestions there made 2017 in (Fail to save document after network drive reconnects). But haven't heard so far that it is solved or has been touched at all ...

Quote

I think we could be doing better with regards to losing the work even when the drive has been reconnected, and have made a bug report for this.

I've already mentioned further suggestions above in this thread and I see no reason to repeat myself here again!

 


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
7 hours ago, v_kyr said:
8 hours ago, R C-R said:

For example, what specifically should the app do if it (actually, the OS on behalf of the app) can't fetch data from the network drive or write back to it?

At least not this here (the bottom text line of "doc must be closed") ...

What should not happen is in no way an answer to what should happen, specifically or otherwise.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
On 1/18/2019 at 2:54 AM, v_kyr said:

...So that the actual state can be safely recovered in case of timeouts and failure. The point here is just to not loose the work (editing state) a user might already has done so far in the apps working memory, if the data can't be written over a network. Usually you don't want to close your document and maybe loose your last editings then due to some fail of network connectivity or an unreachable network drive. - For reading a file there shouldn't be a data lost here anyway, since either a file can be read in (loaded) or not, as far as the file isn't altered/damaged one can retry until some time interval is reached, if always timeouts do occur abort and let the user reinitiate another load file operation on demand.

 


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
1 hour ago, v_kyr said:

The point here is just to not loose the work (editing state) a user might already has done so far in the apps working memory, if the data can't be written over a network.

No, the point here is how to do that, considering all that is involved, including that for performance & other reasons all of a document file is not loaded into working memory at the same time or into local temp files.

You keep assuming that either it is all loaded at once or that there would be no significant downside in doing so. Neither is true.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites
1 hour ago, R C-R said:

No, the point here is how to do that, considering all that is involved, including that for performance & other reasons all of a document file is not loaded into working memory at the same time or into local temp files.

You keep assuming that either it is all loaded at once or that there would be no significant downside in doing so. Neither is true.

Guys, this debate became way too generic. There are two types of interruptions: short (milliseconds - seconds) and long (minutes or even permanent). Two different approaches for two different situations:

  • if interruption is short, that means is most probably "sleeping" and it needed a wake-up signal, or it was a short network drop, or something else. But this also means is back in no time. So, after the first failed attempt, the software should warn the user, WITHOUT CLOSING THE DOCUMENT and the user can check the connection of the drive (in explorer, for example) and then try again to save.
  • if interruption is long or permanent (someone simply unplugged the drive, or internet is down), there isn't much to do. Obviously the document has to be closed, but maybe a better recovery system could be implemented so all tools and operations can be recovered since the last successful save on the disk.

Now, as long as there isn't any "observer" or "listener" to check connection continuously, I see no reason why there would be any negative impact in performance. Just don't close the document until the user performs another attempt to save. Keep everything on stand-by and there is a very big chance that the connection will be re-established with the help of the user. I really don't see where is the issue here, conceptually speaking.


Also, having a cached copy locally would be a safety measure, a redundancy point. Yes, more space occupied on the local disk... and I see no problem with that. The storage media is meant to be used. I'd welcome a setting in Affinity to allow me to decide on which drive to create a "cached copy" of the current documents. I'd be happy to reserve ~10GB or so for that, when Affinity software is open, just to make sure my files are safe. Of course, there are issues here: what if you have Photo, Designer and Publisher open at the same time? Well, whoever wants to do graphics they usually invest in a decent machine with decent storage... there has to be a compromise anywhere.

Share this post


Link to post
Share on other sites
20 minutes ago, derei said:

I see no reason why there would be any negative impact in performance. Just don't close the document until the user performs another attempt to save.

This seems reasonable to me, I haven't had any problem saving documents to my desktop's hard drive (I try and keep it as clean and empty as possible) and I don't use 'the cloud'. Networked drives are for backup of stuff I am working on.


MacBook Pro (13-inch, Mid 2012) Mac OS 10.12.6 || Mac Pro (Late 2013) Mac OS 10.14.5

Affinity Designer 1.7.1 | Affinity Photo 1.7.1 | Affinity Publisher 1.7.1 | Affinity Photo Beta 1.7.2.146 | Affinity Publisher Beta 1.7.2.422

Share this post


Link to post
Share on other sites

@derei Jip, since you can't read and write (aka sync with network based services) until network connectivity is established, retry for some given frequency. Once reconnected, the app could then asynchronously update the to be load/saved data, this could also be managed as a seperate background process. - In case of saving, it can use local caching until the connection is re-established.

Quote

 

When using network services, an application may encounter conflicts when attempting to save data ...

  • An application must be able to resolve these conflicts in a way that provides the best user experience.
  • Since it is not always possible to avoid data conflicts. An application should make every effort to handle conflicts such that the user's data is preserved and that they have a good experience.

 

 


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
6 hours ago, derei said:

There are two types of interruptions: short (milliseconds - seconds) and long (minutes or even permanent).

There should be no short interruptions due to the drive sleeping -- remember, as it works now chunks of the file are read from the file as needed. It's an open, active file so the drive it is stored on should not go into deep sleep mode. I don't have a network drive to test this with, but I have some local external drives (& my Mac is set to allow drives to sleep when not active) & from what I can tell from opening files from one of them, they will not sleep as long as the file is open. By no means definitive but maybe something someone with a locally connected network drive can test with?

As for other brief interruptions, say those that last for no more than a second or two, I would be surprised if the app did not already have a built-in delay before deciding the connection was lost. If not, it should be simple enough to build that into the app.

Dealing with longer delays is more problematic. Again, it is not just about saving back to the drive, it is also about maintaining read access to it. When some chunk of the file not currently in working memory is needed by the app to complete some operation, all work on it must stop until it can be read into working memory. This would effectively hang the app (at least for work on that file), so there needs to be some 'graceful' way to deal with that.

This brings us back to the locally cached copy idea. This is also not without some problems that would need to be sorted out to make this practical. Something to keep in mind about this is the frontmost app is just one of dozens of processes running on the machine, all of which need working memory to run. In the Mac OS memory management system, there are four types of real memory: active, inactive, wired, & free. (I assume there is something similar for Windows.) How it works is complicated but wired memory cannot be paged out, & inactive memory can either be paged out or discarded, depending on if any changes to the data it contains needs to be or already has been saved back to the file.

Here is where it starts to gets tricky. When there is not enough active memory to run all the processes that need it, & all inactive memory has been freed & reallocated, the memory management system starts paging in & out to the backing store (more or less VM), which clobbers performance, not just for the frontmost app, but for everything. Apps have no say in this. Thus, it is important to keep the memory 'footprint' of apps as small as possible & to free up any memory they don't need ASAP.

Backup copies of document files cashed locally on the boot drive would help here since presumably this offers the fastest & highest priority access to the file's data. That same cannot be assumed for other locally connected drives, making them a less desirable choice, & it cannot be assumed that every system even has any other locally connected drives, so there still needs to be a 'Plan B' alternative for that.

There are also other 'Plan B' issues that can't be ignored. The most obvious is there could be several very large files opened at any time so full backup copies of them  would use a lot of drive space, in addition to whatever temp files the app creates & everything else that needs caches or temp working files. At least on Macs, when the boot drive begins to get full, everything slows down, & in the worst case scenario the OS itself crashes when it can't find enough free drive space to page whatever is required out of working memory to make room for the OS level processes it needs to run. (I think this is why at least on the Mac versions there is an adjustable "Disk Usage Warning" preference.)

Anyway, it obviously would not be acceptable to implement anything in the Affinity apps that could, even in extreme cases, crash the OS!

There are some other potential issues with relying on large local caches, but this is already so long I won't get into that. What I will close with is I can see only two reasonably easy, mostly problem free ways to avoid this issue. One is the previously mentioned 'proceed at your own risk warning.' The other is effectively the same as how this works in the iPad apps, that being to download the file itself to the device's storage (instead of a cached version), work with that, & let the user decide if or when they want to copy it back to the remote drive.


Affinity Photo 1.7.1, Affinity Designer 1.7.1, Affinity Publisher 1.7.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.1.143 & Affinity Designer 1.7.1.1 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×