Jump to content
You must now use your email address to sign in [click for more info] ×

Bug report: huge file size compared to assets included in document


Recommended Posts

I have an simple Affinity Designer document with one image and some line work.

 

when saved without an image it is a 13KB afdesign file

 

when I add a 750KB JPG file to the document, the saved afdesign file size is almost 16MB

 

why is this?

 

is there anything I can do to make the afdesign file size more similar to the size of the JPG it contains?

 

ps: I am not saving history with file.

 

happens with:

1.5.4

1.5.5 - Beta 1

post-1261-0-02975000-1483359232_thumb.png

Link to comment
Share on other sites

  • Staff

The Designer file includes the lossless compressed version of the jpeg for performance reasons (Photoshop would do a similar optimisation). We could add an option not to store the lossless compressed version but it would be slower. This isn't a bug but a possible new feature.

Link to comment
Share on other sites

Hi lexislav,

Linked files should be introduced with Affinity Publisher and will be eventually ported back to Designer/Photo.

 

Yes, I know about that info.

I have recreated my current file from affinity (2,6GB) in Illustrator with the same PSDs etc. .ai has less then 305MB…

Link to comment
Share on other sites

As soon as Publisher and multiple pages arrive, this is likely going to be an issue. Even when you're not embedding, but just linking, assuming Affinity will store a preview in the document. You'll get people with terabyte-sized files (not to mention photographers who edit hundreds of high-res pictures per shoot and store layered files).

 

Or take InDesign as an example – if you place a lot of images that have a DPI value of, say, 96 set in the file metadata, it will assume that the image is really big (in terms of physical area) and thus stores a close to full res preview, resulting in extremely large and bloated document files despite image files being linked.

 

I know of course nothing about the Affinity architecture, but I'd expect that file I/O bandwidth would really be the limiting factor and that decompression could basically happen in the downtime when the processor is waiting for the data to arrive from the drive, so the speed penalty should theoretically be minimal. Of course this might be different on platforms like iOS where storage is quite fast.

 

Recent Linux and now also macOS versions have a module that losslessly compresses memory pages before it starts going into swapping because it's still faster to compress and decompress data in memory on the fly than to save to and load uncompressed chunks from disk. I've always wondered why Adobe hasn't added anything like that to Photoshop's memory manager, but I guess the dev team is busy with stuff like coding HTML5 skins like Design Space.

 

Unless there is really a major slowdown associated with this, I'd definitely say that there should be an option (or alternatively a good behind the scenes mechanism) to stores the original compressed version in Affinity files instead of full uncompressed raster data. That way, it would also be possible to "unembed" the original image data, and not a huge uncompressed file or a lower-quality re-compressed one.

Link to comment
Share on other sites

If file size is a problem, I have found, that simply rasterizing the images (even at their original size), will make the file smaller.

 

That is good tip and I practice it myself.

However the issues with rasterization limit this option or make it at least… complicated. Objects positioned partialy out of artboard are clipped to artboard with rasterization. Even the parts in bleed area. And that really sucks as data You may need are lost. That means lot of repositioning before rasterizing… hopefully devs are gonna to fix those issues soon.

And it is not possible to rasterize objects somewhere on canvas at all :-/

Link to comment
Share on other sites

  • 2 weeks later...

Objects positioned partialy out of artboard are clipped to artboard with rasterization. Even the parts in bleed area. 

 

A current workaround for rasterisation of objects in bleeds, is to temporarilly increase the document size with the bleeds size (can be done quite quickly with math expressions when redefining size without scaling).

 

Another approach is to always work with document size including the bleeds and having guides showing the intended trim box. Then sizing down the document size to trim box only when exporting.

 

Eagerly waiting for proper bleeds handling on the canvas.

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.