Jump to content
You must now use your email address to sign in [click for more info] ×

Live Equirectangular Projection in Develop Persona


Recommended Posts

I have starting doing some 360° work, and my camera/software exports equirectangular 360° photos in raw DNG format.  The first step of any raw image is to grade and develop in the Develop Persona, but unfortunately I have found no option to view the raw image with a live equirectangular projection whilst in that persona.

When developing, the ability to toggle between normal and live equirectangular projection is critical, especially when scrolling/zooming/examining close-up details and stitched areas seamlessly, without borders or distortion.

The Layer > Live Projection > Equirectangular Projection works great in the Photo Persona.  There seems to be no reason that live equirectangular projection shouldn't be supported in Develop, for all standard, Split and Mirror views (and presumably, activating this view in Develop would automatically apply live equirectangular projection for the base layer after developing).

Am I missing some simple view setting, or is this a current limitation of the Photo app?  If the latter, is it on the roadmap?

Link to comment
Share on other sites

Checking for stitching errors and blending seams sounds like something you would want to examine in your stitching software prior to making your final stitch for output.  If your stitching software does not permit you to do this, then that is a deficiency in your stitching software.  The Develop persona in AP is for adjusting tone, color and detail/noise (general image attributes, typically for raw image files).

kirk

Link to comment
Share on other sites

5 minutes ago, Lee D said:

The Develop persona doesn't have the Live Projection options available, and the developers currently have no plans to add it.

What is the process to request this capability and get it on the roadmap?  It would be immensely valuable to my workflow, and I’d think others’ as well.

Is there a compelling reason not to support this view?  (Seems like the capability already exists in the app and would just need to be turned on...)

Link to comment
Share on other sites

3 minutes ago, kirkt said:

Checking for stitching errors and blending seams sounds like something you would want to examine in your stitching software prior to making your final stitch for output.  If your stitching software does not permit you to do this, then that is a deficiency in your stitching software.  The Develop persona in AP is for adjusting tone, color and detail/noise (general image attributes, typically for raw image files).

kirk

Hi Kirk,

Not checking for stitching errors and blending: as you say, that is already done in the stitching software before exporting the raw DNG.  I’m talking about viewing the pre-stitched DNG projected equirectangularly in Develop mode so that I can see undistorted detail seamlessly and make better color/curve/detail decisions, even along the stitch lines.

The view is supported in Photo persona—it would be extremely helpful to view it the same way while developing.

Link to comment
Share on other sites

Hi @dkallan - I do not understand how you could view a "pre-stitched" DNG spherical panorama - it hasn't been stitched.  I apologize in advance, I must be missing something here.

It sounds like you mean that you want to be able to take the equirectangular DNG file that has already been stitched and rendered to DNG, open it in the Develop persona in AP (as you would a raw file) and be able to view it in the same Live Projection as you can in the Photo persona when making your raw conversion choices.  I would propose that a more effective way to accomplish your goal, and provide many other side benefits, would be for AP to implement something similar to Smart Objects - people have been asking for this for years now.

With a Smart Object, you could bring your equirectangular DNG into AP, open it in the Develop persona, make your initial Develop decisions, render the result to an RGB file, and inspect it in the Photo persona in the Live Projection mode.  If you need to change things, you would just go back to the Develop persona, change the RETAINED settings from the previous conversion(s) (yes, the entire conversion history could be saved in the smart object), and then rerender the RGB result from the original raw file (i.e., a Smart Object raw workflow).

As it stands now, when you bring a raw (DNG) file into AP and convert it to an RGB image in the Develop persona, none of the settings are retained and any work you did during conversion (global and local overlay edits) is gone.  Implementing a Live Projection view in the Develop persona might be helpful, but if you need to change the resulting initial RGB render,  you will have to start over from scratch in the Develop persona.  I think what you want to do would be better served by a Smart-Object-like workflow in AP.  That would also help A LOT of other users with the various benefits that SOs give image editors.

Kirk

Link to comment
Share on other sites

Rather than get caught up in terminology, which I may get wrong, I’ll show you what I mean.

Recent Insta360 cameras, while not exactly the paragon of high quality like my DSLRs, do produce raw photos.  As with all cameras, sensor data passes through some sort of signal processing path before becoming a “raw” file.  In this weird new Insta360 world, raw file processing occurs in two stages: pre-stitched and stitched.  Maybe the Insta360 stitched DNG should be called a “pseudo-raw” format, but as far as image editing apps go, they see it as a raw image complete with lens info and metadata.  Here’s what happens:

When the camera takes a photo while the capture mode is set to “JPG+RAW”, it creates two files on the camera card: a DNG file containing the concatenated information from both sensors, and a proprietary “INSP” file containing the compressed JPEG+metadata with all sorts of automatic color, tone and detail processing performed in-camera and in the off-board camera app/studio.  (In JPEG-only mode, only the compressed INSP file gets created, and the raw data is discarded.)  The Insta360 smartphone app can easily extract and process the JPEG from the INSP file, but it does not deal with DNG files.  DNG files can only be handled by photo apps and the Insta360 Studio desktop app—and the desktop app is needed for stitching.

Taken immediately off the camera card, the DNG file is unstitched, flat in color, and looks something like this:

  [SEE PHOTO 1]

The DNG must be loaded into Insta360’s computer-based stitching app, where stitching is first attempted automatically and then subject to manual inspection and adjustment (stitching calibration, lens distance, horizontal correction).  From the studio, the user exports a stitched “raw” DNG in 360° equirectangular with no color correction—or call it a “pseudo-raw” DNG, as though coming from a virtual single 360° camera sensor that projects spherical data equirectangularly.

  [SEE PHOTO 2]

As far as any photo post-editing application is concerned, this pseudo-raw DNG is a raw file.  Affinity Photo and Photoshop will only open it in Develop mode.  In Develop persona, the equirectangular projection is displayed flat, with the inherent visual distortion toward the poles and the clipped edges of the way the image is stored but not the way it is viewed (which would be seamless and distortion-corrected in a live scrolling viewer window or VR headset).

In developing this photo, I have started playing around with color, tone and detail, but in doing so I really want to pay attention to the couple and the table they’re by, as well as the tree tops overhead.  However, note the red circles: the couple/table happens to be cut off by the edge of the flat image, and the tree tops show the effects of polar distortion.  I cannot see both sides of the table seamlessly, and I have to pan to either extreme edge of the image to view one part of it or the other.

  [SEE PHOTO 3]

After developing, however, I can view the photo with live projection.  The red line approximates where the sides of the file actually occur.  In live projection, the table is now seamless, the treetops fairly undistorted.  Granted, this lame photo isn’t a perfect example, but hopefully it hints at why it is so valuable to preview an image with live projection seamlessly during Development.

  [SEE PHOTOS 4, 5, 6]

@kirkt, all of what you propose with Smart Objects sounds cool and quite transformative.  But for the sake of my simple workflow, all I’m looking for is really basic: just the ability to preview and scroll an equirectangular image seamlessly and without distortion in Develop the way I can with Live Equirectangular Projection in Photo persona.  The live projection ability already exists in AP,—it just isn’t enabled as a preview in the Develop persona.

1 - On-Camera DNG File.jpg

2 - Stitched DNG File.jpg

3 - AP Opens DNG in Develop.jpg

4 - Seamless Live View Only in Photo Persona.jpg

5 - Seamless Live View Only in Photo Persona.jpg

6 - Undistorted Live View Only in Photo Persona.jpg

Link to comment
Share on other sites

@dkallan - Thank you for the explanation.  I understand what you are saying.  In terms of editing/previewing the non-Live-Projected image, there are a few things you can do and definitely should not do when opening and editing the DNG in AP.

1) In your stitching software, if there is a function to offset the edge/spilt of the equirectangular image, you should use that to shift the image content so that the left and right edges do not split important areas of the scene, like the people at the table in your example.  If you just bring that image into AP, do a basic conversion and render it to an RGB image, you can use the Affine transform (it is called Offset in Photoshop) to wrap the equirectangular image so that you move/wrap the image horizontally to get less important image content split at the left and right edges of the image.

I know this does not help you in your quest to have a Live Projection in the Develop module, but it will give you a better view of important image content in the flattened view that AP will currently show you in the Develop module.  I was going to suggest that you simply open one of the raw/DNG files that went into making the panoramic composite, and determine the best settings for raw conversion in AP with that file, then apply the same settings to the composite DNG, but the insta360 camera takes such distorted, fisheye images, that that strategy may not help too much if the important parts of the scene are distorted significantly by the optics of the camera.

2) Regardless of how you bring your DNG or rendered RGB image into AP, you should not apply any local contrast or similar local adjustments to the image until it is in a Live Projection - presumably Live Projection mode automagically mirrors the edges of the image so that the local enhancements know about what is on the other side of the image.  Does this make sense?  For example, if you add a local contrast enhancement to the image (like HiRaLoAm sharpening) or anything that needs to know about local pixel values at the left and right edges, the filter will not know about the image data on the other side of the seam that is actually continuous in the scene.  Therefore, if you go ahead and process the image with a local enhancement, there is the chance that when you view it in a Live Projection, you will see a distinct discontinuity where the left and right edges should be joined seamlessly - this has been caused by the local enhancement gone wrong.  If AP does not handle this automagically in Live Projection mode, you can add some of the left edge to the right edge of the image and vice versa (the canvas/document will need to be made bigger to do this) - once the local enhancement is performed, crop the image back to its original extent and you are good to go (you will probably also need to mirror the top and bottom pixels a little, too).  You would not be able to do this in the Develop persona, unless there was a check box that was labeled something like "360 Panorama Image" - some software has this option to make such operations aware of the continuity of pixels in the actual scene (I have seen it in a couple of HDR applications).

Anyway, I see what you are getting at - you can always post a a request, with a link to this thread, in the Feature Requests section of the forum and see what feedback you get. I have no idea how much muscle would be required to render a preview of a Live Projection from a raw file, or if that is even realistic to achieve, especially because panoramic images can become extremely large and require a lot of resources.

I hope it works out!

Kirk

Link to comment
Share on other sites

Thanks, @kirkt.  I will use the Feature Requests forum.

Appreciate your advice on caution with adjustments.  The raw 360° photos I have developed in Affinity Photo look surprisingly good for what seems like a toy camera (no issues with mismatches along the "seamless" edges).  But that 360° preview capability in Develop would make life easier from a decision-making perspective.  Because the live projection is layer-based, there are some special considerations with adjustment layers, but that's later in the workflow anyway.

Yes, the stitching software is very basic for raw photos: take in an unstitched DNG, check the stitching and horizon, spit out a stitched DNG or stitched DNG+JPEG (no color adjustment, no offset/shifting/recentering of image content).  After it spits out the stitched DNG, that software has no more role in the workflow—it's all in the hands of other photo software to grade/develop the DNG and do other adjustments.  Although there's no "feedback loop" for this particular stitching software, your Smart Objects idea would be useful in so many other applications.  Keep up the crusade!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.