Jump to content
You must now use your email address to sign in [click for more info] ×

Recommended Posts

  • 4 weeks later...

Great Video, Jochen.

I liked what you said about small steps and that you always tend to overdo something, so mixing back with the original or the step before is a good way to preserve some detail, otherwise lost in the process.

Started myself very similar as you did with DSLR, small refractor and Deep Sky Stacker. I replaced Deep Sky Stacker with Maxim and Pixinsight soon (see workflow below), to be honest I don't use much of this software universe anymore, just Maxim, Pixinsight, Affinity Photo and Fotos (that comes with OS X). In time, the scopes became larger, the DSLR camera replaced by cooled CCD. Color singleshot became filtered monochrome images, later to be recombined accordingly, but all that doesn't matter much. To be honest, you are getting the best looking images with such a small equipment. More equipment and more technology is more complicated. You don't have to struggle with seeing limitations using a decent refractor. Ok, wind and stray light may be a problem as you told. And if you are on a journey (very seldom those days) or have the fortune of good skies at home, then small equipment is just fine because you may use it more often. You may think twice about setting up if that takes an hour.

On the processing topic - I'd go into processing with Affinity after stretching in Maxim or preferably in Pixinsight, e.g. using masked stretch or histogram transformation. Don't throw away the 32Bit until you are done with stretching! I know, there isn't much you can do about it in Deep Sky Stacker. Let's say it is cheap and fast but it doesn't register properly, especially if you are using short focal length (below 1000 mm), because it has no field flattening in registration and that is bad for rejection in integration. It causes the loss of tiny stars and details and contrast. However, DSS is free, I know ... doesn't matter for the moment, used DSS for the better part of a year myself.

You mentioned it, Jochen, you removed stars manually because it hinders you in stretching the Orion Nebula properly. I'm a little reluctant with deleting stars manually, as far as I don't add them back later. I use the Dust and Scratches Filter for that. It's Jukka's method, his page is https://astroanarchy.blogspot.com know him from CEDIC 2013 in Linz. However, I don't bring the tone mapping to such extremes, keep it in a more natural range. 

I prepared a tutorial that might be seen as an addon for your video. It covers Tone Mapping, Jukka's starless processing with Affinity Photo and the usage of selective color. I appreciate Affinity Photo for layer processing, good algorithms and user interface at a decent price tag. Affinity Photo is something you want to have in the last steps of Astro Processing, even if you would prefer doing most of the preprocessing in PixInsight nowadays. Especially, when it comes to removing gradients with Dynamic Background Extraction or automatic database driven image solving and annotation, functions you probably will never see in Affinity Photo.

The Affinity Photo tutorial is in German language, but from the images contained you will see what I am doing. And there is Google Translate, you know. I believe you won't have difficulties reading German anyways, Jochen ;_)

Best Sighard

P.S.: my homepage had to move, astro.square7.ch is no more, I relocated to Sternwarte Hofheim

The tutorial:
https://www.sternwarte-hofheim.de/galerie/schraebler/Pixinsight/!!HDR Tone Mapping und starless Bearbeitung in Affinity Photo 1.8 V01 S. Schraebler.pdf
 

Folie29.jpeg

Edited by S. Schraebler
Link to comment
Share on other sites

The Cocoon Nebula IC5146 as photographed with an 12inch Newtonian telescope, two nights collecting photons, right side is what I take from preprocessing aka Calibration, Registration, Integration, LRGB combination and nonlinear stretching, left is what I made out of it in Affinity Photo with tone mapping, starless processing, selective colors and adding stars again. Detailed description can be found in the tutorial above.

Best Sighard

IC 5146, also known as Cocoon Nebula, is an astronomical nebula with an embedded open cluster in the constellation Cygnus. The star cluster is called Collinder 470. The nebula has an apparent extension of around 10 arc minutes and is approximately 3000 light years from Earth; its physical extent is about 10 light years. The nebula is a star formation area with ionized atomic hydrogen and has both emitting, reflecting and absorbing components. IC 5146 is seen from Earth at the eastern end of the elongated dark cloud Barnard 168 and together with it is part of an extended molecular cloud. A bit to the west is vdB 147's reflection nebula, which probably also belongs to the complex.
The active star formation inside the cloud created a young open star cluster called Collinder 470, apparent brightness of 7.2 mag, estimated age a few hundred thousand years. This star cluster and in particular the brightest star in its center (type B0 V, 10 mag), are responsible for the ionization of the nebula and thus for its glow. They create a "cave" in the dust and gas of the molecular cloud that opens up a view to its interior.
Text: Wiki  

1269444352_OVergleichIC5146zumSchielen.jpeg.a2212ee4c8cfa0243ad79330701b0c3c.jpeg 

Edited by S. Schraebler
Link to comment
Share on other sites

Hey @S. Schraebler and @Amaroun,

 

thanks for the comments! Appreciated!

 

Sighard, thanks for that extensive insight! I do a lot of things in PixInsight as well, but I just love going back and forth with Affinity, because it's just much more hands-on and visual compared to PI. And for the tutorial, I wanted to stay 100% in Affinity, just to give that overview of what's possible with basic tools.

Working on NGC 7822 right now, doing some processing on the first 8 hours of H-Alpha data I collected. Will redo lots of it due to mediocre imaging conditions though.
The processing has been done in PI (Deconvolution, Gradient Removal, Star Attenuation, Dynamic PSF) and some final tweaks in Affinity (Minor Curve Stretches, Blending).

As soon as I get clear skies again I'll collect some more and then go for OIII and RGB data. :)

Might do a video on that a swell.
 

200412-HA-montage-16bit.jpg

Link to comment
Share on other sites

Dear Jochen, 

Oh yes, NGC7822 (the cluster) associated with the nebulosity (Cederblad 214 or Ced214), that's a good one! Some are seeing Medusa's face in it. There are Cometary Globules well known from Hubble's Pillars of Creation in M16, but Ced214 is much closer, so you do not need a large telescope to photograph them. And I like the object's high declination, allowing to observe this target all night from 50° northern latitude. I have to say I "found" this object, because it wasn't on my beginner's star map in late 1991. Back then, I used a 5.5 inch Comet Catcher on a GPDX mount and took three minute images on gas hypered film. And I saw a red blob on two of those images, there was something! I went to a library (hadn't internet back then) and searched better maps, learn about the Cederblad catalogue, never heared of that before. Today I am happy, we didn't blast the house away by experimenting with hydrogen/nitrogen-mixtures under temperature and pressure.

Nowadays, astrophotography is so much easier. The Quantum Efficiency of detectors is superior, it's just not possible to work with half and quarter photons, it never will be. Dark noise is vastly diminished, the readout noise is going down to few electrons, chips are becoming larger, cheaper, fields are more uniform, cooling is just achieved with stacked Peltier elements, consuming some current, no liquid cooling anymore, no ice cubes or liquid nitrogen needed. And there are correctors for the optics, you can have "fast" optics with wide usable image circles, fast like f/2 or f/3.8 instead of f/8 or f/15.

And the digital dark chamber is fascinating too. In the beginning, I had my struggles with PixInsight too. I believe, it is especially difficult when you come from Photoshop, the workflow is quite different, some triangles and icons you have to draw and drop on images or icons. You need to rename images to simple letters, to access them in formulas with ease. Everything is a process or a function, there are no layers! Almost all layer computations have to be formulated as a formula in Pixelmath. You do it, because the algorithms and the computational quality are superior. Well, you get acquainted to the workflow, but it slows down the process of trial and error (see link). There are previews for some functions and a number of functions are outstanding, like the Batch Preprocessing, Dynamic Background Extraction, Richardson Lucy Deconvolution, Drizzle Integration, Image Solving and Annotation to name some. But I understand, then for the fine tuning you want to have layers. Just found the Macro functions in in Affinity Photo yesterday. And I am very happy to have them back (see video). I liked this pretty much in Photoshop-times, but I am PS free for some years now. Didn't like the "load everything up to the cloud and rent the software" trap. Serif is wise in acting differently. 

However, here is Medusas Face in NGC7822, I can already see it in your image. For sure, it's no face, it's just our imagination and some thin gas & dust illuminated by stars that came out of denser regions of the same gas & dust. Since then it's stars against mountains of gas & dust and the stars are winning. By sheer radiation they are blasting holes into the molecular cloud, it blops up like a blister and becomes visible to us. If you are looking long enough into astro images it becomes a Rorschach test for everyone, because we are humans. Our brains want to see something familiar. So I can very well forgive the elders who saw figures in the constellations, same situation.

Two nights of collecting photons for NGC7822/Ced214. ONTC 12-inch Fotonewton, feff=1163mm, QSI583wsg with KAF8300 chip, (14+16+12)x300s SHO bin212. QSI now belongs to ATIK and the realm of CCD-Chips will diminish, they are not being produced anymore, but for the professional astronomers. There is a limit where the CCD is better than CMOS and it is somewhere between 120s and 300s of exposure time. Let's say, when single exposure is beyond 300s, then the CCD is better, otherwise modern CMOS wins.

The secret of good Signal to Noise Ratio (SNR) is to have 3nm narrow band filters. Those are easily more expensive than the camera itself, therefore it is important to have a camera with integrated filter wheel. If the filters are closer to the imager, they can be smaller, thus much less expensive. That was the fundamental thought when I decided for the Quantum Scientific QSI583 cold CCD camera. The well designed electronics and the compact design, 1.25 inch filters by Astrodon, imho best supplier for narrowband filters and well, Maxim & Pixinsight for capturing and preprocessing. The workflow grows, meanders and changes over the years. Now it brought me to Affinity Photo and since V1.8 I really like it and I like that you can buy the product for a fair price. Same is true for Pixinsight.

2nd image is van den Berg 141 or vdB141 in Cepheus, also two nights of observation, 3.6h efficiently in total but true colors. True colors are much more difficult since they require the sky to be dark. If the sky isn't dark, you can do something with narrow band imaging (SII Ha OIII), but it won't help for object classes as Galaxies or Integrated Flux Nebulae (IFN). They require wide band imaging (LRGB). The IFN are Molecular Clouds illuminated by the Milkyway's stars only. They are not being excited by intense UV radiation when there is none in sufficient abundance. The imagination strikes, it looks like a Ghost hunting a fleeing Moon, that's why it's being called the Ghost Nebula, but again, it's thin gas & dust only. Instrument was a 12-inch Newtonian, custom made by TS on a Hungarian G53F mount. I am using the off-axis guider aka a second camera looking thru the main instrument, taking short one second exposures for guiding. And the king's discipline is Galaxies, Star Streams and Einstein Rings. That's really faint and remote stuff. You need dark skies, larger instruments, good cameras and skills all together - Collimation, Guiding, Calibration, Processing.

My most beloved list is the AINTNO100 by Barbara Wilson & Larry Mitchell, a list of real objects you will never observe, at least not visually. I try to capture some stuff with the camera, however you won't qualify for a certificate that way. Is it useful? My discovery is, you cannot avoid discovering something if you are observing long enough and thoroughly compare with elder photographs. You will do this to understand what you are seeing or to see if the next equipment is a progress to the last one, when you revisit well known objects. The things you will find will have to do with the timescale of comparative images. If you compare over years you might detect something like a variable nebula, if you compare over seconds you might grasp an impact on the Moon.

Here is a more elaborate description on how to make and process astrophotos, the instrument choices, aka imagers, optics, mechanics, setup and maintenance. The sky quality vs. transportability considerations, optics, cameras and filters, pre processing and post processing: 

https://www.sternwarte-hofheim.de/galerie/schraebler/2015Sternwarten/201410_Deep_Sky_Astrofotografie.html

And here is a timelapse video with some results. I am happy, that macros are available within Affinity Photo, I was missing them. Did most of the effects with macros applied to batches here.

Best Sighard

P.S.: About the large and expensive instruments. Just give it a thought, you don't need to own what you might want to use, become part of an astronomical association, be friendly, be helpful, get acquainted with the capabilities, use their instruments before you spent insane amounts of money. Together you can use telescopes you never could afford. And you don't need to buy what you can rent. Perhaps you can observe remotely, but the feeling won't be the same. As an amateur, you want a piece of the action, as a professional you want to be on the paper of discovery. And if you are lucky, you can have both.

vdB141.jpg

NGC7822 aka Cederblad 214 Schraebler.jpg

Edited by S. Schraebler
Link to comment
Share on other sites

 

 

Hi Giles,

I did some stacking as described in Affinity Photo's sister program PS back in 2011, you can see that in the video above. Both programs are capable to apply macros to batches, I suppose. I used Registar to generate a Lunar mosaic of out of the images itself and registered every single piece of the Lunar puzzle to the mosaic in the second run. Then I recorded two macros and applied them to the sequence, one copying a loaded image to a second image, same size just black and combined them in the way of a digital filter 80% new, 20% background, copied the image back to replace the original 100%, saved it, left the second image modified and stopped the macro. Sounds more complicated than it is. However, this creates the sweep over the Moon.

In a second macro, I did nearly the same but I brightened the second image instead of averaging, that is the part where it looks like "integrating" the stuff. Well, it does a bad integration job, no averaging, no rejection, but somehow it's looking right. Then I appended the integrated mosaic out of Registar and finally a processed version with intense colors, revealing the metal composition of the regolith. That's the stone and dust on the Moon's surface. 

Similar macro was applied to the stars in Namibia. I brightened the second image to keep the stars where they already were. Babak Tafreshi told me about that Camel thorn tree sequence, that there is a situation where the tree apparently touches Magellan's Cloud and the star is looking like an eye. So I gave him the frame for his TWAN site and we called it "Touch the Sky" ;_) However, those are no stacked images but single frames.

There is no single program that does it all. Normally, I would advise to have at least a second program to do the SIFT/RANSAC and Sigma Reject Integration stuff. SIFT/RANSAC is when you calculate Scale Invariant Features in two images and try to find RANdom SAmple Consent (see below). In our cases, the features are craters on the Moon or just the stars. The features are "oriented" depending on their surroundings and intensity, may be additional 20 numbers to be recognized on the other side. In rare cases you need a different method of registration, correlation for example. You take that when you want to register a Lunar eclipse. You would also use it for lucky imaging with planetary discs. 

If you scroll up to the post before, you find the software ecosystem I encountered so far. You do not need much of that, just a workflow thru it, a path from the left to the right side. The arrows indicate mine, but many others are possible. It mostly depends on the operating system of your choice and if your equipment "speaks" ASCOM, INDI, ST4 or something proprietary. 

There are four categories where you want solutions. That is planning, controlling the instrumentation and observatory, calibration-registration-integration and post processing & archive. Maybe you don't have a mount nor an observatory and you need no planning. Then your workflow is even shorter. Especially for the calibration-registration-integration part there are a lot of programs available.

Having tested some of them I would advise against Deep Sky Stacker (fast, free, but lousy registration) and for PixInsight. CCDstack, Registar and Maxim are great too, but not so cheap. If you search for a free tool, look for Regim or Siril. And for TWAN images, those with some Earth and Sky in it, Babak's discipline, there is Sequator. It registers both hemispheres separately, registers the stars but keeps the Earth in place and you can set the horizon line.

Best Sighard

2078927147_SIFT_RANSACusingvlfeat.thumb.jpg.0115db96c04a69b15a0479bd167027d4.jpg

425934248_LunarEclipsewithSIFT_RANSACwillfail.thumb.jpg.8e6081940cd467e2aa177089d5407547.jpg

489270323_LunarEclipsewithPhaseCorrelationisgood.thumb.jpg.0a4148ba92b7b0b2e762c29fb65d664e.jpg

967115149_TouchtheSky.thumb.jpg.0f08fde8b08b4b344c884f9f42363578.jpg

Center_Region_85mm_Nikkor_EOS20Da_f-2_22x120s_ISO800.thumb.jpg.964cfaf3b7779a60cafa17db57f25dbb.jpg

Center Region AF-D Nikkor 85mm f/1.8 @ f/2 adapted to EOS20Da, 22x120s ISO800 (registered with SIFT/RANSAC Star Alignment algorithm in PixInsight) 

Maybe a little bit over processed, I might do it differently today ...

https://www.sternwarte-hofheim.de/galerie/schraebler/2013deepsky/

Link to comment
Share on other sites

@Gilescooperuk @outgettingsubs

@JohnB62 @Dan C @John Rostron

Hello together,

my 85mm AF-D Nikkor has an exceptional plane field of view, so I gave it a try, complete processing done in Affinity Photo (laugh about it, that's exactly what I suggested not to do). In this example everything worked out neatly. Registration went well, no satellites or airplanes to remove. Gradients are on holidays. Copied the stacking result above the stack and combined all 22 frames with average.

371321489_Astapleaveraged.thumb.jpg.d2073231de44f64b87403202ffb5dc8a.jpg

A Staple averaged - nope, dragging the histogram and saturation sliders are not the first steps. Don't, there are better ways to get what you want. In the image there is a foreground of bright stars and clusters and a background of distant star clouds, emission nebulae and dark molecular clouds. I plan to process them separately to be able to drag the sliders even more. But I want to keep the star colors and to bring out the structure in the background and the noise shall not increase meanwhile. Impossible? Not quite. 

599897162_BHDRToneMapping.thumb.jpg.cbac1d04de178c7048a2812eb952bec8.jpg

B HDR Tone Mapping

1772342219_C4xDustandScratches.thumb.jpg.e3237389506b2c4646335b46720f9562.jpg

C 5x Dust and Scratches. It's Jukka's starless method with 30pixels, 15px, 9px, 5px, 2.3px. Try to preserve bright features in the background.

(No fears, we don't loose anything, get the stars back in later!)

568149257_DDarkenwithpreviouslayer.thumb.jpg.f528b0c88853e643cce72401a7cc5f5e.jpg

D neat trick to preserve dark details -> darken with previous layer!

428537796_Egenerateresiduals.thumb.jpg.812550cfdcaf54c7a8a9142a4a2c34e5.jpg

E Difference to HDR Tone Mapping generates the so called Residuals, we will add them later. Wow, little disappointment with the Macro Tool - apparently it is not possible to record the movement of layers. You have to stop the recording to be able to drag layers! I didn't expect that, Photoshop had no trouble with that. It's bad, because all the funny stuff I used in the video some posts before is based on that. Also Jukka's starless method cannot be recorded in total. Or at least I do not understand how I can include manipulations of the layer stack. There is no layer shifting item in the menu either. Perhaps something in Forth-style with dup, rot, swap? If there is somebody out there who could explain, please - this would be highly appreciated. Back to the starless version, put it on top of the layer stack.

1972340869_Fselectivecolors.thumb.jpg.5781df692e550e00f945ae28bcb8ef5d.jpg

F We can apply selective colors, make the violet more light blue (that's the very old trick from the magazines, you have seen it a thousand times and wondered why you don't record more blue ...)

1158970841_Gstillselectivecolors.thumb.jpg.7f24132bdeeac812c2e7efc2cbead226.jpg

G Still selective colors, intensify the yellow old stars in the Milkyway's center (that's the very old trick..., you have seen it a thousand times ... and wondered what's wrong with your camera :_)

420787747_Hdenoisingstarlessversion.thumb.jpg.03b0d32d75473bb109677a5b59614785.jpg

H Denoising the starless version, you know, you cannot ruin the stars, they are not in the image yet

1894033229_Iaddresidualsback.thumb.jpg.212e874c5e8fdeae955303389b81aaa2.jpg

I Just add the Residuals, stars are back again, but what a background!

2127826299_Jrecombinewithearlierversions.thumb.jpg.f17bc917cb72a27bee899438794f41a9.jpg

J Recombine with earlier versions to make it more natural. Very often you overdo the processing, so recombination is healing from that. I use negation or negative multiplication and soft light, sometimes iteratively in many steps and tiny doses - brighter - darker - brighter - darker ... 

1808757243_KResult_Center_85mm_f-2_22x120s_ISO800_S.Schraebler.thumb.jpg.8a623e5f47b84ad7a6e45f3ff15686f1.jpg

K Almost done! Center 85mm f/2 22x120s ISO800. Every step in Affinity Photo, so far. If the registration goes well, the rest is no problem. Export as JPG 85% quality.

L_intense_result_for_the_almost_color_blind_-_Center_85mm_f-2_22x120s_ISO800_S.Schraebler.thumb.jpg.26ecad7ca1952411d200b92103189d73.jpg

L Intensified result - Center 85mm f/2 22x120s ISO800

1166405884_MYes-shameonmeIdiditinFotos(OSX).thumb.jpg.5b87261b9122102f6797a9a759922376.jpg

M Yes - shame on me, I did it in Fotos (OSX)

Best Sighard

Link to comment
Share on other sites

12 hours ago, S. Schraebler said:

apparently it is not possible to record the movement of layers

No, a macro will not record the movement of layers done by dragging with a mouse. However, you should be able to move layers up and down via the Arrange menu, (Arrange > Move back one or Arrange > Move forward one)and this is recordable in a macro.

John

Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC

CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630

Link to comment
Share on other sites

On 4/19/2020 at 9:24 AM, John Rostron said:

No, a macro will not record the movement of layers done by dragging with a mouse. However, you should be able to move layers up and down via the Arrange menu, (Arrange > Move back one or Arrange > Move forward one)and this is recordable in a macro.

John

Tank you John, I have a look.

Meanwhile, I annotated the image, but you might imagine I didn't do this by hand. Instead I used PixInsight's Image solving and annotation capabilities ...

Best Sighard

315912859_ImageSolver.thumb.jpg.ca7ccb743f136df7f138fda47d912589.jpg

Image Solver

38694723_ImageAnnotation.thumb.jpg.41f7b8fa44542d33b37a957736fa2130.jpg

Image Annotation

Center_85mm_f-2_22x120s_ISO800_S.Schraebler_Annotated.thumb.jpg.9bcc2271d45da44fa16bdbc046255b38.jpg

Result: Center annotated, Nikkor 85mm f/1.8 @ f/2, EOS20Da 22x120s ISO800, portable AstroTrac TT320 mount. S.Schraebler, Hakos, Namibia

Confusion of sources could be the image's title. It's showing the center of the Milky Way galaxy. There are some billion stars in this region, but the image has only 8.3 million pixels. From Earth it is about 30,000 light years to the center, the star clouds are just in front of it. A dark band of molecular clouds is preventing direct sight in visible light.  Here and there, young stars are exciting the clouds to red Ha and blue O[III] emissions with their UV radiation. On the right side you find M8, the Lagoon Nebula and M20, the Trifid Nebula.

The Galactic Center SGR A* is at RA 17h 46m and DEC −29° 00′, almost where you find the ° of -30° in the annotated image.

Image Plate Solver script version 4.2.2
===============================================================================
Referentiation Matrix (Gnomonic projection = Matrix * Coords[x,y]):
            +0.00158606         -0.00408442            +2.00261
             +0.0040796          +0.0015866            -9.04557
                     +0                  +0                  +1
Projection origin.. [1760.769373 1174.146611]pix -> [RA:+17 59 39.73 Dec:-29 04 04.92]
Spline order ...... 2
Num. ControlPoints. 88
Resolution ........ 15.766 arcsec/pix
Rotation .......... 111.206 deg
Focal ............. 83.73 mm
Pixel size ........ 6.40 um
Field of view ..... 15d 25' 27.4" x 10d 16' 58.3"
Image center ...... RA: 17 59 39.999  Dec: -29 04 02.36
Image bounds:
   top-left ....... RA: 18 09 43.447  Dec: -38 02 37.11
   top-right ...... RA: 18 32 33.410  Dec: -23 31 23.58
   bottom-left .... RA: 17 23 13.373  Dec: -34 03 17.45
   bottom-right ... RA: 17 51 12.856  Dec: -20 02 56.13
===============================================================================

 

Just to repeat the first steps in Affinity Photo, didn't mention that before

1018828926_image3.png.f920412d05868dcfac069d8a90037837.png

This is how you start, new staple ...

41948164_image2.png.19fa24111f8e5fe95d6e6e438c3ffa71.png

... add some images

image1.png.16b9d0802d7de5035f45859eb5206567.png

... chose a combination

I just copied the images above the staple and used brighten to see, if registration worked. You can do the same thing with maximum.

Registration won't work with Fisheye or very dark images, here I was lucky. There are special programs that do it better, but it's a starting point to try this.

Good thing is Affinity Photo is good for importing modern DSLR RAW images. However, if you got FITS-Images, then you need another converter.

Link to comment
Share on other sites

On 3/17/2020 at 9:25 AM, outgettingsubs said:

Hey guys,

as a first video on my new channel, I created a tutorial for a basic astrophotography editing workflow in Photo. Hope some of you will find this helpful - any feedback appreciated! :)

 

Thank you for posting that video and in case it is of interest, in January's edition of Astronomy Now magazine, the experienced astrophotographer Nik Szymanek described Affinity Photo as an excellent alternative to Adobe's Photoshop (that was part 1 of a series on the use of Affinity Photo, the latest being part 3 in April's issue of that magazine).

Link to comment
Share on other sites

  • 1 year later...
On 4/11/2020 at 2:10 PM, S. Schraebler said:

Great Video, Jochen.

I liked what you said about small steps and that you always tend to overdo something, so mixing back with the original or the step before is a good way to preserve some detail, otherwise lost in the process.

Started myself very similar as you did with DSLR, small refractor and Deep Sky Stacker. I replaced Deep Sky Stacker with Maxim and Pixinsight soon (see workflow below), to be honest I don't use much of this software universe anymore, just Maxim, Pixinsight, Affinity Photo and Fotos (that comes with OS X). In time, the scopes became larger, the DSLR camera replaced by cooled CCD. Color singleshot became filtered monochrome images, later to be recombined accordingly, but all that doesn't matter much. To be honest, you are getting the best looking images with such a small equipment. More equipment and more technology is more complicated. You don't have to struggle with seeing limitations using a decent refractor. Ok, wind and stray light may be a problem as you told. And if you are on a journey (very seldom those days) or have the fortune of good skies at home, then small equipment is just fine because you may use it more often. You may think twice about setting up if that takes an hour.

On the processing topic - I'd go into processing with Affinity after stretching in Maxim or preferably in Pixinsight, e.g. using masked stretch or histogram transformation. Don't throw away the 32Bit until you are done with stretching! I know, there isn't much you can do about it in Deep Sky Stacker. Let's say it is cheap and fast but it doesn't register properly, especially if you are using short focal length (below 1000 mm), because it has no field flattening in registration and that is bad for rejection in integration. It causes the loss of tiny stars and details and contrast. However, DSS is free, I know ... doesn't matter for the moment, used DSS for the better part of a year myself.

You mentioned it, Jochen, you removed stars manually because it hinders you in stretching the Orion Nebula properly. I'm a little reluctant with deleting stars manually, as far as I don't add them back later. I use the Dust and Scratches Filter for that. It's Jukka's method, his page is https://astroanarchy.blogspot.com know him from CEDIC 2013 in Linz. However, I don't bring the tone mapping to such extremes, keep it in a more natural range. 

I prepared a tutorial that might be seen as an addon for your video. It covers Tone Mapping, Jukka's starless processing with Affinity Photo and the usage of selective color. I appreciate Affinity Photo for layer processing, good algorithms and user interface at a decent price tag. Affinity Photo is something you want to have in the last steps of Astro Processing, even if you would prefer doing most of the preprocessing in PixInsight nowadays. Especially, when it comes to removing gradients with Dynamic Background Extraction or automatic database driven image solving and annotation, functions you probably will never see in Affinity Photo.

The Affinity Photo tutorial is in German language, but from the images contained you will see what I am doing. And there is Google Translate, you know. I believe you won't have difficulties reading German anyways, Jochen ;_)

Best Sighard

P.S.: my homepage had to move, astro.square7.ch is no more, I relocated to Sternwarte Hofheim

The tutorial:
https://www.sternwarte-hofheim.de/galerie/schraebler/Pixinsight/!!HDR Tone Mapping und starless Bearbeitung in Affinity Photo 1.8 V01 S. Schraebler.pdf
 

Folie29.jpeg

Oh man- it would have been amazing to have this full tutorial in English as well!!

Edited by t_k
Grammar edit 🤦🏻‍♀️
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.