Jump to content
dfcollin

DxO Has Acquired The NIK Collection From Google

Recommended Posts

Nice to see they're keeping the current version free.


Dave Straker

Cameras: Sony A7R2, RX100V

Computers: Win10: Chillblast Photo with i7-3770 + 16Gb RAM + Philips 40in 4K; Surface Pro 4 i5

Favourite word: Aha. For me and for others.

Share this post


Link to post
Share on other sites

I see that dxo is planning incorporating U-point technology into their mainline products. Looks like Affinity missed an opportunity there.

 


Windows 10, Affinity Photo 1.7 and Designer 1.7, (mainly Photo), now ex-Adobe CC

CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630

Share this post


Link to post
Share on other sites

They already have placed NIK's U-points technology (probably via Viveza) into their DxO PhotoLab tool. So it offers now a similar workflow like Nikon Capture NX2 had in the past!


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites

A challenge for Affinity is to match PS in plugin integration. Original Nik connection is good. LR is more of a tiff bounce. DxO lets you specify 8 or 16 bit, but still via tiff. A tighter integration would be good, but perhaps will need friendly conversation with the  other software suppliers.


Dave Straker

Cameras: Sony A7R2, RX100V

Computers: Win10: Chillblast Photo with i7-3770 + 16Gb RAM + Philips 40in 4K; Surface Pro 4 i5

Favourite word: Aha. For me and for others.

Share this post


Link to post
Share on other sites

@dmstraker All good points. I think if Affinity Photo was a bit faster and had, as you say, better plugin support (I really need it to work with DFX), then it would be better then PS for sure. Regarding RAW editing, the new DXO Photolab is the best IMO. They have the best noise reduction technology period. And they have the best lens/camera profiles. Then for DAMS... we're dammed. LR is OK but too slow. If Affinity developed one with the same sexy interface and complete metadata support that'd be amazing.

Share this post


Link to post
Share on other sites
2 hours ago, rainwilds said:

@dmstraker All good points. I think if Affinity Photo was a bit faster and had, as you say, better plugin support (I really need it to work with DFX), then it would be better then PS for sure. Regarding RAW editing, the new DXO Photolab is the best IMO. They have the best noise reduction technology period. And they have the best lens/camera profiles. Then for DAMS... we're dammed. LR is OK but too slow. If Affinity developed one with the same sexy interface and complete metadata support that'd be amazing.

 

$100 question: I've got the free version of DxO 11 (not elite). I'm using this (and sometimes LR) as a front end before AP. 

 

Is it worth my upgrading to Photolab from 11? Yes, I know. It probably depends. If so, what are the 'depends'?

 


Dave Straker

Cameras: Sony A7R2, RX100V

Computers: Win10: Chillblast Photo with i7-3770 + 16Gb RAM + Philips 40in 4K; Surface Pro 4 i5

Favourite word: Aha. For me and for others.

Share this post


Link to post
Share on other sites
6 minutes ago, dmstraker said:

$100 question

 

It's usually a $64,000 question or a $64 question. Are you working in octal, Dave? :P

 


Alfred online2long.gif
Affinity Designer/Photo/Publisher 1.7.1.404 • Windows 10 Home (4th gen Core i3 CPU)
Affinity Photo for iPad 1.7.1.143 • Designer for iPad 1.7.1.1 • iOS 12.4 (iPad Air 2)

Share this post


Link to post
Share on other sites

@dmstraker

How should somebody answer this Q individually for you without knowing your personal needs, workflows and all that stuff here?  So I think best will be, you give that PhotoLab a deeper tryout yourself first and then make your own personal decision, if it is possibly worth an upgrade at all or not!


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites
29 minutes ago, dmstraker said:

 

$100 question: I've got the free version of DxO 11 (not elite). I'm using this (and sometimes LR) as a front end before AP. 

 

Is it worth my upgrading to Photolab from 11? Yes, I know. It probably depends. If so, what are the 'depends'?

 

 

I think it's worth upgrading. For one, you get the PRIME noise cancelling. Two, though not a complete DAM it now has star ratings and flagging for basic culling. And three, they have implemented Nik's U-Point technology into RAW editing. But it does depend on what your actually needing to achieve. Affinity does a great job at RAW and the noise cancelling really good.

Share this post


Link to post
Share on other sites

Well, as always such decisions do depend highly on certain personal criterias and also deeply on the used cam/lens base here and how good these are supported.

For example star ratings and flagging etc. is nowadays common usage in RAW processor software, image browsers and even supported by the OS (MacOS) itself. So that's nothing earth shaking at all since even most freeware apps like RAW converters, image viewers/browsers etc. do support and implement this. Noise cancelling in turn here is more algorithm implementation and control dependend, to the degree of used fine tuning steps and on the quality of the initial available image data. Some solutions here do allow to work with a finer, not that aggressive performed steps/degrees and sometimes also do a better pixel pre-analysis, where other ones in contrast are more rude approaching instead. - The NIK U-points do offer a neat way for manipulating and adjusting/changing certain user selected image areas. IMO this is main attraction and difference to the other software versions without that such features.

Related to RAW and lens conversion support etc., many RAW software solutions are only mediocre at best here and do highly rely on other third party libraries like DCRAW etc. Meaning, as far as the initial library maintainers then don't update their code base accordingly, don't expect to find any newer, better or fixed support for certain vendor cam and lens models here.


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites

The noise management is tempting, though it's expensive for just that. Even in octal. Maybe I'll tinker with the free version for a while to see if I can get into it enough. It does seem to have a particular flavour on RAW that is different to the more familiar LR setup.

 

One wonders what DxO actually bought. Was it just the code or did they get engineers too? If just the code, having worked on old software, I don't envy the folks who have to get their heads around it. Much will depend on how well documented it is and how clean and consistent the coding. It's easy for code written by different engineers to be very different in style, etc. I'd guess there's large 'black box' chunks where you call into it and get stuff back, and hope to goodness you don't have to get into deep bug fixing.


Dave Straker

Cameras: Sony A7R2, RX100V

Computers: Win10: Chillblast Photo with i7-3770 + 16Gb RAM + Philips 40in 4K; Surface Pro 4 i5

Favourite word: Aha. For me and for others.

Share this post


Link to post
Share on other sites

Hard to tell. - Google aquired NIK in the past (Sept. 2012) mainly for their Snapseed mobile app and the features and technics behind that one. The NIK PS-plugin Collection wasn't obviously (really at all) maintained or updated by Google itself over the years. The only thing that was added over the past years to the Collection was "Analog Efex Pro" here. According to press sources etc., some of the most talented NIK developers had already disappeared in the past and went to the Adobe LR-Team or worked on other Google projects.

So I think DxO mainly bought the source code and did probably not took over any engineers who initially developed the software. - Related to coding and maintaining foreign written code etc., yes it's always harder to go throught it that way and understand what certain routines and code parts are meant to do and how to use them the right way. But all this also highly depends on the degree of how well the code and documentation is overall written and maintained over the time and it's complexity per se.

 


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites

The worst code I ever had to maintain was written in C but #-defined to look like another language that the programmer was fond of (APL, I think).

 

Second prize goes to a large chunk of assembler written by a hardware engineer, with no procedures, no comments and neat tricks such as setting a flag in a register and doing a conditional jump on it several hundred lines of code later.


Dave Straker

Cameras: Sony A7R2, RX100V

Computers: Win10: Chillblast Photo with i7-3770 + 16Gb RAM + Philips 40in 4K; Surface Pro 4 i5

Favourite word: Aha. For me and for others.

Share this post


Link to post
Share on other sites

Well it always depends!

I recall one company project, one of those where they call you pretty late to assist in development, so to say when it already highly burns due to the estimated release timeline. That once was mainly a huge Java based project for a three tier ERP system with nearly 800 megs of accumulated sources over the years. The system had to be renewed with new functionality and completely new user interface and distributed database connectivity etc. Of course there was no real documentation beneath the code itself, no specs nor modul API docs etc. A bunch of different devs had once worked on the whole code stuff, most of them which had long time ago leaved that evolving project. So just the code itself was the docu and only one or two project people vaguely remembered what certain code parts might have be used for or what the overall intension of certain code modules was at all.

It was pretty hard to get an overall overview of that much code and how a lot of parts were intended to work together, to determine what is dead and/or actually outdated/unsused code and what stuff played important roles for certain critical software behaviour. I recall that we had to do a lot of refactoring, in ordert to find out and get an idea of where certain classes and method calls etc. are used at all and what their purpose was. Sometimes when you thought some stuff is unused and thus dead code ballast when performing code refactorings, thus can be renamed or thrown out. Then at some deeper level you then realized that something was still somewhere used in a very tricky dynamic object calling manner, so that the Java language parser wasn't able to recognize this at an earlier level stage during finding used references and performing refactoring etc. You then had to undo the whole code modifications you made and revert back to the last half way working CVS repository stage, thus loosing the spend time on modifications and cleaning up code.

In short, such things and worst case projects are generally always a nightmare to work on!


☛ Affinity Designer 1.7.1 ◆ Affinity Photo 1.7.1 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×