This is incredible. There are soooo many features that Davinci already handles so damn well when it comes to color editing, that I only wish they existed in photo editors. To the point there were people posting videos on Youtube about hacky workflows to edit RAW photo files on Resolve and export each one as JPG files haha.
Only Darktable seemed to push the technical capabilities of photo editing forward (AgX, parametric masks, tone equalizer, etc), while rest of "industry standard" software lagged behind for quite so long, stagnant. Even more so when it comes to "creative" ways of editing, which Video Editing software have adopted for years but photo editors didn't (relight, actual LUT usage without complications, film emulation, halation, other aesthetic effects like VHS film damage, etc).
There's so much we can do. To me, it seems like these sort of conservative culture (photography) vs progressive (video editing). I've been into both worlds, and for some reason video editing software and professionals were much eager to try new stuff and celebrate new ways to shape visuals, compared to photographers.
The short of it is that there’s no money in photography, compared to videography.
Movies routinely have 8 or 9 digit budgets, with teams of hundreds of people who have to collaborate to make footage coming from dozens of different cameras look seamless and consistent. Meanwhile, $1M would be an insane budget for a photo shoot.
You can see this in the actual skills of people working in the field as well. Anyone working in video has a solid understanding of the technical underpinnings of their craft. On the other hand, it’s not uncommon for working photographers to not understand some really basic stuff about color science/data formats/etc.
Fundamental misunderstanding of the market dynamics here.
There are at least an order of magnitude more people making a professional salary as photographers (ie.: enough to justify a software purchase) than professional videographers.
Outside of film, videographers are generally paid a day rate about half as high as photographers, with enormously higher equipment costs.
Film - hollywood, streaming, TV etc, combined actually employ a relatively small number of people. Sure there's enormously more budget for any given TV show than say a wedding photoshoot, but think about how many people get married, how many corporate photo sessions there are etc etc.
Basically by conflating videography and cinematography you've obscured the issue. Source - I'm a videographer that also works as a cinematographer / director on smaller budget projects.
Also on anything bigger than a very low budget short, it's editors and post people who are using the editing software not the videographers / camera operators / DOP. Bare in mind DaVinci does not own the film industry. It's very much still Avid's game, with Nuke for colour, and a small percentage of Adobe Suite.
100% agree. Photo is a much muuuch bigger market.
Counterpoint most of the Movies budgets is usualy spent on the actors and on the filming. Not on the editing team. There is also copious amounts of money in photography Alot of advertising is still static images and print.
Yes, but if the budget of the whole thing is high(er) they don't tend to cheap out on details that could mske or break it.
Or phrased differently: If your shoot codts a million a day it doesn't matter if your camera costs 400 bucks a day or 40. In fact they may ask you whether you really wanna go with the 40-buck camera.
But there's a couple orders of magnitude more photo shoots than movies and since once you write software once, you can copy it for free, investing in creating photo editing software still makes sense.
> Meanwhile, $1M would be an insane budget for a photo shoot.
Photo shoots for automotive advertising regularly are around that pricepoint.
> Only Darktable seemed to push the technical capabilities of photo editing forward (AgX, parametric masks, tone equalizer, etc)
As a casual photographer, I wanted to love darktable and I'm sure it's extremely capable. But the UI is just so hard to get to grips with. I've put a few hours into it, tried following some tutorials etc. but I have no idea what I'm doing there.
I do have a fairly decent grasp of color science from working in 3d graphics so it's not that I'm lacking there. I guess it's like blender of yore. It could become mainstream but it would require a full UI overhaul and in the meantime it's for experts only, or determined people with a lot more time on their hands than I have.
There is even Darktable fork Ansel where they try to roll back lot of these ux mishaps.
Once you care only about editing and not cataloging then RawTherapee ends up being better editor for mr.
> There is even Darktable fork Ansel where they try to roll back lot of these ux mishaps.
AFAIK, the reasons Ansel exists are:
1. To yank out internals for code purity reasons.
2. Its (talented) developer worked better by himself than in group.
He was vehemently opposed to any idea containing the words "intuitive" or "UX".
Yeah, the UI in darktable is not good enough to go through a large shoot. When I've tried to use it I always end up doing all my selection in PhotoMechanic and then in darktable I just do editing. But even that UI/UX is terrible.
The Blender metaphor is spot on. I am a software engineer, I spent 2 years living in 3ds max in my teens, writing tutorials for it, and I am unable to make a basic scene in Blender, it’s like alien made software.
The GP refered to "blender of yore". Blender went through major UI overhauls and recent versions are very intuitive.
Still, Blender and 3ds Max are pretty much on two different ends of some spectrum, not sure which yet, but they seems to more or less follow two very different axioms when it comes to UI and UX philosophy. I've spent most of my 3D-ing time in 3ds Max, but Blender is more intuitive to me, but I also know others in the same position favoring 3ds Max.
This was always absolutely inexplicable to me. A lot of photographers are just resistant to better color tools (as in, actively arguing against them!) or are in deep denial about their existence. Photography is well behind videography in that regard.
Having done professional design work, photography, video editing, 3D animation, yada, yada, yada: I can’t think of a time where I’ve been unable to achieve my color goals in still photography with PS’s or Lightroom’s tools. For people to bother learning new professional tools, there needs to be a more concrete reason than ’but this is one is technically better.’ For hobbyists that are really into the tech? Sure. For professionals that need real precision and consistency— e.g product photographers shooting a lot of stuff with precisely defined brand colors, wedding photographers whose photos will frequently be looked at in series, or something? Sure. For most, the ROI on the time spent just isn’t there. The use case for more precise and consistent color grading in movies or other professional video is obvious— when all the frames are there sitting next to each other, and subtle color changes can so drastically affect the mood of the piece on a while because it’s so immersive. But most professional images are seen in specific contexts with other elements, often through unpredictable media… those tools just aren’t as useful there. And they’re also more complicated — simplicity is a huge boon for efficiency, and efficiency is really important for professional work.
I think this has been imprinted in the photographer world due to long-standing requirements from AP, Reuters, etc. on avoiding post-processing. Video has never had these constraints; post-processing is required to publish the works.
That’s interesting - how do they define that? Surely they don’t publish raw rgb?
Reuters banned photos processed from RAW over 10y ago. They will only accept JPEGs from the camera.
https://signalprocessingsociety.org/community-involvement/in...
AP has had these rules since the late 90s:
"Only the established norms of standard photo printing methods such as burning, dodging, black-and-white toning and cropping are acceptable. Retouching is limited to removal of normal scratches and dust spots."
https://niemanreports.org/aps-policy-banning-photo-manipulat...
Darktable is great, but notably, it doesn't have any neural network-based denoising, even though that's now standard in Lightroom, Capture One, and other apps. Darktable only has rather outdated wavelet and non-local means denoising. So a photo that would be perfectly fine at ISO 6400 in other apps will still look grainy, or worse, splotchy in Darktable.
To give DarkTable credit, neural-network-based denoising will be in the next major release (5.6).
And even without neural networks, DarkTable denoising is better than open-source competitors, due to the database of camera sensor noise shipped with it. For each supported camera and ISO setting, it contains the measured values of Poissonian and Gaussian components of the sensor noise, so proper denoising becomes a one-click operation. That's as opposed to the much more complicated "drag the luminance and chrominance noise sliders until the noise disappears, then drag two more sliders to recover detail" workflow found, e.g., in ART.
Darktable has a "neural restore" algorithm [0] in the development version (intended for midsummer release). Note:
- It appears to be an out-of-band pre-processing stage (run the image through denoise to produce an intermediary TIFF), unlike most other parts of the program.
- All AI features are gated behind compile-time flags which default to off.
As a professional photographer and mostly stills editor I am really excited to get to learn more advanced colour editing using this software, already using it for some video at a novice level. Thankfully I don't get much video work to do but learning the skills on stills is going to really improve my skills in motion. - I'll wait for the reviews but really looking forward to cancelling my adobe sub.
> handles so damn well when it comes to color editing
I know it sounds shocking to criticise the color editing capabilities of a dedicated colorist tool, but...
Resolve only got HDR output support on Windows recently! Up to version 18 or 19 it output gibberish that only specialised (super expensive) monitors could display. So you could have a HDR OLED 4K monitor and you'd get a washed out mess unless you also spent a ton of money on SDI cards for no good reason.
Sure, they fixed that now, but the pedigree of "we're a hardware company first, software company second" remains. They're not a photo editing company and have no idea what makes Lightroom "the" industry standard.
> conservative culture (photography) vs progressive (video editing)
I've found the exact opposite to be true!
Lightroom has used "scene referred" (correct) color management since forever. 32-bit float ultra-wide-gamut HDR throughout. This is a "new" feature in Resolve! [1]
Similarly, I just tried Resolve 21 photo export and it exports... SDR. Probably in sRGB, who knows? Appears to be totally uncalibrated.
Meanwhile Lightroom can export 16-bit PNGs, wide-gamut, true HDR, HDR gain maps, JPEG XL, etc, etc.
Resolve is way behind on the basics.
[1] There are excuses for this, mostly to do with performance when editing real-time footage vs a still image.
I tried Resolve just now for Photos, and I'm not impressed.
The Sony RAW file rendered terrible compared to Lightroom.
I found the interface unintuitive and did not even manage to locate the much praised Color grading features. That tab opens with a Video view.
This needs some work to compete with Lightroom for Photos - I see that it's Beta 1, just saying.
I guarantee that it won't improve significantly even after several major releases.
Resolve is designed to be controlled with their "panels", which have lots of dials and knobs to turn.
The software only interface is clunky at best, and they steadfastly refuse to fix basic usability issues lest that undermine the justification for buying their hardware.
For example, cropping and rotating media in Lightroom is a totally different experience compared to Resolve (photo or video, they're both bad!).
Lightroom lets you fine-adjust sliders by pressing shift so that instead of rotating an image by HUGE AMOUNTS BACK AND FORTH you can easily remove a 0.4% tilt without having to type in the numbers into an "angle" text input box like a savage.
Lightroom's crop and rotate controls do a "constrained crop" by default so that you don't get black wedges in the corners of the image. When the background is already mostly (but not perfectly) black, this can be infuriating to fix in Resolve by alternatively rotating, cropping (numerically!), rotating, cropping etc...
While I'm complaining about Resolve issues, it gets the color temperature scale wrong, as per this video, to the point where I find it nearly unusable: https://www.youtube.com/watch?v=WADuXiMZxq4
I know you have a whole narrative going but there's gotta be millions of "make my picture look analog" filters, that was the whole premise of Instagram, you can get specific effects for pictures to look like all kinds of specific cameras, so mentioning VHS like esthetics as something that doesn't exist is very strange.
An instagram filter is to a 3D lut as a PB&J sandwich is to a Michelin star meal...
Let alone the other things listed.
I'm saying the things mentioned exist and gave example of one of the most popular consumer applications in the whole world already offering an entry level version of the same feature. Since that's what most people know about.
You have all those features already in professional photo software already as well. DaVinci is cool but it doesn't unlock anything like "make my photo look like VHS" that hasn't existed for decades by now.
Is there even a working definition of what a "filter" is in Instagram, or mobile photo editors targetting social media users (which is approximately all of the mobile photo editors), beyond "a script that fucks up your photo in some trivial but also undocumented ways"?
I'm yet to see a filter that makes your photo look like taken from a specific camera (old or otherwise). Smearing colors and sticking a frame that imitates camera film border does not count.
But that was a fad with the purpose of tentatively hiding the poor quality of the photos taken using smartphones of that era.
Nowadays default filtering is that everybody crank saturation and vibrance way too high so that it looks good when looked on a small screen full of fingerprints and a scratched screen protectors, under the sunlight. Same way music is dynamically overcompressed because the baseline is it need to still sound half decent on hostile noisy environments with crappy speakers/headphone.
Photoshop can do anything that you mentioned for many years now.
I wish using Darkroom more, but it is terrible in defaults. It's one of those software that is developed by enthusiastic programmers but ignore actual needs of photographers. You don't need tons of demosaic algorithms but none reliable selection tool.
Photoshop itself, without ACR, is light years behind in color processing. It's a dinosaur at this point. It had only one remotely competent grading plugin (Firegrade), but it seems abandoned.
Name some color process that cannot be done in PS. I'm recommending PS for color grading to be precise.
You can do anything in a hex editor. The question is how convenient it is.