ChicagoSeb Posted October 25, 2023 Posted October 25, 2023 I take and work with lots of cityscape photos and some have reflections in the windows of the buildings. My Samsung Galaxy S23 has what is called an "object remover" function where you tap a button and the reflections are removed instantly and usually does an excellent job, with few exceptions. The problem is I have to take my Affinity Photo edited photos and complete the editing on my smartphone, which adds extra work. Will Affinity ever come close to having a function similar to Samsung's so I don't have to go back and forth between devices? Quote
NotMyFault Posted October 25, 2023 Posted October 25, 2023 (edited) Looking in my crystal ball I think this will happen never, at least not very soon. (see Carls post later in this thread for update) Anyhow, Affinity has multiple tools which allow to remove unwanted parts, but you may need a bit of practice to achieve the desired result quick and effortless: Inpainting tool Patch tool clone tool Covering areas with solid color or simple gradients for complex cases, combined with frequency separation If you can provide example images of before / after state, we can show some quick edit workflows. Edited October 26, 2023 by NotMyFault emmrecs01 1 Quote Mac mini M1 A2348 | MBP M3 Windows 11 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080 LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5 | Dell 27“ 4K iPad Air Gen 5 (2022) A2589 Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps. I use iPad screenshots and videos even in the Desktop section of the forum when I expect no relevant difference.
ChicagoSeb Posted October 25, 2023 Author Posted October 25, 2023 (edited) 12 minutes ago, NotMyFault said: Looking in my crystal ball I think this will happen never, at least not very soon. Anyhow, Affinity has multiple tools which allow to remove unwanted parts, but you may need a bit of practice to achieve the desired result quick and effortless: Inpainting tool Patch tool clone tool Covering areas with solid color or simple gradients for complex cases, combined with frequency separation If you can provide example images of before / after state, we can show some quick edit workflows. Oh, that's disappointing, but oh well! Thank you for your reply! Here's an example of the after and the before: Edited October 25, 2023 by ChicagoSeb Quote
PaulEC Posted October 25, 2023 Posted October 25, 2023 The best thing, if possible, is to avoid the reflection in the first place. Either change the angle of the camera to the window, or go closer to the window. Quote Acer XC-895 : Core i5-10400 Hexa-core 2.90 GHz : 32GB RAM : Intel UHD Graphics 630 – Windows 11 Home - Affinity Publisher, Photo & Designer, v2 (As I am a Windows user, any answers/comments I contribute may not apply to Mac or iPad.)
ChicagoSeb Posted October 25, 2023 Author Posted October 25, 2023 20 minutes ago, PaulEC said: The best thing, if possible, is to avoid the reflection in the first place. Either change the angle of the camera to the window, or go closer to the window. Precisely what I do. But sometimes, as in this case, that's not possible. Especially when the time of day is particularly important. Thank you! Quote
NotMyFault Posted October 25, 2023 Posted October 25, 2023 No way to achieve the same in Affinity without lengthy editing. This Android object removal tool is probably AI based, and currently Affinity has not given any hint AI based tools will be included. When you inspect the image closely, you see typical AI artifacts, e.g. the middle horizontal line at height of your hand is simply missing/removed, whereas a human editor would clone it from lower areas. Quote Mac mini M1 A2348 | MBP M3 Windows 11 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080 LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5 | Dell 27“ 4K iPad Air Gen 5 (2022) A2589 Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps. I use iPad screenshots and videos even in the Desktop section of the forum when I expect no relevant difference.
NotMyFault Posted October 25, 2023 Posted October 25, 2023 When I shoot though windows (with iPhone), I always put the lens directly on the glass (and clean both surfaces if possible with a tissue). If you need to shoot at an angle, use your hand or any dark cloth to cover the area around the lens to block all light from behind. It is sufficient to cover the surrounding of the phone itself, no need to go near the lens itself. If you shoot this kind of images often, purchase a black soft rubber ring or mini bellows (from folding cameras). Quote Mac mini M1 A2348 | MBP M3 Windows 11 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080 LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5 | Dell 27“ 4K iPad Air Gen 5 (2022) A2589 Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps. I use iPad screenshots and videos even in the Desktop section of the forum when I expect no relevant difference.
thomaso Posted October 25, 2023 Posted October 25, 2023 4 hours ago, ChicagoSeb said: I take and work with lots of cityscape photos and some have reflections in the windows of the buildings. At least this two aspects might mean to use a polarizing filter when shooting the photos. Not only can it prevent window reflections (and see the 'reality' behind them), but it is also useful for other types of glare (e.g. on plant leaves (grass), water (lake, river), chrome, varnish, etc...) and therefore to improve contrast or intensify the sky for instance. - https://www.2020mag.com/article/the-physics-of-polarizing-filters https://www.lenstip.com/115.2-article-Polarizing_filters_test_About_light_and_polarization.html PaulEC, firstdefence, Old Bruce and 2 others 5 Quote • MacBookPro Retina 15" | macOS 10.14.6 | Eizo 27" | Affinity V1 • iPad 10.Gen. | iOS 18.5. | Affinity V2.6
ChicagoSeb Posted October 26, 2023 Author Posted October 26, 2023 4 hours ago, thomaso said: At least this two aspects might mean to use a polarizing filter when shooting the photos. Not only can it prevent window reflections (and see the 'reality' behind them), but it is also useful for other types of glare (e.g. on plant leaves (grass), water (lake, river), chrome, varnish, etc...) and therefore to improve contrast or intensify the sky for instance. - https://www.2020mag.com/article/the-physics-of-polarizing-filters https://www.lenstip.com/115.2-article-Polarizing_filters_test_About_light_and_polarization.html Thank you so much that is very helpful. I have a Canon camera that I sometimes use, I will look into a polarizing filter for it, that's cool! However, I usually use my Samsung S23 to take my photos so I'll have to continue using that phone as a last step sometimes. That reflection remover tool works really well. Thanks again! Quote
ChicagoSeb Posted October 26, 2023 Author Posted October 26, 2023 5 hours ago, NotMyFault said: No way to achieve the same in Affinity without lengthy editing. This Android object removal tool is probably AI based, and currently Affinity has not given any hint AI based tools will be included. When you inspect the image closely, you see typical AI artifacts, e.g. the middle horizontal line at height of your hand is simply missing/removed, whereas a human editor would clone it from lower areas. Oh I see, I never thought of that thanks for the new perspective. I'll have to inspect my final images more closely now that you pointed that out. Thank you!!! Quote
ChicagoSeb Posted October 26, 2023 Author Posted October 26, 2023 5 hours ago, NotMyFault said: When I shoot though windows (with iPhone), I always put the lens directly on the glass (and clean both surfaces if possible with a tissue). If you need to shoot at an angle, use your hand or any dark cloth to cover the area around the lens to block all light from behind. It is sufficient to cover the surrounding of the phone itself, no need to go near the lens itself. If you shoot this kind of images often, purchase a black soft rubber ring or mini bellows (from folding cameras). I neglected to mention I use my Samsung S23 most of the time when I shoot, but I'll have to take your suggestions and prepare more thoroughly when I leave the house so that I have things ready in case I need to sit at a moment's notice. But those times I have the time to use my Canon camera, I'm going to get the ring and bellows because I actually do shoot buildings and often from the windows of other buildings in the city, which causes reflections. Thank you so much you have given me a lot to think about and upgrade !! Quote
thomaso Posted October 26, 2023 Posted October 26, 2023 1 hour ago, ChicagoSeb said: I usually use my Samsung S23 to take my photos so I'll have to continue using that phone as a last step sometimes. That reflection remover tool works really well. Different to image editing after shooting a filter can prevent reflections from reaching the sensor. So, just in case, also for mobile phones there are various third party filter / lens adapters with various mounts (e.g. clip or magnet). Quote • MacBookPro Retina 15" | macOS 10.14.6 | Eizo 27" | Affinity V1 • iPad 10.Gen. | iOS 18.5. | Affinity V2.6
carl123 Posted October 26, 2023 Posted October 26, 2023 8 hours ago, NotMyFault said: and currently Affinity has not given any hint AI based tools will be included. They have but you need to be able to read Japanese https://realsound.jp/tech/2023/09/post-1446012_2.html NotMyFault and thomaso 1 1 Quote To save time I am currently using an automated AI to reply to some posts on this forum. If any of "my" posts are wrong or appear to be total b*ll*cks they are the ones generated by the AI. If correct they were probably mine. I apologise for any mistakes made by my AI - I'm sure it will improve with time.
NotMyFault Posted October 26, 2023 Posted October 26, 2023 Excerpt, translated by Safari reader mode: ーーI think the next update will be version 2.3, but what features are planned to be added in 2.3 or subsequent updates? Ashley: We plan to release 2.3 by the end of the year. I can't tell you about the functions to be implemented specifically now, but we plan to provide "cloud services" as a function to be implemented in the future, so that we can share tool assets and color palettes across devices. I'll be able to do it. In addition, there is a function using AI as a function that is scheduled to be implemented in the future. In fact, we have been working on AI professional teams for about 2 years, and we are also creating exciting functions. It will be a wonderful thing that will surely surprise the users. Quote Mac mini M1 A2348 | MBP M3 Windows 11 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080 LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5 | Dell 27“ 4K iPad Air Gen 5 (2022) A2589 Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps. I use iPad screenshots and videos even in the Desktop section of the forum when I expect no relevant difference.
R C-R Posted October 26, 2023 Posted October 26, 2023 9 hours ago, NotMyFault said: Excerpt, translated by Safari reader mode: Just curious but what version of Safari are you using that has translation capabilities in reader mode? Quote All 3 1.10.8, & all 3 V2.6 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7 All 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7
NotMyFault Posted October 26, 2023 Posted October 26, 2023 It’s build-in since years. Don’t know when it was introduced, at least in 11.0 on Mac https://support.apple.com/guide/safari/webpage-translation-in-safari-on-mac-ibrw6ea421e3/16.0/mac/11.0 On iPhone / iPad, you may need to install the Apple translate app and install additional languages. Quote Mac mini M1 A2348 | MBP M3 Windows 11 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080 LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5 | Dell 27“ 4K iPad Air Gen 5 (2022) A2589 Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps. I use iPad screenshots and videos even in the Desktop section of the forum when I expect no relevant difference.
R C-R Posted October 26, 2023 Posted October 26, 2023 1 hour ago, NotMyFault said: t’s build-in since years. Don’t know when it was introduced, at least in 11.0 on Mac Thanks. Since I am using Catalina like my sig says, I suppose that is why it is not there for me. Quote All 3 1.10.8, & all 3 V2.6 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7 All 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.