Jump to content
You must now use your email address to sign in [click for more info] ×

iconoclast

Members
  • Posts

    1,724
  • Joined

  • Last visited

Everything posted by iconoclast

  1. Don't think so. Look, the main light on the guy on the left comes from somewhere on the right side. The main light on the woman comes from the left side. And there are no plausible sources for these lights. If there should be some sorts of lanterns somewhere outside the boundaries of the image , they should affect all things around them. You are trying to make a very complex collage. The only ways to get this done are to find images that fit to each other or to retouch, not to say paint a very lot. And that is not easy without enough knowledge about light.
  2. My first impression is that the light angle of the woman in the foreground doesn't match to the one of the man on the left. I think, she will not work. Maybe if you flip her horizontally.
  3. I think, the lamp would also cast light to the ground, not only to the people. It wouldn't only throw a shadow. As I already said: No light without shadow and vice versa. And the brighter the light, the darker the shadow. It's even a little more complex, because light has a colour cast most times (e.G. a blueish one at lunchtime and a more red one in the evening, fire and old lightbulbs emit slightly warm yellowish to redish light, while LED- and Neonlight often is more blueish...). And so have shadows - in the complementary colour of the light colour. But I think this is not so important in this case, because the difference is too small at night. These things are very good design mediums for composition, to produce expression, plasticity and suspense. Light is very important in horror movies. But after all: if nobody detects the flaw, it might be not so important. It is all a matter of aspiration.
  4. To drag the pipette to the point I want to catch a color, release the mouse button there and then click on the round field on the right of the pipette, works at least for me. In that case, the color is activated in the main color field and you should be able to add it to the swatches too. But not by simply dragging, but by simply clicking the icon on the left of the color history (on your screenshot, on the right side of the menu in the colour panel where "Colours" is activated - three squares and a plus sign). If that doesn't work in your case, there must be something wrong. But I agree with you that Affinity's way to handle this is a little bit confusing. Edit: By the way, in your screen recording, if you would click on the green sphere right of the pipette in the colour panel, the main colour field, the big sphere on the left, should turn to green. An additional note for clarifycation: if I said "to drag the pipette..." I meant: Click on the pipette symbol and hold the left mouse key to drag it to the pixels. Then release the mouse button and click on the small sphere on the right of the pipette to activate the chosen colour that should now be visible there. And a final annotation: I consequently ignored that you try to transfer a colour from another app. Sorry, I'm actually lying in bed with influenca. I don't remember if I have tried that in the past, and I can't try it at the moment, but in other apps (e.G. GIMP) I'm used to drag colours from outside the app with the pipette of the app I want to transfer the colour to. Can't remember any app that does it vice versa. Maybe this somehow helps.
  5. It definitely looks good, but my problem is still that the lighting isn't plausible. E.G., if the paramedics are lighted so much, why is there no light on the ground? But it is definitely a progress that you darkened the house. That's absolutely OK, in my opinion.
  6. Hey it's a horror movie! Three arms are very spooky. Your version looks more credible, in my opinion. We should possibly differenciate "credible" from "realistic" in this case, because Hollywood lightings usually aren't realistic too, but well composed. If the paramedic in front of the scene is important for the plot, I think they would have set some more light from his backside on him, or a weak light from the front (maybe bluelight) .
  7. One thing I'm actually thinking about: shouldn't color picking work by dragging the pipette from the color panel to the pixels you want to pick up? And after that you need to click on the field on the right side of the pipette to confirm the choice? And to chose a color from the swatches, simply click the swatch you want. That's not what I see on the screen recordings. Possibly I interpret the problem wrong, but that's my last idea for this topic. By the way, this is a thing that confused me too in the beginning. That was the reason for my post above.
  8. OK, that's possible. But if he knows other Windows or MAC apps, it shouldn't be too difficult to understand the structure of the Affinity apps. Otherwise it should be no problem to ask more precise questions. Because we need to know too where we have to start explaining.
  9. Just to put my five cents to this topic: I like the way this is solved in GIMP. There is of course a color bucket too and also a pipette, but you can also easily pick colors with a brush if you hold the Ctrl-key (in GIMP it's the same key you are using for the clone tool), and you can drag colors directly from the color field of the color dialogue (panel) to fill. That's very easy and I don't understand why it should be more complicated.
  10. What is so confusing about the Help and the tutorials? I suppose the amount of options confuses you. In fact there is a range of opportunities, e.G. to retouch. Depending on what you want to retouch and how, you can use e.G. the Clone, the Heal or the Inpaint Tool (to replace e.G. dust grain or lint), the Dodge or the Burn Tool (to lighten or darken areas), the Sponge (to increase or decrease the saturation of certain areas) and the Sharpness, Median and Smudge Tool (to change the sharpness of certain areas). These tools are only for aimed use. For global actions, you should use the filters. For the Clon and Heal Tools, hold the Alt key and click on one point on the image to gather the color-informations of point. Then release the Alt key and click on the point, wher you want the color-informations be printed to. This way, you can cover artifacts and whatever you want. But take care that you use a suitable point to gather the color informations. And take care to use suitable a brush preset, brush size, brush hardness etc. Concerning the other tools: simply test them out. And if you don't understand something certain, ask us. By the way, it is not very difficult in general, but it is something you must pay a little time to learn it. My apprenticeship took about two years. It contained a little more than this, but I needed more than a few hours to learn retouching. Finally it is also a matter of aspiration. And another good tip. Talk to your printer before he prints it. Printing is expensive and can't be annulled after it's done. Depending on the print shop you assign, there can be different requirements that must be fullfilled, like e.G. color conversion, printable resolutions, file formats...
  11. As it seems, the Transform panel doesn't show the dimensions of selections. Only that one of areas on a layer that contain pixels. But if you click on the layer of the shape, it should show it's dimensions. On condition that there are no hidden pixels somewhere on that layer, that distort the measurement.
  12. It is possibly caused by the Alpha channel. As I noticed, in the case I described above, I can see the dimensions of the selection in the Info panel. But this is new to me. I never noticed that before. Edit: And, weird enough, if I transform the shape and redo that, suddenly the dimensions even show up in the Transform panel. Seems to be a bug.
  13. Ah, I see, you created your shape with the Shape tool and rasterized it. And then you selected it using the Magic Wand? Right? In that case the dimensions don't show up even on my computer. Don't think that it is intended behavior. Probably a bug.
  14. Are we talking about Affinity Photo or Designer? In fact, if I rasterize a shape in the Pixel Persona of Designer and select it, the dimensions don't show up in the Transform panel. But if I change to the Designer Persona, it works. Don't know if this is intended behavior. In my Photo version (2.2 on Windows 10) it works.
  15. It's a step into the right direction, I think, but there are still some things that irritate me. At first, before I forget it again, there are dark edges around the lightnings. Not the most important thing, but I already wanted to point on it some time ago and forgot it. Than the grass in front of the house. It looks too much like daylight. I'm not sure how it look like, but I think I would try a more blueish tone. And the house is lightened from the fontside, while the lightnings are behind it. Normally it would be dark at the front, and would throw a shadow on the area in front of the house. And the rule says: The brighter the light, the darker the shadow. And vice versa. Concerning the Policeman: There may be a plausible light source somewhere, that lights him, but why doesn't it light the dog and the back of the ambulance too? And how would it affect the shadow under the ambulance? It's not easy to solve those problems only by imagination, but for me it is somehow obvious that the lighting doesn't work correctly in this scene. Sorry, that may be disheartening, but it's my point of view. As I already said, it's not easy. Edit: One tip: Try a google search for images with scenes of thunderstorms at night. You will see, that the scenes are very dark, almost black. Only the sky is lighted. If there are also houses and other things visible, there must be a plausible Light source that is strong enough. another inspiration could possibly be movies with thunderstorms at night. They will have a Hollywood lighting. E.G. Tobe Hooper's "Poltergeist" contains a thunderstorm at night.
  16. That's an important point: the balancing of lights. In reality, the lights inside the Ambulance would be weaker than the flashlights of the lightnings. So you will have to dim the lights and reflecting light that come from the weaker lights. If you want to print your images, you will have the problem that the white of the paper is the limit in lightness, while the monitor of your computer is a light source and can create much brighter white. But in analogue images, you can use some tricks. One is to dim the weaker lights, another is to use colours to fake more lightness with a colour contrast. The classic case is to mix a tiny little bit of blue intor the white of the light. This will work even better if the background has a tinch of red or at least a warm colour. I would use the lighting to emphasize the important parts of the scene anyway.
  17. Just a tiny annotation from my side: it is not the important point what are the objects that you want to release from the background. The important point is if they have characteristics that distinguish them enough from the background. Characteristics like Hue or Brightness. Often the background and or the objects don't have a continuos Hue and/or Brightness over their complete area, so that the app can't decide which pixels belong to the objects and which to the background, even our eyes seem to be able to do so. In fact our brains do a lot of that decisions, not only the eyes. So maybe AI will one day decide what is are pixels that belong to certain image objects and which don't. In case of your image, it seems that the Brightness is the most important characteristic. So if the Selection Brush is not effective enough, you could try to increase the contrast between fore- and background with a Live Filter (e.G. "Brightness/Contrast" - increase the contrast and the brightness cautious) and the use the Selection brush. Refine the result. The deactivate the Live Filter (e.G. Contrast/Brightness). Then you have a selection that should surround the trees as good as possibly. Activate a Mask, and the background should dissapear (but of course non destructively - it is still there, you only have to deactivate the mask to get it back). If there are still unclean parts at the edges, you can remove them by hand. You can paint with black on that parts on the mask. To restore parts that shouldn't be removed, paint with white on the mask. But take care that you do it on the mask, not on the image itself!
  18. I love the horror genre, but in relation to other genres, there exist only a few very good horror movies. That's at least my opinion. My all time favorite is "Alien" (Ridley Scott), that offers the ultimate horror vision - in my opinion. Another very good one is "The Descent" (Neil Marshall), which is possibly the most frightening movie for me that I have ever seen. Already as I was a little child, I crept out off my bed furtive at night to watch "The Thing from Another World" (1951, Christian Nyby/Howard Hawks), one of the most fascinating movies of my childhood. Don't know what went wrong with me. 😄 To be honest, I was a little bit disappointed by the new "It"-movies. Possibly because I read the book already thirty years ago or so, and I was definitely hooked. Not so this movies. But some aspects of this movies are really good anyway. Especially the child-actors. Werewolf- and Vampire-movies aren't my favorite kinds of horror movies. These topics are a little bit worn, I think. Almost the same with Zombies. My favorite Werewolf-movie is "The Company of the Wolfs" (Neil Jordan), I think. My favorite Vampire-movie is "The Fearless Vampire Killers" (Roman Polanski), followed by "Interview with a Vampire" (Neil Jordan). But it's of course all a matter of taste. Edit: I liked the old TV-filmizations of "It" as I first saw them, and I have them on DVD, but it's not easy for me to watch them today. They lost a lot of their fascination since then. Especially the second part, with the spider etc. looks a little bit ridiculous. But Tim Curry as Pennywise was great.
  19. The intensity of light is better in this last example, but the problem is still the lighting angle. The brightest light sources in this scene are the lightnings and the lights inside the ambulance. But those lights don't have any effect on the people in the foreground of this picture. They are lighted by something somewhere on top left in front of the scene, that only affects them, but nothing around them. That's irritating. I think, at least the light inside the ambulance must be regarded in form of thin reflections on the edges of the silhouettes of the paramedics. And on the back of the guy on the left side, there should be a reflection, caused by the lightning. But you could only paint them. That's not easy.
  20. Usually you can switch to a preview mode. In this mode, guides, edge markings, rasters and even the objects outside the document are hidden, so that you can see your document as it shall look like. I don't know how it can be activated in the iPad-version, but in the desktop-version it is the shortkey Ctrl+W (on Windows). On Mac probably Cmd+W.
  21. If you have a fully opaque sharp brush (all 100%) and you lower the "Hardness", as I already said above, you get a more and more blured brush. And if you set it to 0% "Hardness", it will only be full opaque in the very center. So if I said, that you unsharpen the edges of the brush with it, it my be not the right depiction, because the edge in fact reaches from the center to the outer edge. Because of this you need the "Flow" to adjust the range of the blur. "Flow" and "Hardness" are two settings you can use in relation to each other to adjust the appearance of the brush. "Opacity" is a third option to do this. The only one to get also the center of the brush more transparent. So you always have to understand those settings in relation to each other. E.G. if you release an image object from a background by working with a slightly unsharp brush on a mask, you will have the problem, that sometimes some parts of the object are more detailed than others. So you will probably change the brush size to be able to trace all those details. Of course, smaller brushes with the same "Hardness" setting have a smaller range from the center to the edge, because the brush is smaller. So they will create sharper edges to the image object at that areas, where you use smaller brushes. E.G. for that reason you can adapt the brush to your needs with the "Flow". Simply play a little with the brush and look what happens if you only change one of the setting. Then check out how the settings behave in relation to each other.
  22. Yes, that may be an option, but I don't know if it's worth it. It only happens here in this thread, with this certain post. And I don't know if the OP is still interested.
  23. You can easily test it yourself: Choose a Basic Brush with a sharp edge (all settings on 100 %) and click on the canvas and paint a stroke. Then reduce only the "Hardness" to e.g. 50% and do the same. Finally reduce the "Hardness" to 0% and repeat what you did with the other settings. The strokes will look thinner, the lower the "Hardness" is, but if you measure the blended edges of the strokes by moving the curser over this areas and looking at the "Info" panel, you will see, that there are pixels with low opacity even in outer areas where you can't see them. So "Hardness" defines the sharpness of the brush. And the softest brush is blending the colour from full opaque pixels in the center to fully transparent pixels on the edge of the brush. But strictly speaking aloso the "Flow" can have influence on the sharpness of the stroke. As smadell already said, 0% "Flow" doesn't make any sense, because the brush doesn't emit any colour in that case. The developers could cut the scale at 1% "Flow", but I think that would cause confusion too anyway.
  24. Thanks for that tip, but even that didn't work. Can't really imagine what in my post should be forbidden. Maybe it is simply a bad tip. Sorry, it seems that I can't help. Possibly google "GDI-Scaling on Windows".
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.