Jump to content
You must now use your email address to sign in [click for more info] ×

deepz

Members
  • Posts

    6
  • Joined

  • Last visited

Everything posted by deepz

  1. Depends on your budget of course. For most hobbyists an RTX3080 and RTX4080 is out of budget. Nevertheless, your RTX3080 will still outperform a RTX4060. I would call it a "high-end GPU" (which is subjective again). Providers may consider to do the calculations for their end-users. But with >1000 users, they are faced with additional challenges (beyond the GPU itself). e.g. you can't just put 8 RTX3080/RTX4080 GPUs in a single computer. You need enough PCI lanes and an absurd PSU (that may cost as much as a GPU). Anyway, according to the EULA of Nvidia, cloud providers are not even allowed to use RTX3080 GPUs. What you can get though, are K40 (or K80, which are just twin-K40s). To give you an idea, here are the prices that AWS uses. Which I would call, absurd prices, given the fact that these GPUs won't even outperform a GTX1080Ti. This shows that even the cheapest cloud GPU servers are expensive. (Un)fortunately, there are auction websites like vast.ai that allow you to rent consumer grade GPU servers at lower prices. Having used that service for about 2 years, I can tell you that it's unreliable and not worth the pain. Just too many variables. When I needed GPU servers for my company, I ended up self-hosting them, which also is quite expensive. (think fail-overs, multiple internet providers, smart routers, security, high energy costs). But it's more reliable than anything else I tried. PS: Having said that, if you do the calculations client-side (on customer premise) you get faced with 50 different types of consumer GPUs, and you are going to disappoint some customers. e.g. CUDA won't work on an AMD GPU. GPU driver versions start to matter, which is something you can't ship with your software. Bigger networks perform better but won't fit in the memory of smaller GPU. You won't be able to run it on a tablet, and laptops will suffer from cooling issues. - All of which happened before, with other AI software.
  2. Fact is that AI consumes a lot of energy and has challenging hardware requirements. A server can easily handle 1000 simultaneous web requests, but it can't handle 30 simultaneous AI jobs. So, we're going to see a lot of delays, credit systems, subscription services with monthly billing, ... Goodbye one-time purchases and fixed prices. --- Alternatively, it would be cool if Affinity just had some kind of "AI server software" to self-host on a server as an "agent". For companies, it is becoming more economical to buy a couple of servers with a couple of high-end GPUs. (Maybe it sounds far-fetched. However, other companies are doing this stuff already. e.g. JetBrains is exploring similar solutions with their IDE software. i.e. You can install an agent on a strong server, and then connect from your laptop to it. It's all integrated in their software. The software looks as if it runs locally, but it's actually executing all commands on a remote server. e.g. my laptop with just 8GB RAM connects to a server with 128GB RAM, 12 CPU cores. And that server costed me just 2500€ and will easily last 10 years. The concept reminds me of the "mainframe systems" of the 90s, then again, that concept never really stopped making sense.)
  3. Signed up for the beta and finally tried it yesterday. It can generate "a pointy clowns hat", but it can't generate "a japanese eboshi hat in the style of Hiroshige" To me, it feels like searching and adding clip-art. It doesn't feel like a generative AI at all. Having said that, I love the effort and initiative, and as it stands now, I'm sure they'll get there first. I fear that it will take another 2 year for Affinity to catch up. And I guess most of us (not you, nickbatz) are just worried about this scenario. We don't want to switch back to Adobe. I really love Affinity (... in the same way that I loved Macromedia Fireworks, a long time ago.)
  4. We're going to be redefining what intelligence, art and self-awareness is. The closer innovation will come, the more creative those definitions will get. And when we run out of measurable arguments, we will be looking at spiritual ones.
  5. Exactly, as you put it "They always seem to need fiddling with after the fact". Midjourney generated this one for me last week. Not bad, but let's be honest. It's too dark on the bottom, and the eyes are pointed inwards. So, I generated some more variations, and got this one. But now it looks like there's glue sticking to its fur. No matter how many times you generate, there's always something unwanted. It's great to have something that generates art. However, the next step, are tools to edit the results with the same ease. As shown in the video of Firefly (which isn't real, but just a concept, a marketing video). Smart tools, not just generation. But it could be a first step.
  6. Midjourney runs on a discord server right now. 🤨 It's screaming for a decent UI. Something like Affinity Photo should integrate it. It would lure all those midjourney-hobbyists towards Affinity, and would make Affinity really big. Having said that, we're obviously missing AIs that are able to correct themselves, make changes to previously generated images in an orchestrated way. e.g. to just regenerate a small portion of an image. But once those are in place, I want it embedded in affinity. I won't be doing that on some discord server.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.