• planish382@aiparadise.moe
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 years ago

          I think it’s that you need to be able to throw parallel processing at a lot of RAM. If you want to do that on a PC you need a GPU that has a lot of RAM, which you can only really buy as part of a beefy GPU. You can’t buy a midrange GPU and duct tape DIMMs onto it.

          The Apple Silicon architecture has an OK GPU in it, but because of how it’s integrated with the CPU, all the RAM in the system is GPU RAM. So Apple Silicon Macs can really punch above their weight for AI applications, because they can use a lot more RAM.

        • ChipthensfwMonk@lemmynsfw.comOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 years ago

          It’s a ridiculously powerful machine. Running AI stuff caused the fan to spin on for the first time. It destroys everything else.

            • ChipthensfwMonk@lemmynsfw.comOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 years ago

              It’s a merge of Realistic Vision 4, LazyMix+, and URPM with weighting towards realistic vision 4. There are probably better merges and maybe even better models to use, but these have helped me generate some realistic stuff. I also use a fair amount of experimenting with LoRas and Controlnets along with dynamic prompts to test variables (prompt components or parts).

        • HoldingMyDick@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 years ago

          M2 is a damn good chip and this is coming from someone who has not and probably never will buy an Apple product.

          Yes, they’re genrated offline.