In my opinion, AI Slop is content made with AI that is low effort. People feel and know the quality of something, and if it feels like slop, it feels like slop.

AI being used is not a deal breaker for me as long as the product is good. AI isn’t magic and it takes skills and knowledge to use it and to judge what the model outputs.

You may see it differently however, some might believe that the use of any AI tools whatsoever turns any app into slop the moment the first commit with AI code in it is made.

What do you think, what does AI Slop mean to you?

  • audaxdreik@pawb.social
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    23 hours ago

    AI slop is the output of all generative AI, full stop.

    Slop itself is anything produced for the sake of being produced. Something without feeling or soul, just more content for the content machine.

    Like, yes it does take some level of skill to “prompt engineer” the AI and get it to show you the thing you want, but it’s still not a distinctive style, it’s still not your style. If you say, “sloth astronaut” I can imagine that in my head in my own way, there’s no value in producing an AI image. As far as I’m concerned, an AI image narrows down all the possibilities of my own imagination into the specific piece of slop from the slop machine. If I wanted to see it, the point would be to see an interpretation in someone’s style.

    I can’t remember where I saw this argument recently, it was something coming out of Capcom saying they’d use AI for background details and people citing specific examples from Pragmata maybe? Things like vending machines and environmental details that could be streamlined with the help of AI. But even these small details are places for environmental artists to shine. Show off their skills, hide small details and world building, and little in jokes. It may not be much but it adds to the overall texture and flavor of the product. It does matter.

    AI is slop, is slop, is slop. There’s absolutely no reforming it and if I detect even a whiff of it, I’m out.

    • lime!@feddit.nu
      link
      fedilink
      arrow-up
      4
      arrow-down
      7
      ·
      21 hours ago

      i agree with you, but i’ll give you a counterexample.

      i generate ai images for me. i use only local models and i never share it publically because the output is not really the point, the process is. here is my process:

      i come up with a concept, usually a person or a scene. i then take random images from the internet, cut out the parts i think fit together, and add them as layers in a client called “invoke ai”. if needed i color match the parts in krita first. then i describe the scene i’ve made in a prompt, adding the normal positive and negative keywords to steer generation. i also pull the “blur image” ratio down to 40%.

      the model then makes an image with my digital scrapbook as a base, melting together disjunct elements into a scene. invoke then allows me to move all the elements around and regenerate the scene, or select a few specific elements to regenerate, or paint on top of the scene and generate something new from that, or select part of the scene and change the prompt for that area. it’s a fun little game, and it feels like collaborating with people who know lighting and perspective better than i do.

      most time i’ve spent doing this is half a day, just iterating and tweaking and filling in details. i’m in no way an artist, i’m playing around. it’s basically as resource intensive as playing a video game, and i’m not one to share gameplay footage either.

        • funkless_eck@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          arrow-down
          5
          ·
          19 hours ago

          plagiarism implies a benefit to the plunderer, though. It’s perfectly legal to take images from the internet for your own use - e.g. a sonic the hedgehog themed birthday party for a kid.

          • Rothe@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            13 hours ago

            It is the AI companies who used the images for training in the first place who are the plagiarisers. People using the AI are just using a plagiarising tool made by plagiarisers. Whether they are fine with that is entirely up to their own conscience.

              • lime!@feddit.nu
                link
                fedilink
                arrow-up
                1
                ·
                10 hours ago

                i also think that. and i love having that discussion with people. like, remember the google robot that drew dogs on every picture you fed it? deepdream, i think it was called? was that evil? because it’s the same tech, just trained on… dogs.

              • That’s okay, you can disagree if you like. It’s not like you’re responsible for the creation, upkeep, and various abuses resulting from, this modern pseudo-AI stupidity. You’re just a useful idiot to them.

                • lime!@feddit.nu
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  10 hours ago

                  i don’t think you should start calling third parties useful idiots, i’m the one who started this.

        • lime!@feddit.nu
          link
          fedilink
          arrow-up
          2
          arrow-down
          4
          ·
          20 hours ago

          well yeah, sure. not denying that. but i don’t do it for people to see it. i think of it a bit like cutting stuff out of magazines and gluing it together. it’s remix culture.

          i’m not a creator in any way, i’m a consumer. i just like blending things together.

      • soratoyuki@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        16 hours ago

        Your local chat bot is still trained on data stolen at an astronomical scale, and, even if we accept your use case as ‘less’ bad, it still drives demand for ‘worse’ chat bots owned by oligarchs that want to destroy the world.

        The tool is evil even if it has interesting niche applications.

        • lime!@feddit.nu
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          10 hours ago

          correct. though stable diffusion was initially trained on LAION5, which is a free and open dataset compiled for scientific research which in itself had copyrighted material in it. that didn’t come out until after the model was out there because it didn’t matter to anyone before. scientists had used it for deep learning tasks for years at that point.

          and the source images i take i also do without payment or copyright acknowledgement. as is the shows i watch on my jellyfin server, and some of the computer games i play, and the music i used to record off of the radio onto cassettes, and the comic characters i used to cut out of magazines as a child to make my own comics with.

          i’m not saying those things are all equivalent. i’m saying that this problem isn’t new. i’ll stand up and defend small artists against big corporations any day, and i’ll gladly pay people for their time. my patreon bill alone is proof of that. i’ve canceled netflix and spotify over their unfair treatment of artists. i’ve written to my members of parliament, both local and in the eu, about how the copyright system is broken, how the slogan “information wants to be free” doesn’t automatically mean that meta can leech eighty terabytes of copyrighted material or that the us government can use any work they want in their propaganda material without paying. i’ve driven discussion about rightsholders and the unfairness of payouts to the little guy both at work and during my off time. i’ve voted against corporate control of media for years. i’ve voted pirate. i’ve voted socialist. i’ve voted green.

          but sometimes i just want to dick around.

          believe you me i’ve thought about this.

  • cally [he/they]@pawb.social
    link
    fedilink
    arrow-up
    8
    ·
    18 hours ago

    anything made with AI, specially if it’s undisclosed. i give an exception for AI narration (text to speech AI) but i still don’t like it if it’s not disclosed as AI generated, i’d rather hear a person.

    • Rhynoplaz@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      18 hours ago

      I think the AI narration gives me the most unreasonably negative feelings. Especially since it’s usually the same couple voices used over and over, I just cringe when I hear them, and automatically assume that none of what they are saying can be trusted.

      • PunnyName@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        16 hours ago

        Narration for the blind or visually impaired is always a good idea. A.I. narration can help with that. Most of the A.I. narration slop is also A.I. writing, which is why they have terrible grammar and sentence structure.

  • melsaskca@lemmy.ca
    link
    fedilink
    arrow-up
    7
    ·
    19 hours ago

    Paying $15,000.00 for an internet connected math calculator that may, or may not, give you the correct answer. Where a $20.00 math calculator is right every single time, and needs no connectivity.

  • i_stole_ur_taco@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    16 hours ago

    I call it slop when it was generated by AI and not carefully reviewed (and probably tweaked) by a human that understands the output.

    Code written not by a software developer? Slop.

    Code written by a software developer that just shipped it without understanding it? Slop.

    Code written by a software developer that went through subsequent review, testing, and adjustment? “AI-assisted”, maybe?

    Replace code with any other industry and the same principle applies.

    I generally equate slop with human laziness, even though the actual “quality” of the slop still varies.

  • SkyNTP@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    19 hours ago

    Remember clipart and wordart? It was colourful and flashy and easy, and everywhere in PowerPoint presentations and word documents and even online. For a few years. Then it vanished.

    Turns out, easy and flashy doesn’t have a lot of staying power because when something is easy, it is ubiquitous, and when it is ubiquitous it stops being impressive.

    AI slop is easy and flashy, and will probably run its course as people become tired of it.

    There will still exist AI content, but it will not resemble the slop we see today.

  • Pamasich@kbin.earth
    link
    fedilink
    arrow-up
    13
    ·
    23 hours ago

    AI slop is AI + slop.

    Slop is low effort garbage with zero quality assurance put into it.

  • Deestan@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    22 hours ago

    Slop is anything made by people who don’t know what they are doing and why anyone would want it. The people who necessitate laws that “chocolate” products need to contain X% amount of cocoa. Because to them it is “brown fatty thing” so they add brown to fat and ask someone to make a compelling packaging.

    These people believe generative AI works because they can’t tell. If you gave them hot mud and called it coffee, they would think it was coffee.

    People who can tell butter from lard, do not think generative AI works at all.

    So yeah genAI can only make slop. Some people believe it is useful and I hate them.

  • kboos1@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    18 hours ago

    If it does something helpful to me at low risk and doesn’t soften my skills, then it’s just another tool. If some trys to sell me or give me AI content, then I can’t trust them and they are the enemy.

  • Rhynoplaz@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    18 hours ago

    I have less hate for AI than the general population of Lemmy but I’m certainly not a fan.

    To me, AI slop is a random schmuck typing in “make an image of two people laughing together” and then takes the first result (riddled with errors) and posts it to their blog, article, or social media to supplement whatever story they are telling or point they are trying to make.

    If the image has purpose, or a good concept, then I might give it some credit. If they creator has a great idea or joke, and they do not have the ability to draw it themselves, I’m not opposed to them using AI to help create their vision.

    For example, my dad really enjoyed Far Side comics and often he would come up with his own, but he can’t draw, so he would just have to explain to us what the picture would look like and what the caption would be. If he used AI to create these visions of his, It’s probably the only way that he could share his jokes with the world. And it’s not like he’s taking money from an artist who needs work, because he’d never pay to have them professionally created. If he can’t share them, they will die with him.

    If it’s a tool to help you effectively tell your story, I’m ok with it. But if you’re just cranking out fake doorbell videos and implying they are real, for hits on social, GTFO.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    15 hours ago

    Yeah. Basically ai has made things easy enough to do crappily that a lot of crap is created. People who would have to think for a while and spend time writing a paragraph can get one written from a one sentence prompt. they can create a story from a paragpraph. similarly if they had no art talent they might try some stuff but even to make a crappy thing it would take time and effort. Honestly like I got sick of memes because they are low effort so I stay away from them. That was because photoshop made it easy to make these simple little cartoon type things. ai creations are like that. someone using llms in their work is no different from using google search when they work on something but if someone took the first link from the first search and copy and pasted it like it was some great thing it would be quite annoying.

  • MousePotatoDoesStuff@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    17 hours ago

    Content (images, audio, text…) generated by GenAI. It’s slop not quality-wise, but because it’s mass-produced (as opposed to artisanal creations).

    Not all slop is AI-generated, though - we had mass-produced content before GenAI, too. GenAI is just turning it up to eleven.

  • Soulifix@piefed.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    18 hours ago

    I think the word ‘slop’ has really been overused at this point. It’s dead to me.

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    24 hours ago

    If it’s something a semi skilled person can see is incorrect then it is slop.

    An incorrect number of appendages is the simplest one.