What I have watched happen in my profession in the last two years, I am still struggling to describe. The first time I knew something was wrong, roughly a year and a quarter ago, I noticed a colleague replying to me using AI…

archive.org mirror

  • insight06@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    31
    ·
    6 days ago

    Some quotes that resonated with me:

    In any previous era, the quality of a piece of work was a more or less reliable signal of the competence of the person who produced it. A novice essay read like a novice essay; novice code crashed in novice ways. AI has severed that relationship.

    The skills of producing work and judging it were deliberately distinct, but accomplishing the work itself used to teach the judgment. The first skill now belongs, in large part, to the machines. The second still belongs to us, though fewer are bothering to acquire or utilize it.

    The slowness was not a tax on the real work; the slowness was the real work. It was how the work got good, and how the people producing the work got good

    The current generation of agentic systems is built around the premise that the human is the bottleneck — that the loop runs faster and cleaner without the awkward delay of someone reading what is about to happen and deciding whether it should. This is, in a great many cases, exactly backwards

    • randy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 days ago

      The slowness was not a tax on the real work; the slowness was the real work. It was how the work got good, and how the people producing the work got good

      This line reminded me of a couple of articles, linked below, that I read on AI use in astrophysics. Developing junior researchers is a big part of the point of their work, so they really are going to have to limit their AI use to make sure development happens. But worry that industry won’t care; they’ve been hollowing out junior positions for years, because there’s no value in training a senior who is just going to jump ship to a company that doesn’t train juniors. That’s an existing problem, but AI seems likely to make it worse.

  • MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    6 days ago

    We (my company) are trying to create agents that read a story and translate that into prompts, then execute said prompts, then review the output. The only piece missing is accepting the merge.

    I’m not anti-AI, but a human needs to be involved at every step because a minor mistake made at the first step will amplify through the agentic pipeline.

    A human should review every single thing that comes out of AI — especially if it is to be fed back into AI.