Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

  • Melvin_Ferd@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    Maybe we need a new way to approach school. I don’t think I agree with turning education into a competition where the difficulty is curved towards the most competitive creating a system that became so difficult that students need to edge each other out any way they can.

    • atomicbocks@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      I guess what I don’t understand is what changed? Is everything homework now? When I was in school, even college, a significant percentage of learning was in class work, pop quizzes, and weekly closed book tests. How are these kids using LLMs so much for class if a large portion of the work is still in the classroom? Or is that just not the case anymore? It’s not like ChatGPT can handwrite an essay in pencil or give an in person presentation (yet).

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        University was always guided self-learning, at least in the UK. The lecturers are not teachers. The provide and explain material, but they’re not there to hand-hold you through it.

        University education is very different to what goes on at younger ages. It has to be when a class is 300 rather than 30 people.

        • atomicbocks@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          WTF? 300? There were barely 350 people in my graduating class of high school and that isn’t a small class for where I am from. The largest class size at my college was maybe 60. No wonder people use LLMs. Like, that’s just called an auditorium at that point, how could you even ask a question? Self-guided isn’t supposed to mean “solo”.

          • Pieisawesome@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 days ago

            You can ask questions in auditorium classes.

            The 300+ student courses typically were high volume courses like intro or freshman courses.

            Second year cuts down significantly in class size, but also depends on the subject.

            3rd and 4th year courses, in my experience, were 30-50 students

            • atomicbocks@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              8 days ago

              You can ask questions in auditorium classes.

              I am going to be honest; I don’t believe you. I genuinely don’t believe that in a class with more people than minutes in the session that a person could legitimately have time to interact with the professor.

              The 60 person class I referred to was a required lecture portion freshman science class with a smaller lab portion. That we could ask questions in the lab was the only reason 60 people was okay in the lecture and even then the professor said he felt it was too many people.

  • raspberriesareyummy@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Surprise motherfuckers. Maybe don’t give grant money to LLM snakeoil fuckers, and maybe don’t allow mass for-profit copyright violations.

  • Flamekebab@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 days ago

    I’m shocked. Shocked! Well, not that shocked.

    Ultimately it seems pretty dumb. If you’re not going to actually learn while you’re there, why bother? University isn’t mandatory.

    That was actually my biggest disappointment with my degree - the course didn’t teach anywhere near enough for my tastes. However I would hope that I was an outlier in that respect!

  • venusaur@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    9 days ago

    If ChatGPT can effectively do the work for you, then is it really necessary to do the work? Nobody saying to go to the library and find a book instead of letting a search engine do the work for you. Education has to evolve and so does the testing. A lot of things GPT’s can’t do well. Grade on that.

    • myfavouritename@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      The “work” that LLMs are doing here is “being educated”.

      Like, when a prof says “read this book and write paper answering these questions”, they aren’t doing that because the world needs another paper written. They are inviting the student to go on a journey, one that is designed to change the person who travels that path.

        • Warl0k3@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          Hands on, like engage with prior material on the subject and formulate complex ideas based on that…?

          Sarcasm aside, asking students to do something in lab often requires them to have gained an understanding of the material so they can do something, an understanding they utterly lack if they use AI to do their work. Although tbf this lack of understanding in-person is really the #1 way we catch students who are using AI.

          • venusaur@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 days ago

            Class discussion. Live presentations with question and answer. Save papers for supplementing hands on research.

            • myfavouritename@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 days ago

              Have you seen the size of these classrooms? It’s not uncommon for lecture halls to seat 200+ students. You’re thinking that each student is going to present? Are they all going to create a presentation for each piece of info they learn? 200 presentations a day every day? Or are they each going to present one thing? What does a student do during the other 199 presentations? When does the teacher (the expert in the subject) provide any value in this learning experience?

              There’s too much to learn to have people only learning by presenting.

              • venusaur@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                edit-2
                8 days ago

                Have you seen the cost of tuition? Hire more professors and smaller classes.

                Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?

                When you get to grad school and beyond is what really matters. Speaking from a US perspective.