Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. The UK Home Office has responded to questions raised by Bell Ribeiro-Addy MP on its use of AI tools in the asylum decision-making process, informed by ORG's work.

The UK Home Office has responded to questions raised by Bell Ribeiro-Addy MP on its use of AI tools in the asylum decision-making process, informed by ORG's work.

Scheduled Pinned Locked Moved Uncategorized
asylumimmigrationhomeofficeukpolitics
7 Posts 3 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
    openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
    openrightsgroup@social.openrightsgroup.org
    wrote last edited by
    #1

    The UK Home Office has responded to questions raised by Bell Ribeiro-Addy MP on its use of AI tools in the asylum decision-making process, informed by ORG's work.

    The answers raise serious concerns. These systems are being rolled out without meaningful transparency or governance.

    Read more ⬇️

    Link Preview Image
    Second AI system deployed for asylum caseworkers to be deployed this month as ministers vow ‘decision-makers cannot use the tool by itself to decide a claim’

    An automated platform for creating a summary from the transcripts of interviews is to be introduced alongside an existing tool for helping officials to search for and understand policy considerations All asylum caseworkers and decision-makers will this month be equipped with a second artifici

    favicon

    PublicTechnology (www.publictechnology.net)

    #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

    openrightsgroup@social.openrightsgroup.orgO benhm3@saint-paul.usB 2 Replies Last reply
    1
    0
    • openrightsgroup@social.openrightsgroup.orgO openrightsgroup@social.openrightsgroup.org

      The UK Home Office has responded to questions raised by Bell Ribeiro-Addy MP on its use of AI tools in the asylum decision-making process, informed by ORG's work.

      The answers raise serious concerns. These systems are being rolled out without meaningful transparency or governance.

      Read more ⬇️

      Link Preview Image
      Second AI system deployed for asylum caseworkers to be deployed this month as ministers vow ‘decision-makers cannot use the tool by itself to decide a claim’

      An automated platform for creating a summary from the transcripts of interviews is to be introduced alongside an existing tool for helping officials to search for and understand policy considerations All asylum caseworkers and decision-makers will this month be equipped with a second artifici

      favicon

      PublicTechnology (www.publictechnology.net)

      #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

      openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
      openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
      openrightsgroup@social.openrightsgroup.org
      wrote last edited by
      #2

      AI tools in UK asylum decision-making are being deployed first, while safeguards, oversight and transparency are treated as secondary.

      This approach carries serious risks to fairness, accountability, and the protection of rights.

      Training alone is no replacement for proper governance frameworks.

      #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

      openrightsgroup@social.openrightsgroup.orgO 1 Reply Last reply
      1
      0
      • openrightsgroup@social.openrightsgroup.orgO openrightsgroup@social.openrightsgroup.org

        At a minimum, the use of AI tools must have:

        ✅️ Clear and published safeguards
        ✅️ Comply with government AI playbook
        ✅️ Defined accountability structures
        ✅️ Meaningful human oversight
        ✅️ Full transparency on how these systems are used

        Without this, claims of responsible AI use remain unsubstantiated.

        #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

        openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
        openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
        openrightsgroup@social.openrightsgroup.org
        wrote last edited by
        #3

        AI is not neutral. It can discriminate and make mistakes.

        It shouldn't be used to change information that informs life-changing asylum assessments. Without adequate safeguards, there's a risk that unlawful or unfair decisions may result.

        Ask your MP (UK) to stand against the use of AI tools in asylum ⬇️

        Link Preview Image
        Ban AI tools in asylum decision making

        Take action! What’s the problem? The Home Office is using two AI tools: The Asylum Case Summarisation (ACS) tool uses ChatGPT-4 to summarise asylum interview transcripts. The Asylum Policy Search (APS) tool summarises country Policy and Information Notes (CPINs), guidance documents, and Country of Origin Information (COI) reports. The Home Office’s own evaluation revealed that

        favicon

        Open Rights Group (action.openrightsgroup.org)

        #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

        1 Reply Last reply
        0
        • openrightsgroup@social.openrightsgroup.orgO openrightsgroup@social.openrightsgroup.org

          AI tools in UK asylum decision-making are being deployed first, while safeguards, oversight and transparency are treated as secondary.

          This approach carries serious risks to fairness, accountability, and the protection of rights.

          Training alone is no replacement for proper governance frameworks.

          #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

          openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
          openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
          openrightsgroup@social.openrightsgroup.org
          wrote last edited by
          #4

          The key issues with the use of AI tools in the UK asylum system are:

          🔴 No published Data Protection Impact Assessments.
          🔴 No procedures governing the use of AI tools.
          🔴 Being rolled-out before transparency.
          🔴 Reliance on post-hoc oversight.
          🔴 References to “human in the loop” without clarity over what power human decision-makers actually retain.

          #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

          openrightsgroup@social.openrightsgroup.orgO dalereardon@mastodon.socialD 2 Replies Last reply
          1
          0
          • openrightsgroup@social.openrightsgroup.orgO openrightsgroup@social.openrightsgroup.org

            The key issues with the use of AI tools in the UK asylum system are:

            🔴 No published Data Protection Impact Assessments.
            🔴 No procedures governing the use of AI tools.
            🔴 Being rolled-out before transparency.
            🔴 Reliance on post-hoc oversight.
            🔴 References to “human in the loop” without clarity over what power human decision-makers actually retain.

            #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

            openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
            openrightsgroup@social.openrightsgroup.orgO This user is from outside of this forum
            openrightsgroup@social.openrightsgroup.org
            wrote last edited by
            #5

            At a minimum, the use of AI tools must have:

            ✅️ Clear and published safeguards
            ✅️ Comply with government AI playbook
            ✅️ Defined accountability structures
            ✅️ Meaningful human oversight
            ✅️ Full transparency on how these systems are used

            Without this, claims of responsible AI use remain unsubstantiated.

            #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

            openrightsgroup@social.openrightsgroup.orgO 1 Reply Last reply
            1
            0
            • openrightsgroup@social.openrightsgroup.orgO openrightsgroup@social.openrightsgroup.org

              The key issues with the use of AI tools in the UK asylum system are:

              🔴 No published Data Protection Impact Assessments.
              🔴 No procedures governing the use of AI tools.
              🔴 Being rolled-out before transparency.
              🔴 Reliance on post-hoc oversight.
              🔴 References to “human in the loop” without clarity over what power human decision-makers actually retain.

              #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

              dalereardon@mastodon.socialD This user is from outside of this forum
              dalereardon@mastodon.socialD This user is from outside of this forum
              dalereardon@mastodon.social
              wrote last edited by
              #6

              @openrightsgroup Out of interest the exact same thing is happening with AI tools in Aged Care & Disability Care systems in Australia with all the same concerns & in the same way nothing being done about the huge problems! #AusPol #Disability #NDIS

              1 Reply Last reply
              0
              • openrightsgroup@social.openrightsgroup.orgO openrightsgroup@social.openrightsgroup.org

                The UK Home Office has responded to questions raised by Bell Ribeiro-Addy MP on its use of AI tools in the asylum decision-making process, informed by ORG's work.

                The answers raise serious concerns. These systems are being rolled out without meaningful transparency or governance.

                Read more ⬇️

                Link Preview Image
                Second AI system deployed for asylum caseworkers to be deployed this month as ministers vow ‘decision-makers cannot use the tool by itself to decide a claim’

                An automated platform for creating a summary from the transcripts of interviews is to be introduced alongside an existing tool for helping officials to search for and understand policy considerations All asylum caseworkers and decision-makers will this month be equipped with a second artifici

                favicon

                PublicTechnology (www.publictechnology.net)

                #AI #asylum #immigration #homeoffice #ukpolitics #ukpol

                benhm3@saint-paul.usB This user is from outside of this forum
                benhm3@saint-paul.usB This user is from outside of this forum
                benhm3@saint-paul.us
                wrote last edited by
                #7

                @openrightsgroup

                First “customers” for any oppression-tech are those without rights or protections from it: prisoners, poor, and those with different abilities.

                1 Reply Last reply
                0
                • R relay@relay.infosec.exchange shared this topic
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups