How AI Can Alleviate Administrative Burden on Social Workers

How AI Can Alleviate Administrative Burden on Social Workers

Social workers are integral to the fabric of our society. As trained professionals who can help a wide variety of people through difficult situations, they provide a diverse range of services, from counseling to case management to advocating for their clients in different spheres. Yet research has found that social workers spend as much as 45% of their time on administrative work—this includes things like:

  • Taking notes to record interactions such as meetings and phone calls with clients.
  • Updating and managing a client’s case file and records.
  • Preparing reports for supervisors or for legal purposes.
  • Completing general paperwork or forms for legal and administrative purposes.
  • Scheduling meetings and visits on a daily basis.

And much more. However, from things like digital case management systems, to telehealth and online counseling, to being able to complete online degrees, like an online MSW degree, technology has come a long way in transforming the social work sector, and we now have access to technologies that could potentially cut the time that social workers spend at their desks. By integrating AI into social work, we have the opportunity to create more efficient and accurate systems that reduce the administrative burden, freeing up social workers to do more of what they do best—spend time helping people.

AI administrative support for social workers

At the moment, 28 councils in England are using a specialized AI tool called Magic Notes that records meetings between clients and social workers and produces a meeting summary file. The AI tool works much like other AI meeting-summary assistants (you might be familiar with tools like Microsoft’s CoPilot, Google Gemini’s “Take notes for me” function, and Zoom’s AI-assistant), by listening to the session and producing a complete transcript, before using AI to summarize that transcript into easily digestible notes. At the end of the session, Magic Notes emails the practitioner a transcript, summary and AI-powered suggestions for what to include in case notes.

Early results show that implementation of the Magic Notes software has resulted in huge gains in efficiency and reduction in time spent on administrative tasks, with one council noting that the AI reduced admin time by a dramatic 48%. Not only that, but the AI has helped social care workers produce reports that are more accurate and evidence-based, leading to a higher quality of output that more efficiently communicates findings to the supervisor or other parties reading the reports. While Magic Notes is still in testing stages in the UK, and is yet to reach our shores here in the US, its success overseas shows promise that it, or something like it, might soon come to America.

Things to keep in mind

While the growth of AI in the social care space has potential to reduce practitioners’ workloads, it must also be used with caution, to reduce errors and other potential problems.

Data privacy and security

Social care is often also healthcare, and with healthcare rapidly becoming a top target for cyber criminals, it’s more important than ever that confidential patient data used by AI tools and companies is securely and privately held. This means that AI companies that offer note-taking or schedule-supporting AI tools should also reassure clients that data is held ideally onshore here in the States, and that that data is used only for a temporary period of time to generate relevant artefacts that are then emailed out to practitioners, before the initial unprocessed data is deleted.

Accuracy

AI, like any other tool, is fallible, and can make mistakes like any human. It also can “hallucinate” or make up information that doesn’t exist, but sounds accurate. It’s crucial that practitioners not totally rely on AI, but either use AI to support their own work (i.e., using the summary notes to create their own reports, or using AI-suggested templates to make their report writing faster), or read and closely check over AI-generated reports before submission.

Biases

AI is trained on historical data, and that data can reflect existing biases related to race, gender, status, or disability. When using AI, practitioners should note that AI tools could unfairly prioritize or de-prioritize specific clients—for example, an AI algorithm used by a Pittsburgh-area child protective services agency was accused of disproportionately flagging families with disabled members for investigation. Other potential biases include an AI lacking crucial context for a certain case, and therefore creating irrelevant or overly simplistic results.

At the end of the day, although AI has the potential to change the lives of many social workers, the technology is still new and should be used carefully.