Australasian Sonographers Association

Member Area Login

If this is the first time you are logging into the ASA website or you can't remember your password, please select Get a new password below to be sent a link to your registered email address.

AI in Sonography

Wednesday, 29 January 2025

AI in Sonography

Associate Professor Chris Edwards, FASA, member of the ASA’s SPAC, discusses the importance and role of AI in both sonography practice and education.

The Rise of AI

Lately, it’s been hard to escape discussions about artificial intelligence (AI) and its potential to reshape our lives. Conversations often focus on its influence on work and whether it will meaningfully impact the role of sonographers. How will it affect education and the training of future professionals? Beyond our field, the media is full of warnings about rogue AIs fuelling social unrest by spreading misinformation or aiding criminals in creating elaborate scams. Recently, AI-generated content – images and videos – has flooded the internet. While some of it is amusing, much of it has little value, earning the nickname ‘slop’ or ‘AI-generated spam’. The recent surge in public interest in AI can be traced back to the release of OpenAI’s ChatGPT in November 2022. However, the technology behind it – specifically the transformer, the ‘T’ in ChatGPT – has been around since 2017.1 Over the past year, there has been a rapid rise in user-friendly interfaces, making the technology accessible to a much broader section of the public. Today, numerous tools allow users to generate high quality text and create images and videos with an ever-expanding range of possibilities. In September, Google released an audio add-on to NotebookLM2, their personalised AI collaborator, allowing users to create a podcast from text-based inputs. Could this be a handy study tool? Imagine students uploading their ultrasound physics notes and listening to AI-generated podcasters discuss parallel beamforming on their commute, complete with ‘ums’ and ‘ahs’. There’s even evidence suggesting that, in specific contexts, AI can explain content more effectively than professors.3

Applications that use transformers will continue to evolve. A transformer is an AI model trained via machine learning on vast amounts of data. The training employs a technique called ‘backpropagation’, a type of feedback loop that improves the model’s accuracy.

"Some tools may work in the background to improve image quality but others will require direct interactions with the sonographer"

During backpropagation, the model calculates the error between predicted and actual outputs and then uses this error to adjust parameters to improve model performance over time. This could be text in the case of large language models (LLMs) or images and videos.

Some of the work currently done at QUT, analysing ultrasound images, uses a type of transformer called a Swin Transformer. The Swin Transformer, short for Shifted Window Transformer, is designed to deal with the complexity of image processing. Unlike text, where information flows sequentially, images are spatially located. For example, in medical images, the diagnosis isn’t just in individual pixels but in how these pixels relate to one another across the entire visual field. Ultrasound images are particularly complex and include various acoustic features that relate to one another, such as their echotexture and position in the image. The correct characterisation of an image may even include features that extend across frames.

Once training is complete, the model doesn’t ‘relearn’ each time it gets a new prompt; instead, it leverages its pre-learned knowledge to respond in real time. It does this quickly because it’s designed to process information in parallel rather than sequentially, making it highly efficient. Some commercial ultrasound systems have already deployed this technology with onboard AI assistants to highlight particular anatomy or the correct scan plane. For example, targeting a nerve before a pain block or identifying fetal structures during obstetric scanning.

Professional Context – AI’s Role in Sonography Practice

Predicting how this technology will ultimately impact the sonographer’s role is challenging. Some tools may work in the background to improve image quality but others will require direct interactions with the sonographer. Suppose one were to view AI from a purely technical or capability standpoint – for example, a system that employs machine learning to predict an output from text, images, audio or video. With the current level of AI technology, it is perhaps reasonable to suggest that, in a short period, an ultrasound system will be equipped with advanced image analysis tools, possibly automatically labelling anatomy and highlighting a range of pathologies. Further, it may be able to analyse sets of images or videos and compute a report. Regarding audio, perhaps our machines may be equipped with an inbuilt audio interface that will interact with the patient directly, reassuring them that everything is in hand and answering any questions they may have in a professional, empathetic tone. The system may even prompt you if the patient mentions some relevant clinical history, conveniently prefilling these details into an electronic worksheet before you have left the room. What about wholly automated systems using advanced robotics?

The world was promised the widespread adoption of driverless cars; Morgan Stanley in 2013 predicted this would occur in 2026.4 Current predictions have now gone beyond another two decades. This is partly due to the messiness and unpredictability of real life and the difficulty of designing models to cope. In the case of driverless cars, sudden changes in the weather, roadworks, and, of course, other human drivers engage in irrational behaviour on the road. Healthcare is similarly complex, filled with ambiguity, human emotion, and multiple competing elements. AI models must be robust enough to handle these realities. In sonography, this means dealing with everyday realities like extremes of body habitus, low patient acuity, and challenging patient interactions. Sonographers currently address these issues using techniques such as non-standardised plans and advanced communication skills tailored to each patient’s condition and clinical history.

Various systems will be developed and deployed as technology advances. Following how these systems are implemented and adapted to current healthcare challenges will be interesting. What is clear is that AI systems that make predictions or assist with decision making cannot be viewed simply in a transactional sense; they are not like calculators involving a simple input and output; it is an interaction that consists of a relationship between the user and the patient (both humans) and the machine. As mentioned above, the success of these systems will be dependent on various patient characteristics but also on how AI relates to individual practitioners. This is about knowing if the tool produces the correct response in the correct setting. In this way, the success of the AI will depend on the practitioner’s experience just as much as its ability to interpret the ultrasound machine’s output. In one scenario, an expert sonographer may quickly identify an error in the AI output, whereas a student or newly qualified sonographer may take the output on trust. AI interactions will require a delicate balance between trust and scepticism. Successfully integrating AI into sonography will mean navigating this new reality.

"As AI becomes more integrated into sonography and education, the focus must remain on maintaining the integrity of the profession and upholding community trust in practitioners."

Sonography Education – Balancing AI Tools and Integrity

The impacts of AI in the sonography profession are likely to be gradual. However, their effect on education, including sonography education, has more immediate implications. Generative AI (GenAI), the term used to describe AI that can create new content, is capable of completing many student tasks with high accuracy, making it one of the most disruptive technologies ever seen in education. The latest ChatGPT release, o1-preview, designed explicitly for advanced logic and reasoning, is reported to perform at the PhD level.5 Multiple authorities are wrestling with the impacts. A parliamentary senate injury report, ‘Study Buddy or Influencer’, was released in August 2024, recommending GenAI in education be made a national priority.6 The Tertiary Education Quality and Standards Agency (TEQSA), the national regulatory body of universities, has requested urgent action, asking all institutions to report on how they manage GenAI, particularly how they address the risk of GenAI to award integrity.7

Universities themselves also understand the reputational risks and potential for significant community backlash, for example, graduates from a particular university who enter the workforce without the competence to perform the job effectively and safely. This is especially critical in fields like sonography and other health degrees, where universities selfcertify their students. Once they graduate, there is no post-qualification check to ensure that the skills they reportedly gained meet community expectations. With these considerations in mind and TEQSA having one eye firmly on risk, an urgent overhaul of assessment practices is underway. Of course, some assessments are designed to assist and promote learning, and AI tools will become welcome additions. Other, high-stakes assessments designed to verify student learning, which directly leads to certifying a degree like sonography, will require rigorous direct supervision –for example, a return of traditional hall-style written exams to verify knowledge acquisition, oral assessment tasks to demonstrate understanding and showcase communication skills, and face-to-face practical evaluations to demonstrate technical skills. Many online assessment methods developed during COVID-19 are now obsolete. Numerous tips and YouTube videos are circulating on how students can use AI tools during online assessments, even when the assessor is present via Teams or Zoom. Some may have already seen videos of job candidates using AI tools to convert an employer’s questions into text responses displayed on a hidden monitor during interviews.

Risks must be countered with rewards. In the professional world, it is not unreasonable to expect the question of why you used an AI tool to become ‘why not’, especially when there are workflow and efficiency gains and improvements to overall patient care.

Therefore, it is incumbent on educators to encourage the use of tools for relevant tasks. In the sonography context, this might be written communication tasks where students produce artefacts (reports, oral case presentations, ePoster) to showcase and share their knowledge with colleagues.

The use of AI in learning is also an untapped resource. Feedback literacy, an emerging topic in education to improve student learning outcomes, is one area in which AI may assist. The idea focuses not on how supervisors frame feedback for students but on teaching students to respond positively and integrate feedback into their practice. Feedback comes in various forms; sometimes, it is for evaluation; other times, it is an aid to help the student improve; and at different times, it is for praise on a job well done. All of which are important for growth. How students manage their emotional responses to these three is an important skill to develop. More research is needed, but perhaps AI chatbots may be a good source of feedback literacy development – just a thought.

As AI becomes more integrated into sonography and education, the focus must remain on maintaining the integrity of the profession and upholding community trust in practitioners. While these tools offer potential, ensuring that they enhance rather than replace human judgement and accountability is essential to preserving the core values of patient care and professional competence.

References

  1. Vaswani A. Attention is all you need. Advances in Neural Information Processing Systems. 2017.

  2. Google. NotebookLM, . 2024.

  3. Chiasson RM, et al Does the human professor or artificial intelligence (AI) offer better explanations to students? Evidence from three within-subject experiments. Communication Education. 2024;1–28. https://www.tandfonline.com/doi/full/10.1080/03634523.2024.2398105

  4. Shanker R, et al Autonomous cars: Self-driving the new auto industry paradigm. Morgan Stanley blue paper. 2013;1–109.

  5. OpenAI. Introducing OpenAI o1-preview, . 2024.

  6. Parliament of Australia. Inquiry into the use of generative artificial intelligence in the Australian education system ‘Study Buddy or Influencer’, . 2024.

  7. Tertiary Education Quality and Standards Agency. Artificial intelligence request for information – next steps. 2024.