This is just an example…
“We are moving from a model where the consumer or the user bears the burden of security to
those who best can handle it – the developers and companies selling the software”
Using short essays to check if pupils have absorbed information has always been open to abuse, whether this involved asking an older sibling, copying from an encyclopedia, or using a search engine. To get pupils thinking about how LLMs work, ACS has asked some students to use ChatGPT to draft an essay on a topic, then fact-check it for “hallucinations” (factual errors), and then rewrite the text to correct it.