Article content

Earlier this year, the Supreme Court of British Columbia released a costs decision in the case of Zhang v. Chen, flowing from a family law dispute relating to parenting time.

Advertisement 2

Article content

But that’s not why I am writing about it this weekend.

Article content

In seeking costs on the outcome of the application, Nina Zhang also sought special costs against the lawyer of her former spouse for including two non-existent cases that were discovered to have been invented by ChatGPT.

After receiving Chen’s Notice of Application in December, 2023, Zhang’s counsel advised that they could not locate two cases cited in his application. Chen’s counsel apologized, indicating they would look into it. Zhang’s counsel continued to demand copies of the two cases referenced in the notice of application. They were never produced.

To confirm the cases didn’t exist at all, Zhang’s counsel even hired a legal researcher to seek out the two cases. The researcher, too, determined they didn’t exist.

Article content

Advertisement 3

Article content

On the date of the hearing of the application, Chen’s counsel provided an email to the court admitting to using ChatGPT “without verifying the source of information.”

She also said: “I had no idea these two cases could be erroneous.”

Recommended from Editorial

While I have previously written about a New York lawyer that was outed for relying on ChatGPT research in a legal brief, this is the first Canadian example I have found where, again, ChatGPT has made a dangerous example of itself.

In responding to the request for special costs against her, the lawyer at issue swore an affidavit, where she deposed, “I am now aware of the dangers of relying on Al generated materials.”

Advertisement 4

Article content

In deciding if costs should be awarded against her, the court found that citing fake cases in court filings and other materials handed up to the court is “an abuse of process and is tantamount to making a false statement to the court.”

The court noted though that Zhang had a well resourced legal team, the cases were withdrawn before the hearing and “there was no chance here that the two fake cases would have slipped through.”

While the court didn’t order special costs against Chen’s lawyer, it did find that additional expense and effort was incurred and that she would personally bear the costs of that.

RECOMMENDED VIDEO

We apologize, but this video has failed to load.

This is a very public example of what is likely happening in every industry, even by trained and educated professionals.

Advertisement 5

Article content

And while one can sympathize with the lawyer here in some respects, I can’t help but reflect on what the impact would have been if the two fictitious cases in this case did slip through.

Our Canadian legal system would be seriously imperiled if fake ChatGPT cases were quoted in courts and accepted by less vigilant counsel and judges. On second thought, it likely already has happened.

So what’s the call to action?

The call is to people like you, fair reader, to take the reins of your own AI use. Use it judiciously, not always. Treat AI as you would a brilliant young child – one that can see the world much differently than most, but still needs help crossing the street.

Chen’s lawyer was remorseful and obviously naive to the pitfalls of using generative AI.

Have a workplace question? Maybe I can help! Email me at sunira@worklylaw.com and your question may be featured in a future column.

The content of this article is general information only and is not legal advice.

Article content



Source link torontosun.com