AI & The Duty of Technology Competence
As Robert Ambrogi points out over on Law Sites, there are now 26 states that have adopted the duty of technology competence for lawyers - first noted in Comment 8 to ABA Model Rule 1.1.
The ABA version states:
To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology, engage in continuing study and education and comply with all continuing legal education requirements to which the lawyer is subject. (Emphasis added.)
While the states may differ in the exact language of their rules, these rules will likely have an ongoing effect on a lawyer's duty to learn various aspects of ever-changing technology.
Down the road, there may be a time when attorneys must know and understand how artificial intelligence works to be able to rely on technology to perform the more sophisticated functions of law practice.
As lawyers begin to use ROSS, say, to perform legal research or even draft simple memos, it is not unreasonable to presume that a lawyer would need to understand how ROSS decided on a particular issue to have true algorithmic accountability. Because something like ROSS cannot be subject to the same professional responsibility rules as a living, breathing lawyer, it is up to the lawyer to maintain a duty of technological competence to understand and vet the work of the software.
This is tricky because we are currently at a point where most algorithms are proprietary and there is little transparency about the results that are generated. It is unlikely that this competing issue with be resolved anytime soon.
Until such time when the AI developers release the very decision trees for how an algorithm came to a particular result, law librarians will be helpful in teaching lawyers to understand the current state of AI technology. During our legal research instruction, we should offer pointers to lawyers on the results generated and how to spot possible issues, such as bias.
Post a Comment