AI and Law: Issues and Concerns
There’s an interesting shift in AI and law where it’s moved away from reliance on legal precedent. It now focuses on algorithms for purposes of case dispositions or decision making. It’s created an interesting issue for judges and lawyers alike.
This is because, in theory, AI would allow a computer to replace the jury or the judge. It would take probability and analyze it and apply it to facts and law and would spit out a decision. That is under several pilot programs throughout the country. Right now, it includes smaller cases like car accident cases and misdemeanor cases. We’re keeping a close eye on it to see where it goes.
There is a place for AI and law in our judicial system. You can make the resolve of small claims cases and limited civil cases more expedient. These cases have a value of, for example, less than $25,000, and certain criminal offenses as well. There’s a place for it. We’re not at a point where we can use AI or algorithms for major cases. These major cases are civil cases or intellectual property decisions. Or they are major felony criminal cases. But it has a place in our judicial system and we’re looking forward to seeing how it all shakes out.
Automated Driving Liability
The issues surrounding AI and liability are profound. This is most true in the context of automated driving. That’s very popular and it’s all over the news right now. If there is an incident or an accident, there’s going to be several questions. Judge, jury, and the lawyers involved will have a lot of questions. For example, who exactly is liable? Is it the car manufacturer? Also, is it the software manufacturer or programmer? Is it the owner of the vehicle? So legislation and through legal precedent need to resolve these questions in time. And the other clear issue is of paramount importance, and that’s safety.
How can we have proper regulation of automated vehicles? It needs to be at a point where we have factored in all the safety issues. Can we have confidence behind automated vehicles hitting the road? Right now we’ve had a few hiccups in the process. There needs to be more technical development that needs to happen. It needs to be at a place for the general public to feel secure. It is also to allow the judicial system to catch up.
Open Source Coders: AI and Law
An interesting issue is the potential liability for open source coders in the AI context. Right now there’s not much case law out there in liability. Because of this, I would say that any coder should proceed with caution. Coders should be careful in contributions to any automated vehicle or automated system. This is whether that’s a drone system or a robotic system, or even a computer-based system. Liability is still part of a decision-making process.
I would also compare it to a products liability case. In a products liability case, for example. It could be in the case of a car that’s allegedly manufactured with a defect or has a design defect. It’s common that the plaintiff will sue the car manufacturer. Then it’s everybody else that was in commerce. That includes the car dealer, specific component suppliers.
The same logic applies to AI and law where the manufacturer of the vehicle would face the lawsuit. Then it becomes the component provider of that vehicle. Then it’s the software programmer. The lawsuit can even name freelance programmers could in that action. This is because the plaintiff’s goal is to catch as many potential pockets as possible. It’s wise for any programmer who contributes to an AI platform to proceed with caution. They should make sure to have appropriate contractual protections in place.