How AI Saved JA

  • Published
  • By Mr. Ryan D. Oakley
Disclaimer: This article contains copyrighted images. Many of the images on this Site are purchased from Stock agencies or provided by other resources that make them copyright protected. Those images will be marked as copyright protected with the name of the copyright holder. Our use does not convey any rights to others to use the same material. Those wishing to use copyright protected material of third parties must contact the copyright holder directly.
Disclaimer: This is a fictional article written as part of a writing contest and should not be taken as guidance for the current practice of law or use of artificial intelligence.

 
Innovation Writing Competition:
What does the JAG Corps look like in 2049?
 

How AI Saved JA: Transforming the Base Legal Office

It seems probable that once the machine thinking method had started, it would not take long to outstrip our feeble powers … [Computers] would be able to converse with each other to sharpen their wits. At some stage, therefore, we should have to expect the machines to take control.

~Alan Turing[1]
 
06:12
VIDEO | 06:12 | The AI Advantage

 
NoteThe following is an excerpt from The First 100 Years: The Department of the Air Force Judge Advocate General’s Corps: 1949-2049, Chapter 8, Origins of JAG Corps Artificial Intelligence (AI) Legal Operations, AFJAGS Press (2050), pp. 135-142.

Prelude

When I arrived at my first base in the spring of ‘24, everyone was overwhelmed at the Wing. Working evenings and weekends was de rigueur to keep your head above water. Leadership was responsible for 100-plus inspection items while wrangling with disparate data systems to track cases, schedule meetings, check email, review metrics, assign taskers, clear suspenses, and process performance reports. Paralegals ran endless checklists (and don’t get me started on additional duties). Attorneys triaged incoming contracts, fiscal, environmental, and ethics reviews. Courts-martial awaited preferral, referral, and docketing with proof analyses, motions, and discovery responses coming due. And we all complained about legal assistance. Make no mistake: we worked hard to draft thousands of wills, powers of attorney (POA), and related documents, saving military families millions of dollars in attorneys’ fees. But even the brightest-eyed optimists burned out, struggling with finite manpower, resources, and time—we didn’t have time to think beyond the next suspense or pop-up crisis. Paradoxically, as busy as we were, legal assistance client numbers and office visits were declining department-wide every year. It was easy to blame the aftermath of COVID-19, yet the drop started before the pandemic. What happened? There was plenty of speculation, but I think the simplest explanation is they valued their time too. Why wait for an appointment or sit in a crowded lobby for hours only to be referred to the local bar association if your issue was too complex to be resolved in 30 minutes? Why not cut out the middleman? And if you couldn’t afford a civilian attorney, roll the dice and Google your legal question. Certainly, do-it-yourself legal assistance risked disinformation and bad outcomes, but Airmen and Guardians are innovators—they were onto something.

Starting in 2018, the Department of Defense’s Artificial Intelligence Strategy sought “to accelerate the adoption of AI and the creation of a force fit for our time.”[2] By 2030, Legal Information Services (AF/JAS) had begun replacing our legacy systems with AI-enabled capabilities “to augment the capabilities of JAG Corps personnel by offloading tedious cognitive or physical tasks and introducing new ways of working.” It was a game changer.[3] The JA-AI program was guided by best practices proposed by the Artificial Intelligence Acquisition Guidebook, published by the Department of the Air Force and Massachusetts Institute of Technology’s Artificial Intelligence Accelerator.[4] After successful implementation, base offices witnessed improved productivity, efficiency, and accuracy was evident across all domains.[5] But we first had to overcome myths and misconceptions about AI.

 
Certainly, do-it-yourself legal assistance risked disinformation and bad outcomes, but Airmen and Guardians are innovators—they were onto something.


 

Rage Against the Machines

Frankly, JA-AI was first met with heavy skepticism and sarcasm. Following the first Online News Service announcement, there were tons of Terminator and Matrix jokes. Cynics predicted it was another technological boondoggle, paying millions of for products we didn’t need and wouldn’t work. Like past inventions (e.g., desktop computers, email, pagers, and smart phones) which promised time savings and reduced stress (meet George Jetson), they said, AI would only make our workdays more frenzied and frazzled. But as details trickled out, the “too cool for school” mood turned to fear, particularly among career military and civilian personnel, mirroring a deep resistance in the private sector.

Since the inception of the Industrial Age, an argument has persisted whether automation, fueled by technology, displaces more jobs than it creates. Consider the impact the Internet had on travel agents, cashiers, journalists, Blockbuster Video, and brick-and-mortar store owners. Complacently, we used to think attorneys were safe because of the required right-brained skills—chiefly, the creative spark. “Humans use their life experiences, their emotions, and their creativity to bring things to life,” while “Robotics and AI uses data to learn and improve,” stated Tom Pickergill, the leader of a company using AI to connect job seekers and employers.[6] Dramatic advances in machine learning in legal research and writing, and natural language processing, however, upset our false sense of security.[7] Legal jobs were no longer off-limits.

In his 2008 book, The End of Lawyers, author Richard Susskind foretold that the traditional legal expert would yield to “legal knowledge engineers” who oversee the development of standardized “packaged” services, delivered through automated systems accessible to clients. [8] Yet Susskind lamented that “success stories remain[ed] exceptional” in the development and use of transformative technology to deliver legal services to clients. Much of this was due to a stubbornness among attorneys who were wedded to staid traditions and billable hours.[9] A decade later in Forbes, Rob Toews observed that despite a billion-dollar-plus enterprise software businesses built to fuel productivity in marketing, sales, customer service, finance, accounting, and talent recruitment, the legal field remained a “glaring” and “profoundly underdigitized” outlier.[10] For example, in 2020, Microsoft Word and Outlook email still remained the “dominant digital tools that legal teams use[d] to carry out their work.”[11] Still, change was afoot. By 2024, an estimated 23 percent of the work done by lawyers was now automated. Moreover, civilian legal departments had replaced one-fifth of their workforce with “technologists.”[12]

While we didn’t share the profit-based anxieties of private practice, nevertheless, we worried about the end of the legal world as we knew it. Was the AI initiative a budgetary Trojan horse to cut manpower costs by outsourcing billets to Skynet? Would future squadron commanders be reduced to “asking Alexa” for advice on Article 15 punishments? And why did commanders need to make the final call (since computers were now able to win 97 percent of the time in war games like Stratego and Diplomacy)?[13] What about our professional responsibility rules—are bots exempt from malpractice? But mainly we asked: where does it stop?



 
Artificial intelligence in the practice of law is “the theory and development of processes performed by software instead of a legal practitioner whose outcome is the same as if a legal practitioner had done the work.”
 

About AI

Artificial intelligence in the practice of law is “the theory and development of processes performed by software instead of a legal practitioner whose outcome is the same as if a legal practitioner had done the work.”[14] AI mimics functions of the human brain.[15] The technical term for this is machine learning.[16] While there’s a ton of math involved, how it works is specific data (A) is input and used to quickly generate a simple response (B), such as recognizing a person’s face from a photograph, or translating a sentence from English to Spanish.[17] These A→B software programs learn as they go, improving their performance through continual feedback. This allows the automated completion of data-intensive tasks and recognition of patterns in the relationships between words or data points, thereby identifying relevant information and highlighting errors and inconsistencies—all faster, and usually better, than humans can do.[18] Andrew Ng, the co-founder of Google Brain, put it this way: “If a typical person can do a mental task with less than one second of thought, we can probably automate it using AI either now or in the near future.”[19] In fact, most of us were already using commercially-available programs powered by AI (e.g., LinkedIn), but just didn’t know it.

Impact on JAG Corps

Ultimately, the benefits of AI were undeniable—and unavoidable. Skeptics gradually realized that AI was “more likely to aid than replace” attorneys and paralegals. As Matthew Stepka notes, AI approaches the creative process in a fundamentally different way than humans, through what is called emergent behavior.[20] Thus, AI can help identify new ideas, strategies, and courses of actions for commanders and clients to consider. Throughout the next two decades, we saw tangible benefits throughout the Corps.

Leadership and Office Management: At Wing-level and General Court Martial Convening Authority (GCMCA) offices, AI now manages our self-assessment, making every day processes much easier. No more “preparing” for inspections or multi-day visits to the field to review three-ring binders. Every Article 6 checklist item is updated continuously with substantiating records instantly uploaded and any deficiencies clearly highlighted with a corrective action plans proposed (of course, our higher headquarters offices receive these updates, which motivates quick action on our part). AI is also able to pull data across varied systems to provide commanders and Staff Judge Advocate-Law Office Superintendent (SJA-LOS) teams with a clear “heads-up” display. By automating repetitive tasks, routine reports, and “additional duties,” we freed up time for JA members to advise commanders and mission partners, prepare cases for trial, conduct training, and simply have more meaningful interactions with colleagues outside the legal office.[21]

Legal Assistance: The Air Force Legal Assistance website (LAWS), first launched in 2009, was upgraded with enhanced customer service capabilities which increased annual client engagements while reducing the administrative workload of attorney-paralegal teams. Today, wills are prepared automatically based on client questionnaires (while flagging errors and state-specific issues for follow up before the signing ceremony). It’s a vast improvement over navigating the choose-your-own-adventure of past will-drafting programs, and we can now accomplish special needs trusts. Since their inception, these AI solutions have significantly expanded access to legal assistance from basic training throughout retirement, while simultaneously easing the load of walk-in visits, appointment scheduling, and last-minute emergencies.

Through expertise automation, eligible clients can engage with JA offices from their home or workplace worldwide. They can chatbot questions about specific legal problems, and receive custom tailored information to discuss with an attorney. Here we followed civilian best practices. Legal chatbots had long been used by private law firms to book client appointments, relay information, answer frequently asked questions, file claims, draft forms and agreements, and assist victims of crime. [22]

Collaboratively, LAWS also enables the drafting of state-specific documents for divorce, child custody, adoption, immigration, landlord tenant, civilian protection orders, and Servicemembers Civil Relief Act cases.[23] Similar to tax assistance, our online programs are more convenient for clients with easier access, faster service, and accurate results. By handing the vast majority of garden variety questions and routine services, LAWS has enabled base programs to rapidly respond to the clients most in need, enabling a focus on deploying members, emergent issues, and Exceptional Family Member Program (EFMP) support. These innovations also have been appreciated by retired members who previously waited months for in-person appointments at overloaded offices.

Civil Law: Air Force and Space Force installations now rely on AI to draft and review multi-million dollar contracts, leases, support agreements, and other important legal documents. Daily, we use the latest legal research software, which can perform sophisticated analysis of specific issues for final “spot-check” review by flesh-and-blood lawyers. AI also has drastically improved document management, file plans, and knowledge management for base offices, saving hundreds of thousands of man-hours.[24]

Military Justice and Litigation: In addition to expanding legal research and drafting capabilities, the use of e-discovery software enables a vast number of documents to be surveyed and those relevant to the search criteria to be quickly and accurately identified at a fraction of the time. Post-trial processing is a breeze. Similar AI tools help highlight correct case precedents and potential evidentiary issues for trial and defense counsel. Using predictive analytics, litigation teams are also able to more accurately assess the likelihood of a successful outcome for a high-profile trial or civil lawsuit, allowing the Air Force to decide whether to settle.[25] AI also improved of the ability of the Disciplinary Case Management System (DCMS), (and its predecessor the Automated Military Justice Administration and Management System (AMJAMS)) to identify emerging misconduct trends across commands, effective rehabilitative techniques, and potential racial or gender disparities in military justice.[26] In planning the update of DCMS, we realized the computer isn’t a neutral or objective arbitrator, because the data analyzed may be embedded with inherent bias, which is in turn echoed by AI.[27] However, with this understanding of AI’s limitations, we were able to better see these biases ourselves, remove blinders, and respond to issues and inequities that otherwise might have been missed in a sea of data.[28]

AI and Professional Responsibility

There were extensive conversations in the field and higher headquarters about the ethics of AI. We argued whether we were allowing computers to engage in the unauthorized practice of law. Working groups debated who was responsible if AI malfunctioned: the creator, the contractor, the network administrator, the end user, the agency, or attorney?[29] And who were the supervising attorneys for bots? Eventually, standardized AI governance helped fill in these gaps. Meanwhile, interagency teams did a deep-dive into the details of data collection, privacy rights restrictions, systems of record notices, and Federal Register public comments. Last but not least, how much did the Air Force Rules of Professional Conduct need to change? The answer: surprisingly little.

“The ethical issues raised by AI are in many ways not that different from the ethical issues that lawyers have faced before,” stated David Curle, the Director of the Technology and Innovation Platform at the Legal Executive Institute of Thomson Reuters.[30] Professional responsibility rules recognize that most legal professionals are not computer whizzes. As Curle counsels, “What this means in practice is that lawyers need to find trusted providers of AI-based solutions, and they need to pose smart questions to the providers whose AI tools they are considering.”[31]

Under the duty to supervise, JAG Corps professionals must be competent to select and oversee the proper use of AI solutions.[32] This is not a new ethical standard. As of 2022, per the ABA Model Rules of Professional Conduct, lawyers “must maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology, engage in continuing study and education and comply with all continuing legal education requirements to which the lawyer is subject.”[33] Failing to use, or properly use, commonly available technology (like old-fashioned email and e-discovery software), can be grounds for suspension by the bar, as Professor Lauri Donahue of Harvard University notes.[34] Anthony Davis adds, the employment of AI involves “not just a matter of the duty to supervise what goes on, and what tools are used within the law firm, but what third-party tools are used and how.”[35]

Legal offices must appropriately select AI vendors and effectively manage the use of similar technological solutions in everyday practice to ensure sufficient communication with and support to clients, while protecting confidentiality.[36] Thus, a key part of attorney-paralegal foundational training, and continuing legal education, is to understand, “at a basic level,” how AI solutions work and how they were developed.”[37] This boils down to retaining and exercising independent judgment.[38]

I Am Not A Robot

Over the last 25 years, AI has changed how civil society and our legal profession operates, impacting everything from how law school is taught to the speed at which we work, think, and make decisions daily. While AI continues to transform numerous industries, “it’s not magic” and has inherent limitations.[39] On the surface level, AI takes tons of data to use, and the quality of its end products is tied to the accuracy, relevancy, timeliness, and bias of the inputs analyzed. More profoundly, there is the question of whether self-aware AI will ultimately save or sink humanity. This debate is still ongoing, as AI continues to evolve exponentially. How we decide to use emerging technology to improve our legal practice and how we choose to respond to change, remains within in our control. Approaching 2050, we’ve witnessed myriad applications improving our legal support to commanders and Airmen and Guardians, while enhancing the creativity, proficiency, and resiliency of front-line legal professionals. “As more and more artificial intelligence is entering the world,” notes Dr. Amit Ray, an AI scientist, “more and more emotional intelligence must enter into leadership.”[40]Thus, even if AI can pass the Turing test, we must understand it cannot replace or replicate the value of human JAG Corps professionals, and the pivotal role we all play in mission accomplishment.[41] Here’s to the next fifty years!

PostscriptAs of December 2022, most of the artificial intelligence capabilities discussed in this article already exist and are being used today and transforming business practices. Over 4,500 law firms in the United States have invested in AI-driven legal research.[42] This is not science fiction.
 

About the Author

 
Mr. Ryan D. Oakley

Mr. Ryan D. Oakley

(B.A., Huntingdon College, Montgomery, Alabama; J.D., Samford University, Homewood, Alabama; M.A., Air Command and Staff College, Maxwell Air Force Base, Alabama) is the Chief of General Law, 12th Air Force (Air Forces Southern), Davis-Monthan Air Force Base, Arizona.
 
Edited by: Major Allison K.W. Johnson (Editor-in-Chief), Major Victoria Clarke, Major Victoria Smith, and Major Andrew H. Woodbury
Layout by: Thomasa Huffstutler
 

Endnotes

[1] Alan Turing (1912-1954) was a highly influential mathematician and leading pioneer in the development theoretical computer science and artificial intelligence.
[2] Dep’t of Defense, Summary of the 2018 Department of Defense Artificial Intelligence Strategy (Feb. 12, 2019), https://media.defense.gov/2019/Feb/12/2002088963/-1/-1/1/SUMMARY-OF-DOD-AI-STRATEGY.PDF.
[3] Id.
[4] Andrew Bowne & Ryan Holte, How to Buy AI, Contract Management (Dec. 2022), https://ncmahq.org/Shared_Content/CM-Magazine/CM-Magazine-December-2022/How-to-Buy-AI.aspx. See Dep’t of the Air Force and Massachusetts Inst. of Tech., Artificial Intelligence Acquisition Guidebook https://aia.mit.edu/wp-content/uploads/2022/02/AI-Acquisition-Guidebook_CAO-14-Feb-2022.pdf.
[5] Andrew Bowne &Ryan Holte, Acquiring Machine-Readable Data for an AI-Ready Department of the Air Force, The JAG Reporter (Nov. 29, 2022), https://www.jagreporter.af.mil/Portals/88/2022%20Articles/Documents/20221129_Bowne2_r.pdf. Maj Bowne and Capt Holte’s article presented contracting and program management best practices on how to negotiate for the delivery of and rights to AI-Ready data, including sample clauses that can be used in all contracts and agreements.
[6] Michael Grothaus, These are the few jobs that robots won’t take from us, Fast Company (Aug. 20, 2018), https://www.fastcompany.com/90221230/these-jobs-are-safe-from-being-replaced-by-automation.
[7] Elizabeth C. Tippett & Charlotte Alexander, Opinion: Lawyers and their jobs are no longer safe from AI and automation, Market Watch (Aug. 10, 2021), https://www.marketwatch.com/story/lawyers-and-their-jobs-are-no-longer-safe-from-ai-and-automation-11628599753/.
[8] Richard Susskind, The End of Lawyers?: Rethinking the Nature of Legal Services (2008); see also Richard Moorhead, Book Review The End of Lawyers?, 29 Legal Studies 692, (2009). 
[9] Susskind, at 21. 
[10] Rob Toews, AI Will Transform the Field of Law, Forbes (Dec. 19, 2019), https://www.forbes.com/sites/robtoews/2019/12/19/ai-will-transform-the-field-of-law/?sh=3e2083807f01
[11] Id.
[12] Oren Bareket, The legal knowledge engineer - the answer to lawyers who want to lead technology innovation, LinkedIn (July 22, 2022), https://www.linkedin.com/pulse/legal-knowledge-engineer-answer-lawyers-who-want-lead-oren-bareketSee also, Rob van der Meulen, 5 Legal Trends Technology Trends that are Changing In-House Legal Departments, Gartner (Feb. 24, 2022), https://www.gartner.com/smarterwithgartner/5-legal-technology-trends-changing-in-house-legal-departments/
[13] Jeremy Hsu, DeepMind AI uses Deception to beat human players in war game Stratego, New Scientist (Dec. 1, 2022), https://www.newscientist.com/article/2349484-deepmind-ai-uses-deception-to-beat-human-players-in-war-game-stratego/.
[14] Sergio David Becerra, The Rise of Artificial Intelligence in the Legal Field: Where We Are Going, 11 J. Bus Entrepreneurship & L. 27, 38 (2018).
[15] Lauri Donahue, A Primer on Using Artificial Intelligence in the Legal Profession, Jolt Digest (Jan. 3, 2018), https://jolt.law.harvard.edu/digest/a-primer-on-using-artificial-intelligence-in-the-legal-profession
[16] Machine learning is a branch of artificial intelligence (AI) and computer science which focuses on the use of data and algorithms to imitate the way that humans learn, gradually improving its accuracy. See Machine Learning, IBM (July 15, 2020), https://www.ibm.com/cloud/learn/machine-learning.
[17] Andrew Ng, What AI Can and Can’t Do Right Now, Harvard Bus. Rev. (Nov. 2016), https://hbr.org/2016/11/what-artificial-intelligence-can-and-cant-do-right-now.
[18] Avaneesh Marwaha, Seven Benefits of Artificial Intelligence for Law Firms, Legal Tech. Today (July 13, 2017), https://www.lawtechnologytoday.org/2017/07/seven-benefits-artificial-intelligence-law-firms/ [since being cited, this link appears to no longer be available]
[19] Ng, supra note 17.
[20] Matthew Stepka, Law Bots: How AI Is Reshaping the Legal Profession, Business Law Today (Feb. 22, 2012), https://businesslawtoday.org/2022/02/how-ai-is-reshaping-legal-profession/.
[21] Olga V. Mack, AI For Lawyers: Understanding and Preparing For the Future of Law, Above the Law (Sept. 19, 2022), https://abovethelaw.com/2022/09/ai-for-lawyers-understanding-and-preparing-for-the-future-of-law/
[22] David Lat, The Ethical Implications of Artificial Intelligence, Above the Law (2020), https://abovethelaw.com/law2020/the-ethical-implications-of-artificial-intelligence/.
[23] Anthony E. Davis, The Future of Law Firms (and Lawyers) in the Age of Artificial Intelligence, A.B.A. (Oct. 2, 2020), https://www.americanbar.org/groups/professional_responsibility/publications/professional_lawyer/27/1/the-future-law-firms-and-lawyers-the-age-artificial-intelligence/.
[24] See, Hugh Son, JPMorgan Software Does in Seconds What Took Lawyers 360,000 Hours, Bloomberg (Feb. 27, 2017), https://www.bloomberg.com/news/articles/2017-02-28/jpmorgan-marshals-an-army-of-developers-to-automate-high-finance.
[25] Stepka, supra note 20.
[26] Approved in 1971, AMJAMS replaced the manual reporting of courts-martial and Article 15 nonjudicial punishment statistics. See Patricia A. Kerns, The First Fifty Years of the USAF JAG Department, 106 (2003).
[27] Stepka, supra note 20.
[28] Id.
[29] Press Release, Dep’t of Defense, DOD Adopts Ethical Principles for Artificial Intelligence, (Feb. 24 2020), https://www.defense.gov/News/Releases/release/article/2091996/dod-adopts-ethical-principles-for-artificial-intelligence/. See also, Mary K. Pratt, AI accountability: Who’s responsible when AI goes wrong? Tech Target (Aug. 19, 2021), https://www.techtarget.com/searchenterpriseai/feature/AI-accountability-Whos-responsible-when-AI-goes-wrong.
[30] Lat, supra note 22.
[31] Id.
[32] Dep’t of the Air Force Inst. 51-110, Professional Responsibility Program, Attachment 2 (Dec. 11, 2018).
[33] See Model Rules of Pro. Conduct 1.1 (Am. Bar. Ass’n 2020).
[34] Donahue, supra note 15.
[35] Davis, supra note 23.
[36] Id.
[37] Lat, supra note 22.
[38] Id.
[39] Ng, supra note 17.
[40] Amit Ray, Compassionate Artificial Intelligence, Compassionate AI Lab (2018), https://amitray.com/compassionate-ai-lab/.
[41] Developed by Alan Turing, the Turing test or “imitation game” is a method of determining whether a computer is capable of thinking and acting like a human, to the point it can fool a third-party interrogator.
[42] Toews, supra note 10.
 
Disclaimer: This is a fictional article written as part of a writing contest and should not be taken as guidance for the current practice of law or use of artificial intelligence.