Generative AI has moved from future consideration to a clear boardroom priority for legal teams. There’s a lot of risk associated with using generative AI in a company, and boards are looking to legal counsel to understand and manage that.

I recently participated in a webinar on what GCs need to know about AI transformation in 2026, where I was joined by two expert panelists with deep, high-profile experience shaping AI, privacy, and governance at some of the world’s most recognized companies.

  • Jim Shaughnessy is the Chief Legal Officer at Docusign, where he oversees the company’s global legal and corporate affairs functions. He has more than 30 years of experience, including a decade in senior leadership roles at Workday.

  • Nuala O’Connor is a global Fortune 10 technology governance leader and former SVP and Chief Counsel for Digital Citizenship at Walmart, where she led both legal and operational AI governance. She previously developed Amazon’s early global privacy and compliance strategy and served as the first Chief Privacy Officer at the U.S. Department of Homeland Security, among other roles.

They offered practical, actionable advice for GCs navigating AI.

10 Tips for GCs To Lead AI Transformation in 2026

-BLOG- Webinar summary follow-up Internal Graphics
  1. Create a Safe Test Environment for Experimentation (With Reminders and Guidance)

    Give employees a sanctioned place to try generative AI so experimentation happens inside your controls, not out on public models. Couple the sandbox with short, role-specific rules and just-in-time reminders in the interface so people get guidance where they work.

    Nuala O’Connor

    100%, the ‘walled garden approach’ is the right step for where we are right now in the evolution of this technology.

    Nuala O’Connor
    Formerly Walmart
    Nuala O’Connor
    Formerly Walmart
  2. Give Employees AI Tools They'll Actually Use

    If your team lacks easy, approved tools they will reach for consumer apps on their phones. Note that, as found in our whitepaper on AI maturity, surveys show how a large percentage of employees knowingly use AI improperly and will use banned AI tools.

    Jim Shaughnessy Headshot

    If you don’t, they’ll use their phones outside the firewall and they’ll put in sensitive data … and it’s gone.

    Jim Shaughnessy
    Chief Legal Officer at Docusign
    Jim Shaughnessy
    Chief Legal Officer at Docusign
  3. Address Fear Directly and Honestly

    AI adoption is emotional: people worry about job security. Make sure you confront those concerns openly.

    Nuala O’Connor

    The people in the legal and compliance and operations departments who were the most tech savvy still had discomfort. I said, ‘as long as you're willing to work hard and learn new things, there will always be a job for you.

    Nuala O’Connor
    Formerly Walmart
    Nuala O’Connor
    Formerly Walmart
  4. Keep Humans in the Loop for All Consequential Outputs

    Treat generative AI as a drafting assistant, not an approver; require human review for anything that has legal, regulatory, or external impact. Put approval gates and clear sign-off rules in place so responsibility is never ambiguous.

    Jim Shaughnessy Headshot

    For things that matter, you have humans in the loop [for AI output] in the right time, in the right place … think of it as a good first draft.

    Jim Shaughnessy
    Chief Legal Officer at Docusign
    Jim Shaughnessy
    Chief Legal Officer at Docusign
  5. Ask Vendors the Hard Questions

    Vendor demos can dazzle, but look for proof. Demand evidence on data provenance, auditability, hallucination controls, update cadence, and governance before procurement.

    Nuala O’Connor

    Neither be afraid nor bamboozled. Ask the hard questions, ask the dumb questions.

    Nuala O’Connor
    Formerly Walmart
    Nuala O’Connor
    Formerly Walmart
  6. Know When Not To Use Generative AI

    Generative models are probabilistic and can vary answers. For workflows that need to return identical results, use deterministic or retrieval systems instead. Map use cases by the level of required repeatability and choose the right technology accordingly.

    Nuala O’Connor

    If you need the same answer every single time, you want AI, but you don’t need it to be generative.

    Nuala O’Connor
    Formerly Walmart
    Nuala O’Connor
    Formerly Walmart
  7. Treat Hallucinations as Legal Liabilities

    Generative models can create false citations and invented facts. These hallucinations can create reputational or legal exposure. Test for them, require human fact-checking, and log outputs so you can trace and correct errors.

    Jim Shaughnessy Headshot

    And it happens more often than people would think.

    Jim Shaughnessy
    Chief Legal Officer at Docusign
    Jim Shaughnessy
    Chief Legal Officer at Docusign
  8. Set a Clear, Memorable Rule for AI Inputs

    Protect sensitive information by enforcing a simple input rule. If you wouldn’t want it published or leaked, don’t paste it into AI.

    Nuala O’Connor

    Don’t put anything there that you don’t want to see on the front cover of the New York Times.

    Nuala O’Connor
    Formerly Walmart
    Nuala O’Connor
    Formerly Walmart
  9. Prepare for a Multi-Agent Future

    Expect an assorted ecosystem of agents and embedded AIs across vendors and applications. Governance will need to scale across your entire AI stack, with clear ownership, data flow mapping, and oversight.

    Jim Shaughnessy Headshot

    You're going to have agents from Salesforce or agents from Workday or agents from Docusign ... interacting with another system and it may end up being evaluated by other agents. There's going to be a lot of complexity in these systems.

    Jim Shaughnessy
    Chief Legal Officer at Docusign
    Jim Shaughnessy
    Chief Legal Officer at Docusign
  10. Start Small With Low-Risk Internal Agents

    Pilot low-risk agents like knowledge-retrieval or contract lookup tools to deliver quick value and build governance muscle memory. Use those pilots to refine accuracy requirements, review workflows, and adoption metrics before expanding.

    Nuala O’Connor

    The low-risk internal … ‘Has this contract been written before?’ … ‘What’s our policy on X?’ … I think it’s a great way to start out.

    Nuala O’Connor
    Formerly Walmart
    Nuala O’Connor
    Formerly Walmart

For more in-depth insights, examples, and discussion from the panel, watch the full webinar.

10 Tips To Help GCs Lead AI Transformation in 2026, From an Expert Legal Panel
Ains-headshots GFP 2872 20220714 50-1024x1024

Author

Ben Tingo

Chief Legal Officer

Ben Tingo is Chief Legal Officer at Casepoint, where he shapes the company’s legal strategy and its use of AI. Before joining the company in 2016, he spent a decade in public and private practice as a civil and criminal defense attorney, representing both individual and corporate clients across diverse practice areas. Ben holds a J.D. from Brooklyn…

Categories: