The Savvy Director >> Weekly insights delivered to your inbox on Sunday mornings. Click here

Mind Over Machine

think independently Sep 14, 2025

 “Small is the number of them that see with their own eyes
and feel with their own hearts.” 
– Albert Einstein

 

In today’s fast-moving digital world, artificial intelligence (AI) is transforming how we work, learn, and make decisions. For board directors, AI tools like ChatGPT and other generative platforms offer unprecedented convenience – summarizing lengthy reports, generating questions, and even drafting board materials.

But as these tools become more embedded in boardroom routines, a critical question arises. Are directors outsourcing their thinking – and what does that mean for governance, accountability, and due diligence?

This blog post explores the crucial importance of independent thinking in the age of AI. It’s a timely reminder that while technology can support directors, it must never replace their judgment. As Einstein suggested, seeing with your own eyes– and feeling with your own heart – is rare, but essential. Especially now.

In boardrooms everywhere, a quiet shift is taking place. It’s not about strategy, market disruption, or financial performance – at least not directly. It’s about how directors prepare for the decisions that will shape their organizations.

 

Independent Thinking – The 5th Key Habit of the Savvy Director

At DirectorPrep, we believe that great directors aren’t born – they’re developed through habits that strengthen their ability to serve effectively. In our Savvy Director Framework, Independent Thinking is the 5th key habit.

Independent Thinking is about more than forming your own opinions. It means:

You can read more about Key Habit #5 – Think Independently. For now, let’s appreciate AI makes this habit more important than ever. Why? Because AI can subtly steer what we see – and therefore what we think – without us even realizing it.

 

The Risk of Outsourcing Thinking to AI

The greatest danger of AI in the boardroom isn't that it will replace directors, it’s that directors will replace themselves by outsourcing their thinking.

When directors rely wholly on AI generated summaries instead of reading the material themselves, they risk:

  • Losing nuance. Summaries can’t convey tone, emphasis, or subtle cues that often signal deeper issues.
  • Missing inconsistencies. Only by reading and cross referencing can you spot contradictions between sections.
  • Eroding critical thinking discipline. The less you practice, the weaker the skill becomes.
  • Failing fiduciary duty. Delegating comprehension to a machine is not an acceptable defense if a decision is later challenged.

The slippery slope is real. Once a director starts letting AI decide what’s important, the director’s own independent lens begins to fade.

 

The New AI Reality in Board Materials

Imagine this scenario. A director receives a complex financial report ahead of a board meeting. Instead of reading it, they upload it into an unregulated chatbot and skim the summary. At the meeting, they nod along, ask a few surface-level questions, and vote on a major decision.

What’s missing?

  • The director hasn’t engaged with the full context.
  • They’ve missed subtle signals—tone, emphasis, contradictions.
  • They’ve outsourced their thinking.

This raises a serious governance issue. Can a director fulfill their legal obligation for due diligence if they haven’t read the materials themselves? And if their preparation relied on AI, how is that captured in the board minutes? (Keep reading for ideas about how to record the use of AI in the minutes.)

Until recently, most directors assumed that the board and committee materials they received were entirely human generated. Today, that’s no longer a safe assumption.

Management teams are under pressure to be efficient and concise. AI is an obvious tool to help draft reports, conduct analysis, and prepare summaries. After all, the use of AI can often improve clarity and readability, reduce repetitive manual work, and produce faster turnaround times.

But using AI also comes with risks, including:

  • Oversimplification. Cutting nuance and subtlety that may be critical to a decision.
  • Bias. AI models reflect the biases in their training data.
  • Omissions. Important but less obvious details may be left out entirely.

The challenge for directors is to distinguish between beneficial AI assistance and AI overreach. Regardless of how the material was prepared, or whether you’re a volunteer or a paid director, your legal duty of due diligence is the same.

 

Due Diligence in the AI Era

Directors have a legal obligation to exercise due diligence in decision-making. That means:

  • Reading the materials thoroughly
  • Asking relevant, probing questions
  • Understanding the issues.
  • Seeking additional information when needed
  • Challenging assumptions and testing conclusions
  • Making informed judgments.

Ultimately, AI doesn’t bear responsibility. Directors do.

Boards should consider developing policies around AI use, clarifying what’s acceptable, what’s not, and how it should be disclosed. This protects the integrity of the board and ensures that decisions are made with care.

The legal duty of care requires directors to make informed decisions, exercising the same level of care that a reasonably prudent person would in similar circumstances.

If a report or analysis was prepared, in whole or in part, by AI, you are still accountable for the decision you make based on it. The fact that “AI said so” is not a defense.

In fact, AI involvement may increase the diligence required, because directors must assess not only the content but also the process used to create it. That means asking:

  • Who prepared this report?
  • Was AI used in drafting or analysis?
  • If so, how was the AI tool selected and governed?
  • Were the AI’s outputs verified by a qualified human?

This isn’t about mistrusting management. It’s about ensuring that the decision making process meets the legal and ethical standards expected of you as a director.

 

Capturing Due Diligence in the Minutes

Minutes are the official record of the board’s deliberation and decision-making. They should reflect the process, not just the outcome. They’re a critical piece of evidence if a board decision is ever challenged. The minutes can show that the board recognized and addressed the unique risks associated with AI generated content.

If directors rely on AI to prepare, that use should be transparent, especially if it influences their judgment. It may be appropriate to put a general note in the minutes that AI tools were used to support board preparation, but not to replace it.

If a report or analysis considered by the board was AI assisted, minutes might record:

  • Disclosure by management of AI involvement in preparing the material
  • Questions asked by directors to clarify how the AI was used and whether its outputs were verified
  • Any steps taken to seek additional information or independent advice
  • The board’s discussion and deliberation before making the decision

This creates a clear evidentiary trail showing that directors actively engaged with the material, rather than passively accepting it.

 

The Director’s Thinking Partner

The solution is not to ban AI from your board toolkit – it’s to use AI as a partner, not a proxy.

Generic, public AI tools may be useful for broad, non confidential questions, but they’re not designed with a director’s governance duties in mind. Free AI tools store your inputs, use them for further training of AI models anywhere on the internet, and generate outputs without clear accountability for accuracy.

By contrast, DirectorPrep’s ChatDPQ is purpose-built for board directors and those who support their work. It’s a custom-coded, governance aligned “thinking partner” that:

  • Opens up your thinking, prompting new angles, perspectives, and questions you may not have considered.
  • Enhances relevant insights, analyzing content for what’s already there and identifying what’s missing.
  • Provides instant ‘Learning in the Moment’ on governance concepts, board processes, training topics, and director responsibilities.
  • Operates within a secure environment. You’re in full control.
  • Delivers safe, reliable responses, governed by Responsible AI principles, with coded guardrails to avoid “making stuff up.”
  • Aligns with legal duties, supporting preparation rather than replacing it, and reinforcing your independence of mind.
  • Respects confidentiality. It can be used without fear of uploading sensitive board documents to unregulated platforms.

If ChatDPQ doesn’t know the answer, it says so – a critical distinction from many public tools that will confidently hallucinate, guess, or bluff.

 

Seeing With Your Own Eyes

Einstein’s words echo through today’s governance environment: “Small is the number of them that see with their own eyes and feel with their own hearts.”

Your greatest value in the boardroom is not just the expertise you bring, it’s your willingness to engage deeply, challenge assumptions, and form your own conclusions.

AI can be a powerful ally when used responsibly. Tools like ChatDPQ can expand your thinking, strengthen your preparation, and help you fulfill your legal duties without compromising confidentiality or independence.

But no AI can replace the human director who reads carefully, thinks critically, and decides with integrity.

So, as you prepare for your next meeting, ask yourself:

  • Am I seeing this with my own eyes?
  • Is AI helping me think better, or think less?

Let’s use technology wisely. Let’s prepare with purpose. And let’s never forget the value and power of independent thinking.

 

Your Takeaways:

Here are some practical tips for directors:

  • Read the source meeting material – don’t rely solely on AI summaries.
  • Use AI to clarify, not to replace your own thinking.
  • Consider your own questions, then consult AI for perspective.
  • Discuss AI use openly with fellow directors.
  • Encourage your board to develop shared norms around using AI tools.

AI can be a powerful ally when used with intention and integrity.

 

Note:  Both ChatDPQ and Microsoft Copilot were used in the creation of an outline for this week’s edition of The Savvy Director. Further prompts in ChatDPQ provided new information on the legal ramifications for using AI in my board prep. I’m grateful for the insights on how to record the use of AI for the board meeting’s minutes. How to questions about board minutes are popular with many DirectorPrep members.

 

Resources:

 

Thank you.

Scott

Scott Baldwin is a certified corporate director (ICD.D) and co-founder of DirectorPrep.com – an online membership with practical tools for board directors who choose a learning and growth mindset.

We Value Your Feedback: Share your suggestions for future Savvy Director topics.

 

Comment

Close

Welcome to the Savvy Director Blog

Stay connected with our weekly posts about what it takes to be a savvy board director