1. Purpose
This policy sets out Deacon Marriner LLP’s approach to the responsible use of artificial intelligence (AI) in the delivery of Principal Designer duties under the Construction (Design and Management) Regulations 2015 (CDM).
The aim is to ensure AI tools are used to improve efficiency, quality, and record-keeping, while safeguarding statutory responsibilities under CDM, wider health and safety duties under the Health and Safety at Work etc. Act 1974 (HSAWA), and obligations under data protection law (GDPR).
2. Scope
This policy applies to all AI tools used by the firm in support of Principal Designer functions, compliance audits, and related administrative tasks. It covers both directly procured systems (e.g. AI Solutions’ CDM Toolkit) and licensed third-party platforms (e.g. AI Solutions CDM Toolkit, ChatGPT Business, Otter.ai, Fathom, Cortana, bespoke GPTs).
3. General Principles
- AI is an assistant, not a substitute for professional judgement and competence. The Principal Designer retains full responsibility for all statutory CDM duties.
- All AI outputs must be reviewed and validated before being relied upon in professional advice or reports.
- Confidential and sensitive project data may only be uploaded where risks have been assessed and judged acceptable.
- The firm will maintain a central AI Register listing all tools in use, their purpose, risk classification, and due diligence status.
- The firm opts for non-sharing of data for model training, to ensure client privacy.
4. Categories of Use
Non-material impact uses (low risk)
- Meeting support: Drafting and maintaining structured notes, action trackers, and reminders from project meetings.
- Background research: Gathering publicly available information on legislation, best practice, sustainability standards, and industry news.
- Report formatting: Improving grammar, readability, and layout of reports and audits without altering technical conclusions.
- Knowledge support: Providing overviews of technical frameworks or regulatory changes for rapid familiarisation.
Reasoning: These activities support efficiency but do not replace professional design risk management or CDM compliance responsibilities.
Material impact uses (managed with controls)
Examples include:
- CDM Toolkit (AI Solutions) – supporting risk management, statutory compliance, and design coordination.
- Bespoke GPT (Business edition) – used for research, peer review of compliance reports, audits, and maintaining structured records of actions and decisions.
Controls:
- Documented supplier due diligence (data protection, compliance, environmental considerations).
- Risk assessment completed before uploading any confidential or sensitive data.
- Outputs logged in project records where relevant.
- Final responsibility always rests with the Principal Designer.
Prohibited uses
- Delegating design risk decisions or CDM compliance judgements to AI tools.
- Uploading confidential project or client data without prior anonymisation and documented risk assessment.
- Using AI outputs uncritically or without human validation.
5. Data Protection & Confidentiality
- Confidential data must be anonymised unless a risk assessment shows that upload poses no unacceptable risk.
- Use of AI must comply with GDPR and the firm’s existing data protection policies.
- Only the minimum necessary data should ever be shared with AI tools.
6. Due Diligence & Risk Management
- Each AI tool will be assessed for:
- Purpose and scope of use.
- Data protection and GDPR compliance.
- Risks of bias, inaccuracy, and over-reliance.
- Environmental and stakeholder impacts (where relevant).
- Outcomes recorded in the firm’s AI Register.
7. Governance & Training
- Staff using AI must be trained on:
- Distinction between material and non-material uses.
- Risks of bias, “hallucination” (false output), and over-reliance.
- Confidential data handling.
- Policy will be reviewed annually, or sooner if regulations or industry standards change.
8. Record Keeping
- AI Register maintained and updated for all tools.
- Risk assessments completed and stored where tools have material impact.
- Evidence of “reasonable steps” retained for audit or external inspection.
9. Accountability
- The Principal Designer retains full professional accountability.
- AI is a support mechanism, not a replacement for human expertise or decision-making.
Reviewed by : Katrina Deacon MAPM IMaPS NEBOSH
October 2025
Leave a Reply