notion image
TL;DR
Led the design, strategy, and implementation of an AI-powered knowledge assistant ("Excel Wiki Agent") that transforms how 10+ designers access institutional knowledge—reducing onboarding time by ~40% and saving an estimated 3 hours per designer per week through instant, cited answers to common workflow questions.
notion image
Key Outcomes
~40% Onboarding Time Reduction
90% Team Adoption Rate
2,147 Sessions in 4-Week Pilot
87% Query Resolution Rate
 

Context

The Copilot EXCEL Wiki agent is a purpose-built AI assistant designed to streamline the daily workflow of Excel designers. It centralizes knowledge, automates information retrieval, and proactively guides designers with actionable insights, references, and next steps—making them more efficient and up to date with Excel features and best practices.
 
The Excel Design team at Microsoft operates across multiple geographies (Redmond, IDC) with 10+ designers working on one of the most complex productivity applications in the world. Design knowledge was scattered across:
  • SharePoint portals and OneDrive folders
  • Loop workspaces with guidelines and how-tos
  • Figma files, libraries, and comment threads
  • Teams channels and meeting recordings
  • PowerPoint decks and Word documents
  • Azure DevOps work items

Pain Point

 
💬 Designer Friction Points (from 1:1 interviews)
"I spend 20+ minutes just finding the right Figma library for my feature."
"Every time there's a new hire, I become the human wiki for a month."
"I know we've solved this problem before, but I can't find where we documented it."
 
Problem Statement
How might we enable Excel designers to instantly access the right knowledge, resources, and guidance—without disrupting colleagues or searching through scattered systems?

Background & Motivation

 
The journey began with a simple directory of links—portals, files, presentations, Loop pages, Figma files—intended to help Excel designers quickly find resources. However, this approach was static, required heavy manual maintenance, and was dependent on others for updates. Recognizing these limitations, the vision evolved:
  • Initial Attempts:
    • Aggregated existing Wiki links and guidelines.
    • Experimented with AI Notebook in Copilot (limited by reference cap).
    • Tried grounding the agent on Loop workspace (blocked by compliance/features).
  • Key Insight:
    • Designers needed more than a directory—they needed an intelligent, context-aware agent that could answer nuanced questions, surface the latest resources, and reduce manual searching.
notion image

Brainstorming & Iterative Development

  • Research:
    • Conducted 1:1 and whiteboarding sessions with designers to gather real-world questions and expectations.
    • Collected and categorized prompts into five main buckets (e.g., design craft, process, handoff, accessibility, onboarding).
  • Pain Points Identified:
    • Existing Wiki was hard to maintain, lacked excitement, and didn’t scale.
    • Manual updates to SharePoint and Loop were tedious.
  • Solution Evolution:
    • Rebuilt the Wiki structure in SharePoint for compliance and scalability.
    • Hard-coded tables, links, and authored how-to guides for key workflows (icon creation, bug bash, research, etc.).
    • Iteratively fine-tuned the agent’s responses for clarity, relevance, and actionable output.
    •  
💡 Insight #1: Presentations Were a Bigger Pain Point Than Expected
Designers spent significant time preparing for reviews and LT presentations. The need wasn't just "where are templates" but "which past presentations landed well with stakeholders like Venkat?"
 
💡 Insight #2: "Find Prior Art" Was the Hidden Superpower
Senior designers wanted to reference how problems were solved before. This shifted the agent from a "link finder" to a "design memory" system.
 
💡 Insight #3: Trust Requires Citations
Designers would only trust agent answers if they could verify the source. This informed the TL;DR + Citations response format.

Development Phases

notion image

Attempt 1: Static Link Directory (Failed)

What
Excel spreadsheet with categorized links to resources
Why Failed
High maintenance burden; dependency on others to update; not searchable
Learning
Designers need answers, not links. The cognitive load of navigating links was the original problem.
notion image

Attempt 2: AI Notebook (Ceiling Hit)

What
Copilot's AI Notebook feature with ~20 reference documents
Why Limited
20-reference ceiling was too restrictive for our knowledge base; couldn't cover all 6 theme areas
Learning
Need a more powerful grounding solution—Copilot Studio with enterprise connectors.
notion image

Attempt 3: Loop Workspace (Compliance Blocker)

What
Created structured Wiki in Loop workspace, hoping to ground agent on it
Why Blocked
Copilot Studio did not support Loop workspace as a grounding corpus due to compliance/feature limitations—only SharePoint accepted
Learning
Platform constraints matter. Had to migrate everything to SharePoint—a week of re-work.

Agent Capabilities & Instructions

notion image
The Copilot EXCEL Wiki agent is designed to be:
  • Helpful, Concise, and Actionable:
    • Answers questions on design craft, process, handoff, accessibility, onboarding, references, collaboration, quality, research, telemetry, and impact.
  • Source Prioritization:
    • Curated Knowledge Bank (canonical files/pages).
    • Enterprise sources: SharePoint/OneDrive → Figma → Teams/Loop → ADO → Calendar/Email (on request).
    • Org-wide knowledge graph (Azure AI Search, Viva Topics).
  • Evidence-Based Summarization:
    • Summarizes only from retrieved evidence.
    • Clearly states when information is missing or conflicting.
Response Formatting: - TL;DR summary. - Numbered action steps. - References table (Title, Owner, Last Updated, Link). - Platform-specific notes (Win/Web/Mac/Mobile). - Inline citations and proactive next steps. - **Tone & Style:** - Friendly, concise, and designer-centric. - Uses bullets, tables, and short paragraphs for clarity.
 

Final Response Template

1. TL;DR — 1-2 sentence summary
2. Action Steps — Numbered, concise instructions
3. References Table — Title | Owner | Last Updated | Link
4. Inline Citations — [Source] after each key claim
5. Platform Notes — Win/Web/Mac/Mobile callouts where relevant
6. Next Step — Proactive suggestion or owner to contact
 
Feedback
Action Taken
Result
Responses too verbose
Refined prompt instructions; added TL;DR requirement
Response time ↓34%
Missing accessibility docs
Authored 5 new a11y guides; added to knowledge bank
A11y queries resolved ↑ to 91%
Can't verify citations
Added clickable hyperlinks in references table
Trust score ↑ (qualitative)
Need Figma file search
Explored MCP/API integration (see Section 7)
Parked for future (technical constraints)

Figma Integration: MCP or API?

Designer feedback consistently requested the ability to search Figma files and access comment threads directly through the Wiki agent. This would eliminate the manual labor of updating the Wiki when designs change and enable queries like:
  • "Find the Checkout flow file and summarize recent comments"
  • "What components are used in the Dashboard redesign?"
  • "Show me files updated in the last week in the Charts project"
Two Approaches Evaluated
I explored two technical paths to integrate Figma with the Copilot agent:

Option A: Official Figma MCP Server

What It Is
Figma's official MCP server (Dev Mode) for design-to-code workflows
Primary Goal
Extract CSS/React props from a user-provided file link
Tools Exposed
get_design_context, get_variable_defs, get_code_connect_map
Critical Limitation
Link-based only. Cannot browse team files or search projects. User must provide specific file URL first.
Missing Capabilities
No list_team_projects, search_files, or get_file_comments tools

Option B: Custom Figma REST API Connector

What It Is
Power Platform custom connector wrapping Figma REST API with OAuth 2.0
Available Endpoints
GET /v1/teams/:team_id/projects, GET /v1/files/:file_key/comments, etc.
Advantages
Can browse team files, search projects, read/post comments, list components
Blocking Constraints
Requires HTTPS-hosted MCP server with valid certificate; rate limiting concerns for team-wide queries; OAuth app registration with MS org compliance review

Comparative Assessment

Capability
Official MCP
Custom Connector
Find files by name/search
❌ No
✅ Yes
Browse team projects
❌ No
✅ Yes
Read/post comments
❌ No
✅ Yes
Design-to-code context
✅ Yes
⚠️ Limited
Setup complexity
Low
High
Meets our use case
❌ No
⚠️ Possible but complex
 
Neither option met our immediate needs with acceptable effort. The Official MCP lacks the required capabilities entirely. The Custom Connector requires infrastructure (HTTPS-hosted MCP server, OAuth app compliance review, rate limit management) that would take 4-6 weeks to implement properly. Given the strong baseline value already delivered by the SharePoint-grounded agent, we chose to park Figma integration for the "Run" phase.

Impact & Value

 
Quantitative Outcomes
Metric
Before Agent
After Agent
Onboarding time (new designer)
~5 weeks
~3 weeks (↓40%)
Weekly active users
N/A
9/10 designers (90%)
Total pilot sessions
N/A
2,147 sessions
Query resolution rate
N/A
87%
Est. time saved per designer/week
N/A
~3 hours
Avg. response time
N/A
<40 seconds
 
Qualitative Outcomes
  • Reduced "pings": Senior designers report fewer interruptions for FAQ-type questions
  • Documentation culture: Team now contributes to Wiki knowing it feeds the agent
  • Consistent answers: Everyone gets the same canonical guidance, backed by sources
  • Stakeholder interest: Other Office app teams have expressed interest in similar agents

Challenges & Lessons Learned

 

What Worked Well

  • User research before building: The card sorting exercise revealed unexpected pain points (presentations) that shaped the MVP
  • Iterative response format tuning: Multiple rounds of prompt engineering led to the TL;DR + Citations format that designers trust
  • Phased rollout: Crawl-Walk-Run allowed learning without over-investing in wrong approaches
  • Stakeholder involvement: Regular team sync demos built buy-in and surfaced feature requests early

What I Would Do Differently

  • Earlier platform research: Would have discovered Loop workspace limitations sooner if I'd tested Copilot Studio grounding earlier
  • Baseline metrics: Should have measured onboarding time more rigorously before launch for cleaner comparison
  • Figma integration PoC earlier: Starting the MCP exploration in parallel would have given more time to validate the approach

Future Vision

Near-Term
  1. Expand knowledge base with recently authored content (12+ new docs in pipeline)
  1. Roll out to wider Excel design org (Redmond-based designers)
  1. Implement thumbs-up/down feedback mechanism for response quality tracking
Medium-Term
  1. Figma integration PoC using Cursor/Lovable for custom MCP connector
  1. Teams meeting transcript search (where permitted by compliance)
  1. Prompt library / conversation starters based on most common queries
Long-Term Vision
"How is my day looking?"
The ultimate vision: a designer asks this one question and receives a synthesized brief including:
  • Key emails and mentions requiring attention
  • Today's meetings with links to prep docs
  • Figma files updated overnight + unread comments
  • ADO work items requiring design input
  • Suggested priorities based on deadlines and dependencies

Conclusion

The Copilot EXCEL Wiki agent represents a leap forward in how Excel designers access, use, and contribute to organizational knowledge. By combining user-centered design, enterprise-grade knowledge management, and AI-powered assistance, it sets a new standard for efficiency and collaboration in design teams.
badge