GS-12 · Qualifications & FMLA · Self-paced · 4 Phases
Copilot for M365 is not a search engine and it does not know federal law. It is an AI assistant that reads and works with your own content — your emails, your Word documents, your Teams meetings, your SharePoint files — within your government tenant.
Think of Copilot as a very capable assistant who has read only what's in your office filing cabinet — and nothing else. Specifically, it can access:
It cannot access other agencies' systems, the internet, or any database it hasn't been connected to.
1. You need to verify whether an employee qualifies for FMLA. You ask Copilot "Does this employee qualify for FMLA?" What should you expect?
2. Which of these is a realistic, appropriate use of Copilot for an HR Specialist?
3. Where does Copilot for M365 get the information it uses to help you?
Copilot appears in Outlook in two main ways: summarizing email threads and helping you draft replies.
Summarizing a thread: Open a long email thread. Look for the Copilot icon or "Summary by Copilot" at the top of the thread. Click it and Copilot will give you the key points without making you read every message.
Drafting a reply: In a message, click Reply, then look for the Copilot icon in the compose toolbar. You can ask it to draft a reply and give it a direction — formal, brief, detailed. Always read and edit before sending.
The most useful Teams feature for HR work is meeting recap. After a Teams meeting that was recorded or transcribed, Copilot can give you a summary of what was discussed, decisions made, and action items.
You can also use Copilot during a live meeting to catch up if you joined late — click the Copilot icon and ask "What have I missed?"
Note: Meeting transcription must be enabled for this to work. If it isn't available, check with your IT or supervisor.
In a Word document, you'll see a Copilot icon in the left margin or in the toolbar. You can ask Copilot to draft a section, rewrite something for clarity, or summarize a long document you're reading.
For HR correspondence, the most useful approach is to give Copilot the structure and let it fill in language — then edit for accuracy and policy compliance.
In Excel, Copilot can help you analyze data, add columns, filter, and create summaries. For HR work this might mean sorting case tracking data, identifying patterns, or generating a quick summary of a dataset.
Note: Copilot in Excel works best when your data is formatted as a Table (Insert → Table). If your data isn't in a Table, Copilot may not be able to work with it.
1. You open a 40-message email thread about an employee's case. What is the most efficient first step using Copilot?
2. You joined a Teams meeting 15 minutes late. Copilot can help you by:
3. Why does Copilot in Excel work best when data is formatted as a Table?
FMLA administration involves legal determinations, strict timelines, and employee rights under federal law (29 CFR Part 825). Errors — missing a 5-business-day notice deadline, mischaracterizing a serious health condition, incorrect eligibility determination — carry real consequences for employees and for the agency.
Copilot is a drafting and organizing tool. It does not know the FMLA regulations unless you put them in front of it. It cannot tell you whether an employee is eligible. That determination is yours, based on the law.
1. Organizing case notes
If you've taken notes in a Word document or Teams chat about a case, Copilot can help you organize and format them into a chronological case summary.
2. Drafting standard notices
FMLA has several required notices: the eligibility notice, rights and responsibilities notice, and designation notice. Copilot can help you draft these using language you provide from your agency's approved templates.
3. Summarizing long email chains on a case
FMLA cases often involve many emails over weeks. Copilot can summarize the thread and pull out key dates and actions.
4. Checking your own writing
Paste a draft letter and ask Copilot to check it for clarity, neutral tone, or consistency.
1. An employee submits an FMLA request. You ask Copilot: "Is this employee eligible for FMLA?" What is the problem with this approach?
2. Which of these is an appropriate use of Copilot in FMLA case management?
3. You want Copilot to draft a designation notice. What should you do to get a useful, accurate result?
Copilot produces output based on what you ask and what context you give it. A vague prompt produces a vague result. A specific prompt with clear constraints produces something actually useful.
You are now doing this after learning the tool — which is the right order. Prompts only make sense when you understand what Copilot can and can't do.
The most important habit to build: always tell Copilot what not to do. This prevents it from inventing details, adding information you didn't provide, or drifting outside what you need.
Don't start over. Refine. Tell Copilot what was wrong:
1. Which prompt is more likely to produce a useful result?
2. Copilot drafts a letter but includes a detail you didn't provide and that may be incorrect. What should you do?
3. What is the purpose of including constraints in a prompt (like "do not add information I didn't provide")?
Out of the box, Copilot writes the way AI writes — formal to the point of being stiff, padded with filler phrases, and generic. For HR correspondence that has to sound like it came from a person, this creates extra work. The fix isn't to edit every draft — it's to give Copilot your voice before it starts writing.
Create a Word document or OneNote page titled something like My Copilot Style Instructions. Keep it open whenever you're working. Copy and paste the style block at the start of any prompt that involves drafting.
Here is a starting point built for federal HR work. Edit it to match how you actually write:
Paste that block at the top of any drafting prompt, then add your specific request below it. Copilot will apply the style instructions to everything it writes in that session.
If you find yourself deleting the same phrases repeatedly, add them explicitly to your style prompt:
Build this list over time as you notice patterns in what Copilot produces that you always change.
The simplest system: keep a file called Copilot Prompts.docx on your SharePoint or OneDrive. One section for your style block, another for your most-used prompt templates (FMLA acknowledgment, eligibility notice, etc.). Open it alongside whatever you're working on. This becomes your personal prompt library — no special technology needed.
Some M365 tenants — depending on the agency's Copilot license and IT policy — include access to Copilot Studio, which allows building a custom agent with your style instructions baked in permanently. You would not need to paste your style block every time.
Ask your IT helpdesk this exact question:
If the answer is yes and it's available to you, a custom agent can hold your style instructions, your most-used prompt templates, and even links to your agency's approved FMLA template language — so every draft starts from your baseline, not Copilot's default.
1. What is the most practical way to make Copilot drafts sound more like you, without needing IT involvement?
2. You keep deleting the phrase "please be advised" from every Copilot draft. What's the most efficient fix?
3. What does Copilot Studio allow that a standard personal style prompt does not?
When Copilot is new, you use it carefully. Once it becomes a daily habit, it's easy to use it the way you use any other tool — quickly, without thinking twice. That's when professional and privacy risks appear.
This isn't about distrust. It's about applying the same professional standards to Copilot that already govern your email, your case files, and your phone calls.
Microsoft 365 Copilot has audit logging built in. In a federal agency tenant, it is reasonable to assume that your prompts and Copilot's responses are logged at the administrator level. This is standard enterprise IT practice and consistent with federal records management requirements.
You don't need to change how you use Copilot for legitimate work tasks. The boundary is the same one that already applies to all your work communications:
As an HR Specialist, you handle information protected under the Privacy Act of 1974. That protection doesn't pause when you're using an AI tool. Any employee information you put into a Copilot prompt — names, medical details, leave history, disciplinary records — is subject to the same handling rules as that information in any other context.
Instead of using real names and case details in prompts, use placeholders. Fill in the real information after Copilot produces the draft.
This produces the same quality output, keeps real employee information out of the prompt log, and takes about five seconds to adapt.
Your government M365 account and its Copilot license are for official use. Using it for personal tasks — writing personal letters, helping with personal financial questions, anything unrelated to your job — is outside the terms of your government IT use agreement and potentially a federal IT policy violation.
1. You're frustrated with an employee who you believe is misusing FMLA. Is it appropriate to type that frustration into a Copilot prompt to help you "think it through"?
2. You need Copilot to draft an FMLA denial letter for a specific employee. What is the correct approach?
3. Which of these is an appropriate use of your government Copilot license?