Wednesday, August 13, 2025

Governing Feedback Data Sharing in Microsoft 365 Apps and Copilot

ComfyUI_00083_


Microsoft treats user feedback as confidential and uses it to improve the product experience. That does, however, involve real people reading what’s submitted. For many organizations that’s fine; for others it’s something they want to control—particularly when feedback is sent from Copilot chat, where a message might contain sensitive business context.

Good news: you can govern what can (and cannot) be sent in the feedback dialog.

Note: Prompts and responses used inside a Copilot conversation are not viewed by Microsoft. What I’m covering here is only the optional feedback form users can submit.

Where to configure Feedback policies

image

These settings live in the Microsoft 365 Apps admin center as Cloud Policy:

  • Go to config.office.comCustomizationPolicy Management.

  • Create or edit a policy configuration for Microsoft 365 Apps and scope it to the users/groups you want.

  • Search for Feedback to find the policies below.

Official docs for Cloud Policy: https://learn.microsoft.com/microsoft-365-apps/admin-center/overview-cloud-policy.

These policies apply to apps and web experiences that use the standard “Send feedback to Microsoft” UX from Microsoft 365 Apps (including Copilot surfaces that use that dialog). Teams uses its own policy model for feedback, so manage Teams separately.

Also important: these settings are not “restrictive by default.” If you don’t explicitly disable them, users can include logs and content samples by default.

See https://learn.microsoft.com/en-us/microsoft-365/admin/manage/manage-feedback-ms-org for a list of products covered by feedback policies set at config.office.com

The four knobs that matter (set to Disabled to restrict)

I’m listing them in the order I recommend you evaluate them.

1) Allow Microsoft to follow up on feedback submitted by users

  • What changes: the small text at the bottom of the dialog that says Microsoft may contact the user.

  • Why it matters: disables the consent for follow-up (no contact back). Many orgs prefer no outbound follow-ups to end users.

2) Allow users to include screenshots and attachments when they submit feedback to Microsoft

  • What changes: hides the Include a screenshot control in the form.

  • Why it matters: screenshots often contain customer content.

  • Gotcha: this does not cover log/content attachments (that’s the next policy).

3) Allow users to include log files and relevant content samples when feedback is submitted to Microsoft

  • What changes: removes the option that shares the prompt, generated response, relevant content samples, and additional log files with the feedback.

  • Why it matters: this is the big one for Copilot. When enabled, users can (and by default will) include parts of the conversation and context data. Setting it to Disabled prevents those from being sent.

4) Allow users to submit feedback to Microsoft

  • What changes: blocks the feedback dialog from appearing at all.

  • Why it matters: the nuclear option. If you don’t want any feedback sent from the product UI, turn this off.

What users will see

Here’s how the UI shifts as you apply the policies:

  • With default settings, users can add a screenshot and (by default) include prompt/response + logs/content samples.

    feedback-default

  • Disable follow-up → the small “Microsoft may contact you…” text is removed/changed.       

    feedback-no-contact[1]

  • Disable screenshots → the screenshot checkbox/button disappears (users cannot attach images), but log/content sharing may still be available unless you disable it too.

    feedback-noscreenshot[1]

  • Disable log files and content samples → the “Share prompt, generated response, … and additional log files” option is removed, so no conversation context is shared.

    feedback-nologs[1]

  • Disable submit feedback → the dialog doesn’t show.

Recommended approaches

Pick the level that matches your risk tolerance:

  • Leave as is: The default behavior allows Microsoft to capture valuable feedback to adjust products and experiences to the better for those using them.

  • Balanced: Disable screenshots and log files/content samples, allow feedback, and optionally disable follow-ups.

  • Strict: Disable screenshots, log files/content samples, and follow-ups.

  • Locked down: Disable submit feedback entirely.

If you do allow feedback, you may want to consider disabling the log files and relevant content samples option for Copilot users. While this data is handled securely by Microsoft, turning it off can help ensure that no conversation snippets or contextual information are included in feedback—something some organizations prefer for peace of mind or to align with their internal data handling practices.

Final thoughts

I work at Microsoft, and I know there are actual people reading feedback to make our products better. That’s a feature, not a bug—but each organization has different requirements for customer data. With Cloud Policy you can decide what’s appropriate for your tenant, from light filtering to full lockdown.

Again, this doesn’t change how Copilot processes prompts during normal use—those aren’t viewed by Microsoft. We’re only talking about the separate, optional act of sending product feedback according to your level of comfort, and is why the options are there.

If you’ve got a mixed environment (e.g., Teams), remember to set feedback controls where that app expects them.

Happy governing!