Power Automate Governance with FlowStudio MCP
Classify, tag, and govern Power Automate flows at scale through the FlowStudio
MCP
cached store
— without Dataverse, without the CoE Starter Kit, and
without the Power Automate portal.
This skill uses
update_store_flow
to write governance metadata and the
monitoring tools (
list_store_flows
,
get_store_flow
,
list_store_makers
,
etc.) to read tenant state. For monitoring and health-check workflows, see
the
flowstudio-power-automate-monitoring
skill.
Start every session with
tools/list
to confirm tool names and parameters.
This skill covers workflows and patterns — things
tools/list
cannot tell you.
If this document disagrees with
tools/list
or a real API response, the API wins.
Critical: How to Extract Flow IDs
list_store_flows
returns
id
in format
hashtags
)
businessImpact
string
Low / Medium / High / Critical
businessJustification
string
Why the flow exists, what process it automates
businessValue
string
Business value statement
ownerTeam
string
Accountable team
ownerBusinessUnit
string
Business unit
supportGroup
string
Support escalation group
supportEmail
string
Support contact email
critical
bool
Designate as business-critical
tier
string
Standard or Premium
security
string
Security classification or notes
Caution with
security
:
The
security
field on
get_store_flow
contains structured JSON (e.g.
{"triggerRequestAuthenticationType":"All"}
).
Writing a plain string like
"reviewed"
will overwrite this. To mark a
flow as security-reviewed, use
tags
instead.
Governance Workflows
1. Compliance Detail Review
Identify flows missing required governance metadata — the equivalent of
the CoE Starter Kit's Developer Compliance Center.
1. Ask the user which compliance fields they require
(or use their organization's existing governance policy)
2. list_store_flows
3. For each flow (skip entries without displayName or state=Deleted):
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
- Check which required fields are missing or empty
4. Report non-compliant flows with missing fields listed
5. For each non-compliant flow:
- Ask the user for values
- update_store_flow(environmentName, flowName, ...provided fields)
Fields available for compliance checks:
Field
Example policy
description
Every flow should be documented
businessImpact
Classify as Low / Medium / High / Critical
businessJustification
Required for High/Critical impact flows
ownerTeam
Every flow should have an accountable team
supportEmail
Required for production flows
monitor
Required for critical flows (note: standard plan includes 20 monitored flows)
rule_notify_onfail
Recommended for monitored flows
critical
Designate business-critical flows
Each organization defines their own compliance rules. The fields above are
suggestions based on common Power Platform governance patterns (CoE Starter
Kit). Ask the user what their requirements are before flagging flows as
non-compliant.
Tip:
Flows created or updated via MCP already have
description
(auto-appended by
update_live_flow
). Flows created manually in the
Power Automate portal are the ones most likely missing governance metadata.
2. Orphaned Resource Detection
Find flows owned by deleted or disabled Azure AD accounts.
1. list_store_makers
2. Filter where deleted=true AND ownerFlowCount > 0
Note: deleted makers have NO displayName/mail — record their id (AAD OID)
3. list_store_flows → collect all flows
4. For each flow (skip entries without displayName or state=Deleted):
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
- Parse owners: json.loads(record["owners"])
- Check if any owner principalId matches an orphaned maker id
5. Report orphaned flows: maker id, flow name, flow state
6. For each orphaned flow:
- Reassign governance: update_store_flow(environmentName, flowName,
ownerTeam="NewTeam", supportEmail="new-owner@contoso.com")
- Or decommission: set_store_flow_state(environmentName, flowName,
state="Stopped")
update_store_flow
updates governance metadata in the cache only. To
transfer actual PA ownership, an admin must use the Power Platform admin
center or PowerShell.
Note:
Many orphaned flows are system-generated (created by
DataverseSystemUser
accounts for SLA monitoring, knowledge articles,
etc.). These were never built by a person — consider tagging them
rather than reassigning.
Coverage:
This workflow searches the cached store only, not the
live PA API. Flows created after the last scan won't appear.
3. Archive Score Calculation
Compute an inactivity score (0-7) per flow to identify safe cleanup
candidates. Aligns with the CoE Starter Kit's archive scoring.
1. list_store_flows
2. For each flow (skip entries without displayName or state=Deleted):
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
3. Compute archive score (0-7), add 1 point for each:
+1 lastModifiedTime within 24 hours of createdTime
+1 displayName contains "test", "demo", "copy", "temp", or "backup"
(case-insensitive)
+1 createdTime is more than 12 months ago
+1 state is "Stopped" or "Suspended"
+1 json.loads(owners) is empty array []
+1 runPeriodTotal = 0 (never ran or no recent runs)
+1 parse json.loads(complexity) → actions < 5
4. Classify:
Score 5-7: Recommend archive — report to user for confirmation
Score 3-4: Flag for review →
Read existing tags from get_store_flow response, append #archive-review
update_store_flow(environmentName, flowName, tags="
archived
so it's discoverable for future cleanup.
Actual deletion requires the Power Automate portal or admin PowerShell
— it cannot be done via MCP tools.
4. Connector Audit
Audit which connectors are in use across monitored flows. Useful for DLP
impact analysis and premium license planning.
1. list_store_flows(monitor=true)
(scope to monitored flows — auditing all 1000+ flows is expensive)
2. For each flow (skip entries without displayName or state=Deleted):
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
- Parse connections: json.loads(record["connections"])
Returns array of objects with apiName, apiId, connectionName
- Note the flow-level tier field ("Standard" or "Premium")
3. Build connector inventory:
- Which apiNames are used and by how many flows
- Which flows have tier="Premium" (premium connector detected)
- Which flows use HTTP connectors (apiName contains "http")
- Which flows use custom connectors (non-shared_ prefix apiNames)
4. Report inventory to user
- For DLP analysis: user provides their DLP policy connector groups,
agent cross-references against the inventory
Scope to monitored flows.
Each flow requires a
get_store_flow
call
to read the
connections
JSON. Standard plans have ~20 monitored flows —
manageable. Auditing all flows in a large tenant (1000+) would be very
expensive in API calls.
list_store_connections
returns connection instances (who created
which connection) but NOT connector types per flow. Use it for connection
counts per environment, not for the connector audit.
DLP policy definitions are not available via MCP. The agent builds the
connector inventory; the user provides the DLP classification to
cross-reference against.
5. Notification Rule Management
Configure monitoring and alerting for flows at scale.
Enable failure alerts on all critical flows:
1. list_store_flows(monitor=true)
2. For each flow (skip entries without displayName or state=Deleted):
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
- If critical=true AND rule_notify_onfail is not true:
update_store_flow(environmentName, flowName,
rule_notify_onfail=true,
rule_notify_email="oncall@contoso.com")
- If NO flows have critical=true: this is a governance finding.
Recommend the user designate their most important flows as critical
using update_store_flow(critical=true) before configuring alerts.
Enable missing-run detection for scheduled flows:
1. list_store_flows(monitor=true)
2. For each flow where triggerType="Recurrence" (available on list response):
- Skip flows with state="Stopped" or "Suspended" (not expected to run)
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
- If rule_notify_onmissingdays is 0 or not set:
update_store_flow(environmentName, flowName,
rule_notify_onmissingdays=2)
critical
,
rule_notify_onfail
, and
rule_notify_onmissingdays
are only
available from
get_store_flow
, not from
list_store_flows
. The list call
pre-filters to monitored flows; the detail call checks the notification fields.
Monitoring limit:
The standard plan (FlowStudio for Teams / MCP Pro+)
includes 20 monitored flows. Before bulk-enabling
monitor=true
, check
how many flows are already monitored:
len(list_store_flows(monitor=true))
6. Classification and Tagging
Bulk-classify flows by connector type, business function, or risk level.
Auto-tag by connector:
1. list_store_flows
2. For each flow (skip entries without displayName or state=Deleted):
- Split id → environmentName, flowName
- get_store_flow(environmentName, flowName)
- Parse connections: json.loads(record["connections"])
- Build tags from apiName values:
shared_sharepointonline → #sharepoint
shared_teams → #teams
shared_office365 → #email
Custom connectors → #custom-connector
HTTP-related connectors → #http-external
- Read existing tags from get_store_flow response, append new tags
- update_store_flow(environmentName, flowName,
tags="
operations
in
the PA portal description). Tags set via
update_store_flow(tags=...)
write to a separate field in the Azure Table cache. They are independent —
writing store tags does not touch the description, and editing the
description in the portal does not affect store tags.
Tag merge:
update_store_flow(tags=...)
overwrites the store tags
field. To avoid losing tags from other workflows, read the current store
tags from
get_store_flow
first, append new ones, then write back.
get_store_flow
already has a
tier
field (Standard/Premium) computed
by the scanning pipeline. Only use
update_store_flow(tier=...)
if you
need to override it.
7. Maker Offboarding
When an employee leaves, identify their flows and apps, and reassign
Flow Studio governance contacts and notification recipients.
1. get_store_maker(makerKey="