• 21

    Regarding the restoration of Dataverse tables

    Suggested by Takuto Sakuma New  0 Comments

    Under the current Dataverse specifications, it is not possible to restore only an individual deleted table.

    To return a Dataverse table to its state prior to deletion, an environment backup taken before the deletion must be used, which requires restoring the entire environment. As a result, restoring only the deleted table individually is not supported at this time.

    If it becomes possible to restore only deleted tables on an individual basis, it would improve customer convenience. We kindly ask that you consider this enhancement.


  • 7

    Reflect default environment storage quota in tenant-level capacity summary and alerts

    Suggested by Jacob Huynh New  2 Comments

    Problem Statement


    The tenant-level storage capacity summary in the Power Platform Admin Center calculates entitled capacity based solely on purchased license SKUs. This creates a blind spot: tenants that operate exclusively through Microsoft 365 licensing (Business Basic, Business Standard, Office 365 E1/E3) receive 0 GB of Dataverse File, Database, and Log entitlement on the summary page, despite the fact that every default environment is provisioned with a built-in storage quota.

     

    Per Microsoft Learn: 

    "The default environment has the following included storage capacity: 3 GB Dataverse database capacity, 3 GB Dataverse file capacity, and 1 GB Dataverse log capacity."

     

    The automated capacity alert system uses the license-based entitlement figure, not the real available capacity, to calculate threshold breaches. The result is false-positive over-capacity warnings.


    Example


    A small business using Microsoft 365 Business Standard licenses has one default Dataverse environment. They use two Canvas apps and a few Power Automate flows, accumulating 1.67 GB of file storage (mainly from platform-managed web resources and note attachments). Their tenant-level summary shows:

    TypeEntitledUsedStatusFile0.00 GB1.67 GB100% Over-Capacity

     

    The admin receives a weekly email: "You're out of File capacity. Your tenant has used 100 percent of available File storage. Please act immediately to continue operating without disruptions."

    In reality, the default environment holds a 3 GB included file quota. Actual usage is only 56% of the true limit. No operational risk exists. The customer opens a support case, only to learn the alert was a cosmetic overstatement.

     

    This is not an edge but the standard experience for every M365-only tenant that uses Dataverse in a default environment.


    Why This Happens


    The reason behind this is a data presentation gap between two layers:

    1. License layer (entitlement engine): Calculates capacity from license SKUs. M365 SKUs like Business Basic and Office 365 E1 grant CDS Lite capacity for Teams environments but contribute 0 GB to the standard Dataverse capacity pools. The entitlement engine correctly reports 0 GB - these licenses genuinely do not purchase standalone Dataverse capacity.
    2. Environment layer (included quota): Every default environment is allocated 3 GB DB / 3 GB file / 1 GB log at provisioning time, independent of licensing. This quota is visible only when drilling into the environment-level details view. The tenant-level summary does not aggregate it.

    The alert pipeline reads from layer 1. The real capacity exists in layer 2. The disconnect produces warnings that are technically accurate at the license level but meaningless for customers who operate within the included quota.


    Suggestion


    Include the default environment's built-in quota in the tenant-level capacity calculation. Specifically:

    1. Summary tab: Add a "Default environment included capacity" row under "Storage capacity, by source" - alongside "Org (tenant) default", "User licenses", and "Additional storage". This shows the 3/3/1 GB allocation transparently.
    2. Alert threshold calculation: Factor the default environment's included quota into the over-capacity check. A tenant using 1.67 GB of file storage with a 3 GB default quota should show 56% usage, not 100%.
    3. Alert email wording: When a tenant's overage is driven entirely by the absence of license-based entitlement rather than actual consumption exceeding the included quota, the email could state: "Your default environment's included storage is being used. If your usage exceeds the included 3 GB, consider purchasing additional capacity add-ons."


    What This Would Not Change


    • The StorageDriven capacity model and overflow rules would remain intact.
    • Tenants whose usage genuinely exceeds the 3 GB included quota would continue to receive over-capacity notifications as they do today.
    • Tenants with paid Dataverse capacity add-ons would see no change.
    • The default environment list view behavior (showing only consumption beyond included quota) would remain unchanged - this suggestion addresses the summary-level presentation only.


    Additional Consideration


    The same gap likely affects the environment creation capacity check. Per the documentation: "The capacity check conducted before creating new environments excludes the default environment's included storage capacity when calculating whether you have sufficient capacity." If a tenant has 0 GB license-based capacity, the included quota is already factored into the new-environment check - but the summary page does not show this deduction, creating an inconsistency between what the admin sees and what the platform enforces.

     

    The fix would bring the summary presentation in line with the provisioning logic that already accounts for the included quota.


  • 2

    Sharepoint Connector, Only allow specific sites

    Suggested by Adam Macaulay New  0 Comments

    For the SharePoint connector, only allow specific sites to be selected under the data privacy policies for that connector much like we do for the HTTP connector.


  • 0

    Increase the 100-Character schemaname Limit for Catalog Template Item Solution Components

    Suggested by Marck Anthony Payanay New  0 Comments

    When installing a solution as a Template Item via the Power Platform Catalog, the installation can fail because the generated Dataverse schemaname exceeds the current 100-character limit.

    This is especially limiting for solutions that include components such as Copilot Studio bots, topics, triggers, cloud flows, and other named solution assets. These components often require meaningful business names so makers, admins, and support teams can clearly understand their purpose.

    The issue is that the failing schemaname is system-generated, immutable, and includes values such as the publisher prefix, topic/flow name, component details, action name and GUID. Makers do not have direct control over the final generated value, yet the Catalog installation can fail because of it.


    Requested improvement:

    Please improve this behavior by implementing the following:

    • Increase the current 100-character schemaname limit for Catalog Template Item scenarios.
    • Add pre-validation before publishing or installing a Catalog Template Item so makers are warned before installation fails.
    • Use a safer schema name generation pattern
    • Provide official documentation explaining how schema names are generated for Catalog Template Items and what naming limits makers should follow.
    • Confirm whether this limitation applies only to Copilot Studio botcomponent records or to other Dataverse-backed solution components as well.


    Business impact:

    The current limitation forces makers to shorten component names in ways that reduce clarity and maintainability. This is not ideal for enterprise-ready solutions where naming standards, governance, ALM, supportability, and reusability are important.

    A platform-level fix or clear validation guidance would reduce deployment failures, improve Catalog adoption, and make Template Items more reliable for reusable Power Platform solution distribution.