Suggest a new Idea

  • 239

    Flow/Logic app triggers for D365FO

    Suggested by Sunil GargUnder Review 20
    Category: Data Management

    Have the ability in the D365FO flow/logic app connector to register for callbacks when certain events happen in D365FO. Events can be defined by the users in D365FO which will be exposed in the connector for registration. When the specific event happens in D365FO, D365FO callsback into the connector thus, triggering the flow/logic app. Sample events can be, "When a PO is approved" or "When a vendor is created" etc.

  • 71

    Export Data from all companies withiin the same job

    Suggested by Milinda VitharanaUnder Review 6
    Category: Data Management

    Exporting entities should be done per company. In case of exporting Entities for building a data mart (BYOD) or for an integration scenario, this limitation causes multiple export jobs to be defined for the same entity.


    Enable a single job that can export data for all the companies.


     

  • 40

    Allow Azure SQL as Data source in DMF

    Suggested by gijs WalravenUnder Review 2
    Category: Data Management

    In order to support more complex data migration and integration scenario's, i would like to have the option to use a Azure SQL database as data source for importing data via DMF (Data Management Framework).


    In the current situation, AzureSQL databases can only be used for export scenarios. The option of using an ODBC connection as a source is also not yet supported/working...

  • 39

    [Data management] Integration configuration (Recurring data jobs/Queues) movement across environments

    Suggested by Hari Kiran VarreUnder Review 4
    Category: Data Management

    Issue description:


     


    Integration (Data projects/Source formats/Recurring data jobs) configuration is the key element for successful data exchange between the application’s. Configure once and move it across environments which reduces lot of manual effort and time for every environment changes/repointing, it also reduces possibility of mistakes due to manual intervention.


     


    There are three important configuration objects from data management perspective are really crucial for successful integration/data exchange with external applications.


     


























    Form Name



    Form Navigation



    Remarks



    Source data formats



    Dashboard -> Data management -> Configure data source (Tile)



     



    Data projects



    Dashboard -> Data management -> Data projects (Tab/Grid) -> Select any data project



    A feature required to carry/move mapping, default values/Auto generate settings, Sequencing, Sample files etc. that are setup as part of it in the source system.



    Recurring data jobs



    Dashboard -> Data management -> Data projects (Tab/Grid) -> Select any data project -> Manage (Action pane) -> Manage recurring data jobs -> All data jobs for the data project.



    A feature required to carry/move recurrence setup done on the batch jobs as part of recurring data job and authorization policy data that is setup as part of it in the source system.



     


    Source data formats: This is simple and straight forward tables hence we can develop data entity’s to move this data.


     


    Data projects: There is a Download button on the form that downloads the settings into a zip file. It also includes a shell of the Excel sheet (or XML). It stores the maps, sequencing, autogenerate settings, filters, etc. Open the Manifest.xml file in the attached zip. You will see what is stored.  It was updated in platform 7 and more was added in platform 11 for the new Copy into legal entity feature.


    After you import or export, you can do a Download package from the execution summary page. That will export all of the sequences, etc plus the data that goes with it.


     


    Recurring data jobs: There is no option or feature to move this data across environments which is a key gap/feature missing would add great value to customers and partners implementing D365FO.


     


    Potential solution/design:


     


    Recurring data jobs: This is really important from D365FO and Integrator (Ex: BizTalk) perspective, else both party’s will end up spending days just to create this jobs and update the GUID’s for every environment change/repointing, until all interfaces promoted to production. As Data projects and Recurring data jobs go hand in hand it would be great if the “Download” option can download Data projects and recurring data jobs which can be moved across environments.

  • 37

    XSD from Data management

    Suggested by Stefffen Daugaard PedersenUnder Review 2
    Category: Data Management

    Is should be possible to generate a XSD file for a standard Data Entity.
    I need this on a D365O project using standard entities and the XML is just fine, but the XSD is needed for the external system.
    Br
    Steffen Daugaard Pedersen

  • 32

    Enable import of attachments (document management)

    Suggested by Nicholas CramNew 3
    Category: Data Management

    It should be possible to import document management records along with file attachments for any data entity with a natural key present. This would be a generic solution, with the natural key stored on the document management record, and linked once in the target environment.
    This would allow any files to be migrated from legacy systems, and also allow files to be attached through integrations with third party systems, without the need to customise each data entity to which the file will be attached.
    Currently only a very small number of data entities are enabled to import file attachments, e.g. Products, Purchase order headers.
  • 29

    Recurring data jobs export empty file with no data

    Suggested by Hari Kiran VarreCompleted 4
    Category: Data Management

    Data management entity's exports empty files i.e. if no data to export still it exports a blank file.


    Problem statement: Assume 30 interfaces (Per legal entity) scheduled to export every minute, which will result in huge export history per hour. i.e. (30 * No of legal entity * 60 minutes) Manually finding 1 execution where exactly the data is exported in next to impossible, space usage on blob and for integrating application like BizTalk directly consuming these messages using Dequeue involves execution cost.


    As mentioned above this has issue at two levels especially for Recurring jobs.



    • In the recurring data jobs execution history it is impossible to find the execution with data is next to impossible.

    • Integrating application like BizTalk directly consuming these messages using Dequeue involves execution cost.


    Potential solution/design: Having a condition to check if records/record count is greater than 0 will restrict the exporting of empty files. This condition should be controlled by the user. Ex: Like skip staging option on data project there should be another parameter to allow user to choose export a file only if it has data. This way it will be more flexible and suitable for any customer scenarios. This resolution at the core level of API’s will control the flow of the messages with minimal change and simple design to address the above problems.

  • 27

    View Messages Sent to Inbound Recurring Integration Queue (DMF)

    Suggested by John KonradUnder Review 5
    Category: Data Management

    When inbound messages fail to process it's really helpful to view the incoming file to see if there are any major issues.  Without the ability to access the file, it's really hard to troubleshoot. This was possible in previous versions of AX through AIF and was frequently used by our clients.  

  • 25

    Update-a-record action in Azure Logic Apps should support cross-company behaviour

    Suggested by Wouter MijleNew 1
    Category: Data Management

    Cross-company behaviour for D365FO is supported in OData queries when using a get-record or get-records action in Azure Logic Apps.


    "method": "get",


    "path": "/datasets/@{encodeURIComponent(parameters('Dyn365fOURL'))}/tables/@{encodeURIComponent(encodeURIComponent('AVAItems'))}/items",


    "queries": {


    "$filter": "HasBeenExported eq Microsoft.Dynamics.DataEntities.NoYes'No' and MustBeExported eq Microsoft.Dynamics.DataEntities.NoYes'Yes' and dataAreaId eq '@{parameters('Dyn365fODataAreaId_STU')}'",


    "cross-company": true


    }


     


    --> This returns only the records from the specified company!


     


    However, the cross-company query filter does not seem to be taken into account in the update-a-record action...


    "method": "patch",


    "path": "/datasets/@{encodeURIComponent(parameters('Dyn365fOURL'))}/tables/@{encodeURIComponent(encodeURIComponent('AVAItems'))}/items/@{encodeURIComponent(item()?['dataAreaId'],',',item()?['ItemId'])}",


    "queries": {


    "$filter": "ItemId eq '@{item()?['ItemId']}' and dataAreaId eq '@{item()?['dataAreaId']}'",


    "cross-company": true


    }


    --> This results in an error, meaning that the record could not be found. This is probably due to the fact that the system is looking in the default company of the executing user, even though the cross-company filter is specified.

  • 24

    Generate Data Entity Workbook including field data types and length/attributes automatically from data management workspace

    Suggested by Ivan ColeUnder Review 3
    Category: Data Management

    Data Entity exports do not show field data types, lengths and attributes.  It also does not show a logical relationship for the related tables/entities. We have been asked by almost every client to provide these details up front so they can map their source data without trial and error.  What we found is that once an initial export is completed this data is populated in the table 


    DMFDefinitionGroupEntityXMLFields


    you can access this with a specific environment at using this url template.


    https://Your.D365.URL.Here/?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntityXMLFields&cmp=USRT&limitednav=true&lng=en-us


    We were able to extract the necessary details from this table manually in SQL. Using this data, we created the attached sample workbook. We would like to add a feature to automatically generate this workbook from the Data Management Workspace without manual intervention.  That would include:


    1) Summary Workflow Sheet


    2) Logical Data View - Context Diagram


    3) Data Model with field data types,  lengths, descriptions, help text, etc from the example


    4) Sample data extract tab


    Thank you


    Ivan


     DM - 006 - Customers-Example