web
You’re offline. This is a read only version of the page.
close
  • BYOD & Data Lake Security

    There should be a better way for us to secure the communication from Finance and Operations to Azure SQL or Azure Data Lake. Currently the communication happens over the default port 1433, and although we are able to configure a network security group (NSG) to the IP range for the Azure Region in which our F&O is is hosted. Although this removes the public threat, this leaves crucial business data open to threats originating from within Azure. For example, if any virtual machine is compromised within Azure and runs an exploit against the Azure SQL/Data Lake it may gain access. The alternative would be a way to limit access to just our F&O instance to greatly limit the threat. Azure supports such access using VNET peering or VPNs. https://dagfileshare01.blob.core.windows.net/public-blob-dag-calafell/FO to ASQL.png Please provide a better way to secure critical business data.
  • View obsolete data entities in all data projects

    Currently you can only view obsolete data entities in the data entities page in the Data Management workspace. (System administration > Workspaces > Data Management > Data entities tile> filter for obsolete). It would be incredibly useful to see what data entities are obsolete in all data projects, rather than drilling down to each individual data project. This could be implemented on the Data projects > All projects table as a column to add to be able to filter and see what data projects are affected.
  • Add data entity for Opportunities and also consider other sales and marketing entities which are missing.

    Add data entity for Opportunities and also consider other sales and marketing entities which are missing such as call lists and campaigns. Customers moving from a CRM solution to use integrated sales and marketing module in D365fFO require loading of legacy opportunities into D365fFO but no entity exists for this and other related elements of sales and marketing.
  • Unable to import the WMSLOCATION on PurchPurchaseOrderLineV2Entity

    User added a column of inventdim.WMSLOCATION into PurchPurchaseOrderLineV2Entity on the form. However no map was created for the inventdim.WMSLOCATION Could you please consider to add a feature that WMSLOCATION is mapped?
  • Need to be able to specify Exchange Rate on Free Text Invoices both via the UI and via OData

    We hedge foreign currency transactions (i.e. buy currency in advance so the exchange rate is fixed at a particular value for a particulr contract and/or customer, etc) so need to be able to specify the exchange rate on all invoice and payment transactions. In particular, the OData entity has a Exchange Rate field, but it is not possible to use it except in legal entities in a limited set of countries; even in the those countries, the functionality is limited. Also, there does not seem to be a way to override the Exchange Rate from the UI. Please extend the exchange rate handling to allow it to be specified for all invoices and journals independently of the current default rate.
  • Add type column for fields of a data entity

    When we personalize a form we can see the fields with type of the field that belongs to the tables of the form, example - Text, Date, Number etc. Similar functionality would be beneficial if the column "Type" is added to Target fields of a data entity. There with the field list of the data entity, this new column "Type" will indicate the type of the field (Number, Text, Date, Option etc). This is beneficial in many ways, as we don't have to look into the Application explorer to see the type of the fields. Dynamics users involved in data migration can benefit from this.

  • DMF Entity for Sequence Groups and Sequence Values

    An entity that could be used to import data necessary to populate the sequence groups and sequence values for production order sequencing in master planning would be extremely useful. Currently there are two entities (Production order sequencing sequences, Production order sequencing sequence values) that can be used to import the sequences themselves, but there is nothing for sequence groups or sequence values (assigning the sequence values to an item). The values in particular can be particularly painful as this can be tens of thousands of records if the customer desires to assign sequences to items rather than groups of items. Having to manually configure this is a daunting task.
  • Add data entity for WHSLicensePlateLabel , WHSContainerTable , WHSContainerLine , WHSWorkTable ,WHSWorkLine , WHSLoadLine

    The tabel license plate label, container table, container line, work, work line, load line do not have a data entity. This is needed for some integrations.
  • Change start time and end time in Data Management to real running time

    Currently, Start time and End time which was show up when we import and export data entities are the time of internal import/export process, not the time where customer are truly experience. In example, Customer start an import project and it's take about one hour, but follow Start time and End time, it's only take 10 minutes. This is confusing our customer. Improve this issue might support customer in 2 aspect: * Batch queuing planning - Knowing that a project takes an hour to run, means it can’t be scheduled every 45 minutes. However, the only way to know how long it takes for this batch planning means that it has to be physically timed. For lengthy data integration projects, this is not practical. * Auditing As already indicated, understanding how long a process is taking is key in performance management of integrations.

  • Link to Line in file required when Import Fails due to bad / unclean data

    When importing data into an entity, if there is bad data (Examples below), the exceptions are recorded in Staging log details. These lines will never make it into the staging tables. The errors are usually of type "value violates integrity". The column of the table that threw this error is highlighted. However, there is no way of finding where in the file we imported the errors occurs. This becomes important when the file being imported has a lot of lines, like 1000 lines and becomes a manual and extremely high time consuming task to find where the error happened. This list of errors against the lines also needs to be sent across via the REST api. Bad data examples: 1. Value "TEN" instead of "10" for the Quantity field which expects a number and not a string 2. Invalid date entered like 15/15/2019