When dealing with performance, there is a new control under the AX query system called Query::Insert_Recordset. This allows mutliple records to be inserted into a table, with a single SQL call, in the same way the insert_recordset command works.
While this command is great for performance, there is a problem with the standard crosscompany control when using the query::Insert_recrodset comamnd. The crosscompany controls only build the extended select statement to include the comapny control (dataareaid) when the queryrun class is used. If since the query::insert_recordset does not use the the queryrun class, the crosscompany control does not work, and will not filter the data across companies based on any rules you apply. It will however extract data for all companies.
The only work around that I have found is to add the dataareaid into the query, selecting the companies you want to use, this then includes the company control into the SQL statement ensuring only the companies you are interested in are processed.
There are a few things to think off when writing code, design requirements and performance. If we include the thought process of performance when developing we can often improve our overall code performance at the time of writing.
When you need to import large numbers of item records into AX2012 you will find the time it takes to load them fairly slow. In my example I have 200,000 records to load. First I need to create the Product masters, then I need to release the product to the different companies. I was finding the creation of the product masters was taking about 3 hours to complete with the release to each of the companies around 3 hours each, and I had two companies to release.
The use of RecordInsertList has been well documented for increasing the system performance of inserts records.But I was curious how the use of RecordInsertList would scale and what impact it would have on performance.
To understand the impact on performance and scale-ability I created a table with one field. I then wrote a job that would insert X number of records into that table using first the standard insert method, then use the RecordInsertList method. Each pass would record the number of milliseconds to complete. I did this for 100 records then 1000, 5000, 10000, 20000, 40000, 100000, 1000000 and 10000000.
The use of SANs in today network environment is almost common place, but is the performance requirements of the systems using those SANs considered in the design on the SAN setup.
A typical problem I find is people are not considering how the SAN should be setup to get the best performance, primarily for DAX we are talking about the considerations for SQL server, regardless if SQL is visualized or physical.
The standard ‘out-of-the-box’ system will perform reasonably for many companies. There is always a need to test the performance and tune the system to your requirements.
If your implementation uses inventory, and if you are using batch and or serial numbers, then you might find your inventory performance to be sluggish. You might want to look at your indexes on the tables InventDim and InventSum. These tables are used in almost every query when involved with inventory.
Using the Database Engine Advisor to analyse a SQL query can give you an insight into where to look for problems with performance. Recently I have a problem opening the advisor from a query window recieving an error: “Failed to connect to an IPC Port: The system cannot find the file specified”.
The problem was caused by a previous use of the system that had not been shutdown cleanly. The simplest way to resolve this problem is to open the windows task manager and end the process for DTAShell.exe.