Try recycling your App pools again, and make sure you close all the way out of Epicor before you go back in and try again. And make sure in UD Column Maintenance is says the Table is in sync. I’ve had issues with UD fields in the past where I had to recycle the app pool multiple times before it worked correctly.
I do not mix different versions of Epicor on the same server running the Kinetic application pool. I use a separate virtual machine for each version.If you have a SQL server in development that is separate from your application server you can have 2 databases that are at different Kinetic versions as long as both of those versions of Kinetic support the windows and SQL version running both of them. So you can have a single SQL server running 2 Kinetic database at different versions with 2 different servers running the application pools, one for each Kinetic version.
We’ve got two 2019 servers running the 2023.1.6 version and both are running fine. I did need to install the .Net pre-reqs but that was it. Are you ahving specific errors? We did had some issues when installing 2022 because of not installing the right .net itemsHere’s what I used for the 2023.1 install…Also, it does matter what order you install .net and IIS, this is from the new install guide…
I have 2023.1 installed in a Test environment on 2019 servers and I didn’t have any issues installing it.I would go back to the new install guide and verify that you have all the prerequisites installed.If you are still getting errors there’s a good chance someone else has also seen it… search for that error on https://www.epiusers.help/
I know this is an old post but I would like to offer a suggestion incase anyone comes across this. So the APInvHed table does have a ReceivedDate field on it but this ReceivedDate field is not from Receipt Entry, the description on this field is “Received Date. Used for Poland Localization.”, I’m not sure what this field is actually for but it usually doesn’t have data in it. Since the APInvHed table doesn’t have the ReceiptDate on it you will need to query the RcvHead table to get that, and since the APInvHed table doesn’t have the PackNum on it you will also need to include the APInvDtl table in this query. In order to do this without custom code you will need to be on a version of Epicor that support BPM Queries, I know for sure 10.1.600 and newer supports BPM Queries. To try this with a BPM Query… Create a Data Directive on the APInvDtl table, because this table contains the PackNum and is created after the APInvHed table. Add a Condition widget to this directive and add the “Numbe
Very recently we just had this discussion on which way to use this and what you are saying Bruce is probably why the buyers have been reluctant to use the Promise Date. The trouble is Buyers don’t remember to populate the Promise Date with the original due date either so the Promise Date doesn’t work at all right now for reporting for us. I’m thinking maybe we need to use a BPM to populate the Promise Date when they first enter a Due Date.
For the Due Date I think the best way is to use the Promise Date to store the Date that the supplier says they can deliver and do not move the Due Date. But we struggle with this because our buyers like to move the Due Date instead. One of the problems with moving the Due Date is then the supplier performance reporting is not accurate because we use the Due Date to calculate Days Early/Late on our reporting. If the Buyer and Supplier both agree it’s ok to move the Due Date then that’s fine. But the Due Date often gets moved when we get an update from the supplier that it’s going to be late and the Buyer wants to reflect that, when this happens the reporting will not be correct since the Due Date is no longer the date we wanted it... in this situation they should be using the Promise Date. The Promise Date does display in other areas like Time Phase For supplier performance reporting we have created our own reports that show Days Early/Late for each release and a summary of to show the
You have a specific user/s setup for DMT, correct? And only those user/s are allowed to create a ShipHead record correct and they only going to be entering ShipHead record through DMT, correct?If this is true then you can create a BPM Method Directive on CustShip.GetNewShipHead method... Then using Pre-Processing add the Condition "The method <is not called> by <specificed user>" (If you have multiple DMT users you can add multiple conditions setting or between them)... Then on the "True" side of the condition link the Raise Exception to prevent them from starting a new ShipHead record.Or... if you have multiple DMT Users you can create a group like I said before and, instead of using the condition above, use the condition "The user who called the method <does not belong> to <DMT> group" and on the "True" of the condition link the Raise Exception.I have attached these two above BPM examples... in the first example I'm using System Manager as the user, so if your
Try using the CustShip.Update method instead. Even though it is a new record I believe it still uses the update method to initially save the record. Post Processing directives are typically used with GetNew methods. If that doesn't work then try using a Data Directive BPM. Another tip is you can create a security group for DMT and put those DMT users in it and use that security group as a criteria in your BPM. ------------------------------Jeff MartinsonTEAM Industries------------------------------
I'm guessing you mean a way to log when a BAQ is ran correct?If so I did this by using a Method Directive BPM on the DynamicQuery.Execute method, Pre-Processing. I have attached the QueryLog BPM I'm using... In this BPM every time a (non-system, does not begin with "z") query is ran it writes a record in table UD35. Before you enable this BPM make sure you are not already using table UD35... if you are already using this table you need to modify the BPM to use a different table.FYI, I created this BPM just to run temporarily before I did a migration... if I were going to run this BPM all the time in a Live environment I would probably modify this BPM to do a counter each time a user runs a query, instead of creating a new record every time a query is ran. This would limit the number of log records created. So for example, each time this BPM runs it would first check to see if there is already a UD35 record for this user and query and if there is it would update the record and add 1 to
Carol,Unfortunately the Top - Set Rows doesn't work like you would expect. This command finds the Top rows in the subquery before applying any joins to that subquery which many times results in a null value when this subquery is joined to another table.This is kind of hard to explain but I'll do my best. I've attached a BAQ example of this.The best way I've found to do first or last on a table like you could do in E9 is, create an InnerSubQuery then add a calculated field and use the row_number() expression. What this row_number() expression does is sequentially number each row returned starting at 1. Then when you join this Subquery to another table, you create a criteria on top of this Subquery and set the (row_number) calculated field = 1, this will limit the results to the first record.The advantage to using the row_number expression over E9's first/last is you have full control of how the data is partitioned (grouped) and which field the data is sorted by and the direction. In E9
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.