What is the best way to remove or retire an asset that was imported by accident but has accumulated deprecation.
Thanks
What is the best way to remove or retire an asset that was imported by accident but has accumulated deprecation.
Thanks
We have a few A/P accounts that still have a credit balance on their account. We have asked the vendor to send us a check and when we get the check I am unsure the best way to get the credit balance off of the vendors account. Do I create an invoice in the Payables Trans Entry Window and change the distribution?
The client is using Microsoft Dynamics GP 2015 and Office 2013. The problem is when we try to export the customer list we get the following error, but it works fine when vendors list from SmartList
excel cannot open the file .xlsx because the file format or file extension is not valid. verify that the file has not been corrupted and that the file extension matches the format of the file.
I have already checked there are no viruses on the server.
Need helping in resolving this problem.
I have an existing integration with Great Plains importing JE's from a source Enterprise Asset Management system. These journal entries record receiving and inventory transactions.
Since last year, when it went live, I have been using the actual Journal Entry Number from the Source System as the Journal Number in GP, because I liked the idea of having the actual number from the source system as the JE
This works fine, as the JE's from the source system are in the 100000 range and the Journal Numbering in GP is already at 240000 so there is little chance of them stepping on each other year-over-year. There is also a unique ID coming from the source system that is alpha numeric that I place in the Reference field of the Journal Entry, but the actual way to tie back to the entry from the source system is with the imported JE number.
So, everything went fine for a year. The client stores images of backup for the JE's in Image Silo, and using an app provided by the Image Silo vendor, we update metadata on the image silo documents with data from GP.
Some of the metadata in ImageSilo from an older year got stomped on by data from the new year, because the incoming JE numbers for the current year are also in a previous year. This is ok in GP, of course, but my contact at the company did not like it and I could not convince him that if we added a Year key to the query that populated the Image Silo metadata, everything would be fine. I could not get him to understand that GP is built to allow the same JE number in a different year, because the year is part of the key, etc.
So I am testing the switch with SmartConnect to use the GP Rolling Column for JE numbers instead of the incoming JE number, and I don't like losing that key. I feel like that opens me up to more duplicates if there is a hiccup in the system for some reason it could end up importing GJ's over and over and just assign them new JE numbers. The design of the interface is good and this scenario is highly unlikely, but it does not sit well with me. I can check against the other value in the Reference field, but that is editable and if a user deletes or alters that value, the connection is lost.
The other proposed option is to stick a zero on the end of the incoming JE number, so that it will be way larger than the series in GP, but that also has its problems.
Has anyone ever had a General Journal import project and potentially have some advice?
Thanks
Using GP 2013 R2, Enhanced PO system. I would like to set a permanent destination path for the output of two PO reports and can not locate in GP where to do so. These two reports are not in the std GP Posting Setup screen (Tools>Setup>Posting>Posting>Purchasing>Origin...). These two reports are also not in the SY02200 SQL table which appears to be the table for storing all other GP report destinations.
Any and all assistance is greatly appreciated!
Hi all,
Hope you can help me with this problem.
My client has an implementation of GP Analysis Cubes for GP 10.0. The version of cubes is 2005.
We have a problem whereby for some accounts open year transaction debits and credits in the GP DataWarehouse do not match the corresponding amounts in GP source table GL20000. The corresponding transaction amounts in GP source table GL20000 are correct.
In some instances and for some transactions not only the transaction amounts are wrong, sometimes even transaction dates are wrong in the GP DataWarehouse which is really worrying.
The nightly DW SSIS update runs with no problems.
I am very puzzled by this. Questions:
1. Has anyone experienced this problem ? Any idea under which circumstances this could happen ? (e.g. some stray late update of source tables or something)
2. Is there a way to force re-population of transactions in GPDataWarehouse for the offending company and table (for e.g. GLTransactions) ?
3. Can GPDataWarehouse table LastUpdated be used for forced re-population by somehow changing the LastRow and DateUpdated columns to force re-population as the data in the source table is correct ?
4. What would happen if I delete offending records in the GPDataWarehouse ? Would they be re-inserted from source tables presumably with correct values and dates ?
5. More generally, could some explain how the records insert/update in GPDataWarehouse via SSIS packages actualy work for GP Analysis Cubes? What is the logic for records insert/update ? I couldn't find any information on this important aspect nor can I inspect involved SSIS packages as they are password protected.
I really hope that someone can help me.
Many thanks.
Regards,
Davor
Hi everyone!
A client of ours is having trouble when vendors are replying to their auto generated emails originating from Great Planes. This is a sporadic issue but we've observed it across numerous mail hosts running Exchange, IMAP, and POP and the NDR is always the same. Ill add it to the bottom of the post as to read better.
This issue also only happens when the emails are generated from Great Planes, and not when the user sends out a normal email. If you were to reply to an auto-created email, all of the recipients other than the user that created the email in GP will receive the email. Also, if you were to try to reply to an auto-generated email, remove the users email address, and then re-add the same email address to remove the possibility of GP messing up the email address the results are the same. This is how we have nailed it down to being an issue in the body of the email that is generated by Dynamics GP
These emails are pre-composed templates that the user can view under setup -> Email Message Setup. Under Email Message Setup the issue still persists if remove the users email from Have Replies Sent To option. We're using these template emails on all orders and they're all returning the same NDRs. Also, under system preferences were using MAPI for "server type".
We really do think this is either an issue with the way GP creates these emails or an issue with our email templates in GP. Any insight is greatly appreciated.
Hopefully I've explained this all well but I do have access to this users computer with a way to recreate a sale and auto generate an email. If I've left anything out that would help please let me know!
Thanks everyone in advance.
NDR: "Remote Server returned '<DM2PR0601MB1166.namprd06.prod.outlook.com #5.2.0 smtp;554 5.2.0 STOREDRV.Deliver.Exception:CorruptDataException.EndOfStreamException; Failed to process message due to a permanent exception with message A participant ENTRYID is malformed and cannot be read. EndOfStreamException: A participant ENTRYID is malformed and cannot be read. [Stage: OnPromotedEvent][Agent: Conversations Processing Agent]>'"
Hi,
We are upgrading our GP from 10 to GP 2013.
We are using eConnect for creating PO batches. eConnect is installed on our different server (Not on our upgraded GP server) we have a job that nightly runs on this server and create PO batches in GP. We use the same job to also create Monthly Journal Entry.
I was told that since we are upgrading GP environment we have to upgrade eConnect too and obviously use new DLLs and change the custom code accordingly.
My question is since we have eConnect installed on different server and not on our GP server do we really have to upgrade eConnect?
Please advise. Any information or help would be much appreciated. Thanks
I am just getting started with eConnect (ie. so this may be an obvious problem) and I am following the "CSHARP Requester Console Application" example to test if I can connect to GP using eConnect.
My code works up to the point of the requester.GetEntity(...) call whereby it never returns - I waited well over a minute before terminating the process.
I would assume if I did anything wrong, eConnect would throw an exception that I could catch but that does not seem to be happening.
The customers started getting errors when generating a report so I had them check the Event Viewer and they are receiving the following error. They restarted the server and were able to run reports. This morning the error started again. Any suggestions on what maybe causing the following error.
“System.ServiceModel.FaultException`1[Microsoft.Dynamics.Performance.Reporting.Common.Service.ServiceFault]: The operation could not be completed because the item is protected. (Fault Detail is equal to Microsoft.Dynamics.Performance.Reporting.Common.Service.ServiceFault).”
Thank you,
Angie
Has anyone else thought about this or am I the only one using the business portal for employee time cards/self service?
I am working with our developer on creating a migration process but it is becoming quite involved.
Updating our test environment with current data requires a very time consuming recreation of users/access/security/timecards etc.
Any ideas? Thoughts?
Thank you....
Michael
We recently upgraded a customer to GP2013 R2 and the user notified me that since the upgrade, she is not getting the message to change the user date when leaving GP open from the day before.
The first thing I did was to check on the DEX.INI for the suppresschangedate (notice the DEX file substantially changed on GP2013) but it was not there. Is this setup differently on this version? How can I set it up to pop up the message after midnight?
Help is always appreciated :)
Over the last month I have been working and learning in a test company in GP. I've set it up to be exactly like what I want my actual company to be. When i get to the point where I want to start my new company and set it up, what is the best way to transfer all of the information from the test over to the real company? For instance the categories, accounts, vendors, customers, and etc. Also, can you move transactions/historical amounts as well? You guys are always so helpful. I learn something new everyday i get on here!
Thanks!
Anyone seen or figured out why the web client in GP 2015 always defaults to Template as the report type? In the desktop client it respects the choice in the Template Configuration window. But the web client seems to ignore that and always defaults to Template no matter the Configuration setting.
After migrating FRx to MR 2012 CU 11 if we run any of the reports using the tree we get this message "invalid report defintion data" If we run the reports without the trees they run fine. We checked the trees and they all look fine. What should we look at to correct this?
Hi all,
So i'm trying to setup the Business Analyzer app for GP. It asks for my username, pass and service connection - where can i find this last piece? Also, different topic - but does anyone know how to get Management Reporter Reports to show up inside GP?
Thanks,
Kimberly
Have been all around this and can't find an answer. In SSRS GL Trial Balance Summary report, we have accounts that have not had any activity this year but do have a beginning balance. They even have a transaction on a future date, but not yet (4/30/15). Our years run 4/1 - 3/31, so in GP 2016 is 4/1/2015 to 3/31/2016. If I run the report for 4/1/15-4/24/15, two accounts are missing. If I rerun but choose to "Include Zero Balance", they show up, but this causes a lot of other accounts to show up that our accounting department does not want on the report. If I run it and include the date of the future transaction, they show up.
Found another post that referenced a similar issue and a stored procedure to fix it, but that did not work for me so I put the procedure back to the original.
GP 2013 R2 version 12.00.1826. SSRS on SQL Server 2014 12.0.2000.8.
Does anyone know the role/task to assign a user to allow access to the MRP Quantities Inquiry screen??
We have a user assigned to MFG Admin that cannot access this screen???
Not sure why
Tami
In GP we have a 3 dimension format - 000-0000-00.
The first three we use to differentiate our products. The next four are account descriptions and we truly don't use the last two as of now.
I'm trying to do a basic reporting tree where each account falls under it's product, then its category. For instance, product a's cash would fall under product a - assets - cash.
For some reason, every time I try to set up anything I always get crazy results.
First, notice i'm showing 4 dimensions instead of 3 - is this suppose to happen?
Next, when i type in just account categories - they appear perfect... expcept for the fact that I have categories listed that I no longer have ( in this test environment I added and deleted some categories ).
How can I go about fixing this?
Okay, next, I try to set up the account like i want (see earlier example). Instead I get this result.
Note that the bottom few are picking up some accounts - After some investigating, i noticed they were the only two accounts I had used to make some fake transactions.
When i go back to the menu to change the order of things, I notice that the dimensions I had checked to show had changed... odd.
So guys, what all am i doing wrong? Does anyone have some pointers. I'm certainly lost!
Thanks!
I just turned on direct deposit and for some reason the updates to the GL are by individual, not lump sum as they were previously. It is doing this for both checks and direct deposit transactions. I did configure the direct deposit set up as follows:
Is there another setting driving the update to the GL?