It happened probably for a bug in sw 2019 pdm administration, that happened when we switched the variable from mapped to a version free one.
Now my VAR insists that to update all the data in the vault thause that variable I must check them all out, rebuild and checkin them back. The variable was initially mapped to a custom property that we do not need anymore, and we do not care if it stays into our files unused.
Switching to a custom free variable deletes the link between the mapped property and the pdm version free variable in the dbo.attributes table so I see no reason to check out 100s of file to rebuild them?
I understand that the existing data (with history)is not transitioned to the version free variable unless checked out, but it is not the problem here.
I also asked my VAR if I can use the xml import tool from pdm administration to update in batch our pdm version fre variables. they said the tool is nit intended for that purpose and we still need to check out our data??
I would like to import from our ERP the variable values and match them with our unique file names, so it seemed a good fit to me.
If I were in your shoes and needed it fixed, I’d write a script to fix the VariableValue table. I know, SolidWorks says don’t touch the tables but that table doesn’t have triggers and editing it in a controlled way is ok. In fact, for our migration, the consultant company that did it added the data card variables directly to the table as it was much faster than using the API.
This would take some T-SQL knowledge but you could change the version to “0” for the latest version of the variable, then delete the rest. Obviously this needs to be done in a test environment and rigorously checked. Then a backup taken before going to production and checked again after.
jcapriotti I agree with your suggestion, but we do not have the skills inhouse yet and our var (hope soon to be called our “former” var) is not reliable.
they even suggested to use cold storage for version control, while the official kb says explicitly “do not do it. it us not what cold storage is designed for”.
This is why I am looking here for something able to run a batch variable input, similar to an addon we use to batch input mapped properties.
xml import seemed good at first glance, but it seems you need to integrate it into the workflow actions, while we need something "stand alonex for one run basically.
I’ve done something similar to repair variables. I have a separate workflow called “variable repair”. It has one state with a transition out to another workflow link. Then in my file’s workflow, I create a transition to a workflow link to the repair workflow. Set your transition action to set the variable value. Then you can bulk some files through as long as you can search and limit the file set to one particular value.
Are your servers local or in Azure? Often I do these kind of bulk operations directly on the archive server. I have the PDM client setup on it. This takes as much latency out of the equation as possible.
local server, bare metal.
test server is also a very old and slow machine for both sql and archive all in one (like 6 slow cores <2GHz 32gb ram)
I fear it could be not a problem of lag, from the sql performance monitor under SSMS it was just the query running forever…?
Do you have SQL maintenance plans to Rebuild, Reorganize, Integrity check, backup, cleanup backups? If not, you may have some fragmentation that could be slowing your db.
Yes we do maintenance and the backup I am using for the tests should be just optimized like 1 week ago, but I will try to optimize it manually again on the test server before running the state change again, and I am not very confident.
All the documentation about pdm sql maintenance (from ms and sw) basically told us we do not need maintenance, which is not true because after running maintenance (statistics, indexes & tables rebuild) sql response beame a lot faster.
The reasoning seemes to be that under X pages and % of fragmentation you do not apparently need sql to run that kind of maintenance and we are always under all the suggested thresholds (on top of that the japanese document from SW was plain wrong, not even a translation problem, they asked someone that do not understand sql to write it as it confused rebuild and reorganize, I reported to my var and sw and they said their QA in japanese is plain wrong). there are a couple of QA on the KB with a query to check all the tables for fragmentation and it never returned “maintenance needed” to us, our parameters are apparently always under the suggested thresholds, still id you run maintenance it gets better and in some case it got a LOT better.
we have an integity check before our daily backup. it runs every day.
we launch a reorganization every month, two weeks apart we do also a rebuild with statistics.
I had a brief exchange with Tor Iveroth from PDM dev team and he sent me via 3d experience a version free importer from excel.
I overwrote some hundreds file cards with it and it works.
It is a sample code so there are some issues, but it is pretty much usable and I can work around them.
I recall the following:
Empty value is ignored so to “delete” a version free variable you need at least a space character.
The log file counter seems a little buggy, not taking in account failed files. It is overwrote every time the exe file is started, appended if the exe is kept open and multiple excel files are loaded and processed.
Full path is required, if the file is not found or moved the variable is not processed.
Article ID: QA00000111578
Abstract: Is there an API example that demonstrates how to bulk update (import) card variable values from an Excel spreadsheet to SOLIDWORKS® PDM?
He’s more than a support guy, but still a great support guy. I get the feeling he just want’s everyone to use Connisio to it’s fullest. I like the SPRs that are written by him, can usually tell by the format and clarity.