Advice about how to return data to PDM

I am looking for a opinion or a tested way to handle this issue with EPDM.

Our Deptartments are asking to be able to pack-and-go some PDM data and outsource the job to external designers and design offices to finish or modify our projects and return them to PDM once finished. Externals are cheaper, but equiped at bare minimum and not able to afford PDM.
At the moment the process is complete anarchy and some data ends up somewhere outside the vault, so this needs to be fixed somehow and in a decent way.

I was thinking about the simplest concept to do it:

  1. checking out the data to be outsourced (not even done atm)
  2. pack and go them and outsourced
  3. when the data come back run a sanity check on the file properties cleaning them up outside the vault
  4. replace all the variable with the one from the corrispondent checked out files in the vault
  5. overwrite the checked out file in the vault with the new one from outside the vault
  6. check in the file (file version increase)
  7. new files in the vault are reset by our default workflow so no need too much effort

This process while viable for a few dozen of files must be automatized, or replaced by a better workflow.
It is only a raw idea at this stage.

One drawback for assemblies is that their files and subassemblies could be in a lot of different states, not necessarly locked and I likely won’t be able to roll the outsourced assembly back, as our legacy workflows saves permanently only official releases and we have a very small cold storage (10 versions).


thanks in advance

I don’t recommend this if you have a complex product and process with lots of changes happening. Our company forced us to try this years ago and it as a major flop. We spent more time gathering the files, packing them up, sending them to the remote group, then receiving them, checking that everything was ok. There was a 12 hour time zone (US to India) difference so questions back and forth were often delayed a day. The external company also seemed to have a constant turnover of new people so they were constantly training resulting in more errors.

The ‘hourly’ saving the uppers management touted quickly dissolved. There were multiple errors in the work, the external groups “repository of copies” quickly became out of date resulting in errors when added back to PDM that they didn’t see.

That said, management still pushes this and have since opened our network and PDM to external contractors via VPN. This alleviates some of the problems. This time, they chose a company with a similar time zone (South America). Challenges are latency, we don’t have replication local to them, and language barrier.

This all depends on how complex the product and process are and the abilities of the external group. Ours is extremely complex and requires constant communication.

Thank you jcapriotti .

Unfortunately VPN is not an option for various reasons.
We already have some data (some GBs) that was packed out the vault and trying to figure out how to make it useful again.

You could look at using Branch and Merge for these cases.

  1. Branch the files to a separate folder in PDM - ensure all files that should be modified and any references they rely on are branched
  2. Pack and Go the branched files and give these to the outsource company.
  3. When the files come back, check out the branched files and copy the updated files over the branch files.
  4. Verify that the files are working ok in the branch.
  5. Merge the files back to the source - During the merge you can select which branched files should overwrite the source files and which files you may want to keep using the original source (eg for files that did not change in the update)

There are merge settings in the user settings where you can specify how to handle variables in the merge. You can select to use values from the source file or from the branch file. In your case, you will probably want to set it so that it will take the source file variable values so you don’t have to worry about these getting changed by the outsource company. These settings will also allow you to keep the source file revision on the merge so that your revision history stays intact.

I second the branch and merge option. I had a conversation with people about this at a previous job and pushed back on the idea for the very reasons that jcapriotti brought up, plus there would be an opacity to the design intent as the version history would show a huge step change (which could be addressed with good documentation).

Thank you for the feedback. I need to look into the branch and merge for our needs.
Anyway it looks a lot better than the current state of things, but my main concern is who is going to handle branching and merging as the PDM literacy of our engineers is bear minimum and the vault is quite a mess.

I feel ya! UU

I don’t use PDM, but why does “some” data end up somewhere outside the vault..? Newly created files by them..?

Copy and paste or Pack n go outside the vault on some Dept. server (with documents or other stuff), quite a lot on personal workstations desktops for what I saw. Then sent to other Depts or suppliers.

Why can’t they be brought back in afterwards…? Does it need to be done manually? Can’t something go through your item list using a True/False argument and compare the item number to existing ones inside the Vault to know if they are there or not and add them if they aren’t..?

I would like to bring up this post of mine as things are moving again, a couple of years later, and the outsourcing brigade is knocking at my door again.

Another big issue that is floating in my head is how the outsourcing company is going to deal with our data.
They have a couple of people working inside our teams so they know the basics of the design, that is OK, but system wise their company does not implement SW yet, as they use other CAD platforms normally, but they will buy SW to work for us.

I cannot imagine how it would be to work with a dozen of persons without PDM on that department machinery with like 15,000 components (and growing), and without SW properly setup to work with our environment.

Our SW settings, design libraries, catalog parts, templates, macro are inside PDM and I personally refuse to handle them all outside as there was a lot of work put in them. And we all know how a NDA is stopping people from copy and pasting you code and use it for free…
That aside to set up SW outside PDM to make it work “close” to what PDM allows means I should work closely to them for some weeks, then there is no datacard without PDM, so even writing all the info inside a drawings (to avoid configuration specific entries we do not use) require another lot of work, maybe making a property tab builder. (we do not use it right now)
I though about design checker to bring back the data and the nice advice from the commenters in the thread, then… I stopped thinking about it.

I think it is simply crazy to go along this kind of non sense. We have over 2 million files inside PDM, lot of data our Depts is making every year and outsourcing means a mess and honestly it is going to create more harm than good IMHO.
If SW and the design environment is not properly setup we receive garbage data to be fixed (time, costs, performance loss etc) and to a certain extent the received data must be fixed or synchronized with our PDM data… which is the main merit of having a PDM system in place IMHO. At the end of the day the data brought back from PDM must be checked somehow, otherwise our teams will just end up to save duplicate data (with a temporary file name, I know their tricks) inside the PDM and the re-usability goes into the toilet drain.

How do I explain that to the management without “being part of the problem”?

We have done it both ways (Package files to give to outsourced company, give outsource company access to network and PDM). The first failed, the second is more successful.

For the outsourced contracted company, I have them in their own group and limit which content they have access to. they do have access to all our libraries templates, macros, etc. as they need to do the same work as an internal resource. I suppose they could steal code from our macros but they are lite and we don’t have many. We have much more complex cusomtizations but they are .net compiled programs and no one outside of IT has the source.

I think outsourcing has its place, you just have to segragate your work and protect as much as you can.

Be aware that compiled .NET programs are 100% decompilable using freely available tools such as JetBrains dotPeek. Unless you are using a third party code obfuscation tool, the source code is basically right there in the compiled program.

1 Like

Good to know, didn’t know it was that easy. The users don’t typically see the EXEs and DLLs as they are being called from a transition action in PDM to a “secret” server location the users aren’t aware of. Some programs we have started to shift to run server side instead of client side as well. The client side just puts the task in a server queue. We started doing this for performance as we no longer have local onsite servers.

OK. So these aren’t plain vanilla PDM add-ins, then? Because those are downloaded directly to every client and stored in %AppData%\Local\SolidWorks\SOLIDWORKS PDM\Plugins<vault name>\<addin-guid>.

We have a mix of PDM add-ins and EXEs run from transition actions or a command button.

While I agree with your analysis and think it is the right path, we have ruled out the second option due to bandwidth and traffic issues on our side.

IT says they do not want gigabytes peak in upload and download during working hours, for their reasons, so we are forced with packaging which is going to create more issues and it is on track to fail.
The risk, again, is that PDM could take all the blame for the failure.

Unfortunately the CTO culture is not strong in this country and non technical managers (which are common in all technical positions) tend to ignoreand downplay technical issues, like external data sync with pdm, automation breaking down, concurrent work on the same 3d project with multiple users etc

PDM admins are me and another person that checks and fixes 3d catalog part on request of our depts. we support also all departments as internal callcenter, training of new hires, their 3d modeling, pdm and sw install and upgrade, server upgrade and maintenance.
we had an SQL “expert”, but he is going to retire soon, so we are effectively looking after all the issues caused by 5,000 new drawings per year registeted into our vault with two persons.
I should add our PDM was sub-optimally setup (revision workflow is completely wrong) and lot of bugs bited us back multiple times, very few was documented.

Outsourcing means also following a company that have no clue about administering SW.
I agree to give them the minimum access to the vault, but they want to pack and go and take home the whole project like they do for another group company and the PDM issue was never their problem.

Hmm, definetely not ideal. We tried the packaging method to outsource to another country to save money on lower labor rates for simple changes. In the end, the hourly rate savings was lost in errors and packaging time and constant back and forth communication inefficiencies.

The other company your are working with must follow a strict process to ensure they aren’t saving files that they shouldn’t. This means creating a folder structure that matches the PDM folder structure, then setting the Windows “Read-only” flag on files they shouldn’t be changing as part of the work. There is a lot of room for error and they will need to check that they are following it. I would make them deliver a plan and SOPs on how they will accomplish this that you both sign off on.

1 Like