For this month’s hardware tips, Rob Jamieson takes a look at CAD in a networked environment, how to avoid network collisions, and the importance of virus protection and verified backup.
There are very few of us today that work in isolation, and whether it’s a cable or wireless link, the network is our primary information transfer mechanism. We are used to the idea that we can connect to the Web or just send email, but does this infrastructure affect the performance of CAD?
I don’t intend to go into detail on topologies and protocols (TCP/IP etc) as this is layered away from most of us – plus I want you to keep reading and not fall asleep.
Virtually all companies today have a network connected to a server to store files, or perhaps it may be a mail server. Most of this is wired with a twisted pair cable running from 10Mb (now out of date) to 1,000Mb (remember this is a theoretical maximum and a lot of other things affect the real performance).
The data is still stored on a hard disk somewhere whether it’s on a traditional PC server, rackmount PC or NAS (Network Attached Server). NAS is a computer (generally running Linux) that you can configure via a web interface. Servers are designed to be reliable, have fast access to a disk and of course network plugs. A typical CAD server should have a reasonable amount of RAM so it can cache lots of information in it without having to read the disk all the time. In an ideal world unless you have a good infrastructure you should have a separate CAD server. Why?
When working on a network drive the data is pulled locally and cached in temp files but mainly into RAM. Some data management software can do check in check out routines as well to give some protection if you have a crash. As you open Xrefs or sub assemblies this data needs to end up in your local RAM.
The most common problem I have seen is where the IS doesn’t realise that CAD actually has large files being pushed around a network. If there is a lot of other traffic this can slow the loading and saving of files. A CAD network should ideally be on its own server with a router to keep out other traffic. A router, if properly set up, can filter out traffic not destined to go for your workstations. I had a graphic example of this when I visited a customer once who was complaining of poor performance loading, saving and running the 3D CAD software. Looking at the basic setup, the workstations were setup OK but talking further with the designers it seems the problems were happening more between 11am and 2pm?
The IT support was remote and in a conference call with them it turned out that the backup and consolidation of the stock reports happened at this time. The network protocol is collision detect, so if you take an example that you are tying to join a main road in a large truck, but there is constant traffic, you need a gap big enough to get on. If the gap is too small you end up waiting for ages. After a router was installed and accounts updates were done out of working hours all was well and the blame moved off the software.
A typical layout of a network has the CAD server supporting local connections to each workstation (drop cables) with the CAD server connected to the main network with a higher capacity “backbone”. The backbone should be a gigabit (1,000Mb) at least today.
If everything is connected in a flat structure to a hub the collisions happen more often. A different customer I was consulting to was also complaining of loading problems after a recent change of office. The IT department had tested a single workstation loading a dataset in 5 mins and had a cheap network hub with 10 drop cables going to the others. This hub was connected to the server via a single 100Mb cable. Every morning it would take 25mins to load the data on all the eight workstations because everything would go through the single cable to the server and the cheap hub at the same time. The IT man failed the backup question badly as well. But what’s the backup question, you may ask?
I asked him, “Have you ever tested a restore from your backup?” Loads of data is lost every year because no one checked to see what they were backing up was the right stuff or that it’s not corrupted in some way. If you don’t backup go and buy a device now! The advanced backup question is “what happens if the building burns down?” Give your IT man a tape and his restore device and get him to show you the data working on a new workstation. This is the best way and you would be surprised at how many times it fails. In my IT managing days long ago (when systems were more likely to fail) I lost half a day’s data in seven years, which is not bad for a poor mid-sized engineering company.
Virus protection is very important today and a virus is one of the major causes of data loss. There is still a lot of scanning software that tests every file on loading. Now I might be wrong but I don’t need the DWG or part file checked to see if it’s got a virus as there aren’t any. Any server virus scanning should be run when the design office is not using it, likewise with defragmentation which can affect the performance greatly. One of the corporate companies I worked for had a policy that whenever you connected the laptop to the server it did a complete virus scan of the hard disk. Of course this made it unusable for the first hour and we couldn’t stop it either as we didn’t have admin rights to the network. Our solution was to reinstall the OS, which is a bit drastic. My point is if the policy is too draconian people will find a way round it.
Can’t get your network guy to improve the speed? Copy the data locally. I can hear all the document management people groaning. I’m not talking about the current working files but the standard fixings etc. These can be on the server but if they are updated regularly it should be OK to run them from the local disk.
Wireless is great but pulling large CAD files over it is not always a good idea. Use encryption keys so that you don’t give access to the outside world. I travel a lot in central London and can connect to my server via a VPN (Virtual Private Network). This is an encrypted link that needs a key to access the mail server. You would be surprised at how many open links there are in London that you can browse for freeÍ Networks are great just make sure your work is safe when it’s on one.
Robert Jamieson works for workstation graphics specialist, ATI.