PC Workstations: Removable media

1562 0

In the latest instalment of his series of articles on workstation components, Robert Jamieson looks at how to best make use of your CDs/DVDs, what to look for in the latest monitor technology, and gives the low down on PCI Express.

Now you might be thinking that for workstations who cares about CDs and DVDs as it’s not part of the day to day performance, but if you’re installing a new set of software, on say 20 workstations, CD read speed is very important!

In the last few years CDs (Compact Discs) have been replaced by DVD (Digital Versatile Disk). As software packages get bigger they are starting to have to use multiple CDs. Software companies would love to put all software on DVD to reduce distribution costs but not every workstation has a DVD-ROM yet.

Hardware

You have your basic CD up to 52x speed, this is the read speed multiple over a standard music CD read speed of 150Kb seconds. Standard CDs hold 650MB but you can overburn to 800MB, but this can not always be read by every CD-ROM. Some computer manufactures will not ship anything over 32x speed as supposedly the disks are spinning out of drives and therefore can be dangerous!

For CD reading or writing anything above 32x is only saving you seconds as the reader has to slow down to access the limited data density at the centre of the disk. Now of course burnable CDs are everywhere, but will any CD drive read any burnt CD? As the writing speed increases the ability of other CD-ROM drives to read them decreases. The pits or holes are cut differently at higher speeds. You can see this sometimes by burning a CD at 52x and at 4x and the colour when refracting under light will be different with the naked eye. A lot of the read problems are down to the quality of the disks and the format used. CD Burners now are so cheap that ú20 is a typical price and a DVD-ROM combined with a CD burner is only ú25.

DVD write speeds
As there is a lot more information that can be copied on a DVD the X factor is more important, for example do you like waiting around for an hour?

Single Layer (4.7GB) write speeds

Advertisement
Advertisement
  • 1x (CLV) = about 58 minutes
  • 2x (CLV) = about 29 minutes
  • 2.4x (CLV) = about 24 minutes
  • 4x (CLV) = about 14.5 minutes
  • 6x (CLV/ZCLV) = about 10-12 minutes
  • 8x (PCAV/ZCLV) = about 8-10 minutes
  • 12x (PCAV/ZCLV) = about 6.5-7.5 minutes
  • 16x (CAV/ZCLV) = about 6-7 minutes

Dual/Double Layer (8.5GB) write speeds

  • 1x CLV = about 105 minutes
  • 2.4x CLV = about 44 minutes
  • 4x CLV = about 27 minutes

DVD-ROMs use a different colour laser and more compact pits so they hold more information. Your typical DVD-ROM drive costs ú15 today. With all the new standards for burnt DVDs your typical older DVD-ROM might not cope. You could buy a later generation DVD-ROM but a DVD Burner only costs around ú50 and burns 4.7GB. DVD-ROMs you watch on the telly have 9GB and a burnable DVD9 dual layer standard is available in the latest burning DVDs. The media cost is a bit high (ú5 per disk) and they only really play in dual DVD burners but 8.5GB is a good cheap backup device.

-R +R RW etc

Each burnable CD comes in write once CD-R and re-writeable CD-RW. If you don’t close the CD-R you can write more info to it and erase data already there but only up to the capacity, i.e. if you delete a 1MB file and write a new 1MB file you’ve still used 2MB of the capacity this is called write once. With CD-RW disks you can write and delete data again and again, up to a limit. I’m not a fan of RW because there are limits to the amount of times you can write to the disks before you get an error. When the disk decides that it can’t write any more you can damage the existing data on the disk. Write Once disks are so cheap (10p each) so I prefer to just use another one, as after all, how much does your data cost you?

In terms of DVD media there are many different camps of manufactures giving different standards, for basic write once I prefer DVD-R (25p each in bulk). These play in almost all DVD-ROMs and DVD players. This format is backed by www.dvdforum.com as with DVD-RW. The competitor is +R format by the www.dvdrw.com which is a different bunch of companies. +R works like -R but provides less support with readers. This camp also back the DVD+RW which is a better read/write standard but is generally used more in recording video in the living room. DVD RAM is an older standard which can also record live TV and play something else on the disk at the same time. Not many DVD players or burners support this but it’s not so important for computer use. Again this is from the www.dvdforum.com camp.

For the new 8.5GB +R DL (Dual Layer) disks you are in the www.dvdrw.com camp again, but the disks cost ú5 each, ouch! A modern DVD burner NEC 3520a, for example, will play and record all these in a computer (except the DVD RAM) and these are ú50. For the media I like www.lynxdv.com, as they know their stuff.

There are other formats: DVD10 and Blue Ray etc. These promise to hold more data but there is no definite winner yet. I have seen a Sony Blue Ray drive hold 27GB at Research Machines which makes it a good backup device.

Today CDs and DVDs use the IDE interface. Make sure you have an 80pin cable to support the full speed of the interface. You can mix a DVD-ROM and a burner on the same cable but not a hard disk as this would slow it down. SATA DVD drives are coming out but don’t really give any great advantage yet except neater cabling.

Usability advice

One of the common problems is reading data somebody has sent you on a DVD or CD. If they are using an older drive perhaps you will need an upgrade! Any burner has better support for reading data than standard reader drives. You can get updated firmware for most drives; see www.cdrinfo.com for advice (this will give better support for more media). Updating firmware is not an easy task and there is always a risk of damaging your drives.

If you are sending disks to somebody don’t burn at your maximum speed as this will give more chance for them to read it and is less likely that errors will occur in the data stream. On this point don’t burn data off the network or off slower media, on USB etc, always copy it locally. Don’t run intensive applications while burning as if there is any problem with the data stream the cache on the drive might not cope and errors appear on the disk.

What you copy also depends on how it is read. If you are copying lots of drawings or model files it is often better to ZIP them. This will then be easier to copy off (as they are compressed so less data) and expanded locally off a fast hard drive. Overloading the windows cache will slow things down as well, so copying a directory at a time can be faster than selecting the whole disk and copying it off.

The software you use to create disks also has an effect on the quality of the disk. Nero on the Windows platform is my current favourite www.nero.com. The new interface is a bit annoying as they are trying to make it an easy wizard driven but the disks are very reliable as long as you use the constant free updates.

Having the correct IDE motherboard drivers can affect the read speed performance dramatically. One of the systems a reseller had was really slow installing software. After installing the motherboard IDE driver it took 10mins to install the instead of 40mins. All new PCs should have this driver, but what if somebody re-installs the operating system without the correct drivers?

Laptops CD/DVD drives are generally quite expensive to upgrade from the manufacturer. The actual drives themselves are a standard slim drives with IDE interface and you only need to change the cage around it before fitting it to your laptop. Buying the drive mechanism in the UK is harder than the US but possible and you just swap over the existing cage. See www.itx-warehouse.co.uk/productlist.aspx?Cid=21 but check it fits before you buy.

A lot of people are now working on the move, i.e. from home or with laptops on site. When you take data locally you often leave Xrefs or standards parts behind. The software manufactures have remote access software or ways to pack data up but it doesn’t always work. With a 4.5GB DVD you could copy then mount a lot of data off your network drive and then it will work locally. As the DVD can be read only you know you are working with the data just as the last time you were in the office so you can give limited issue control. Just don’t lose the disk!

It’s quite easy today to buy inkjet printers that print directly to the face of special CDs and DVDs. Canon/Epson all do good all round printers for ú150. This gives a quality feel to anything you send out to customer and they often do not realise it’s done by you.

Finally, the other day I dug out a burnt 5 year old data CD to find that the gold coating had become damaged and my data was gone. I rubbed the surface of the disk and the gold just came off! CDs are meant to have a life of about 10 years, but I have a lot of well-used disks that only last six months. It’s good to have a backup but it’s better to have two!

Monitors

Monitors for CAD fall into two main areas CRT (Cathode-Ray Tube) and LCD (Liquid Crystal Display). Plasma screens are used for display purposes at shows etc but don’t have the clarity for day-to-day CAD work.

CRTs have been around for a long time (over a hundred years) and are therefore highly developed. They are also cheap in comparison to LCDs. LCDs are coming down in price but are they good enough? Before I look at LCDs let’s see what the technology is that has made CRTs the dominate force in computing for so many years.

Apple’s new 30-inch cinema display is one on a new breed of high resolution flat panel monitors which require a specialist ‘Dual Link’ graphics card where both outputs are linked together.

The CRT monitor

The CRT offers excellent colour saturation. I have lots of LCDs but when laying out an advert it might look fine on an LCD but as soon as I take it to a CRT screen I can see the colour gradients a lot better and often have to correct something. A customer in the motion picture industry I deal with only has Sony CRTs because of this. Good viewing angles and excellent brightness are the other plus points to CRTs. So if they are so good why does everybody want an LCD?

CRTs are physically big and take valuable desk space, suck lots of power and suffer from distortions at the fringes. If not setup properly (or of the cheaper variety) the scanning technology may cause flickering and may cause eye strain and fatigue. So how can you get the best out of your CRT if your IT department does not want to change?

Most (nearly all) CRTs are analogue and therefore use a VGA cable. The monitor will support a maximum resolution at a certain refresh rate. The resolution can vary from 800 x 600 to 1,920 x 1,600. The higher the resolution the smaller the text you see but greater the detail. Using the monitor driver with Windows will show the available refresh rates per resolution. Not all monitors have one or it’s not installed so Windows will use a Vesa setting. This will be a range from 60Hz to 120Hz.

Anything below 75Hz gives visible flickering and is not advisable (analogue input to LCD uses refresh rates but does not affect in the same way). You can set any refresh rate you like but the screen might not work. Most monitors have a “sweet spot” that gives a good image, but this is not always the highest resolution. If you set something not supported, the screen will be distorted and Windows will reset after 15 seconds. For example if you take a 17inch monitor it should run a 1,280 x 1,024 at 75 or 85Hz with no problem.

PCI Express has been around long enough for the drivers to iron out any bugs but shows enough performance to be worth the small premium. Most graphics cards are now second generation pure PCI Express chips and therefore stable.

LCD flat panels

The original LCD panels were passive screens DSTN etc they had slow response times of 300 milliseconds and have been replaced by TFT or Active Matrix panels as they have response times of 25ms or less. These response times are quite important if you don’t have a fast monitor otherwise you can see motion blur rotating 3D objects on the screen.

LCDs are designed to run at a certain resolution. Older monitors have real problems at resolutions they are not designed for and can look fuzzy. Modern LCDs are better at the non “native” resolutions but the monitor driver will only allow you to set the maximum resolution which is the native one and is the one it wants to run at.

If you decide to go for dual setup and use two or more monitors make sure you get quality models with colour adjustment. Graphic card drivers can do some of this but the best solution is two identical monitors from the same manufacturer with thin bezels.

Special LCDs that support stereo input display a pseudo 3D image. These work with shutter glasses that can work with viewing software to display your files. In product design and presentation purposes this is user-friendly 3D. Not priced for mainstream yet but has its uses.

If you are using a VGA cable to connect to the monitor you will have an option to set the refresh rate (the default will be 60 to 70Hz) and keep it at this as it makes no difference with LCDs. Setting higher rates just might slow your graphics card down for no gain. All workstation graphic cards have DVI connections (Digital Video Interface) and give better results as VGA is analogue signal. The down side is that the DVI interface can be used to send info from a DVD player etc so all LCDs with DVIs have been classed as televisions!

The euro land has added a nice 15% tax because of this so unless you specify it you might get VGA only connection. You also need to get the correct DVI cable as well to get the quality. HDMI is a smaller version of the DVI cable and is starting to appear on some monitors and standard graphics cards.

Some of the newer large panels support supper high resolutions up to 3,840 x 2,400 and require both outputs linked together (Dual Link) with a special graphics card the FireGL V5000 (ú359), for example. The monitors are not that cheap but would look great in my living room. HD (High Definition) TV needs resolutions of 1,920 x 1,080 so more of these high quality screens will be available as LCDs take over from Plasma TVs. Sky is broadcasting in HD next year so start saving now!

PCI Express

ATI’s FireGL V5100, one of the new generation of PCI Express graphics cards currently available.

When I covered motherboards and graphics cards in an earlier edition I looked briefly at PCI Express. PCI Express is a standard promoted by Intel but now as the AMD camp also has solutions it’s becoming the standard across all platforms. Several readers suggested that I revisit it in more detail, which I am doing now.

A brief history

The original IBM XT came with 8-bit ISA slot, which a 16-bit slot was added to for the 286 and true 16-bit computing arrived. 386s mainly had 16-bit ISA slots and it was the 486 and its clones that introduced the VL-Bus and PCI. PCI came into its own with Pentium and Athlon giving greater bandwidth and a defined standard. PCI is a good interface but graphics wanted more so AGP was developed which slowly increased speed: 2x, 4x and the last 8x. Other enhancements to PCI were developed including 64-bit slots PCI-X etc, but parallel information transfer had started to reach its limit so a brand new standard was required to support not only graphics, but other devices requiring good interfaces.

PCI Express uses a serial connection and can scale speeds by forming multiple lanes, the physical layer supports x1, x2, x4, x8, x12, x16 and x32 lane widths. Most current graphics cards use x16. It may be called PCI Express but it is really a different technology than PCI and is not interchangeable i.e. you can not use a PCI Express card in a PCI slot and vice versa.

As with most new buses, current motherboards have both PCI Express and PCI slots in. Typically a board would have an x16 slot for a graphics card and an x4 with several PCI slots for “legacy” cards. Some new boards have two PCI Express x16 slots so you can place multiple graphics cards but with the current bottleneck imposed being in the CPU for most CAD and design application this looks like overkill. Both Nvidia and ATI have multiple graphics card solutions but my view on this, for CAD, is unless you are doing something pretty special, a single top end card would suffice.

AGP vs PCI Express

PCI Express is a reasonably new interface and is designed to replace PCI and AGP in workstations. One of the greatest benefits of PCI Express is with graphics cards, with PCI Express allowing more bandwidth between the motherboard and the card. While AGP 8x offers around 18MB/sic/pin, PCI Express x16 will offer 4GB/sec (8GB/sec concurrent) bandwidth, the request pipe depth will go from 32 to 256. Basically this means that the interconnect is not going to be the limiting factor and GPU can get faster while realising this performance on the screen.

Some of the other highlights of PCI Express are that it can deliver up to 60 watts max power compared to 25 watts for AGP 8X. This means that all well designed low and medium range graphic cards don’t need extra power connectors. As most graphics cards today have a lot of transistors getting power to them is important.

Graphic cards are the main beneficiaries of the increased bandwidth but other devices can use PCI Express today. Several PCI E network cards have become available with a new 10GB network standard. Different types of cards will follow that will require good access to the bus. A point-to-point connection available with PCI E, allows each device to have a dedicated connection without sharing bandwidth. So cards can talk to each other without too much CPU intervention which is a good thing as the more times the CPU has to help out, the workstation will slow down.

PCI Express offers low power consumption and power management features, useful for saving the planet but it also allows devices to switch of unnecessary functions when not needed. Say you had a graphics card with 16 pipelines but you were only browsing the web and didn’t need all that power the PCI E bus might only use x4 transfer and turn off 12 pipelines. This has the added effect of reducing the temperature inside the case – the knock on from this is that the fans don’t have to work so hard. If a fan spins slower (most new workstations have temperature sensing fans) the noise is reduced. I hate the hum of noisy workstations so this is all good stuff.

All change

When new technology is about us there is always the question to ask, when do I want to adopt this? Anything that is new has teething problems but what also happens is that when there is a big change other things get added into the equation. The press is talking a lot about “dual core” technology with two CPUs on one die – should I wait till this time before I get a new PCI Express Workstation? My answer to this is PCI Express has been around long enough for the drivers to iron out any bugs but shows enough performance to be worth the small premium. Most graphics cards are now second generation pure PCI Express chips and therefore stable. Practically, most CAD applications are not multi threaded (support only one CPU) so won’t take advantage of Dual Core for a while.

The top end AMD and P4’s have capped out in speed (Intel cancelled the P4 4GHz). The Dual Core CPUs are likely to be 3.2GHz for Intel. This means there is a plateau of performance at least for the short term in CAD arena. If you need multiple processing power there are good dual solutions out there if you are prepared to pay for them. Most software developers are making sure they support 64-bit Windows first before developing dual core support. One notable software developer is stating they would have dual core support within two years. That’s a long way away in my view and we all know how reliable software developers are about hitting deadlines!

Robert Jamieson works for workstation graphics specialist, ATI

Advertisement

Leave a comment