Background and Motivation

Fibre Channel is an 1.063 gigbit bidirectional serial link that can be used in several interesting ways:
  1. As a "Storage Area Network" or SAN, where multiple computers are all connected to a set of disks, tapes, etc. Used to provide redundancy and performance. Quite common in high-end computer centers.
  2. As a replacement for SCSI. FC allows 126 connected devices in loop mode, and zillions in "fabric" mode. Since it can easily move 90-100MB/sec in both directions at once, FC is an excellent choice for high-speed peripherals.
  3. As a TCP/IP network (quite rare in my experience)

(Actually, there's now 2.1 gigabit Fibre Channel as well, but it's pretty new and correspondingly expensive for the time being, so I'll not discuss it. Check out Gadzoox, Vixel or Ancot for more details.)

Note that the technology should not be confused with the medium! Despite the name, Fibre Channel can be run over copper or fiber optic media. In either case, the bitrate is the same, and there is no difference to the operating system or software.

Historically, FC has been extremely expensive, with adapter cards costing more than an average PC, etc, etc. Due to mass-market effects and the magical economies of scale, prices have now dipped to the point where it is affordable to the average turbo-nerd. More to the point, it's a fabulous I/O channel with bandwidth to spare, miniscule CPU usage, and unmatched expandability.

Getting Started with Fibre Channel

The minimal FC setup has 4 parts:
  1. An adapter card.
  2. Cabling from the machine to the enclosure
  3. A so-called "T-card" adapter that takes the FC cable and power and conveys them onto the drives' 40-pin SCA-type connector.
  4. A hard drive.

(Pictures of all of these can be found on the "home results" page.)

At this point, there are some interrelated choices that you have to make. They all, of course, relate to how much you want to spend.

  1. Point-to-point versus Hub.
    The cheapest and simplest connection is daisy-chaining, or point to point. One cable from the adapter to the disk, and if you get more disks, you connect them to the output of the first disk. Exactly akin to 10base2 (thinnet). Cheap and simple, but no fault tolerance.

    A 7 port hub costs about $300 on ebay as of March 2001, requires media adapters called "GBICs" (Gigabit interface converter), but you can mix and match copper and optical. Additionally, you have better reliability, since the failure of one GBIC or device does not break the loop.

    There are some hubs that do not take GBICs, e.g. the Gadzoox Bitstrip series. These use DB9 exclusively, and are useful if that's what you need.

  2. Optical versus copper cabling.
    Again, the cheapest is copper. It's limited to 10-30m, depending on the quality of the wire, but that's usually adequate for a home machine. Recommended unless you have extra money!

    There are DB9 and HSSDC connectors for copper; DB9 is more common on enclosures but HSSDC is the new standard. You can get cabling that is HSSDC<->DB9, so this is mostly a question of opinion.

    Optical allows you to run long distances (500m to 20km or so, depending on the GBIC/fiber), but then you either need a media converter or hub, since most enclosures and all t-cards use copper. Nice if you can afford it!

    Fiber GBICs and media converters come in two flavors: multimode and single-mode. Multimode is good for about 500m, and is usually cheaper. Single-mode will go kilometers, and is both uncommon and expensive. Also called long-wavelength or long-haul. For both types, Connectors are the 'SC' (square housing) type.

  3. T-card versus FC enclosure.
    A t-card connects a single drive to the FC loop. There are also FC enclosures that have 1-9 FC slots. These allow you to simply plug in multiple FC drives. Alas, they're expensive, since they incorporate a simple FC hub on the backplane of the device. If you happen to get one used, enjoy it. Otherwise, you'll need one T-card per drive that you want to connect.

    The resources page has links to vendors of t-cards, and a link to overclockers explaining how to build your own from cheap parts.

      T-card cabling. There are 3 different cables for daisy-chaining from one T-card to another.
    1. DB9. Same DB9 4-pin cable used to connect most enclosures. Sold by CSI.
    2. PTP. A short-length 3-pin cable with a smaller connector. Nice inside cabinets, but you'll need an adapter from DB9-PTP. Sold by CSI.
    3. RJ45. Uses standard category-5 cable to connect drives. Cheap, and if you're still reading this, you probably can get cat5 cables. Requires an adapter, sold by Cinonics.
  4. New versus used
    I've bought most of my FC gear used, via ebay. This relates to your personal risk tolerance, but I've had zero problems. Your call.

Recommended intro setup and vendors

The following are my personal choices, with notes as to recommendations.
  1. Adapter card

    QLogic 2100

    Available new from TeamExcess for $150. The 2100 model lacks the ability to do TCP/IP; for that you can get the 2200 if you so desire. Available with DB9, HSSDC or fiber connectors.

    It works great with Linux, though the couple of times that I've tried, I cannot get windows 2000 to install, though it's supposed to be possible.

    These are the best that I've found - nice SCSI-like BIOS that has loop queries, selectable boot settings, speed tuning, low-level formatting, etc, etc. These are what the big iron vendors like SGI and Sun sell with their enterprise systems, so take that as another ringing endorsement.

    If you can afford it, the 2200 is a bit nicer and seems to be a bit more durable. For some reason that might just be a statistical fluke, I've lost two 2100's to transceiver failure, and we had some show up DOA at Fermilab. However, there is a 3-year warranty and Qlogic will repair them for $145US.


  2. Emulex LP-6000/7000/8000 series

    These have drivers available on the Emulex website, but its a binary-only link library. So they will never be available in the stock Linux kernel unless they GPL it, and will probably never work on non-x86 boxes due to the binary library. Also, they currently do not support having your root partition using their driver, and I have yet to get one to work. Your mileage will, I hope, vary but at the present time I cannot recommend these.

    I don't yet know if these are bootable or not. Their BIOS update proceedure is stone-axe crude.

    In their favor, the 8000 series use GBICs for their connection; this is very convenient for connections to disks. One can use a DB9 or HSSDC copper connector and go direct to the t-card or chassis. Most of the other cards have a hardwired connector.

    Interphase 5526

    These use the HP Tachyon chipset, and can do TCP/IP as well as SCSI. Support is improving; as of the 2.5 kernels there is an IP driver in the stock kernel but SCSI support still requires the driver from Interphase. These can be had on Ebay for twenty bucks; I bought four to play with in my cluster. I've not yet tried them out (long story) but they are by far the least expensive option to try.
  3. Cabling
    I'd recommend HSSDC on the QLogic, and DB9 on the enclosure, and Cinonics RJ45 T-cards. You then need
    1. A HSSDC->DB9 cable - ebay or CSI
    2. DB9-RJ adapter. Cinonics, ~30.
    3. shieldedRJ45-RJ45 cable(s). Nearly free from your nearest net guru.
  4. T-card
    I bought the CSI PTP t-card (72 in early 2000) and then the Cinonic stuff arrived. I'd get the Cinonic card, since its less money and RJ45 cabling is cheaper.
  5. Hard drive
    You need a fibre channel disk, such as the Seagate ST19171FC (9GB, 7200RPM, 3.5 inch, half-height, Barracuda series). There are a ton of these on Ebay, and if you're patient, you can get one for under $30. I ended up with a 9G for $20.50, an 18G for $70, and a 36G for $219. Patience is the key!

    You can get new drives from any number of sources: Dirt Cheap Drives, TeamExcess, etc. Most vendors have them if you ask.

  6. Enclosure
    For minimal cost, just put the disk in your PC's case, and loop the cabling through an unused cutout. Works fine. For my t-card drive, I used an old SCSI case, and used a serial cutout to attach the DB9 to the SCSI case. (See the pictures for details.)

    Don't spend a lot of money on this. As long as you have adequate cooling, most cases will do fine.

Other Recommendations

From here, you can expand nicely: daisy-chain in more drives, buy a hub, etc. One very nice feature is the ability to move the disks out of one's PC. I put mine in a closet, and my room is much quieter. Even the 10m copper cabling is enough to move the drives somewhere else.

If you want to play with optical cabling, you have two methods: there are media converters (CSI, ~$400) or you can use a hub with GBICs. I recommend a hub, since the total cost is about the same and a hub is more flexible. Note also that most media converters require a powered DB9 connection, which is both unusual and difficult to retrofit.

Further Reading

The Resources page is chockablock with more information: vendors, tech info, tutorials, drivers, etc. If you're more interested in a sample setup, the home results page covers my setup.

Pictures of various pieces can be found on the home results page.

Navigation Links

  • Introduction to Fibre Channel
  • FC at CDF page
  • FC at home page
  • Resources page
  • Back to home page