Monday, March 10, 2014

Difference Between A Tap And A Splitter



I have two different devices... a Regal DS2DGH10 120db EMI Isolation 2-way splitter (other specs: 5-1000MHZ, 3.5db, 3.5db); and a Regal ZDRDCT10-6 110db Isolation Tap (other specs: 5-1000MHZ, 6db).
What is the intended use of the tap vs. splitter?
Also, I am intending to use one of these to split incoming cable: one output to cable modem; one output to cable amplifier and then to 4-way (later to be 6-way) splitter to run to TV outlets.
Thanks.

A tap is used when a cable needs to feed TVs in one location and then continue downstream to more locations. Hallways in schools are a good example. The cable (called a trunk at this point) will hit a tap to feed a block of four rooms. The cable connected to the output side of the tap will run down the hall to the next block of four rooms where another tap will be inserted, and so on to the end of the hall. The closest taps have the highest attenuation, while taps at the end of the hall have the lowest attenuation.
A splitter divides the input between two or more outputs. It is a dead-end device. In the above example it would be used at the end of the hall to feed the last rooms.
Taps are rarely used in homes. Most home systems use a single splitter near the demarc (point at which the cable enters the house) to feed all of the drops.
A splitter will cut the signal by 3.5 dB at each output for every doubling of ports. IOW, a two-way will cut -3.5dB. A four-way = -7 dB, and so on. For this reason, a broadband amplifier is used before the input of the splitter to compensate for the splitter insertion loss.

Originally Posted by Rick Johnston
The closest taps have the highest attenuation, while taps at the end of the hall have the lowest attenuation.
So a Tap somehow splits the signal, but not evenly? It allows more signal to continue down the trunk... is that it? You say the closest ones have highest attenuation... is that due to different specs on the taps in different locations or just a natural outcome of siphoning off a small amount of signal at each location?

Not to jump on RJ's reply...but, I'm waiting for an e-mail...so...
David....I think what he means, is you put a tap with a higher attenuation closest to the signal source...and as you go down the trunk line, you put lower and lower attenuation taps on, the further away from the source, as the signal level on the trunk decreases.
You want a certain signal level at each device connected to the tap, so you may have to adjust the attenuation if more taps are added to the trunk or the trunk is extended. savvy?
So a Tap somehow splits the signal, but not evenly? It allows more signal to continue down the trunk... is that it?
Yes, thats it
You say the closest ones have highest attenuation... is that due to different specs on the taps in different locations or just a natural outcome of siphoning off a small amount of signal at each location?
Different specs, and the reason you need diff specs is the second part of your statement.
We had to do this in the Navy when running CATV to the berthing and lounge areas from up in the electronics shop where the VCR's, switches and amplifiers were. Normally 2 trunks, one down each side of the ship, probably 500 ft each,(with maybe 20-30 taps off each side (small ship...lol).
Hope I helped

A broadband video system is designed by starting at the end of the line and working backwards.
Different sizes of cable (RG59, RG6, RG11, etc.) attenuate the signal at different rates. The loss is specified in dB per hundred feet.
Installing a tap in the line at any point causes an insertion loss of roughly 1dB between its input and output. Insert ten taps right next to each other and you'll see a difference of 10dB between the input of the first and the output of the last. This has nothing to do with the dB rating of the tap or the number of TV drops on it. (Or whether there are any TVs connected to it.) Input-to-output is simply a pass-through.
The ratings of taps range from 3dB to 30dB. This is the attenuation that will be found between the input side of the tap and each port on the tap (plus 1dB insertion loss).
In terms of dB, it's simple addition and subtraction.
In the example of the school hallway, we're told that we need 12 TVs on three four-port taps, the hallway is 100 feet long, and it's 100 feet from the beginning of the line (the head end) to the first tap. The cable loss is 6dB per 100 feet. Each TV needs a signal of +9dB, but we'll ignore that for now.
200 feet of cable = 12dB loss. Three taps = 3dB loss. There will be a total loss of 15dB at the end of the line. The smallest 4-port tap available is a 7dB, so we need to have a total of +22dB at the head end to overcome all of the losses at the end of the line.
This is accomplished by inserting an amplifier at the head end with enough power to make up the difference: +22dB.
Now that the EOL is done, we work backwards to determine the ratings of the taps at 150 and 100 feet.
At 150 feet, we have a difference of one splitter and 50 feet of cable = 4dB less attenuation. We need a bigger tap value to match the tap at the EOL. 4dB + the EOL tap of 7dB = 11dB.
At 100 feet we have a difference of two splitters (2dB) and 100 feet of cable (6dB) = 8dB. 8dB + the 7db tap at EOL equals 15dB. Our choices for tap ratings are 14dB or 17dB. 14 is closer.
To this point we haven't addressed the +9dB we need at each TV. All we've done is balance the system for the insertion losses. At this point it's a simple matter of increasing the signal by 9dB at the input or the output of the amp (or both).
One more thing before I close this book: Cable loss specs include the frequencies at which the attenuation is specified. This is because the signal is attenuated more at high frequencies than it is at low frequencies. To make up for this lop-sided loss, most amplifiers have a slope control which equalizes the signal. Calculating slope over distance gets a bit more complicated, but uses the same methods.

Thanks Rick, that's very helpful and educational, no pun intended.
Regarding insertion loss, is that spec'd or is that counted for any device (including a cable extender thru-connector)? For a splitter is it just the db rating on the device or is there also this -1db loss just for having a device?
Also, on the splitter, each leg is marked 3.5db, am I correct in thinking this splitter causes a -7db loss in terms of signal attenuation?
Finally, in your example you refer to all the three distribution points as taps is the last device actually a splitter as your first reply indicated?
Thanks again.

When you split a signal you cut the power in half. 1/2 the power = -3dB.
A two-way splitter therefore splits the signal in half, but it's rated at -3.5dB at each output port because of the insertion loss of the device itself. The ports are not cumulative. You'll see -3.5dB at each, not -7.
An F-to-F coupler will have some insertion loss, but it's negligible and can be ignored.
Although a splitter is an end-of-line device, a tap is normally used at the end of the line in applications like the one I described. The output side of the EOL tap is terminated with a 75-ohm resistor to prevent noise from entering the system. This allows some leeway for expanding the system.
A different star distribution method is most commonly used in homes: Everything is split (not tapped) at the demarc where the cable enters the home. This head-end is simply a single splitter with enough ports to handle every TV drop in the house. All TV connections are home-run to the head end.
If a single splitter can't handle it, additional splitters are added in a balanced configuration until all of the TV drops are accommodated.
For example, a typical subscriber drop carries a signal of +10 to +15dB from the cable company. There are 16 TVs in the house. 16-way splitters are rare, so a two-way is used to feed two 8-way splitters. The insertion loss of a two-way is 3.5dB, and the insertion loss of an 8-way is roughly 12dB.
The entire signal from the cable company is attenuated to -.5dB at each output of the 8-way splitters. (Remember, the loss is at each port and is not cumulative.)
An amplifier is added before the splitters to make up the difference. That difference also has to make up for the cable loss for the longest run of cable in the house. This usually isn't a major factor because TVs can produce a good picture with between +5dB and +15dB of signal. So the amp needs to make up the loss of the splitters and give each TV a decent signal to work with. A 15dB amplifier will do the job nicely.
This star distribution pattern is not only easier to install, it's a whole lot easier to troubleshoot than a distributed system with taps all over the place.
Going back to your original question, a two-way splitter should feed (1) the cable modem, and (2) the rest of the home's cable system. The amplifier should be after the split on the TV side of the system.
Another vote for the star system: If you ever move the cable modem to another room, you can simply swap that line to the first splitter.
Here is a PDF that has a chart of the losses of splitters. An example of cable attenuation for RG6 is in a chart in this PDF.

Originally Posted by Gunguy45
We had to do this in the Navy when running CATV to the berthing and lounge areas from up in the electronics shop where the VCR's, switches and amplifiers were. Normally 2 trunks, one down each side of the ship, probably 500 ft each,(with maybe 20-30 taps off each side (small ship...lol).
You must have been in the more recent, modern Navy. In the 80's we didn't have 20 tv's on either of my ships. I was an IC. You?

Integrator97...PM'd you






Tags: difference, between, splitter, insertion loss, each side, have highest, have highest attenuation, highest attenuation, attenuation that, between input