PDA

View Full Version : Had a thought.. 200Mbit couldn't get abused by downloaders!


DigitalShadow
24-07-2009, 13:34
As per the other thread I started the other day "50Mbit is not enough, discuss... (http://www.cableforum.co.uk/board/12/33653056-50mbit-is-not-enough-discuss.html)." We had some posts where the thought of "downloading the internet" via torrents or newsgroups was floated (more through finger pointing than actual discussion) and I thought well, is that actually sustainable.

50Mbit can in theory download about 500Gb a day, now that is 3500Gb a week, at todays current prices a 1.5tb HDD can be bought for £95 so every two weeks you would need to spend £475 in Hard drives alone, that is £11400 a year if you wanted to store everything you downloaded.

That might be possible for some people, but when you bring 200Mbit into the picture it becomes a bit more complicated financially even for those that are well off.

You would need roughly ten 1.5Tb drives a week to store it all.. £950 a week on hard drives, that is £49,400 a year on drives to store it all.

Now that is without taking into account the servers you would need to build to have it online, and also the electricity required.

You would need about 18 servers to house all those hard drives, and that is without any raid remember.

It starts to get expensive very quickly and not possible at all, so unless people would download EVERYTHING then delete it, the 200Mbit service is just too fast for people to try to archive it all, I don't think offering a 200Mbit service would result in problems for VM as it couldn't be abused 24/7 as people couldn't fund the space required.

The only area they may suffer is from uploaders, but with some investment that shouldn't be a problem.

People could only download in fits and starts as they couldn't afford the Hard Drives, the service would be better without STM as with STM people may all try to do their downloading at the same time, with no STM the fits and starts of downloading would be spread through the day...


In addendum, part of my job is to deal with the storage and archiving of data in a way that makes it easy to locate every time. To manage 2Tb of new data a day they would need a large raided download volume of about 10Tb that multiple download clients would feed, so that films when to one folder, distros to one, music to another etc etc. If you tried to sort a 2tb dump daily it wouldn't be possible.


Discuss...

Sephiroth
24-07-2009, 13:49
I think the issue is that thousands of people per area would be downloading a fraction of the max volumes on which you were theorising. So all that multi-TB of capacity would sit with those thousands of people.

But the quantity of electricity consumed in that area would be as you say; the load on that area's network would be as you say.

But the per person example you were putting forward isn't what happens IMHO.

DigitalShadow
24-07-2009, 13:52
True, but IMO the load wouldn't be much more than what is currently placed on the 50Mbit network download wise..

Not many people can afford to buy the HDD required for keeping up with 50Mbit downloads.

Sephiroth
24-07-2009, 14:21
Not sure I agree with you. You can download more films overnight on 50Mb than you can on 20Mb provided that at least your area rate holds up and the other end can serve at 50Mb/s to every taker. That last p[oint will make the difference in the short term because not every provider wants to shell out unless there's a purpose.

DigitalShadow
24-07-2009, 14:34
How many people do you think could sustain the cost involved in archiving at 50Mbit 24/7 very few IMO, so even if they increased their headline speed to 200Mbit, I dont think that this move would cause that much of an increase in total throughput on the network, their would be more peaks of high speed use, correct, but as a trend, I dont believe the increase would be four fold over what VM are dealing with on 50Mbit.

(assuming all 50Mbit customers moved to 200Mbit)

Ignitionnet
24-07-2009, 14:34
The only thing that matters is consumption of bandwidth in peak times unfortunately, which is why STM is present on other tiers to reduce peak load. 100GB downloaded at 3am has a very different effect to 100GB downloaded at 8pm.

With current network technology VM can only deliver 200Mbit to an area, the other say 400 customers in that area might get a tad upset if one person is saturating the area at peak.

Also remember that if someone is using several TCP connections such as a 20 thread NNTP download or a high seed torrent they can take a disproportionate share of the bandwidth.

DigitalShadow
24-07-2009, 14:36
In my opinion if you had STM on 200Mbit it would be suicide. Can you imagine all the downloaders releasing their clients at the end of STM, the peak in traffic would swamp the network.

Ignitionnet
24-07-2009, 14:40
Better that than they swamp the network for the hours beforehand though.

A combination of a few chats with persistently heavy users and some limited protocol shaping on upstream only, and per subscriber shaping during periods of congestion would work well though.

http://www.sandvine.com/news/pr_detail.asp?ID=224 as used by Comcast perhaps.

DigitalShadow
24-07-2009, 14:43
So my cab in the street has roughly 200Mbit of bandwidth available?

---------- Post added at 14:43 ---------- Previous post was at 14:41 ----------

On balance, based on current STM, 200Mbit would drop the speed to 50Mbit, not too much of a worry :)

Ignitionnet
24-07-2009, 14:51
So my cab in the street has roughly 200Mbit of bandwidth available?[COLOR="Silver"]

Nah, check your modem. It is synched to 4 downstream channels each of which carries 50Mbps after overheads.

The cabinet has no intelligence in it, it's a layer 1 device.

DigitalShadow
24-07-2009, 14:53
What I meant was, how much bandwidth could the cab throughput before its link to the head end was swamped...

tweetiepooh
24-07-2009, 14:56
But not every bit you download is kept. True you'd need massive storage if you downloaded and kept everything but as speeds increase more folk will simply stream down a feed for immediate consumption and not keep a local copy.

Then you could also download stuff to store to removable media. Most standard films once stripped of extras will fit inside 4G, so you can get 10 films onto one 50GB bluray disk. Then storage costs will drop with multi terabyte devices coming in.

Still the general point does stand.

DigitalShadow
24-07-2009, 15:02
I agree but 200Mbit is just too fast to swamp, I can't even see how it would be possible to swamp the downstream for long periods without a large empty array of disks, upstream yes, there is never enough upstream bandwidth.

If you were to stream a few iplayer HD streams and download a compressed 1080p movie from a source that could sustain ~ 180mbit and play some games at the same time, you would probably swamp the connection, but for how long? An hour tops!

A 200Mbit service could not be maxed for long periods by "heavy users" as it would just be too much data.

The current top 5% or whatever it is of people who heavily use their 20mbit connection would struggle to make a dent on 200mbit download for long periods, so if the network could cope with a large number of people on 50Mbit then fair enough, they will cope with the same users on 200Mbit.

If they are saying that they require STM on 20Mbit to ensure QOS for all people, then they would have to place some low limits on the STM for 200Mbit as the users of 200Mbit would even with STM in place be downloading more than 20Mbit without STM.

I believe I understand that bonded channels are lifting some pressure from the "local network", but still. What VM need to make sure is that they can cope with 50Mbit being heavily used before they release 200Mbit, otherwise the STM values for 200Mbit will have to be really low in comparison, not 75% of the headline speed, but more like 90 to 85% (if they continue with their current 5 hour block STM technique)

Ignitionnet
24-07-2009, 15:10
What I meant was, how much bandwidth could the cab throughput before its link to the head end was swamped...

That's an arbitrary figure as the link is a 2 way RF link over optics on the HFC network.

The DOCSIS 3 net as built feeds each area with 200Mbps downstream.

200Mbps as a headline speed is a while away yet mind you.

DigitalShadow
24-07-2009, 16:32
Going back to what I was saying about a home user needing some good infrastructure to cope with archiving at that kind of speed, here is a screenshot taken from one of the servers when it was receiving a backup from another sever. I am a true geek and can literally sit and watch the switches and routers flashing away when the servers are doing a backup routine.

http://www.digitalshadow.co.uk/wpimages/storagemaxed.jpg

caph
24-07-2009, 18:08
Don't forget data deduplication. You'll need to revise your cost estimates down tenfold if you take in to account this technology. It's still pricey at the moment but it's getting cheaper all the time and by the time we're all on 200Mbit it could well be common for homes to have a data deduplicating storage device in the house.

As broadband speeds go up so HDD costs fall and compression techniques improve. Nothing stays static in this game.

DigitalShadow
24-07-2009, 18:11
I agree completely, however, it was just a rough model for the costs.

And regarding data deduplication as long as they didn't download the same thing twice, that would be of no use.

caph
24-07-2009, 18:21
It's a lot more intelligent than that these days!

We use a Quantum DXi 3500 at work backing up mainly compressed data and we get currently get about 10 terrabytes of data on a 1 terrabyte disk.

It truly is a magical technology!!!

DigitalShadow
24-07-2009, 18:30
I would have thought that it is based around the ideal that a hash or crc of a file is the same as another then creates a log of the duplicates and bob's your uncle etc. Perhaps some compression thrown in for good measure.

I could see how there would be alot of similar data in an office environment, I deal with alot of these backups and I know it to be true.

However, media material surely is vastly different and no common ground could be found and as such, such technology would yield little savings?

Please correct me if I am wrong as I love to learn, but I can't see how it could even get a 2x saving on media files and rar archives, let alone a 10x saving.

caph
24-07-2009, 18:36
You may well be right, but as i understand it the deduplication happens at the block level on the disk and is independent of the source file. In essence, the more you chuck at it, the higher the deduplication chance. However, as you say we don't back up a lot of media so I don't really know how that would affect things.

DigitalShadow
24-07-2009, 18:41
I'm interested now in how the technology would react to media files and rar archives etc...

I might contact some of the leading companies in this area and see what they have to say regarding expected space saving.

If anyone has any knowledge in this area please shed some light.

RyanB
24-07-2009, 18:45
When it comes out in a few years time I guess that you wouldn't need to store it as you could just go back online and download it again in the same amount of time...

example would be a HD film... 10gb? 12gb? providing the server held up at full speed on 200mbit then just download the file again...

on the other hand HDD's have significantly increased in size year on year so in 2 years time we could have 5tb or bigger drives for £100... (still doesn't really solve the OP's point of costs... :) )

DigitalShadow
24-07-2009, 18:53
My point was really that 200Mbit can't be maxed 24/7 like 20mbit can or maybe even 50Mbit could.

So a 200Mbit rollout wouldn't require 4x the bandwidth available.

I know i'm putting it simply, but you get the idea.

Zaim7890
24-07-2009, 19:09
As per the other thread I started the other day "50Mbit is not enough, discuss... (http://www.cableforum.co.uk/board/12/33653056-50mbit-is-not-enough-discuss.html)." We had some posts where the thought of "downloading the internet" via torrents or newsgroups was floated (more through finger pointing than actual discussion) and I thought well, is that actually sustainable.

50Mbit can in theory download about 500Gb a day, now that is 3500Gb a week, at todays current prices a 1.5tb HDD can be bought for £95 so every two weeks you would need to spend £475 in Hard drives alone, that is £11400 a year if you wanted to store everything you downloaded.


http://www.ebuyer.com/product/166989

£80 for 1.5TB, prices will soon drop as bigger HDD's come out

DigitalShadow
24-07-2009, 19:16
Sorry I based my prices on a HDD that I would buy.

I've got nearly 200 Seagates and not had a single failure.

So my prices were based on ST31500341AS

After checking e-buyer (http://www.ebuyer.com/product/149459) the hard drive is priced at £89.69

spiderplant
24-07-2009, 19:31
Please correct me if I am wrong as I love to learn, but I can't see how it could even get a 2x saving on media files and rar archives, let alone a 10x saving.
You are correct. Data that is already compressed can't be compressed much (or any) more. Unless you want to get into lossy compression. (Download HD, transcode to SD, store... :erm: )

My point was really that 200Mbit can't be maxed 24/7 like 20mbit can or maybe even 50Mbit could.
As you've shown, it can't realistically be maxed if you are storing it, but it can if you are consuming it in real time. Maybe streaming ultra hi-def, or filtering the data in real time? (You already have around 38 streams of 38Mbps each coming into your home that your STB filters in real-time)

I'd love someone to come up with a killer application that needs this kind of bandwidth.

DigitalShadow
24-07-2009, 19:36
The problem with streaming such large streams then lies with the content providers. If we were to be able to stream a 20Mbit High Def stream, imagine the load on the servers.

Ignitionnet
24-07-2009, 19:56
The problem with streaming such large streams then lies with the content providers. If we were to be able to stream a 20Mbit High Def stream, imagine the load on the servers.

That is why the Gods of the Internet came up with CDNs (http://en.wikipedia.org/wiki/Content_delivery_network) :)

DigitalShadow
24-07-2009, 20:18
I do agree with you, but even so, as speeds increase and bitrates increase, the loads placed on these will increase, I suppose they could increase the number of the servers at each exchange/head end, but still the potential load placed on these in a city would be really amazing.

Ignitionnet
24-07-2009, 21:28
I do agree with you, but even so, as speeds increase and bitrates increase, the loads placed on these will increase, I suppose they could increase the number of the servers at each exchange/head end, but still the potential load placed on these in a city would be really amazing.

True but Akamai have simply grown and grown. ISPs host Akamai because it saves them a ton of transit and peering capacity, Akamai make money delivering content at a high quality, customers get a better experience thanks to the servers being closer, everyone wins.

DigitalShadow
24-07-2009, 21:46
Well roll on 30Mbit HD Movie streams :)

I presume iPlayer uses similar technology...

Ignitionnet
24-07-2009, 22:35
30Mbit movie streams are a long way away yet unfortunately. With BT Openretch only offering 'up to' 40Mbps to 40% of the country and VM's own services available to 50% or so, total coverage 60% it'll be a little while.

Streams of that quality aren't going to be free I don't think, bandwidth usage too high. iPlayer actually has its' own server farm, the beeb only use CDNs when they need to take loading off their own servers. They use Akamai actually :)

RyanB
25-07-2009, 06:59
I was highly disappointed when i saw my first server farm... I expected to see lots of magic and gizmos but all i saw was a big cabinet with a few flashing lights... Highly disappointed... anyways...

IF the servers could take it and everyone was on 200meg then surely the optic would then get saturated?

Ignitionnet
25-07-2009, 09:06
I was highly disappointed when i saw my first server farm... I expected to see lots of magic and gizmos but all i saw was a big cabinet with a few flashing lights... Highly disappointed... anyways...

IF the servers could take it and everyone was on 200meg then surely the optic would then get saturated?

If you mean the optics in the core networks not really, you can shove 1.6Tbps down a single fibre pair now.

The local bits of VM's network would be saturated by a single person on 200Mbps right now.

bubblegun
25-07-2009, 10:29
you can shove 1.6Tbps down a single fibre pair now

Wow, that's quite a lot I guess. So how many fibre pairs have they got?

Ignitionnet
25-07-2009, 10:55
Wow, that's quite a lot I guess. So how many fibre pairs have they got?

I had it written on the back of a ciggie packet somewhere, right next to who actually shot JFK. :p:

supered
25-07-2009, 12:46
The only thing that matters is consumption of bandwidth in peak times unfortunately, which is why STM is present on other tiers to reduce peak load. 100GB downloaded at 3am has a very different effect to 100GB downloaded at 8pm.

With current network technology VM can only deliver 200Mbit to an area, the other say 400 customers in that area might get a tad upset if one person is saturating the area at peak.

Also remember that if someone is using several TCP connections such as a 20 thread NNTP download or a high seed torrent they can take a disproportionate share of the bandwidth.

It seems that for the internet to progress it is going to need massive investment, government money I think, because it must be able to deliver the speeds when peiople want to use the net. Telling someone they can use the net as much as they want at 2am is useless when you sleep at night, after all you wouldn't expect the electric company to cut you off in the evening because you had used to much electric in the previous few hours they simply increase the capacity to deal with peak loads and the internet must do the same.
The cost would be high that's why I think the government is going to have to pay for the investment.

digitalspace
25-07-2009, 13:02
How does it all work, Broadbandings? I take it there's a fiber link that comes from a headend somewhere (where would this be? one central location, or somewhere quite local?) straight to the green cab in the street, which serves a few houses down my road? From what I can tell, there are three green cabinets on my road, serving probably 120 homes.

Does anyone have any pictures of the inside of a cab?

Ignitionnet
25-07-2009, 13:24
It seems that for the internet to progress it is going to need massive investment, government money I think, because it must be able to deliver the speeds when peiople want to use the net. Telling someone they can use the net as much as they want at 2am is useless when you sleep at night, after all you wouldn't expect the electric company to cut you off in the evening because you had used to much electric in the previous few hours they simply increase the capacity to deal with peak loads and the internet must do the same.
The cost would be high that's why I think the government is going to have to pay for the investment.

The electricity company and what they do is irrelevant. You pay them for what you use so it's in their interest to ensure you always have capacity. VM cable for example you pay a fixed charge whether you use a a few gig a month or a few gig an hour.

If you have no problems with your taxes being used so people can download and stream porn faster that's your prerogative, I am violently opposed to paying 50p a month to increase spread of next-generation services and think that the national government should keep its' nose out and people if they want these services should put their hands in their pockets and pay up. My taxes are for things like a social security safety net, health care, transport infrastructure, they are not there to ensure that people can connect to the internet at higher speeds. Connecting to it at all, fine, but bumping people up from 2Mbit to 40Mbit... if they want it that badly they can pay for it, or alternatively move somewhere like here and pay through the nose for everything while sucking up pollution.

If local councils want their populations to have the services that badly that's a very different matter however, but certainly nationally the government has more to worry about than deploying its' own infrastructure. If they were to stop taxing fibre and make it so you don't have to fellate half the country just to be allowed to dig private companies would be more inclined to deploy the infrastructure themselves.

caph
25-07-2009, 13:25
You are correct. Data that is already compressed can't be compressed much (or any) more. Unless you want to get into lossy compression. (Download HD, transcode to SD, store... :erm: )


You are quite right, but I wasn't talking about compression, I was talking about block level data deduplication which is a completely different technology which is still in its infancy. Get the block sizes right and you can achieve a 90% saving in disk space.

spiderplant
25-07-2009, 13:53
You are quite right, but I wasn't talking about compression, I was talking about block level data deduplication which is a completely different technology which is still in its infancy. Get the block sizes right and you can achieve a 90% saving in disk space.
Data block deduplication works great for things like regular backups, where there are many similar blocks over time, but encoded media files are effectively random data and there is no duplication to deduplicate.

DigitalShadow
24-12-2009, 14:16
https://www.cableforum.co.uk/images/local/2009/12/8.png (http://www.speedtest.net)

Ha, something must have gone wrong somewhere, although that would be nice if it was real

:)

Welshchris
24-12-2009, 14:33
try emptying ur browsers cache and re running it.