September 19, 2020

How Much Bandwidth is Enough?

It is a matter of faith among infotech experts that (1) the supply of computing and communications will increase rapidly according to Moore’s Law, and (2) the demand for that capacity will grow roughly as fast. This mutual escalation of supply and demand causes the rapid change we see in the industry.

It seems to be a law of physics that Moore’s Law must terminate eventually – there are fundamental physical limits to how much information can be stored, or how much computing accomplished in a second, within a fixed volume of space. But these hard limits may be a long way off, so it seems safe to assume that Moore’s Law will keep operating for many more cycles, as long as there is demand for ever-greater capacity.

Thus far, whenever more capacity comes along, new applications are invented (or made practical) to use it. But will this go on forever, or is there a point of diminishing returns where more capacity doesn’t improve the user’s happiness?

Consider the broadband link going into a typical home. Certainly today’s homeowner wants more bandwidth, or at least can put more bandwidth to use if it is provided. But at some point there is enough bandwidth to download any reasonable webpage or program in a split second, or to provide real-time ultra-high-def video streams to every member of the household. When that day comes, do home users actually benefit from having fatter pipes?

There is a plausible argument that a limit exists. The human sensory system has limited (though very high) bandwidth, so it doesn’t make sense to direct more than a certain number of bits per second at the user. At some point, your 3-D immersive stereo video has such high resolution that nobody will notice any improvement. The other senses have similar limits, so at some point you have enough bandwidth to saturate the senses of everybody in the home. You might want to send information to devices in the home; but how far can that grow?

Such questions may not matter quite yet, but they will matter a great deal someday. The structure of the technology industries, not to mention technology policies, are built around the idea that people will keep demanding more-more-more and the industry will be kept busy providing it.

My gut feeling is that we’ll eventually hit the point of diminishing returns, but it is a long way off. And I suspect we’ll hit the bandwidth limit before we hit the computation and storage limits. I am far from certain about this. What do you think?

(This post was inspired by a conversation with Tim Brown.)

Comments

  1. This gets discussed a bit. My own guess for the bandwidth of the optic nerve is about 8 gigabits. It contains a bit over 1 million fibers, each firing in a range from about 2hz to 100hz. If only the frequency matters, the bandwidth as actually a lot less.

    This is of course a very specialized bandwidth, with most detail in the fovea and with some motion detection and other factors already done. In theory however, it should be possible to some day produce systems that map from real visual inputs to that, and compress the data to match the entire visual field.

    The other senses are less bandwidth than the eye by a good margin, I believe.

    If you don’t want to compress, the eye has a resolution of about 1 minute of arc. That means 311 million eye-pixels in a full sphere, or call it a billion if you prefer to have sub-minute resolution. With 32 bits per pixel, and two eyepoints, and 60 frames/second, that’s two terabits. But you can easily compress to vastly smaller than that.

  2. I think you’re looking at the bandwidth issue from the wrong direction: while there may be a realistically reachable cap on how much data you might want to transfer into your home at one time, we are so far away from reaching the point where there is sufficient data to transfer data out of your home that I don’t think it’s even a point of conversation. Reaching that point requires that each individual user have enough bandwidth to support a good chunk of the rest of the world wanting to pull several hours of the highest quality recording of reality available simultaneously — or at the very least, to be able to be the primary bittorrent seed without impacting local latencies.

    That’s still a lot of bandwidth. Most corporations can’t pull it off today. The White House got clobbered just posting a relatively small document during the Clinton impeachment.

  3. Brad,

    Also, consider that you may not want to constrain your input bandwidth to what the human body can perceive, but what your computer can analyze for you, reducing the portion of interest into the highest quality the body can perceive. A batch of three-dimensional data that can be zoomed in upon for greater detail will consume far greater bandwidth than that, and if it is changing in real time, it must be transmitted in real time. (As I understand it, something similar to this is already in use on Internet2 links between universities, though obviously not at anywhere near the quality we’re talking about.)

  4. The bandwidth isn’t limited by the human’s ability to ingest it. No doubt there will be zillions of agents doing the bidding of the humans who happen to reside in the house, scouring the nets, searching through the ever expanding pile of crap that hides what the humans will deem worthy of interacting with – all kinds of stuff. Maybe most of this will be off loaded to Google, inc. Maybe it will continuously slosh between the client and server as it always has in the past. Point is, looking at it from a human sensory POV seems to miss the fact that most consumers of bandwidth will likely not be human.

  5. To expand upon the computational analysis point of view, one possibility for the future is more data mining used by people of the web or perhaps even one day the semantic web.

    And as people put more information on the web, there is more information to be searched. Imagine a program that filters each picture/video returned by a google search to see which match an image the user has supplied. Where the processing gets done determines where the bandwidth is needed.

  6. Carl Witty says:

    Your computations based on human perception assume non-interactive usage (watching a movie). For interactive usage (playing a video game, exploring a virtual world), if the remote server does not want to do the rendering, then the local machine needs enough information to render a scene. If the remote server is (for instance) 200 milliseconds away (round-trip), then I believe rendering must be local (I think a 200ms control lag is unacceptable), and the local machine must be fed enough information to be able to render anything the user might do in the next 200ms; this could be significantly more information than a raw video stream would require.

  7. There is a limit to how much bandwidth a human can consume directly, and I’m sure we will reach that limit in the near future.
    Your problem is that you are limiting bandwidth usage and needs to what a human can consume. As more bandwidth becomes available, more and more appliances, machinary, and devices will use that bandwidth. So the limit isn’t what a human can consume, but what a machine can consume. I’m currently working on a project to add real time monitoring to reciept printers at banks and stores. There isn’t a lot of data being passed on that pipe, but that is just one device on your connection. Soon every device we own will be capable of real time monitoring and updating.
    That doesn’t take into account all the devices that will continously download and filter and organize data streams for us. Data the human consumer may never look at but wants available in an organized and filtered format at a moments notice.
    Bandwidth needs in the future will not be limited by how much a human can consume, but by how much the machines that the human uses(whether he/she uses them to consume or simply run his/her house) can consume.

  8. There are two reasons I think that a normal person’s home bandwidth needs could exceed the theoretical maximum presented by the human sensory system:

    – Latency. Modern network links already achieve within a factor of two of the theoretical minimum latency determined by the speed of light, with a major exception for many common last-mile technologies. Absent a major revolution in physics, there is no forseeable way to get a serious improvement here. Interactive content from distant servers will be noticeably laggy, and one way to fix this is to send much more data then is necessary so that the local processor can send the human the part that he’s interested in at the moment.

    – Caching. Unless your human-sensory-level bandwidth is everywhere, people are going to want to save things to portable devices. If I have just enough bandwidth to support full sensory perception in real time, then that means it will take approximately two hours to download a new movie to my iPod 3000. If I’m leaving for work in five minutes but I suddenly decide I want to grab a copy of Star Trek Nineteen to watch on the bus, then it’s too late. I need a connection about twenty five times faster to be able to pull this off.

  9. Lawrence D'Oliveiro says:

    One obvious future bandwidth-intensive application is 3D printing. Downloading the plans for fabricating parts could take quite a lot of bandwidth.

    Personally I don’t see any end to the demand for bandwidth until we can do direct matter transmission.

  10. I doubt it, simply because in order for bandwidth to surpass human need first it would have to surpass our present technology. The bandwidth needed to stream high definition video is still beyond us, and by the time we acquire it, we’ll have larger screens with higher resolutions.

    Our technology has always been able to do more locally, so even when I’m having Star Wars/Trek style conversations with a life-size high resolution hologram of my friend, there will probably be things I can do locally that consume too much bandwidth to stream from another location.

    Considering that we like to fill our RAM, hard drives, and video capacity with more and more demanding programs, I wouldn’t expect this trend to change.

  11. Scott Rose says:

    We reached the point of diminishing returns as soon as or before we moved past 110 baud teletypes, the minimally-useful rate for written information. Surely each incremental B/s brings us less value than those first dozen B/s. That’s not really the interesting question- what’s interesting is “will we reach the point where we won’t pay any more for another increment of bandwidth because we can’t find a use for it justified by the incremental cost?” And if so, will that condition persist?

    I can’t see that the bandwidth of a human being’s sensory apparatus is key here. Already our Tivo’s are recording content in anticipation that we might eventually choose to watch it- if we had the storage capacity and processing power, some of us would choose to record all the content that’s on the cable. Just in case.

    Presumably there will be applications for downloading massive data sets in anticipation of the possibility that we might eventually want to analyze it. All the video for all the nightclub webcams, so that, armed with facial recognition software, we can see which ones by-then creaky old Paris Hilton graced last Tuesday. If it cost less than a nickel, surely somebody will want that sort of data. If we still have nickels then.

  12. Brask Mumei says:

    The real bandwidth demand of the household is sufficient bandwidth that one can access remote files with the same speed as local files. In other words, I want to be able to throw out my terabyte disk array in exchange for remote storage.

    Since I am quite content with using Gigabit to access my file system, I suspect for the now that is a good bandwidth guess for where I’d stop looking for the next increment.

    I personally doubt we’ll hit the banwidth wall before the CPU or Storage wall. The reason is simple. Upgrading CPUs and Storage is cheap and easy – you only have to pay for the new hardware. Upgrading bandwidth is way more expensive – you need to dig, lay new cable, get right of ways, etc. For example, I am already at the bandwidth limit in my house – I don’t have any desire to look into 10 Gigabit. I am no where near that to the rest of the ‘net, however.

  13. Lior Silberman says:

    The best bound on the amount of information that can be stored in a volume of space (this also depends on the amount of energy there) is so absurdly large as to be irrelevant to this discussion (at least for the coming century). The same holds for physics-based limits on bandwidth.

    As Mr. Mumei points about above, however, there’s a fundamental difference between Moore’s law for hardware and for bandwidth. The infrstructure costs of delivering high bandwidth are very different from the ones for delivering bigger and faster CPU or ram chips. In particular, I think the R&D aspect (and, to a lesser extend, factory construction) dominates infrastructure costs for hardware, while legal and political concerns have a large effect on the costs of providing bandwidth.

    Since customers buy their computers individually, but (in the US) obtain bandwidth from highly regulated conglomerates, the future would be very different. Whether Intel will design a next-gen chip is mostly a market profit question. Whether comcast will lay out next-gen fibreoptics depend on Congress, the FCC, and thousands of municipalities.

  14. We used to have these discussions years ago at Bell Labs – depending on models of how good the human eye is and what fraction of a sphere you need to fill (there are latency issues based on point of view changes) the video numbers ranged from a few hundred Gbps to 1 or 2 Tbps. The audio problem is serious, but we assumed a few dozen channels should be good enough for a reasonable approximation to the original sound field (you also hit the limit of how many transducers you can place in a room). The other three senses were considered lower rate as we didn’t understand them (other than sensor density and rough number of bps required for each sensor)…

    I suspect you are right … there will be a constant growth. We already have 4k cameras and projectors and a 16k has been experimentally shown.

    The amount required to trigger the mind and allow a story to proceed is much lower. A good book or story teller (say Garrison Keillor over the radio) is at least as rich than the 4k experimental movies I’ve seen.

  15. We’re talking bandwidth to the home. If that’s scarce, the way you do this is you put processors at the network nexus and do rendering there. That nexus is only a few milliseconds latency from the home, and we presume it has the real bandwidth. To the home you only need what a person needs. You only have to render and transmit what the person is looking at, so you only need the bandwidth of the human nervous system.

    There will be thick bundles of fiber running between the network nexus points, each fiber today already capable of terabits, who knows what in the future.

    If it’s not hard to get bandwidth to the home and have rendering processes in the home, then go for it. If, however, that first mile is the chokepoint, then you should not need more than the bandwidth of the human sensory nervous system coming in.

    Going out could be more complex. As noted, for minimal bandwidth you only render and transmit what’s at the focus of human attention as long as latency is low. To remote receivers, however, you have to transmit anything they might possibly look at (which is not everything) and that may require more bandwidth if you want a perfect sensory duplication of what’s in my house.

  16. I’ve made video games for a living, and I can tell you directly that there is a huge ocean of possibility in bandwidth that will take decades to drain away.

    Imagine a distributed simulation; bandwidth = shareable state, roughly speaking.

    There is a big market for this kind of system. Killer apps, if you will. The binder is already bandwidth, and this is growing more acute as Moore’s law does its work on the other parts of the system.

  17. Others have made a similar point here, but…

    It’s not just about the bandwidth the user will actually perceive, but also the bandwidth of stuff that they don’t care about, but has to come along with the good stuff.

    I may focus on just one 32K page of a document, but I am forced to retrieve the full document of 700Mb.

  18. Manfred Macx says:

    Surely Classen’s law applies here (see http://www.edn.com/article/CA56722.html as I haven’t had a time to craft the initial WikiPedia entry on this). If the Usefulness=log(Technology) [or in this case log(bandwidth)] then you can never have too much, but you need something like Moore’s law just to get linear improvement in usefulness over time.

  19. You can never be too rich or too thin, or have too much bandwidth.

    The problem is that you have to create or provide content at the given bandwidth. It will take an hour to record an hour symphony, days to produce a music video, and a year for a movie. Then if I want to start halfway through, or scan through it, whatever is at the far end of the bandwidth has to be agile enough to shift.

    Consider that we seem to have more than enough electricity to power all the computers and communications – and the appliances, but could put more to use if it was provided, maybe, or simply waste it. That won’t limit things.

    Nor will bandwidth per-se, but the ultimate computing power will. What if you want to “channel” surf something with thousands of sensory saturating bandwidth options?

    Finding something in existing DVD collections is still hard, and faster manually. And might still be even if the data was on a disk farm in a library media room where there would be no bandwidth limits today.

    Your idea is one of passive consumption. Does the problem change when consumers actively change or adapt what can be provided? When the firehose delivers clay to be modeled?

  20. Spoiler warning…

    Key points of future computing are revealed…

    Sorry… got a bit carried away there 😛

    Ibelieve that it is impractical to have a personal computer processing your data in the future… Take the story behind The Matrix / Neuromancer if you will…

    Our bandwith need is covered once we have enough bandwith, computer power and the ability to intreface our senses 100%. This will allow us to move our presence from the real world and into the virtual world. Then centralized computing will be far more cost effective and far cheaper for everyone.

  21. With the ultamate dream of distributed computing, no bandwidth will be ever enough. Users all over the world utilizing processor power, sowftware, storage of compuers around the world or a centralized group of computers, over the internet will require enormous bandwidth. And to make it truelly efficient it must be about the same as the bus speed of internal computer parts.
    T1 Survey

  22. I’ve been arguing for a long time that what the users want is low bandwidth, high reliability and low per-month cost. They also want portability and long battery life. In simple terms, the industry is defined by the lowest available cost of a reliable, permanent connection to the Internet (regardless of the speed). We can see that this cost has not dropped much in the last five years and probably won’t drop. However, the cheapest connection that you can buy has been increasing in bandwidth so a large segment of the market are being expected to buy more bandwidth than they either need or want.

    At first glance, connection sharing would be the answer but take a look at any ISP contract and reselling bandwidth is expressly forbidden (the legality of such a restraint of trade is questionable but it seems to stick for the most part).

    As for portability… consider the huge popularity of SMS, especially amongst young people. What is SMS but an ultra-low-bandwidth permanent connection with portability. The actual per-byte cost of the SMS data is high, but the per-month cost is usually low because you can say a lot in only a few bytes.

    Sure there is an advantage in being able to send photos and sometimes even videos but the advantage over plain text is usually small. Even when you do send photos, you don’t need to send a lot.

    I agree with the people who argue for low latency but latency and bandwidth are quite different. Satellite networks have high bandwidth and very high latency while a simple RS232 cable has low bandwidth but excellently low latency. I also agree that there’s a place for things like remote desktops and such interactive applications that require approx 64kbit up to 128kbit before they are usable. Somewhere around this level would be the most that I can ever see real justification for.

    As for the cable-TV industry and all the video-on-demand hype… it goes along with the set-top-box as something that everyone wants to sell and no one wants to buy. There just isn’t any content out there… the News repeats the same story from five or ten different angles and is honest in none of those iterations… the movies made by Holywood long ago lost any creativity and the industry has completely run out of ideas… the mainstream music industry keeps churning out pop clone after pop clone as they find themselves targetting younger and younger audiences while their market hits total boredom after about two years. We have people telling us that we “must have” high definition digital television but the shows on TV don’t even warrant the current analog technology.

    By the way, the same rule is true for desktop computing — 90% of users are perfectly happy buying the lowest cost desktop machine that they can find simply because they don’t need a powerful machine.

  23. There is no reason anyone would want a computer in their home.
    Ken Olson, president, chairman and founder
    of Digital Equipment Corp., 1977.

    We know what has happened for Digital, the former high-flyer, in the last decade, and might be tempted to conclude that it was the result of poor predictions, such as the one above. However, other, more successful, companies have been guided by leaders who were not much better at predicting the future. For example:

    640K ought to be enough for anybody.
    Bill Gates, 1981

    The difficulties of predicting developments in technology have discouraged many.

    I confess that in 1901, I said to my brother Orville that man
    would not fly for fifty years … Ever since, I have distrusted
    myself and avoided all predictions.
    Wilbur Wright, 1908

  24. JOhnson Paul says:

    Please my systems are too slow in browsing. I want to know how to increase the speed.

    Thaks.

  25. JOhnson Paul says:

    Please my systems are too slow in browsing. I want to know how to increase the speed.

    Thanks.

  26. Consider ISP’s offering 10Terabits connections for home users,

    Now tell me a home user’s PC with 3.0Ghz Dual Processor, 300Gb SATA HD, 5Mb of RAM, so what bandwidth this system can handle?

    Don’t know much about HD writting speeds but m sure this is not in Terabits, also 5MB RAM is not capable to handle TeraBits Bandwidth in, so what Processor alone have to do.

    Here’s the awnser…. “That much bandwidth is needed wich can satisfy a Machien not a Humen”

    Becouse humen can think and process in many more ways, and they can put many bandwidth consuming applications on work so what ever the bandwidth is available and a machine to handle that bandwidth Humen going to make it less for their needs.

    Thanks…

  27. There can never be enough bandwidth for me! I can prove it to also since I am getting the maximum bandwidth with my high speed internet provider. http://best-t1-service-provider.com